New Tech! How I made PRO Level VFX in a BARN!

2024 ж. 22 Мам.
37 225 Рет қаралды

This video is an overview of Greenscreen Virtual Production, using Lightcraft Jetset, Blender, and Nuke.
This system also exports to Unreal Engine - and Live Link preview is coming soon.
This combination of software, as well as using pratical FX and some tricky compositing techniques, proves that virtual production doesn't require expensive LED stages or millions of dollars. If you have an idea you can make it happen.
Big thanks to the team at Lightcraft for sponsoring this video.
00:00 Introduction
00:32 Final Shot
00:50 Behind the Scenes Intro
01:34 Set Building & Scene Planning
03:13 Practical FX
05:05 Lightcraft Jetset Virtual Production
08:22 Conclusion
Download Jetset:
lightcraft.pro/
Want to make your VFX shots look amazing?
You need to learn compositing to get there.
Follow our Pro-Level Nuke Compositing Training Series:
www.compositingacademy.com/nu...
Need some smoke assets or VFX assets?:
www.compositingacademy.com/vf...
-------||| Alex's Gear (Paid Links):
Sony FX3:
amzn.to/3x4rI7U
FX3 Charger and Batteries:
amzn.to/43BpJ7w
amzn.to/3ISqqzv
Sony Lens (Zeiss Distagon 35mm):
amzn.to/3x9LAGV
Sony Lens 2 (50mm 1.4 GM Lens):
amzn.to/3vr10pH
--------
Ronin RS3 Gimbal:
amzn.to/3TRrYA8
RS3 SmallRig Attachments for Attaching ATOMOS TO RIG:
Smallrig Adapter: amzn.to/3IQT2sX
SmallRig Attachment: amzn.to/3VEPwJI
Smallrig Arm to hold screen: amzn.to/3PAIEtl
-------------
Atomos Ninja-V (For prores raw out of FX3):
amzn.to/4csJQbZ
Smallrig HDMI (Atomos to FX3):
amzn.to/4csJRwz
Cable to ingest footage:
amzn.to/4cuI8a0
Ninja V Extra Batteries:
amzn.to/43zahIX
AtomX 1tb SSD For Ninja-V (you'll need it for pro-res raw):
amzn.to/4atAURW
-------
LIGHTS:
BIG LIGHT:
Amaran 200xS:
amzn.to/49b7Ptn
C-Stand to hold Light
amzn.to/3vrqqU2
Amaran 200xS softbox:
amzn.to/4aveYpA
SMALL LIGHTS (these are great):
amzn.to/43woxSG
-------
GREENSCREEN:
amzn.to/3PxLzTx
POP-UP GREENSCREEN:
amzn.to/3TQKFUJ
Tracking markers (Green ducktape):
amzn.to/3viN558
------------
MISC
Camera Bag:
amzn.to/4cwNjWT
Backup Storage Devices:
HDD: amzn.to/3TRUfWj
HDD Reader: amzn.to/3IUfdyo
Laser Measurement for Set Measurement:
amzn.to/3vtg4mQ
-------
VIRTUAL PRODUCTION ATTACHMENT LINKS COMING SOON.

Пікірлер
  • Thanks for watching and sharing! Let me know what you guys thought about this as well, I'm curious to hear others thoughts! If you want to learn compositing, check out our course below: www.compositingacademy.com/nuke-compositing-career-starter-bundle

    @CompositingAcademy@CompositingAcademyАй бұрын
  • this may be one of the best VFX breakdowns I've ever seen!! So awesome to see how you made this using so many different techniques and tools! BRAVO!!!!

    @TheDavidTurk@TheDavidTurkАй бұрын
  • This is really exciting and gives me hope at getting into compositing footage. I love that you can see your scene in real time, what an absolute game changer. Thank you for all of your work put into this video. The scene you created was so amazing and detailed as well, very impressive!

    @focusmedia2465@focusmedia2465Ай бұрын
  • Great work on this breakdown, Alex. Im so impressed by what you managed to achieve. It's an amazing system, isn't it.

    @JoshuaMKerr@JoshuaMKerrАй бұрын
  • This is really great Alex, Stuff we really dont learn even while working for VFX. Super proud and happy to see you from MPC and watching you here in KZhead. Keep making. Just by making this, made me fall in love in movies again. Thank you so much and good luck for the rest of the videos. Looking forward.

    @snehalkm@snehalkmАй бұрын
  • I did the excat same setup last year. Connected my iphone with live link camera. And attched it over my dslr. So my dslr shot live action while iphone captured unreal footage. Combined together gave amazingly well made product

    @dakshpandya6559@dakshpandya6559Ай бұрын
  • Absolutely amazing!!

    @VFXCentral@VFXCentralАй бұрын
  • super awesome!! really nice seeing the BTS

    @NildoE@NildoEАй бұрын
  • This is really amazing! Thanks so much as always Alex for sharing such interesting and exciting VFX techniques 😊 Would definitely love to try out this virtual production workflow.

    @lucywallace@lucywallaceАй бұрын
  • Yea this was great, thanks so much for the thoughtful explanation of the entire process. Instant sub!

    @buddyfx7026@buddyfx7026Ай бұрын
  • amazing thank looking fwd to more

    @moisesdelcastillo6703@moisesdelcastillo6703Ай бұрын
  • Thank you amazing video and tutorials

    @pietromaximoff4365@pietromaximoff4365Ай бұрын
  • Leveling up!

    @hellaocd@hellaocdАй бұрын
  • Nice work 👌👌

    @marktech2378@marktech2378Ай бұрын
  • the shot works great. boom and bam.

    @JuJu-he8ck@JuJu-he8ckАй бұрын
  • Nice work n vid thanks!

    @2artb@2artbАй бұрын
  • Amazing

    @Gireshahid33@Gireshahid33Ай бұрын
  • Nice EmberGen info @ 4:00 👍

    @JasonKey3D@JasonKey3DАй бұрын
  • Awesome! I 'm actually trying to figure out a Solution for a complicated Green Screen Shot at the moment. Very Inspiring. I subbed

    @DEADIKATED@DEADIKATEDАй бұрын
  • Awesome

    @jimmahbee@jimmahbeeАй бұрын
  • Fantastic! Manipulating light to our advantage in terms of saving cost and time is something rare these days!

    @NirmalVfxArtist@NirmalVfxArtistАй бұрын
  • For an aspiring VFX artist such as myself, this is really awesome content to learn from. Thanks Alex! 🙌

    @GabrielMendezC@GabrielMendezCАй бұрын
  • Well, this video has motivated me once again after all the burnout afflicted by writer's strike and extra pressure of work and learning side by side. So good to see that with such small team and assets we can create such a stunning shot.

    @whypee07@whypee07Ай бұрын
    • This feeling of 'I can do this' is what I most wanted to make happen when building Jetset. It means a lot to me that it gave you that!

      @eliotmack@eliotmackАй бұрын
  • Looks cold in that barn.

    @VFXforfilm@VFXforfilmАй бұрын
    • it was terrible, batteries kept dying, haha

      @CompositingAcademy@CompositingAcademyАй бұрын
  • Cool! It would be interesting to hear what difficulties and limitations your team encountered when using this pipeline?

    @ChronicleMove@ChronicleMoveАй бұрын
    • Detailed in a few of the other responses. Mainly if you want to refine one of the real-time tracks there's a workflow that they've developed, but it ended up working pretty well. That was probably the biggest thing we worked together to figure out, but they're tooling up a gizmo that essentially does a refinement workflow in either nuke or syntheyes. Another hurdle was figuring out the lighting - the app can load in a model but not lighting (they're adding this feature very soon though, if you have a workstation on set). Mainly I just screenshotted the blender scenes from various angles that I knew I would be shooting, and moved around the lights accordingly. We were in a barn in some cold temperatures so I wouldn't bring a workstation there - but I can imagine this workflow will be insane when you have unreal engine live-linked into your viewfinder. They're also adding the ability to stream an eevee viewport into the phone as well if you want to use blender instead of unreal. It gives you a great idea on lighting and how to match it for the composite. Some other factors could be you need to have enough light for the iphone to see features for it to stay stable, so I would imagine in the pure darkness this wouldn't work, but even this was shot back-lit so I think you just need to plan accordingly. I was already stretching it here and it worked. They have some other features they're updating as well, currently there's a feature called 'infinite greenscreen' which essentially garbage mattes out the edges that aren't greenscreen. Currently it uses auto-detection but on an un-even greenscreen it didn't work as much, so they're going to change the approach to just snapping corners and then garbage matting the outside away. This is nice to have but I still had no problem shooting the scene since probably 80% of it was in the greenscreen area. Orienting the scene they also have a printed marker you can use. This is really useful for flipping the scene around etc, without having to mess around with positioning the origin by hand on the app. Basically you just aim the app at a piece of paper with a QR code, and it orients the scene to that marker.

      @CompositingAcademy@CompositingAcademyАй бұрын
  • Game is changing! GREAT job Alex

    @malcolmfrench@malcolmfrenchАй бұрын
  • Epic

    @SeanAriez@SeanAriezАй бұрын
  • Really great advice for people who have a friend who owns an ABANDONED BARN. Lack of physical space is the toughest obstacle to my plans. Can't even use my garage, because it's full of someone else's stuff. I'll call you back when I find my barn

    @johnwoods9380@johnwoods9380Ай бұрын
    • When there's a will there's a way! Worst case scenario hang up a greenscreen at night on a non-windy day and do it outside somewhere. There's always a workaround, the barn was lucky but we didn't even plan to use it originally. Also it was quite cold and batteries died a lot, so workarounds come with their own problems. My belief though is constraints create creativity.

      @CompositingAcademy@CompositingAcademyАй бұрын
  • this is exactly what i need. now if you have unreal engine tutorials i'm subbing lol

    @SHVWNCOOPER@SHVWNCOOPERАй бұрын
    • later this year unreal will come into the picture

      @CompositingAcademy@CompositingAcademyАй бұрын
  • Great video! The final shots look awesome! Love that the toxic goo was practical. How solid was the tracking data out of JetSet? Did you have to clean up or re-track or was it sufficiently accurate?

    @iAmBradfordHill@iAmBradfordHillАй бұрын
    • The tracks are pretty good. For the hips up out of focus shot, I ended up just using the real-time track out of the box For the one where the feet are really prominently featured, I wanted a sub-pixel level and wanted to refine it. I worked with them to figure out a workflow that essentially “snaps” (orients/scales) any post track you do to the real-time camera. When you do a nuke track normally it’s not to real world scale and it’s not oriented to your cg set at all, so essentially the “refined” track workflow is do your track in post, and hit a python script button to “snap” that camera to the real-time camera where we know the scale and orientation is good in world-space. They’re working on a nuke gizmo (or syntheyes) to wrap that workflow up, but it worked really well. Orienting one camera is one thing, but once you start having sequences this is a big time saver. Additionally you’ll probably have some shots where the realtime track works as well so you can literally just start rendering / compositing.

      @CompositingAcademy@CompositingAcademyАй бұрын
    • @@CompositingAcademy Thanks for the insight to this! That's great to hear that you could use the realtime tracking for several shots. I was curious about the idea of somehow using the realtime track to refine or orient the post track. When you mentioned scanning the set to have a model of it in post, my mind went to the syntheyes tool that uses photogrammetry to make a model to improve tracking. Sounds like this workflow is something similar. Very cool! I can't wait to use this app and workflow myself. Hoping to shoot a project with it sometime this year.

      @iAmBradfordHill@iAmBradfordHillАй бұрын
    • Very similar to that! It's especially useful if you have more takes. Imagine if you had 10 camera angles pointing at a CG scene from different positions. Aligning & scaling, all of those would be a painstaking process normally. Interestingly they also export some extra stuff for post workflows, like an A.I matte that you can use to garbage out the post tracker if you want to (basically to roto out the moving person, etc). What's also interesting is this can be used for CG filmmaking, or CG objects placed into real scenes. I didn't go into it on this project, but those are also possible here.

      @CompositingAcademy@CompositingAcademyАй бұрын
    • @@iAmBradfordHill That's exactly the methods we're setting up for Nuke and Syntheyes. For Nuke we can re-orient a solved track to match the original scale & location of the Jetset track, and solve the usual alignment & scale problems encountered with single camera post tracking. Syntheyes will be extremely interesting as we can 'seed' the track and then tell Syntheyes to refine it with a variety of tools (soft and hard axis locks, survey points to 3D geo, etc.) The Jetset live tracks are good enough that we want to use them as a base for a final subpixel solve when the shot demands it.

      @eliotmack@eliotmackАй бұрын
    • @@eliotmack That sounds like the best workflow to me. Take your live track and refine it, instead of having to start all over. All that data is invaluable, even if isn’t sub pixel, I would think it has to be helpful when refining to get sub pixel accuracy. I’ll keep an eye out for that syntheyes update. I really want to get out and play with Jet Set myself!

      @iAmBradfordHill@iAmBradfordHillАй бұрын
  • WOW, the revolution has begun.

    @otisfuse@otisfuseАй бұрын
  • ❤👌💯🔥 please more #UnrealEngine #VirtualProduction

    @violentpixelation5486@violentpixelation5486Ай бұрын
  • This is really cool. Only thing holding me back from being able to do this is not having a huge barn to shoot in.

    @HeckuleStudios@HeckuleStudiosАй бұрын
    • Outdoors is a good workaround too! Hang up a greenscreen on back of a garage or any wall, and shoot at night if you need to control the lighting.

      @CompositingAcademy@CompositingAcademyАй бұрын
  • Fuck yes Alex! Greta work.

    @trizvfx@trizvfxАй бұрын
  • great workflow, quick question , why green trackinpoints are used in this green screen , Isn't it better to use a different colour or is it because of this workflow?

    @AlejandroGarciaMontionargonaut@AlejandroGarciaMontionargonaut27 күн бұрын
    • In this case the green markers are used because if the character passes in front of them, it can still easily be keyed out - but at the same time there's enough contrast to be able to track the pattern. I believe the phone app tracks better if there are features as well - but mainly I put them there just in case I wanted to track one of the shots in Nuke afterwards wth a more refined track. In darker lighting conditions sometimes pink tape is used, because it's very bright and creates a lot of contrast against the green. However, this is not keyable so if an actor walks past it, you'll have to paint or rotoscope out the marker

      @CompositingAcademy@CompositingAcademy27 күн бұрын
  • Which Accsoon semo can I use?

    @themightyflog@themightyflogСағат бұрын
  • I swear this tech didn’t exist a couple months ago, I was trawling the internet for solutions and couldn’t find anything. Ended up having to compile 3 After Effects camera tracks across one 2 minute clip :I

    @tomcattermole1844@tomcattermole1844Ай бұрын
    • Oh yeah, for long camera tracks this is going to be really interesting. I didn’t even think about that. It was also interesting that I could use the real-time track on an out of focus background shot, those shots are usually much harder to track.

      @CompositingAcademy@CompositingAcademyАй бұрын
    • You're correct! We only introduced this in February at the British Society of Cinematographers' show, so it's very new. We like long camera tracks!

      @eliotmack@eliotmackАй бұрын
    • @@eliotmack Kudos. I was scratching my head trying to figure out what piece of the puzzle was missing to make something like this possible because it felt like all the hardware you'd need can be found in a modern phone anyway. As someone who wants to push concepts as much as possible with smaller crews/budgets this is going to be nothing short of a life saver.

      @tomcattermole1844@tomcattermole1844Ай бұрын
    • @@tomcattermole1844I’m actually working on a grad project right now where I’m going to have to do a lot of camera tracking. Do you have any pointers on how I could use this method with just the iPhone? Or should I try to get a cine camera as well 😅

      @AstroMelody_TV@AstroMelody_TVАй бұрын
  • Are courses n101 - 104 currently discounted? I really want to buy this course 😌

    @JungNguyen09@JungNguyen09Ай бұрын
  • How accurate is the track output of Jetset? Would you have to do a normal 3D track in post still or can you use the app's track for final pixel?

    @LFPAnimations@LFPAnimationsАй бұрын
    • The tracks are pretty good. For the hips up out of focus shot, I ended up just using the real-time track out of the box For the one where the feet are really prominently featured, I wanted a sub-pixel level and wanted to refine it. I worked with them to figure out a workflow that essentially “snaps” (orients/scales) any post track you do to the real-time camera. When you do a nuke track normally it’s not to real world scale and it’s not oriented to your cg set at all, so essentially the “refined” track workflow is do your track in post, and hit a python script button to “snap” that camera to the real-time camera where we know the scale and orientation is good in world-space. They’re working on a nuke gizmo (or syntheyes) to wrap that workflow up, but it worked really well. Orienting one camera is one thing, but once you start having sequences this is a big time saver. Additionally you’ll probably have some shots where the realtime track works as well so you can literally just start rendering / compositing.

      @CompositingAcademy@CompositingAcademyАй бұрын
  • it looks like you were using a gymbal with the camera...did that cause any problems with LIghtcraft Jetset? Could you use it with dji ronin?

    @WhereInTheWorldIsGinaVee@WhereInTheWorldIsGinaVeeАй бұрын
    • nope no problem, this combination was awesome! I balanced the gimbal with the iphone and attachment on top. This setup works great with the gimbal. This is with ronin RS3. You basically check the iphone to see your CG, but your final image out of the Ninja 5. Obviously it's two different cameras so the view is slightly different, but jetset gives you a super clear idea of where you are framed up against your CG with your actor

      @CompositingAcademy@CompositingAcademyАй бұрын
  • Is the prores RAW any good? does it keep more data that is useful for vfx?

    @rossdanielart@rossdanielartАй бұрын
    • Super useful for keying, also gives more flexibility with grading. You basically transcode the prores raw into ProRes 4444 or directly into EXRs, it really helps. Also it's just less compressed overall so everything has really crisp detail

      @CompositingAcademy@CompositingAcademyАй бұрын
  • Ian Hubert has been doing things like this for over a decade

    @ninjanolan6328@ninjanolan6328Ай бұрын
  • How accurate/usable is the track that you get from this workflow? Is it something just good enough for look dev or do you find it good enough/comparable to a track you might be able to solve out of something like Syntheyes?

    @ryanansen@ryanansenАй бұрын
    • From the tests I did, sometimes it was good enough for final track, in other cases I wanted to refine it especially when the feet were very prominent. They have a refined tracking workflow if you want to use syntheyes or nuke - it snaps the post track to the real-time track. In this way it saves you time on orienting / scaling / positioning the new camera in world space.

      @CompositingAcademy@CompositingAcademyАй бұрын
  • Which Accsoon Seemo can be used? I see a Regular, Pro version and a 4k Version.

    @StudioWerkz@StudioWerkzАй бұрын
    • All of them. The standard SeeMo is HDMI and the Pro is both HDMI and SDI but they work the same.

      @CompositingAcademy@CompositingAcademyАй бұрын
  • VFX supervisor here, nice vídeo and nice concepts. But in no way is this called virtual production (outdated term for volume production) the whole reason is to avoid the green screen as you can see by those 2 unjustified specular lights on the white bucket the actor is carrying. This is just tradicional green screen work with tracking set data. You need the LED panels to create your light environment and volume correctly

    @unspecialist@unspecialist27 күн бұрын
    • I would disagree that this isn’t virtual production - traditional greenscreen doesn’t allow you to see what you’re filming. Personally I think it helps clients understand what we’re doing (shooting, and seeing the result). This app / company also markets itself as that term for that reason. I don’t think LED volume companies can claim the term virtual production - although they would like to and have poured millions into doing so. Also, sure there might be a spec highlight, but these shots are impossible to do on an LED stage without compositing. There’s foreground elements, a replaced floor, rack focus, a virtual camera move extension , etc. This is why I chose this environment, it plays to the strengths of fully CG scenes in a contained space, while also costing 100x less with arguably a better result than an LED volume. Personally I think LED stages are a cross-over technology to something much better, most likely virtual production seen through VR headsets on set, while the greenscreen (if you’ll even need one) is replaced live or pre-vis. Smaller more cost effective panels would be interesting for reflections definitely. I think that’s cool. But from a first principle and even physics standpoint, there’s a lot of limitations not honestly discussed often about LED virtual production. Also , you can get a lot of realistic lighting without needing LED panels which has been done for years, the only time you need LED panels is if you have many obvious reflective objects. Including greenscreen outside - which you can’t get realistic direct sunlight on led panels. The best mix is actually using greenscreen projected on an LED stage, and a virtual environment around the edges, but this still makes your cost ridiculously high for an arguably diminishing return, unless you’re filming chrome characters.

      @CompositingAcademy@CompositingAcademy27 күн бұрын
  • Better than LED Screens i must say

    @DannyshRosli@DannyshRosliАй бұрын
    • it all depends on the lighting. Can have really bad results on greenscreen if it's lit wrong too. I think LED stages could be good for very reflective scenes or characters (like mandalorian, he's chrome), but they're limiting (and very expensive) in a number of ways that aren't often discussed.

      @CompositingAcademy@CompositingAcademyАй бұрын
  • any tutorials on jetset?

    @themightyflog@themightyflogАй бұрын
    • Possibly! If more people are asking for it, it's something I might do.

      @CompositingAcademy@CompositingAcademyАй бұрын
    • @@CompositingAcademy I would love to see your lighting tutorials. For real everyone but you seems like they are still on green screen

      @themightyflog@themightyflogАй бұрын
    • good idea. I'll probably talk more about that in some next tutorials on these shots. Lighting & compositing is 100% the reason why people aren't getting the results they want. It's also the same reason a lot of CG environments look super video-game, people don't know how to control contrast & light mainly.

      @CompositingAcademy@CompositingAcademyАй бұрын
    • @@CompositingAcademy A tutorial on that would be amazing. It’s the one setback I have when doing anything virtual. Even in UE with mega scans it looks like a video game still plus my compositing skills need some work

      @orbytal1758@orbytal1758Ай бұрын
  • @compositingAcademy how can I become vfx artist and a compositing artist can u make a video for that

    @aidenzacharywessley3808@aidenzacharywessley3808Ай бұрын
    • Hey Aiden, The best way is to learn the fundamentals and build a demo reel to prove you have the skills to employers. If you're interested in the beginner series, we've built a really good path for beginners who want to go professional here. There's a bunch of projects included, and the footage can be used for reels as well. www.compositingacademy.com/nuke-compositing-career-starter-bundle All the best!

      @CompositingAcademy@CompositingAcademyАй бұрын
  • I have an FX3 too. But I use it with a PortKeys LH5P II and Zhiyun Crane 4, will it work? Do I absolutely need to have pro-res raw to do virtual production like that? And can I do it with Unreal Engine and Davinci Resolve?

    @roxanehamel1753@roxanehamel1753Ай бұрын
    • you don't necessarily need prores raw, it does help getting super clean keys and small details, but if you use one of the better Codecs in the internal recording on FX3 those will still work pretty well. Make sure to shoot 10 bit for sure though. You can use Unreal as well. Right now you can import any geometry, but they're also making making a live-link to see your unreal scene live if you hook up wired to a workstation. Either way you choose to work, you can get the camera data + plate data into unreal afterwards and it will align with your scene.

      @CompositingAcademy@CompositingAcademyАй бұрын
    • @@CompositingAcademy Thanks for the infos. And you're using an Iphone, but with a Google Pixel 8 can make it too?

      @roxanehamel1753@roxanehamel1753Ай бұрын
    • @@roxanehamel1753 Right now Jetset uses iOS devices, but even a used iPhone 12/13/14 Pro will work great. The onboard LiDAR improves the 3D tracking, and makes 3D scene scanning possible.

      @eliotmack@eliotmackАй бұрын
  • This is all impressive until the client says the barn limits their idea and they want to shoot something in Time Square lol

    @SortOfEggish@SortOfEggishАй бұрын
    • Yeah you would need a bigger stage. This barn space is equivalent to a smaller greenscreen stage, it’s about the same size as a few you can rent. This workflow would still work on a wrap around stage though if you need to pan more.

      @CompositingAcademy@CompositingAcademyАй бұрын
  • Accsoon Seemo or Accsoon Seemo pro Using good

    @santhirabalasinthujan9170@santhirabalasinthujan9170Ай бұрын
  • You iPhone people are so annoying, but you sure get all the cool toys these days 😅

    @lFunGuyl@lFunGuyl11 күн бұрын
  • don't know. it is subscription.... and I am cheap LOL

    @keithtam8859@keithtam8859Ай бұрын
  • How much are you earning with this setup

    @travelstories2529@travelstories2529Ай бұрын
  • Iphone is trash....is there an Android software....and I wish UE would add mp4 support...😢

    @kanohane@kanohaneАй бұрын
  • meh. i mean cool results but don't see the value add of jetset.

    @jinchoung@jinchoungАй бұрын
    • It helps a lot when you're filming - framing up to things that don't exist is pretty unique. Personally I used to do a lot of photography I liked moving around and finding interesting angles, you can't do that traditionally which is why I think a lot of greenscreen stuff in the past is only background / distant stuff.

      @CompositingAcademy@CompositingAcademyАй бұрын
  • Dude shoots GS and thinks this is VP. It's well done, but not correct...

    @HTOP1982@HTOP1982Ай бұрын
    • The term "virtual production" came well before LED walls. But now LED walls has "taken over" the term virtual production. We're not ready to give up that fight. We think the Lightcraft Jetset approach is going to be so accessible to a much wider audience, that in the fullness of time, "virtual production" will go back to its original meaning which is any way to combine live action with a synthetic background.

      @billwarner4641@billwarner4641Ай бұрын
KZhead