Virtual Production Project Tutorial from scratch with Unreal Engine and RETracker Bliss!

  Рет қаралды 16,935

Greg Corson

Greg Corson

Күн бұрын

This tutorial starts from scratch and builds a project that replicates my "Neo's apartment from the Matrix" demo. I'm using Unreal Engine 5 and the new RETracker Bliss as my camera tracker. This setup will work with other camera trackers or no camera tracking at all if you want.
To buy RETracker Bliss visit www.retracker.co/
With Unreal Engine 5 the whole process takes about an hour, once you have some experience you can do it even quicker! Use the table of contents below to skip around.
If you have an old graphics card that does not support Lumen and Nanite on UE5, this project will still work, just don't enable them in the project settings.
00:00 Intro, what we will be doing
00:26 RETracker Bliss offer and disclosure
01:21 Getting Started using Film & Video Template
02:10 Add plugins to the project
03:00 Enable important plugins
04:03 Set project settings
05:30 Build a basic scene
08:35 Center Stage, install ARUCO assets, mark center stage
10:57 Hookup your camera
14:17 Setup Composure Composite
15:35 Add a camera
17:00 Finish building composite
21:44 Setup chroma key
22:57 Camera Calibration Introduction
24:23 Setup FIZ blueprint for calibration
28:20 Adding a LiveLink tracker
31:19 Connect LiveLink to our CineCamera Actor
33:00 Print calibration board and Aruco
35:16 Create calibration board actor in Unreal
36:40 Create and setup Lens File
40:45 Setup Lens FIZ curves
42:26 Shoot calibration images
45:00 Checking Calibration Results
47:30 If you need to do calibration again
47:47 Add tracker offset (nodal point)
49:15 Set Worldpose
50:45 Preview the final composite
51:20 Removing the "bulls-eyes" on the screen
51:45 Clean up the chromakey (green screen)
53:24 View the Final Composite
53:40 Set tracking delay
55:28 Timed Data Monitor
56:32 Importing the Neo's apartment set
58:37 Workaround for a bug in Unreal 5.0.3
1:01:10 Setup cg_element to render the set
1:02:05 Set worldpose for this set
1:03:49 Checkout the final result
1:04:15 Tuning the lighting and layout for your studio
1:04:38 RETracker Bliss Discount Offer
1:06:44 Tutorial Summary So Much Faster!
1:07:15 Support the channel, Like and Subscribe
FREE Software you need
Unreal Engine www.unrealengine.com/
Resources for Unreal, Aruco Tag to print and calibration board to print, get it Aruco.Asset.Pack.zip from github.com/MiloMindbender/Liv...
Live Link Plugin for RETracker Bliss (ONLY needed if you have Bliss hardware)
github.com/MiloMindbender/Liv...
#virtualproduction #unrealengine #RETrackerBliss

Пікірлер: 196
@howtotaiwan1384
@howtotaiwan1384 Жыл бұрын
Man thats so cool that you are spending the time to share this!!
@themidstream
@themidstream Жыл бұрын
Thank you for sharing immense value.
@DouglasDoNascimento
@DouglasDoNascimento Жыл бұрын
Great tutorial. Thank you.
@mariorodriguez8627
@mariorodriguez8627 Жыл бұрын
Great work!! Thank you!
@shudheerji
@shudheerji Жыл бұрын
Thanks a lot for detailed tutorial
@tomreiner8676
@tomreiner8676 Жыл бұрын
Very interesting! Recently I experimented a bit with nDisplay which was very fun
@Creating_Space
@Creating_Space Жыл бұрын
thank you thank you thank you. Perfect timing as I just hung my green screen this weekend.
@GregCorson
@GregCorson Жыл бұрын
Let me know how you get on and if you have any questions. Also if you look in the older posts on my channel there are some nice tricks for hanging green screens using magnets or "contractor poles" for a footprint that's much easier to deal with than the big tripod stands.
@G.I_BRO_SHOW
@G.I_BRO_SHOW 2 ай бұрын
Bro you the best, this video should have a million likes, keep up the great work, p.s I Subbed
@JWPanimation
@JWPanimation 2 ай бұрын
Thanks!
@JamesCardonatweekaholic
@JamesCardonatweekaholic 11 ай бұрын
Thank you!!
@SveinWroldsen
@SveinWroldsen Жыл бұрын
Thank you, Greg. This is a very helpful tutorial. I wish someone had tested the HTC Vive against REtracker Bliss. I also wonder why Unreal need to do all this so complicated. They are so clever in all they do. Why not have a readymade template? Yes. They have would someone say. Well kind of. It only gets you some steps on the way. So keep doing this Greg!
@GregCorson
@GregCorson Жыл бұрын
Unreal does have some templates, but it is hard to do them for Virtual Production because everybody has different hardware, cameras and trackers. I had one for 4.27 but most people had to make a lot of changes before they could use it, so for 5.0 I thought a start-to-finish tutorial would be better. It takes almost the same amount of time to build the setup from scratch as it would to modify a template, unless you are just lucky and the template matches your hardware exactly. What takes the most time is getting all the hardware connected and your lens distortion/nodal offset calibrated, even with a template you still have to do that. I have used both Bliss and Vive, the Bliss is much better as a camera tracker and is easier to setup than the Vive. Vive also has problems with jitter whenever the camera is stationary or moving slowly. The vive is nice if you want to track multiple objects though. I have attached virtual objects to Vive trackers so the presenter can pickup the tracker in the studio and on-screen it looks like they are holding the virtual object.
@lukastemberger
@lukastemberger 9 ай бұрын
​@@GregCorsonThis is awesome, thank you! Have you tried the Unreal's Live Link Vcam app as the source of tracking data?
@GregCorson
@GregCorson 8 ай бұрын
Technically, you may be able to do this but last time I checked the setup wasn't exactly straightforward. They have made a number of updates to vcam since UE5 so maybe it is easier now, but I haven't tried it. You would still have to have a camera sending video to UE as I don't think the vcam app does that on it's own.
@hafizmtariq
@hafizmtariq Жыл бұрын
Great job. Subscribe and liked.
@SocailInteruption
@SocailInteruption Ай бұрын
Great content and delivery. Thank you for all these videos makes me feel supported to bring VP into my work. I’ve been considering whether it’s a way to bring augmented reality into a live theatre production (think Dogville type set) where the audience can move a screen/camera around to uncover the set. Just wondered if you had any thoughts on that at all, and insight into how latency might affect this if the audience are viewing the real and filmed actor via unreal. Thanks again
@ArcanePacific
@ArcanePacific Жыл бұрын
Thank you Greg! It seems now... we have to add a "Lens" component to the cine camera. The Lens file now looks for a Lens Component. (in the lower left corner of the lens file window, it used to say "Selected Livelink Subject" and now says, "Lens Component". without adding this "Lens" component in the details panel of the cine camera, It would not allow me to do any calibration...Wonder if that is a recent update to Unreal... So basically I have added 3 components, 2 livelink controllers, plus a lens component...
@GregCorson
@GregCorson Жыл бұрын
Yes, in 5.1 they added this "lens component" you have to add to get distortion to work right
@sinanarts
@sinanarts Жыл бұрын
Have no idea how useful this post is for us 5.1 users 🙂 Thank you so much..
@shudheerji
@shudheerji Жыл бұрын
@@GregCorson Please do small tutorial for 5.1, because till lens file (lens calibration) every thing is perfect
@sinanarts
@sinanarts Жыл бұрын
@@GregCorson Greg I did all the steps in 5.0 and after completing a sucessfull test and calibration I run 5.1 and open a copy of the same project in 5.1 and The Lens Component is auto added to the cineCamera by itself. As it is so is there anything I need to input in there.? cos it seems to work.. just double checking with you.
@catchytrouble
@catchytrouble Жыл бұрын
So what should be done with that? I am stuck in calibration step and my lens file shows "the tracked camera does not have a lens component or the lens file assigned in that Lens component does not macth this one" after connecting it to FIZ.
@arjungowda6452
@arjungowda6452 Жыл бұрын
Thank y ❤️u
@WatermanSurin
@WatermanSurin 10 ай бұрын
Hey Greg, when using a prime lens, would you set up a particular kind of curve for focal length? (ie just one value.) Or would you still leave it blank like you have done here?
@GregCorson
@GregCorson 9 ай бұрын
For prime/non zoom lenses I believe in this tutorial I just setup a "curve" that is a constant value like 35mm or whatever the lens is. You should also check if your lens "breaths" (changes fov when you adjust focus). A good prime won't do this but some do. If yours does you may want to calibrate at several different focus settings.
@martin_minds
@martin_minds Жыл бұрын
thanks sooo much. how can anybody set this up without having seen your video ;-) are you happy with your bliss tracker? all the best!
@GregCorson
@GregCorson Жыл бұрын
The bliss is working very well for me.
@anfproduction72
@anfproduction72 Жыл бұрын
cool
@think.feel.travel
@think.feel.travel Жыл бұрын
Hi Greg, this is great. Did you also try or have some knowledge about the Vive Mars system for virtual production? In your opinion, which one is better in terms of accuracy and less jitter? Thank you.
@GregCorson
@GregCorson Жыл бұрын
Sorry but I have not tried the Vive Mars system so I can't report on it.
@otegadamagic
@otegadamagic Жыл бұрын
This is an awesome and very easy to follow along. Can you do one for those using only the realsense t265?
@GregCorson
@GregCorson Жыл бұрын
I don't have a realsense T265 setup working, but the project would be very similar once you have tracking information coming into Unreal. Nothing in this tutorial is dependent on the type of tracking system you use. If your tracking comes in through livelink, the setup would be almost exactly as shown here. If the tracking comes in some other way, you just need to be able to apply the tracking to your camera object, then the rest of the process should be the same.
@otegadamagic
@otegadamagic Жыл бұрын
@@GregCorson ok then. Thanks so much
@boonboon8007
@boonboon8007 Жыл бұрын
great tutorial! would be great if you can have a setup tutorial with Aximmetry. Aximmetry does have a 'free' community version with a small watermark.
@GregCorson
@GregCorson Жыл бұрын
I'm currently working only with Unreal, I know Aximmetry has a better keyer and some other nice features, but I simply don't have enough free time to split my attention between two different platforms (not to mention all the others out there for VP). I'm busy enough trying to keep up with just Unreal! ;-)
@boonboon8007
@boonboon8007 Жыл бұрын
@@GregCorson understood. thank you for all the tutorial. Much appreciated.
@Cronogeo
@Cronogeo 19 күн бұрын
Hello! I'm following your tutorial but using UE 5.4 and I have some troubles with the Livelink Component Fiz. In my case the Camera role doesn't detect a single controller. I'm aware that I need to add a Lens component to the camera, but doesn't work. The live link does work tho, the camera moves synchronised with my phone, so I'm not sure what did I do wrong
@crisislab
@crisislab Жыл бұрын
The link for the Aruco asset I think is wrong (the same link as the Bliss plugin). Do you know where I can find that same Aruco tag asset to follow along?
@GregCorson
@GregCorson Жыл бұрын
Hi, sorry for the confusion, that github link points at the releases page for my plugin and the file you are looking for is down a bit in the "release 2" section. However I have had some problems with distributing unreal assets that way because of the constant changes Epic has been making to Unreal 5. So what I would recommend is you build the aruco asset yourself in whatever version of Unreal you are using. I show how to do this in this video kzfaq.info/get/bejne/gJigoc5qnOCtdJs.html it only takes a few minutes to do. If you can't get the "calibration points" to appear, make sure the camera calibration plugin (included with unreal) is enabled, then they should work.
@ahmedhamed8116
@ahmedhamed8116 Жыл бұрын
Thank you Greg. I've been learning alot from your videos. I'm using the elgato cam link 4k and I cant use the media bundle option. The only way I can do it is through media player and when I insert the fiz data it doesn't seem to work. I'm also using the rift S controller as a tracker Any help would be appreciated.
@benz.1730
@benz.1730 Жыл бұрын
I had the Camlink 4k, too, and it seemed not to work properly with UE. I got the recommendation for Blackmagic Decklink 8k pro, which worked fine and is officially supported by UE. It's of course much more expensive. Maybe there are other cards, but I don't know which.
@neilcarman4002
@neilcarman4002 Жыл бұрын
Greg - great video, thank you. We attempted in 5.1.1 but when we add two LiveLinkComponentControllers - the one for Virtual FIZ cancels out the other one supposed to be bringing in live positional/transform data! It has something to do with the fact Epic split off the Lens File component of the LiveLinkComponent into its own "Lens" component. We had to revert to 5.0.3 since your workflow here does indeed work for that version. Wonder what you're able to find out.
@GregCorson
@GregCorson Жыл бұрын
Not really sure why you are having trouble with this. I have a 5.1 project with a livelink controllers for fiz and tracking, and also a lens component. The livelink components both have the "component to control" set to camera and so does the lens. The last time I used this it seemed to be fine. I am going to try and shoot something over the weekend, I'll see if I can confirm that everything still works.
@deejay8681
@deejay8681 Жыл бұрын
bliss looks awesome, Is it possible to composite virtual objects in front of the video or is it currently limited to just the background?
@GregCorson
@GregCorson Жыл бұрын
Yes, with unreal you can have multiple layers of compositing so you can have CG stuff in foreground or background. If you look through some of my other videos you will see examples of this, the most recent ones I've composited a robot standing in my studio.
@pomeloyou
@pomeloyou Жыл бұрын
Thank you Greg, it's a great tutorial, I follow it step by step and have some questions about 50:24 , how do I fix or avoid the moving slippage, is it any way I can get more correct result?
@GregCorson
@GregCorson Жыл бұрын
Slippage is almost always caused by an incorrect nodal offset. If you are seeing a lot of slippage, you may want to try re-doing your lens calibration and nodal offset as described in this and other videos of mine. Manually measuring your nodal point can also give you a double check to see if the automated process is putting out values that are close. Besides this, in the CG layer of your composure setup, there is a box to check to apply lens distortion, this needs to be checked. If things only go out of alignment when you move the camera and they line up correctly when movement stops, you might have the tracking display set wrong. If you can give me more details about the exact problem you are having and possibly post a sample video, I can be of more help.
@wjm123
@wjm123 Жыл бұрын
thanks for the tutorial! Dont have a tracker yet, but the aruco section to align the camera maybe useful. I believe that will work with any kind of camera input such as a usb webcam or my ATEM mini and doesn't need either a AJA or a Decklink card?
@GregCorson
@GregCorson Жыл бұрын
You can use this without a tracker by manually positioning the camera. Measure the position of the real-world camera with a tape measure and put matching measurements into the CineCamera. You can also line things up by eye. Since you won't be moving the camera while you shoot, the alignment is not so critical. You should be able to use the aruco method to align the camera with a tag in the studio. Of course if you put the tag on the floor, the camera will be looking down and you will need to tilt the real and CineCamera up by hand. You might also put the aruco on a stand and use it like a target for where you want the camera to point. I'll have to give this a try. Any kind of input device like a USB camera, ATEM mini or digitizer card will work. The only thing you lose with these is some of the more professional features and maybe some advanced video modes. But they should be fine for a lot of uses, a number of my earlier videos used a webcam. The only issue I have found is that if you set UE 5.0.3 to use DirectX 12, a lot of USB video devices won't work. You need to set it to use directx 11, then they should. The downside is that I believe DirectX12 is required to use Lumen and Nanite (the new lighting and geometry system). I already reported this bug to Epic several months back and they told me to expect a fix in the next release
@wjm123
@wjm123 Жыл бұрын
@@GregCorson thanks for the info, Def gonna give it a try, was trying to do what these guys were doing at 6:21 kzfaq.info/get/bejne/ereEY7SqnZuanXU.html which can help easily align the camera to the scene in a single click.
@kennytan5570
@kennytan5570 Жыл бұрын
Hi Greg, Thanks for the great work!!! I'm having issue with lens calibration. I keep having this error message from UE. "Only spherical distortion is currently supported. Please update the distortion model used by the camera." Do you happen to know any solution for this?
@GregCorson
@GregCorson Жыл бұрын
I think the issue is in your lens file. There is a "lens info" field there which should be set to spherical as it's the only one unreal supports. If that isn't the issue, look for a similar field in your cinecamera.
@tommywright
@tommywright Жыл бұрын
At 43:13, you have a new CameraCalib_ stack with a cg and plate... did you build that outside the video or something? I don't have that stuff when reaching this point.
@tommywright
@tommywright Жыл бұрын
Oh.. nevermind.. you need to have the Lens File open to see those in the Outliner. :P
@GregCorson
@GregCorson Жыл бұрын
Yes, the lens calibrator creates that stuff when you open it and removes it when you close it.
@kracowska
@kracowska Жыл бұрын
Hello Greg, thank you so much for your work and enthusiasm, I’ve been following your work for a few years now, so thanks again for the inspiration. I’m using retracker since first T265, and I’m now looking to create my custom ArUco markers within unreal (like dozen of them), so i wonder if you could recommend some documentations to achieve this ? I’m looking online unsuccessfully Have nice day / Raphael / paris France
@GregCorson
@GregCorson Жыл бұрын
Hi, in this video kzfaq.info/get/bejne/gJigoc5qnOCtdJs.html about halfway through I go through the process of creating a custom aruco actor in Unreal. The main thing to get right is that the names of the calibration points have to match up with the Aruco you are using, there is more info in the video. Please come back and ask more questions if this does not solve it for you. Also check out a friend of mine in Paris who is also doing tutorials in french! kzfaq.info
@allanfocus2705
@allanfocus2705 10 ай бұрын
Hello Greg, thank's for the video, I have an issue with the 'Lens Calibration' plugin; it doesn't appear in the list of plugins, so I can't activate it. I'm using UE5 Early Access. Thank's for the help
@GregCorson
@GregCorson 10 ай бұрын
The plugin is listed as camera calibration, as far as I know it has been available as far back as unreal 4.27. If you are using the original UE5 early access, this is very old and was missing some virtual production features, we are up to 5.3 release now but any of the "final" releases should have the camera calibration plugin. Don't confuse it with the "lens distortion" plugin which is old and will eventually be discontinued. If you update to a more recent UE5 and still can't find the plugin then there is something wrong. Also remember, after you have setup and calibrated your lens that you have to go into each of your CG composure passes, turn on lens distortion and add the name of your lens file, this was added in 5.1
@mingli1492
@mingli1492 7 күн бұрын
Hey guys, I've been experimenting with UE 5.4.2 today. After setting everything up and migrating my lens files over from an existing project created with 5.1, I wanted to set my camera alignment with the Aruco tag method, only to repeatedly get an error message stating: "Failed to resolve a camera position given the data in the calibration rows. Please retry the calibration." It works fine with the same lens files in the original 5.1 project. Has anyone else experienced this?
@khanmichael
@khanmichael 6 ай бұрын
Thanks for the video Greg! Do you know if any promos are still available?
@GregCorson
@GregCorson 6 ай бұрын
Checkout www.retracker.co/ for prices, there are currently 3 different licenses available (same tracking performance, different levels of support) and there are sometimes discounts for things like black friday, prime day and other things. I've asked retracker where they usually announce these things, in the past it has been on the website or on facebook virtual production groups.
@AdamSmith-zq5sr
@AdamSmith-zq5sr Жыл бұрын
So I just wanted to make sure I did the right calculations. If anyone else is also using a BMPCC4K with a 0.71 Metabones Speedbooster and a Sigma 18-35 Lens, any help would be appreciated. As far a I know and what I have been using was a sensor size of 18.96 x 10, and for the lens, im using it at the 35mm, so for the calculation I took 35 x 0.71 which came out to roughly 25mm. Would this be correct? I appreciate all of your help and I thank you Greg again for all the great tutorials. It would be great if you could maybe make a really quick update though on adding the lens component within 5.1. I'm still not sure why I cannot get it right in 5.1, so I've been using the 5.0.3. Cheers
@Hub-Images
@Hub-Images Жыл бұрын
hi Greg how to set nodal offset point with the retracker Bliss, thank you for all tutorials you made,
@GregCorson
@GregCorson Жыл бұрын
Funny you should ask. We are just finishing up a way to do this, just doing some final testing and documentation on it. I should have a tutorial up soon.
@GregCorson
@GregCorson Жыл бұрын
The tutorial for setting the nodal point with RETracker bliss is up now, it's very accurate and easy. kzfaq.info/get/bejne/ec6qeLuWmJaxeZ8.html
@AlejandroGuerrero
@AlejandroGuerrero Жыл бұрын
THANKS for this video, this is the BEST tutorial out there. On 37:30 , under LiveLinkComponentControllerFIZ, Transform Role, LiveLink, Camera Role Role I get "No Controllers were found for this role". It looks like the issue appears with UE 5.1, but in 5.0.3 it works, so looks like the method changed. As stated below by ArcanePacific: It seems now that for the calibration in 5.1 we have to add a “Lens” component to each CineCameraActor we want to calibrate. In the lower left corner of the lens file window, it used to say “Selected Livelink Subject” and now says, “Lens Component”. Without adding this “Lens” component in the details panel of the cine camera, it would not allow you to do any calibration. So basically add 3 components: 2 livelink controllers and 1 Lens FORUM THREAD: forums.unrealengine.com/t/livelink-cannot-assign-camera-role-no-controllers-were-found-for-this-role/740889/3
@GregCorson
@GregCorson Жыл бұрын
Yes, this is right. The lens component is new in 5.1, I need to do a short tutorial about this. Also to get the right match between your live and composure cameras you need to check the "apply lens distortion" box in your composure CG layer and point it at the lens component.
@WatermanSurin
@WatermanSurin 10 ай бұрын
Hey Greg ran into a problem at the calibration stage: Under Cinecameraactor in the LiveLinkComponentControllerFIZ settings: there are no options populating in the Camera Role section. Any guidance would be hugely appreciated.
@WatermanSurin
@WatermanSurin 10 ай бұрын
in the list of controllers where Live link camera controller should be, it is just showing "No controllers were found for this role" im using htc vive trackers, they are connected and receiving data, and I have connected the livelinkcomponentcontrollerFIZ's subject representation to the Virtual Subject (Virtual Fiz) in livelink
@GregCorson
@GregCorson 9 ай бұрын
Not certain that this is the issue, but when they upgraded to 5.1 they added another component for handling the lens distortion. This has to be added to your camera component. I don't have a reference handy right now but I believe it was just called "lens" and has places to put the lens file information. Let me know if you can't find it.
@topgunmaverick9281
@topgunmaverick9281 Жыл бұрын
WOW Great !!! Thank you , I have steamvr + vive tracker + htc vive base station. Can I use method in this video to follow steps too ? for Calibration.
@GregCorson
@GregCorson Жыл бұрын
Yes, you can use almost the same methods with a Vive. Please have a look through the other videos on the channel, there is one specifically for VIVE. It was done on unreal 4.27 but the process has not changed.
@topgunmaverick9281
@topgunmaverick9281 Жыл бұрын
@@GregCorson Thank you so much 😊
@sybexstudio
@sybexstudio 5 ай бұрын
Hi Greg, I could not get the camera feed to work on my Blackmagic decklink 8K. Getting the Could not find the device 0 in UE. And failed to capture media source. The data specified for the media is invalid, inconsistent or not supported. If I use the Blackmagic input source in UE I can see it in media player but can’t select it from capture device. Any other options?
@GregCorson
@GregCorson 5 ай бұрын
I don't have a blackmagic card so I can't test this. However I have noticed that the Unreal drivers are very sensitive to having the right video format set. Sometimes they won't work if the format is a little off, so try doublechecking the format to make sure it matches what your camera is putting out. FPS, resolution...everything. This might help. I know many people use the blackmagic 8k without any problems.
@VictoRs107
@VictoRs107 Жыл бұрын
Good evening, I have't watched the tutorial yet, but I will definitely look. Grek, I heard that can only use the T265 tracker itself and the aximmetry program? Have you already tried the test in such a set?
@GregCorson
@GregCorson Жыл бұрын
I have not done any real work with Aximmetry, it is a great product with nice features, but I'm sticking to Unreal Engine because it's free and I want my tutorials and such to be accessible to anyone even if they don't have the budget to buy tools. It is possible to use a T265 for tracking. There are a number of different projects on github and elsewhere that do this. The problem is that these are DIY coding projects so it takes some smarts to get them working, but it can be done if you have the time. The other problem is that the T265 is a discontinued product which is getting harder and harder to buy. I spent a lot of time looking for cheap tracking solutions but finally decided my time was better spent pushing out tutorials to help people actually do virtual production.
@VictoRs107
@VictoRs107 Жыл бұрын
@@GregCorson Possibly, but now the price for "retracking bliss" is €3,128.00. This is to much for me, I don’t know how these guys did it without this system, only in conjunction with aximmetry and T265 kzfaq.info/get/bejne/j-CKh6SfsKrOmYE.html.
@ubong120
@ubong120 Жыл бұрын
@@GregCorson retracker is compatible with the T265, so you can connect to live link through it ,only problem is that it is a $2000 solution,but you can ask rasavvi the owner for a trial
@xujerry7791
@xujerry7791 Жыл бұрын
Many thanks Greg! So helpful tutorial! But still wondering if there's any other way to fix tracking delay since testing BLISS for a while. Would you try genlcok port of BLISS in ue?
@GregCorson
@GregCorson Жыл бұрын
I'm not sure what you mean by "fix tracking delay"? There will always be a delay, you just need to do the setup correctly and the tracking will remain synced up with the video. If you mean something like AUTOMATICALLY setting the right delay, the only method I know of to do this is to have UE, your tracking system and camera synced up to the same timecode signal through something like a tentacle sync. Unfortunately not all tracking systems support timecode (Vive doesn't, Bliss doesn't right now, but may in the future)
@xujerry7791
@xujerry7791 Жыл бұрын
@@GregCorson Yes, I have tried manually setup of tracking delay in different projects and it's hard to match the signal perfectly especially in the complicated scenes. It's quite a big problem for real time composite. I know there's no hardware genlock port of BLISS but there is software genlock port. So I'm wondering if there is any possible ways to do something in software (both in BLISS setup and UE) to help to match the signals.
@GregCorson
@GregCorson Жыл бұрын
Really the only way to automatically setup the tracking delay is with timecode hardware. You have timecode going into your camera and also have it going into your tracker (whether it is bliss or something else). When the video and tracking arrives at unreal it will automatically sync it up. This isn't "genlock". External genlock just causes the video and tracking to generate data at exactly the same time. Each one will still be delayed by some amount which you will have to figure out. But with genlock the delay will always be a whole number of frames. The trick to manually getting the delay right is practice, basically you just point everything at a scene. Have something in the CG and the real world set that line up. Like have an X on the floor in the CG scene and put a matching tape X on the floor in the real set. Give the camera a sudden motion and watch to see if the CG or the background moves first. Then adjust the delay and repeat till they move at the same time. In most cases your delay will be some multiple of the camera frame time. ie: if you are shooting at NTSC frame rates each frame is 1/29.97 or 0.03336 second. You can start by trying one frame of delay, two and so on till you zero in on the smallest delay, then try adjusting by smaller amounts from that. It is possible that your tracking system might be over-filtering the tracking which can also make things impossible to sync up. After you get your delay setup, try shaking the camera. The CG and video should stay synced. If the delay is wrong the movement will lag behind, if the tracking is over filtered, it will swirl all over the place, not just delayed but completely wrong. Bliss doesn't do this with it's default settings but if you have manually messed with the filter settings it might. Some lower cost USB digitizers and cameras do not output a constant frame rate, this is bad for virtual production because the frame rate and tracking delay will keep changing.
@sinanarts
@sinanarts Жыл бұрын
Dear Greg, I upgraded my tracking device to a INTEL now. And the placement of my new device is few cm BEHIND the previous one. I mean With Vive (old tracker) It was 12 high above but now 10.5cm Upwards and 3cm backwards. Should I revise the Nodal setting accordingly or leave it as is.?
@GregCorson
@GregCorson Жыл бұрын
If the origin of the tracker is now in a different place, you will have to adjust the nodal offset accordingly or you will have tracking errors. The amount of error may not be that much, but it will probably be visible. Also be sure to check where the actual tracking center of the INTEL is, if I remember correctly it is NOT at the tripod screw.
@sinanarts
@sinanarts Жыл бұрын
@@GregCorson I am having such an issue can you say what the reason may be.? Is it my exceeded VideoMemory to get such low quality on Composure Window.? Pls see the link Jpg : drive.google.com/file/d/1sWKhOG2rSIDAmaKIKT82m3fVjkMeZRNc/view?usp=share_link
@GregCorson
@GregCorson Жыл бұрын
You have basically run out of memory on your video card. This could be because you just have a lot of high resolution things in your unreal/composure setup, or you might have another program running on your computer at the same time as unreal that is also using video memory. What GPU card do you have? You can see how much GPU memory is being used by opening up the windows task manager and looking at the performance tab for the GPU. Normally, before you start running anything, you won't be using that much. On my machine about 4gb is used before launching unreal. You can watch this gauge as you startup and use unreal, check to see what you are doing when it hits the maximum, that may give a clue as to what is using all your memory.
@olivermoore6881
@olivermoore6881 Жыл бұрын
Thank you very much for your tutorial. Can you make a tutorial for making zoom lens documents? I now have a PTZ. I can get freed data from livelink, zoom value and focus value from 0 to 1,
@GregCorson
@GregCorson Жыл бұрын
Unfortunately, I don't have a lens encoder so I haven't been able to try calibrating my zoom at different focal lengths. The basic process is the same, you just do it more times. ie: zoom the lens all the way out, do a distortion and nodal calibration, then zoom it all the way in and do it again. Depending on how much your nodal and distortion changes with zoom, you may have to do this at several more zoom settings to get something usable. I have heard that for some lenses 2 or 3 is enough, for others it takes more.
@MrJonathanwolff
@MrJonathanwolff Жыл бұрын
Hey Greg - thank you so much for this. Just wondered if you had any opinions on the pros and cons of retracker bliss compared to mars vive (other than price). Many thanks for all your great tutorials 😊
@GregCorson
@GregCorson Жыл бұрын
I haven't used the Mars system, it's based on VIVE trackers which originally had a lot of problems with tracking jitter. I'm told Mars has solved this and fixed a number of other issues with using something designed for gaming in Virtual Production. But I can't say for sure till I get my hands on one to try. Probably the main advantage of VIVE/Mars is that it can track other objects like hand controllers, props...etc as well as the camera. I'm actually using RETracker Bliss and so far I'm finding it stable, accurate and easy to use. It's fairly new and originally had some issues but over the last few months the manufacturer has fixed nearly all of them. The main advantage is that there is little or no setup required. You can mount this little box on your camera and be ready to shoot very quickly and it works indoors and outdoors.
@MrJonathanwolff
@MrJonathanwolff Жыл бұрын
@@GregCorson Thank you so much for the compressive reply. Great points and very useful. Just saw the vive mars in action today with a studio in London and they were still getting noticeable jitter on static closeups which I was surprised with. The flexibility of bliss is a great point - what I was slightly unsure about is you would have to get a number of retrackers if you wanted to have more than one camera while the mars has ability for 3 - but then of course I was told unreal only can take one live tracked stream in (I believe) unless you use a nunber of computers for each one and synch them together somehow maybe in Aximmetry? So it’s not such a deal breaker. Anyway thanks again Greg - your generosity to the vp community is much appreciated 🤓
@GregCorson
@GregCorson Жыл бұрын
Almost all trackers perform better in longer shots than in closeups or zoomed in shots. There is just more error in close shots that is hard to get rid of. In the past, the solution has been to setup the camera and just turn tracking off for a close static shot. As far as multiple cameras, yes you would need a bliss for each camera. I'm not experienced in Aximmetry so I don't know how/if they handle multiple cameras. I have setup multiple cameras in unreal before and it works, but it is expensive because all the views are rendering at the same time. I was not able to find a way to shut off the views that aren't visible (ie: a view that is not being output) so every camera slowed things down. Using a separate PC for each camera is probably the best solution for multicam for now, until this issue can be solved. But if your 3d set is simple enough and your target frame rate is low enough, one PC can handle two or more cameras.
@MrJonathanwolff
@MrJonathanwolff Жыл бұрын
As always many thanks for such great advice Greg. Very much appreciated. I will be experimenting soon hopefully with one tracked and 2 static cameras at same time. If I discover anything useful I will let you know. Best wishes Jonathan
@GregCorson
@GregCorson Жыл бұрын
The thing that always caused problems for me in UE4 was that all my multiple cameras views were always rendering, even when they were not on-screen. Not sure if there is a solution to this in UE5 or not, let me know if you find one
@tommywright
@tommywright Жыл бұрын
What if you have the RealSense T265 with REtracker, can that go straight into Unreal or do you have to go through Aximmetry?
@GregCorson
@GregCorson Жыл бұрын
I have not used the T265 with unreal, so I can't really tell you about this.
@tommywright
@tommywright Жыл бұрын
@@GregCorson Marwan helped me out with this.. there is a LiveLink REtracker version which was exactly what I was looking for.
@mmharvey67
@mmharvey67 Жыл бұрын
Hi Greg, I can't thank you enough for all of your tutorials. The only problem I have is that I am using an Elgato CamLink 4K and when I press play it stops receiving video from the camera and I have to go back and reset the media player. Feels like I am missing a setting because I am using a USB capture device rather than a media bundle. My other question is how will I know if you don't have the sensor information for my camera correct? I've made my best estimation based on the information available, but what should I look for in Unreal if the sensor information I have entered does not match my actual sensor? As always, thank you for all of this great work.
@GregCorson
@GregCorson Жыл бұрын
Awhile back there were some issues with USB video capture devices and Unreal 5.0, I thought this was eventually resolved though, you may want to try the latest 5.1.1 version of unreal. If you still have a problem please let me know and I'll try to check it out here. As far as the sensor information, if you have it really wrong, you won't be able to get a calibration. That is, when you calibrate your lens you will get a large error value or the video will look even more distorted. The thing that is most important is to get the horizontal/vertical ratio correct. A lot of cameras have 4:3 ratio sensors but output 16:9 video, cropping some off the top and bottom. Your best bet if you can't find out how your sensor is cropped would be to take the sensor width and divide it by 1.77777 (the 16:9 ratio) to get the vertical size. This should give you a good estimate. However some cameras do crop the sensor for video in both horizontal and vertical directions, you really need to find out from the manufacturer if it does this because this actually shrinks the overall sensor size and changes the effective focal length of the lens. dpreview.com has reviews of most common cameras and tells how the sensor is cropped, try looking there.
@mmharvey67
@mmharvey67 Жыл бұрын
@@GregCorson I am using Unreal 5.1.1 and the USB capture was definitely misbehaving for me anyway. Along with quitting when I pressed play, it would develop a significant delay over time (during the camera calibration process) and I would have to go to media source and select it again to fix it. It would also randomly seem to stop receiving signal even though the camera was still on. It may have been something in my machine, but it is a pretty capable computer. As far as the sensor goes, that helps a great deal. I think my best course of action is to set the camera to 4:3 aspect ration and use the entire sensor size and see how the calibration works. I can use that as a benchmark when I try using a 16:9 ratio with the math above. The other big question I have is the camera I am using is a Micro Four Thirds format which according to the mfg. changes the focal length of my lens from 25mm to 50mm. I'm unsure as to which data to use in the lens file in Unreal. I can't thank you enough for the tutorial and the fact that you respond to questions as well. I have learned more than you can imagine from all of this amazing work. Thank you!
@GregCorson
@GregCorson Жыл бұрын
I believe if you set the sensor size correctly, the true (as marked on the lens) focal length will be handled correctly. You will see in Unreal if you set the sensor size to full-frame DSLR you get a wider angle than if you set one of the smaller sizes like micro 4/3s. This even works with phone cameras which have tiny sensors and lenses around 4mm focal length. When calibrating the camera/lens it is usually a good idea to have the camera in the mode you will use for shooting video, just in case it internally crops the sensor in some way...So probably best to calibrate in 16:9 if that is what you will be shooting in. I will try to give USB input a check here, just have to find where my old camlink got to.
@terryriegel
@terryriegel Жыл бұрын
The checkerboard should not have the same number of horizontal and vertical squares. Mine did and it caused the orientation (landscape/portrait) to be unpredictable.
@GregCorson
@GregCorson Жыл бұрын
Thanks for the tip! Mine usually don't have the same horizontal and vertical size because I print on standard sized paper which is not square.
@sinanarts
@sinanarts Жыл бұрын
Once again a Gem from Greg. Thx. Can you please give us a sample tutorial using / Htc As Tracker and A live key video in an UE5 scene. I mean a walkthrough specific with these devices. Or you did already for UE5.? Thank you for your precious time.
@GregCorson
@GregCorson Жыл бұрын
Actually, the setup for using a VIVE is almost the same as this. VIVE also connects by LiveLink just like the Bliss does so it should just work if you connect to the VIVE during the LiveLink part. If you have a second VIVE tracker you can use the method shown in kzfaq.info/get/bejne/fpukZaWouq3Vm30.html to measure the nodal offset also. The first part of this video is identical to the method shown here but at the end it shows how to use a second VIVE to set the nodal offset. With VIVE, the setting of worldpose is optional. If you do a "room setup" with VIVE then the center of the stage will be wherever the headset was when you did it. If you don't have a headset the method shown here should work. Be aware though, that the method shown here sets the worldpose only for the camera. If you are using additional vive trackers in the same scene to track talent or props, you probably would not want to set worldpose with an Aruco
@sinanarts
@sinanarts Жыл бұрын
@@GregCorson Dear Greg, I did all and up and runnin but My
@GregCorson
@GregCorson Жыл бұрын
If your CG layer is out of focus and you did not use FIZ, you can adjust the focus inside of the cinecamera actor. Down inside it you will find a section for lens and focus where you can adjust focus distance, f-stop and other camera parameters by hand. In the setup I detailed in the tutorial, all the camera parameters come from the FIZ and calibrator and get pushed into the camera automatically. If you didn't use the calibrator/FIZ you will need to adjust the parameters in the camera by hand to match your real world camera. This means things like sensor size, focus, focal length, f-stop...etc.
@sinanarts
@sinanarts Жыл бұрын
@@GregCorson Thank you man thank you. I hate Htc Vive controllers as a tracker.. As there is always a tiny glide no matter what.. :-( Hope that soon I can afford ReTracker Bliss 🙏🏻⭐️
@GregCorson
@GregCorson Жыл бұрын
Vive trackers will work, but there is always some jitter. If you are getting slippage between your real and CG elements when you move, this is probably because the offset between your lens nodal point and tracker is not correct. If there is a slight lag between the real and CG when you start or stop moving it is probably an error in your tracking delay. Also be careful about adding filtering to smooth out the jitter in vive trackers. A little filtering can be beneficial but if you add too much it can cause the CG and real cameras to get out of sync when moving.
@waifongthng5982
@waifongthng5982 Жыл бұрын
Thank you for the great tutorial. Able to to get everything to work, also did the lens calibration using vive tracker to find the nodal offset. However, the last part I can't get it to work properly. I add source the cine cam actor to the Take Recorder, but in the sequencer the orientation are tilted down and the pan axis became roll axis. Please help me out...
@GregCorson
@GregCorson Жыл бұрын
To get an accurate nodal offset, your lens distortion calibration has to be correct. Unfortunately the only way to check this is to do several calibrations and see that they come out close to the same. Once you have the lens calibration, getting the correct nodal offset requires doing the calibration with the vive tracker in a number of different places. For example, some close up, some at a distance, some in the center of the frame and some at the edges. Using VIVE trackers I would recommend a minimum of 6 positions, more would be better. I would advise using one of the other methods on my channel for approximating the nodal and offset measurements. This way you will know roughly what they should be when doing the nodal offset calibration in Unreal. In my experience, when Unreal comes up with the wrong values they are usually wrong by a lot, so they are not too hard to spot. It sounds like this might be what's happening to you. When I was first testing the VIVE method I did all the samples with the VIVE at the same distance from the camera and got pretty wildly inaccurate results, this stopped when I started doing the samples at a variety of distances.
@waifongthng5982
@waifongthng5982 Жыл бұрын
@@GregCorson I was able to get the nodal offset pretty accurate, as there is no drifting when I move the tracker. Also able to get the aruco mark working properly. Done the composure as well, with media plate and cg element. My problem is I can’t get the take recorder working properly. I only add the cine camera actor into the take recorder, but once I record, and then play back in sequencer, all the axis are wrong. Do you have any tips for take setting?
@waifongthng5982
@waifongthng5982 Жыл бұрын
Do you have any tips for take recorder settings?*
@GregCorson
@GregCorson Жыл бұрын
I'm not really sure why take recorder would do that, it sounds as if it is recording only the camera's local transform rather than it's world position. Maybe you also need to record the camera parent actor so you have the full transform being recorded? Also, it is possible to record the live link streams in take recorder. If you do this you don't record the camera and just play the live link stuff back so all the things controlled by live link will move the way they did when you made the recording.
@martin_minds
@martin_minds Жыл бұрын
are you going to get a new promo deal with retracker? thanks
@GregCorson
@GregCorson Жыл бұрын
If there is a new one I will post it, right now I think they are releasing an indie version for unreal engine only that is less expensive.
@howtotaiwan1384
@howtotaiwan1384 Жыл бұрын
Hello Greg! I am using the quest controllers as tracker and so far I haven't been able to calibrate the camera. Specifically, it seems like the FIZ variables are not getting feed from the tracker. Any Idea on how to do it?
@GregCorson
@GregCorson Жыл бұрын
Hi, I don't have a quest so I can't be of much help here. As far as the FIZ values, they don't come directly from the controler/tracker. You need to make a separate FIZ actor and hook it to your camera as I did in this video. If that still doesn't work, let me know.
@pavelpereyarenets
@pavelpereyarenets Жыл бұрын
@@GregCorson Yes when properly connected Oculus 2 - despite the fact that everything is shown and the camera is spinning, unfortunately the data is not received, it is not possible to calibrate the camera and compare Aruco marks - it also fails Greg you do magic, tell me how to get around this limitation and match the camera
@howtotaiwan1384
@howtotaiwan1384 Жыл бұрын
@@pavelpereyarenets hello! Did you find a way around it? I am also working with the occulus quest
@sinanarts
@sinanarts Жыл бұрын
Dear Greg, I can't thank you enough for your invaluable efforts on enligtening us.. Thank you. I have a minor issue I hope u can help out. I set the nodal point and with Aruco set the world pose and in Lens File Viewport real aruco and virtual aruco are fit milimetric. But in Composure Wİndow and also when I begin Play.. there is few cm offset EVEN WITHOUT MOVING.. They are not sharp align in Composure but in LensFile looks perfect .. Why this might be.?
@GregCorson
@GregCorson Жыл бұрын
It is probably this checkbox. They added it awhile ago and we didn't know about it till we started getting good nodal offset measurements. It may look a bit different if you are using 5.0 and 5.1 because in 5.1 you add a "lens distortion" component to the camera. Either way you have to set this up in the cg_element of your composure or the lens distortion won't be applied yt3.ggpht.com/GYdHbpPNg_DPqvjx6v3MMf8yDfpmoEyu86mzOZiBM4BymevMY7FZ5PHZbxD8sCDZS9i-7TxjNxt8=s494-nd-v1
@sinanarts
@sinanarts Жыл бұрын
@@GregCorson Will try first thing in the morning here 😁 And more; As all set up and Composure Screen shows without problem. Once I click Play In an External Viewport option The Camera shows completely elsewhere.. What might be the reason. .? And One last thing Greg would u consider starting a discord channel and make your priceless efforts more accessible and categorised.? Have such minds. ?
@sinanarts
@sinanarts Жыл бұрын
@@GregCorson This worked thank you. 2 problems though.. 1) My composure screen the CG objects are flickering. drive.google.com/file/d/1H3i1vHxNi3kgPODwsB-_KQqURLl4PFBO/view?usp=share_link 2) I want to hit begin play and have myself in the scene. At the moment only the CG Background is rendering when play mode..
@GregCorson
@GregCorson Жыл бұрын
The flickering you see around the edges of objects seems to be because they are rendering in lower resolution than they should, I'm not sure why. Any small amount of jitter in your camera tracking will make edges move like this if the resolution is low and anti-aliasing is not being done. As far as your live camera not showing up in the composite, have you looked at the media plate to make sure the video is visible there? If it is, your compositing material in your composure setup might be wrong.
@Tauron211
@Tauron211 Жыл бұрын
Hi, I have a question. I have HTC vive tracker in UE 5.1 and followed the video step by step, everything is calibrated but when I move my camera after calibration like you do on 50.26 timecode my UE surrounding with aruco table doesn't follow for real camera movements, staying on the same position. What did I miss?
@GregCorson
@GregCorson Жыл бұрын
Not sure why this happens, it sounds like the tracking data from your VIVE might not be coming through correctly. Check your livelink to make sure it is all green and that the VIVE hasn't gone to sleep on you. Also check the livelink component in you camera to make sure data is getting through. If it is, the location/rotation numbers should be constantly changing.
@Tauron211
@Tauron211 Жыл бұрын
@@GregCorson Yep, thanks, tracker had yellow color in livelink during that time. But I have another problem now - when you rotate your stationary camera in real life it also rotates in UE without movements, in my case the entire camera in UE is moving instead of rotation when I rotate real camera, like somebody holding it in hands and stepping in sides. Did I miss some checkbox to have it fixed and only have rotating movements or its tracker's problem?
@GregCorson
@GregCorson 10 ай бұрын
Generally you actually want to apply all tracker movement to the camera, not just rotation. I would suggest checking your setup to make sure it is doing horizontal motion correctly. Depending on the VIVE setup you use it can end up applying horizontal motion to the wrong axis (ie: left right motion might go forward back). Check things out to see that all 6 axis of motion are going the right way. If they aren't you can flip the axis around in the livelink window. However generally if you use the nodal point setup as shown in this video, it will figure out this for you and you don't have to set anything.
@nate-plays
@nate-plays Жыл бұрын
I have followed this tutorial but I cannot get the Aruco to detect. I get a message that says "Could not find a cached set of camera data (e.g FIZ). Check the Lens Component to make sure it has valid evaluation points." I followed the FIZ and Lens file bit but I feel I'm missing something silly.
@GregCorson
@GregCorson Жыл бұрын
Not sure why this would happen if you were able to make a lens file in the lens calibrator. One thing though, in 5.1 they moved where the lens calibration stuff is hooked up. Now you have to add a "lens component" to your camera and point it to the lens file. This could be the issue. Unreal was automatically adding the component when you updated a 5.0 project, but if you started with 5.1 you have to add it manually. This could be the problem but I have to admit I have never seen that error message before so I'm not really sure.
@nate-plays
@nate-plays Жыл бұрын
@@GregCorson This was where I was lost. Thankyou for clarifying that really helps. :)
@eliteartisan6733
@eliteartisan6733 Жыл бұрын
At exactly 40:25 when you say "You can already see that the RAW FIZ information that we had in our FIZ object is already coming through". But in mine this doesn't show the numbers I input in the FIZ object
@GregCorson
@GregCorson 11 ай бұрын
Hi, sorry about this, this tutorial was done with UE5.0, when they upgraded to 5.1 they changed how the lens file works. A lot of people didn't notice because when you updated a project from 5.0 they automatically added the required component. If you're starting from 5.1 or 5.2 you need to add it yourself. Click on your cinecamera actor and add a "lens" component to it. Set this to your lens file and the FIZ data should show up in the calibrator. There was also one other addition. In all of your CG elements in composure there isnow a lens distortion section also. Find this, check "apply distortion" and for lens component click the pop-up and select the "lens" component in your cine camera actor. I wish unreal would document changes like this better, it caused many people problems (including me) Right now I'm building a project for a new tutorial that will mention all this.
@eliteartisan6733
@eliteartisan6733 11 ай бұрын
@@GregCorson no need to apologize at all. You’re such a good guy and I’m just happy you exist
@ubong120
@ubong120 Жыл бұрын
what camrea are you using i assume it a hdmi camera are you using a hdmi to sdi converter
@GregCorson
@GregCorson Жыл бұрын
My AJA card is HDMI in, but I can't recommend it because the price recently doubled for some reason. a Black Magic card and converter is much cheaper. I current use a Sony A7R4 mirrorless camera. It's a good camera but one intended for video with SDI would be better.
@eliteartisan6733
@eliteartisan6733 Жыл бұрын
I got stuck At the point where you aim the camera at the real ártico marker on the floor and set the Apply to camera parent, it gets aligned with the virtual aruco. But the moment I pan the camera only the camera moves and the virtual set doesn’t. I know my camera entrance pupil is wrong but even so both worlds should be moving albeit in wrong sync. Isn’t it?
@GregCorson
@GregCorson Жыл бұрын
If the virtual set is not moving, you have some problem with your camera tracker. What are you using? Vive? Bliss? something else? When the virtual set doesn't move it means your livelink component in the camera actor is not setup right or the livelink system isn't connected to the tracker or have lost connection to it. This starts around 28 minutes into the video. If your set isn't working, look at livelink and make sure your tracker is showing up and has a green light. In the case of a VIVE, they sometimes just shut themselves off if they have been sitting still for awhile. Also remember that every time you quit out of unreal and come back in, you have to reconnect to your live link device. Unless you have setup a default livelink configuration that automatically loads.
@eliteartisan6733
@eliteartisan6733 Жыл бұрын
@@GregCorson Hi Mr Corson. I can't express enough how helpful you've been. Thank you so much. Now, about my tracker, I use Antilatency. I am really stuck on nodal point of my camera. I watched three of your videos about it, watched other channels, but I just cant get it done. I saw one of your videos where you put the camera on the slider, then stick a small colored paper on the window. In your video when the nodal point is ahead your camera's panning goes to one side, then you've gone past the nodal point then the panning goes to the other side. However no matter how far or how close I slide the panning is always to my left
@GregCorson
@GregCorson 10 ай бұрын
Sorry for the display getting back to you. The nodal point of your lens will almost always be between the front of the lens and the back of it. If you move the pivot all the way to the front of the lens and then all the way back, when you check the alignment of the mark on the window they should shift in different directions. The only thing I can think of is maybe your camera/slider is not level? It should be perfectly horizontal when you use this technique. A cheap carpenters level works well for this. This method should pretty much always work kzfaq.info/get/bejne/lcB0ksx-0dXbdqc.htmlsi=hD4Fxoq7ZdeIjgjw
@PhilipLemon
@PhilipLemon Жыл бұрын
In Unreal 5.1 the lens component setup (around 37:30) the Camera Role options appear to have been changed, the Camera Role drop-down doesn't have the Livelink Camera Controller option, it just says "No controllers were found for this role". Am I missing a newly required step?
@GregCorson
@GregCorson Жыл бұрын
Unreal 5.1 made a few changes. You now have to add a "lens component" and that's where you add the lens file.
@PhilipLemon
@PhilipLemon Жыл бұрын
@@GregCorson Ah. Thanks. All these changes makes a newbie learning a moving target. Still trying to figure out how the FIZ virtual source is driven by a tracker. Hopefully I can decipher the doco. Thanks for this great video.
@GregCorson
@GregCorson Жыл бұрын
To use the FIZ in real-time you need lens encoders. This is hardware that attaches to the lens rings and reads the position of the focus, iris, zoom in real time. Unfortunately, this hardware is still rather more expensive than it should be. But it basically sends the information to unreal over another livelink channel which feeds into the cinecamera. The only issue is that if the camera has auto exposure, auto focus or power zoom their might not be a lens ring to attach one of these trackers to. Most professional cine lenses have rings but a lot of consumer lenses don't.
@PhilipLemon
@PhilipLemon Жыл бұрын
@@GregCorson thanks. I have a couple of vive trackers rigged on manual follow focuses (I use cine primes) so I guess I need to write a blueprint.
@GregCorson
@GregCorson Жыл бұрын
A number of people have rigged vive trackers to manual follow focus devices and it works, you need to write your own blueprint to calibrate them though. I haven't tried this so I don't have one.
@AdamSmith-pn5hk
@AdamSmith-pn5hk Жыл бұрын
Hey Greg, so I’ve been making progress slowly but surely. I’m stuck now on the tutorial at 1:02 where you start to add the layers into the matrix map. My issue is that my lens calibration layers will not show up (camera_calib_sony24-70). And not only that, when I return to my original scene they have also disappeared? If you have any idea why this is happening that would be fantastic because I’m stuck at the moment. Cheers
@GregCorson
@GregCorson Жыл бұрын
Not really sure what the problem is 1:02 doesn't seem to point to the spot in the video where you are having trouble. Also you understand that you can't use my lens files, you have to make your own or they won't match your lens and camera. If you can send a correct time stamp so I know where you are having trouble I'll see what I can do.
@AdamSmith-pn5hk
@AdamSmith-pn5hk Жыл бұрын
@@GregCorson Yep, I’m using my own lens file just wanted to put the name of yours so you had an idea where I was. It was 1 hour and 2 minutes I believe but I figured it out.
@AdamSmith-pn5hk
@AdamSmith-pn5hk Жыл бұрын
@@GregCorson Is it possible within unreal to crop the green screen footage in real time so I can basically remove my room other than the green screen background? Cheers
@GregCorson
@GregCorson Жыл бұрын
You are looking for what's called a "garbage matte" This can be as simple as using a couple of planes to mask out the part of your video feed you don't want. If you look in my older videos you can see examples of it. If your setup is simple you can just create a plane in your CG world that is the same size and position as your greenscreen, assign it to a matte layer in composure. It will render as a black/white mask. Put this in front of your video layer and all the off-greenscreen stuff will get masked out. Just like your CG layers, this layer will move as the camera moves so it lets you pan all over the place and not catch anything off-greenscreen.
@AdamSmith-pn5hk
@AdamSmith-pn5hk Жыл бұрын
@@GregCorson Thanks again, Greg. Don’t know what id do without your help! Your tutorials are fantastic so please keep ‘em coming!
@benshen9600
@benshen9600 Жыл бұрын
wait for the next version.hardware genlock for ds80.
@adorablemarwan
@adorablemarwan Жыл бұрын
You do not need a hardware genlock. LivelinkBliss syncs all the timestamps coming from Bliss to Unreal's timestamps.
@JDARRASVP
@JDARRASVP Жыл бұрын
Hello Greg? Nice tuto. I understand that you can genlock this tracking system? Do you confirm ? I have a problem with composure when I want to go in Play mode. Do you have solution for this?
@GregCorson
@GregCorson Жыл бұрын
If you are using Aximmetry there is a system for genlocking RETracker. In Unreal the tracking is coming in at 500hz and so far I haven't seen any need for genlocking. When REtracker is running with LiveLink on unreal all the tracking data is timestamped and when it arrives at Unreal it will select the tracking info closest to the time the video frame was captured. As far as using play mode, I have never had any trouble with this, it just works for me. Normally Play mode runs smoother also. Bring up the "timed data monitor" and check to make sure you are running fast enough so you aren't dropping frames. Also make sure no other windows are open before you press play. If you can be more specific about what problem you are having with composure I could help more.
@JDARRASVP
@JDARRASVP Жыл бұрын
@@GregCorson Thanks a lot for your answer. Concerning the Play mode, no doubt that you do not encounter any problem because you use the output "Player Viewport compositing output" and not the "Compositing Media Capture Output"? In the viewport mode, your compositing does not come out through your AJA card !? What I want is to use composure in production while continuing to play my animations contained in my blueprints.
@GregCorson
@GregCorson Жыл бұрын
In my case, the AJA card I have does not have video output, so I haven't used this mode. I know plenty of others that do it with Blackmagic cards though, they don't seem to be having any composure problems. Sorry I can't offer more help, but I have no way to test this out.
@JDARRASVP
@JDARRASVP Жыл бұрын
@@GregCorson Hehehe, I fix my problem, I made like you, use Player viewport compositing output and all is ok ! Great. I was block with simple problem ! Thanks for your help and your videos. Bye. JD
@duchmais7120
@duchmais7120 Жыл бұрын
Sometimes one can get sick and tired of virtual production, from HTC Vive Trackers, Intel Real Sense Trackers , Antilatency Trackers, HTC Vive Mars, Open XR Branch..what Next VP Trackers??? My Advise...don invest anything untill something solid comes out.I seem to burn fingers always getting different updates and trackers or manners of the pipeline..its never ending and so complex.
@GregCorson
@GregCorson Жыл бұрын
There are actually a number of solid tracking solutions out there, the problem is that they are all quite expensive (major motion picture expensive) and require a lot of technical support to get running spot on. If you want to do virtual production on a budget, you have to be willing to put up with some issues like vive jitter and other such things. If you are trying to do this on a budget, you have to learn the problems and pick a solution that will work with the kind of things you are trying to shoot. Unfortunately camera tracking is a difficult problem for realtime. I have heard reports of people having issues of one kind or another with almost every tracking system, even the very expensive ones. It can take a bit of persistence to find the solution that works for you and get it fine tuned. However it is slowly getting better. If you want spot-on camera tracking and don't need realtime, try the camera tracker inside blender.org. It can be a bit of a devil to learn but once you get it going it gives excellent results, it just doesn't do it in real time.
@AdamSmith-pn5hk
@AdamSmith-pn5hk Жыл бұрын
Any chance you could explain what the “max number of frames buffered” is when your setting live link? I understand you set it at 200. Would this have anything to do with choppiness in camera movement?
@GregCorson
@GregCorson Жыл бұрын
Livelink tries to sync the tracking data with the video. the tracking data usually has less latency so to match the video you have to delay it, usually 1-4 frame times. With bliss the tracking data comes in at 500 hz so to be able to delay it properly you need 200 frames of data. That's all this number is, how much "old tracking" is stored. Check the timed data monitor, if it does not show a green synchronized light you do not have this number high enough. If you are getting a green light, everything is in sync. Camera choppiness is usually from your computer not being able to keep up the rendering rate that you have requested.
@AdamSmith-pn5hk
@AdamSmith-pn5hk Жыл бұрын
@@GregCorson Thanks Greg. Another quick question I have for you is if you know how I can use the manual focus and tracking focus to focus on, let’s say I put a cylinder in the scene at the same spot that my talent will be standing, and I want to have the focus tracking on that cylinder at all times, how would I go about doing that. For some reason with the camera setup in Unreal I am unable to change the focus. Each time I try to use manual focus, it will not change. Thanks for the great help once again. Cheers
@GregCorson
@GregCorson Жыл бұрын
If your setup has a FIZ object, then that is controlling the camera's focus, iris and zoom settings. You need to change them in the FIZ object, or disconnect the FIZ so you can set them inside the camera using the autofocus setting. The FIZ also feeds through into the camera distortion settings that are applied to your CG layer in composure, so disconnecting it might cause problems there. A safer approach might be to add some blueprint code to your FIZ that figures the distance between your camera and target object, then set the focus value in the FIZ automatically.
@AdamSmith-zq5sr
@AdamSmith-zq5sr Жыл бұрын
@@GregCorson Would it be possible to create a new cinecamera or virtual camera and somehow parent it to the camera that my FIZ info is on so that would allow me to then paste the information into the new cameras lens settings to control where focus is pulling? Then my new camera would follow the same movement as my virtual fiz camera right?
@GregCorson
@GregCorson Жыл бұрын
I think the best approach is to have something updating the FIZ for you. A bit of blueprint code could measure the distance from your camera to the subject you are tracking and reset the fiz in realtime to match. I haven't had time to check, but there may be some other way to disable the fiz and use the auto tracking focus, or to feed the focus data back into the fiz.
@tommywright
@tommywright Жыл бұрын
How are you taking the calibration images 44:40?
@GregCorson
@GregCorson Жыл бұрын
you just click in the window to take a new shot
@tommywright
@tommywright Жыл бұрын
@@GregCorson It doesn't want to recognize my checkerboard. I can put it right up to the camera and it doesn't read it as the correct number of squares. :(
@GregCorson
@GregCorson Жыл бұрын
Generally, recognizing the checkerboard is quite reliable as long as you see a clear and sharp image of it in the calibrator, without glare/reflections. You do have to give it the correct number of squares though. For whatever reason, the rows/columns don't count the last square which can be confusing. So if you had something like a standard chessboard that is 8x8, you would tell unreal 7x7.
@xewi60
@xewi60 Жыл бұрын
i don't know why my lens calibration is always catastrophic what am i doing wrong?
@GregCorson
@GregCorson Жыл бұрын
Hi, you are not giving me much information to work with. If you are talking about distortion calibration the most common thing that will mess it up is if you have put in the wrong dimensions for your camera sensor, that can throw everything off. As some cameras don't use the entire sensor for video, you have to make sure to put in the size of the area actually being used. You can usually find this in the camera manual or on the internet. For some cameras this area is different for different resolutions. (ie: 1080 might be different than 4k). If you don't get a good distortion calibration, do not bother with the nodal point stuff, it requires a good distortion calibration to work. I can give you better answers if you can tell me more about your setup and what is going wrong when you try to calibrate.
@xewi60
@xewi60 Жыл бұрын
@@GregCorsonHi ! Thank you for your assistance! The issue likely lies in the dimensions of the camera sensor, as I have struggled to find precise values. The only measurement provided, even in the lens manual (Canon HJ14x4.3) , is 2/3 inch. this value is open to interpretation. Do you know if there's another place to find such values besides the manual? I thought the issue might lie in the nodal point stuff, but as you mentioned, it requires accurate distortion calibration to function, which I don't have.
@GregCorson
@GregCorson Жыл бұрын
I think Canon HJ14x4.3 is a lens and not a camera, you need to look up the camera to find the sensor size. One place you can look is dpreview.com if they have reviewed the camera they usually post a pretty comprehensive list of sensor modes for video.
@hubert1666
@hubert1666 Жыл бұрын
ᵖʳᵒᵐᵒˢᵐ
Пранк пошел не по плану…🥲
00:59
Саша Квашеная
Рет қаралды 5 МЛН
КОМПОТ В СОЛО
00:16
⚡️КАН АНДРЕЙ⚡️
Рет қаралды 29 МЛН
Sigma girl and soap bubbles by Secret Vlog
00:37
Secret Vlog
Рет қаралды 11 МЛН
I Made The SMALLEST Virtual Production Studio!
7:58
Joshua M Kerr
Рет қаралды 78 М.
I tried 'Suicide Squad' Season 2...
27:00
Luke Stephens
Рет қаралды 161 М.
How we made a Comedy Series for the BBC using Virtual Production
8:12
OpenAI's New SearchGPT Shakes Up the Industry, Google Stock CRASHES!
10:10
Max Invites his Nugget Friends to the Costume Party!
0:24
Max Design Pro - Creative Animation Channel
Рет қаралды 7 МЛН
Малого Приняли... ❘  #фильмы  #сериал
0:52
Врач спас девушку от мужа абьюзера😳
0:55
POV Joy went hospital because of Anger, but.. | Inside Out 2
0:32
CAT-N-TOON
Рет қаралды 8 МЛН