You are amazing for uploading this!!! Thank you so so much :) When I try to replicate this though, I don't get the add to project button. Did you also have to download the regular unreal engine along with the unreal editor for aximmetry? I notice you have the launch unreal engine 5.2.1 button at the top but I don't have any versions in my launcher.
@hyperbolicfilms2 күн бұрын
Maaaaaybe you need Unreal Engine. I've always had it installed, so it could be that it's the case. It's almost like the Epic Games Launcher only sort of knows that UE for Aximmetry is installed.
@SitinprettyProductions2 күн бұрын
@@hyperbolicfilms thanks!! Apparently only some of the assets in the marketplace can be added to a project. Admittedly, I was trying to add a different environment that took up less space. I'll try the one you added and see if it makes a difference
@Abdullah-ku6er15 күн бұрын
Nice work .. promising
@robhulson15 күн бұрын
Oh thank God, I was five seconds in thinking, "46 minutes might be too much of this guy." Glad to be wrong!
@3DGraphicsFun17 күн бұрын
*Nice clothing!!!* Cheers! *IcloneFun🤗🤗🤗*
@3DGraphicsFun19 күн бұрын
*Nice collection!* Refreshing to see some great designs and patterns. Look awesome! ...Cheers! *IcloneFun🤗*
@hyperbolicfilms18 күн бұрын
Thanks! I am hoping to build up a big library of different kinds of clothing so I don't reuse the same items over and over.
@3DGraphicsFun18 күн бұрын
@@hyperbolicfilms Thanks. I have probably collected several thousand outfits over my last 5 years using Iclone so nice to see refreshing stuff! *Hugs!🤗🤗🤗*
@michaelbarlow170020 күн бұрын
Thanks, please do more of these videos, great stuff 👍
@carlosguimaraes720224 күн бұрын
Thank you very much :)
@nguyenphuoc737227 күн бұрын
I have set up the network layers correctly but the aximmetry eye cannot connect to the PC
@hyperbolicfilms26 күн бұрын
I'm not sure I have an answer on that. You can try at the Aximmetry forum, they usually respond within a day or two. my.aximmetry.com/posts
@huseenmorad3944Ай бұрын
The video I posted previously was the beginning of this application
@DreamTv_243Ай бұрын
Hello dear Thanks a lot for this video, it's helping us starting with Aximmetry by using a phone a Tracker before we invest in a more professionnal solution like VIVE MARS, however, i would like to know what calibrator material do you use? where do we find it? Thanks a lot
@baaotv5522Ай бұрын
Thank you very much, you are the first person to give detailed instructions about it 🥰🥰
@hyperbolicfilmsАй бұрын
I am glad you liked it. I am just recording a video today on how to use the iPhone as a tracker with a cinema or mirrorless camera. It's basically the same as this, but with a few extra steps.
@Doitanyway5Ай бұрын
Hi, thanks for the video. I was wondering, what are your system specs? Or what do you think are the lowest requirements to do exactly this!? I tried finding system requirements but it basically says the best GPU you can get, and I'm trying to work with a budget.
@hyperbolicfilmsАй бұрын
My system was top of the line in 2020, but is fairly dated now. Intel i9-10900KF 3.7ghz 32 GB RAM Nvidia Geforce RTX 3080 Since doing this video, the largest size output I've been able to record without dropping frames is 2560x1440. UHD is a bit too much. HD is no problem. The GPU is what has the heaviest load on it according to Aximmetry's Processor Load info. If you want to shoot at UHD, you'll probably need a 4080 or faster, but it's hard to know without testing it.
@hyperbolicfilmsАй бұрын
I should also say that it is probably scene dependent. The scene I've been testing with is fairly complex, so that might be eating extra GPU.
@aldermediaproductions695Ай бұрын
Any chance of redoing with 5.4? Hopefully they’ve simplified the process. There are like 75 steps to put a thing in front of a camera. I’ll never get Unreal.
@hyperbolicfilmsАй бұрын
I haven't yet tried the new Vcam app to see if it has potential in this old style way of doing things. I did just release a new tutorial using Aximmetry, which you can now very easily use an iPhone for both the camera and the tracking. I'm still testing using a DSLR/Mirrorless camera with the iPhone as a tracker, but my initial tests looked promising. I'll have a tutorial of that as soon as I am sure of the workflow. Aximmetry uses Unreal Engine to work, but really takes care of the camera input and keying aspects. There is also a new subscription-based app called JetSet from Lightcraft Pro that lets you use an iPhone for virtual production.
@aldermediaproductions695Ай бұрын
@@hyperbolicfilms Thanks for the detailed reply. I saw the JetSet video and it looks pretty promising although I don't know if the price is right. May be worth a look though. I'd tried their older app which was basically Facetime AR and wasn't very good but hopefully they've ironed it out. I hope that some day there's a simplified method and also maybe some Unreal slimming down, i.e. a "beginner mode" that works but maybe offers only a small percentage of the options because it's pretty overwhelming.
@PHATTrocadopelusАй бұрын
Awesome
@JonHuarteАй бұрын
Great video! Thanks
@G.I_BRO_SHOWАй бұрын
thanks for the knowledge much appreciated, Cheers
@user-gl2vj2ff5y2 ай бұрын
cool!
@ahmedusef-cd6pt2 ай бұрын
Awesome tutorial thank you sir. How can I use it over wifi, I'm trying to find a way to connect my iPhone to computer using a static IP but I'm failing, please I would be very thankful if you could help me
@hyperbolicfilms2 ай бұрын
I would test it first over your normal wifi network and see how it works. The official Aximmetry video on Eye suggested the private network, but if your network is not too busy you may be fine with that. If you did want a private wifi network just for the computer and iPhone, you could get a second little wifi router and just link both of them to there. I don't think you'd have to do a static IP for that. You'll just have to do an ipconfig on Windows to see the computer IP and check your iPhone's wifi settings to see what its IP is. I use the automatic IP address to record facial mocap in iClone all the time and I don't have any problems.
@G.I_BRO_SHOW2 ай бұрын
Legend 👍🏼
@hyperbolicfilms2 ай бұрын
You can now do this much easier than I was trying in Aximmetry with their new Eye app
@eberevivianakuche63682 ай бұрын
When you talked about your dificulty installing stable diffusion I was laughing and was like 100% relate
@feedyourmind13 ай бұрын
😂😂 thanks Rodrigo
@dwainmorris78543 ай бұрын
Yes, you're absolutely correct, it has too much censorship. That's why I'm gonna go ahead and pay that $2000 to buy a gaming computer. So I can run Stable de fusion on it directly. I'm tired of being toe note go ahead. M***********, it won't let you do fight scenes. It won't let you let the figures hold weapons. And that's a plus if you're gonna make comic books. If you're gonna make superhero comic books you're gonna have to show some violence and midjourney as well as others like leonardo I won't let you do it
@itanrandel45524 ай бұрын
Excuse me, how do I add the speech bubbles?
@CessarVillarreal4 ай бұрын
burns my eyes see people using chatgpt or SD white theme 😂😂
@crimson_fire_Dragon14 ай бұрын
darn so you need to pay for mid journey and cant do what you want
@artbywaqas4 ай бұрын
@hyperbolicfilms Hi, I'm also using Iclone for the Facial Mocap. How did you end up recording the dialogue? Tech Support told me you have to use a wired mic connected to your PC and that the Live Face app will only record motion not the sound? I'm also trying to record body and face mocap at the same time. I'm using Rokoko Vision for body mocap which is also AI based and not live BTW Rokoko's Head Rig is quite good and affordably priced at $295.
@alibaba-wy1iv5 ай бұрын
How to make every generation, is the same character? I heard some people using own lora training. Can you make a tutorial for it? Thanks
@WalidDingsdale5 ай бұрын
enjoy listening to your reflection, comments and other talks about this ai implications from a professional artist's perspective. keep going
@WalidDingsdale5 ай бұрын
thanks for sharing this amazing walkthrough of combination of SD and PS. This is my first comic art class, which interests many. keep going with more such prefessional comic insight and ideas.
@Lalambz5 ай бұрын
LEts go!!! :))
@todosmiros81195 ай бұрын
i want to make BLACK people. is everyone in these things white/asian looking?
@Elaneor6 ай бұрын
When I found this video, I hoped the authoк would use IP adapter of Control net extension ...
@myst10496 ай бұрын
Hi im prolly gonna go unnoticed but can you make an animestyled comic using stable diffusion like those manhwas(korean) beautiful 2D characters comics always have more viewers.I would like it if u made a series for beginners like me.Many people read manhuas(chinese)just cause of an op mc.The storyline is just overused plot.Anyway would like if u made a beginner series.
@godofdream91126 ай бұрын
Sir please make more video on this topic... I want to make comic.
@d.banksdesigns19957 ай бұрын
AWESOME VIDEO SIR. EXTREMELY HELPFUL.
@lifestoryentertainment7 ай бұрын
Your voice work is fantastic!!!!!!
@ka96487 ай бұрын
😂
@MrSka7cis7 ай бұрын
Thanks for sharing. I just 3d printed this binaural mic setup. I used stereo clippy mics. I was not sure how you mounted the mics internally. I just taped them to test it out. It is working fantastic. I could enlarge it to fit the 12mm diamater rather than use 7.5mm diameter of the Polsen OLM-20 mics, but then the ears would be too big.
@Xandercorp7 ай бұрын
My dude, =you don't have to wait 5 minutes to load one of those... Why is your install so slooooow?
@fadimantium7 ай бұрын
Amazing!
@petroglyphsentertainment84987 ай бұрын
Which iPhone basic model can i use for motion capture?
@hyperbolicfilms7 ай бұрын
I'm not sure, MoveAI has changed what services are available now. Their MoveOne app needs an iPhone, but I think your best bet is to download the app from the App Store and test out if it works on your device. www.move.ai/single-camera
@petroglyphsentertainment84987 ай бұрын
Can I use an iPhone SE 1st gen for motion capture?
@ascarselli8 ай бұрын
Thanks for the video, I had a feeling this process would require a little bit of external editor work. Hopefully reallusion will implement a quick solution in future iterations
@TheSoleProprietor8 ай бұрын
Sometimes, the advanced tools does not give me the option of a batch count. just everything else like choosing the model, adjusting the random seed number, etc.
@TheSoleProprietor8 ай бұрын
I made the mistake of trying to prompt a scene where I square off against Bruce Lee, as a 12-year old challenger... and it banned me! Sometimes these stupid ai engines get their "minds in the gutter" and misinterpret under-age prompts as being child porn. While that is understandable, it was not what I intended ... to be deliberately offensive. Rather than having to bug the administrator and get unbanned, I don't log in to my account, anymore and use the SD XL 1.0, anonymously. I still would like to learn some of the AI tweaking tools, such as random seed number and Base Guidance and negative prompts and how to use them more effectively.
@C4DRS4U8 ай бұрын
Cool little horror film, appropriately atmospheric.
@hyperbolicfilms8 ай бұрын
Glad to hear that! The last (live action) horror film I made really lacked in that department.
@kissler1018 ай бұрын
Very cool.. I liked it...great dialogue and good pace.
@hyperbolicfilms8 ай бұрын
Thanks!
@arielmorandy81899 ай бұрын
I thing you really need to practice stable diffusion before making images. You are not using SD properly. You need to use img2img , not text2img. You cannot refine an image, by trying to have one single prompt to generate your image. Try to look at Sebastian Kampf tutorials.
@rachelleventhal47189 ай бұрын
Love the story! Agree that the lip sync was good- how did you get the 3/4 angle lip sync to work?
@hyperbolicfilms8 ай бұрын
I was using Wav2Lip, which seems to get the lips right even on a profile shot. Takes quite long to process a shot and upscale it, but great results.
@gamersmania84948 ай бұрын
@@hyperbolicfilmshow u upscale the video to decent quality after the wave 2 lip degrade it ??
@hyperbolicfilms8 ай бұрын
There is a Wav2Lipz + GAN that does the upscaling and cleaning automatically. I used that in a Google colab ($14 a month) to be able to run it somewhat fast, but it was usually 10 minutes per clip and a lot of clicking.
@NiccoWargon9 ай бұрын
This was a good story. It also had the best lip-sync I've seen in any of the competitors yet. Nice work!