Sonic Flowers - TouchDesigner x StableDiffusion Tutorial 1

  Рет қаралды 74,438

bileam tschepe (elekktronaut)

bileam tschepe (elekktronaut)

Күн бұрын

In this tutorial we're having a first look at how to integrate the image-generation AI tool #stablediffusion into TouchDesigner. We're creating an audio-reactive texture in an independant component to be able to do frame by frame animation without worrying about losing frames. TD is running in real-time, but the process itself is not (yet) real-time, as one frame takes 5-10 seconds to be created.
Please mind that this is very experimental and just one way (my current approach). I am very open to hear about ways of improving this or ideas on how to expand it. See this more of a starting point and inspiration than a perfectly refined technique.
Make sure that you download the automatic1111 WebUI (below) and some models to work with. The models should be placed inside "stable-diffusion-webui\models\Stable-diffusion".
Only tested on Windows, running on an Nvidia RTX 3070 Notebook version.
dotsimulate's SD_API: / sd-api-1-22-85238082
automatic1111: github.com/AUTOMATIC1111/stab...
Models: civitai.com/
Parameters explained: blog.openart.ai/2023/02/13/th...
Create your own API: • DIY Stable Diffusion A...
IG Post: reelCtue_S...
The prompt used for this example:
ultrarealistic surreal flowers, ultra detailed, texture, generative art, focus, wes anderson, kodak, light and shadow
When you're working with animation in the independant comp, instead of e.g. absTime.seconds, it makes sense to use me.time.seconds.
-----
00:00 Intro & Overview
02:40 Independant Comp & Audio Setup
07:50 Audioreactive Circles
16:00 Circles Processing
18:30 SD API Component
27:54 Feedback Loop
32:58 Frame by Frame
36:52 Exporting
40:34 Additional Info
43:16 Outro
-----
Files, exclusive content and more:
/ elekktronaut
The special thanks is integrated at the end of the video ❤
If you have any questions, feel free to ask.
IG: @elekktronaut
/ elekktronaut

Пікірлер: 119
@dotsimulate
@dotsimulate Жыл бұрын
Such an amazing overview and powerful approach with the independent time component. This is awesome ! thank you !
@elekktronaut
@elekktronaut Жыл бұрын
thanks man, appreciate your work ❤️
@gozearbus1584
@gozearbus1584 10 ай бұрын
How showing the actual result?
@Raul-ym4ly
@Raul-ym4ly Жыл бұрын
YES FINALLY stable diffusion, thank you sooo much man!!!! Love your channel
@kiksu1
@kiksu1 Жыл бұрын
I've been waiting for this! You are a genius madman, thank you! 😄
@BasEkkelenkamp
@BasEkkelenkamp Жыл бұрын
This one is big!! Awesome and easy solution to a pretty complicated problem❤
@HYBE02
@HYBE02 Жыл бұрын
I really appreciate your effort. Many and many thanks.
@qde2
@qde2 Жыл бұрын
omg i know what im doing tonight! great job as always!
@marcoaccardi
@marcoaccardi 9 ай бұрын
Great video! waiting for pt 2
@excido7107
@excido7107 4 ай бұрын
You my friend are a legend! I followed your video to the letter and finally actually understand TD a lot more! Thank you
@travhennekam16
@travhennekam16 Жыл бұрын
gonna do this after work, thank you brother!
@marcovioletvianello
@marcovioletvianello 5 ай бұрын
Great tutorial, thanks!!
@Nanotopia
@Nanotopia Жыл бұрын
Thank you for sharing this 💖
@Data_Core_
@Data_Core_ Жыл бұрын
Very very nice 👌
@SMAWxyz
@SMAWxyz 5 ай бұрын
thank you for this wonderful! there doesn't seem to be any current frame, but only a channel named framecount, the issue is after every frame rendered, the value goes back to 0, meaning it is pulsing twice on every render, also meaning that the independant base is cooking twice for every stable diffusion cook, do you have any ideas there?
@Qwert_Zuiop
@Qwert_Zuiop 9 ай бұрын
Awesome tutorial, thank you so much! I am not sure, but I think people could stumble over is your writing of "independent" as "independant" and then getting the referencing in the code wrong at the end wrong?
@iloveallvideos
@iloveallvideos 5 ай бұрын
LETSSS GOOOOOOOOOOO
@Luigih12
@Luigih12 Жыл бұрын
King
@jarygergely2074
@jarygergely2074 9 ай бұрын
I don t see the progression bar in the terminal and don t get the image, just if I generate the image from microsoft edge, which opens when I launch the webui. Can somebody help me with this?
@metrognomed
@metrognomed 10 ай бұрын
Is anyone else having trouble with Image-to-Image in SD_API? Text-to-Image works fine for me, but Image-to-Image does not seem to do anything. I don't see a progress bar in the terminal, for example.
@tinglesound6621
@tinglesound6621 5 ай бұрын
Hi! Sorry but I can't figure out how to create a custom parameter for the resample CHOP like you have in your simpleresample component. In this case, it seems like I need to edit the FFT size in the AudioSpectrum CHOP? Sorry if this is such a noob question but I have no clue how to do that!
@soprano3317
@soprano3317 5 ай бұрын
@elekktronaut I have the same question have been following your tutorials - do you have any other tutorial where u built the simple_resampler? thanks
@yaraalhusaini2551
@yaraalhusaini2551 Ай бұрын
Did you ever figure this out? I am stuck on the same thing!
@bridgetteteare3856
@bridgetteteare3856 2 күн бұрын
I'm looking for help on this part too any luck?
@nico3144
@nico3144 10 ай бұрын
question, whenever i hit launch WebUi, in the command panel it doesn't want to launch web UI in the command pannel, it says invalid syntax error. Pls help i
@ph0enixr
@ph0enixr 7 ай бұрын
Awesome tutorial! I got everything working, but with the same settings as in the video I get runaway feedback where the background noise and any bright spots eventually go to white and start growing. Any suggestions on how to limit that, or maybe I missed some setting? Thanks!
@ph0enixr
@ph0enixr 7 ай бұрын
I think I fixed it, I was over-sharpening. Thanks again!
@clee6030
@clee6030 2 ай бұрын
Hi! Thank you for the great video :) I've completed all the steps and I generated 1000 frames over night, but I'm confused on how to create an mp4 file from the generated frames. What would you recommend that I do?
@mendoziya
@mendoziya Ай бұрын
Great video!!!!. Which graphics card are you using?
@ysy69
@ysy69 6 ай бұрын
This video is a gift and best way to start 2024. Thank you. Do you know if the SD_API supports LoRas?
@irgendwaer3000
@irgendwaer3000 10 күн бұрын
I did some testing and for me the Select TOP you put in to smooth the flickering ended up to give me more flickering. Sometimes the AI is triggered also by the very low opacity input and "jumps" back to generate some elements at the "old" positions of the blobs, or get stuck at positions. I tried to somehow set up a Feedback Loop which fades out the fed back animation over time but couldn't figuring out how
@dariayakubovska1877
@dariayakubovska1877 3 ай бұрын
super klas masterklas kvas
@JiaCUI-gd3qu
@JiaCUI-gd3qu 6 ай бұрын
How to connect the local stable-diffusion-webui with the touch designer? I do not know how to build the SD-API in this video.
@AnyaTran
@AnyaTran 11 ай бұрын
amazing tutorial!!! though I have a question - when the time is being triggered to play, instead of moving by just 1 frame, i think it jumps more (eg. from 00:01:13:13 to 00:01:59:01) . How can I fix that?
@elekktronaut
@elekktronaut 11 ай бұрын
thanks! hmm that's odd. you can try a different approach someone on discord shared, which might be better anyways. so in the chop execute you'd use this expression: op('local/time').frame +=1
@Schall-und-Rauch
@Schall-und-Rauch 6 ай бұрын
​@@elekktronaut So I deleted my entire content of chopexec 2 and added: def onValueChange(channel, sampleIndex, val, prev): op('independent/local/time').frame +=1 return Then deactivated Off to On and On to Off and activated Value Change. At least the timeline moves on in frames now, unfortunately it jumps two at a time, but that seems good enough for me.
@spacefordigitalvisualresea8031
@spacefordigitalvisualresea8031 5 ай бұрын
I guess you're not on 24 fps right? It's somehow connected to the fps but i dont know why.
@AnaMariaPires8915
@AnaMariaPires8915 21 күн бұрын
heyy i got stuck on the scale instacing part of this tutorial. i followed your 'simple-resample' chop tutorial, however i got suck on what to specify on the 'scale x' and 'scale y' under 'scale OP'.
@electromagneticgoldstar7903
@electromagneticgoldstar7903 Жыл бұрын
this is incredible so much information here! thank you! *is working fine on mac m1
@louisfievet9341
@louisfievet9341 Жыл бұрын
4 real ?!! OMGGG
@electromagneticgoldstar7903
@electromagneticgoldstar7903 Жыл бұрын
@@louisfievet9341 for real!
@louisfievet9341
@louisfievet9341 Жыл бұрын
@@electromagneticgoldstar7903 Hi! I was wondering if you've had any problems? I did the whole GitHub process for Mac I also got the famous sentence "To create a public link, set `share=True` in `launch()`" which I guess means everything is ok like Bileam said. But unfortunately I can't create an image :( (Got M1 too)
@mateuszsarapata
@mateuszsarapata Жыл бұрын
Damn, how did you do that? I'm trying to install AUTOMATIC1111 on my mac and when I'm trying to run ./webui.sh on terminal there's one info over and over again - "Stable diffusion model failed to load" :/
@bardoof
@bardoof 2 ай бұрын
Obviously, you are lying. Please do not misguide people here
@ThomasMYoutube
@ThomasMYoutube 19 күн бұрын
when using image to image, how do you change the input resolution?
@Schall-und-Rauch
@Schall-und-Rauch 6 ай бұрын
In case anybody ran into the same problem. I couldn't lauch the webui form the API node around 19:20. I had installed the WebUI following the instructions "Install and Run on NVidia GPUs" - Automatic Installation - Windows (method 1). Then I deleted the sd.webUI folder and followed the instructions "Windows (method 2)", first installiung Python and then git and it worked immediately.
@MikeHancho663
@MikeHancho663 18 күн бұрын
BUMP! Solved my issue thank you SO much!!!
@RussellKlimas
@RussellKlimas 10 ай бұрын
So I'm really close but on the CHOP Exec 2 I keep having the error of Cannot find function named OnValueChange (project1/chopexec2). I've gone over the tutorial a couple times and can't figure out what I'm doing wrong.
@Schall-und-Rauch
@Schall-und-Rauch 6 ай бұрын
OnValueChange was deleted out by him, so it shouldn't be in yours anymore.
@nime1575
@nime1575 Жыл бұрын
Awesome tutorial, thank you! For me the workflow fails to run smoothly tho, it seems there is a problem with the frame-by-frame workflow. When I use SD to proceed on currentframe, it doesnt always send the signal to play the independant timeline. When I hook a timer to count up the currentframe, it updates as expected. Might be, because I run on Mac M1.. Maybe it's some lag issue in TD. I will try this on a faster windows pc soon.
@elekktronaut
@elekktronaut Жыл бұрын
Thanks! Interesting. There's anther hacky way of doing this, by using and Info CHOP on the texture coming out of SD_API and use the total_cooks channels instead of the Currentframe. Or maybe you've gotta use a Trigger CHOP to make sure it's really hitting one, maybe you're dropping frames somewhere. Is realtime turned off?
@nime1575
@nime1575 Жыл бұрын
@@elekktronaut lets gooo, turning off realtime fixed it. Also working on a faster setup works, but might drop frames. Thanks again!
@elekktronaut
@elekktronaut Жыл бұрын
@@nime1575 cool! frame drops shouldn't happen when realtime's turned off, that's literally why you turn it off 😌
@melihg.kutsal7566
@melihg.kutsal7566 Жыл бұрын
@@nime1575 I was having the same issue, thanks to your comment I figured that, I also didn't turned off the realtime. 🥲
@massakalli
@massakalli 9 ай бұрын
Hello. Thank you so much for the video. For me, whenever I link the Audio CHOP to Moviefileout, it prevents the local timeline from advancing for some reason. And audio stops moving forward. Can you help?
@massakalli
@massakalli 9 ай бұрын
Upon further inspection, this only seems to be happening with "Stop-frame Movie". The audio file works fine with other types of export.
@spacefordigitalvisualresea8031
@spacefordigitalvisualresea8031 5 ай бұрын
When i change the settings to movie instead of stop frame movie it's completely broken. How did you manage to make that run? @@massakalli
@AngelsEgg9
@AngelsEgg9 Ай бұрын
Because I am trying this now with a newer version of dotstimulate's SD API, when I grab a null from the API it doesn't have a currentframe element in it, just: Streamactive, framecount, and fps. :( How do I go about this now?
@NicholasCarn
@NicholasCarn 7 ай бұрын
Thanks for this :) strange but for me the frame by frame seems to skip forward 2 frames at a time for some reason and I seem to have duplicate frames in the final movie. Not an unfixable issue as I can workaround it by rendering individual frames and removing the duplicates, but I can't figure out why it's doing that yet...
@Schall-und-Rauch
@Schall-und-Rauch 6 ай бұрын
I have the same problem, just wanna push your question up a bit.
@spacefordigitalvisualresea8031
@spacefordigitalvisualresea8031 5 ай бұрын
pay attention to have your fps at 24
@istarothe
@istarothe 2 ай бұрын
I need some help, I have recorded but the recording almost speed past the frames, and sound is speed up basically like a screech. Realtime is turned off
@user-qr3hh1lp4u
@user-qr3hh1lp4u 3 ай бұрын
I'm wondering how to make this using comfyui for TD?
@Schall-und-Rauch
@Schall-und-Rauch 6 ай бұрын
Has anyone figured out how to have less change in between the individual frames, so the movie looks more fluent? All the parameters I tried didn't really work in transform and level.
@abrandtneris
@abrandtneris 6 ай бұрын
did you figure this out? have the same question
@spacefordigitalvisualresea8031
@spacefordigitalvisualresea8031 5 ай бұрын
you should read into controllnets. They controll the change between images. @@Schall-und-Rauch
@baoqiancheng8224
@baoqiancheng8224 7 ай бұрын
Hi, I'm a beginner to TouchDesigner, I love this tut, can you please explain what this simple-resample is?
@user-lp1mn6hf4g
@user-lp1mn6hf4g 6 ай бұрын
me too
@soundswhile9529
@soundswhile9529 6 ай бұрын
⁠@@user-lp1mn6hf4git’s just a resample chop. make sure to turn time slice off
@blackleatherboots
@blackleatherboots 5 ай бұрын
Hi, I am new as well, and am wondering how to specifically change the number of samples using a resample chop? When he types "60" into the "num samples" field at 8:33, I don't know how I would resample to 60 using a default resmaple chop.@@soundswhile9529
@soprano3317
@soprano3317 5 ай бұрын
@@soundswhile9529 thank yo!!
@connergriffith3601
@connergriffith3601 Жыл бұрын
Sorry for naive question: would this in theory mean each generated frame is a stable diffusion token (so rendering would cost money, basically) ? Thank you :)
@aulerius
@aulerius 11 ай бұрын
Yes if you're accessing a paid cloud backend, but afaik the automatic1111 is meant for local use on your own hardware, which is free as long as your hardware can handle it.
@elekktronaut
@elekktronaut 11 ай бұрын
exactly. no tokens involved, this is running locally and there's no limit but your gpu. it's completely free (except for the patreon support for dotsimulate, but there's alternatives as well) :)
@apoca07
@apoca07 9 ай бұрын
I did everything, it works fine but I have a problem. When I activate Keep Generating the image input does not change but the frames advance, when I generate with the Generate Frame Pulse button it works. Where can I check?
@apoca07
@apoca07 9 ай бұрын
NVM, turn real time OFF fix it!!!!! i need to read more lol
@KaleidoKurt
@KaleidoKurt 2 ай бұрын
getting stuck at 5:06 when I change the rate to cookrate() the Rate field goes to 0.01 and turns red. Anyone else experiencing this?
@geoseatooo
@geoseatooo 6 ай бұрын
Followed all steps but still the independent timeline doesnt advance once image is generated. the only difference i can see is the newer version of SD_API doesnt have anything linked to the (current frame) parameter by default (within the sd api) your version however does have something linked there, would you mind sharing what your expression is there?
@Schall-und-Rauch
@Schall-und-Rauch 6 ай бұрын
As suggested by Bileam in another comment, I deleted my entire content of chopexec 2 and added: def onValueChange(channel, sampleIndex, val, prev): op('independent/local/time').frame +=1 return Then deactivated Off to On and On to Off and activated Value Change. At least the timeline moves on in frames now, unfortunately it jumps two at a time, but that seems good enough for me. Can you follow?
@visionz5776
@visionz5776 4 ай бұрын
Hi have you solve that😢I got same problem with you,and I also see the independent time value down below shows Red but in the tutorial it’s white
@simarimbunetaliasidabutar4476
@simarimbunetaliasidabutar4476 5 ай бұрын
Please, Can you continue this video combining with Kinect ?
@HeLevi
@HeLevi 10 ай бұрын
Hi, bileam, I followed every single step, but my independent timeline didn't seem to move, I put a count CHOP after delay CHOP it seems like every move was detected but made no difference to the Time COMP. I downloaded the exact same model you were using there, but my flower is so plain with no details like leaves, stems and textures
@Qwert_Zuiop
@Qwert_Zuiop 9 ай бұрын
Maybe you were calling the Base-Component "independAnt" like he is and then wrote "independEnt" in the code or the other way around? Was something i stumbled over...
@adrianarvidsson1384
@adrianarvidsson1384 5 ай бұрын
at first its not supposed to move
@triangulummapping4516
@triangulummapping4516 2 ай бұрын
which windows requirements do i need to run all this?
@TheGladScientist
@TheGladScientist Жыл бұрын
nice technique! quick question: if using 24FPS video, why resample the audio to 60? as a sidenote: FlowFrames and Video2X are great free alternatives to Da Vinci and Topaz :)
@elekktronaut
@elekktronaut Жыл бұрын
thanks, also for the recommendation! the resampling defines the amount of circles for instancing :)
@TheGladScientist
@TheGladScientist Жыл бұрын
@@elekktronaut ahhh, missed that bit (admittedly watching at 1.5x speed lol), thanks for clarifying!
@kiksu1
@kiksu1 Жыл бұрын
There is also the Deforum AI video making extension for Automatic1111 which has a video upscaler and it does interpolation too 👍🏻
@TheGladScientist
@TheGladScientist Жыл бұрын
@@kiksu1 definitely. would be verrry interesting to extend the SD COMP to also support Deforum, Warp, and/or TemporalNet from within TD
@aryansingh5470
@aryansingh5470 6 ай бұрын
For some reason my frames arent updating while recording... it records and updates few frames correct then overlaps the frames without updating the base noise and also the audio glitches any idea why thats happening?
@secilkurtulus9368
@secilkurtulus9368 3 ай бұрын
I have the same problem, could you manage to fix it?
@istarothe
@istarothe 2 ай бұрын
Same here, my audio basically gliches and genrations just fly past
@spacefordigitalvisualresea8031
@spacefordigitalvisualresea8031 5 ай бұрын
Everything works fine for me but the audio sounds laggy. I have the audio fixed to the timeline and realtime is off. Any ideas where to look for the error?
@francescbecerracabrera2837
@francescbecerracabrera2837 3 ай бұрын
Same problem here
@spacefordigitalvisualresea8031
@spacefordigitalvisualresea8031 3 ай бұрын
A friend told me he had to adjust the animation speed of a noise to fix it but i didn't touch the project since then.@@francescbecerracabrera2837
@peacekulture
@peacekulture 5 ай бұрын
I am stuck at the point of launching the SD Webui from TD. MacBook Pro M2 Max. The Webui launches fine from Terminal but when I put the path into the SD_API SD Webui Folder address in the API Settings, nothing launches when I press the Pulse button. Has anyone else had any similar issues or could point me in the right direction? Eternal thank you.
@clee6030
@clee6030 3 ай бұрын
Hitting the same issue! Did you figure out how to get unblocked?
@time_itself
@time_itself 3 ай бұрын
I believe that Streamdiffusion is only compatible with windows and NVIDIA cards; you might be SOL
@leotromano
@leotromano 2 ай бұрын
also curious if you had any luck?
@xiaotingtan3369
@xiaotingtan3369 4 күн бұрын
I met the same issue, did u figure it out?
@digitalflick
@digitalflick Жыл бұрын
got stuck on the simple resample chop, how do i create that?
@elekktronaut
@elekktronaut Жыл бұрын
it's a custom .tox you can find on my patreon but it's really just a resample chop. look at the example here: kzfaq.info/get/bejne/hLB1Z5t6u-Cyhpc.html
@digitalflick
@digitalflick Жыл бұрын
@@elekktronaut thanks! found it!
@pierreleveille515
@pierreleveille515 6 ай бұрын
Hello, I love your tutorials! I have a question, if my stable diffusion takes 6 minutes to generate an image (I have a AMD GPU that can't use Nvidia features and I've been researching about that for one complete day lol), do you think it it possible to still do your tutorial?
@hecosmos1996
@hecosmos1996 11 ай бұрын
why the SD_API link is not support?
@elekktronaut
@elekktronaut 11 ай бұрын
what do you mean?
@seulkireadlim2869
@seulkireadlim2869 4 ай бұрын
im a students. where is SD API...in touchdesigner.. PLEASE .. i want detail... please
@brunotripodi
@brunotripodi 5 ай бұрын
Hi! I need help, I don’t know why when I press “Launch webUI” pop up this message; Couldn't launch python exit code: 9009 stderr: No se encontr¾ Python; ejecuta sin argumentos para instalar desde Microsoft Store o deshabilita este acceso directo en Configuraci¾n > Administrar alias de ejecuci¾n de la aplicaci¾n. Launch unsuccessful. Exiting. Presione una tecla para continuar . . .
@L3K1P
@L3K1P 2 ай бұрын
Same here. Have you found a solution?
@mustaTraceur
@mustaTraceur 5 ай бұрын
Do you know if this does work on mac?
@ericgoldstein2051
@ericgoldstein2051 3 ай бұрын
it does, but image generation takes about 1.30 on my m1 at 25 samples
@ans-mf6ki
@ans-mf6ki 2 ай бұрын
pleas,python3.10 executable not found.
@mobioboris1
@mobioboris1 9 ай бұрын
Hello ! Thanks for this tutorial. To challenge myself I choosed to create my own API with the tut' you linked : kzfaq.info/get/bejne/atGYlq-nrNDTiGw.html As a beginner I see there is huge difference between the API of Interactive & Immersive HQ and the one from DotSimulate. I mean a lot of parameters we don't have. In other way, the toe created in the tut' is a BASE, like a closed box. How can we connect other operators to it ? Thanks.
@prismatic.visuals
@prismatic.visuals Жыл бұрын
this is amazing, thank you! Found a way to trigger the next frame of the independent component that's maybe a bit simpler, since it can be run as a single script: timeOp.par.play = 1 run("timeOp.par.play = 0", delayFrames=2)
@vsevolodtaran4818
@vsevolodtaran4818 11 ай бұрын
could you please add more information about your tip? your syntax is different from what is shown in the video. op('independant/local/time').par.play = 1
@rudolf_II
@rudolf_II 9 ай бұрын
please give more information, where to setup the script. thanks
@Markdood88
@Markdood88 Жыл бұрын
This one's been on my wishlist for a while now! 🥹 So happy to finally see a way to connect that webui with TD!!!
Audioreactive Particle Cloud (new) - TouchDesigner Tutorial 65
16:24
bileam tschepe (elekktronaut)
Рет қаралды 39 М.
Stormy Swamp Unreal Engine 5.4 Lumen Nanite
2:40
Sciontidesign
Рет қаралды 74
3M❤️ #thankyou #shorts
00:16
ウエスP -Mr Uekusa- Wes-P
Рет қаралды 8 МЛН
MEGA BOXES ARE BACK!!!
08:53
Brawl Stars
Рет қаралды 34 МЛН
How to connect everything to Stable Diffusion
11:21
VJ SCHOOL
Рет қаралды 21 М.
Paris
3:00
Marvin Caleb - Topic
Рет қаралды 379
Ray Caster. Touch Designer. GLSL
2:19
Baaaaak
Рет қаралды 53
Light Caster Progress. TouchDesigner. GLSL
2:15
Baaaaak
Рет қаралды 46
3M❤️ #thankyou #shorts
00:16
ウエスP -Mr Uekusa- Wes-P
Рет қаралды 8 МЛН