Howdy all,
tldr; is using 'TIME' in a shader pointless when passing your own time from a GDscript's _process() seems to do a better job?
Very new and only recently started tinkering with some of the shader based freebies on the asset store. In particular, Water and Grass. The issue is I notice a huge performance dip when including such shaders (from ~145fps to 45fps kinda dip which compounds the more assets involved).
I've read about the lack of Integrated GPU performance in general and how 4.0 will likely improve this, however after tinkering I noticed that if I pass my own 'process_time' variable from the parent GDscript's _process(delta) function to the shader, all of the performance comes back.
I guess my question is, is this normal? and, is my solution viable/scalable for my shader performance woes or likely to bite me in the ass down the line?
I am not sure of the reason behind the difference, but am guessing that a shader's 'TIME' is running as fast as possible, or for some other reason has a high overhead; while the GDscript _process(delta) is running at 1/60th of a second and so causes less strain?
Any clues or links to documentation I have missed would be great .
Thanks