Physics based ripple shader

By Quintin Yu

The goal

My goal was to create water ripples based on the impact of objects in shaders. A ripple should start at the point of contact between the water and an object. Based on the size and speed of an object the size and height of the ripples should be changed. In the gif below you can see ripples in the water after something is tossed into the water.

Source: https://www.youtube.com/watch?v=T9QwiBFN9gI&ab_channel=IRISEarthquakeScience

Creating ripples

In the first few weeks of my research, I was unable to find a good source on creating a ripple shader in HLSL, as all the sources I could find for ripple shaders weren’t done in HLSL but in shadergraph or in Shadertoy. I was unable to translate the code they had to HLSL. Though I was able to find really good examples of what I wanted to create such as the following WebGL shader: https://madebyevan.com/webgl-water/

WebGL Water by Evan Wallace

The first ripple shader I was able to create was from Freya Holmér in her tutorial about shader basics, which I was watching to get a better understanding of shaders. In the hope it would increase my understanding and help me translate Shadertoy and WebGL shader languages to HLSL.

Source: Freya Holmér: https://www.youtube.com/watch?v=kfM-yu0iQBk&list=WL&index=20&t=1831s&ab_channel=FreyaHolm%C3%A9r

This ripple effect is created by getting the centre of the uv of the mesh. From this centre we calculate the radial distance which is the distance between the centre and the edge of the plane. Then to calculate the ripple Freya uses a cos wave to create the ripple effect seen in the image above. Last but not least we add some dampening/dispersion by multiplying the ripple height by the radial distance. As seen in the code below.

//Code from Freya Holmér: Shader Basics, Blending & Textures • Shaders for Game Devs [Part 1]
float GetRipple(float2 uv) {
     float2 uvsCentered = uv * 2 - 1;
     float radialDistance = length(uvsCentered);

     float ripple = cos( (radialDistance - _Time.y * 0.1) * TAU * 5) * 0.5 + 0.5;
     ripple *= 1 - radialDistance;

     return ripple;
}

While this code works for a basic ripple it’s not the end product I want to end up with for several reasons. The first being this is all based on uvs, not on coördinates. The second problem is that the waves are unable to bounce off the edges and or objects that might obstruct them.

After being stuck for a few weeks in my research. I decided to take a step back and research how ripples work in real life and what the mathematics are behind them in the hope of getting a better understanding of what I have to do. When looking at ripples in reality you need to look at propagating waves. Propagating in this context means to travel through a space or material (Propagation definition). When looking at a basic propagating wave there doesn’t seem to be a standard. The formula can either contain a sin (η( x, t ) = αsin(θ(x,t))) or a cos (y=acos(wt+δ)). From what I could find and see it doesn’t make a big difference if you use the sin or cos formula to make the ripples. Freya’s shader for example used the cos formula.

Looking further at waves in reality we learn that there are two types of waves created when water is disturbed by an object entering its surface. Capillary waves and gravity waves, the former being waves affected by the surface tension of a liquid while the latter are waves created trying to make an equilibrium on the surface. In the image below you can see the difference between the two waves.

Source: https://www.researchgate.net/figure/Capillary-Waves-and-Gravity-Waves_fig53_333672415

In the shader, I am going to create I am focusing on Gravity waves to keep it simple for myself. Future improvements would include trying to make the ripple have both capillary waves as well as gravity waves. Other improvements will be listed at the end of this blog.

Having taken a step back I was able to find a water shader that did exactly what I wanted. This shader was created by a Unity user named Simplestar-Game (Shader: Simple Interactive Water). After a small email exchange with Simplestar-Game, I understood that I should switch from normal shaders in URP to compute shaders in the 3D pipeline of unity. This is because my goals there were better sources for water ripple shaders in compute shaders than in the normal shaders. The pipeline change was necessary due to issues with the kernels needed for compute shading not being detected in the URP version of Unity. I was unable to find the reason behind this issue.

Creating the compute shader

To create my shader I started with a simple gray tessellated plane. Tessellation in this context means that you double the number of triangles and verts of a plane or object. By increasing the trice and verts you can get more detailed deformation from your mesh.

To create my shader I started with a tessellated plane I got from the Shader example in the earlier mentioned video by Freya Holmér.

A compute shader works differently from a normal shader. The biggest differences are the following:
1) You don’t add the shader to a material but access it via a C# script
2) Compute shaders use the GPU instead of the CPU. To make use of the GPU you need to assign kernels and threads on which the shader has to run. This allows you to run several tasks in parallel increasing performance a lot, compared to the normal shaders which run on the CPU and can’t run in parallel with each other.
3) All properties have to be created in the C# script and sent through the script as will be shown later. You are still able to create your own variables inside of the compute shader of course.

Then for the compute shader, I ended up with, which is heavily based on a video created by Spontaneous Simulations.

To start the creation of the shader you need to make three textures and enable random write on it. If you don’t enable this the compute shader will be unable to interact with the mesh.

void Start()
    {
        InitializeTexture(ref NState);
        InitializeTexture(ref Nm1State);
        InitializeTexture(ref Np1State);

        waveMat.mainTexture = NState;
    }

    void InitializeTexture(ref RenderTexture tex)
    {
        tex = new RenderTexture(resolution.x, resolution.y, 1, UnityEngine.Experimental.Rendering.GraphicsFormat.R16G16B16A16_SNorm);
        tex.enableRandomWrite = true;
        tex.Create();
    }

In this code snippet, you can notice the use of UnityEngine.Experimental.Rendering.GraphicsFormat.R16G16B16A16_SNorm. This is used for the RGBA values as we need this for the waves. This GraphicsFormat however also allows for negative values which we need for the lower part of the ripples as these go into the negatives.

These three textures are then copied to each other and sent to the compute shader.

void Update()
    {
        Graphics.CopyTexture(NState, Nm1State);
        Graphics.CopyTexture(Np1State, NState);

        waveCompute.SetTexture(0, "NState", NState);
        waveCompute.SetTexture(0, "Nm1State", Nm1State);
        waveCompute.SetTexture(0, "Np1State", Np1State);
        waveCompute.SetVector("resolution", new Vector2(resolution.x, resolution.y));
        waveCompute.SetFloats("dispersion", dispersion);
        waveCompute.Dispatch(0, resolution.x / 8, resolution.y / 8, 1);

        waveMat.mainTexture = NState;     
    }

By accessing the waveCompute you can set the properties via the code. As mentioned earlier this is the only way to set the properties for a compute shader as they are otherwise inaccessible. On line 11 you can see a Dispatch call. This has to be placed in the Update because the shader won’t be updated otherwise. This method is divided into four parameters, the last three can sort off and be combined.
1) The kernel, which in the case of the compute shader is a method.
2, 3 & 4) This divides the mesh into sections so the shader can run those parts in parallel. The second and third parameters have to divided by the thread count you have set in the compute shader. The fourth parameter can be set as 1 considering we use a plane.

On line 13 we set the texture of a material equal to the texture NState that is created in the Start method.

As explained in my goal I want ripples to be created when an object collides with the water plane. The biggest problem I had here is that to place the ripples I needed to have the UV cords instead of the world position. After searching around I found that a raycast could return the texturecoords which translate to the UV cords. In the collision method, you can access the contact point between the colliding objects. This returns the world position on which you can then create a downwards raycast which is created on line 6 in the comment snippet below.

private void OnCollisionEnter(Collision other) {
        
        foreach(ContactPoint contact in other.contacts){
            RaycastHit colHit = new RaycastHit();
        
            Ray ray = new Ray(contact.point-contact.normal, contact.normal);
            if(Physics.Raycast(ray, out colHit)){
                //Debug.Log(colHit.textureCoord);
                //Debug.Log(colHit.collider.name);

                waveCompute.SetTexture(1, "NState", NState);
                waveCompute.SetTexture(1, "Nm1State", Nm1State);
                waveCompute.SetTexture(1, "Np1State", Np1State);
                waveCompute.SetVector("effect", effect);
                waveCompute.SetFloat("objectWeight", other.gameObject.GetComponent<Rigidbody>().mass);
                waveCompute.SetVector("rippleOrigin", new Vector2(resolution.x * colHit.textureCoord.x, resolution.y * colHit.textureCoord.y));
                waveCompute.Dispatch(1, resolution.x / 8, resolution.y / 8, 1);
                
                other.transform.position = new Vector3(other.transform.position.x, Random.Range(6, 12), other.transform.position.z);
            }
        }
    }

The collision sometimes created two ripples at the same time. The second ripple was placed on the UV coördinate of (0,0). To solve this bug I used two debug lines found on lines 8 and 9. This showed that the raycast sometimes hit the colliding object instead of the water mesh. By setting the colliding objects to the Ignore Raycast layer in Unity I was able to fix this bug.

After setting the parameters for the shader I Dispatch the shader again. This time I dispatch it on the kernel of index 1 instead of kernel 0. What the kernels do I will explain next.

To create a new kernel in the compute shader you need to call #pragme kernel [Kernel name]. The Kernel name has to be the same as the method name. If they are not the same it will not be recognized as a method.

#pragma kernel CSMain
#pragma kernel PlaceRipplle

Each method you create in a compute shader has to have GPU threads assigned to them and be given an ID via the method parameter. This ID can then be used in the C# script for the kernel ID.

[numthreads(8,8,1)]
void CSMain (uint3 id : SV_DispatchThreadID)
{
    float ns_ij = NState[id.xy].x;
    float nm1s_ij = Nm1State[id.xy].x;
    float ns_ip1j = NState[id.xy + uint2(1, 0)].x;
    float ns_ijp1 = NState[id.xy + uint2(0, 1)].x;
    float ns_im1j = NState[id.xy - uint2(1, 0)].x;
    float ns_ijm1 = NState[id.xy - uint2(0, 1)].x;
    float newWaveHeight = ns_ij * 2 - nm1s_ij + 0.25 * (ns_ip1j + ns_im1j + ns_ijp1 + ns_ijm1 - 4 * ns_ij);
    newWaveHeight = newWaveHeight * dispersion;

    Np1State[id.xy] = float4(newWaveHeight, newWaveHeight, newWaveHeight, 1);
}

The variables containing ns_ are used to calculate the waves. The variables containing p1 and m1 are used to create the boundary of the shader. Once the ripple hits one of these boundaries the wave will reflect on this and come back.
On line 10 we calculate how the wave. This wave is created based on the formula shown in a video by Haroon Stephan.

(Haroon Stephan, 2016)

The lowercase delta in this formula is replaced with the ns_ variables. In the formula, you can also see the +1/-1 which in the variables is shown as p1/m1. In the formula, they divide the separate parts by the delta of x sqrt and y sqrt. This however is not needed in Unity as this is automatically done by assigning them to the id.xy as seen on line 15.

[numthreads(8, 8, 1)]
void PlaceRipplle(uint3 id : SV_DispatchThreadID){
    float TheGreatestWaveHeight = effect.z * objectWeight;
    if (id.x == floor(rippleOrigin.x) && id.y == floor(rippleOrigin.y)) {Np1State[id.xy] = float4(TheGreatestWaveHeight, TheGreatestWaveHeight, TheGreatestWaveHeight, 1);}
}

The second kernel is used to place the ripple on the mesh. I separated these because I originally also had the option to create a wave on the place where you clicked with the mouse. On line 3 you can see the object’s weight. This is how I try to simulate the ripples in a realistic manner. Heavy objects for example create bigger waves than light objects. This results in the following example gif. The redder an object is the heavier they are.

As is visible in the gif above the ripples are now created. There is however one problem, the ripples that are created are 2D and not 3D. To make the ripples 3D you need to create a normal shader using HLSL and place this shader on a material to apply it to the mesh.

In this shader you only need to add a few things which are seen in the following snippet:

v2f vert (appdata v)
            {
                v2f o;

                fixed4 vertexDisplacement = tex2Dlod(_MainTex, float4(v.uv.x, v.uv.y, 0, 0));
                v.vertex.y += vertexDisplacement.x;

                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = TRANSFORM_TEX(v.uv, _MainTex);
                return o;
            }

            fixed4 frag (v2f i) : SV_Target
            {
                fixed4 col = tex2D(_MainTex, i.uv);
                col *= _Emision;

                col = lerp(float4(0, 0, 1, 0), float4(1, 1, 1, 0), col.x);
                return col;
            }

On line 4 and 5 the vertex displacement is done to make the ripples become 3D. The _MainTex for the material is set in the update method of the WaveManager script back up in this blog. The line of code that sets this property is: waveMat.mainTexture = NState;. From this texture we get the uv values of the x and y axis as these are what show the ripples on the plane. On line 5 you apply the uv.x to the mesh so it gains height. You don’t need to use the x value and are able to use the y value. this changes nothing for the shader.

In the fixed4 frag method, the last important pieces of code are added. On line 16 I add some emission so that the ripples are visible. On line 18 I change the colour of the mesh to blue to make it look more like water. After you apply the shader to a material and place the material on the mesh you get the following effect which is the current result of my r&d project

From my Unity project

Future shader improvements

In the future there are several things I wish to do better or improve for my shader. These improvements are as followed.
1) Make the water look realistic – The water right now is just a blue plane with an emission level to make the ripples visible to the user.
2) Add obstacles/terrain – At this moment the waves only reflect off of the sides of the plane. If I want to make the ripples realistic they should also reflect from objects that are in the water.
3) Make the starting ripple size adjustable – The code that currently creates the ripples have a set start size for the ripples. This should be improved so that this is adjustable. This might need lead to a change in the base ripple formula that is used.

Lessons learned

A big takeaway from this research project is that I should first attack a problem from different angles. At the start of the project, I had set my eyes on working with shaders and tried to find something that could help me. Peers did send me sources that I could use, but those sources I had already found. After the second week or maybe even the first one where I barely had made any progress, I should’ve tried to look from a different angle. This time I started looking from a different angle too late, which makes my shader working, but not visually appealing. Something else I realize now that I am finished is that I should’ve asked for a teacher’s example earlier. It’s okay to use it as a source to help me understand what I am doing better. This time I waited until I got stuck before asking. This teacher then got sick the moment I wanted to ask it, which could’ve been avoided had I asked for it earlier. In the end, it would’ve been the same as me looking at any other existing project out there.

As for the shaders, I did learn a lot about them. It’s interesting how much is possible with shaders as long as you understand the maths behind it that you need. Math being one of my weaker sides definitely made this project a lot more challenging for me. While I ended up using compute shaders instead of HLSL I do think there is a decent enough translation to them as the main difference is that they are run on different hardware and that the coding language is different. Apart from that I could in theory do the ripples I have now in a HLSL shader instead of a compute shader.

Resit

Summary

For the resit, I focussed on making the shader look good by adding some textures and light simulation to the shader.

The textures I added are a simple water texture I got from the unity asset store. A foam line that is placed on the intersection between an object and the shader. The last texture is a depth texture to make the water look darker the deeper you go into the water. Which makes use of some of Unity’s built-in features.

The light simulations added are diffuse lighting and specular lighting. Diffuse light is used to make the shader go dark when the light source is not pointed at the shader, and specular light is used to create a reflection spot on the water where the light source is pointed at.

The result of these additions of the textures and light is seen below. This is a major improvement compared to the basic blue plane I had in my original delivery.

A gif showing the finished shader

Light

Both diffuse and specular light have the same basics, where we want to look at the direction of the light source and a vertex normal. The image below is a visualization of how it works.

Freya Holmer Healthbars, SDFs & Lighting • Shaders for Game Devs [Part 2]

To make light work in Unity we need to know the vertex normal and the light direction. From these two vectors, we can calculate the dot product. We then clamp the result between the max of this dot product and 0. We clamp this value at 0 as any lower value is unneeded considering 0 means there is no light and thus the vertex should be “dark”.

https://amirazmi.net/dot-products-in-games-and-their-use-cases/

The resulting dot product from the light source and vertex direction can then be used to calculate the diffuse light. To add to this, I multiply a diffusion colour with the light colour before multiplying it with the normalized dot product of the light.

//Diffuse
                float3 N = normalize(i.normal);
                float3 lightDir = normalize(_WorldSpaceLightPos0.xyz);
                float NdotL = max(0.0, dot(i.worldNormal, lightDir));
                float3 diffuse = _DiffuseColor * _LightColor0.rgb * NdotL;

Specular light works on a similar principle, but also takes into account the direction of the camera position. This direction is used to show a light spot which is extra bright on a surface. This spot is using the vertex normal and a normalized vector of the light direction and a normalized vector of the camera position relative to the world position of a vertex. This is visually simplified in the image below

Freya Holmer Healthbars, SDFs & Lighting • Shaders for Game Devs [Part 2]

In code, it is as follows

//Specular
                float3 V = normalize(_WorldSpaceCameraPos - i.wPos);
                float3 H = normalize(lightDir + V);

                float3 lambert = saturate(dot(N, lightDir));
                
                float3 specularLight = saturate(dot(H, N)) * (lambert > 0);
                float specularExponent = exp2( _Gloss * 11) + 2;

                specularLight = pow( specularLight, specularExponent) * _SpecColor * _Gloss * _Shininess;

In the code above, we see three variables that can change the way that the specular light is shown. These variables are _Gloss, _SpecColor and _Shininess. Apart from the variables, there is a badly named method by Unity which is the saturate used to calculate the lambert. Saturate in HLSL is used to clamp a value between 0 and 1. This method is again used in the calculation of the specular light. Here we find a new strange piece of code which is (lambert > 0). This piece is more or less an if statement where this multiplication is only done when lambert is bigger than 0.

Changing _Gloss makes the light point more focused.

_SpecColor changes the light colour.

_Shininess changes how bright the reflection of the specular light is on the surface.

To add light, we apply it to the visual texture by multiplying the texture with a float4 which contains the specular + diffuse light to fill the first 3 floats. The last float is filled with a 1 for the alpha. The reason we want to add the diffuse to the specularLight instead of multiplying is because we want to get a light spot. If we multiply the two light types you won’t get a light spot but a general light increase over the entire shader.

//Add light to the visual texture
                secCol *= float4(specularLight + diffuse, 1);

Adding Depth

To simulate depth in Unity we can use some of their build-in methods that make use of the main camera in Unity. This method is the ComputeScreenPos in which we give the vertex in the parameter. This method takes the main camera in your scene.

// Calculate depth based on screen position (Camera)
                o.screenPos = ComputeScreenPos(o.vertex);
                COMPUTE_EYEDEPTH(o.screenPos.z);

The camera shoots a sort raycast towards the vertex. When it hits the vertex, this is taken as the base height. It then goes through the vertex all the way down until it collides.

//compute depth
                float sceneZ = LinearEyeDepth(SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, UNITY_PROJ_COORD(i.screenPos)));
                float depth = sceneZ - i.screenPos.z;

                // fade with depth
                fixed depthFading = saturate((abs(pow(depth, _DepthPow))) / _DepthFactor);
                secCol *= depthFading * _DepthColor;

As seen above using Unity’s function LinearEyeDepth we take a generated depth texture and compare this to the i.screenPos which is the camera. The sceneZ variable then gets subtracted by the i.screenPos.z. The depthTexture contains several values we subtract the vertex position from. This gives us a negative value back which is our depth.

The calculated depth is than raised by a power controlled by the variable _DepthPow. From the resulting number we then want the absolute value, which makes all number positive. If a number is negative 10 it will be translated to positive 10. The absolute value than gets divided by a variable called _DepthFactor. The DepthFactor controls at what depth the water should change colour. The resulting value than gets “saturated”, which in the context of Unity shaders means it gets clamped between 0 and 1. This variable in the end makes the transition between deep and shallow water smoother.

Lastly this depthFading is applied to the visual Texture which is called secCol by multiplying the texture with the depthFading.

After adding the depth I combine the main texture and the visual texture, which is done by lerping. In this case it doesn’t matter which texture you place first as long as the second parameter is 1. This value controls how strongly the textures are combined. The reason we merge the textures here and not earlier is so that the lighting is also applied to the depth. Even in deep oceans, you can have a white spot where the sun reflects.

//Add first texture on top of visual texture to make the light work
                secCol = lerp(col, 1, secCol);
Light-reflecting off of an ocean

Foam

For the foam, we reuse the depth variable. If we have a depth of 0 we know that this is a place where something intersects with the texture, and thus we want to apply the foam to this place.

// Foam Line
                fixed intersect = saturate((abs(depth)) / _IntersectionThreshold);
                secCol += _EdgeColor * pow(1 - intersect, 4) * _IntersectionPow;

For the foam, there are two important variables which are the _intersectionThreshold which controls how far out the foam should be and the _IntersectionPow which controls how much foam should fill this place

After adding the foam I added the light again so that the foam doesn’t disappear completely at nighttime. This is a visual choice I made because it looked nice.

//Add light to the second texture
                secCol *= float4(specularLight + diffuse, 1) + 0.3;
                
                return float4(secCol.xyz, _Color.a);

Lastly, we return a float4 filled with the texture colours and the alpha value from the main colour in our shader.

Conclusion

My main issue at the start was getting started and understanding. Now that I have a better understanding of shaders, it was easier for me to apply new principles to my shader. What I enjoyed most about this little project is that even though I am bad at maths, with just a basic understanding I can achieve some really fun and good-looking shaders. Even if I might not use the ripples I have created this time, other components of the shader I created here can be used in several ways. With this, I leave a short video of the end result cycling through the day cycle.


Bibliography

@aa_debdeb. (2018, 10 9). 【Unity】Compute Shaderで波動方程式シミュレーション. Retrieved from Qiita: https://qiita.com/aa_debdeb/items/1d69d49333630b06f6ce

Aj_. (2018, 2 13). Water ripples. Retrieved from Shadertoy: https://www.shadertoy.com/view/4ddyDn

Bhaskar, D. (sd). What is the equation describing a ripple in water? Retrieved from Quora: https://www.quora.com/What-is-the-equation-describing-a-ripple-in-water

Holmér, F. (2021, 2 26). Shader Basics, Blending & Textures Shaders for Game Devs [Part 1]. Retrieved from Youtube: https://www.youtube.com/watch?v=kfM-yu0iQBk&list=WL&index=20&t=1831s&ab_channel=FreyaHolm%C3%A9r

Karmakar, T. K. (sd). 2: Capillary Waves and Gravity Waves. 2: Capillary Waves and Gravity Waves. Research Gate. Retrieved from https://www.researchgate.net/figure/Capillary-Waves-and-Gravity-Waves_fig53_333672415

Lunar, B. (2021, 7 19). Ripple Wave Vertex Shader Graph- Easy Unity Tutorial. Retrieved from Youtube: https://www.youtube.com/watch?v=QsLkb1aOkb8&ab_channel=BinaryLunar

nethe550. (2022, 8 16). Simple Water Ripple. Retrieved from Shadertoy: https://www.shadertoy.com/view/NlfcDj

Play, P. (2015, 4 4). Ripple Water Shader – Unity CG/C# Tutorial. Retrieved from Youtube: https://www.youtube.com/watch?v=UfX9dzhBhg0&ab_channel=PeerPlay

Rutvik_Tak. (2022, 6 8). Water-Ripple Effect. Retrieved from Shadertoy: https://www.shadertoy.com/view/fsGcWz

Simplestar-game. (2022, 9 14). Simple Interactive Water. Retrieved from Unity Asset Store: https://assetstore.unity.com/packages/2d/textures-materials/water/simple-interactive-water-162033

Simulations, S. (2022, 4 29). Use compute shaders to create water effect in Unity. Now you can simulate ripples of a moving boat! Retrieved from Youtube: https://www.youtube.com/watch?v=4CNad5V9wD8&t=1653s&ab_channel=SpontaneousSimulations

Stephan, H. (2016, 4 21). Lab12_2: Wave Equation 2D. Retrieved from Youtube: https://www.youtube.com/watch?v=O6fqBxuM-g8&t=3s&ab_channel=HaroonStephen

Tomé, S. M. (2018, 4 12). Ripples from a Splashing Drop. Retrieved from Institute of Mathematics & its applications: https://ima.org.uk/9278/ripples-from-a-splashing-drop/

Tuts, A. (2020, 11 2). Procedural Water Ripples in Unity using Shader Graph Only. Retrieved from Youtube: https://www.youtube.com/watch?v=LaHvayJphdU&ab_channel=AETuts

Wallace, E. (2011, 8 15). WebGL Water. Retrieved from madebyevan: https://madebyevan.com/webgl-water/

Wikipedia. (sd). Capillary wave. Retrieved from Wikipedia: https://en.wikipedia.org/wiki/Capillary_wave#Gravity_wave_regime

Wikipedia. (sd). Dispersion (water waves). Retrieved from Wikipedia: https://en.wikipedia.org/wiki/Gravity_wave

Wikipedia. (sd). Gravity wave. Retrieved from Wikipedia: https://en.wikipedia.org/wiki/Gravity_wave

AmirAzami. (sd). Dot product and their use cases. Retrieved from https://amirazmi.net/dot-products-in-games-and-their-use-cases/

WordPress. (2018). Real-time Water Shader in Unity. Retrieved from WordPress: https://unitywatershader.wordpress.com/

Enrico, M. (2020, 9 28). Custom shaders with depth sampling. Retrieved from Edraflame: https://www.edraflame.com/blog/custom-shader-depth-texture-sampling/


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *