## The goal

My goal was to create water ripples based on the impact of objects in shaders. A ripple should start at the point of contact between the water and an object. Based on the size and speed of an object the size and height of the ripples should be changed. In the gif below you can see ripples in the water after something is tossed into the water.

## Creating ripples

In the first few weeks of my research, I was unable to find a good source on creating a ripple shader in HLSL, as all the sources I could find for ripple shaders weren't done in HLSL but in shadergraph or in Shadertoy. I was unable to translate the code they had to HLSL. Though I was able to find really good examples of what I wanted to create such as the following WebGL shader: https://madebyevan.com/webgl-water/

The first ripple shader I was able to create was from Freya Holmér in her tutorial about shader basics, which I was watching to get a better understanding of shaders. In the hope it would increase my understanding and help me translate Shadertoy and WebGL shader languages to HLSL.

This ripple effect is created by getting the centre of the uv of the mesh. From this centre we calculate the radial distance which is the distance between the centre and the edge of the plane. Then to calculate the ripple Freya uses a cos wave to create the ripple effect seen in the image above. Last but not least we add some dampening/dispersion by multiplying the ripple height by the radial distance. As seen in the code below.

``````//Code from Freya Holmér: Shader Basics, Blending & Textures • Shaders for Game Devs [Part 1]
float GetRipple(float2 uv) {
float2 uvsCentered = uv * 2 - 1;

float ripple = cos( (radialDistance - _Time.y * 0.1) * TAU * 5) * 0.5 + 0.5;

return ripple;
}``````

While this code works for a basic ripple it's not the end product I want to end up with for several reasons. The first being this is all based on uvs, not on coördinates. The second problem is that the waves are unable to bounce off the edges and or objects that might obstruct them.

After being stuck for a few weeks in my research. I decided to take a step back and research how ripples work in real life and what the mathematics are behind them in the hope of getting a better understanding of what I have to do. When looking at ripples in reality you need to look at propagating waves. Propagating in this context means to travel through a space or material (Propagation definition). When looking at a basic propagating wave there doesn't seem to be a standard. The formula can either contain a sin (η( x, t ) = αsin(θ(x,t))) or a cos (y=acos(wt+δ)). From what I could find and see it doesn't make a big difference if you use the sin or cos formula to make the ripples. Freya's shader for example used the cos formula.

Looking further at waves in reality we learn that there are two types of waves created when water is disturbed by an object entering its surface. Capillary waves and gravity waves, the former being waves affected by the surface tension of a liquid while the latter are waves created trying to make an equilibrium on the surface. In the image below you can see the difference between the two waves.

In the shader, I am going to create I am focusing on Gravity waves to keep it simple for myself. Future improvements would include trying to make the ripple have both capillary waves as well as gravity waves. Other improvements will be listed at the end of this blog.

Having taken a step back I was able to find a water shader that did exactly what I wanted. This shader was created by a Unity user named Simplestar-Game (Shader: Simple Interactive Water). After a small email exchange with Simplestar-Game, I understood that I should switch from normal shaders in URP to compute shaders in the 3D pipeline of unity. This is because my goals there were better sources for water ripple shaders in compute shaders than in the normal shaders. The pipeline change was necessary due to issues with the kernels needed for compute shading not being detected in the URP version of Unity. I was unable to find the reason behind this issue.

To create my shader I started with a simple gray tessellated plane. Tessellation in this context means that you double the number of triangles and verts of a plane or object. By increasing the trice and verts you can get more detailed deformation from your mesh.

To create my shader I started with a tessellated plane I got from the Shader example in the earlier mentioned video by Freya Holmér.

A compute shader works differently from a normal shader. The biggest differences are the following:
1) You don't add the shader to a material but access it via a C# script
2) Compute shaders use the GPU instead of the CPU. To make use of the GPU you need to assign kernels and threads on which the shader has to run. This allows you to run several tasks in parallel increasing performance a lot, compared to the normal shaders which run on the CPU and can't run in parallel with each other.
3) All properties have to be created in the C# script and sent through the script as will be shown later. You are still able to create your own variables inside of the compute shader of course.

Then for the compute shader, I ended up with, which is heavily based on a video created by Spontaneous Simulations.

To start the creation of the shader you need to make three textures and enable random write on it. If you don't enable this the compute shader will be unable to interact with the mesh.

``````void Start()
{
InitializeTexture(ref NState);
InitializeTexture(ref Nm1State);
InitializeTexture(ref Np1State);

waveMat.mainTexture = NState;
}

void InitializeTexture(ref RenderTexture tex)
{
tex = new RenderTexture(resolution.x, resolution.y, 1, UnityEngine.Experimental.Rendering.GraphicsFormat.R16G16B16A16_SNorm);
tex.enableRandomWrite = true;
tex.Create();
}``````

In this code snippet, you can notice the use of UnityEngine.Experimental.Rendering.GraphicsFormat.R16G16B16A16_SNorm. This is used for the RGBA values as we need this for the waves. This GraphicsFormat however also allows for negative values which we need for the lower part of the ripples as these go into the negatives.

These three textures are then copied to each other and sent to the compute shader.

``````void Update()
{
Graphics.CopyTexture(NState, Nm1State);
Graphics.CopyTexture(Np1State, NState);

waveCompute.SetTexture(0, "NState", NState);
waveCompute.SetTexture(0, "Nm1State", Nm1State);
waveCompute.SetTexture(0, "Np1State", Np1State);
waveCompute.SetVector("resolution", new Vector2(resolution.x, resolution.y));
waveCompute.SetFloats("dispersion", dispersion);
waveCompute.Dispatch(0, resolution.x / 8, resolution.y / 8, 1);

waveMat.mainTexture = NState;
}``````

By accessing the waveCompute you can set the properties via the code. As mentioned earlier this is the only way to set the properties for a compute shader as they are otherwise inaccessible. On line 11 you can see a Dispatch call. This has to be placed in the Update because the shader won't be updated otherwise. This method is divided into four parameters, the last three can sort off and be combined.
1) The kernel, which in the case of the compute shader is a method.
2, 3 & 4) This divides the mesh into sections so the shader can run those parts in parallel. The second and third parameters have to divided by the thread count you have set in the compute shader. The fourth parameter can be set as 1 considering we use a plane.

On line 13 we set the texture of a material equal to the texture NState that is created in the Start method.

As explained in my goal I want ripples to be created when an object collides with the water plane. The biggest problem I had here is that to place the ripples I needed to have the UV cords instead of the world position. After searching around I found that a raycast could return the texturecoords which translate to the UV cords. In the collision method, you can access the contact point between the colliding objects. This returns the world position on which you can then create a downwards raycast which is created on line 6 in the comment snippet below.

``````private void OnCollisionEnter(Collision other) {

foreach(ContactPoint contact in other.contacts){
RaycastHit colHit = new RaycastHit();

Ray ray = new Ray(contact.point-contact.normal, contact.normal);
if(Physics.Raycast(ray, out colHit)){
//Debug.Log(colHit.textureCoord);
//Debug.Log(colHit.collider.name);

waveCompute.SetTexture(1, "NState", NState);
waveCompute.SetTexture(1, "Nm1State", Nm1State);
waveCompute.SetTexture(1, "Np1State", Np1State);
waveCompute.SetVector("effect", effect);
waveCompute.SetFloat("objectWeight", other.gameObject.GetComponent<Rigidbody>().mass);
waveCompute.SetVector("rippleOrigin", new Vector2(resolution.x * colHit.textureCoord.x, resolution.y * colHit.textureCoord.y));
waveCompute.Dispatch(1, resolution.x / 8, resolution.y / 8, 1);

other.transform.position = new Vector3(other.transform.position.x, Random.Range(6, 12), other.transform.position.z);
}
}
}``````

The collision sometimes created two ripples at the same time. The second ripple was placed on the UV coördinate of (0,0). To solve this bug I used two debug lines found on lines 8 and 9. This showed that the raycast sometimes hit the colliding object instead of the water mesh. By setting the colliding objects to the Ignore Raycast layer in Unity I was able to fix this bug.

After setting the parameters for the shader I Dispatch the shader again. This time I dispatch it on the kernel of index 1 instead of kernel 0. What the kernels do I will explain next.

To create a new kernel in the compute shader you need to call #pragme kernel [Kernel name]. The Kernel name has to be the same as the method name. If they are not the same it will not be recognized as a method.

``````#pragma kernel CSMain
#pragma kernel PlaceRipplle``````

Each method you create in a compute shader has to have GPU threads assigned to them and be given an ID via the method parameter. This ID can then be used in the C# script for the kernel ID.

``````[numthreads(8,8,1)]
void CSMain (uint3 id : SV_DispatchThreadID)
{
float ns_ij = NState[id.xy].x;
float nm1s_ij = Nm1State[id.xy].x;
float ns_ip1j = NState[id.xy + uint2(1, 0)].x;
float ns_ijp1 = NState[id.xy + uint2(0, 1)].x;
float ns_im1j = NState[id.xy - uint2(1, 0)].x;
float ns_ijm1 = NState[id.xy - uint2(0, 1)].x;
float newWaveHeight = ns_ij * 2 - nm1s_ij + 0.25 * (ns_ip1j + ns_im1j + ns_ijp1 + ns_ijm1 - 4 * ns_ij);
newWaveHeight = newWaveHeight * dispersion;

Np1State[id.xy] = float4(newWaveHeight, newWaveHeight, newWaveHeight, 1);
}``````

The variables containing ns_ are used to calculate the waves. The variables containing p1 and m1 are used to create the boundary of the shader. Once the ripple hits one of these boundaries the wave will reflect on this and come back.
On line 10 we calculate how the wave. This wave is created based on the formula shown in a video by Haroon Stephan.

The lowercase delta in this formula is replaced with the ns_ variables. In the formula, you can also see the +1/-1 which in the variables is shown as p1/m1. In the formula, they divide the separate parts by the delta of x sqrt and y sqrt. This however is not needed in Unity as this is automatically done by assigning them to the id.xy as seen on line 15.

``````[numthreads(8, 8, 1)]
float TheGreatestWaveHeight = effect.z * objectWeight;
if (id.x == floor(rippleOrigin.x) && id.y == floor(rippleOrigin.y)) {Np1State[id.xy] = float4(TheGreatestWaveHeight, TheGreatestWaveHeight, TheGreatestWaveHeight, 1);}
}``````

The second kernel is used to place the ripple on the mesh. I separated these because I originally also had the option to create a wave on the place where you clicked with the mouse. On line 3 you can see the object's weight. This is how I try to simulate the ripples in a realistic manner. Heavy objects for example create bigger waves than light objects. This results in the following example gif. The redder an object is the heavier they are.

As is visible in the gif above the ripples are now created. There is however one problem, the ripples that are created are 2D and not 3D. To make the ripples 3D you need to create a normal shader using HLSL and place this shader on a material to apply it to the mesh.

In this shader you only need to add a few things which are seen in the following snippet:

``````v2f vert (appdata v)
{
v2f o;

fixed4 vertexDisplacement = tex2Dlod(_MainTex, float4(v.uv.x, v.uv.y, 0, 0));
v.vertex.y += vertexDisplacement.x;

o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
return o;
}

fixed4 frag (v2f i) : SV_Target
{
fixed4 col = tex2D(_MainTex, i.uv);
col *= _Emision;

col = lerp(float4(0, 0, 1, 0), float4(1, 1, 1, 0), col.x);
return col;
}``````

On line 4 and 5 the vertex displacement is done to make the ripples become 3D. The _MainTex for the material is set in the update method of the WaveManager script back up in this blog. The line of code that sets this property is: waveMat.mainTexture = NState;. From this texture we get the uv values of the x and y axis as these are what show the ripples on the plane. On line 5 you apply the uv.x to the mesh so it gains height. You don't need to use the x value and are able to use the y value. this changes nothing for the shader.

In the fixed4 frag method, the last important pieces of code are added. On line 16 I add some emission so that the ripples are visible. On line 18 I change the colour of the mesh to blue to make it look more like water. After you apply the shader to a material and place the material on the mesh you get the following effect which is the current result of my r&d project

In the future there are several things I wish to do better or improve for my shader. These improvements are as followed.
1) Make the water look realistic - The water right now is just a blue plane with an emission level to make the ripples visible to the user.
2) Add obstacles/terrain - At this moment the waves only reflect off of the sides of the plane. If I want to make the ripples realistic they should also reflect from objects that are in the water.
3) Make the starting ripple size adjustable - The code that currently creates the ripples have a set start size for the ripples. This should be improved so that this is adjustable. This might need lead to a change in the base ripple formula that is used.

## Lessons learned

As for the shaders, I did learn a lot about them. It's interesting how much is possible with shaders as long as you understand the maths behind it that you need. Math being one of my weaker sides definitely made this project a lot more challenging for me. While I ended up using compute shaders instead of HLSL I do think there is a decent enough translation to them as the main difference is that they are run on different hardware and that the coding language is different. Apart from that I could in theory do the ripples I have now in a HLSL shader instead of a compute shader.

# Resit

## Summary

For the resit, I focussed on making the shader look good by adding some textures and light simulation to the shader.

The textures I added are a simple water texture I got from the unity asset store. A foam line that is placed on the intersection between an object and the shader. The last texture is a depth texture to make the water look darker the deeper you go into the water. Which makes use of some of Unity's built-in features.

The light simulations added are diffuse lighting and specular lighting. Diffuse light is used to make the shader go dark when the light source is not pointed at the shader, and specular light is used to create a reflection spot on the water where the light source is pointed at.

The result of these additions of the textures and light is seen below. This is a major improvement compared to the basic blue plane I had in my original delivery.

## Light

Both diffuse and specular light have the same basics, where we want to look at the direction of the light source and a vertex normal. The image below is a visualization of how it works.

To make light work in Unity we need to know the vertex normal and the light direction. From these two vectors, we can calculate the dot product. We then clamp the result between the max of this dot product and 0. We clamp this value at 0 as any lower value is unneeded considering 0 means there is no light and thus the vertex should be "dark".

The resulting dot product from the light source and vertex direction can then be used to calculate the diffuse light. To add to this, I multiply a diffusion colour with the light colour before multiplying it with the normalized dot product of the light.

``````//Diffuse
float3 N = normalize(i.normal);
float3 lightDir = normalize(_WorldSpaceLightPos0.xyz);
float NdotL = max(0.0, dot(i.worldNormal, lightDir));
float3 diffuse = _DiffuseColor * _LightColor0.rgb * NdotL;``````

Specular light works on a similar principle, but also takes into account the direction of the camera position. This direction is used to show a light spot which is extra bright on a surface. This spot is using the vertex normal and a normalized vector of the light direction and a normalized vector of the camera position relative to the world position of a vertex. This is visually simplified in the image below

In code, it is as follows

``````//Specular
float3 V = normalize(_WorldSpaceCameraPos - i.wPos);
float3 H = normalize(lightDir + V);

float3 lambert = saturate(dot(N, lightDir));

float3 specularLight = saturate(dot(H, N)) * (lambert > 0);
float specularExponent = exp2( _Gloss * 11) + 2;

specularLight = pow( specularLight, specularExponent) * _SpecColor * _Gloss * _Shininess;``````

In the code above, we see three variables that can change the way that the specular light is shown. These variables are _Gloss, _SpecColor and _Shininess. Apart from the variables, there is a badly named method by Unity which is the saturate used to calculate the lambert. Saturate in HLSL is used to clamp a value between 0 and 1. This method is again used in the calculation of the specular light. Here we find a new strange piece of code which is (lambert > 0). This piece is more or less an if statement where this multiplication is only done when lambert is bigger than 0.

Changing _Gloss makes the light point more focused.

_SpecColor changes the light colour.

_Shininess changes how bright the reflection of the specular light is on the surface.

To add light, we apply it to the visual texture by multiplying the texture with a float4 which contains the specular + diffuse light to fill the first 3 floats. The last float is filled with a 1 for the alpha. The reason we want to add the diffuse to the specularLight instead of multiplying is because we want to get a light spot. If we multiply the two light types you won't get a light spot but a general light increase over the entire shader.

``````//Add light to the visual texture
secCol *= float4(specularLight + diffuse, 1);``````

To simulate depth in Unity we can use some of their build-in methods that make use of the main camera in Unity. This method is the ComputeScreenPos in which we give the vertex in the parameter. This method takes the main camera in your scene.

``````// Calculate depth based on screen position (Camera)
o.screenPos = ComputeScreenPos(o.vertex);
COMPUTE_EYEDEPTH(o.screenPos.z);``````

The camera shoots a sort raycast towards the vertex. When it hits the vertex, this is taken as the base height. It then goes through the vertex all the way down until it collides.

``````//compute depth
float depth = sceneZ - i.screenPos.z;

fixed depthFading = saturate((abs(pow(depth, _DepthPow))) / _DepthFactor);

As seen above using Unity's function LinearEyeDepth we take a generated depth texture and compare this to the i.screenPos which is the camera. The sceneZ variable then gets subtracted by the i.screenPos.z. The depthTexture contains several values we subtract the vertex position from. This gives us a negative value back which is our depth.

The calculated depth is than raised by a power controlled by the variable _DepthPow. From the resulting number we then want the absolute value, which makes all number positive. If a number is negative 10 it will be translated to positive 10. The absolute value than gets divided by a variable called _DepthFactor. The DepthFactor controls at what depth the water should change colour. The resulting value than gets "saturated", which in the context of Unity shaders means it gets clamped between 0 and 1. This variable in the end makes the transition between deep and shallow water smoother.

Lastly this depthFading is applied to the visual Texture which is called secCol by multiplying the texture with the depthFading.

After adding the depth I combine the main texture and the visual texture, which is done by lerping. In this case it doesn't matter which texture you place first as long as the second parameter is 1. This value controls how strongly the textures are combined. The reason we merge the textures here and not earlier is so that the lighting is also applied to the depth. Even in deep oceans, you can have a white spot where the sun reflects.

``````//Add first texture on top of visual texture to make the light work
secCol = lerp(col, 1, secCol);``````

## Foam

For the foam, we reuse the depth variable. If we have a depth of 0 we know that this is a place where something intersects with the texture, and thus we want to apply the foam to this place.

``````// Foam Line
fixed intersect = saturate((abs(depth)) / _IntersectionThreshold);
secCol += _EdgeColor * pow(1 - intersect, 4) * _IntersectionPow;``````

For the foam, there are two important variables which are the _intersectionThreshold which controls how far out the foam should be and the _IntersectionPow which controls how much foam should fill this place

After adding the foam I added the light again so that the foam doesn't disappear completely at nighttime. This is a visual choice I made because it looked nice.

``````//Add light to the second texture
secCol *= float4(specularLight + diffuse, 1) + 0.3;

return float4(secCol.xyz, _Color.a);``````

Lastly, we return a float4 filled with the texture colours and the alpha value from the main colour in our shader.

## Conclusion

My main issue at the start was getting started and understanding. Now that I have a better understanding of shaders, it was easier for me to apply new principles to my shader. What I enjoyed most about this little project is that even though I am bad at maths, with just a basic understanding I can achieve some really fun and good-looking shaders. Even if I might not use the ripples I have created this time, other components of the shader I created here can be used in several ways. With this, I leave a short video of the end result cycling through the day cycle.

## Bibliography

Karmakar, T. K. (sd). 2: Capillary Waves and Gravity Waves. 2: Capillary Waves and Gravity Waves. Research Gate. Retrieved from https://www.researchgate.net/figure/Capillary-Waves-and-Gravity-Waves_fig53_333672415

Simplestar-game. (2022, 9 14). Simple Interactive Water. Retrieved from Unity Asset Store: https://assetstore.unity.com/packages/2d/textures-materials/water/simple-interactive-water-162033

Tomé, S. M. (2018, 4 12). Ripples from a Splashing Drop. Retrieved from Institute of Mathematics & its applications: https://ima.org.uk/9278/ripples-from-a-splashing-drop/