Creating a cartoon water shader for the web. Part 1

In my tutorial “Creating Shaders,” I mainly looked at fragmentary shaders, which are enough to implement any 2D effects and examples on ShaderToy . But there is a whole category of techniques that require the use of vertex shaders. In this tutorial, I will talk about creating a stylized cartoon water shader and introduce you to the vertex shaders (vertex shaders). I will also talk about the depth buffer and how to use it to get additional information about the scene and to create lines of sea foam.

Here is what the finished effect will look like. Interactive demo can be viewed here .


This effect consists of the following elements:

  1. A translucent water mesh with subdivided polygons and offset vertices to create waves.
  2. Static lines of water on the surface.
  3. Simulated buoyancy of boats.
  4. Dynamic foam lines around the boundaries of objects in the water.
  5. Post-processing to create distortion of everything that is under water.

In this effect, I like the fact that it affects many different concepts of computer graphics, so it will allow us to use ideas from previous tutorials, as well as develop techniques that can be applied in new effects.

In this tutorial, I will use PlayCanvas , simply because it is a convenient free web-IDE, but you can easily apply everything to any other environment for working with WebGL. At the end of the article will be presented the version of the source code for Three.js. We will assume that you are already well-versed in fragment shaders and the PlayCanvas interface. You can refresh your knowledge of shaders here , and get to know PlayCanvas here .

Environment setup


The goal of this section is to customize our PlayCanvas project and insert several environment objects into it that will be affected by water.

If you do not have a PlayCanvas account, register it and create a new blank project . By default, you should have a couple of objects in your scene, a camera and a light source.


Insert models


An excellent resource for finding 3D models for the web is the Google Poly project. I took the model of the boat from there. After downloading and unpacking the archive, you will find the .obj and .png files in it.

  1. Drag both files into the Assets window of the PlayCanvas project.
  2. Select the automatically created material and select the .png file as its diffuse map.


You can now drag Tugboat.json into the scene and delete the Box and Plane objects. If the boat looks too small, you can increase its scale (I set the value to 50).


Similarly, any other models can be added to the scene.

Orbiting camera


To set up a camera flying in orbit, we will copy the script from this PlayCanvas example . Follow the link and click on Editor to open the project.

  1. Copy the contents of mouse-input.js and orbit-camera.js from this tutorial project into files with the same names from your project.
  2. Add a Script component to the camera.
  3. Attach two scripts to the camera.

Tip: to organize the project, you can create a folder in the Assets window. I put these two camera scripts in the Scripts / Camera / folder, my model in Models /, and the material in the Materials / folder.

Now when you start the game (the start button in the upper right part of the scene window) you should see the boat, which you can inspect with the camera, moving it in orbit with the mouse.

Subdivision of water surface polygons


The purpose of this section is to create a subdivided mesh that will be used as the surface of the water.

To create the surface of the water, we adapt part of the code from the tutorial on relief generation . Create a new Water.js script Water.js . Open this script for editing and create a new GeneratePlaneMesh function that will look like this:

 Water.prototype.GeneratePlaneMesh = function(options){ // 1 -    ,     if(options === undefined) options = {subdivisions:100, width:10, height:10}; // 2 -  , UV   var positions = []; var uvs = []; var indices = []; var row, col; var normals; for (row = 0; row <= options.subdivisions; row++) { for (col = 0; col <= options.subdivisions; col++) { var position = new pc.Vec3((col * options.width) / options.subdivisions - (options.width / 2.0), 0, ((options.subdivisions - row) * options.height) / options.subdivisions - (options.height / 2.0)); positions.push(position.x, position.y, position.z); uvs.push(col / options.subdivisions, 1.0 - row / options.subdivisions); } } for (row = 0; row < options.subdivisions; row++) { for (col = 0; col < options.subdivisions; col++) { indices.push(col + row * (options.subdivisions + 1)); indices.push(col + 1 + row * (options.subdivisions + 1)); indices.push(col + 1 + (row + 1) * (options.subdivisions + 1)); indices.push(col + row * (options.subdivisions + 1)); indices.push(col + 1 + (row + 1) * (options.subdivisions + 1)); indices.push(col + (row + 1) * (options.subdivisions + 1)); } } //   normals = pc.calculateNormals(positions, indices); //    var node = new pc.GraphNode(); var material = new pc.StandardMaterial(); //   var mesh = pc.createMesh(this.app.graphicsDevice, positions, { normals: normals, uvs: uvs, indices: indices }); var meshInstance = new pc.MeshInstance(node, mesh, material); //      var model = new pc.Model(); model.graph = node; model.meshInstances.push(meshInstance); this.entity.addComponent('model'); this.entity.model.model = model; this.entity.model.castShadows = false; //   ,       }; 

Now we can call it in the initialize function:

 Water.prototype.initialize = function() { this.GeneratePlaneMesh({subdivisions:100, width:10, height:10}); }; 

Now when you start the game you should see only a flat surface. But it’s not just a flat surface, it’s a mesh made up of thousands of vertices. As an exercise, try to make sure of it yourself (this is a good reason to study the code you just copied).

Task 1: Move the Y coordinate of each vertex by a random value so that the plane looks like the image below.


Waves


The purpose of this section is to designate the surface of the water of your own material and create animated waves.

To get the effects we need, you need to customize your own material. Most 3D engines have a set of pre-built shaders for rendering objects and a way to override them. Here is a good link on how this is done in PlayCanvas.

Shader attachment


Let's create a new function CreateWaterMaterial , which sets a new material with a modified shader, and returns it:

 Water.prototype.CreateWaterMaterial = function(){ //     var material = new pc.Material(); //    ,       material.name = "DynamicWater_Material"; //    //        . var gd = this.app.graphicsDevice; var fragmentShader = "precision " + gd.precision + " float;\n"; fragmentShader = fragmentShader + this.fs.resource; var vertexShader = this.vs.resource; //       . var shaderDefinition = { attributes: { aPosition: pc.gfx.SEMANTIC_POSITION, aUv0: pc.SEMANTIC_TEXCOORD0, }, vshader: vertexShader, fshader: fragmentShader }; //     this.shader = new pc.Shader(gd, shaderDefinition); //      material.setShader(this.shader); return material; }; 

This function takes the vertex and fragment shader code from script attributes. Therefore, let's define them at the top of the file (after the pc.createScript line):

 Water.attributes.add('vs', { type: 'asset', assetType: 'shader', title: 'Vertex Shader' }); Water.attributes.add('fs', { type: 'asset', assetType: 'shader', title: 'Fragment Shader' }); 

Now we can create these shader files and attach it to our script. Go back to the editor and create two shader files: Water.frag and Water.vert . Attach these shaders to the script as shown in the figure below.


If the new attributes are not displayed in the editor, then click the Parse button to update the script.

Now paste this base shader into Water.frag :

 void main(void) { vec4 color = vec4(0.0,0.0,1.0,0.5); gl_FragColor = color; } 

And this one is in Water.vert :

 attribute vec3 aPosition; uniform mat4 matrix_model; uniform mat4 matrix_viewProjection; void main(void) { gl_Position = matrix_viewProjection * matrix_model * vec4(aPosition, 1.0); } 

Finally, go back to Water.js to use our new material instead of the standard material. That is, instead of:

 var material = new pc.StandardMaterial(); 

insert:

 var material = this.CreateWaterMaterial(); 

Now, after starting the game, the plane should have a blue color.


Hot reboot


So far we have just configured the shader blanks for our new material. Before I start writing these effects, I want to configure automatic code reload.

Uncommenting the swap function in any script file (for example, in Water.js), we will enable a hot reboot. Later we will see how to use this to maintain state even when the code is updated in real time. But for now, we just want to re-apply shaders after making changes. Before launching in WebGL, the shaders are compiled, so to do this we need to re-create our material.

We will check if the content of our shader code has changed, and if so, create material again. First, let's save the current shaders in initialize :

 //  initialize,       Water.prototype.initialize = function() { this.GeneratePlaneMesh(); //    this.savedVS = this.vs.resource; this.savedFS = this.fs.resource; }; 

And in update we check if there are any changes:

 //  update,     Water.prototype.update = function(dt) { if(this.savedFS != this.fs.resource || this.savedVS != this.vs.resource){ //   ,      var newMaterial = this.CreateWaterMaterial(); //     var model = this.entity.model.model; model.meshInstances[0].material = newMaterial; //    this.savedVS = this.vs.resource; this.savedFS = this.fs.resource; } }; 

Now, to make sure it works, run the game and change the color of the plane in Water.frag to a more pleasant blue. After saving the file, it should be updated even without rebooting and restarting! Here is the color I chose:

 vec4 color = vec4(0.0,0.7,1.0,0.5); 

Vertex Shaders


To create waves, we must move every vertex of our mesh in each frame. It seems that this will be very inefficient, but each vertex of each model is already transformed in each rendered frame. This is what the vertex shader does.

If you perceive a fragmentary shader as a function that runs for each pixel, gets its position and returns a color, then the vertex shader is the function that runs for each vertex, gets its position, and returns the position .

The vertex shader by default gets a position in the model world and returns its position on the screen . Our 3D scene is set in x, y, and z coordinates, but the monitor is a flat two-dimensional plane, so we project the 3D world onto a 2D screen. Matrixes of a type, a projection and model are engaged in such projection, therefore we will not consider it in this tutorial. But if you want to understand what exactly happens at each stage, here is a very good guide .

That is, this line:

 gl_Position = matrix_viewProjection * matrix_model * vec4(aPosition, 1.0); 

gets aPosition as a position in the 3D world of a particular vertex and converts it into gl_Position , that is, into a final position on a 2D screen. The prefix “a” in aPosition indicates that this value is an attribute . Do not forget that the variable uniform is the value that we can define in the CPU and pass it to the shader. It keeps the same value for all pixels / vertices. On the other hand, the attribute value is obtained from the array assigned by the CPU. The vertex shader is called for each value of this attribute array.

You can see that these attributes are configured in the shader definition that we defined in Water.js:

 var shaderDefinition = { attributes: { aPosition: pc.gfx.SEMANTIC_POSITION, aUv0: pc.SEMANTIC_TEXCOORD0, }, vshader: vertexShader, fshader: fragmentShader }; 

PlayCanvas takes over the job of setting up and transmitting an array of vertex positions for aPosition when sending this enumeration, but in general we can transfer any data array to the vertex shader.

Moving vertices


Suppose we want to compress the entire plane by multiplying all x values ​​by 0.5. Do we need to change aPosition or gl_Position ?

Let's first try aPosition . We cannot directly change the attribute, but we can create a copy:

 attribute vec3 aPosition; uniform mat4 matrix_model; uniform mat4 matrix_viewProjection; void main(void) { vec3 pos = aPosition; pos.x *= 0.5; gl_Position = matrix_viewProjection * matrix_model * vec4(pos, 1.0); } 

Now the plane should look more like a rectangle. And there is nothing strange about it. And what happens if instead we try to change gl_Position ?

 attribute vec3 aPosition; uniform mat4 matrix_model; uniform mat4 matrix_viewProjection; void main(void) { vec3 pos = aPosition; //pos.x *= 0.5; gl_Position = matrix_viewProjection * matrix_model * vec4(pos, 1.0); gl_Position.x *= 0.5; } 

Until you start moving the camera, it may look similar. We change the coordinates of the screen space, that is, the picture will depend on how we look at it .

So we can move the vertices, and at the same time it is important to distinguish work in the global and screen spaces.

Task 2: can you move the entire surface of the plane a few units up (along the Y axis) in the vertex shader without distorting its shape?

Task 3: I said that gl_Position is two-dimensional, but gl_Position.z also exists. Can you perform checks to see if this value affects anything, and if so, what is it used for?

Add time


The last thing we need before we start creating moving waves is a uniform variable that can be used as time. Let's declare uniform in the vertex shader:

 uniform float uTime; 

Now, to transfer it to the shader, let's go back to Water.js and define the time variable in initialize:

 Water.prototype.initialize = function() { this.time = 0; /////     this.GeneratePlaneMesh(); //    this.savedVS = this.vs.resource; this.savedFS = this.fs.resource; }; 

Now we use material.setParameter to pass the variable to the shader. First, we set the initial value at the end of the CreateWaterMaterial function:

 //     this.shader = new pc.Shader(gd, shaderDefinition); //////////////   material.setParameter('uTime',this.time); this.material = material; //      //////////////// //      material.setShader(this.shader); return material; 

Now in the update function, we can perform the time increment and access the material using the link created for this:

 this.time += 0.1; this.material.setParameter('uTime',this.time); 

Finally, in the swap function, we copy the old time value so that even after changing the code it continues to increase, without being reset to 0.

 Water.prototype.swap = function(old) { this.time = old.time; }; 

Now everything is ready. Run the game to make sure that there are no errors. Now let's move our plane using the time function in Water.vert :

 pos.y += cos(uTime) 

And our plane should start moving up and down! Since we now have the swap function, we can also update Water.js without having to restart. To make sure it works, try changing the time increment.


Task 4: Can you move the vertices so that they look like the waves in the image below?


I will tell you that I examined in detail the topic of various ways to create waves here . The article refers to 2D, but mathematical calculations apply to our case. If you just want to see the solution, here is gist .

Translucency


The purpose of this section is to create a translucent water surface.

You may notice that the color returned in Water.frag has an alpha channel value of 0.5, but the surface still remains opaque. In many cases, transparency still becomes an unsolved problem in computer graphics. A low-cost way to solve it is to use blending.

Usually, before drawing a pixel, it checks the value in the depth buffer and compares it with its own depth value (its position along the Z axis) to determine whether to redraw the current pixel of the screen, or discard it. This is what makes it possible to render the scene correctly without the need to sort objects from behind forwards.

When mixing, instead of simply discarding a pixel or rewriting, we can combine the color of an already drawn pixel (target) with the pixel we are going to draw (the source). A list of all the mixing features available in WebGL can be found here .

For the alpha channel to work according to our expectations, we want the combined result color to be the source multiplied by the alpha channel plus the target pixel multiplied by one minus alpha. In other words, if alpha = 0.4, then the final color should matter:

 finalColor = source * 0.4 + destination * 0.6; 

In PlayCanvas, this is exactly the operation that pc.BLEND_NORMAL performs .

To enable it, simply set the material property inside CreateWaterMaterial :

 material.blendType = pc.BLEND_NORMAL; 

If you now start the game, the water will become translucent! However, it is not perfect yet. The problem occurs when the translucent surface overlaps with itself, as shown below.


We can eliminate it by using instead of blending alpha to coverage - a multisampling technique for transparency:

 //material.blendType = pc.BLEND_NORMAL; material.alphaToCoverage = true; 

But it is only available in WebGL 2. For the rest of the tutorial, I will use blending for the sake of simplicity.

Summarize


We set up the environment and created a translucent water surface with animated waves from the vertex shader. In the second part of the tutorial, we will look at the buoyancy of objects, add lines to the surface of the water, and create foam lines along the borders of the objects that intersect the surface.

In the third (last) part, we will consider the application of the post-processing effect of underwater distortion and consider ideas for further improvement.

Source


A complete PlayCanvas project can be found here . In our repository there is also a project port under Three.js .

Source: https://habr.com/ru/post/416953/


All Articles