Frontend Development 11 min read

Understanding WebGL: GPU Basics, Shaders, and Practical Code Examples

This article introduces WebGL fundamentals for frontend developers, explaining GPU versus CPU, GLSL shaders, and how JavaScript prepares data, followed by step‑by‑step code examples of fragment and vertex shaders, custom primitives, and using the gl‑renderer library to render graphics.

ByteFE
ByteFE
ByteFE
Understanding WebGL: GPU Basics, Shaders, and Practical Code Examples

WebGL is a niche area within frontend development that many developers find difficult because it differs greatly from traditional high‑level web APIs, relying on the low‑level OpenGL API that runs on the GPU.

To grasp WebGL you need to understand the distinction between CPU and GPU, learn the GLSL shading language, and become familiar with basic shader concepts.

The CPU processes tasks sequentially with a limited number of cores, while the GPU consists of thousands of tiny pipelines that can handle each pixel of an image in parallel, making it ideal for graphics rendering.

WebGL’s rendering pipeline uses JavaScript to prepare data (often via TypedArray and ArrayBuffer), sends it to the GPU, runs vertex and fragment shaders, and finally writes the result to a framebuffer for display.

Using the open‑source gl-renderer library, the article demonstrates a simple fragment shader that fills the canvas with red:

const canvas = document.querySelector('canvas');
const renderer = new GlRenderer(canvas, {webgl2: true});
const fragment = `#version 300 es
precision highp float;
out vec4 FragColor;
void main() {
  FragColor = vec4(1, 0, 0, 1);
}`;
const program = renderer.compileSync(fragment);
renderer.useProgram(program);
renderer.render();

The fragment shader code is explained line by line, covering the version declaration, precision setting, output variable, and the main function that sets the color.

A second example adds a uniform vec2 resolution to color each pixel based on its normalized coordinates:

const fragment = `#version 300 es
precision highp float;
out vec4 FragColor;
uniform vec2 resolution;
void main() {
  vec2 st = gl_FragCoord.xy / resolution;
  FragColor = vec4(st, 0, 1);
}`;
renderer.uniforms.resolution = [canvas.width, canvas.height];

The article then shows how to render a triangle using the library’s default vertex shader, which simply passes the vertex position to gl_Position :

#version 300 es
precision highp float;
precision highp int;

in vec3 a_vertexPosition;

void main() {
  gl_PointSize = 1.0;
  gl_Position = vec4(a_vertexPosition, 1);
}

It also demonstrates a custom vertex shader that scales the vertex positions by 0.5, reducing the triangle’s size:

const vertex = `#version 300 es
precision highp float;
precision highp int;

in vec3 a_vertexPosition;

void main() {
  gl_PointSize = 1.0;
  gl_Position = vec4(0.5 * a_vertexPosition, 1);
}`;

Finally, the article changes the primitive mode to LINE_STRIP to draw connected lines instead of a filled triangle, illustrating how different draw modes affect the output.

The piece ends by noting that the default full‑canvas drawing area will be explored in the next tutorial.

frontendGraphicsJavaScriptGPUWebGLShaders
ByteFE
Written by

ByteFE

Cutting‑edge tech, article sharing, and practical insights from the ByteDance frontend team.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.