Introduction
I wanted to try out WebGPU, so I looked into how to set up a development environment.
I managed to run the Hello Triangle sample from WebGPU Samples on a local server set up with vite.
Also, I made it possible to develop with TypeScript.
Setting Up the Development Environment
Creating a Working Folder
Create a working folder and move into it.
mkdir /path/to/work-dir
cd /path/to/work-dir
Development Environment
The development environment will be vite.
Please copy the following package.json.
{
"type": "module"
}
Then create package.json.
pbpaste > package.json
Install the necessary modules.
npm i -D typescript @webgpu/types vite
TypeScript
Please copy the following tsconfig.json
.
Specifying @webgpu/types
allows TypeScript to recognize WebGPU.
{
"compilerOptions": {
"target": "es2016",
"module": "commonjs",
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": true,
"skipLibCheck": true,
"types": ["@webgpu/types"]
},
"include": ["src"]
}
Then create tsconfig.json.
pbpaste > tsconfig.json
index.html
Please copy the following index.html.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>WebGPU</title>
</head>
<body>
<canvas></canvas>
<script type="module" src="/src/main.ts"></script>
</body>
</html>
Then create index.html.
pbpaste > index.html
src Folder
Create a folder to place your code in.
mkdir src
src/main.ts
Please copy the following code.
Notes are also left, so please check them if you like.
// shader import requires ?raw
import vertexShader from './vertex.wgsl?raw'
import fragmentShader from './fragment.wgsl?raw'
main()
// Initializing WebGPU requires asynchronous processing
async function main() {
const canvas: HTMLCanvasElement = document.querySelector('canvas')!
// Request an adapter to access GPU hardware for WebGPU
const adapter = await navigator.gpu.requestAdapter()
if (!adapter) {
// Handling for environments not supporting WebGPU, part A
throw new Error()
}
// Request a device for executing GPU commands and allocating memory
// The device will not be nullish (according to type definitions)
const device = await adapter.requestDevice()
// Get the WebGPU context
const context: GPUCanvasContext = canvas.getContext('webgpu')
if (!context) {
// Handling for environments not supporting WebGPU, part B
throw new Error()
}
const { devicePixelRatio } = window
canvas.width = devicePixelRatio * canvas.clientWidth
canvas.height = devicePixelRatio * canvas.clientHeight
// Get the canvas color format recommended by WebGPU
const presentationFormat = navigator.gpu.getPreferredCanvasFormat()
// Associate the GPU device and context
context.configure({
// Specify the GPU device to use
device,
// Specify the color format
format: presentationFormat,
// Specify the alpha blending mode
alphaMode: 'premultiplied',
})
// Create a render pipeline
// vertex|fragment shaders are combined, defining the rendering process
const pipeline = device.createRenderPipeline({
// Automatically set the pipeline layout
layout: 'auto',
// Configuration for vertex shader
vertex: {
module: device.createShaderModule({
code: vertexShader,
}),
},
// Configuration for fragment shader
fragment: {
module: device.createShaderModule({
code: fragmentShader,
}),
// Configuration for render targets
targets: [
{
// Specify the color format for render target
format: presentationFormat,
},
],
},
primitive: {
// Specify the topology of primitives to draw
topology: 'triangle-list',
},
})
function frame() {
// Create a GPU command for batch processing multiple GPU commands
const commandEncoder = device.createCommandEncoder()
// Get the texture to display rendering results
const textureView = context.getCurrentTexture().createView()
const renderPassDescriptor: GPURenderPassDescriptor = {
colorAttachments: [
{
view: textureView,
// Specify the background
clearValue: { r: 0.0, g: 0.0, b: 0.0, a: 1.0 },
// Specify the load operation
// 'load': Retains the drawing result from the previous frame
// 'clear': Initializes with the color specified in
// GPURenderPassDescriptor.colorAttachments.clearValue
// before drawing a new frame
loadOp: 'clear',
// Specify the store operation
// In this case, it is set to save the rendering result
storeOp: 'store',
},
],
}
// Begin recording GPU commands
const passEncoder = commandEncoder.beginRenderPass(renderPassDescriptor)
// Set the render pipeline
// This specifies the shaders and rendering settings to use
passEncoder.setPipeline(pipeline)
// Start rendering
passEncoder.draw(3)
// End recording GPU commands
passEncoder.end()
// Submit the command to the GPU queue to execute the GPU command
device.queue.submit([commandEncoder.finish()])
requestAnimationFrame(frame)
}
requestAnimationFrame(frame)
}
Then create src/main.ts.
pbpaste > src/main.ts
src/vertex.wgsl
The WebGPU Shading Language (WGSL) is a shading language for WebGPU.
The code feels somewhat similar to Rust.
Please copy the following code.
Notes are also left, so please check them if you like.
// The @vertex attribute indicates that this function is a vertex shader
@vertex
fn main(
// @builtin(vertex_index) is a built-in variable representing the vertex index
// This is used to identify vertices from the vertex buffer
@builtin(vertex_index) VertexIndex : u32
// Returns a 4-dimensional vector representing clip space coordinates
) -> @builtin(position) vec4f {
// pos is an array containing three 2-dimensional vectors
// These vectors represent the vertices of a triangle
var pos = array<vec2f, 3>(
vec2(0.0, 0.5),
vec2(-0.5, -0.5),
vec2(0.5, -0.5)
);
// Use the vertex index to retrieve the corresponding vertex coordinates from the array
// Convert the retrieved 2-dimensional vector to a 4-dimensional vector and return it
// The z-coordinate is set to 0.0, and the w-coordinate to 1.0
// This is necessary to place the triangle in clip space
return vec4f(pos[VertexIndex], 0.0, 1.0);
}
Then create src/vertex.wgsl.
pbpaste > src/vertex.wgsl
src/fragment.wgsl
Please copy the following code.
Notes are also left, so please check them if you like.
// The @fragment attribute indicates that this function is a fragment shader
@fragment
// Returns a 4-dimensional vector to be written to color attachment index 0
fn main() -> @location(0) vec4f {
// Red color
return vec4(1.0, 0.0, 0.0, 1.0);
}
Then create src/fragment.wgsl.
pbpaste > src/fragment.wgsl
Running the Application
npx vite
A red triangle will be displayed in the web browser.
Conclusion
Although it's simple, we've set up an environment to develop with WebGPU.
Like Shadertoy, I plan to deepen my understanding of WebGPU while creating pictures with shaders.
PR
VOTE is a web app for voting on binary choices, developed by Blue Inc.
Feel free to play with it on a variety of topics, from technical discussions to everyday choices.
Top comments (0)