轻量封装WebGPU渲染系统示例<42>- vsm阴影实现过程(源码)

本文主要是介绍轻量封装WebGPU渲染系统示例<42>- vsm阴影实现过程(源码),希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

前向实时渲染vsm阴影实现的主要步骤:

        1. 编码深度数据,存到一个rtt中。

        2. 纵向和横向执行遮挡信息blur filter sampling, 存到对应的rtt中。

        3. 将上一步的结果(rtt)应用到可接收阴影的材质中。

具体代码情况文章最后附上的实现源码。

当前示例源码github地址:

https://github.com/vilyLei/voxwebgpu/blob/feature/rendering/src/voxgpu/sample/BaseVSMShadowTest.ts

当前示例运行效果:

主要的WGSL Shader代码:

编码深度:

struct VertexOutput {@builtin(position) Position: vec4<f32>,@location(0) projPos: vec4<f32>,@location(1) objPos: vec4<f32>
}
@vertex
fn vertMain(@location(0) position: vec3<f32>
) -> VertexOutput {let objPos = vec4(position.xyz, 1.0);let wpos = objMat * objPos;var output: VertexOutput;let projPos = projMat * viewMat * wpos;output.Position = projPos;output.projPos = projPos;output.objPos = objPos;return output;
}const PackUpscale = 256. / 255.; // fraction -> 0..1 (including 1)
const UnpackDownscale = 255. / 256.; // 0..1 -> fraction (excluding 1)const PackFactors = vec3<f32>(256. * 256. * 256., 256. * 256., 256.);
const UnpackFactors = UnpackDownscale / vec4<f32>(PackFactors, 1.0);const ShiftRight8 = 1. / 256.;fn packDepthToRGBA(v: f32) -> vec4<f32> {var r = vec4<f32>(fract(v * PackFactors), v);let v3 = r.yzw - (r.xyz * ShiftRight8);r = vec4<f32>(v3.x, v3);return r * PackUpscale;
}@fragment
fn fragMain(@location(0) projPos: vec4<f32>,@location(1) objPos: vec4<f32>
) -> @location(0) vec4<f32> {let fragCoordZ = 0.5 * projPos[2] / projPos[3] + 0.5;var color4 = packDepthToRGBA( fragCoordZ );return color4;
}

纵向和横向执行遮挡信息blur filter sampling:

struct VertexOutput {@builtin(position) Position: vec4<f32>,@location(0) uv: vec2<f32>
}
@vertex
fn vertMain(@location(0) position: vec3<f32>,@location(1) uv: vec2<f32>
) -> VertexOutput {var output: VertexOutput;output.Position = vec4(position.xyz, 1.0);output.uv = uv;return output;
}const PackUpscale = 256. / 255.; // fraction -> 0..1 (including 1)
const UnpackDownscale = 255. / 256.; // 0..1 -> fraction (excluding 1)const PackFactors = vec3<f32>(256. * 256. * 256., 256. * 256., 256.);
const UnpackFactors = UnpackDownscale / vec4<f32>(PackFactors, 1.0);const ShiftRight8 = 1. / 256.;fn packDepthToRGBA(v: f32) -> vec4<f32> {var r = vec4<f32>(fract(v * PackFactors), v);let v3 = r.yzw - (r.xyz * ShiftRight8);return vec4<f32>(v3.x, v3) * PackUpscale;
}fn unpackRGBAToDepth( v: vec4<f32> ) -> f32 {return dot( v, UnpackFactors );
}fn pack2HalfToRGBA( v: vec2<f32> ) -> vec4<f32> {let r = vec4( v.x, fract( v.x * 255.0 ), v.y, fract( v.y * 255.0 ));return vec4<f32>( r.x - r.y / 255.0, r.y, r.z - r.w / 255.0, r.w);
}
fn unpackRGBATo2Half( v: vec4<f32> ) -> vec2<f32> {return vec2<f32>( v.x + ( v.y / 255.0 ), v.z + ( v.w / 255.0 ) );
}const SAMPLE_RATE = 0.25;
const HALF_SAMPLE_RATE = 0.125;
@fragment
fn fragMain(@location(0) uv: vec2<f32>,
) -> @location(0) vec4<f32> {var mean = 0.0;var squared_mean = 0.0;let resolution = viewParam.zw;let fragCoord = resolution * uv;let radius = param[3];let c4 = textureSample(shadowDepthTexture, shadowDepthSampler, uv);var depth = unpackRGBAToDepth( c4 );for ( var i = -1.0; i < 1.0 ; i += SAMPLE_RATE) {#ifdef USE_HORIZONAL_PASSlet distribution = unpackRGBATo2Half( textureSample(shadowDepthTexture, shadowDepthSampler, ( fragCoord.xy + vec2( i, 0.0 ) * radius ) / resolution ) );mean += distribution.x;squared_mean += distribution.y * distribution.y + distribution.x * distribution.x;#elsedepth = unpackRGBAToDepth( textureSample(shadowDepthTexture, shadowDepthSampler, ( fragCoord.xy + vec2( 0.0, i ) * radius ) / resolution ) );mean += depth;squared_mean += depth * depth;#endif}mean = mean * HALF_SAMPLE_RATE;squared_mean = squared_mean * HALF_SAMPLE_RATE;let std_dev = sqrt( squared_mean - mean * mean );var color4 = pack2HalfToRGBA( vec2<f32>( mean, std_dev ) );return color4;
}

应用到可接收阴影的材质中(示例用法):


struct VertexOutput {@builtin(position) Position: vec4<f32>,@location(0) uv: vec2<f32>,@location(1) worldNormal: vec3<f32>,@location(2) svPos: vec4<f32>
}
@vertex
fn vertMain(@location(0) position: vec3<f32>,@location(1) uv: vec2<f32>,@location(2) normal: vec3<f32>
) -> VertexOutput {let objPos = vec4(position.xyz, 1.0);let wpos = objMat * objPos;var output: VertexOutput;let projPos = projMat * viewMat * wpos;output.Position = projPos;// output.normal = normal;let invMat33 = inverseM33(m44ToM33(objMat));output.uv = uv;output.worldNormal = normalize(normal * invMat33);output.svPos = shadowMatrix * wpos;return output;
}fn pack2HalfToRGBA( v: vec2<f32> ) -> vec4<f32> {let r = vec4( v.x, fract( v.x * 255.0 ), v.y, fract( v.y * 255.0 ));return vec4<f32>( r.x - r.y / 255.0, r.y, r.z - r.w / 255.0, r.w);
}
fn unpackRGBATo2Half( v: vec4<f32> ) -> vec2<f32> {return vec2<f32>( v.x + ( v.y / 255.0 ), v.z + ( v.w / 255.0 ) );
}fn texture2DDistribution( uv: vec2<f32> ) -> vec2<f32> {let v4 = textureSample(shadowDepthTexture, shadowDepthSampler, uv );return unpackRGBATo2Half( v4 );}
fn VSMShadow (uv: vec2<f32>, compare: f32 ) -> f32 {var occlusion = 1.0;let distribution = texture2DDistribution( uv );let hard_shadow = step( compare , distribution.x ); // Hard Shadowif (hard_shadow != 1.0 ) {let distance = compare - distribution.x ;let variance = max( 0.00000, distribution.y * distribution.y );var softness_probability = variance / (variance + distance * distance ); // Chebeyshevs inequalitysoftness_probability = clamp( ( softness_probability - 0.3 ) / ( 0.95 - 0.3 ), 0.0, 1.0 ); // 0.3 reduces light bleedocclusion = clamp( max( hard_shadow, softness_probability ), 0.0, 1.0 );}return occlusion;}
fn getVSMShadow( shadowMapSize: vec2<f32>, shadowBias: f32, shadowRadius: f32, shadowCoordP: vec4<f32> ) -> f32 {var shadowCoord = vec4<f32>(shadowCoordP.xyz / vec3<f32>(shadowCoordP.w), shadowCoordP.z + shadowBias);let inFrustumVec = vec4<bool> ( shadowCoord.x >= 0.0, shadowCoord.x <= 1.0, shadowCoord.y >= 0.0, shadowCoord.y <= 1.0 );let inFrustum = all( inFrustumVec );let frustumTestVec = vec2<bool>( inFrustum, shadowCoord.z <= 1.0 );var shadow = VSMShadow( shadowCoord.xy, shadowCoord.z );if ( !all( frustumTestVec ) ) {shadow = 1.0;}return shadow;
}@fragment
fn fragMain(@location(0) uv: vec2<f32>,@location(1) worldNormal: vec3<f32>,@location(2) svPos: vec4<f32>
) -> @location(0) vec4<f32> {var color = vec4<f32>(1.0);var shadow = getVSMShadow(params[1].xy, params[0].x, params[0].z, svPos );let shadowIntensity = 1.0 - params[0].w;shadow = clamp(shadow, 0.0, 1.0) * (1.0 - shadowIntensity) + shadowIntensity;var f = clamp(dot(worldNormal, params[2].xyz),0.0,1.0);if(f > 0.0001) {f = min(shadow,clamp(f, shadowIntensity,1.0));}else {f = shadowIntensity;}var color4 = vec4<f32>(color.xyz * vec3(f * 0.9 + 0.1), 1.0);return color4;
}

此示例基于此渲染系统实现,当前示例TypeScript源码如下:

export class BaseVSMShadowTest {private mRscene = new RendererScene();private mShadowCamera: Camera;private mDebug = false;initialize(): void {this.mRscene.initialize({canvasWith: 512,canvasHeight: 512,rpassparam: { multisampleEnabled: true }});this.initScene();this.initEvent();}private mEntities: Entity3D[] = [];private initScene(): void {let rc = this.mRscene;this.buildShadowCam();let sph = new SphereEntity({radius: 80,transform: {position: [-230.0, 100.0, -200.0]}});this.mEntities.push(sph);rc.addEntity(sph);let box = new BoxEntity({minPos: [-30, -30, -30],maxPos: [130, 230, 80],transform: {position: [160.0, 100.0, -210.0],rotation: [50, 130, 80]}});this.mEntities.push(box);rc.addEntity(box);let torus = new TorusEntity({transform: {position: [160.0, 100.0, 210.0],rotation: [50, 30, 80]}});this.mEntities.push(torus);rc.addEntity(torus);if (!this.mDebug) {this.applyShadow();}}private mShadowDepthRTT = { uuid: "rtt-shadow-depth", rttTexture: {}, shdVarName: 'shadowDepth' };private mOccVRTT = { uuid: "rtt--occV", rttTexture: {}, shdVarName: 'shadowDepth' };private mOccHRTT = { uuid: "rtt--occH", rttTexture: {}, shdVarName: 'shadowDepth' };private applyShadowDepthRTT(): void {let rc = this.mRscene;// rtt texture proxy descriptorlet rttTex = this.mShadowDepthRTT;// define a rtt pass color colorAttachment0let colorAttachments = [{texture: rttTex,// green clear background colorclearValue: { r: 1, g: 1, b: 1, a: 1.0 },loadOp: "clear",storeOp: "store"}];// create a separate rtt rendering passlet rPass = rc.createRTTPass({ colorAttachments });rPass.node.camera = this.mShadowCamera;let extent = [-0.5, -0.5, 0.8, 0.8];const shadowDepthShdSrc = {shaderSrc: { code: shadowDepthWGSL, uuid: "shadowDepthShdSrc" }};let material = this.createDepthMaterial(shadowDepthShdSrc);let es = this.createDepthEntities([material], false);for (let i = 0; i < es.length; ++i) {rPass.addEntity(es[i]);}// 显示渲染结果extent = [-0.95, -0.95, 0.4, 0.4];let entity = new FixScreenPlaneEntity({ extent, flipY: true, textures: [{ diffuse: rttTex }] });rc.addEntity(entity);}private applyBuildDepthOccVRTT(): void {let rc = this.mRscene;// rtt texture proxy descriptorlet rttTex = this.mOccVRTT;// define a rtt pass color colorAttachment0let colorAttachments = [{texture: rttTex,// green clear background colorclearValue: { r: 1, g: 1, b: 1, a: 1.0 },loadOp: "clear",storeOp: "store"}];// create a separate rtt rendering passlet rPass = rc.createRTTPass({ colorAttachments });let material = new ShadowOccBlurMaterial();let ppt = material.property;ppt.setShadowRadius(this.mShadowRadius);ppt.setViewSize(this.mShadowMapW, this.mShadowMapH);material.addTextures([this.mShadowDepthRTT]);let extent = [-1, -1, 2, 2];let rttEntity = new FixScreenPlaneEntity({ extent, materials: [material] });rPass.addEntity(rttEntity);// 显示渲染结果extent = [-0.5, -0.95, 0.4, 0.4];let entity = new FixScreenPlaneEntity({ extent, flipY: true, textures: [{ diffuse: rttTex }] });rc.addEntity(entity);}private applyBuildDepthOccHRTT(): void {let rc = this.mRscene;// rtt texture proxy descriptorlet rttTex = this.mOccHRTT;// define a rtt pass color colorAttachment0let colorAttachments = [{texture: rttTex,// green clear background colorclearValue: { r: 1, g: 1, b: 1, a: 1.0 },loadOp: "clear",storeOp: "store"}];// create a separate rtt rendering passlet rPass = rc.createRTTPass({ colorAttachments });let material = new ShadowOccBlurMaterial();let ppt = material.property;ppt.setShadowRadius(this.mShadowRadius);ppt.setViewSize(this.mShadowMapW, this.mShadowMapH);material.property.toHorizonalBlur();material.addTextures([this.mOccVRTT]);let extent = [-1, -1, 2, 2];let rttEntity = new FixScreenPlaneEntity({ extent, materials: [material] });rPass.addEntity(rttEntity);// 显示渲染结果extent = [-0.05, -0.95, 0.4, 0.4];let entity = new FixScreenPlaneEntity({ extent, flipY: true, textures: [{ diffuse: rttTex }] });rc.addEntity(entity);}private createDepthMaterial(shaderSrc: WGRShderSrcType, faceCullMode = "none"): WGMaterial {let pipelineDefParam = {depthWriteEnabled: true,faceCullMode,blendModes: [] as string[]};const material = new WGMaterial({shadinguuid: "shadow-depth_material",shaderSrc,pipelineDefParam});return material;}private createDepthEntities(materials: WGMaterial[], flag = false): Entity3D[] {const rc = this.mRscene;let entities = [];let ls = this.mEntities;let tot = ls.length;for (let i = 0; i < tot; ++i) {let et = ls[i];let entity = new Entity3D({ transform: et.transform });entity.materials = materials;entity.geometry = et.geometry;entities.push(entity);if (flag) {rc.addEntity(entity);}}return entities;}private mShadowBias = -0.0005;private mShadowRadius = 2.0;private mShadowMapW = 512;private mShadowMapH = 512;private mShadowViewW = 1300;private mShadowViewH = 1300;private buildShadowCam(): void {const cam = new Camera({eye: [600.0, 800.0, -600.0],near: 0.1,far: 1900,perspective: false,viewWidth: this.mShadowViewW,viewHeight: this.mShadowViewH});cam.update();this.mShadowCamera = cam;const rsc = this.mRscene;let frameColors = [[1.0, 0.0, 1.0], [0.0, 1.0, 1.0], [1.0, 0.0, 0.0], [0.0, 1.0, 1.0]];let boxFrame = new BoundsFrameEntity({ vertices8: cam.frustum.vertices, frameColors });rsc.addEntity(boxFrame);}private initEvent(): void {const rc = this.mRscene;rc.addEventListener(MouseEvent.MOUSE_DOWN, this.mouseDown);new MouseInteraction().initialize(rc, 0, false).setAutoRunning(true);}private mFlag = -1;private buildShadowReceiveEntity(): void {let cam = this.mShadowCamera;let transMatrix = new Matrix4();transMatrix.setScaleXYZ(0.5, -0.5, 0.5);transMatrix.setTranslationXYZ(0.5, 0.5, 0.5);let shadowMat = new Matrix4();shadowMat.copyFrom(cam.viewProjMatrix);shadowMat.append(transMatrix);let material = new ShadowReceiveMaterial();let ppt = material.property;ppt.setShadowRadius(this.mShadowRadius);ppt.setShadowBias(this.mShadowBias);ppt.setShadowSize(this.mShadowMapW, this.mShadowMapH);ppt.setShadowMatrix(shadowMat);ppt.setDirec(cam.nv);material.addTextures([this.mOccHRTT]);const rc = this.mRscene;let plane = new PlaneEntity({axisType: 1,extent: [-600, -600, 1200, 1200],transform: {position: [0, -1, 0]},materials: [material]});rc.addEntity(plane);}private applyShadow(): void {this.applyShadowDepthRTT();this.applyBuildDepthOccVRTT();this.applyBuildDepthOccHRTT();this.buildShadowReceiveEntity();}private mouseDown = (evt: MouseEvent): void => {this.mFlag++;if (this.mDebug) {if (this.mFlag == 0) {this.applyShadowDepthRTT();} else if (this.mFlag == 1) {this.applyBuildDepthOccVRTT();} else if (this.mFlag == 2) {this.applyBuildDepthOccHRTT();} else if (this.mFlag == 3) {this.buildShadowReceiveEntity();}}};run(): void {this.mRscene.run();}
}

这篇关于轻量封装WebGPU渲染系统示例<42>- vsm阴影实现过程(源码)的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/449553

相关文章

浅析Spring Security认证过程

类图 为了方便理解Spring Security认证流程,特意画了如下的类图,包含相关的核心认证类 概述 核心验证器 AuthenticationManager 该对象提供了认证方法的入口,接收一个Authentiaton对象作为参数; public interface AuthenticationManager {Authentication authenticate(Authenti

不懂推荐算法也能设计推荐系统

本文以商业化应用推荐为例,告诉我们不懂推荐算法的产品,也能从产品侧出发, 设计出一款不错的推荐系统。 相信很多新手产品,看到算法二字,多是懵圈的。 什么排序算法、最短路径等都是相对传统的算法(注:传统是指科班出身的产品都会接触过)。但对于推荐算法,多数产品对着网上搜到的资源,都会无从下手。特别当某些推荐算法 和 “AI”扯上关系后,更是加大了理解的难度。 但,不了解推荐算法,就无法做推荐系

基于人工智能的图像分类系统

目录 引言项目背景环境准备 硬件要求软件安装与配置系统设计 系统架构关键技术代码示例 数据预处理模型训练模型预测应用场景结论 1. 引言 图像分类是计算机视觉中的一个重要任务,目标是自动识别图像中的对象类别。通过卷积神经网络(CNN)等深度学习技术,我们可以构建高效的图像分类系统,广泛应用于自动驾驶、医疗影像诊断、监控分析等领域。本文将介绍如何构建一个基于人工智能的图像分类系统,包括环境

水位雨量在线监测系统概述及应用介绍

在当今社会,随着科技的飞速发展,各种智能监测系统已成为保障公共安全、促进资源管理和环境保护的重要工具。其中,水位雨量在线监测系统作为自然灾害预警、水资源管理及水利工程运行的关键技术,其重要性不言而喻。 一、水位雨量在线监测系统的基本原理 水位雨量在线监测系统主要由数据采集单元、数据传输网络、数据处理中心及用户终端四大部分构成,形成了一个完整的闭环系统。 数据采集单元:这是系统的“眼睛”,

作业提交过程之HDFSMapReduce

作业提交全过程详解 (1)作业提交 第1步:Client调用job.waitForCompletion方法,向整个集群提交MapReduce作业。 第2步:Client向RM申请一个作业id。 第3步:RM给Client返回该job资源的提交路径和作业id。 第4步:Client提交jar包、切片信息和配置文件到指定的资源提交路径。 第5步:Client提交完资源后,向RM申请运行MrAp

hdu1043(八数码问题,广搜 + hash(实现状态压缩) )

利用康拓展开将一个排列映射成一个自然数,然后就变成了普通的广搜题。 #include<iostream>#include<algorithm>#include<string>#include<stack>#include<queue>#include<map>#include<stdio.h>#include<stdlib.h>#include<ctype.h>#inclu

嵌入式QT开发:构建高效智能的嵌入式系统

摘要: 本文深入探讨了嵌入式 QT 相关的各个方面。从 QT 框架的基础架构和核心概念出发,详细阐述了其在嵌入式环境中的优势与特点。文中分析了嵌入式 QT 的开发环境搭建过程,包括交叉编译工具链的配置等关键步骤。进一步探讨了嵌入式 QT 的界面设计与开发,涵盖了从基本控件的使用到复杂界面布局的构建。同时也深入研究了信号与槽机制在嵌入式系统中的应用,以及嵌入式 QT 与硬件设备的交互,包括输入输出设

JAVA智听未来一站式有声阅读平台听书系统小程序源码

智听未来,一站式有声阅读平台听书系统 🌟&nbsp;开篇:遇见未来,从“智听”开始 在这个快节奏的时代,你是否渴望在忙碌的间隙,找到一片属于自己的宁静角落?是否梦想着能随时随地,沉浸在知识的海洋,或是故事的奇幻世界里?今天,就让我带你一起探索“智听未来”——这一站式有声阅读平台听书系统,它正悄悄改变着我们的阅读方式,让未来触手可及! 📚&nbsp;第一站:海量资源,应有尽有 走进“智听

【C++】_list常用方法解析及模拟实现

相信自己的力量,只要对自己始终保持信心,尽自己最大努力去完成任何事,就算事情最终结果是失败了,努力了也不留遗憾。💓💓💓 目录   ✨说在前面 🍋知识点一:什么是list? •🌰1.list的定义 •🌰2.list的基本特性 •🌰3.常用接口介绍 🍋知识点二:list常用接口 •🌰1.默认成员函数 🔥构造函数(⭐) 🔥析构函数 •🌰2.list对象

【Prometheus】PromQL向量匹配实现不同标签的向量数据进行运算

✨✨ 欢迎大家来到景天科技苑✨✨ 🎈🎈 养成好习惯,先赞后看哦~🎈🎈 🏆 作者简介:景天科技苑 🏆《头衔》:大厂架构师,华为云开发者社区专家博主,阿里云开发者社区专家博主,CSDN全栈领域优质创作者,掘金优秀博主,51CTO博客专家等。 🏆《博客》:Python全栈,前后端开发,小程序开发,人工智能,js逆向,App逆向,网络系统安全,数据分析,Django,fastapi