GLSL_Tutorial

更新时间:2023-05-11 20:38:18 阅读: 评论:0

GLSL Tutorial
/opengl/glsl
Introduction
In this tutorial shader programming using GLSL will be covered. There is an introduction to the specification, but reading the OpenGL 2.0 and GLSL official specs is always recommended if you get rious about this. It is assumed that the reader is familiar with OpenGL programming, as this is required to understand some parts of the tutorial.
GLSL stands for GL Shading Language and often referred as glslang and was defined by the Architectural Review Board of OpenGL, the governing body of OpenGL.
I won't go into disputes, or comparisons, with Cg, Nvidia's proposal for a shading language that is also compatible with OpenGL. The only reason I cho GLSL and not Cg for this tutorial, is GLSL cloness to OpenGL.
Before writing shaders, in any language, it is a good idea to understand the basics of the graphics pipeline. This will provide a context to introduce shaders, what types of shaders are available, and wha
t shaders are suppod to do. It will also show what shaders can't do, which is equally important.
After this introduction the OpenGL tup for GLSL is discusd. The necessary steps to u a shader in an OpenGL application are discusd in some detail. Finally it is shown how an OpenGL application can feed data to a shader making it more flexible and powerful.
Some basic concepts such as data types, variables, statements and function definition are then introduced.
Plea bear in mind that this is work in progress and therefore bugs are likely to be prent in the text or demos. Let me know if you find any bug, regardless of how insignificant, so that I can clean them up. Also suggestions are more than welcome. I hope you enjoy the tutorial.
Pipeline Overview
The following figure is a (very) simplified diagram of the pipeline stages and the data that travels amongst them. Although extremely simplified it is enough to prent some important concepts for shader programming. In this subction the fixed functionality of the pipeline is prented. Note that this pipeline is an abstraction and does not necessarily meet any particular implementation in all its steps.
Vertex Transformation
In here a vertex is a t of attributes such as its location in space, as well as its color, normal, texture coordinates, amongst others. The inputs for this stage are the individual vertices attributes. Some of the operations performed by the fixed functionality at this stage are:  •Vertex position transformation
•Lighting computations per vertex
•Generation and transformation of texture coordinates
Primitive Asmbly and Rasterization
The inputs for this stage are the transformed vertices, as well as connectivity information. This latter piece of data tells the pipeline how the vertices connect to form a primitive. It is in here that primitives are asmbled.
This stage is also responsible for clipping operations against the view frustum, and back face culling.
Rasterization determines the fragments, and pixel positions of the primitive. A fragment in this context is a piece of data that will be ud to update a pixel in the frame buffer at a specific location. A fragment contains not only color, but also normals and texture coordinates, amongst other possible attributes, that are ud to compute the new pixel's color.
The output of this stage is twofold:
•The position of the fragments in the frame buffer
•The interpolated values for each fragment of the attributes computed in the vertex transformation stage
The values computed at the vertex transformation stage, combined with the vertex connectivity information allow this stage to compute the appropriate attributes for the fragment. For instance, each vertex has a transformed position. When considering the vertices that make up a primitive, it is possible to compute the position of the fragments of the primitive. Another example is the usage of color. If a triangle has its vertices with different colors, then the color of the fragments inside the triangle are obtained by interpolation of the triangle's vertices color weighted by the relative distances of the vertices to the fragment.
Fragment Texturing and Coloring
Interpolated fragment information is the input of this stage. A color has already been computed in the previous stage through interpolation, and in here it can be combined with a texel (texture element) for example. Texture coordinates have also been interpolated in the previous stage. Fog is also applied at this stage. The common end result of this stage per fragment is a color value and a depth for the fragment.
Raster Operations
The inputs of this stage are:
•The pixels location
•The fragments depth and color values
The last stage of the pipeline performs a ries of tests on the fragment, namely:
•Scissor test
•Alpha test
•Stencil test
•Depth test
If successful the fragment information is then ud to update the pixel's value according to the current blend mode. Notice that blending occurs only at this stage becau the Fragment Texturing and Coloring stage has no access to the frame buffer. The frame buffer is only accessible at this stage.
Visual Summary of the Fixed Functionality
The following figure prents a visual description of the stages prented above:
Replacing Fixed Functionality
Recent graphic cards give the programmer the ability to define the functionality of two of the above described stages:
•Vertex shaders may be written for the Vertex Transformation stage.
•Fragment shaders replace the Fragment Texturing and Coloring stage's fixed functionality.
In the next subctions the programmable stages, hereafter the vertex processor and the fragment processor, are described.
Vertex Processor
The vertex processor is responsible for running the vertex shaders. The input for a vertex shader is the vertex data, namely its position, color, normals, etc, depending on what the OpenGL application nds.
The following OpenGL code would nd to the vertex processor a color and a vertex position for each vertex.
glBegin(...);
glColor3f(0.2,0.4,0.6);
glVertex3f(-1.0,1.0,2.0);
glColor3f(0.2,0.4,0.8);
glVertex3f(1.0,-1.0,2.0);
glEnd();
In a vertex shader you can write code for tasks such as:
•Vertex position transformation using the modelview and projection matrices
•Normal transformation, and if required its normalization
•Texture coordinate generation and transformation
•Lighting per vertex or computing values for lighting per pixel
•Color computation
There is no requirement to perform all the operations above, your application may not u lighting for instance. However, once you write a vertex shader you are replacing the full functionality of the vertex processor, hence you can't perform normal transformation and expect the fixed functionality to perform texture coordinate generation. When a vertex shader is ud it becomes responsible for replacing all the needed functionality of this stage of the pipeline.
As can be en in the previous subction the vertex processor has no information regarding connectivity, hence operations that require topological knowledge can't be performed in here. For instance it is not possible for a vertex shader to perform back face culling, since it operates on vertices and not on faces. The vertex processor process vertices individually and has no clue of the remaining vertices.
The vertex shader is responsible for at least writing a variable: gl_Position, usually transforming the vertex with the modelview and projection matrices.
A vertex processor has access to OpenGL state, so it can perform operations that involve lighting for instance, and u materials. It can also access textures (only available in the newest hardware). There is no access to the frame buffer.
Fragment Processor
The fragment processor is where the fragment shaders run. This unit is responsible for operations like:
•Computing colors, and texture coordinates per pixel
•Texture application
•Fog computation
•Computing normals if you want lighting per pixel
The inputs for this unit are the interpolated values computed in the previous stage of the pipeline such as vertex positions, colors, normals,
In the vertex shader the values are computed for each vertex. Now we're dealing with the fragments inside the primitives, hence the need for the interpolated values.

本文发布于:2023-05-11 20:38:18,感谢您对本站的认可!

本文链接:https://www.wtabcd.cn/fanwen/fan/82/590108.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

标签:
相关文章
留言与评论(共有 0 条评论)
   
验证码:
推荐文章
排行榜
Copyright ©2019-2022 Comsenz Inc.Powered by © 专利检索| 网站地图