## Ray tracing, Patterns, and the Web

(Caveat before I even say anything else: the ray tracer is unoptimized and really stresses the browser’s Javascript implementation. I recommend using Google Chrome for viewing it as Chrome is notably faster at the moment…)

I managed to reorganize the Javascript-based ray tracer such that it can be used as a JQuery plug-in: transform any CANVAS element into a ray traced scene. I then incorporated the 2D patterns library I’ve been working on for LxLang into the page. The result? An HTML page with multiple embedded ray tracers, rendering the various 2D procedural patterns onto a ray traced sphere. It’s really exciting seeing all these technologies working together.

Next up, I might work on using the HTML5 File API to add some image caching – the page load time for the below is a bit ridiculous.

Click here to view the ray tracing page. Or click the image below to get a static image of the results (admittedly, a much faster approach).

Written by arthur

January 7th, 2012 at 2:41 pm

Posted in lxengine

## 2D Patterns

Been working on a small Javascript library for generating 2D procedural patterns (with the ShaderBuilder in mind).

See the progress here or click on the image below.

Written by arthur

December 30th, 2011 at 2:16 pm

Posted in lxengine

## Ray Tracing (Yet Again)

I obviously must enjoy writing basic ray tracers…

My recent realization has been that developing in Javascript – or more precisely in a browser-enabled language – makes a lot of sense for me at this point.  Immediately demonstrable results along with a flexible, dynamic language for experimentation are higher utility than highly optimized offline compilation for my goals.  At least for now.

So, I’m working on rewriting the LxEngine ray-tracer and ShaderBuilder in Javascript.  I’m rather excited about the possibilities that a dynamic language will open.  In any case, it’s still at a very early stage but I’m excited by the rapid progression:

Written by arthur

December 27th, 2011 at 7:30 pm

Posted in lxengine

## Experimenting with Perlin Noise

### Update

Here’s another interesting noise pattern and the generating function, possibly useful as the basis for a stylized rock wall texture:

    function dim_sqr(s, t)
{
var f = Math.fract([s, t]);
var d = Math.abs( [f[0] - .5, f[1] - .5] );
return d[0] * d[1] * 4;;
}

function f8(s, t)
{
s += Math.noise3d(s, t, .4);
t += Math.noise3d(s, t, .4);
var v = dim_sqr(s, t);
return [v, v, v];
}

Slightly more interesting rocks?

    function f9(s, t)
{
s += Math.noise3d(s, t, .4);
t += Math.noise3d(s, t, .4);
var v = Math.checker_dim(s, t);
s += Math.noise3d(s, t, 22.4);
t += Math.noise3d(s, t, 22.4);
var u = Math.spot_dim(.4, [s, t]);
var w = Math.max(u, v);

return [w, w, w];
}

### Original Post

I implemented a perlin noise function quite some time ago (nothing fancy – more or less retyped from Perlin’s revised algorithm as described here). I recently exposed the noise3d function to the Javascript Math object in LxEngine and played with some procedural textures generated from Javascript fragments (as described in the last post).

I’m sure with a bit of research I could find known algorithms for generating some good patterns from variations on noise, but below are some textures generated simply by experimentation. Note that all the textures are generated in an (s, t) domain of (0,0) – (8, 8). The below textures are all generated from continuous functions, but as is obvious from the images are not seamless for the given domain.

### Functions

The images from left-to-right, then top-to-bottom correspond to the functions f1 through f7 below:

    function f1(s, t)
{
var v0 = Math.noise3d(s, t, .10);
var v1 = Math.noise3d(2.1 * s, 2.1 * t, 88.22);
var v2 = Math.noise3d(2 * s, 2 * t, .34);
var v = Math.max(v0, v1, v2);
var n = Math.min(v0, v1, v2);
v = Math.mix(v, n, .5);
return [v, v, v];
}

function f2(s, t)
{
var v0 = Math.noise3d(s, t, .10);
var v1 = Math.noise3d(2.1 * s, 2.1 * t, 88.22);
var v2 = Math.noise3d(2 * s, 2 * t, .34);
var v = Math.max(v0, v1, v2);
return [v, v, v];
}

function f3(s, t)
{
var v = Math.noise3d(s, t, .10);
v = .5 - Math.pow(Math.abs(v - .5), .5);
return [v, v, v];
}

function f4(s, t)
{
var v0 = Math.noise3d(s, t, .10);
var v1 = Math.noise3d(2.1 * s, 2.1 * t, 88.22);
var v2 = Math.noise3d(2 * s, 2 * t, .34);
var v = Math.max(v0, v1, v2);
var n = Math.min(v0, v1, v2);
v = Math.mix(v, n, .5);
v = .5 + Math.sign(v - .5) * Math.pow(Math.abs(v - .5), .5);
return [v, v, v];
}

function f5(s, t)
{
var v0 = Math.noise3d(s, t, .10);
var v1 = Math.noise3d(2.1 * s, 2.1 * t, 88.22);
var v2 = Math.noise3d(2 * s, 2 * t, .34);
var v = Math.max(v0, v1, v2);
var n = Math.min(v0, v1, v2);
v = Math.mix(v, n, .5);
v *= .5 + Math.pow(Math.abs(Math.fract(Math.mix(Math.fract(4 * s), v0, v1)) - .5), 1);
return [v, v, v];
}

function f6(s, t)
{
var r =  1.62 * Math.noise3d(s, t, .31);
s1 = s * Math.cos(r) + t * Math.sin(r);
t1 = s * Math.sin(r) - t * Math.cos(r);
v = Math.mix( (Math.sin(s1 * 6.28) + 1) / 2, (Math.cos(t1 * 6.28) + 1) / 2, .5);
return [v, v, v];
}

function f7(s, t)
{
var v = Math.max( f6(s, t)[0], f6(s * 2, t * 2)[0] );
return [v, v, v];
}

Written by arthur

August 13th, 2011 at 12:12 pm

Posted in lxengine

## Rendering Voxels

The image below is not in fact of a checker procedural material, but rather is a first image from some work towards adding voxel rendering to LxEngine.  As version 0, it’s more or less simply rendering an array of cubes.

A single cell of 16x16x4 voxels

Here’s a rough the change list:

• Added support for solid colored materials (i.e. no lighting, just color)
• Added support for specifying the light set and camera to use a particular pass (previously this had to be specified per item being rendered)
• Improved the OpenGL error checking in the rasterizer a bit

### Update: 2011.06.25

Procedurally generated voxel "world"

The sample has been updated to support multiple voxel cells with a procedurally generated (noise-based) height function.  The rendering has been rewritten to create a single mesh for each cell which – as anticipated – is orders of magnitude faster than the original naive voxel-by-voxel rendering algorithm.

Building the mesh…

The algorithm for building the mesh is quite simple:

• For each voxel in the cell (i.e. each of the 16x16x4 blocks)…
• Check if the block above is empty or there is no block (i.e. the block is on the top boundary); if so add a quad for the top of the block to the vertex buffer
• Repeat the prior step for the other five sides of the block

This has two major advantages:

The entire cell is treated as a single vertex buffer.  This means one draw call for all 16x16x4 blocks, rather than 1024 draw calls.  Graphics hardware operates far more efficiently one large batches; thus, the single draw call for the whole cell takes roughly as much time as each of the 1024 calls took when drawing each block individually.  (Combining the buffer, of course, means the same shader and parameters need to be used for the entire buffer.  The solution to handle textured blocks – not yet implemented – will be to use a texture atlas containing all the textures so that texture parameters do not have to be reset.)

The second advantage is all the interior blocks of a cell get discard implicitly.  In the naive algorithm, every block has drawn, regardless of whether it was completely obscured or not.  Due to the grid nature of the cell, checking neighbors is sufficient to see if a particular face is never visible.  Since “most” blocks in the anticipated data sets are fully interior, this potentially trims out a sizeable chunk of obscured vertices and faces.  In a “solid” cell of 16x16x4 block, there are potentially 24,576 vertices and 6144 faces (assuming each face requires a unique set of vertex parameters for each of its vertices); with the interior cells stripped out, this becomes 768 faces with 3,072 vertices.  That’s 12.5% the number of vertices and faces.  If vertices can be shared between faces, the number of vertices drops to 768 as well, or 3.1% of the original count.

Written by arthur

June 24th, 2011 at 12:01 pm

Posted in lxengine

## Progress Update

Nothing technically astounding to report: only some regular progress.

I’ve been moving away from OGRE and towards coding my own OpenGL-based renderer. Why? Because I realized that, as much as I want LxEngine to utilize the work of others and avoid reinventing the wheel, this is also currently a hobby project and, bottom-line, I enjoy writing custom graphics code.

That out of the way, I’ve been very interested in terrain lately. Between the Lithosphere demo (see earlier blog entry) and unfortunately downloading and instantly getting a minor addiction to Minecraft, I’m fascinated by what a procedural terrain engine theoretically could do.

Step 1: Height maps. These seem to have a nice balance between simplicity (e.g. path-finding and collision detection) and capacity to display something interesting. Here’s a heightmap generated from a sin + cos wave with a checker pattern and fake diffuse lighting:

Video was probably overkill for such a simple example, but I thought I’d use this as an excuse to try out CamStudio (a free, open source screen recording app).  Pretty simple to use.  I recommend it.

### Update

A bit more work and some head-scratching to really understand Perlin noise, and the result is this heightmap:

Terrain generated from a Perlin Noise function

Thanks to these pages for the information on Perlin noise:

### Update 2

Added automatic generation of additional tiles and a cheap GLSL fog effect:

Perlin Noise-based Terrain with Fog

Written by arthur

December 22nd, 2010 at 1:17 pm

Posted in lxengine

## Effects Sub-Project Complete

with one comment

One more shader fragment for the tile procedural and an updated data file later:

GLSL rasterizer matching the ray-traced
result fairly accurately.

Definition of “Complete”
According to the objectives of this sub-project, it’s fair to say the rasterized versus raytraced images results match sufficiently to call this sub-project done.  The remaining mismatches are primarily: (1) lack of shadows, (2) light position mismatch, (3) color tweaking.  Regarding lack of rasterized, soft, area light shadows, that is a sizable sub-project in itself; I’d like to pursue some other areas of the code before embarking on that one.  Regarding the light position mismatch, that’s a defect in the ray-tracer code: there’s no real learning value in attempting to compensate for that defect.  Regarding color tweaking, it’s close enough that I’m satisfied with the results.

There are plenty of areas for further improvement (there always are!) such as bitmap texture support, light shaders, nested uv spaces, shared variables, etc.  However, in the interest of exploring different areas of graphics code and based on the aims of this particular effort, I’m labeling this sub-project complete.

A complicated, nested procedural example
I’m fairly happy with the system’s capabilities.  For example, the foremost sphere in below image demonstrates a fairly complicated setup.  The diffuse channel contains a nested procedural: there is outer checker pattern which defines an inner checker pattern for one tile and a spot pattern for the other tiles.   Furthermore, the UV scaling on each level varies (the inner checker is 2×2, the inner spot is 3×4).  Lastly, the specular channel is completely independent as it defines a checker pattern of it’s own with a completely different UV mapping.

Complex, nested procedurals in GLSL

Here’s what the corresponding data file for the nested procedural sphere looks like (in the JSON scene format):

{
description     : "Nested checker",
type            : "plyfile",
content         : "sphere.ply",
style           :   {
scale       : 0.00392156863,
translate   : [ 0.0, -1.25, 0.5 ],
exponent    : 128,
material :       {
phong : {
diffuse  : {
checker : {
primary     : {
checker : {
uv : { spherical : { scale : [ 24,9 ]}},
primary : [ .4, 0, 1],
secondary : [ .2, .6, .3],
},
},
secondary   : {
spot : {
uv : { spherical : { scale : [ 36,24 ]}},
primary : [ .5, 1, 1 ],
secondary : [ 0, 0, .1],
},
},
},
},
specular : {
checker : {
uv        : { spherical : { scale : [ 32,16 ]}},
primary   : [ 1, 1, 1],
secondary : [ 0, 0, 1],
},
},
}
},
}
}


Effect System Support
A quick look some of what the system currently supports:

• Pluggable shader fragments
• Current Mappers cube, spherical, spherical-xy
• Current Procedurals: checker, spot, tile
• Current Base Shaders: phong
• Fully dynamic material shader generation
• Shaders can be nested arbitrarily (to the limit of what the video card can support for a single shader)
• Objects can share the same procedural and mapping for diffuse and specular, or have unique definitions.  The correct shader will be generated automatically.
• If two materials have the same structure but different parameters, the system automatically reuses the existing shader and simply changes the uniform variables.
• Inherited Stylization in Scene Graph
• Child nodes automatically inherit any parent attribute, unless the child overrides it.  Example: the three stacked cubes share a parent that defines the stylization; the children only define the transform and geometry.
• Shader fragment definitions can automatically pick up these style parameters.  Example: the checker procedural automatically grabs the current primary and secondary colors defined by the parent if it is not given explicit values to use.

Written by arthur

May 1st, 2010 at 10:57 am

Posted in lxengine