I spent some time tonight following up on my previous post about memory leaks I was encountering with V8. Stepping through in the debugger I was able to pin down a few problematic areas primarily dealing with some static variables holding on to memory. I went ahead and pushed them up to my fork of the V8 repo on github for anyone who is interested. With the changes I made I am no longer seeing any memory leaks with this very simple program:

int main(int argc, char* argv[])
{
    _CrtSetDbgFlag ( _CRTDBG_ALLOC_MEM_DF | _CRTDBG_LEAK_CHECK_DF );
    v8::V8::Initialize();
    v8::V8::Dispose();
    return 0;
}

Progress? Of course, if I remove either of those V8 calls then I see a bunch of memory leaks get spewed out again... So, that sucks. I am not sure how much more time I will spend on this though since tracking down leaks in V8 isn't exactly the focus of this project and this seems like a potentially deep rabbit hole. It does make it more painful separating out any memory leaks I am causing though.

Coding: V8 Memory Leak?

I haven't been paying enough attention to this site as of late, but I have been keeping busy. Recently I have returned to spending time working on my hobby engine "Plaster". I have primarily been working on the interface between C++ and Javascript and writing up some wrapper code for working with V8. I know there are a few libraries out there already that look pretty good for this, but really I wanted the experience of interacting more directly with V8.

This has been going pretty well, and I am more or less happy with the how it is shaping up (there are a few small things I would still like to change). That is it was going pretty well until I turned on leak tracking (using Visual Leak Detector)... and then bam tons of spew about all kinds of memory leaks. Bummer.

Diving in to it further I ended up stripping things down to try and come up with simple test cases that would cause memory leaks and eventually found that simply entering main and making a call to v8::V8::Dispose before returning (i.e. not touching my code at all) was enough to see a memory leak. I did some searching around and found an entry for this on the V8 issue tracker but there hasn't been any movement on it since May 2.

I'll be digging in to it a little further later on, but I wanted to throw this out here in case in the meantime there is anyone reading this with any experience on the subject that could shed some light.

Demo.

This is a fluid simulation written in C++ and made available on the web via Google's Native Client. I originally wrote up an implementation of the paper "Particle-based viscoelastic fluid simulation" just to get my feet a little wet with Native Client. I ended up revisiting it with the intention of just working on optimizing that implementation but ended up rewriting it and going with a different methodology after going through the fluid simulation rabbit hole a bit.

Rather than produce yet another explanation of Navier-Stokes and all the bits of math behind fluid simulation I will just link to a few different resources I found useful:

My simulation is to a large extent based off the work in the last few links; I started down this path after seeing Grant Kot's fantastic demo and fluid sim videos (which does a bit more than what I am doing here). The simulation works by tracking individual particles within the context of a fixed grid and "spreading" the particle values into the grid using a biquadratic interpolation scheme. From the grid I can then easily calculate my fluid forces and use these to then update the individual particles in the simulation. It is much faster than what I was doing before and works pretty well - although the fluid does end up compressing more than it should and the gridding seems to end up making things a bit more viscous. To be sure, I am not really going so much for accuracy here as something believeable and that could be dropped into a game to have some fun with.

As for the actual simulation steps they proceed roughly as follows:

  • Clear all grid cells
  • For each particle
    • Determine grid cell and the 8 surrounding cells
    • Calculate weights for biquadratic interpolation at particle position
    • Add particle mass and velocity to surrounding grid cells based on interpolation weights
  • For each particle
    • Calculate interpolated mass and velocity gradients at particle position
    • Use calculated mass and velocity gradient to determine pressure and viscosity forces
    • Add these forces back into the grid cells
  • For each particle
    • Calculate interpolated acceleration at particle position
    • Add acceleration due to gravity
    • Update particle velocity
    • Add particle velocity back into the grid cells
  • For each particle
    • Calculate interpolated velocity at particle position
    • Update particle position and velocity with interpolated velocity
Beyond that I also "cheat" a little and introduce a bit of a force near collision surfaces to push particles away.

For more detail you can check out the source code I have made available here.

Demo

This is a demonstration of collision detection against a signed distance field, running in Google's Native Client. It is an evolution of the last demo I put together of a fluid simulation running in Native Client. Aside from working on making that demo faster I also wanted to give it some more interesting physics by putting the fluid in a more complicated environment than a box. This is a result of working toward that.

So what is actually going on here? To begin with, a signed distance field is a scalar field describing the distance at each point in the field from the nearest point on some surface (or set of surfaces). Points that lie outside of an object have positive distance to the surface while points that lie inside an object will have a negative distance.

Using a signed distance field then makes it very easy to check if a particle is colliding with our environment - simply sample the distance field at the particle position and if the magnitude of the distance is negative we have a collision (ignoring the potential for tunneling for the time being). Once a collision is detected we can then backtrack along the particle trajectory until we find where the particle collided with the surface. To calculate the collision response we then need to determine the surface normal - this is again easy as we simply need to calculate the gradient of the distance field and normalize the result.

Aside from making the the collision detection process rather simple, signed distance fields are also attractive because you can get pretty good results with a pretty low resolution field. In this demo the distance field is only 32x32 samples, or 1/16 the resolution we are rendering at. I have included some visualization options so you can see the distance field both with point sampling and with bilinear filtering (CPU rendered so it is a bit slow). Working on this, I can see why people find this a useful technique for text rendering (see this publication from Valve).

Next up will be re-integrating an optimized fluid simulation and then moving the rendering over to the GPU.

Demo

I have been wanting to play around with Google's Native Client for a while now and decided to just dive in to it today. After going through the example apps a little bit I wanted to do something a little more meaty. Inspired by this demo, I went ahead and read this paper and began to implement the simulation they propose inside of Native Client. The result is still somewhat sub-optimal and I still have not integrated the spring model put forth in the paper, but you can play with current version of it here (obviously it only works in Chrome). I know it is not terribly pretty to look at, but I'm hoping to spend some more time next weekend working on optimizing it and prettying it up some.