opengl, geforce cards and my continuous finding of strange bugs

I'm starting to believe strongly in my unlucky fate when it comes to programming. It does not matter which language it is, I always find the most unexpected bugs. For example, one of my favourite one was the stream closed exception that appeared during the loading of a .jar file which included certain crypt class, using a certain version of a jdk. It stopped the loading of the .jar file because an unhandled exception was thrown and then the unpackager couldn't finish read the contents of the file because the stream was clased. Quite interesting! Anyway this time the search for the solution is being much more hard. We are in front of an opengl strange behaviour when it comes to execute my demo <a href=http://soledadpenades.com/projects/demoscene/blue-tuesday-by-xplsv/">blue tuesday/xplsv in geforce cards. When I execute it on my computer (with an ATI radeon9000 mobility) everything goes fine. But hey! when executing it on the geforce cards, I get depressed. A sweet and pretty spiral that falls slowly, behind the credit titles, does simply get hidden by a PLANE! I don't understand why, because there are more scenes in the demo which feature "something behind a plane", and the plane does not overlap and hide everything - as it does in this case.

I will explain the working of this scenes: First everything is cleaned (the color and the depth buffers). So then I draw the planes with the credits. Copy it to a texture. Delete the screen again (color and depth buffers - I was not doing the cleaning of the depth buffer in this case but somehow I thought and some people told me that it could be a cause for the problem, anyway, it has not effect in my computer and on geforce class cards either). Now draw the spiral. I do PushMatrix and blablabla. Is not a problem of matrix transformations - just because in that case I shouldn't be able to see nothing in my computer. Next step is redrawing the previous texture, but somehow distorted, by mapping it to a GL_QUADS grid and applying the appropiate texture coordinates to each one of them. (The factors for determining the level of distortion, should be explained in another blog entry I think; they don't matter too much in this case). For drawing the QUADS I set an Ortho view which sets up a frustrum with some Z coordinates range. And then ALL of the QUADS are drawn in such manner that their Z value falls inside this Z range. Ok, so they are inside the frustrum. Everything shows on Ati and Geforce - no problems (I had had problems some months ago with planes that did not get shown when having some Z values on geforce cards only). But The Problem is that somehow, I don't have a clue why, it seems like, even with the BLEND = true, the quads hide everything behind them - on geforce, of course. So the spiral simply does not show. The alpha values set when I do the glClear are 0 - so when it reads the buffer for copying to the texture, it should be reading alpha values of 0 when there's nothing drawn. So those zones should be transparent and show whatever is behind them. But it does not work. I have done some other tests, like making the alpha test pass just when there's a 0.5 level of alpha... It does not work on geforce too. It seems like the drivers are ignoring me. I don't know what to do about it. I would like to, and then I would be able to finish the demo and upload a final version somewhere. Any clue? Send it to supersole aaaaaaaaat gmail.com (by the way, does somebody need invitations? I still have many...)