[cairo] Possible Memory Leak Using Glitz Surfaces

Charles Tuckey ctuckey at verano.com
Fri Jul 23 12:01:05 PDT 2004


More news. We have set up four test systems that all show the memory leak:
- RedHat 9, NVidia driver 5336
- RedHat 9, NVidia driver 6106
- RedHat 9, Ati driver, Mesa 4.0.4 GL libraries
- RedHat Enterprise 2.1, XiG Summit MX Platinum driver for a Matrox Quad 
G450 card
All four systems exhibit the leak using the test program. The NVidia 
systems show a leak on the order of tens of megabytes per second. XiG 
system is about 5 megabytes per second and the Ati system is about 1K 
per second.

Through an extremely serendiptious happenstance we found that if we 
added this line of code:
       glitz_texture_fini(gl, src_texture);
in the _glitz_composite_direct method of glitz.c (diff is below) then 
the memory leak disappers on all four of our test systems.

***************
*** 281,286 ****
--- 281,288 ----

     glitz_surface_pop_current (dst);

+   glitz_texture_fini(gl, src_texture);
+
     return 1;
   }

We noticed that new textures were being created but that the 
glitz_texture_fini method was never being called on them. Thus the above 
fix - or hack. Is this the proper way of dealing with this problem?

I admit my knowledge of OpenGL is almost nonexistent. But some questions 
I have are:
- why do textures get created every time cairo_show_text is called?
- why do we need textures at all? Won't the default texture do?

Thanks for your help on this one. We still have a memory leak caused by 
GL in our application that is under development but that will require a 
new test program. So, I'll be back. :)

charlie

P.S. If anyone would like the output of glxinfo and/or xdpyinfo on any 
of the above 4 systems just let me know.

ct

David Reveman wrote:
> On Tue, 2004-07-20 at 17:27 -0600, Charles Tuckey wrote:
> 
>>Hi,
>>
>>I have found an extremely large memory leak that occurs when using cairo 
>>with glitz surfaces. I have attached a very simple program that 
>>demonstrates the leak. I've run the program (and demonstrated the leak) 
>>against two different video cards: one is an NVidia card using the 
>>latest NVidia driver and the other is a Matrox card using an Xi Graphics 
>>driver. The NVidia card is on a RedHat 9 system and the Matrox card is 
>>on a Redhat Enterprise 2.1 system.
>>
>>The leak does not occur using pixman surfaces. Also, if the glitz 
>>surfaces are destroyed after each use, instead of using the same glitz 
>>surface all the time, the leak slows down by several orders of magnitude 
>>but is still quite noticeable.
>>
>>I used valgrind to try and track down the source of the leak on the 
>>NVidia system. It does not appear to be in either cairo or glitz but in 
>>the GL driver supplied by NVidia for the video card. This makes me 
>>wonder if it is in fact a leak, or if I am using glitz incorrectly.
>>
>>Any help or advice on this issue would be appreciated.
> 
> 
> ok, I looked at your test program and there are some minor problems.
> 
> You shouldn't use XCreateSimpleWindow to create windows for GLX
> rendering. You must first find a valid visual and then use that visual
> to create the window. glitz_glx_find_standard_format will find a valid
> visual for you, just use glitz_glx_get_visual_info_from_format to get a
> pointer to a XVisualInfo structure which you can use to create the
> window. I've attached a patch for this.
> 
> It's not impossible that this is somehow related to your memory leak.
> Running your test program with software mesa shows no memory leak.
>  
> -David
> 
> 



More information about the cairo mailing list