memory issue in dbreeze

Jul 30, 2013 at 1:36 PM
hi,
I ran some tests on dbreeze in the past days in purpose to check if there is any memory leak or some memory problem.

what I did was every 30 sec I inserted about 20000 - 30000 records, now sometimes it takes about 30 min +- to see that sadenly my test program started to do big memory jumps, and sometimes after i crached my test program and started same test right away the memory started to grow very fast. because of this I think that in some cases there is a problem which makes dbreeze use memory without cleaning it, I tried to figure it out but I couldn't because my profiler didn't act well after I took memory snapshot on my program.

I would like if you could test it yourself, it should be very simple just making big inserts to the db every 30secs. the more enteries you make I think the sooner you will see the leak.

if there is any questions I will be happy to replay

hope to solve it as soon as possible.

don.
Jul 30, 2013 at 1:46 PM
In our practice, we used to add some testing code to the words.
So, please, add your testing code here in the forum and describe its running conditions.
Jul 30, 2013 at 6:07 PM
Edited Jul 31, 2013 at 8:21 AM
So, my small experiment
System.Timers.Timer tmrxxx = new System.Timers.Timer();
        int i = 0;

        private void testF_006()
        {
            tmrxxx.Elapsed += new System.Timers.ElapsedEventHandler(tmrxxx_Elapsed);
            tmrxxx.Interval = 1000 * 25;

            tmrxxx.Start();        

        }

        void tmrxxx_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
        {
            tmrxxx.Enabled = false;
            int j = 0;
            using (var tran = engine.GetTransaction())
            {
                
                while (j < 20000)
                {
                    tran.Insert<int, int>("t1", i, 1);
                    tran.Insert<int, int>("t2", i, 1);

                    i++;
                    j++;
                }

                tran.Commit();
            }

            Console.WriteLine("{0} Inserted: {1} ",DateTime.Now.ToString("dd.MM.yyyy HH:mm:ss"),i.ToString());
          

            tmrxxx.Enabled = true;
        }
Memory: project stars from 34MB and after 40 minutes takes 20MB of RAM

Output:
30.07.2013 18:26:31 Inserted: 20000
30.07.2013 18:26:57 Inserted: 40000
30.07.2013 18:27:23 Inserted: 60000
30.07.2013 18:27:48 Inserted: 80000
...
30.07.2013 19:03:46 Inserted: 1760000
30.07.2013 19:04:12 Inserted: 1780000
30.07.2013 19:04:38 Inserted: 1800000
30.07.2013 19:05:04 Inserted: 1820000
30.07.2013 19:05:29 Inserted: 1840000
30.07.2013 19:05:55 Inserted: 1860000
30.07.2013 19:06:21 Inserted: 1880000
30.07.2013 19:06:46 Inserted: 1900000
... still takes 20MB of RAM
30.07.2013 19:44:02 Inserted: 3640000
30.07.2013 19:44:28 Inserted: 3660000
30.07.2013 19:44:54 Inserted: 3680000
30.07.2013 19:45:20 Inserted: 3700000

I don't see problem for now.
Aug 5, 2013 at 11:28 PM
Edited Aug 5, 2013 at 11:29 PM
I've been running 70 machines now (millions of records), which collect data constantly (and this is just a test).
Which are connected to collector machines which collect data from those machines, which in turn are connected
to the machine super collector that collects data from collectors.

Every single one of those hosts in this 3-tier topology uses dbreeze as storage and a protobuf-net serializer.
Most of these machines including those that contain most data (collectors and supercollectors) have been running
now for 2 months of testing approximately. I haven't noticed any significant memory usage by dbreeze.

Most of the time working set is about 100mbytes, which also includes memory buffers for my custom network routing protocol and other services (so dbreeze is just a small part in these 100mb of memory.

How do you serialize data ? Maybe Your tests are the cause of the hog ? Also if I understood correctly it takes you 30min
for 30k records ? On an average machine it usally takes me a couple of seconds and i use guids as keys (tho sorted).
Aug 6, 2013 at 10:22 AM
krome wrote:
I've been running 70 machines now (millions of records), ...
Such structures...that's what I love :)
Aug 8, 2013 at 5:13 PM
Yup, it will be 1000 machines soon :)
For the moment no problems.
Aug 8, 2013 at 8:14 PM
SkyNET is coming...