Developer forum

Forum » Development » SQL server fills up

SQL server fills up

Matthijs Hofstede
Reply

We have an main issue with out Cook&Co site (http://www.cookandco.nl). Our SQL server (MS-SQL2008-R2) fills up slowly.

Memory remains relative low and increases suddenly to the max off 8 GB and the machine stops.

 

We had to reboot the site 2 – 3 times a day the last few days. Are there any other sites having these problems, we are using the new E-comm power pack?

 

Maybe any other ideas what the problem is?


Replies

 
Imar Spaanjaars Dynamicweb Employee
Imar Spaanjaars
Reply
Hi Matthijs,

Is your custom import procedure running on the same site / server? I've seen the behavior you're describing on my own sites that run an intensive import.

Otherwise, you should report this as a case so someone from the DW team can look at it.

Kind regards,

Imar
 
Jurgen van Kreij
Reply
Thanks for your reaction.
We run short updates of the products only changes and updates of price and stock. It uses the DW-API.
There seems no relation between these processes and the crash.
Yesterday we installed the last cummulative update (8) of the SQL 2008-R2 server. It didn't help. 

Any idea's where to look?

Jurgen
 
Imar Spaanjaars Dynamicweb Employee
Imar Spaanjaars
Reply
Hi Jurgen,

As I hinted in my previous post, you may want to open a support case. There might be an issue with Dynamicweb (I can still see way too many queries for the manufacturer filter being called, indicating that problem hasn't been fixed / or you haven't upgraded the solution yet in case it was fixed (not sure)).

Also, make sure you rule out your custom code. Check connections to SQL server and see if they are closed correctly. Also, see if you're not excessively caching data.

Finally, you could use SQL Server's standard tools or a more advanced tool such as Red Gate's Memory or Performance profilers to see what's causing the issue.

Imar
 
Morten Snedker
Reply
 Hi Jurgen,

You may also want to take a look at the recovery model being used upon backup. When performing a large import the transaction log will grow (and very well to a size much bigger than the actual database), and if the backup routine is not set properly, the size of the transaction log will not be reduced. And together with the database file this log file will count as space used.

If you want to dig down to the table sizes you can (if you have access) execute the following script on the database that will tell you the size of each table:

declare @id	int			
declare @type	character(2) 		
declare	@pages	int			
declare @dbname sysname
declare @dbsize dec(15,0)
declare @bytesperpage	dec(15,0)
declare @pagesperMB		dec(15,0)

create table #spt_space
(
	objid		int null,
	rows		int null,
	reserved	dec(15) null,
	data		dec(15) null,
	indexp		dec(15) null,
	unused		dec(15) null
)

set nocount on

-- Create a cursor to loop through the user tables
declare c_tables cursor for
select	id
from	sysobjects
where	xtype = 'U'

open c_tables

fetch next from c_tables
into @id

while @@fetch_status = 0
begin

	/* Code from sp_spaceused */
	insert into #spt_space (objid, reserved)
		select objid = @id, sum(reserved)
			from sysindexes
				where indid in (0, 1, 255)
					and id = @id

	select @pages = sum(dpages)
			from sysindexes
				where indid < 2
					and id = @id
	select @pages = @pages + isnull(sum(used), 0)
		from sysindexes
			where indid = 255
				and id = @id
	update #spt_space
		set data = @pages
	where objid = @id


	/* index: sum(used) where indid in (0, 1, 255) - data */
	update #spt_space
		set indexp = (select sum(used)
				from sysindexes
				where indid in (0, 1, 255)
				and id = @id)
			    - data
		where objid = @id

	/* unused: sum(reserved) - sum(used) where indid in (0, 1, 255) */
	update #spt_space
		set unused = reserved
				- (select sum(used)
					from sysindexes
						where indid in (0, 1, 255)
						and id = @id)
		where objid = @id

	update #spt_space
		set rows = i.rows
			from sysindexes i
				where i.indid < 2
				and i.id = @id
				and objid = @id

	fetch next from c_tables
	into @id
end


select top 25
	Table_Name = (select left(name,25) from sysobjects where id = objid),
	rows = convert(char(11), rows),
	reserved_KB = ltrim(str(reserved * d.low / 1024.,15,0) + ' ' + 'KB'),
	data_KB = ltrim(str(data * d.low / 1024.,15,0) + ' ' + 'KB'),
	index_size_KB = ltrim(str(indexp * d.low / 1024.,15,0) + ' ' + 'KB'),
	unused_KB = ltrim(str(unused * d.low / 1024.,15,0) + ' ' + 'KB')
		
from 	#spt_space, master.dbo.spt_values d
where 	d.number = 1
and 	d.type = 'E'
order by reserved desc

drop table #spt_space
close c_tables
deallocate c_tables
Hope it helps.

Regards /Snedker

 
Jurgen van Kreij
Reply
Hello Morten,

Thanks for your reaction.

You refer to the size on disk I think. Well that's not a problem (Yet?). I mean the extreme RAM usage.
Or did I misunderstand you?

We are not sure anymore if the SQL server is the problem. It seems de IIS workerrprocess consumes all the RAM.
We prevent SQL-server problems now by limiting the assigned memory to 6GB for the SQL server en 4GB for the application pool.

Kind regards,
Jurgen


 

You must be logged in to post in the forum