I have a customer with an MSDB database that had grown to 40 GB. After looking into it, I found that the table sys_mailitems was large.
There are some “fixes” out there to look at the sent_date and delete data based on that. But when the table is so large, that will not work. There is not an index on sent_date, so it will scan the clustered index looking for sent_dates. Any attempt to create an index on sent_date takes a long time, and causes blocking on any INSERTS. Let’s face it, if the table got that large, there are definitely going to be inserts. In a 24/7 environment, we’re in a dilemma. Of course, a truncate might be efficient
The other problem is, as a highly paid DBA, I don’t have time to babysit DELETING small records at a time, all day long!
However, I came up with a good solution. The sent_dates do show up in order of the ID, which is indexed. Schedule this SQL to run in a job about every 2 minutes. It deletes 20,000 of the oldest records at a time. The @DateToStop variable means it will not delete email history more recent than that date.
/* Created by GlutenFreeSQL, Daniel Gras, 1/17/2013 */ DECLARE @DateToStop datetime SET @DateToStop = '2012-6-1' DECLARE @CurrentMinDate datetime SET @CurrentMinDate = (SELECT top 1 sent_date from sysmail_mailitems) IF @CurrentMinDate < @DateToStop BEGIN DELETE sysmail_mailitems where mailitem_id < (select MIN(mailitem_id) + 20000 from sysmail_mailitems) select top 1 sent_date from sysmail_mailitems select MAX(mailitem_id) - MIN(mailitem_id) from sysmail_mailitems END