I have a question around perform garbage collection "Cleanup Submitted Tasks' from User Console vs a SQL Client.
We like to schedule the garbage collection to cleanup taskpersistence database. We tried running the "Cleanup Submitted Tasks' from User Console, however, the task is too slow. The Cleanup task is processing or cleaning roughly 650 tasks per hour.
We also tried running the garbage collection from a SQL Client and also through the batch script using 'sqlcmd' by invoking the storedProcedure. It performs better cleaning roughly 1500 tasks per hour.
However, the issue is when the storedProcedure is executing and performing a cleanup, we could not run any other queries. Any query like a simple 'select' statement run against 'tasksession12_5' table are left in executing state forever from the SQL Client . It seems like running the garbage collection storedProcedure from SQL Client locks the table whereas running the "Cleanup Submitted Tasks' from User Console does not lock the table.
set @cutoff_time1 = (select DATEADD(day, DATEDIFF(day, 14, getdate()), 0))
@ime_id = 1,
@cutoff_time = @cutoff_time1, (anything before 14 days of data are cleaned out)
@audit_timeout_days = 1,
@chunk_rec_count = 500
Does anyone have any input on how to resolve this issue running the storedProcedure without locking the database ?
Chris' approach is correct. You'll just need to tailor this for your particular OS and SQL client.
Have assisted my colleague to successfully implement this in production and actual did the UAT testing for it. My Advice, target small chunks (think cutoff times that are only a few hundred records), but encapsulate the stored procedure call within a sequential loop. then schedule manageable runs during non-peak periods (overnight), even considering maintenance windows where you shut down the IME. It's easy enough to write a vbscript or perl script to call the stored procedure and pass whatever parameters you want, then set that on a cron or task scheduler and just make sure the dba is in the loop.