I got this exception : the cause seems to be a deadlock in SQL Server.

What is the correct way to handle deadlocks?

Is there an automatic transaction retry like in DO 3.8.x ?

Message="Error 'Unknown'"
    à Xtensive.Storage.Providers.Sql.Driver.ReadRow(DbDataReader reader)
    à Xtensive.Storage.Providers.Sql.CommandProcessor.<RunTupleReader>d__0.MoveNext()
    à Xtensive.Storage.Providers.Sql.SessionHandler.<Execute>d__0.MoveNext()
    à Xtensive.Storage.Providers.Sql.SqlProvider.<OnEnumerate>d__0.MoveNext()
    à Xtensive.Storage.Rse.Providers.ExecutableProvider.<GetEnumerator>d__0.MoveNext()
    à Xtensive.Core.Collections.EnumerableExtensions.<Batch>d__27`1.MoveNext()
    à Xtensive.Core.Collections.EnumerableExtensions.<ApplyBeforeAndAfter>d__2f`1.MoveNext()
    à Xtensive.Storage.Rse.RecordSet.<GetEnumerator>d__4.MoveNext()
    à System.Linq.Enumerable.WhereSelectEnumerableIterator`2.MoveNext()
    à Xtensive.Core.Collections.EnumerableExtensions.<Batch>d__27`1.MoveNext()
    à Xtensive.Core.Collections.EnumerableExtensions.<ApplyBeforeAndAfter>d__2f`1.MoveNext()
    à Xtensive.Storage.Linq.Materialization.MaterializationHelper.<Materialize>d__8`1.MoveNext()
    à Xtensive.Core.Collections.EnumerableExtensions.<Batch>d__27`1.MoveNext()
    à Xtensive.Core.Collections.EnumerableExtensions.<ApplyBeforeAndAfter>d__2f`1.MoveNext()
    à Xtensive.Storage.Query.<ExecuteSequence>d__b`1.MoveNext()
    à Xtensive.Storage.EntitySet`1.LoadState()
    à Xtensive.Storage.TransactionalStateContainer`1.get_State()
    à Xtensive.Storage.EntitySetBase.Contains(Key key, Nullable`1 state)
    à Xtensive.Storage.EntitySetBase.Contains(Entity item)
    à Xtensive.Storage.EntitySetBase.Add(Entity item)
    à Xtensive.Storage.EntitySet`1.Add(TItem item)
    à Test.Pilot.Kernel.ProcessingWorkflowModel.Container.CreateDocument[T](String name)
InnerException: System.Data.SqlClient.SqlException
    Message="Transaction (Process ID 54) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction."
    Source=".Net SqlClient Data Provider"
         à System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
         à System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
         à System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
         à System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
         à System.Data.SqlClient.SqlDataReader.HasMoreRows()
         à System.Data.SqlClient.SqlDataReader.ReadInternal(Boolean setTimeout)
         à System.Data.SqlClient.SqlDataReader.Read()
         à Xtensive.Storage.Providers.Sql.Driver.ReadRow(DbDataReader reader)

Updated at 23.09.2009 15:08:52

We develop a server-side application (not ASP.Net).

In this particular case, we are reading a list of work to do from database, then doing it and writing the result in database. A deadlock isn't really critical because if a deadlock occurs, then the work will remain in the list of things to do ;)

Will we be able to disable the deadlock control? I.e. disable deadlock reprocessing in some cases?

Updated at 25.09.2009 8:40:50

This is really good news concerning deadlock handling : It will allow us to greatly simplify some code interacting with filesystem, and more generally "not sql" data stores.

Yes, we had our share of deadlocks with DO.Net : I was really glad to see doDataObject table was gone, and it is a good reason for us to migrate to DO4.

No I don't think you need to worry much about my deadlocks.

I was testing the following configuration : 8 threads creating complex objects (similar => using the same tables) as fast as possible on 8 core cpu. So I think there is a good chance of deadlocks, and this seems only natural in this situation.

This thread was imported from our support forum. The original discussion may contain more detailed answer.

asked Sep 23 '09 at 13:42

olorin's gravatar image


One Answer:

Technically we can reprocess deadlocks for transactional methods, but for now we don't do this (i.e. TransactionalAspect does not override OnError method of its base class).

This isn't implemented mainly because we didn't finish our work on unified exceptions: http://code.google.com/p/dataobjectsdot ... tail?id=29 . On the other hand, this is what we're working on now. When this is done, any locking-related exceptions will be easily recognizable, and we'll add the code we need to handle deadlocks.

What application you're developing? I.e. WPF or ASP.NET? If ASP.NET, do you use our SessionManager? I'll give the answer based on this.

Yes, it will be fully controllable via SessionConfiguration (~ reprocessing delay & attempt count).

So in your case you really simply don't need reprocessing.

Btw, can you tell us what is causing a deadlock in your case? Is it intentionally simulated high load case, or more like a random deadlock on low concurrency? I got few e-mails from users worrying there can be deadlocks. That's because DO v3.X running on SQL Server 2000 (or 2005 w/o snapshot isolation) was really having higher chance of lock escalation & deadlocks because doDataObject table was almost always joined there. But now there is nothing like this.

So must we investigate this case?

answered Sep 23 '09 at 14:36

Alex%20Yakunin's gravatar image

Alex Yakunin

Yes, in this case they're almost inevitable.

Anyway, if there will be any performance issues, don't avoid bothering us ;)

(Sep 23 '09 at 14:36) Alex Yakunin Alex%20Yakunin's gravatar image
Your answer
Please start posting your answer anonymously - your answer will be saved within the current session and published after you log in or create a new account. Please try to give a substantial answer, for discussions, please use comments and please do remember to vote (after you log in)!
toggle preview


Once you sign in you will be able to subscribe for any updates here



Asked: Sep 23 '09 at 13:42

Seen: 4,116 times

Last updated: Sep 23 '09 at 13:42

powered by OSQA