How to insert duplicated records in Entity Framework

bulkinsert c# entity-framework-6 linq linq-to-entities

Question

I am having very strange situation regarding insertion of duplicated records. I am retrieving records against certain criteria and after modifying couple of properties, I am re-inserting the entire collection of objects again without even changing the primary key values of objects.

But I am not getting expected results. My all collections nested inside parent collection never added into the database completely.

I have no idea, what is going wrong. Do I need to detach all these entities completely? I am already retrieving entities with the usage of NoTracking() and even detaching parent entity while modifying it's attributes.

My Parent Collection is Consignment and containing child entities as list "ConsignmentLine"

My Entity heirarchy is :

public class Consignment 
{
    public int ConsignmentId { get; set; }
    public int ClientSubsidiaryId { get; set; }
    public int ForwarderId { get; set; }
    public int ClientId { get; set; } 

    public ICollection<ConsignmentLine> ConsignmentLines { get; set; }    

    public Consignment()
    {          
      ConsignmentLines = new List<ConsignmentLine>();            
    }
}

public class ConsignmentLine
{
    public int ConsignmentLineId { get; set; }
    public int PackagingId { get; set; }
    public double Amount { get; set; }
    public double Weight { get; set; }                

    public int ConsignmentId { get; set; }
    public virtual Consignment Consignment { get; set; }        
}

Steps involved in my code :

Retrieving data :

var Consignments = _dbContext.Consignments.AsNoTracking().Where(Pr => ( Pr.SourceParty == 0 && Pr.ParentId == null && Pr.ConnectState>=4 ) ).ToList();

Modifying couple of properties.

consignments.ForEach(
            (consignment) =>
            {
                consignment.ClientId = clientId;                    
                _dbContext.Entry(consignment).State = System.Data.Entity.EntityState.Detached;
                consignment.ForwarderId = forwarderId;
                consignment.ClientSubsidiaryId = clientSubsidiaryId;                    
            }); 

Trying to save in chunks because I know, Consignments having > 250000 records.

const int BulkSize = 1000;
var SkipSize = 0;

try
{                                                                                                                        
   while (SkipSize < consignments.Count)
   {
      ProcessableConsignments =   consignments.Skip(SkipSize).Take(BulkSize).ToList();                                                                                                                                                  _dbContext.Configuration.AutoDetectChangesEnabled = false;     
                                dbContext.Consignments.AddRange(ProcessableConsignments);

         var changedRecords = _dbContext.SaveChanges();
         SkipSize = SkipSize +  BulkSize;                                            
       }                                      
 }
 catch (Exception ex)
 {
    throw;
 }

I don't know what I am missing here. All I need to re-insert the whole BULK of records containing various other child entities as collection second time ( in fact N times in loop ) as new BULK.

1
0
5/31/2016 4:17:52 PM

Popular Answer

I believe that when you are detaching already detached entities from context, you effectively instruct context to ignore such entities when they added back to context. Also I am strongly advise you against using EF in bulk updates due to performance reasons. At least create new context for each transaction in other case processing time for each next batch will grow.

0
5/31/2016 6:00:33 PM


Related Questions





Related

Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow