How to insert duplicated records in Entity Framework

bulkinsert c# entity-framework-6 linq linq-to-entities


Regarding the insertion of duplicate data, I am experiencing a really peculiar issue. In order to re-insert the complete collection of objects without even changing their primary key values, I am collecting records based on specific criteria and, after changing a few properties, modifying a few more properties.

But I'm not receiving the outcomes I was hoping for. My parent collection and its nested collections were never fully added to the database.

What's wrong—I have no notion what's wrong. Do I have to entirely separate all of these entities? With the help of NoTracking(), I can already retrieve entities and even detach the parent entity while changing its characteristics.

Consignment is my parent collection, and the list "ConsignmentLine" contains the child entities.

My entity heirarchy is as follows:

public class Consignment 
    public int ConsignmentId { get; set; }
    public int ClientSubsidiaryId { get; set; }
    public int ForwarderId { get; set; }
    public int ClientId { get; set; } 

    public ICollection<ConsignmentLine> ConsignmentLines { get; set; }    

    public Consignment()
      ConsignmentLines = new List<ConsignmentLine>();            

public class ConsignmentLine
    public int ConsignmentLineId { get; set; }
    public int PackagingId { get; set; }
    public double Amount { get; set; }
    public double Weight { get; set; }                

    public int ConsignmentId { get; set; }
    public virtual Consignment Consignment { get; set; }        

Steps taken by my code are:

obtaining data

var Consignments = _dbContext.Consignments.AsNoTracking().Where(Pr => ( Pr.SourceParty == 0 && Pr.ParentId == null && Pr.ConnectState>=4 ) ).ToList();

changing a few properties.

            (consignment) =>
                consignment.ClientId = clientId;                    
                _dbContext.Entry(consignment).State = System.Data.Entity.EntityState.Detached;
                consignment.ForwarderId = forwarderId;
                consignment.ClientSubsidiaryId = clientSubsidiaryId;                    

Consignments have more than 250000 records, thus I'm attempting to save in chunks.

const int BulkSize = 1000;
var SkipSize = 0;

   while (SkipSize < consignments.Count)
      ProcessableConsignments =   consignments.Skip(SkipSize).Take(BulkSize).ToList();                                                                                                                                                  _dbContext.Configuration.AutoDetectChangesEnabled = false;     

         var changedRecords = _dbContext.SaveChanges();
         SkipSize = SkipSize +  BulkSize;                                            
 catch (Exception ex)

I'm not sure what I'm missing in this situation. All I have to do is re-insert the entire collection of records, which includes various other child entities, as a new batch a second time (in fact, N times in loop).

5/31/2016 4:17:52 PM

Popular Answer

I think that by removing already-removed entities from context, you are basically telling context to disregard those entities when they are placed again. Additionally, for performance reasons, I strongly advise against using EF for bulk updates. If not, processing time for each subsequent batch will increase. At the very least, build fresh context for each transaction.

5/31/2016 6:00:33 PM

Related Questions


Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow