.Net and Deep Copy

Some time ago I had to clone objects and .Net’s shallow copy proved to be insufficient – it was necessary to use deep copy. No good tools are provided by .Net itself. If required, the object to be cloned must conform to the ICloneable interface and the Clone() method can be defined for the object. As the classes were not very numerous, but relatively bulky and complicated, there was no point in writing a Clone() method for all of them. I needed something else.

With the help of Google I found a very smart method for performing deep copy. Its performance may not be exceptional but it does its work nicely. The idea of the method is simple – first the object has to be serialised and then deserialised. Serialisation loses the relation between the specific instance of the object and the object data left to us. On deserialisation a new object identical to the old one is created based on these data, but now we are dealing with a different instance.

public object DeepCopy(object obj)
{
   
MemoryStream ms = new MemoryStream
();
   
BinaryFormatter bf = new BinaryFormatter
();
    bf.Serialize(ms, obj);

   
object
retval;
    ms.Seek(0,
SeekOrigin
.Begin);
    retval = bf.Deserialize(ms);
    ms.Close();
   
return retval;
}

NB! This method can be used to deep copy only classes marked as serialisable. If there are problems with the events of the objects to be deserialised, you can find a solution from Rockford Lhotka’s blog posting .NET 2.0 solution to serialization of objects that raise events. If the security level of the appropriate IIS application is not set to Full (internal), this method can generate errors in Windows Vista. If someone knows a solution to this problem, please don’t hesitate to inform me.

Also be careful when objects you are copying have deep hierarchies – using deep copying like this may harm your code performance badly.

Gunnar Peipman

Gunnar Peipman is ASP.NET, Azure and SharePoint fan, Estonian Microsoft user group leader, blogger, conference speaker, teacher, and tech maniac. Since 2008 he is Microsoft MVP specialized on ASP.NET.

    8 thoughts on “.Net and Deep Copy

    • October 8, 2007 at 2:15 pm
      Permalink

      Hey Gunnar, what about applying generics so you don't have to cast on the returned object?

      public T DeepCopy<T>(T obj)

      {

         MemoryStream ms = new MemoryStream();

         BinaryFormatter bf = new BinaryFormatter();

         bf.Serialize(ms, obj);

         ms.Seek(0, SeekOrigin.Begin);

         T retval = (T)bf.Deserialize(ms);

         ms.Close();

         return retval;

      }

    • October 8, 2007 at 3:25 pm
      Permalink

      I've been using a generic version of this for quite some time and I think the performance has been quite good.

      public static T DeepCopy<T>(T item)

      {

         BinaryFormatter formatter = new BinaryFormatter();

         MemoryStream stream = new MemoryStream();

         formatter.Serialize(stream, item);

         stream.Seek(0, SeekOrigin.Begin);

         T result = (T)formatter.Deserialize(stream);

         stream.Close();

         return result;

      }

    • October 8, 2007 at 5:25 pm
      Permalink

      Hi guys!

      Thank you for your comments and examples. Your example are very useful on .Net versions starting from 2.0. The version I given here can also be used on earlier versions of .Net.

    • October 9, 2007 at 8:34 pm
      Permalink

      Great, unless you need any decent performance. While interesting for very low volume deep copying, this seems yet another example of lazy programming, akin to loading HUGE result sets into DataSets, just because you can – stuff I have to teach to inexperienced and junior programmers everyday.

      If you have so many classes that you can’t take time to write simple clone methods for each, I would suggest writing a template for a code generator such as CodeSmith or MyGeneration to do it for you.

      Sorry for the flame. I’m very tired of the poor programming technique I see employed day after day.

    • October 9, 2007 at 9:40 pm
      Permalink

      Greg, I cannot take your opinion as flame. I agree with you completely. It’s not a good idea to use this methodics on larger data volumes or on more complex object trees.

    • October 9, 2007 at 11:31 pm
      Permalink

      Noone aware of the using command?

      public static T DeepClone<T>(T item)

      {

       using (MemoryStream stream = new MemoryStream())

       {

         BinaryFormatter formatter = new BinaryFormatter();

         formatter.Serialize(stream, item);

         stream.Location = 0;

         return (T) formatter.Deserialize(stream);

       }

      }

    • January 23, 2009 at 5:07 pm
      Permalink

      Hey Digimortal,

      I hear the sentement about lazy programing but, I wonder since there is no other way for me, what is better. I have been asked by my boss to do a publisher/subscriber patter that also passes data (kind of more publisher/subscriber mixed with medator) but due to multple subscribers reading a single object I have decided to give each subscriber a deep copy to do what they like with, this alows me to not worry about tracking whos where on the data or wether its still needed. so in this case needing to just make a deep copy of an object is usfull with out knowinwhat it is.

      Ps teh boss does nopt want teh subscriber or publisher taking any responsibility of notifying me whether or not teh object is needed still or what it is.

    • November 30, 2011 at 1:07 pm
      Permalink

      I was searching for deep copying a memorystream itself.

      How to do that?

    Leave a Reply

    Your email address will not be published. Required fields are marked *