Has inheritance become bad?

Posted by mafutrct on Stack Overflow See other posts from Stack Overflow or by mafutrct
Published on 2010-05-20T12:29:42Z Indexed on 2010/05/20 12:50 UTC
Read the original article Hit count: 157

Personally, I think inheritance is a great tool, that, when applied reasonably, can greatly simplify code.

However, I seems to me that many modern tools dislike inheritance. Let's take a simple example: Serialize a class to XML. As soon as inheritance is involved, this can easily turn into a mess. Especially if you're trying to serialize a derived class using the base class serializer.

Sure, we can work around that. Something like a KnownType attribute and stuff. Besides being an itch in your code that you have to remember to update every time you add a derived class, that fails, too, if you receive a class from outside your scope that was not known at compile time. (Okay, in some cases you can still work around that, for instance using the NetDataContract serializer in .NET. Surely a certain advancement.)

In any case, the basic principle still exists: Serialization and inheritance don't mix well. Considering the huge list of programming strategies that became possible and even common in the past decade, I feel tempted to say that inheritance should be avoided in areas that relate to serialization (in particular remoting and databases).

Does that make sense? Or am messing things up? How do you handle inheritance and serialization?

© Stack Overflow or respective owner

Related posts about inheritance

Related posts about serialization