Monosoul's Dev Blog A blog to write down dev-related stuff I face
Spring Data MongoDB MappingInstantiationException

Polymorphic fields with MongoDB and Spring Data

Depending on your needs using polymorphism might benefit different aspects of the project. For example, having a limited class hierarchy might make your code cleaner and more expressive in contrast to a single class having nullable fields or a type label. And it only seems natural to use it with document-oriented databases like MongoDB. But if you want to have a document with polymorphic fields using MongoDB and Spring Data you might face an exception similar to this: MappingInstantiationException: Failed to instantiate ...FieldType using constructor NO_CONSTRUCTOR with arguments. Let’s see why this happens and how to fix that issue properly.

The issue

Let’s imagine we have an app that uses Spring Data and MongoDB. Within our app we have a simple document – DocumentWithInterfaceField:

@Document("document") @TypeAlias("document") data class DocumentWithInterfaceField( @Field("interface_field") val interfaceField: FieldType, @Id val id: ObjectId = ObjectId.get(), ) sealed interface FieldType { @get:Field("some_int_field") val someIntField: Int @TypeAlias("field_type_impl") data class FieldTypeImpl( override val someIntField: Int = 123, @Field("some_string_field") val someStringField: String = "qwe", ) : FieldType @TypeAlias("other_field_type_impl") data class OtherFieldTypeImpl( override val someIntField: Int = 456, @Field("other_string_field") val otherStringField: String = "asd", ) : FieldType }
Code language: Kotlin (kotlin)

Full example is available on GitHub.

Let’s walk through this class. The document itself only has 2 fields: interfaceField and id, where id is a BSON object id and interfaceField is of type FieldType.

FieldType is a sealed interface having 2 implementations: FieldTypeImpl and OtherFieldTypeImpl. Both implementations share someIntField and have 1 string field. In case of FieldTypeImpl the field is called someStringField, and in case of OtherFieldTypeImpl the field is called otherStringField.

So the 2 implementations are different and one cannot be deserialized from the other.

Both implementations are also annotated with @TypeAlias.

Looks good! Now let’s declare a repository for the document.

interface DocumentRepository : MongoRepository<DocumentWithInterfaceField, ObjectId>
Code language: Kotlin (kotlin)

Default CRUD repository will work just fine for our use case.

And we will also need to configure our app to work with MongoDB.


@Configuration(proxyBeanMethods = false) @EnableMongoRepositories( basePackageClasses = [DocumentRepository::class] ) @EnableConfigurationProperties(MongoProperties::class) class MongoDbConfig : AbstractMongoClientConfiguration() { @Autowired private lateinit var mongoProperties: MongoProperties override fun getDatabaseName() = mongoProperties.dbName override fun mongoClient(): MongoClient { val connectionString = ConnectionString(mongoProperties.uri) val mongoClientSettings = MongoClientSettings.builder() .applyConnectionString(connectionString) .build() return MongoClients.create(mongoClientSettings) } } @ConstructorBinding @ConfigurationProperties("mongo") private data class MongoProperties( val dbName: String = "spring_data_mongo_issue", val uri: String, )
Code language: Kotlin (kotlin)

Here we configure mongo client according to the Spring docs. If you’re interested in reading more about mongo configuration make sure to also check this article.

Let’s quickly check what happens in this configuration:

  • L2: Enable Mongo repositories (so that Spring will create repository beans for us).
  • L5: Enable configuration properties (so that we can configure the DB name and connection url).
  • L11: Configure database name using the properties.
  • L13: Configure mongo client to use connection string from the properties.

Nothing fancy, you can check the full example on GitHub.

Now that we have everything set up, let’s write a test for the repository.

Repository test

We want to have an integration test that will use a real MongoDB instance so that we can make sure our repository works fine in real life (aka prod). The easiest way to have a real DB instance is to use a MongoDB container with testcontainers. I will skip the setup details here, but you can check that on GitHub.

class SaveAndGetWithRepositoryTest @Autowired constructor( private val repository: DocumentRepository, ) : AbstractRepositoryTest() { @TestFactory fun `should save the document and then get it`() = sequenceOf( DocumentWithInterfaceField(FieldTypeImpl()), DocumentWithInterfaceField(OtherFieldTypeImpl()), ).map { document -> dynamicTest("should save $document and then get it") { // given // when & then expectCatching { repository.findById( }.isSuccess().isPresent() isEqualTo document } }.asStream() }
Code language: Kotlin (kotlin)

This test is a dynamic test. Don’t worry if you’re not familiar with what dynamic tests are, basically what happens here is:

  • L7-8: We create 2 instances of DocumentWithInterfaceField, one with FieldTypeImpl instane and one with OtherFieldTypeImpl.
  • L10: For each of those 2 instances we create a dynamic test.
  • L12: Inside the test we save the document first using the repository.
  • L16: Then we try to get it by id from the DB.
  • L17: If we got it successfully we check that the document we got from the DB is equal to the original document.

You can check the full example here.

Let’s try to run this test.

SaveAndGetWithRepositoryTest result
SaveAndGetWithRepositoryTest result

Okay, looks good to me! Let’s deploy it!

A wild MappingInstantiationException appears

Imagine we have deployed the app to the development environment and now checking everything works there.

We have tried to create a new document and save it – that worked.

We have tried to get the document from the DB – that also worked.

Cool, seems like we did a good job here! Until at some point we suddenly start to see this: Failed to instantiate using constructor NO_CONSTRUCTOR with arguments.

The stack trace looks somewhat similar to this:

MappingInstantiationException stack trace Failed to instantiate using constructor NO_CONSTRUCTOR with arguments at at at at at at$ConversionContext.convert( at$MongoDbPropertyValueProvider.getPropertyValue( at$AssociationAwareMongoDbPropertyValueProvider.getPropertyValue( at$AssociationAwareMongoDbPropertyValueProvider.getPropertyValue( at at at$DefaultingKotlinClassInstantiatorAdapter.extractInvocationArguments( at$DefaultingKotlinClassInstantiatorAdapter.createInstance( at at at at at at at$ReadDocumentCallback.doWith( at at at at at at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke( at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke( at java.base/java.lang.reflect.Method.invoke( at$RepositoryFragmentMethodInvoker.lambda$new$0( at at at$RepositoryFragments.invoke( at at$ImplementationMethodExecutionInterceptor.invoke( at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed( at at at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed( at at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed( at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke( at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed( at at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed( at org.springframework.aop.framework.JdkDynamicAopProxy.invoke( at jdk.proxy2/jdk.proxy2.$Proxy77.findById(Unknown Source) ... Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate []: Specified class is an interface at org.springframework.beans.BeanUtils.instantiateClass( at ... 122 more
Code language: Properties (properties)

The root exception there is:
org.springframework.beans.BeanInstantiationException: Failed to instantiate []: Specified class is an interface.

That is quite puzzling, right? Why did it work locally in the test and worked for some time after deployment but then suddenly started failing?

Well, I’m going to save you the troubleshooting time I wasted myself.

DefaultTypeMapper is to blame

Kind of. There are actually multiple classes involved. DefaultTypeMapper is the class responsible for type mapping when saving/reading data to/from the DB. If you put a break point at the lines 198 (method writeType) and 128 (method getFromCacheOrCreate) in this class and run the test again, you’ll notice method writeType is called when saving the entry. This method delegates to MappingContextTypeInformationMapper that computes the mapping and saves it internally.

save call chain - Polymorphic fields with MongoDB and Spring Data
save call chain

Once you’re past the first break point and get to the second one (when getting the document from the repository), you’ll notice method getFromCacheOrCreate is called when getting the entry. If DefaultTypeMapper‘s cache is empty this method delegates again to MappingContextTypeInformationMapper to resolve the alias, which in turn resolves the type from it’s internal map.

find call chain - Polymorphic fields with MongoDB and Spring Data
find call chain


A-ha! So this is what happens! When we save the document first it’s alias to type mapping gets cached and when we try to get it after that it deserializes fine. But if we try to get the document before saving it we’ll get the exception! The mapper can’t deduce the type based on just the alias, it doesn’t know about the type so it tries to use the type of the field in the document! And in our case the field type is, which is an interface!

So the app only worked until the next deployment happened! Once we deploy the new instance of the app doesn’t have the mapping in cache to resolve the type (until we save the document again).

The root exception we got before makes perfect sense now. But how can we reproduce it locally? We need to cover it with a test before we start fixing it to prevent regressions in the future, right?

Luckily, we can do that.

Writing a better test

To properly test the scenario of getting the document before it got saved at least once we can use MongoTemplate. We will create an instance of DBObject with BasicDBObjectBuilder and save it using MongoTemplate thus avoiding any type mapping and caching magic of Spring Data.

@DirtiesContext(classMode = BEFORE_CLASS) // required to prevent spring from caching the type in the other test class SaveWithTemplateAndGetWithRepositoryTest @Autowired constructor( private val repository: DocumentRepository, private val mongoTemplate: MongoTemplate, ) : AbstractRepositoryTest() { @TestFactory fun `should save the document and then get it`() = sequenceOf( ObjectId.get().let { id -> id to buildDbObject { this["_id"] = id this["_class"] = "document" this["interface_field"] = buildDbObject { this["_class"] = "field_type_impl" this["some_int_field"] = 123 this["some_string_field"] = "qwe" } } }, ... ).map { (id, document) -> dynamicTest("should save $document and then get it") { // given, "document") // when & then expectCatching { repository.findById(id) // will throw MappingInstantiationException }.isSuccess().isPresent().get { } isEqualTo id } }.asStream() private fun buildDbObject(block: BasicDBObjectBuilder.() -> Unit) = BasicDBObjectBuilder.start().apply(block).get() private operator fun BasicDBObjectBuilder.set(field: String, value: Any) = add(field, value) }
Code language: Kotlin (kotlin)

There are a few important things here:

  • L1: We mark the test as DirtiesContext. This is crucial to make sure the test is always reproducible, otherwise other tests we have might save the document and create an entry in MappingContextTypeInformationMapper thus making this test always work regardless of how we save the document here.
  • L2: In addition to the repository bean we also inject MongoTemplate.
  • L9-20: We create a pair (or a tuple) of object id to an instance of DBObject. The objects we create here have exactly the same values as the objects we create for testing in SaveAndGetWithRepositoryTest.
  • L24: We save the document into the collection.
  • L33-34: simple DSL to ease DBObject creation.
  • The rest is similar to SaveAndGetWithRepositoryTest.

You can check the full class on GitHub.

Let’s try to run the test.

SaveWithTemplateAndGetWithRepositoryTest result - Polymorphic fields with MongoDB and Spring Data
SaveWithTemplateAndGetWithRepositoryTest result

Perfect! It has failed with the exact exception we expected.

Wait, but are you sure?

I know what you might be thinking. Are we sure of the issue? Maybe the documents that actually get saved to the DB are different?

To make it a bit more convincing for you, AbstractRepositoryTest has a tearDown method where it prints all documents from the collection after every test.

Here’s what we get after running both SaveAndGetWithRepositoryTest and SaveWithTemplateAndGetWithRepositoryTest:

Documents in the DB
d.m.s.d.m.i.SaveAndGetWithRepositoryTest - Document in DB: {"_id": {"$oid": "63249ce7adb06e3c0466cc1e"}, "interface_field": {"some_int_field": 123, "some_string_field": "qwe", "_class": "field_type_impl"}, "_class": "document"} d.m.s.d.m.i.SaveAndGetWithRepositoryTest - Document in DB: {"_id": {"$oid": "63249ce7adb06e3c0466cc1f"}, "interface_field": {"some_int_field": 456, "other_string_field": "asd", "_class": "other_field_type_impl"}, "_class": "document"} d.m.s.d.m.i.SaveWithTemplateAndGetWithRepositoryTest - Document in DB: {"_id": {"$oid": "63249ce7adb06e3c0466cc21"}, "_class": "document", "interface_field": {"_class": "field_type_impl", "some_int_field": 123, "some_string_field": "qwe"}} d.m.s.d.m.i.SaveWithTemplateAndGetWithRepositoryTest - Document in DB: {"_id": {"$oid": "63249ce7adb06e3c0466cc22"}, "_class": "document", "interface_field": {"_class": "other_field_type_impl", "some_int_field": 456, "other_string_field": "asd"}}
Code language: JSON / JSON with Comments (json)

As you can see even though the field order is different, the values in the documents are the same.

So yes, I am sure. ๐Ÿ™‚

Now let’s get to…

The solution

If you try searching for the exception on Google, you might stumble across this question on StackOverflow or this issue on GitHub.

Both places propose implementing a converter as the soltuion. Let’s try that out.


Basically with this solution we will do deserialization ourselves field by field. Here’s how it would look like:

@ReadingConverter class FieldTypeConverter : Converter<Document, FieldType> { override fun convert(source: Document): FieldType? = when (source.getString("_class")) { "field_type_impl" -> { FieldTypeImpl( someIntField = source.getInteger("some_int_field"), someStringField = source.getString("some_string_field"), ) } "other_field_type_impl" -> { OtherFieldTypeImpl( someIntField = source.getInteger("some_int_field"), otherStringField = source.getString("other_string_field"), ) } else -> null } }
Code language: Kotlin (kotlin)

The implementation is pretty straightforward:

  • L3: We read the type alias from the Document object.
  • L4: When the alias is field_type_impl we instantiate FieldTypeImpl.
  • L11: When the alias is other_field_type_impl we instantiate OtherFieldTypeImpl.
  • L18: When the alias value is unkown we return null (throwing an exception might be an option here as well).

We also need to change MongoDbConfig a little bit to use the new converter:

override fun customConversions() = MongoCustomConversions( listOf( FieldTypeConverter() ) )
Code language: Kotlin (kotlin)

The full example is available on GitHub.

Let’s try to run the tests.

test results with converter
test results with converter

Perfect! All green.

But I don’t really like that solution. It requires you to manually deserialize every polymorphic field. Not only it is an error prone approach on it’s own, it also requires anyone who changes the models to not forget to update the converter. Quite unrealiable. Luckily that’s not the only solution available! Let’s try other approaches.

Explicit type mapping declaration

Another thing that we can do is to configure DefaultMongoTypeMapper with explicit alias to type mappings. To do that we will take advantage of ConfigurableTypeInformationMapper. We will need to change our MongoDbConfig a bit:

@Configuration ... class MongoDbConfig : AbstractMongoClientConfiguration() { ... @Bean fun configurableTypeInformationMapper() = ConfigurableTypeInformationMapper( mapOf( to "field_type_impl", to "other_field_type_impl", ) ) override fun mappingMongoConverter( databaseFactory: MongoDatabaseFactory, customConversions: MongoCustomConversions, mappingContext: MongoMappingContext ) = super.mappingMongoConverter(databaseFactory, customConversions, mappingContext).apply { setTypeMapper( DefaultMongoTypeMapper( DEFAULT_TYPE_KEY, listOf( configurableTypeInformationMapper(), SimpleTypeInformationMapper(), ) ) ) }
Code language: Kotlin (kotlin)

A few important things to note here:

  • L1: We remove proxyBeanMethods = false argument from @Configuration annotation. This is needed for configurableTypeInformationMapper() to return a singleton bean instead of a new object whe we call it within the config class.
  • L7-12: We declare ConfigurableTypeInformationMapper bean with 2 mappings for each of FieldType implementations.
  • L14-28: We configure DefaultMongoTypeMapper to use ConfigurableTypeInformationMapper along with SimpleTypeInformationMapper.

The full example is available on GitHub.

This solution is better than the converter one since it doesn’t force us to manually deserialize each polymorphic field. But it is still not perfect, as you might forget to add a mapping here when adding a new polymorphic field. I’d like to have something I can configure once and forget about it.

And I have a solution that will work this way. ๐Ÿ™‚

Automated type mapping with reflection

First we will need to add a new dependency to the project. We’re going to use the reflections library. Pick the latest version for the build tool of your choice here.

Now we’re going to create a small extension to DefaultMongoTypeMapper to make it easy to configure and instantiate. Here’s how it would look:

class ReflectiveMongoTypeMapper( private val reflections: Reflections = Reflections("") ) : DefaultMongoTypeMapper( DEFAULT_TYPE_KEY, listOf( ConfigurableTypeInformationMapper( reflections.getTypesAnnotatedWith( { clazz -> getAnnotation(clazz,!!.value } ), SimpleTypeInformationMapper(), ) )
Code language: Kotlin (kotlin)

Pretty minimalistic, isn’t it?

Here’s what happens there:

  • L1: We create an instance of Reflections pointing it to package, so it will only scan classes in that package.
  • L7: We get all classes annotated with @TypeAlias.
  • L8: We map each of those classes to the value of @TypeAlias annotation. I.e. we create a map of type to alias.
  • L3-L13: We instantiate DefaultMongoTypeMapper similar to how we did that with explicit type declaration. The only difference is that we get the mappings automatically via reflection.

We will also have to change MongoDbConfig like this:

override fun mappingMongoConverter( databaseFactory: MongoDatabaseFactory, customConversions: MongoCustomConversions, mappingContext: MongoMappingContext, ) = super.mappingMongoConverter(databaseFactory, customConversions, mappingContext).apply { setTypeMapper(ReflectiveMongoTypeMapper()) }
Code language: Kotlin (kotlin)

Here we just make Spring use the new type mapper similar to explicit type declaration.

The full example is available on GitHub.

Let’s run the tests again.

test results with reflective type mapping - Polymorphic fields with MongoDB and Spring Data
test results with reflective type mapping

Awesome! Everything works and if we add a new polymorphic field we woudln’t have to change any configuration. Moreover, we only use reflection during context startup, it is not used in runtime, so it won’t affect performance anyhow.


So, what can we take away from all that?

  1. When writing integration tests for repositories it’s better to avoid saving objects using the same repository you’re testing, even if you’re testing a different method. E.g.: if you test save method, try to not use the same repository’s get method to validate the object got saved properly. And if you test get method, it’s better not to use save method to persist the objects first.
  2. Might be a good idea to not share the Spring context between tests for different repository methods. Because even if you follow the previous advise, it doesn’t guarantee other tests won’t affect execution. I.e. make use of @DirtiesContext.
  3. There are multiple ways to solve the issue, pick the one that suits your needs best. In my opinion the last solution is the most error-proof one.

Happy hacking!

Like it? Share it!

Leave a comment

Your email address will not be published.