MongoDB: Unique index on array element's property
As far as I know, unique indexes only enforce uniqueness across different documents, so this would throw a duplicate key error:
db.cats.insert( { id: 123, kittens: [ { id: 456 } ] } )
db.cats.insert( { id: 123, kittens: [ { id: 456 } ] } )
But this is allowed:
db.cats.insert( { id: 123, kittens: [ { id: 456 }, { id: 456 } ] } )
I'm not sure if there's any way enforce the constraint you need at the Mongo level, maybe it's something you could check in the application logic when you insert of update?
Ensuring uniqueness of the individual values in an array field
In addition to the example above, there is a function in MongoDB to ensure that when you are adding a new object/value to an array field, that it will only perform the update if the value/object doesn't already exist.
So if you have a document that looks like this:
{ _id: 123, kittens: [456] }
This would be allowed:
db.cats.update({_id:123}, {$push: {kittens:456}})
resulting in
{ _id: 123, kittens: [456, 456] }
however using the $addToSet function (as opposed to $push) would check if the value already exists before adding it. So, starting with:
{ _id: 123, kittens: [456] }
then executing:
db.cats.update({_id:123}, {$addToSet: {kittens:456}})
Would not have any effect.
So, long story short, unique constraints don't validate uniqueness within the value items of an array field, just that two documents can't have identical values in the indexed fields.
There is an equivalent of insert with uniquness in array attribute. The following command essentially does insert while ensuring the uniqueness of kittens (upsert creates it for you if the object with 123 doesn't already exist).
db.cats.update(
{ id: 123 },
{ $addToSet: {kittens: { $each: [ 456, 456] }}, $set: {'otherfields': 'extraval', "field2": "value2"}},
{ upsert: true}
)
The resulting value of the object will be
{
"id": 123,
"kittens": [456],
"otherfields": "extraval",
"field2": "value2"
}