Hibernate - @ElementCollection - Strange delete/insert behavior

I had the same issue but wanted to map a list of enums: List<EnumType>.

I got it working like this:

@ElementCollection
@CollectionTable(
        name = "enum_table",
        joinColumns = @JoinColumn(name = "some_id")
)
@OrderColumn
@Enumerated(EnumType.STRING)
private List<EnumType> enumTypeList = new ArrayList<>();

public void setEnumList(List<EnumType> newEnumList) {
    this.enumTypeList.clear();
    this.enumTypeList.addAll(newEnumList);
}

The issue with me was that the List object was always replaced using the default setter and therefore hibernate treated it as a completely "new" object although the enums did not change.


The problem is somehow explained in the page about ElementCollection of the JPA wikibook:

Primary keys in CollectionTable

The JPA 2.0 specification does not provide a way to define the Id in the Embeddable. However, to delete or update a element of the ElementCollection mapping, some unique key is normally required. Otherwise, on every update the JPA provider would need to delete everything from the CollectionTable for the Entity, and then insert the values back. So, the JPA provider will most likely assume that the combination of all of the fields in the Embeddable are unique, in combination with the foreign key (JoinColunm(s)). This however could be inefficient, or just not feasible if the Embeddable is big, or complex.

And this is exactly (the part in bold) what happens here (Hibernate doesn't generate a primary key for the collection table and has no way to detect what element of the collection changed and will delete the old content from the table to insert the new content).

However, if you define an @OrderColumn (to specify a column used to maintain the persistent order of a list - which would make sense since you're using a List), Hibernate will create a primary key (made of the order column and the join column) and will be able to update the collection table without deleting the whole content.

Something like this (if you want to use the default column name):

@Entity
public class Person {
    ...
    @ElementCollection
    @CollectionTable(name = "PERSON_LOCATIONS", joinColumns = @JoinColumn(name = "PERSON_ID"))
    @OrderColumn
    private List<Location> locations;
    ...
}

References

  • JPA 2.0 Specification
    • Section 11.1.12 "ElementCollection Annotation"
    • Section 11.1.39 "OrderColumn Annotation"
  • JPA Wikibook
    • Java Persistence/ElementCollection

We discovered that entities we were defining as our ElementCollection types did not have an equals or hashcode method defined and had nullable fields. We provided those (via @lombok for what it's worth) on the entity type and it allowed hibernate (v 5.2.14) to identify that the collection was or was not dirty.

Additionally, this error manifested for us because we were within a service method that was marked with the annotation @Transaction(readonly = true). Since hibernate would attempt to clear the related element collection and insert it all over again, the transaction would fail when being flushed and things were breaking with this very difficult to trace message:

HHH000346: Error during managed flush [Batch update returned unexpected row count from update [0]; actual row count: 0; expected: 1]

Here is an example of our entity model that had the error

@Entity
public class Entity1 {
@ElementCollection @Default private Set<Entity2> relatedEntity2s = Sets.newHashSet();
}

public class Entity2 {
  private UUID someUUID;
}

Changing it to this

@Entity
public class Entity1 {
@ElementCollection @Default private Set<Entity2> relatedEntity2s = Sets.newHashSet();
}

@EqualsAndHashCode
public class Entity2 {
  @Column(nullable = false)
  private UUID someUUID;
}

Fixed our issue. Good luck.


In addition to Pascal's answer, you have to also set at least one column as NOT NULL:

@Embeddable
public class Location {

    @Column(name = "path", nullable = false)
    private String path;

    @Column(name = "parent", nullable = false)
    private String parent;

    public Location() {
    }

    public Location(String path, String parent) {
        this.path = path;
        this.parent= parent;
    }

    public String getPath() {
        return path;
    }

    public String getParent() {
        return parent;
    }
}

This requirement is documented in AbstractPersistentCollection:

Workaround for situations like HHH-7072. If the collection element is a component that consists entirely of nullable properties, we currently have to forcefully recreate the entire collection. See the use of hasNotNullableColumns in the AbstractCollectionPersister constructor for more info. In order to delete row-by-row, that would require SQL like "WHERE ( COL = ? OR ( COL is null AND ? is null ) )", rather than the current "WHERE COL = ?" (fails for null for most DBs). Note that the param would have to be bound twice. Until we eventually add "parameter bind points" concepts to the AST in ORM 5+, handling this type of condition is either extremely difficult or impossible. Forcing recreation isn't ideal, but not really any other option in ORM 4.