indexing twitter data into elasticsearch: Limit of total fields [1000] in index has been exceeded
This limit has been introduced in following GitHub issue.
The command counts grep type | wc -l
counts the number of lines with text "type". Therefore I guess there is a chance for the count to be inaccurate. I did a small text and I got a higher value than the actual number of fields. So you could get less than the actual number of fields as well, but I can't think of a scenario yet.
Here's the test I did.
curl -s -XGET http://localhost:9200/stackoverflow/_mapping?pretty
{
"stackoverflow" : {
"mappings" : {
"os" : {
"properties" : {
"NAME" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"TITLE" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
},
"fielddata" : true
},
"title" : {
"type" : "text",
"fielddata" : true
}
}
}
}
}
}
Since the "type" is there in 5 lines I get the output as 5 even though I only have 3 fields.
Can you try increasing the limit and see if it works?
PUT my_index/_settings
{
"index.mapping.total_fields.limit": 2000
}
You can also increase this limit during index creation.
PUT my_index
{
"settings": {
"index.mapping.total_fields.limit": 2000,
"number_of_shards": 1,
"number_of_replicas": 0
},
"mappings": {
...
}
}
Credits: https://discuss.elastic.co/t/total-fields-limit-setting/53004/2