Elasticsearch学习笔记之—测试查询分词器的分词结果

GET _analyze
{
  "tokenizer" : "standard",
  "text" : "this is a test 13544478956"
}
GET _analyze
{
  "tokenizer" : "standard",
  "filter": [{"type": "length", "min":1, "max":3 }],  
  "text" : "this is a test"
}
POST _analyze
{
  "char_filter": [], 
  "tokenizer": "standard",
  "filter": [
    "stop"
  ],
  "text": "Eating an apple a day keeps doctor away"
}
原文地址:https://www.cnblogs.com/wjx-blog/p/12204515.html