分词器的安装与使用

分词器介绍

 

IK分词器的安装和使用

POST _analyze
{
  "analyzer": "standard",
  "text" : "hello imooc"
}

POST _analyze
{
  "analyzer": "standard",
  "text" : "我是中国人"
}

 

ik分词器下载地址:

https://github.com/medcl/elasticsearch-analysis-ik/releases

 

POST _analyze
{
  "analyzer": "ik_smart",
  "text" : "我是中国人"
}

POST _analyze
{
  "analyzer": "ik_max_word",
  "text" : "我是中国人"
}

POST _analyze
{
  "analyzer": "ik_max_word",
  "text" : "我是慕课网"
}

字典添加慕课网后

原文地址:https://www.cnblogs.com/longronglang/p/12019393.html