Solr7.1---數據庫導入並創建中文分詞器

 這裏只是告訴你如何導入,生產環境不要這樣部署你的solr服務。node

首先修改solrConfig.xml文件

備份_default文件夾mysql

修改solrconfig.xmlgit

加入以下內容github

 

官方示例:
<requestHandler name="/dataimport" class="org.apache.solr.handler.dataimport.DataImportHandler"> <lst name="defaults"> <str name="config">/path/to/my/DIHconfigfile.xml</str> </lst> </requestHandler>

 

效果:web

 

在conf目錄創建一個db-data-config.xml文件

 

<dataConfig>
    <dataSource driver="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost:3306/demo" user="root" password="123" />
    <document>
        <entity name="bless" query="select * from bless"
                deltaQuery="select bless_id from bless where bless_time > '${dataimporter.last_index_time}'">
            <field column="BLESS_ID" name="blessId" />
            <field column="BLESS_CONTENT" name="blessContent" />
            <field column="BLESS_TIME" name="blessTime" />
        </entity>
    </document>
</dataConfig>

 

 

 個人數據庫sql

 

 

複製jar

找到這個:數據庫

連同mysql驅動包一塊兒複製到apache

 

找到自帶的中文分詞器

複製到webapp的lib目錄session

 

 修改managed-shchema

在最後加入以下中文配置app

    <!-- ChineseAnalyzer -->
    <fieldType name="solr_cnAnalyzer" class="solr.TextField" positionIncrementGap="100">
      <analyzer type="index">
        <tokenizer class="org.apache.lucene.analysis.cn.smart.HMMChineseTokenizerFactory"/>
      </analyzer>
      <analyzer type="query">
        <tokenizer class="org.apache.lucene.analysis.cn.smart.HMMChineseTokenizerFactory"/>
      </analyzer>
    </fieldType>

 

 

下面以cloud模式啓動

 整個過程只須要輸入 索引集合 的名稱,其餘都是一路回車。

D:\>cd solr-7.1.0

D:\solr-7.1.0>bin\solr start -e cloud

Welcome to the SolrCloud example!

This interactive session will help you launch a SolrCloud cluster on your local
workstation.
To begin, how many Solr nodes would you like to run in your local cluster? (spec
ify 1-4 nodes) [2]:
【回車】
Ok, let's start up 2 Solr nodes for your example SolrCloud cluster.
Please enter the port for node1 [8983]:
【回車】
Please enter the port for node2 [7574]:
【回車】
Solr home directory D:\solr-7.1.0\example\cloud\node1\solr already exists.
D:\solr-7.1.0\example\cloud\node2 already exists.

Starting up Solr on port 8983 using command:
"D:\solr-7.1.0\bin\solr.cmd" start -cloud -p 8983 -s "D:\solr-7.1.0\example\clou
d\node1\solr"

Waiting up to 30 to see Solr running on port 8983

Starting up Solr on port 7574 using command:
"D:\solr-7.1.0\bin\solr.cmd" start -cloud -p 7574 -s "D:\solr-7.1.0\example\clou
d\node2\solr" -z localhost:9983

Started Solr server on port 8983. Happy searching!
Waiting up to 30 to see Solr running on port 7574
INFO  - 2017-11-04 12:35:02.823; org.apache.solr.client.solrj.impl.ZkClientClust
erStateProvider; Cluster at localhost:9983 ready

Now let's create a new collection for indexing documents in your 2-node cluster.

Please provide a name for your new collection: [gettingstarted]
Started Solr server on port 7574. Happy searching!
bless【輸入名稱並回車】
How many shards would you like to split bless into? [2]
【回車】
How many replicas per shard would you like to create? [2]
【回車】
Please choose a configuration for the bless collection, available options are:
_default or sample_techproducts_configs [_default]
【回車】
Created collection 'bless' with 2 shard(s), 2 replica(s) with config-set 'bless'


Enabling auto soft-commits with maxTime 3 secs using the Config API

POSTing request to Config API: http://localhost:8983/solr/bless/config
{"set-property":{"updateHandler.autoSoftCommit.maxTime":"3000"}}
Successfully set-property updateHandler.autoSoftCommit.maxTime to 3000


SolrCloud example running, please visit: http://localhost:8983/solr


D:\solr-7.1.0>

 

 下面訪問

選擇bless

而後選擇Schema,來配置字段【注意:這裏的名字要與數據庫中的字段名如出一轍!!!】

bless_id

bless_content

 

bless_time

 

 

點擊DataImport

要注意勾選Auto-Refresh Status

 

如今點擊Query。能夠看到,數據庫中的數據都導入了。

 

下面看一下中文分詞

看起來還不錯。查詢試試看。

發現0條數據,至少也得有一條啊!然而若是我指定默認搜索字段。會發現出來了。

 試試搜索【心】

 

 關於數據庫的文件,你們若是想要用來測試能夠GitHub

相關文章
相關標籤/搜索