Import data from a database into Solr.

Data Import

  This tutorial uses Mysql database import data into Solr。node

Create table

  node table:mysql

CREATE TABLE `node` (
  `id` varchar(50) NOT NULL,
  `name` varchar(50) NOT NULL COMMENT 'node name',
  `node_size` int(11) NOT NULL COMMENT 'node size',
  `creator_id` varchar(50) NOT NULL COMMENT 'creator Id',
  `parent_id` varchar(50) NOT NULL COMMENT '父節點Id',
  `path` varchar(200) NOT NULL COMMENT '路徑',
  `create_time` datetime NOT NULL COMMENT 'create time',
  PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='node';

  node table has tens of thousands of records, test uses only 300.web

Full-Import

Create data driven Core

  在Solr Home中Create Core,Solr Home路徑爲:D:\Environment\apache-tomcat-7.0.37\webapps\solr\solrhome。sql

  Solr Core 的基礎配置文件存放在solr-X.X.X\server\solr\configsets:數據庫

  basic_configs爲基礎core的配置,data_driven_shcema_configs是數據庫鏈接的配置。複製相應的Core配置內容,更改相應的參數便可使用。爲了方便此處複製sample_techproducts_configs,由於data_driven_schema_configs中沒有schema.xml文件,因此使用sample_techproducts_configs。apache

Configuration Steps

  1. 在Tomcat的 Solr home 中,Create a new folder named comment;tomcat

  2. 在apache-tomcat-8.0.37\webapps\solr\solrhome\cat中,Create a new folder named data;app

  3. Copy solr-5.X.X\server\solr\configsets\sample_techproducts_configs\conf to apache-tomcat-7.0.37\webapps\solr\solrhome,as follows:webapp

  4.  Configuration solrconfig.xmlui

add request handler data import.

<requestHandler name="/dataimport" class="org.apache.solr.handler.dataimport.DataImportHandler">
	<lst name="defaults">
	    <str name="config">db-config.xml</str>
	</lst>
</requestHandler>

  5. Configuration db-config.xml

  此處db-config.xml與solrconfig.xml在同一目錄。

<dataConfig>
    <dataSource type="JdbcDataSource" driver="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost:3306/solr" 
	user="solr" password="solr" batchSize="-1"/>
    <document>
        <entity name="comment" pk="id" query="select id,name,create_time,node_size,creator_id,parent_id,path from node limit 300">
			<field column="id" name="id" />
            <field column="name" name="name" />
			<field column="create_time" name="createTime"/>
			<field column="node_size" name="nodeSize" />
			<field column="creator_id" name="creatorId" />
			<field column="parent_id" path="parentId" />
			<field column="path" path="path" />
        </entity>
    </document>
</dataConfig>

   6. Configuration schema.xml

<field name="_version_" type="long" indexed="true" stored="true"/>
   
   <!-- points to the root document of a block of nested documents. Required for nested
      document support, may be removed otherwise
   -->
   <field name="_root_" type="string" indexed="true" stored="false"/>

   <!-- Only remove the "id" field if you have a very good reason to. While not strictly
     required, it is highly recommended. A <uniqueKey> is present in almost all Solr 
     installations. See the <uniqueKey> declaration below where <uniqueKey> is set to "id".
     Do NOT change the type and apply index-time analysis to the <uniqueKey> as it will likely 
     make routing in SolrCloud and document replacement in general fail. Limited _query_ time
     analysis is possible as long as the indexing process is guaranteed to index the term
     in a compatible way. Any analysis applied to the <uniqueKey> should _not_ produce multiple
     tokens
   -->   
   <field name="id" type="string" indexed="true" stored="true" required="true" multiValued="false" /> 
   <field name="name" type="text_ik" indexed="true" stored="true"/>  
   <field name="nodeSize" type="int" indexed="true" stored="true"/> 
   <field name="creatorId" type="text_general" indexed="false" stored="true"/>
   <field name="parentId" type="string" indexed="false" stored="true"/>
   <field name="path" type="text_ik" indexed="true" stored="true"/>
   <field name="createTime" type="date" indexed="true" stored="true"/>

   <!-- defined filedtype -->
   <fieldType name="text_ik" class="solr.TextField">
         <analyzer type="index" class="org.wltea.analyzer.lucene.IKAnalyzer" useSmart="false"/>
         <analyzer type="query" class="org.wltea.analyzer.lucene.IKAnalyzer" useSmart="true"/>
	</fieldType>

  7. 經過Solr 控制檯新增Core,但前提是要建立Core文件夾,conf(配置文件夾)、data(索引存儲文件夾)、schema.xml(模式配置)和solrconfig.xml(索引配置),注意圖上灰色文字。

  8. Create core success,Then Core Seletor select "comment",click Dataimport。

  Command有兩種: full-import (全量導入)、delta-import (增量導入)。

  注意:執行delta-import導入的時候不要勾選「clean」,不然會將以前已存在的數據清空。 

  Select Entity 爲 "comment",click Execute便可全量導入數據。

  勾選「Auto-Refresh Status」,頁面會自動刷新狀態,能夠從頁面右側看見導入的狀況,不然須要手動點擊"Refresh Status"。

   Tomcat log as follows:

會出現一段DataImporter Starting Full Import,到執行完畢後會顯示DataImporter success。

  9. Data Importer success as follows:

Detal - import

相關文章
相關標籤/搜索