1000字范文,内容丰富有趣,学习的好帮手!
1000字范文 > Apache Solr 建立 HBase 二级索引

Apache Solr 建立 HBase 二级索引

时间:2022-01-12 06:01:45

相关推荐

Apache Solr 建立 HBase 二级索引

一、安装 Apache Solr

图片分割线----------------------------------------------------------------------------------------------------------------------

图片分割线----------------------------------------------------------------------------------------------------------------------

图片分割线----------------------------------------------------------------------------------------------------------------------

二、配置 HBase 二级索引

1、增加HBase复制功能

打开启用编制索引、启用复制

对Hbase的表进行改造:

对于初次建立的表:

# 其中1表示开启replication功能,0表示不开启,默认为0Create ‘device_safe_center_notice’,{NAME=>’alert_data’,VERSIONS=>3,REPLICATION_SCOPE=>1}

对于已经存在的表:

disable ‘device_safe_center_notice’alter ‘device_safe_center_notice’,{NAME=>’alert_data’,REPLICATION_SCOPE=>1}enable ‘device_safe_center_notice’

2、创建相应的 SolrCloud 集合

[root@node2 bin]# solrctl instancedir --generate /opt/module/hbase-indexer/solr_test[root@node2 conf]# pwd/opt/module/hbase-indexer/solr_test/conf[root@node2 conf]# vim managed-schema添加:# name格式为”表名_列族名_列名”<field name="device_alert_notice_alert_data_src_ip" type="string" indexed="true" stored="true" multiValued="true"/><field name="device_alert_notice_alert_data_src_port" type="string" indexed="true" stored="true" multiValued="true"/><field name="device_alert_notice_alert_data_dest_ip" type="string" indexed="true" stored="true" multiValued="true"/><field name="device_alert_notice_alert_data_dest_port" type="string" indexed="true" stored="true" multiValued="true"/>

打开硬提交:

[root@node2 conf]# vim solrconfig.xml<!-- 修改 --><autoCommit><maxTime>${solr.autoCommit.maxTime:10000}</maxTime><openSearcher>false</openSearcher></autoCommit><autoSoftCommit><maxTime>${solr.autoSoftCommit.maxTime:1000}</maxTime></autoSoftCommit>

执行:

[root@node2 conf]# solrctl instancedir --create solr_test /opt/module/hbase-indexer/solr_test/

# 以下的collection名称很重要,后续代码中要用到[root@node2 conf]# solrctl collection --create solr_test

在/opt/module/hbase-indexer/solr_test路径下创建morphline-hbase-mapper.xml文件:

[root@node2 solr_test]# pwd/opt/module/hbase-indexer/solr_test[root@node2 solr_test]# vim morphline-hbase-mapper.xml<?xml version="1.0"?><indexer table="device_alert_notice">​ <field name="device_alert_notice_alert_data_src_ip" value="alert_data:src_ip" type="string"/>​ <field name="device_alert_notice_alert_data_src_port" value="alert_data:src_port" type="string"/>​ <field name="device_alert_notice_alert_data_dest_ip" value="alert_data:dest_ip" type="string"/>​ <field name="device_alert_notice_alert_data_dest_port" value="alert_data:dest_port" type="string"/></indexer>

执行:

[root@node2 solr_test]# hbase-indexer add-indexer -n solr_test_indexer -c /opt/module/hbase-indexer/solr_test/morphline-hbase-mapper.xml -cp solr.zk=node1,node2,node3:2181/solr -cp solr.collection=solr_test

至此,hbase新增数据已经同步添加solr索引,但是在添加solr索引之前就已经存在于hbase中的数据需要手动批量添加索引

3、批量添加索引

[root@node2 jars]# hadoop jar /opt/cloudera/parcels/CDH/jars/hbase-indexer-mr-1.5-cdh6.3.2-job.jar --hbase-indexer-zk node1,node2,node3:2181 --hbase-indexer-name solr_test_indexer --reducers 0

三、基于solr二级索引查询hbase

1、SolrQueryUtil.java

添加pom依赖:

<!-- /artifact/org.apache.solr/solr-solrj --><dependency><groupId>org.apache.solr</groupId><artifactId>solr-solrj</artifactId><version>7.4.0</version></dependency><!-- /artifact/org.apache.solr/solr-core --><dependency><groupId>org.apache.solr</groupId><artifactId>solr-core</artifactId><version>7.4.0</version></dependency>

代码:

import org.apache.solr.client.solrj.SolrQuery;import org.apache.solr.client.solrj.impl.CloudSolrClient;import org.apache.solr.client.solrj.response.QueryResponse;import org.mon.SolrDocument;import java.util.Collections;import java.util.Optional;/*** @Author: fyq* @Date: Create in 11:03 /6/30* @Desc: 获取solr中二级索引数据对hbase进行条件匹配查询工具类*/public class SolrQueryUtil {public static void main(String[] args) {try {long start_time = System.currentTimeMillis();CloudSolrClient cloudSolrClient = new CloudSolrClient.Builder(Collections.singletonList("192.168.18.211:2181,192.168.18.212:2181,192.168.18.213:2181"), Optional.of("/solr")).build();long middle_time = System.currentTimeMillis();// 查询语句SolrQuery query = new SolrQuery("device_alert_notice_alert_data_dest_ip:163.402.777.809");// 查询结果需要显示的行数query.setRows(20);// 其中“solr_test”为创建的二级索引的collection名QueryResponse response = cloudSolrClient.query("solr_test", query);long stop_time = System.currentTimeMillis();System.out.println("======== connect solr total use time : " + (stop_time - start_time));System.out.println("======== solr query use time : " + (stop_time - middle_time));for (SolrDocument result : response.getResults()) {// SolrQuery()内需要传入的参数格式为:hbase表名_列族名_列名:列值System.out.println(result.get("id"));System.out.println(PhoenixGetDataUtil.get((String) result.get("id")));}System.out.println(response.getResults().getNumFound());System.out.println(response.toString());cloudSolrClient.close();} catch (Exception e) {e.printStackTrace();}}}

2、PhoenixGetDataUtil.java

添加pom依赖:

<dependency><groupId>org.apache.phoenix</groupId><artifactId>phoenix-core</artifactId><version>5.0.0-HBase-2.0</version><exclusions><exclusion><groupId>org.glassfish</groupId><artifactId>javax.el</artifactId></exclusion><exclusion><groupId>org.apache.hadoop</groupId><artifactId>hadoop-common</artifactId></exclusion></exclusions></dependency><dependency><groupId>org.glassfish</groupId><artifactId>javax.el</artifactId><version>3.0.1-b06</version></dependency><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-common</artifactId><version>3.0.0</version></dependency>

代码:

import java.sql.*;import java.util.Properties;/*** @Author: fyq* @Date: Create in 15:24 /6/24* @Desc: 使用phoenix驱动查询hbase数据的工具类*/public class PhoenixGetDataUtil {public static String getData(String id) {try {String driver = "org.apache.phoenix.jdbc.PhoenixDriver";Class.forName(driver);String url = "jdbc:phoenix:192.168.18.211,192.168.18.212,192.168.18.213:2181";//需保证客户端和服务端参数配置一致Properties props = new Properties();props.put("phoenix.schema.isNamespaceMappingEnabled", "true");props.setProperty("phoenix.query.timeoutMs", "1200000");props.setProperty("hbase.rpc.timeout", "1000");props.setProperty("hbase.client.scanner.timeout.period", "1200000");long start_time = System.currentTimeMillis();Connection connection = DriverManager.getConnection(url);long middle_time = System.currentTimeMillis();//查询数据PreparedStatement pstste = connection.prepareStatement("select \"src_ip\",\"dest_ip\" from \"device_alert_notice\" where \"ID\" = ?");pstste.setString(1,id);ResultSet resultSet = pstste.executeQuery();long stop_time = System.currentTimeMillis();System.out.println("============== connect hbase total use time :" + (stop_time - start_time));System.out.println("============== hbase query use time :" + (stop_time - middle_time));String result = null;while (resultSet.next()){result = "src_ip : "+resultSet.getString(1)+" dest_ip : "+resultSet.getString(2);}pstste.close();connection.close();return result;} catch (Exception e){e.printStackTrace();return null;}}}

本内容不代表本网观点和政治立场,如有侵犯你的权益请联系我们处理。
网友评论
网友评论仅供其表达个人看法,并不表明网站立场。