RHive

R + Hive = RHive

支持原创:http://blog.fens.me/nosql-r-hive/

R利剑NoSQL系列文章 之 Hive

图片 1

  1. Hive介绍
  2. Hive安装
  3. RHive安装
  4. RHive函数库
  5. RHive基本用操作

  6. Hive介绍


Hive是建以Hadoop上的数据仓库基础构架。它提供了一如既往多元之家伙,可以为此来进行多少提取转化加载(ETL),这是相同种植可以储存、查询和剖析存储于
Hadoop 中之广阔数据的编制。Hive 定义了简易的类 SQL 查询语言,称为
HQL,它同意熟悉 SQL 的用户查询数据。同时,这个语言也允许熟悉 MapReduce
开发者的出自定义之 mapper 和 reducer 来处理内建的 mapper 和 reducer
无法就的纷繁的辨析工作。

Hive 没有专门的数码格式。 Hive 可以很好的工作于 Thrift
之上,控制分隔符,也允许用户指定数量格式

方内容选择自 百度百科(http://baike.baidu.com/view/699292.htm)

hive与关系数据库的区分:

  • 数存储不同:hive基于hadoop的HDFS,关系数据库则根据本地文件系统
  • 算模型不同:hive基于hadoop的mapreduce,关系数据库则冲索引的内存计算模型
  • 动场景不同:hive是OLAP数据仓库系统提供海量数据查询的,实时性很不同;关系数据库是OLTP事务系统,为实时查询业务服务
  • 扩展性不同:hive基于hadoop很轻通过分布式增加存储能力和计算能力,关系数据库水平扩展很为难,要连追加单机的性质

  • Hive安装


Hive是根据Hadoop开发的数据仓库产品,所以首先我们设先期有Hadoop的条件。

图片 2

Hadoop安装,请参考:Hadoop环境搭建, 创立Hadoop母体虚拟机

 

Hive的安装,请参考:Hive安装与以攻略

Hadoop-1.0.3的下载地址
http://archive.apache.org/dist/hadoop/core/hadoop-1.0.3/

Hive-0.9.0之下载地址
http://archive.apache.org/dist/hive/hive-0.9.0/

 

Hive安装好后
启动hiveserver的服务

~ nohup hive --service hiveserver  &
Starting Hive Thrift Server

打开hive shell

~ hive shell
Logging initialized using configuration in file:/home/conan/hadoop/hive-0.9.0/conf/hive-log4j.proper             ties
Hive history file=/tmp/conan/hive_job_log_conan_201306261459_153868095.txt

#查看hive的表
hive> show tables;
hive_algo_t_account
o_account
r_t_account
Time taken: 2.12 seconds

#查看o_account表的数据
hive> select * from o_account;
1       abc@163.com     2013-04-22 12:21:39
2       dedac@163.com   2013-04-22 12:21:39
3       qq8fed@163.com  2013-04-22 12:21:39
4       qw1@163.com     2013-04-22 12:21:39
5       af3d@163.com    2013-04-22 12:21:39
6       ab34@163.com    2013-04-22 12:21:39
7       q8d1@gmail.com  2013-04-23 09:21:24
8       conan@gmail.com 2013-04-23 09:21:24
9       adeg@sohu.com   2013-04-23 09:21:24
10      ade121@sohu.com 2013-04-23 09:21:24
11      addde@sohu.com  2013-04-23 09:21:24
Time taken: 0.469 seconds

3. RHive安装

告提早部署好JAVA的条件:

~ java -version
java version "1.6.0_29"
Java(TM) SE Runtime Environment (build 1.6.0_29-b11)
Java HotSpot(TM) 64-Bit Server VM (build 20.4-b02, mixed mode)

安装R:Ubuntu 12.04,请更新源再下载R2.15.3版

~ sudo sh -c "echo deb http://mirror.bjtu.edu.cn/cran/bin/linux/ubuntu precise/ >>/etc/apt/sources.list"
~ sudo apt-get update
~ sudo apt-get install r-base-core=2.15.3-1precise0precise1

安装R依赖库:rjava

#配置rJava
~ sudo R CMD javareconf

#启动R程序
~ sudo R
install.packages("rJava")



安装RHive

install.packages("RHive")
library(RHive)
Loading required package: rJava
Loading required package: Rserve
This is RHive 0.0-7. For overview type ‘?RHive’.
HIVE_HOME=/home/conan/hadoop/hive-0.9.0
call rhive.init() because HIVE_HOME is set.

由于RHive已经从CRAN上移除,需要着手下载安装,下载地址:https://cran.r-project.org/src/contrib/Archive/RHive/。我们用着手下载RHive_0.0-7.tar.gz包,然后经过命令进行安装。

# 安装RHive
~ R CMD INSTALL RHive_0.0-7.tar.gz

4. RHive函数库

rhive.aggregate        rhive.connect          rhive.hdfs.exists      rhive.mapapply
rhive.assign           rhive.desc.table       rhive.hdfs.get         rhive.mrapply
rhive.basic.by         rhive.drop.table       rhive.hdfs.info        rhive.napply
rhive.basic.cut        rhive.env              rhive.hdfs.ls          rhive.query
rhive.basic.cut2       rhive.exist.table      rhive.hdfs.mkdirs      rhive.reduceapply
rhive.basic.merge      rhive.export           rhive.hdfs.put         rhive.rm
rhive.basic.mode       rhive.exportAll        rhive.hdfs.rename      rhive.sapply
rhive.basic.range      rhive.hdfs.cat         rhive.hdfs.rm          rhive.save
rhive.basic.scale      rhive.hdfs.chgrp       rhive.hdfs.tail        rhive.script.export
rhive.basic.t.test     rhive.hdfs.chmod       rhive.init             rhive.script.unexport
rhive.basic.xtabs      rhive.hdfs.chown       rhive.list.tables      
rhive.size.table
rhive.big.query        rhive.hdfs.close       rhive.load             rhive.write.table
rhive.block.sample     rhive.hdfs.connect     rhive.load.table
rhive.close            rhive.hdfs.du          rhive.load.table2

Hive和RHive的基本操作对比:

#连接到hive
Hive:  hive shell
RHive: rhive.connect("192.168.1.210")

#列出所有hive的表
Hive:  show tables;
RHive: rhive.list.tables()

#查看表结构
Hive:  desc o_account;
RHive: rhive.desc.table('o_account'), rhive.desc.table('o_account',TRUE)

#执行HQL查询
Hive:  select * from o_account;
RHive: rhive.query('select * from o_account')

#查看hdfs目录
Hive:  dfs -ls /;
RHive: rhive.hdfs.ls()

#查看hdfs文件内容
Hive:  dfs -cat /user/hive/warehouse/o_account/part-m-00000;
RHive: rhive.hdfs.cat('/user/hive/warehouse/o_account/part-m-00000')

#断开连接
Hive:  quit;
RHive: rhive.close()

5. RHive中心用操作

#初始化
rhive.init()

#连接hive
rhive.connect("192.168.1.210")

#查看所有表
rhive.list.tables()
             tab_name
1 hive_algo_t_account
2           o_account
3         r_t_account

#查看表结构
rhive.desc.table('o_account');
     col_name data_type comment
1          id       int
2       email    string
3 create_date    string

#执行HQL查询
rhive.query("select * from o_account");
   id           email         create_date
1   1     abc@163.com 2013-04-22 12:21:39
2   2   dedac@163.com 2013-04-22 12:21:39
3   3  qq8fed@163.com 2013-04-22 12:21:39
4   4     qw1@163.com 2013-04-22 12:21:39
5   5    af3d@163.com 2013-04-22 12:21:39
6   6    ab34@163.com 2013-04-22 12:21:39
7   7  q8d1@gmail.com 2013-04-23 09:21:24
8   8 conan@gmail.com 2013-04-23 09:21:24
9   9   adeg@sohu.com 2013-04-23 09:21:24
10 10 ade121@sohu.com 2013-04-23 09:21:24
11 11  addde@sohu.com 2013-04-23 09:21:24

#关闭连接
rhive.close()
[1] TRUE

创临时表

rhive.block.sample('o_account', subset="id<5")
[1] "rhive_sblk_1372238856"

rhive.query("select * from rhive_sblk_1372238856");
  id          email         create_date
1  1    abc@163.com 2013-04-22 12:21:39
2  2  dedac@163.com 2013-04-22 12:21:39
3  3 qq8fed@163.com 2013-04-22 12:21:39
4  4    qw1@163.com 2013-04-22 12:21:39

#查看hdfs的文件
rhive.hdfs.ls('/user/hive/warehouse/rhive_sblk_1372238856/')
  permission owner      group length      modify-time
1  rw-r--r-- conan supergroup    141 2013-06-26 17:28
                                                 file
1 /user/hive/warehouse/rhive_sblk_1372238856/000000_0

rhive.hdfs.cat('/user/hive/warehouse/rhive_sblk_1372238856/000000_0')
1abc@163.com2013-04-22 12:21:39
2dedac@163.com2013-04-22 12:21:39
3qq8fed@163.com2013-04-22 12:21:39
4qw1@163.com2013-04-22 12:21:39

依照限定划分字段数据

rhive.basic.cut('o_account','id',breaks='0:100:3')
[1] "rhive_result_20130626173626"
attr(,"result:size")
[1] 443

rhive.query("select * from rhive_result_20130626173626");
             email         create_date     id
1      abc@163.com 2013-04-22 12:21:39  (0,3]
2    dedac@163.com 2013-04-22 12:21:39  (0,3]
3   qq8fed@163.com 2013-04-22 12:21:39  (0,3]
4      qw1@163.com 2013-04-22 12:21:39  (3,6]
5     af3d@163.com 2013-04-22 12:21:39  (3,6]
6     ab34@163.com 2013-04-22 12:21:39  (3,6]
7   q8d1@gmail.com 2013-04-23 09:21:24  (6,9]
8  conan@gmail.com 2013-04-23 09:21:24  (6,9]
9    adeg@sohu.com 2013-04-23 09:21:24  (6,9]
10 ade121@sohu.com 2013-04-23 09:21:24 (9,12]
11  addde@sohu.com 2013-04-23 09:21:24 (9,12]

Hive操作HDFS

#查看hdfs文件目录
rhive.hdfs.ls()
  permission owner      group length      modify-time   file
1  rwxr-xr-x conan supergroup      0 2013-04-24 01:52 /hbase
2  rwxr-xr-x conan supergroup      0 2013-06-23 10:59  /home
3  rwxr-xr-x conan supergroup      0 2013-06-26 11:18 /rhive
4  rwxr-xr-x conan supergroup      0 2013-06-23 13:27   /tmp
5  rwxr-xr-x conan supergroup      0 2013-04-24 19:28  /user

#查看hdfs文件内容
rhive.hdfs.cat('/user/hive/warehouse/o_account/part-m-00000')
1abc@163.com2013-04-22 12:21:39
2dedac@163.com2013-04-22 12:21:39
3qq8fed@163.com2013-04-22 12:21:39
网站地图xml地图