您现在的位置是:首页 > 文章详情

Hive常用操作

日期:2018-07-28点击:436

1. 绑定数据

1.1 创建表&创建文本文件
create table fantj.t3(id int,name string,age int) row format delimited fields terminated by ','stored as textfile; 
hive> create table fantj.t3(id int,name string,age int) row format delimited fields terminated by ','stored as textfile; OK Time taken: 4.467 seconds hive> select * from fantj.t3; OK Time taken: 2.82 seconds 

表示行格式用逗号来分割字段。

创建文本文件test.txt

我创建在/home/fantj目录下。

1,jiao,18 2,fantj,20 3,laowang,30 4,laotie,40 
1.2 从本地导入到hive

LOAD DATA LOCAL INPATH '/home/fantj/test.txt' OVERWRITE INTO TABLE t3;

hive> LOAD DATA LOCAL INPATH '/home/fantj/test.txt' OVERWRITE INTO TABLE fantj.t3; Loading data to table fantj.t3 [Warning] could not update stats. OK Time taken: 26.334 seconds 

select * from fantj.t3;

hive> select * from fantj.t3; OK 1 jiao 18 2 fantj 20 3 laowang 30 4 laotie 40 Time taken: 2.303 seconds, Fetched: 4 row(s) 

导入成功!

1.3 从hdfs导入到hive
先将test文件上传到hdfs中

[root@s166 fantj]# hadoop fs -put test.txt /hdfs2hive

-rw-r--r-- 3 root supergroup 46 /hdfs2hive/test.txt 
进入hive,创建表t5
create table fantj.t5(id int,name string,age int) row format delimited fields terminated by ','stored as textfile; 
hive> create table fantj.t5(id int,name string,age int) row format delimited fields terminated by ','stored as textfile; OK Time taken: 3.214 seconds 
执行导入

LOAD DATA INPATH '/hdfs2hive/test.txt' OVERWRITE INTO TABLE fantj.t5;

hive> LOAD DATA INPATH '/hdfs2hive/test.txt' OVERWRITE INTO TABLE fantj.t5; Loading data to table fantj.t5 [Warning] could not update stats. OK Time taken: 25.498 seconds 

检查是否成功:

hive> select * from fantj.t5; OK 1 jiao 18 2 fantj 20 3 laowang 30 4 laotie 40 Time taken: 5.046 seconds, Fetched: 4 row(s) 
原文链接:https://yq.aliyun.com/articles/650210
关注公众号

低调大师中文资讯倾力打造互联网数据资讯、行业资源、电子商务、移动互联网、网络营销平台。

持续更新报道IT业界、互联网、市场资讯、驱动更新,是最及时权威的产业资讯及硬件资讯报道平台。

转载内容版权归作者及来源网站所有,本站原创内容转载请注明来源。

文章评论

共有0条评论来说两句吧...

文章二维码

扫描即可查看该文章

点击排行

推荐阅读

最新文章