用nifi把hdfs数据导到hive
全景图:

1. ListHDFS & FetchHDFS:
ListHDFS:
FetchHDFS:
2. EvaluateJsonPath:
{"status": {"code":500,"message":"FAILED","detail":"DTU ID not exists"}}
如果json里有数组,需要先用SplitJson分隔:
3. RouteOnContent:
4. ReplaceText:
先在hive里创建一个表:
create table tb_test(
register string,
register_url string
);
|
1
|
|
|
1
|
insert into yongli.tb_test(register, register_url)values(
'${register}'
,
'${register_url}'
)
|
|
1
|
|
|
1
|
|
下面介绍一种效率更高的方式:
|
1
|
还是用ReplaceText:
|
|
1
|
|
|
1
|
再用MergeContent:
|
|
1
|
insert into yongli.tb_dtu(dtuid, addr, value, time)values
|
5. PutHiveQL:
创建一个HiveConnectionPool
设置Database Connection URL, User, Password:
本文转自疯吻IT博客园博客,原文链接:http://www.cnblogs.com/fengwenit/p/5823177.html,如需转载请自行联系原作者








