CDH Hadoop系列目录:
Sqoop语法说明
Sqoop官方学习文档:
Sqoop import是相对于HDFS来讲,即从关系数据库import到HDFS上。
mysql的驱动包放到sqoop/lib下。
案例一:把数据导入到HDFS上
/root/project
mkdir sqoop_prj
cd sqoop_prj/
mkdir DBS
cd DBS/
touch DBS.opt
hadoop fs -mkdir /user/hive/warehouse/DBS
which sqoop
执行opt文件,不能传参,sqoop --options-file aa.opt
。-m
,指定map数,如果抽取的表数据量大,则调大map数。如果-m设置为5,5个线程,则在HDFS上产生5个文件。
把sqoop写到shell脚本的好处,可以传参数。
#!/bin/sh
. /etc/profile
hadoop fs -rmr /user/hive/warehouse/DBS
sqoop import --connect "jdbc:mysql://cdhmaster:3306/hive" \
--username root \
--password 123456 \
-m 1 \
--table DBS \
--columns "DB_ID,DESC,DB_LOCATION_URI,NAME,OWNER_NAME,OWNER_TYPE" \
--target-dir "/user/hive/warehouse/DBS"
#--where "length(DESC)>0" \
#--null-string ''
bug,驱动问题
ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@3c1a42fa is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@3c1a42fa is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
增加参数,参考
--driver com.mysql.jdbc.Driver
增加参数后的警告,
WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
bug,sql语法问题,
Error: java.io.IOException: SQLException in nextKeyValue
去掉关键词列DESC
,参考,
案例二:数据写Hive普通表(非分区表)
# mysql
create table test (id int, pdate date);
insert into test(id, pdate) values (1, '2017-11-05');
insert into test(id, pdate) values (2, '2017-11-06');
insert into test(id, pdate) values (3, '2017-11-05');
insert into test(id, pdate) values (4, '2017-11-06');
# hive
drop table if exists test;
create table test(id int, pdate string);
--hive-import
,指定要写入hive表,该参数无value。
--hive-overwrite
。
--hive-table
,test。
案例三:写Hive分区表,so,salesorder
注意事项:
1、用什么字段做分区?
创建时间,而不是last_modify_time
。
Q: 用创建时间抽取至hive分区,订单状态变化周期是45天,订单状态变化后,hive数据如何同步?
hive不支持update,每天抽取近15天的订单到Hive的各自分区里。Hive是做统计分析,通常最关心是昨天的情况。
# cdhmaster
cd ~
mysql -uroot -p123456 < so.sql
ERROR 1046 (3D000) at line 3: No database selected
vi so.sql
use test;
mysql -uroot -p123456 < so.sql
# hive
CREATE TABLE so (
order_id bigint,
user_id bigint,
order_amt double ,
last_modify_time string
) partitioned by (date string);
Sqoop执行后,注意:
- 会在该用户HDFS的home目录下,产生一个与源表同名的目录,如/user/root/so
如果sqoop import至hive成功,该目录会自动删掉。 - 在执行的目录下产生一个java文件,即opt转化的MR Job代码。
- sqoop import中,无论hive表是什么列分隔符,均可以自动兼容。
Sqoop抽取框架封装:
- 建一个mysql配置表,配置需要抽取的表及信息;
- Java读取mysql配置表,动态生成opt文件;
- Java中执行Process类调本地系统命令—sqoop –options-file opt文件;
Sqoop-imp -task 1 "2015-04-21"
Sqoop-imp "2015-04-21"
Sqoop export
# mysql test
create table so1 as
select * from so where 1=0;
源头必须是HDFS/Hive,目标关系数据库。
表so1的date
和last_modify_time
修改为varchar
。
Sqoop工具封装
Flow etl 执行所有已配置的表抽取。
Flow etl -task 1
Flow etl -task 1 2017-01-01
- 读取mysql的
extract_to_hdfs
和extract_db_info
,根据配置信息生成.opt文件。 - 通过Java的Process类调Linux命令:
sqoop --options-file opt文件
。
idea打包Flow.jar,'D:/Java/idea/IdeaProjects/Hive_Prj/src/META-INF/MANIFEST.MF' already exists in VFS
,删掉文件夹META-INF
。
db.properties
是访问mysql数据库的配置。
extract_db_info
,抽取的表来自的数据库的配置。
Flow.jar上传至/root/project/lib
。
/root/project/bin
,创建Flow命令。
配置FLOW_HOME
,
vi /etc/profile
export FLOW_HOME=/root/project
source /etc/profile
配置db.properties
,
# FLOW_HOME
mkdir conf
vi db.properties
db.driver=com.mysql.jdbc.Driver
db.url=jdbc:mysql://cdhmaster:3306/test
db.user=root
db.password=123456
配置sqoop option目录sqoop/opts
。
# FLOW_HOME
mkdir -p sqoop/opts
如果要在执行时产生日志,需要开发jar时配置log4j。
ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@310d117d is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@310d117d is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
HDFSExtract.java,增加配置--driver com.mysql.jdbc.Driver
,重新打包上传。
作业可以相应做修改,如sh ./so.sh
# /root/project/sqoop_prj/DBS
vi so.sh
Flow etl -task 1 $yestoday
您可能还想看
数据分析/数据挖掘/机器学习
爬虫
微信公众号「数据分析」,分享数据科学家的自我修养,既然遇见,不如一起成长。
数据分析转载请注明:转载自微信公众号「数据分析」