loading...

Migrating Apache Flume Flows to Apache NiFi: Any Relational Database To/From Anywhere

tspannhw profile image Timothy Spann Originally published at datainmotion.dev on ・6 min read

Migrating Apache Flume Flows to Apache NiFi: Any Relational Database To/From Anywhere

Article 8 - ** https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_42.html

**Article 7 - This


*Article 6 - * https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_9.html

*Article 5 - * https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_35.html

*Article 4 - * https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_8.html

*Article 3 - * https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_7.html

*Article 2 - * https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache.html

Article 1 - https://www.datainmotion.dev/2019/08/migrating-apache-flume-flows-to-apache.html

*Source Code: * https://github.com/tspannhw/flume-to-nifi

This is a simple use case of being a gateway between Relational Databases and other sources and sinks. We can do a lot more than that in NiFi. We can SELECT, UPDATE, INSERT, DELETE and run any DML. All with No Code. We can also access metadata from an RDBMS and build dynamical ELT systems from that.

It is extremely easy to do this in NiFi.

Instead of using Flume, Let's Use Apache NiFi to Move Any Tabular Data To and From Databases

From A Relational Database (via JDBC Driver) to Anywhere. In our case, we will pull from an RDBMS and post to Kudu.

Step 1: QueryDatabaseTableRecord (Create Connection Pool, Pick DB Type, Table Name, Record Writer)

Step 2: *PutKudu (Set Kudu Master, Table Name, *

Done!

Query Database

Connect to Kudu

Let's Write JSON Records That Get Converted to Kudu Records or RDBMS/MySQL/JDBC Records

Schema For The Data

Read All The Records From Our JDBC Database

Let's Create an Apache Kudu table to Put Database Records To

Let's Examine the MySQL Table We Want to Read/Write To and From

Let's Check the MariaDB Table

MySQL Table Information

From Anywhere (Say a Device) to A Relational Database (via JDBC Driver). In our case, we will insert into an RDBMS from Kafka.

Step 1: Acquire or modify data say ConsumeKafkaRecord_2

Step 2: PutDatabaseRecord (Set Record Reader, INSERT or UPDATE, Connection Pool, Table Name)

Done!

Put Database Records in Any JDBC/RDBMS

Setup Your Connection Pool to SELECT, UPDATE, INSERT or DELETE

SQL DDL

Create MariaDB/MySQL Table

CREATE TABLE iot ( uuid VARCHAR(255) NOT NULL PRIMARY KEY,

ipaddress VARCHAR(255),top1pct BIGINT, top1 VARCHAR(255),

cputemp VARCHAR(255), gputemp VARCHAR(255),

gputempf VARCHAR(255),

cputempf varchar(255), runtime VARCHAR(255),

host VARCHAR(255), filename VARCHAR(255),

imageinput VARCHAR(255),hostname varchar(255),

macaddress varchar(255), end VARCHAR(255), te VARCHAR(255), systemtime VARCHAR(255),

cpu BIGINT, diskusage VARCHAR(255), memory BIGINT, id VARCHAR(255));

Create Kudu Table

CREATE TABLE iot ( uuid STRING,

ipaddress STRING,top1pct BIGINT,

top1 STRING,

cputemp STRING,

gputemp STRING,

gputempf STRING,

cputempf STRING, runtime STRING,

host STRING, filename STRING,

imageinput STRING,hostname STRING,

macaddress STRING,

end STRING, te STRING, systemtime STRING,

cpu BIGINT, diskusage STRING,

memory BIGINT,

id STRING,

PRIMARY KEY (uuid)

)

PARTITION BY HASH PARTITIONS 16

STORED AS KUDU

TBLPROPERTIES ('kudu.num_tablet_replicas' = '1');

References

- https://community.cloudera.com/t5/Community-Articles/Apache-NiFi-Processor-Building-a-SQL-DDL-Schema-From-A-JSON/ta-p/247989

Posted on by:

tspannhw profile

Timothy Spann

@tspannhw

I am a Principal Field Engineer for Data in Motion at Cloudera. I work with Apache NiFi, Apache Kafka, Apache Spark, Apache Flink, IoT, MXNet, DLJ.AI, Deep Learning, Machine Learning, Streaming...

Discussion

markdown guide