DEV Community

Discussion on: Cloud Data Fusion, a game-changer for GCP

Collapse
 
geojsg profile image
JS Gourdet

Hey Giuliano,
Thanks for this insightful article.
As I was telling, price is prohibiting small & medium company using it just for daily usage unfortunately and who would prefer using 3rd party solution like Segment and others (of course it has less features). So it's pity that GCP could do offer a special package for such audience and use case.
Using CDAP from marketplace is actually a possibility but not serverless.
I was wondering if a trick like saving and exporting the pipeline to swtich off the instance and then daily create an instance import the saved pipeline, execute it and close instance after, could be done ?
So far, I couldn't find a possibility to do it unfortunately.

Keep me informed if you by any chance you do.

Thread Thread
 
belipero profile image
Beliche

Hi!
I'm currently searching for a serverless solution for ETL transformation and I was thinking in GCP Data Flow but pricing is restrictve for us.
Our basic requirements is to read a json file from an API which returns 4000 objects, do data transformation to objects and call an API on destiny for data import.

It's not possible to swith of Data Flow instance as you asked, right?

Regards

Thread Thread
 
geojsg profile image
JS Gourdet

Hi,
DataFlow is really not the tool for such load, it concerns much higher volume.
Probably Google Cloud Function could be an cheap option, depending of your data transformation.

PS: My question was about Google Cloud Data Fusion, which is anyway not appropriate for your use case.

Thread Thread
 
belipero profile image
Beliche

Hi!
Thanks for the reply. Definitly GCP Data Fusion is not the use case for my data integration requirements.
I tried to say Data Fusion instead of Data Flow, sorry for that, I'm reviewing too much tools that I mispelled.

Regards