Hi Will,
Thanks for amazing write up.
But I am facing an issue while executing cmd:
tweets.write.format("delta").mode("append").saveAsTable("tweets")
for the first time the data is stored in delta table, but executing it again gives me the error:
"org.apache.spark.sql.AnalysisException: Cannot create table ('default.tweets'). The associated location ('dbfs:/user/hive/warehouse/tweets') is not empty.;
"
How can I make sure the data is continuously getting stored in table format as well.
Thanks in advance
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Hi Will,
Thanks for amazing write up.
But I am facing an issue while executing cmd:
tweets.write.format("delta").mode("append").saveAsTable("tweets")
for the first time the data is stored in delta table, but executing it again gives me the error:
"org.apache.spark.sql.AnalysisException: Cannot create table ('
default
.tweets
'). The associated location ('dbfs:/user/hive/warehouse/tweets') is not empty.;"
How can I make sure the data is continuously getting stored in table format as well.
Thanks in advance