I am trying to write to a iceberg table(which does not exist before write, hence creating during write) and would like to provide few table properties. Is it possible to do so using the dataframeWriter? I do not want to fire a sql query using spark.sql()
the following are some of the configs that I am using.
"spark.sql.catalog.spark_catalog": ".apache.iceberg.spark.SparkSessionCatalog"
"spark.sql.extensions": ".apache.iceberg.spark.extensions.IcebergSparkSessionExtensions"
"spark.sql.catalogImplementation": "hive"
I am trying to write to a iceberg table(which does not exist before write, hence creating during write) and would like to provide few table properties. Is it possible to do so using the dataframeWriter? I do not want to fire a sql query using spark.sql()
the following are some of the configs that I am using.
"spark.sql.catalog.spark_catalog": ".apache.iceberg.spark.SparkSessionCatalog"
"spark.sql.extensions": ".apache.iceberg.spark.extensions.IcebergSparkSessionExtensions"
"spark.sql.catalogImplementation": "hive"
Share Improve this question asked Jan 29 at 12:06 trutru 1632 silver badges11 bronze badges 3 |1 Answer
Reset to default 1Using DataFrameWriterV2, this is possible:
spark.range(10).withColumn("tmp", lit("hi")).writeTo("test.sample").using("iceberg").tableProperty("write.spark.accept-any-schema", "true").createOrReplace()
发布者:admin,转转请注明出处:http://www.yc00.com/questions/1745299523a4621350.html
.option()
are listed in Iceberg docs: iceberg.apache./docs/latest/spark-configuration/… – mazaneicha Commented Mar 21 at 21:49