trait CreateTableWriter[T] extends WriteConfigMethods[CreateTableWriter[T]]
Trait to restrict calls to create and replace operations.
- Since
3.0.0
- Alphabetic
- By Inheritance
- CreateTableWriter
- WriteConfigMethods
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Abstract Value Members
- abstract def clusterBy(colName: String, colNames: String*): CreateTableWriter[T]
Clusters the output by the given columns on the storage.
Clusters the output by the given columns on the storage. The rows with matching values in the specified clustering columns will be consolidated within the same group.
For instance, if you cluster a dataset by date, the data sharing the same date will be stored together in a file. This arrangement improves query efficiency when you apply selective filters to these clustering columns, thanks to data skipping.
- Annotations
- @varargs()
- Since
4.0.0
- abstract def create(): Unit
Create a new table from the contents of the data frame.
Create a new table from the contents of the data frame.
The new table's schema, partition layout, properties, and other configuration will be based on the configuration set on this writer.
If the output table exists, this operation will fail with org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException.
- Annotations
- @throws(classOf[TableAlreadyExistsException])
- Exceptions thrown
org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
If the table already exists
- abstract def createOrReplace(): Unit
Create a new table or replace an existing table with the contents of the data frame.
Create a new table or replace an existing table with the contents of the data frame.
The output table's schema, partition layout, properties, and other configuration will be based on the contents of the data frame and the configuration set on this writer. If the table exists, its configuration and data will be replaced.
- abstract def option(key: String, value: String): CreateTableWriter[T]
Add a write option.
Add a write option.
- Definition Classes
- WriteConfigMethods
- Since
3.0.0
- abstract def options(options: Map[String, String]): CreateTableWriter[T]
Add write options from a Java Map.
Add write options from a Java Map.
- Definition Classes
- WriteConfigMethods
- Since
3.0.0
- abstract def options(options: Map[String, String]): CreateTableWriter[T]
Add write options from a Scala Map.
Add write options from a Scala Map.
- Definition Classes
- WriteConfigMethods
- Since
3.0.0
- abstract def partitionedBy(column: Column, columns: Column*): CreateTableWriter[T]
Partition the output table created by
create
,createOrReplace
, orreplace
using the given columns or transforms.Partition the output table created by
create
,createOrReplace
, orreplace
using the given columns or transforms.When specified, the table data will be stored by these values for efficient reads.
For example, when a table is partitioned by day, it may be stored in a directory layout like:
table/day=2019-06-01/
table/day=2019-06-02/
Partitioning is one of the most widely used techniques to optimize physical data layout. It provides a coarse-grained index for skipping unnecessary data reads when queries have predicates on the partitioned columns. In order for partitioning to work well, the number of distinct values in each column should typically be less than tens of thousands.
- Annotations
- @varargs()
- Since
3.0.0
- abstract def replace(): Unit
Replace an existing table with the contents of the data frame.
Replace an existing table with the contents of the data frame.
The existing table's schema, partition layout, properties, and other configuration will be replaced with the contents of the data frame and the configuration set on this writer.
If the output table does not exist, this operation will fail with org.apache.spark.sql.catalyst.analysis.CannotReplaceMissingTableException.
- Annotations
- @throws(classOf[CannotReplaceMissingTableException])
- Exceptions thrown
org.apache.spark.sql.catalyst.analysis.CannotReplaceMissingTableException
If the table does not exist
- abstract def tableProperty(property: String, value: String): CreateTableWriter[T]
Add a table property.
- abstract def using(provider: String): CreateTableWriter[T]
Specifies a provider for the underlying output data source.
Specifies a provider for the underlying output data source. Spark's default catalog supports "parquet", "json", etc.
- Since
3.0.0
Concrete Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- def option(key: String, value: Double): CreateTableWriter[T]
Add a double output option.
Add a double output option.
- Definition Classes
- WriteConfigMethods
- Since
3.0.0
- def option(key: String, value: Long): CreateTableWriter[T]
Add a long output option.
Add a long output option.
- Definition Classes
- WriteConfigMethods
- Since
3.0.0
- def option(key: String, value: Boolean): CreateTableWriter[T]
Add a boolean output option.
Add a boolean output option.
- Definition Classes
- WriteConfigMethods
- Since
3.0.0
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
Deprecated Value Members
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable]) @Deprecated
- Deprecated
(Since version 9)