object DeltaTable extends Serializable
Companion object to create DeltaTable instances.
DeltaTable.forPath(sparkSession, pathToTheDeltaTable)
- Since
0.3.0
- Alphabetic
- By Inheritance
- DeltaTable
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
columnBuilder(spark: SparkSession, colName: String): DeltaColumnBuilder
:: Evolving ::
:: Evolving ::
Return an instance of DeltaColumnBuilder to specify a column. Refer to DeltaTableBuilder for examples and DeltaColumnBuilder detailed APIs.
- spark
sparkSession sparkSession passed by the user
- colName
string the column name
- Annotations
- @Evolving()
- Since
1.0.0
-
def
columnBuilder(colName: String): DeltaColumnBuilder
:: Evolving ::
:: Evolving ::
Return an instance of DeltaColumnBuilder to specify a column. Refer to DeltaTableBuilder for examples and DeltaColumnBuilder detailed APIs.
Note: This uses the active SparkSession in the current thread to read the table data. Hence, this throws error if active SparkSession has not been set, that is,
SparkSession.getActiveSession()
is empty.- colName
string the column name
- Annotations
- @Evolving()
- Since
1.0.0
-
def
convertToDelta(spark: SparkSession, identifier: String): DeltaTable
Create a DeltaTable from the given parquet table.
Create a DeltaTable from the given parquet table. Takes an existing parquet table and constructs a delta transaction log in the base path of the table.
Note: Any changes to the table during the conversion process may not result in a consistent state at the end of the conversion. Users should stop any changes to the table before the conversion is started.
An Example would be
io.delta.tables.DeltaTable.convertToDelta( spark, "parquet.`/path`"
- Since
0.4.0
-
def
convertToDelta(spark: SparkSession, identifier: String, partitionSchema: String): DeltaTable
Create a DeltaTable from the given parquet table and partition schema.
Create a DeltaTable from the given parquet table and partition schema. Takes an existing parquet table and constructs a delta transaction log in the base path of that table.
Note: Any changes to the table during the conversion process may not result in a consistent state at the end of the conversion. Users should stop any changes to the table before the conversion is started.
An example usage would be
io.delta.tables.DeltaTable.convertToDelta( spark, "parquet.`/path`", "key1 long, key2 string")
- Since
0.4.0
-
def
convertToDelta(spark: SparkSession, identifier: String, partitionSchema: StructType): DeltaTable
Create a DeltaTable from the given parquet table and partition schema.
Create a DeltaTable from the given parquet table and partition schema. Takes an existing parquet table and constructs a delta transaction log in the base path of that table.
Note: Any changes to the table during the conversion process may not result in a consistent state at the end of the conversion. Users should stop any changes to the table before the conversion is started.
An example usage would be
io.delta.tables.DeltaTable.convertToDelta( spark, "parquet.`/path`", new StructType().add(StructField("key1", LongType)).add(StructField("key2", StringType)))
- Since
0.4.0
-
def
create(spark: SparkSession): DeltaTableBuilder
:: Evolving ::
:: Evolving ::
Return an instance of DeltaTableBuilder to create a Delta table, error if the table exists (the same as SQL
CREATE TABLE
). Refer to DeltaTableBuilder for more details.- spark
sparkSession sparkSession passed by the user
- Annotations
- @Evolving()
- Since
1.0.0
-
def
create(): DeltaTableBuilder
:: Evolving ::
:: Evolving ::
Return an instance of DeltaTableBuilder to create a Delta table, error if the table exists (the same as SQL
CREATE TABLE
). Refer to DeltaTableBuilder for more details.Note: This uses the active SparkSession in the current thread to read the table data. Hence, this throws error if active SparkSession has not been set, that is,
SparkSession.getActiveSession()
is empty.- Annotations
- @Evolving()
- Since
1.0.0
-
def
createIfNotExists(spark: SparkSession): DeltaTableBuilder
:: Evolving ::
:: Evolving ::
Return an instance of DeltaTableBuilder to create a Delta table, if it does not exists (the same as SQL
CREATE TABLE IF NOT EXISTS
). Refer to DeltaTableBuilder for more details.- spark
sparkSession sparkSession passed by the user
- Annotations
- @Evolving()
- Since
1.0.0
-
def
createIfNotExists(): DeltaTableBuilder
:: Evolving ::
:: Evolving ::
Return an instance of DeltaTableBuilder to create a Delta table, if it does not exists (the same as SQL
CREATE TABLE IF NOT EXISTS
). Refer to DeltaTableBuilder for more details.Note: This uses the active SparkSession in the current thread to read the table data. Hence, this throws error if active SparkSession has not been set, that is,
SparkSession.getActiveSession()
is empty.- Annotations
- @Evolving()
- Since
1.0.0
-
def
createOrReplace(spark: SparkSession): DeltaTableBuilder
:: Evolving ::
:: Evolving ::
Return an instance of DeltaTableBuilder to replace a Delta table, or create table if not exists (the same as SQL
CREATE OR REPLACE TABLE
) Refer to DeltaTableBuilder for more details.- spark
sparkSession sparkSession passed by the user.
- Annotations
- @Evolving()
- Since
1.0.0
-
def
createOrReplace(): DeltaTableBuilder
:: Evolving ::
:: Evolving ::
Return an instance of DeltaTableBuilder to replace a Delta table or create table if not exists (the same as SQL
CREATE OR REPLACE TABLE
) Refer to DeltaTableBuilder for more details.Note: This uses the active SparkSession in the current thread to read the table data. Hence, this throws error if active SparkSession has not been set, that is,
SparkSession.getActiveSession()
is empty.- Annotations
- @Evolving()
- Since
1.0.0
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
def
forName(sparkSession: SparkSession, tableName: String): DeltaTable
Instantiate a DeltaTable object using the given table or view name using the given SparkSession.
Instantiate a DeltaTable object using the given table or view name using the given SparkSession. If the given tableOrViewName is invalid (i.e. either no table exists or an existing table is not a Delta table), it throws a
not a Delta table
error.The given tableOrViewName can also be the absolute path of a delta datasource (i.e. delta.
path
), If so, instantiate a DeltaTable object representing the data at the given path (consistent with the forPath). -
def
forName(tableOrViewName: String): DeltaTable
Instantiate a DeltaTable object using the given table or view name.
Instantiate a DeltaTable object using the given table or view name. If the given tableOrViewName is invalid (i.e. either no table exists or an existing table is not a Delta table), it throws a
not a Delta table
error.The given tableOrViewName can also be the absolute path of a delta datasource (i.e. delta.
path
), If so, instantiate a DeltaTable object representing the data at the given path (consistent with the forPath).Note: This uses the active SparkSession in the current thread to read the table data. Hence, this throws error if active SparkSession has not been set, that is,
SparkSession.getActiveSession()
is empty. -
def
forPath(sparkSession: SparkSession, path: String, hadoopConf: Map[String, String]): DeltaTable
Java friendly API to instantiate a DeltaTable object representing the data at the given path, If the given path is invalid (i.e.
Java friendly API to instantiate a DeltaTable object representing the data at the given path, If the given path is invalid (i.e. either no table exists or an existing table is not a Delta table), it throws a
not a Delta table
error.- hadoopConf
Hadoop configuration starting with "fs." or "dfs." will be picked up by
DeltaTable
to access the file system when executing queries. Other configurations will be ignored.val hadoopConf = Map( "fs.s3a.access.key" -> "<access-key>", "fs.s3a.secret.key", "<secret-key>" ) DeltaTable.forPath(spark, "/path/to/table", hadoopConf)
- Since
2.2.0
-
def
forPath(sparkSession: SparkSession, path: String, hadoopConf: Map[String, String]): DeltaTable
Instantiate a DeltaTable object representing the data at the given path, If the given path is invalid (i.e.
Instantiate a DeltaTable object representing the data at the given path, If the given path is invalid (i.e. either no table exists or an existing table is not a Delta table), it throws a
not a Delta table
error.- hadoopConf
Hadoop configuration starting with "fs." or "dfs." will be picked up by
DeltaTable
to access the file system when executing queries. Other configurations will not be allowed.val hadoopConf = Map( "fs.s3a.access.key" -> "<access-key>", "fs.s3a.secret.key" -> "<secret-key>" ) DeltaTable.forPath(spark, "/path/to/table", hadoopConf)
- Since
2.2.0
-
def
forPath(sparkSession: SparkSession, path: String): DeltaTable
Instantiate a DeltaTable object representing the data at the given path, If the given path is invalid (i.e.
Instantiate a DeltaTable object representing the data at the given path, If the given path is invalid (i.e. either no table exists or an existing table is not a Delta table), it throws a
not a Delta table
error.- Since
0.3.0
-
def
forPath(path: String): DeltaTable
Instantiate a DeltaTable object representing the data at the given path, If the given path is invalid (i.e.
Instantiate a DeltaTable object representing the data at the given path, If the given path is invalid (i.e. either no table exists or an existing table is not a Delta table), it throws a
not a Delta table
error.Note: This uses the active SparkSession in the current thread to read the table data. Hence, this throws error if active SparkSession has not been set, that is,
SparkSession.getActiveSession()
is empty.- Since
0.3.0
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
isDeltaTable(identifier: String): Boolean
Check if the provided
identifier
string, in this case a file path, is the root of a Delta table.Check if the provided
identifier
string, in this case a file path, is the root of a Delta table.Note: This uses the active SparkSession in the current thread to search for the table. Hence, this throws error if active SparkSession has not been set, that is,
SparkSession.getActiveSession()
is empty.An example would be
DeltaTable.isDeltaTable(spark, "/path/to/table")
- Since
0.4.0
-
def
isDeltaTable(sparkSession: SparkSession, identifier: String): Boolean
Check if the provided
identifier
string, in this case a file path, is the root of a Delta table using the given SparkSession.Check if the provided
identifier
string, in this case a file path, is the root of a Delta table using the given SparkSession.An example would be
DeltaTable.isDeltaTable(spark, "path/to/table")
- Since
0.4.0
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
replace(spark: SparkSession): DeltaTableBuilder
:: Evolving ::
:: Evolving ::
Return an instance of DeltaTableBuilder to replace a Delta table, error if the table doesn't exist (the same as SQL
REPLACE TABLE
) Refer to DeltaTableBuilder for more details.- spark
sparkSession sparkSession passed by the user
- Annotations
- @Evolving()
- Since
1.0.0
-
def
replace(): DeltaTableBuilder
:: Evolving ::
:: Evolving ::
Return an instance of DeltaTableBuilder to replace a Delta table, error if the table doesn't exist (the same as SQL
REPLACE TABLE
) Refer to DeltaTableBuilder for more details.Note: This uses the active SparkSession in the current thread to read the table data. Hence, this throws error if active SparkSession has not been set, that is,
SparkSession.getActiveSession()
is empty.- Annotations
- @Evolving()
- Since
1.0.0
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()