public class DeltaMergeBuilder
extends Object
implements org.apache.spark.internal.Logging
Builder to specify how to merge data from source DataFrame into the target Delta table.
You can specify 1, 2 or 3 when
clauses of which there can be at most 2 whenMatched
clauses
and at most 1 whenNotMatched
clause. Here are the constraints on these clauses.
- whenMatched
clauses:
- There can be at most one update
action and one delete
action in whenMatched
clauses.
- Each whenMatched
clause can have an optional condition. However, if there are two
whenMatched
clauses, then the first one must have a condition.
- When there are two whenMatched
clauses and there are conditions (or the lack of)
such that a row matches both clauses, then the first clause/action is executed.
In other words, the order of the whenMatched
clauses matter.
- If none of the whenMatched
clauses match a source-target row pair that satisfy
the merge condition, then the target rows will not be updated or deleted.
- If you want to update all the columns of the target Delta table with the
corresponding column of the source DataFrame, then you can use the
whenMatched(...).updateAll()
. This is equivalent to
whenMatched(...).updateExpr(Map( ("col1", "source.col1"), ("col2", "source.col2"), ...))
- whenNotMatched
clauses:
- This clause can have only an insert
action, which can have an optional condition.
- If the whenNotMatched
clause is not present or if it is present but the non-matching
source row does not satisfy the condition, then the source row is not inserted.
- If you want to insert all the columns of the target Delta table with the
corresponding column of the source DataFrame, then you can use
whenMatched(...).insertAll()
. This is equivalent to
whenMatched(...).insertExpr(Map( ("col1", "source.col1"), ("col2", "source.col2"), ...))
Scala example to update a key-value Delta table with new key-values from a source DataFrame:
deltaTable
.as("target")
.merge(
source.as("source"),
"target.key = source.key")
.whenMatched
.updateExpr(Map(
"value" -> "source.value"))
.whenNotMatched
.insertExpr(Map(
"key" -> "source.key",
"value" -> "source.value"))
.execute()
Java example to update a key-value Delta table with new key-values from a source DataFrame:
deltaTable
.as("target")
.merge(
source.as("source"),
"target.key = source.key")
.whenMatched
.updateExpr(
new HashMap<String, String>() {{
put("value", "source.value");
}})
.whenNotMatched
.insertExpr(
new HashMap<String, String>() {{
put("key", "source.key");
put("value", "source.value");
}})
.execute();
Constructor and Description |
---|
DeltaMergeBuilder() |
Modifier and Type | Method and Description |
---|---|
void |
execute()
:: Evolving ::
|
DeltaMergeMatchedActionBuilder |
whenMatched()
:: Evolving ::
|
DeltaMergeMatchedActionBuilder |
whenMatched(org.apache.spark.sql.Column condition)
:: Evolving ::
|
DeltaMergeMatchedActionBuilder |
whenMatched(String condition)
:: Evolving ::
|
DeltaMergeNotMatchedActionBuilder |
whenNotMatched()
:: Evolving ::
|
DeltaMergeNotMatchedActionBuilder |
whenNotMatched(org.apache.spark.sql.Column condition)
:: Evolving ::
|
DeltaMergeNotMatchedActionBuilder |
whenNotMatched(String condition)
:: Evolving ::
|
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
$init$, initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, initLock, isTraceEnabled, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarning, org$apache$spark$internal$Logging$$log__$eq, org$apache$spark$internal$Logging$$log_, uninitialize
public DeltaMergeMatchedActionBuilder whenMatched()
Build the actions to perform when the merge condition was matched. This returns
DeltaMergeMatchedActionBuilder
object which can be used to specify how
to update or delete the matched target table row with the source row.
public DeltaMergeMatchedActionBuilder whenMatched(String condition)
Build the actions to perform when the merge condition was matched and
the given condition
is true. This returns DeltaMergeMatchedActionBuilder
object
which can be used to specify how to update or delete the matched target table row with the
source row.
condition
- boolean expression as a SQL formatted stringpublic DeltaMergeMatchedActionBuilder whenMatched(org.apache.spark.sql.Column condition)
Build the actions to perform when the merge condition was matched and
the given condition
is true. This returns a DeltaMergeMatchedActionBuilder
object
which can be used to specify how to update or delete the matched target table row with the
source row.
condition
- boolean expression as a Column objectpublic DeltaMergeNotMatchedActionBuilder whenNotMatched()
Build the action to perform when the merge condition was not matched. This returns
DeltaMergeNotMatchedActionBuilder
object which can be used to specify how
to insert the new sourced row into the target table.
public DeltaMergeNotMatchedActionBuilder whenNotMatched(String condition)
Build the actions to perform when the merge condition was not matched and
the given condition
is true. This returns DeltaMergeMatchedActionBuilder
object
which can be used to specify how to insert the new sourced row into the target table.
condition
- boolean expression as a SQL formatted stringpublic DeltaMergeNotMatchedActionBuilder whenNotMatched(org.apache.spark.sql.Column condition)
Build the actions to perform when the merge condition was not matched and
the given condition
is true. This returns DeltaMergeMatchedActionBuilder
object
which can be used to specify how to insert the new sourced row into the target table.
condition
- boolean expression as a Column objectpublic void execute()
Execute the merge operation based on the built matched and not matched actions.