public class DeltaMergeBuilder
extends Object
implements org.apache.spark.sql.delta.util.AnalysisHelper, org.apache.spark.internal.Logging
when
clauses of which there can be at most 2 whenMatched
clauses
and at most 1 whenNotMatched
clause. Here are the constraints on these clauses.
- whenMatched
clauses:
- There can be at most one update
action and one delete
action in whenMatched
clauses.
- Each whenMatched
clause can have an optional condition. However, if there are two
whenMatched
clauses, then the first one must have a condition.
- When there are two whenMatched
clauses and there are conditions (or the lack of)
such that a row matches both clauses, then the first clause/action is executed.
In other words, the order of the whenMatched
clauses matter.
- If none of the whenMatched
clauses match a source-target row pair that satisfy
the merge condition, then the target rows will not be updated or deleted.
- If you want to update all the columns of the target Delta table with the
corresponding column of the source DataFrame, then you can use the
whenMatched(...).updateAll()
. This is equivalent to
whenMatched(...).updateExpr(Map( ("col1", "source.col1"), ("col2", "source.col2"), ...))
- whenNotMatched
clauses:
- This clause can have only an insert
action, which can have an optional condition.
- If the whenNotMatched
clause is not present or if it is present but the non-matching
source row does not satisfy the condition, then the source row is not inserted.
- If you want to insert all the columns of the target Delta table with the
corresponding column of the source DataFrame, then you can use
whenMatched(...).insertAll()
. This is equivalent to
whenMatched(...).insertExpr(Map( ("col1", "source.col1"), ("col2", "source.col2"), ...))
Scala example to update a key-value Delta table with new key-values from a source DataFrame:
deltaTable
.as("target")
.merge(
source.as("source"),
"target.key = source.key")
.whenMatched
.updateExpr(Map(
"value" -> "source.value"))
.whenNotMatched
.insertExpr(Map(
"key" -> "source.key",
"value" -> "source.value"))
.execute()
Java example to update a key-value Delta table with new key-values from a source DataFrame:
deltaTable
.as("target")
.merge(
source.as("source"),
"target.key = source.key")
.whenMatched
.updateExpr(
new HashMap<String, String>() {{
put("value", "source.value");
}})
.whenNotMatched
.insertExpr(
new HashMap<String, String>() {{
put("key", "source.key");
put("value", "source.value");
}})
.execute();
Constructor and Description |
---|
DeltaMergeBuilder() |
Modifier and Type | Method and Description |
---|---|
void |
execute()
Execute the merge operation based on the built matched and not matched actions.
|
DeltaMergeMatchedActionBuilder |
whenMatched()
Build the actions to perform when the merge condition was matched.
|
DeltaMergeMatchedActionBuilder |
whenMatched(org.apache.spark.sql.Column condition)
Build the actions to perform when the merge condition was matched and
the given
condition is true. |
DeltaMergeMatchedActionBuilder |
whenMatched(String condition)
Build the actions to perform when the merge condition was matched and
the given
condition is true. |
DeltaMergeNotMatchedActionBuilder |
whenNotMatched()
Build the action to perform when the merge condition was not matched.
|
DeltaMergeNotMatchedActionBuilder |
whenNotMatched(org.apache.spark.sql.Column condition)
Build the actions to perform when the merge condition was not matched and
the given
condition is true. |
DeltaMergeNotMatchedActionBuilder |
whenNotMatched(String condition)
Build the actions to perform when the merge condition was not matched and
the given
condition is true. |
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
$init$, improveUnsupportedOpError, resolveReferencesForExpressions, toDataset, tryResolveReferences
$init$, initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, initLock, isTraceEnabled, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarning, org$apache$spark$internal$Logging$$log__$eq, org$apache$spark$internal$Logging$$log_, uninitialize
public DeltaMergeMatchedActionBuilder whenMatched()
DeltaMergeMatchedActionBuilder
object which can be used to specify how
to update or delete the matched target table row with the source row.public DeltaMergeMatchedActionBuilder whenMatched(String condition)
condition
is true. This returns DeltaMergeMatchedActionBuilder
object
which can be used to specify how to update or delete the matched target table row with the
source row.
condition
- boolean expression as a SQL formatted stringpublic DeltaMergeMatchedActionBuilder whenMatched(org.apache.spark.sql.Column condition)
condition
is true. This returns a DeltaMergeMatchedActionBuilder
object
which can be used to specify how to update or delete the matched target table row with the
source row.
condition
- boolean expression as a Column objectpublic DeltaMergeNotMatchedActionBuilder whenNotMatched()
DeltaMergeNotMatchedActionBuilder
object which can be used to specify how
to insert the new sourced row into the target table.public DeltaMergeNotMatchedActionBuilder whenNotMatched(String condition)
condition
is true. This returns DeltaMergeMatchedActionBuilder
object
which can be used to specify how to insert the new sourced row into the target table.
condition
- boolean expression as a SQL formatted stringpublic DeltaMergeNotMatchedActionBuilder whenNotMatched(org.apache.spark.sql.Column condition)
condition
is true. This returns DeltaMergeMatchedActionBuilder
object
which can be used to specify how to insert the new sourced row into the target table.
condition
- boolean expression as a Column objectpublic void execute()