pyspark.sql.SparkSession.addTag#
- SparkSession.addTag(tag)[source]#
- Add a tag to be assigned to all the operations started by this thread in this session. - Often, a unit of execution in an application consists of multiple Spark executions. Application programmers can use this method to group all those jobs together and give a group tag. The application can use - SparkSession.interruptTag()to cancel all running executions with this tag.- There may be multiple tags present at the same time, so different parts of application may use different tags to perform cancellation at different levels of granularity. - New in version 3.5.0. - Changed in version 4.0.0: Supports Spark Classic. - Parameters
- tagstr
- The tag to be added. Cannot contain ‘,’ (comma) character or be an empty string.