Table Functions¶
This page lists all table functions available in Spark SQL.
python_worker_logs¶
python_worker_logs() - Returns a table of logs collected from Python workers.
Examples:
> SET spark.sql.pyspark.worker.logging.enabled=true;
spark.sql.pyspark.worker.logging.enabled true
> SELECT * FROM python_worker_logs();
Since: 4.1.0
range¶
range(start[, end[, step[, numSlices]]]) / range(end) - Returns a table of values within a specified range.
Arguments:
- start - An optional BIGINT literal defaulted to 0, marking the first value generated.
- end - A BIGINT literal marking endpoint (exclusive) of the number generation.
- step - An optional BIGINT literal defaulted to 1, specifying the increment used when generating values.
- numParts - An optional INTEGER literal specifying how the production of rows is spread across partitions.
Examples:
> SELECT * FROM range(1);
+---+
| id|
+---+
| 0|
+---+
> SELECT * FROM range(0, 2);
+---+
|id |
+---+
|0 |
|1 |
+---+
> SELECT * FROM range(0, 4, 2);
+---+
|id |
+---+
|0 |
|2 |
+---+
Since: 2.0.0