pyspark.sql.functions.datepart#
- pyspark.sql.functions.datepart(field, source)[source]#
Extracts a part of the date/timestamp or interval source.
New in version 3.5.0.
- Parameters
- Returns
Column
a part of the date/timestamp or interval source.
See also
Examples
>>> import datetime >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([(datetime.datetime(2015, 4, 8, 13, 8, 15),)], ['ts']) >>> df.select( ... '*', ... sf.datepart(sf.lit('YEAR'), 'ts').alias('year'), ... sf.datepart(sf.lit('month'), 'ts').alias('month'), ... sf.datepart(sf.lit('WEEK'), 'ts').alias('week'), ... sf.datepart(sf.lit('D'), df.ts).alias('day'), ... sf.datepart(sf.lit('M'), df.ts).alias('minute'), ... sf.datepart(sf.lit('S'), df.ts).alias('second') ... ).show() +-------------------+----+-----+----+---+------+---------+ | ts|year|month|week|day|minute| second| +-------------------+----+-----+----+---+------+---------+ |2015-04-08 13:08:15|2015| 4| 15| 8| 8|15.000000| +-------------------+----+-----+----+---+------+---------+