postgres¶
age¶
Usage
age(expr1, expr2) - Subtract arguments, producing a "symbolic" result that uses years and months
age(expr) - Subtract from current_date (at midnight)
Arguments
Examples
> SELECT age(timestamp '1957-06-13');
43 years 9 months 27 days
> SELECT age(timestamp '2001-04-10', timestamp '1957-06-13');
43 years 9 months 27 days
Class
org.apache.spark.sql.catalyst.expressions.postgresql.Age
Note
Since 0.1.0
array_append¶
Usage
array_append(array, element) - Returns an array of appending an element to the end of an array
Arguments
Examples
Examples:
> SELECT array_append(array(1, 2, 3), 3);
[1,2,3,3]
> SELECT array_append(array(1, 2, 3), null);
[1,2,3,null]
> SELECT array_append(a, e) FROM VALUES (array(1,2), 3), (array(3, 4), null), (null, 5) tbl(a, e);
[1,2,3]
[3,4,null]
[5]
Class
org.apache.spark.sql.catalyst.expressions.postgresql.ArrayAppend
Note
Since 0.1.0
array_cat¶
Usage
array_cat(col1, col2, ..., colN) - Returns the concatenation of col1, col2, ..., colN.
Arguments
Examples
Examples:
> SELECT array_cat('Spark', 'SQL');
SparkSQL
> SELECT array_cat(array(1, 2, 3), array(4, 5), array(6));
[1,2,3,4,5,6]
Class
org.apache.spark.sql.catalyst.expressions.Concat
Note
Concat logic for arrays is available since 2.4.0.
Since 1.5.0
array_length¶
Usage
N/A.
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.postgresql.ArrayLength
Note
Since
justifyDays¶
Usage
justifyDays(expr) - Adjust interval so 30-day time periods are represented as months
Arguments
Examples
Examples:
> SELECT justifyDays(interval '1 month -59 day 25 hour');
-29 days 25 hours
Class
org.apache.spark.sql.catalyst.expressions.postgresql.JustifyDays
Note
Since 0.1.0
justifyHours¶
Usage
justifyHours(expr) - Adjust interval so 30-day time periods are represented as months
Arguments
Examples
Examples:
> SELECT justifyHours(interval '1 month -59 day 25 hour');
-29 days 25 hours
Class
org.apache.spark.sql.catalyst.expressions.postgresql.JustifyDays
Note
Since 0.1.0
justifyInterval¶
Usage
justifyInterval(expr) - Adjust interval so 30-day time periods are represented as months
Arguments
Examples
Examples:
> SELECT justifyInterval(interval '1 month -59 day 25 hour');
-29 days 25 hours
Class
org.apache.spark.sql.catalyst.expressions.postgresql.JustifyDays
Note
Since 0.1.0
regr_count¶
Usage
regr_count(expr1, expr2) - Returns the count of all rows in an expression pair. The function eliminates expression pairs where either expression in the pair is NULL. If no rows remain, the function returns 0.
Arguments
expr1 The dependent DOUBLE PRECISION expression
expr2 The independent DOUBLE PRECISION expression
Examples
> SELECT regr_count(1, 2);
1
> SELECT regr_count(1, null);
0
Class
org.apache.spark.sql.catalyst.expressions.ansi.RegrCount
Note
Since 0.2.0
scale¶
Usage
N/A.
Arguments
Examples
Class
Scale
Note
Since
split_part¶
Usage
split_part(text, delimiter, field) - Split string on delimiter and return the given field (counting from one).
Arguments
Examples
Examples:
> SELECT split_part('abc~@~def~@~ghi', '~@~', 2);
def
Class
org.apache.spark.sql.catalyst.expressions.postgresql.SplitPart
Note
Since 0.1.0
stage_attempt_num¶
Usage
stage_attempt_num() - Get stage attemptNumber, How many times the stage that this task belongs to has been attempted.
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.StageAttemptNumber
Note
Since 0.3.0
stage_id¶
Usage
stage_id() - Get the stage id which the current task belong to
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.StageId
Note
Since 0.3.0
stage_id_with_retry¶
Usage
stage_id_with_retry(stageId) - Get task attemptNumber, and will throw FetchFailedException in the `stageId` Stage and make it retry.
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.StageIdWithRetry
Note
Since 3.3.0
string_to_array¶
Usage
string_to_array(text, delimiter [, replaced]) - splits string into array elements using supplied delimiter and optional null string
Arguments
Examples
Examples:
> SELECT string_to_array('xx~^~yy~^~zz~^~', '~^~', 'yy');
["xx",null,"zz",""]
Class
org.apache.spark.sql.catalyst.expressions.postgresql.StringToArray
Note
Since 0.1.0
task_attempt_id¶
Usage
task_attempt_id() - Get an ID that is unique to this task attempt within SparkContext
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.TaskAttemptId
Note
Since 0.3.0
task_attempt_num¶
Usage
task_attempt_num() - Get task attemptNumber, how many times this task has been attempted
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.TaskAttemptNumber
Note
Since 0.3.0
task_metrics_result_size¶
Usage
task_metrics_result_size() - Meaningless
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.TaskMetricsResultSize
Note
Since 0.3.0
unnest¶
Usage
unnest(expr) - Separates the elements of array `expr` into multiple rows recursively.
Arguments
Examples
Examples:
> SELECT unnest(array(10, 20));
10
20
> SELECT unnest(a) FROM VALUES (array(1,2)), (array(3,4)) AS v1(a);
1
2
3
4
> SELECT unnest(a) FROM VALUES (array(array(1,2), array(3,4))) AS v1(a);
1
2
3
4
Class
org.apache.spark.sql.catalyst.expressions.postgresql.UnNest
Note
Since 0.1.0