presto¶
char2hexint¶
Usage
char2hexint(expr) - Returns the hexadecimal representation of the UTF-16BE encoding of the string.
Arguments
Examples
Examples:
> SELECT char2hexint('Spark SQL');
0053007000610072006B002000530051004C
Class
org.apache.spark.sql.catalyst.expressions.teradata.Char2HexInt
Note
Since 0.1.0
cosine_similarity¶
Usage
N/A.
Arguments
Examples
Class
CosineSimilarity
Note
Since
EDITDISTANCE¶
Usage
EDITDISTANCE(str1, str2) - Returns the Levenshtein distance between the two given strings.
Arguments
Examples
Examples:
> SELECT EDITDISTANCE('kitten', 'sitting');
3
Class
org.apache.spark.sql.catalyst.expressions.Levenshtein
Note
Since 1.5.0
from_base¶
Usage
from_base(num, from_base, to_base) - Convert `num` from `from_base` to `to_base`.
Arguments
Examples
Examples:
> SELECT from_base('100', 2, 10);
4
> SELECT from_base(-10, 16, -10);
-16
Class
org.apache.spark.sql.catalyst.expressions.Conv
Note
Since 1.5.0
index¶
Usage
index(substr, str[, pos]) - Returns the position of the first occurrence of `substr` in `str` after position `pos`.
The given `pos` and return value are 1-based.
Arguments
Examples
Examples:
> SELECT index('bar', 'foobarbar');
4
> SELECT index('bar', 'foobarbar', 5);
7
> SELECT POSITION('bar' IN 'foobarbar');
4
Class
org.apache.spark.sql.catalyst.expressions.StringLocate
Note
Since 1.5.0
infinity¶
Usage
N/A.
Arguments
Examples
Class
Infinity
Note
Since
is_finite¶
Usage
N/A.
Arguments
Examples
Class
IsFinite
Note
Since
is_infinite¶
Usage
N/A.
Arguments
Examples
Class
IsInfinite
Note
Since
nan¶
Usage
N/A.
Arguments
Examples
Class
NaN
Note
Since
regr_count¶
Usage
regr_count(expr1, expr2) - Returns the count of all rows in an expression pair. The function eliminates expression pairs where either expression in the pair is NULL. If no rows remain, the function returns 0.
Arguments
expr1 The dependent DOUBLE PRECISION expression
expr2 The independent DOUBLE PRECISION expression
Examples
> SELECT regr_count(1, 2);
1
> SELECT regr_count(1, null);
0
Class
org.apache.spark.sql.catalyst.expressions.ansi.RegrCount
Note
Since 0.2.0
stage_attempt_num¶
Usage
stage_attempt_num() - Get stage attemptNumber, How many times the stage that this task belongs to has been attempted.
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.StageAttemptNumber
Note
Since 0.3.0
stage_id¶
Usage
stage_id() - Get the stage id which the current task belong to
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.StageId
Note
Since 0.3.0
stage_id_with_retry¶
Usage
stage_id_with_retry(stageId) - Get task attemptNumber, and will throw FetchFailedException in the `stageId` Stage and make it retry.
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.StageIdWithRetry
Note
Since 3.3.0
task_attempt_id¶
Usage
task_attempt_id() - Get an ID that is unique to this task attempt within SparkContext
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.TaskAttemptId
Note
Since 0.3.0
task_attempt_num¶
Usage
task_attempt_num() - Get task attemptNumber, how many times this task has been attempted
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.TaskAttemptNumber
Note
Since 0.3.0
task_metrics_result_size¶
Usage
task_metrics_result_size() - Meaningless
Arguments
Examples
Class
org.apache.spark.sql.catalyst.expressions.debug.TaskMetricsResultSize
Note
Since 0.3.0
to_base¶
Usage
to_base(num, from_base, to_base) - Convert `num` from `from_base` to `to_base`.
Arguments
Examples
Examples:
> SELECT to_base('100', 2, 10);
4
> SELECT to_base(-10, 16, -10);
-16
Class
org.apache.spark.sql.catalyst.expressions.Conv
Note
Since 1.5.0
try¶
Usage
try(expr) - Evaluate an expression and handle certain types of runtime exceptions by returning NULL.
In cases where it is preferable that queries produce NULL instead of failing when corrupt or invalid data is encountered, the TRY function may be useful, especially when ANSI mode is on and the users need null-tolerant on certain columns or outputs.
AnalysisExceptions will not be handled by this, typically runtime exceptions handled by try function are:
* ArightmeticException - e.g. division by zero, numeric value out of range,
* NumberFormatException - e.g. invalid casting,
* IllegalArgumentException - e.g. invalid datetime pattern, missing format argument for string formatting,
* DateTimeException - e.g. invalid datetime values
* UnsupportedEncodingException - e.g. encode or decode string with invalid charset
Arguments
Examples
Examples:
> SELECT try(1 / 0);
NULL
> SELECT try(date_format(timestamp '2019-10-06', 'yyyy-MM-dd uucc'));
NULL
> SELECT try((5e36BD + 0.1) + 5e36BD);
NULL
> SELECT try(regexp_extract('1a 2b 14m', '\\d+', 1));
NULL
> SELECT try(encode('abc', 'utf-88'));
NULL
Class
org.apache.spark.sql.catalyst.expressions.teradata.TryExpression
Note
Since 0.1.0