Release 420 (22 Jun 2023)#
General#
Add support for the
any_value()
aggregation function. (#17777)Add support for underscores in numeric literals. (#17776)
Add support for hexadecimal, binary, and octal numeric literals. (#17776)
Deprecate the
dynamic-filtering.small-broadcast.*
anddynamic-filtering.large-broadcast.*
configuration properties in favor ofdynamic-filtering.small.*
anddynamic-filtering.large.*
. (#17831)
Security#
BigQuery connector#
Fix direct download of access tokens, and correctly use the proxy when it is enabled with the
bigquery.rpc-proxy.enabled
configuration property. (#17783)
Delta Lake connector#
Add support for recalculating all statistics with an
ANALYZE
statement. (#15968)Disallow using the root directory of a bucket (
scheme://authority
) as a table location without a trailing slash in the location name. (#17921)Fix Parquet writer incompatibility with Apache Spark and Databricks Runtime. (#17978)
Druid connector#
Add support for tables with uppercase characters in their names. (#7197)
Hive connector#
Add a native Avro file format reader. This can be disabled with the
avro.native-reader.enabled
configuration property or theavro_native_reader_enabled
session property. (#17221)Require admin role privileges to perform
ALTER ... SET AUTHORIZATION...
statements when thehive-security
configuration property is set tosql-standard
. (#16691)Improve query performance on partitioned Hive tables when table statistics are not available. (#17677)
Disallow using the root directory of a bucket (
scheme://authority
) as a table location without a trailing slash in the location name. (#17921)Fix Parquet writer incompatibility with Apache Spark and Databricks Runtime. (#17978)
Fix reading from a Hive table when its location is the root directory of an S3 bucket. (#17848)
Hudi connector#
Iceberg connector#
Kafka connector#
Fix server startup failure when a Kafka catalog is present. (#17299)
MongoDB connector#
Add support for
ALTER TABLE ... RENAME COLUMN
. (#17874)Fix incorrect results when the order of the dbref type fields is different from
databaseName
,collectionName
, andid
. (#17883)
SPI#
Move table function infrastructure to the
io.trino.spi.function.table
package. (#17774)