Merging upstream version 26.12.0.
Signed-off-by: Daniel Baumann <daniel@debian.org>
This commit is contained in:
parent
d24d19e9ea
commit
69b6dd9501
70 changed files with 1134 additions and 340 deletions
|
@ -1,6 +1,12 @@
|
||||||
Changelog
|
Changelog
|
||||||
=========
|
=========
|
||||||
|
|
||||||
|
## [v26.11.1] - 2025-03-18
|
||||||
|
### :bug: Bug Fixes
|
||||||
|
- [`d7b3b3e`](https://github.com/tobymao/sqlglot/commit/d7b3b3e89720d1783d092a2c60a9c2209d9984a2) - **optimizer**: handle TableFromRows properly in annotate_types *(PR [#4889](https://github.com/tobymao/sqlglot/pull/4889) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- :arrow_lower_right: *fixes issue [#4004](https://github.com/TobikoData/sqlmesh/issues/4004) opened by [@hmeng-taproot](https://github.com/hmeng-taproot)*
|
||||||
|
|
||||||
|
|
||||||
## [v26.11.0] - 2025-03-17
|
## [v26.11.0] - 2025-03-17
|
||||||
### :boom: BREAKING CHANGES
|
### :boom: BREAKING CHANGES
|
||||||
- due to [`ac3d311`](https://github.com/tobymao/sqlglot/commit/ac3d311c4184ca2ced603a100588e3e7435ce352) - do not expand having expressions if they conflict with a projection *(PR [#4881](https://github.com/tobymao/sqlglot/pull/4881) by [@tobymao](https://github.com/tobymao))*:
|
- due to [`ac3d311`](https://github.com/tobymao/sqlglot/commit/ac3d311c4184ca2ced603a100588e3e7435ce352) - do not expand having expressions if they conflict with a projection *(PR [#4881](https://github.com/tobymao/sqlglot/pull/4881) by [@tobymao](https://github.com/tobymao))*:
|
||||||
|
@ -6178,3 +6184,4 @@ Changelog
|
||||||
[v26.10.0]: https://github.com/tobymao/sqlglot/compare/v26.9.0...v26.10.0
|
[v26.10.0]: https://github.com/tobymao/sqlglot/compare/v26.9.0...v26.10.0
|
||||||
[v26.10.1]: https://github.com/tobymao/sqlglot/compare/v26.10.0...v26.10.1
|
[v26.10.1]: https://github.com/tobymao/sqlglot/compare/v26.10.0...v26.10.1
|
||||||
[v26.11.0]: https://github.com/tobymao/sqlglot/compare/v26.10.1...v26.11.0
|
[v26.11.0]: https://github.com/tobymao/sqlglot/compare/v26.10.1...v26.11.0
|
||||||
|
[v26.11.1]: https://github.com/tobymao/sqlglot/compare/v26.11.0...v26.11.1
|
||||||
|
|
|
@ -6,7 +6,7 @@ It is a very comprehensive generic SQL parser with a robust [test suite](https:/
|
||||||
|
|
||||||
You can easily [customize](#custom-dialects) the parser, [analyze](#metadata) queries, traverse expression trees, and programmatically [build](#build-and-modify-sql) SQL.
|
You can easily [customize](#custom-dialects) the parser, [analyze](#metadata) queries, traverse expression trees, and programmatically [build](#build-and-modify-sql) SQL.
|
||||||
|
|
||||||
Syntax [errors](#parser-errors) are highlighted and dialect incompatibilities can warn or raise depending on configurations. However, SQLGlot does not aim to be a SQL validator, so it may fail to detect certain syntax errors.
|
SQLGlot can detect a variety of [syntax errors](#parser-errors), such as unbalanced parentheses, incorrect usage of reserved keywords, and so on. These errors are highlighted and dialect incompatibilities can warn or raise depending on configurations.
|
||||||
|
|
||||||
Learn more about SQLGlot in the API [documentation](https://sqlglot.com/) and the expression tree [primer](https://github.com/tobymao/sqlglot/blob/main/posts/ast_primer.md).
|
Learn more about SQLGlot in the API [documentation](https://sqlglot.com/) and the expression tree [primer](https://github.com/tobymao/sqlglot/blob/main/posts/ast_primer.md).
|
||||||
|
|
||||||
|
@ -80,10 +80,6 @@ I tried to output SQL but it's not in the correct dialect!
|
||||||
|
|
||||||
* Like parsing, generating SQL also requires the target dialect to be specified, otherwise the SQLGlot dialect will be used by default. For example, to transpile a query from Spark SQL to DuckDB, do `parse_one(sql, dialect="spark").sql(dialect="duckdb")` (alternatively: `transpile(sql, read="spark", write="duckdb")`).
|
* Like parsing, generating SQL also requires the target dialect to be specified, otherwise the SQLGlot dialect will be used by default. For example, to transpile a query from Spark SQL to DuckDB, do `parse_one(sql, dialect="spark").sql(dialect="duckdb")` (alternatively: `transpile(sql, read="spark", write="duckdb")`).
|
||||||
|
|
||||||
I tried to parse invalid SQL and it worked, even though it should raise an error! Why didn't it validate my SQL?
|
|
||||||
|
|
||||||
* SQLGlot does not aim to be a SQL validator - it is designed to be very forgiving. This makes the codebase more comprehensive and also gives more flexibility to its users, e.g. by allowing them to include trailing commas in their projection lists.
|
|
||||||
|
|
||||||
What happened to sqlglot.dataframe?
|
What happened to sqlglot.dataframe?
|
||||||
|
|
||||||
* The PySpark dataframe api was moved to a standalone library called [SQLFrame](https://github.com/eakmanrq/sqlframe) in v24. It now allows you to run queries as opposed to just generate SQL.
|
* The PySpark dataframe api was moved to a standalone library called [SQLFrame](https://github.com/eakmanrq/sqlframe) in v24. It now allows you to run queries as opposed to just generate SQL.
|
||||||
|
|
File diff suppressed because one or more lines are too long
|
@ -84,8 +84,8 @@
|
||||||
</span><span id="L-17"><a href="#L-17"><span class="linenos">17</span></a><span class="n">__version_tuple__</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
</span><span id="L-17"><a href="#L-17"><span class="linenos">17</span></a><span class="n">__version_tuple__</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
||||||
</span><span id="L-18"><a href="#L-18"><span class="linenos">18</span></a><span class="n">version_tuple</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
</span><span id="L-18"><a href="#L-18"><span class="linenos">18</span></a><span class="n">version_tuple</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
||||||
</span><span id="L-19"><a href="#L-19"><span class="linenos">19</span></a>
|
</span><span id="L-19"><a href="#L-19"><span class="linenos">19</span></a>
|
||||||
</span><span id="L-20"><a href="#L-20"><span class="linenos">20</span></a><span class="n">__version__</span> <span class="o">=</span> <span class="n">version</span> <span class="o">=</span> <span class="s1">'26.11.0'</span>
|
</span><span id="L-20"><a href="#L-20"><span class="linenos">20</span></a><span class="n">__version__</span> <span class="o">=</span> <span class="n">version</span> <span class="o">=</span> <span class="s1">'26.11.1'</span>
|
||||||
</span><span id="L-21"><a href="#L-21"><span class="linenos">21</span></a><span class="n">__version_tuple__</span> <span class="o">=</span> <span class="n">version_tuple</span> <span class="o">=</span> <span class="p">(</span><span class="mi">26</span><span class="p">,</span> <span class="mi">11</span><span class="p">,</span> <span class="mi">0</span><span class="p">)</span>
|
</span><span id="L-21"><a href="#L-21"><span class="linenos">21</span></a><span class="n">__version_tuple__</span> <span class="o">=</span> <span class="n">version_tuple</span> <span class="o">=</span> <span class="p">(</span><span class="mi">26</span><span class="p">,</span> <span class="mi">11</span><span class="p">,</span> <span class="mi">1</span><span class="p">)</span>
|
||||||
</span></pre></div>
|
</span></pre></div>
|
||||||
|
|
||||||
|
|
||||||
|
@ -93,7 +93,7 @@
|
||||||
<section id="__version__">
|
<section id="__version__">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">__version__</span><span class="annotation">: str</span> =
|
<span class="name">__version__</span><span class="annotation">: str</span> =
|
||||||
<span class="default_value">'26.11.0'</span>
|
<span class="default_value">'26.11.1'</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -105,7 +105,7 @@
|
||||||
<section id="__version_tuple__">
|
<section id="__version_tuple__">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">__version_tuple__</span><span class="annotation">: object</span> =
|
<span class="name">__version_tuple__</span><span class="annotation">: object</span> =
|
||||||
<span class="default_value">(26, 11, 0)</span>
|
<span class="default_value">(26, 11, 1)</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -117,7 +117,7 @@
|
||||||
<section id="version">
|
<section id="version">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">version</span><span class="annotation">: str</span> =
|
<span class="name">version</span><span class="annotation">: str</span> =
|
||||||
<span class="default_value">'26.11.0'</span>
|
<span class="default_value">'26.11.1'</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -129,7 +129,7 @@
|
||||||
<section id="version_tuple">
|
<section id="version_tuple">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">version_tuple</span><span class="annotation">: object</span> =
|
<span class="name">version_tuple</span><span class="annotation">: object</span> =
|
||||||
<span class="default_value">(26, 11, 0)</span>
|
<span class="default_value">(26, 11, 1)</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -355,7 +355,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Athena">
|
<section id="Athena">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Athena</span> =
|
<span class="name">Athena</span> =
|
||||||
<span class="default_value"><MagicMock id='139764207005664'></span>
|
<span class="default_value"><MagicMock id='140418209935328'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -367,7 +367,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="BigQuery">
|
<section id="BigQuery">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">BigQuery</span> =
|
<span class="name">BigQuery</span> =
|
||||||
<span class="default_value"><MagicMock id='139764203079664'></span>
|
<span class="default_value"><MagicMock id='140418205943792'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -379,7 +379,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="ClickHouse">
|
<section id="ClickHouse">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">ClickHouse</span> =
|
<span class="name">ClickHouse</span> =
|
||||||
<span class="default_value"><MagicMock id='139764203085856'></span>
|
<span class="default_value"><MagicMock id='140418205949984'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -391,7 +391,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Databricks">
|
<section id="Databricks">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Databricks</span> =
|
<span class="name">Databricks</span> =
|
||||||
<span class="default_value"><MagicMock id='139764212168112'></span>
|
<span class="default_value"><MagicMock id='140418215278000'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -403,7 +403,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Doris">
|
<section id="Doris">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Doris</span> =
|
<span class="name">Doris</span> =
|
||||||
<span class="default_value"><MagicMock id='139764212159760'></span>
|
<span class="default_value"><MagicMock id='140418215269648'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -415,7 +415,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Drill">
|
<section id="Drill">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Drill</span> =
|
<span class="name">Drill</span> =
|
||||||
<span class="default_value"><MagicMock id='139764197326704'></span>
|
<span class="default_value"><MagicMock id='140418200239984'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -427,7 +427,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Druid">
|
<section id="Druid">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Druid</span> =
|
<span class="name">Druid</span> =
|
||||||
<span class="default_value"><MagicMock id='139764196429584'></span>
|
<span class="default_value"><MagicMock id='140418197294864'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -439,7 +439,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="DuckDB">
|
<section id="DuckDB">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">DuckDB</span> =
|
<span class="name">DuckDB</span> =
|
||||||
<span class="default_value"><MagicMock id='139764196433520'></span>
|
<span class="default_value"><MagicMock id='140418197298800'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -451,7 +451,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Dune">
|
<section id="Dune">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Dune</span> =
|
<span class="name">Dune</span> =
|
||||||
<span class="default_value"><MagicMock id='139764210494832'></span>
|
<span class="default_value"><MagicMock id='140418213604720'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -463,7 +463,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Hive">
|
<section id="Hive">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Hive</span> =
|
<span class="name">Hive</span> =
|
||||||
<span class="default_value"><MagicMock id='139764202200064'></span>
|
<span class="default_value"><MagicMock id='140418209176576'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -475,7 +475,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Materialize">
|
<section id="Materialize">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Materialize</span> =
|
<span class="name">Materialize</span> =
|
||||||
<span class="default_value"><MagicMock id='139764203072496'></span>
|
<span class="default_value"><MagicMock id='140418208748144'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -487,7 +487,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="MySQL">
|
<section id="MySQL">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">MySQL</span> =
|
<span class="name">MySQL</span> =
|
||||||
<span class="default_value"><MagicMock id='139764206147552'></span>
|
<span class="default_value"><MagicMock id='140418214865088'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -499,7 +499,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Oracle">
|
<section id="Oracle">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Oracle</span> =
|
<span class="name">Oracle</span> =
|
||||||
<span class="default_value"><MagicMock id='139764211952192'></span>
|
<span class="default_value"><MagicMock id='140418201758272'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -511,7 +511,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Postgres">
|
<section id="Postgres">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Postgres</span> =
|
<span class="name">Postgres</span> =
|
||||||
<span class="default_value"><MagicMock id='139764211236048'></span>
|
<span class="default_value"><MagicMock id='140418214345936'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -523,7 +523,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Presto">
|
<section id="Presto">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Presto</span> =
|
<span class="name">Presto</span> =
|
||||||
<span class="default_value"><MagicMock id='139764211236576'></span>
|
<span class="default_value"><MagicMock id='140418214346464'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -535,7 +535,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="PRQL">
|
<section id="PRQL">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">PRQL</span> =
|
<span class="name">PRQL</span> =
|
||||||
<span class="default_value"><MagicMock id='139764206347136'></span>
|
<span class="default_value"><MagicMock id='140418209194880'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -547,7 +547,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Redshift">
|
<section id="Redshift">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Redshift</span> =
|
<span class="name">Redshift</span> =
|
||||||
<span class="default_value"><MagicMock id='139764211600640'></span>
|
<span class="default_value"><MagicMock id='140418214694144'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -559,7 +559,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="RisingWave">
|
<section id="RisingWave">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">RisingWave</span> =
|
<span class="name">RisingWave</span> =
|
||||||
<span class="default_value"><MagicMock id='139764211253360'></span>
|
<span class="default_value"><MagicMock id='140418214346864'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -571,7 +571,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Snowflake">
|
<section id="Snowflake">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Snowflake</span> =
|
<span class="name">Snowflake</span> =
|
||||||
<span class="default_value"><MagicMock id='139764197062160'></span>
|
<span class="default_value"><MagicMock id='140418199975440'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -583,7 +583,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Spark">
|
<section id="Spark">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Spark</span> =
|
<span class="name">Spark</span> =
|
||||||
<span class="default_value"><MagicMock id='139764197064368'></span>
|
<span class="default_value"><MagicMock id='140418199977648'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -595,7 +595,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Spark2">
|
<section id="Spark2">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Spark2</span> =
|
<span class="name">Spark2</span> =
|
||||||
<span class="default_value"><MagicMock id='139764207440400'></span>
|
<span class="default_value"><MagicMock id='140418210304528'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -607,7 +607,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="SQLite">
|
<section id="SQLite">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">SQLite</span> =
|
<span class="name">SQLite</span> =
|
||||||
<span class="default_value"><MagicMock id='139764207446016'></span>
|
<span class="default_value"><MagicMock id='140418210310144'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -619,7 +619,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="StarRocks">
|
<section id="StarRocks">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">StarRocks</span> =
|
<span class="name">StarRocks</span> =
|
||||||
<span class="default_value"><MagicMock id='139764198852688'></span>
|
<span class="default_value"><MagicMock id='140418205976656'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -631,7 +631,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Tableau">
|
<section id="Tableau">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Tableau</span> =
|
<span class="name">Tableau</span> =
|
||||||
<span class="default_value"><MagicMock id='139764198853984'></span>
|
<span class="default_value"><MagicMock id='140418205977952'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -643,7 +643,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Teradata">
|
<section id="Teradata">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Teradata</span> =
|
<span class="name">Teradata</span> =
|
||||||
<span class="default_value"><MagicMock id='139764194709360'></span>
|
<span class="default_value"><MagicMock id='140418197655408'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -655,7 +655,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Trino">
|
<section id="Trino">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Trino</span> =
|
<span class="name">Trino</span> =
|
||||||
<span class="default_value"><MagicMock id='139764194717232'></span>
|
<span class="default_value"><MagicMock id='140418197663280'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -667,7 +667,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="TSQL">
|
<section id="TSQL">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">TSQL</span> =
|
<span class="name">TSQL</span> =
|
||||||
<span class="default_value"><MagicMock id='139764194741600'></span>
|
<span class="default_value"><MagicMock id='140418197671264'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -679,7 +679,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Dialect">
|
<section id="Dialect">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Dialect</span> =
|
<span class="name">Dialect</span> =
|
||||||
<span class="default_value"><MagicMock id='139764194749520'></span>
|
<span class="default_value"><MagicMock id='140418197679184'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -691,7 +691,7 @@ dialect implementations in order to understand how their various components can
|
||||||
<section id="Dialects">
|
<section id="Dialects">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">Dialects</span> =
|
<span class="name">Dialects</span> =
|
||||||
<span class="default_value"><MagicMock id='139764194757504'></span>
|
<span class="default_value"><MagicMock id='140418197703552'></span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
@ -61852,7 +61852,7 @@ Otherwise, this resets the expressions.</li>
|
||||||
<div id="DataType.STRUCT_TYPES" class="classattr">
|
<div id="DataType.STRUCT_TYPES" class="classattr">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">STRUCT_TYPES</span> =
|
<span class="name">STRUCT_TYPES</span> =
|
||||||
<span class="default_value">{<Type.NESTED: 'NESTED'>, <Type.UNION: 'UNION'>, <Type.STRUCT: 'STRUCT'>, <Type.OBJECT: 'OBJECT'>}</span>
|
<span class="default_value">{<Type.STRUCT: 'STRUCT'>, <Type.OBJECT: 'OBJECT'>, <Type.NESTED: 'NESTED'>, <Type.UNION: 'UNION'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -61864,7 +61864,7 @@ Otherwise, this resets the expressions.</li>
|
||||||
<div id="DataType.ARRAY_TYPES" class="classattr">
|
<div id="DataType.ARRAY_TYPES" class="classattr">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">ARRAY_TYPES</span> =
|
<span class="name">ARRAY_TYPES</span> =
|
||||||
<span class="default_value">{<Type.LIST: 'LIST'>, <Type.ARRAY: 'ARRAY'>}</span>
|
<span class="default_value">{<Type.ARRAY: 'ARRAY'>, <Type.LIST: 'LIST'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -61877,7 +61877,7 @@ Otherwise, this resets the expressions.</li>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">NESTED_TYPES</span> =
|
<span class="name">NESTED_TYPES</span> =
|
||||||
<input id="DataType.NESTED_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DataType.NESTED_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DataType.NESTED_TYPES-view-value"></label><span class="default_value">{<Type.OBJECT: 'OBJECT'>, <Type.NESTED: 'NESTED'>, <Type.LIST: 'LIST'>, <Type.ARRAY: 'ARRAY'>, <Type.MAP: 'MAP'>, <Type.STRUCT: 'STRUCT'>, <Type.UNION: 'UNION'>}</span>
|
<label class="view-value-button pdoc-button" for="DataType.NESTED_TYPES-view-value"></label><span class="default_value">{<Type.STRUCT: 'STRUCT'>, <Type.ARRAY: 'ARRAY'>, <Type.OBJECT: 'OBJECT'>, <Type.LIST: 'LIST'>, <Type.UNION: 'UNION'>, <Type.NESTED: 'NESTED'>, <Type.MAP: 'MAP'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -61890,7 +61890,7 @@ Otherwise, this resets the expressions.</li>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">TEXT_TYPES</span> =
|
<span class="name">TEXT_TYPES</span> =
|
||||||
<input id="DataType.TEXT_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DataType.TEXT_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DataType.TEXT_TYPES-view-value"></label><span class="default_value">{<Type.NVARCHAR: 'NVARCHAR'>, <Type.VARCHAR: 'VARCHAR'>, <Type.TEXT: 'TEXT'>, <Type.NCHAR: 'NCHAR'>, <Type.CHAR: 'CHAR'>, <Type.NAME: 'NAME'>}</span>
|
<label class="view-value-button pdoc-button" for="DataType.TEXT_TYPES-view-value"></label><span class="default_value">{<Type.VARCHAR: 'VARCHAR'>, <Type.NVARCHAR: 'NVARCHAR'>, <Type.TEXT: 'TEXT'>, <Type.NCHAR: 'NCHAR'>, <Type.CHAR: 'CHAR'>, <Type.NAME: 'NAME'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -61903,7 +61903,7 @@ Otherwise, this resets the expressions.</li>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">SIGNED_INTEGER_TYPES</span> =
|
<span class="name">SIGNED_INTEGER_TYPES</span> =
|
||||||
<input id="DataType.SIGNED_INTEGER_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DataType.SIGNED_INTEGER_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DataType.SIGNED_INTEGER_TYPES-view-value"></label><span class="default_value">{<Type.SMALLINT: 'SMALLINT'>, <Type.BIGINT: 'BIGINT'>, <Type.INT128: 'INT128'>, <Type.MEDIUMINT: 'MEDIUMINT'>, <Type.TINYINT: 'TINYINT'>, <Type.INT256: 'INT256'>, <Type.INT: 'INT'>}</span>
|
<label class="view-value-button pdoc-button" for="DataType.SIGNED_INTEGER_TYPES-view-value"></label><span class="default_value">{<Type.BIGINT: 'BIGINT'>, <Type.TINYINT: 'TINYINT'>, <Type.SMALLINT: 'SMALLINT'>, <Type.INT128: 'INT128'>, <Type.MEDIUMINT: 'MEDIUMINT'>, <Type.INT256: 'INT256'>, <Type.INT: 'INT'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -61916,7 +61916,7 @@ Otherwise, this resets the expressions.</li>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">UNSIGNED_INTEGER_TYPES</span> =
|
<span class="name">UNSIGNED_INTEGER_TYPES</span> =
|
||||||
<input id="DataType.UNSIGNED_INTEGER_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DataType.UNSIGNED_INTEGER_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DataType.UNSIGNED_INTEGER_TYPES-view-value"></label><span class="default_value">{<Type.UINT: 'UINT'>, <Type.USMALLINT: 'USMALLINT'>, <Type.UTINYINT: 'UTINYINT'>, <Type.UMEDIUMINT: 'UMEDIUMINT'>, <Type.UBIGINT: 'UBIGINT'>, <Type.UINT256: 'UINT256'>, <Type.UINT128: 'UINT128'>}</span>
|
<label class="view-value-button pdoc-button" for="DataType.UNSIGNED_INTEGER_TYPES-view-value"></label><span class="default_value">{<Type.UINT256: 'UINT256'>, <Type.UMEDIUMINT: 'UMEDIUMINT'>, <Type.UINT: 'UINT'>, <Type.UINT128: 'UINT128'>, <Type.USMALLINT: 'USMALLINT'>, <Type.UTINYINT: 'UTINYINT'>, <Type.UBIGINT: 'UBIGINT'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -61929,7 +61929,7 @@ Otherwise, this resets the expressions.</li>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">INTEGER_TYPES</span> =
|
<span class="name">INTEGER_TYPES</span> =
|
||||||
<input id="DataType.INTEGER_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DataType.INTEGER_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DataType.INTEGER_TYPES-view-value"></label><span class="default_value">{<Type.UINT: 'UINT'>, <Type.SMALLINT: 'SMALLINT'>, <Type.BIGINT: 'BIGINT'>, <Type.INT128: 'INT128'>, <Type.USMALLINT: 'USMALLINT'>, <Type.UTINYINT: 'UTINYINT'>, <Type.MEDIUMINT: 'MEDIUMINT'>, <Type.TINYINT: 'TINYINT'>, <Type.INT256: 'INT256'>, <Type.BIT: 'BIT'>, <Type.UMEDIUMINT: 'UMEDIUMINT'>, <Type.INT: 'INT'>, <Type.UBIGINT: 'UBIGINT'>, <Type.UINT256: 'UINT256'>, <Type.UINT128: 'UINT128'>}</span>
|
<label class="view-value-button pdoc-button" for="DataType.INTEGER_TYPES-view-value"></label><span class="default_value">{<Type.BIGINT: 'BIGINT'>, <Type.TINYINT: 'TINYINT'>, <Type.SMALLINT: 'SMALLINT'>, <Type.INT128: 'INT128'>, <Type.UINT256: 'UINT256'>, <Type.UMEDIUMINT: 'UMEDIUMINT'>, <Type.MEDIUMINT: 'MEDIUMINT'>, <Type.UINT: 'UINT'>, <Type.UINT128: 'UINT128'>, <Type.INT256: 'INT256'>, <Type.BIT: 'BIT'>, <Type.INT: 'INT'>, <Type.USMALLINT: 'USMALLINT'>, <Type.UTINYINT: 'UTINYINT'>, <Type.UBIGINT: 'UBIGINT'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -61954,7 +61954,7 @@ Otherwise, this resets the expressions.</li>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">REAL_TYPES</span> =
|
<span class="name">REAL_TYPES</span> =
|
||||||
<input id="DataType.REAL_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DataType.REAL_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DataType.REAL_TYPES-view-value"></label><span class="default_value">{<Type.FLOAT: 'FLOAT'>, <Type.DECIMAL64: 'DECIMAL64'>, <Type.DECIMAL32: 'DECIMAL32'>, <Type.DECIMAL128: 'DECIMAL128'>, <Type.UDECIMAL: 'UDECIMAL'>, <Type.DECIMAL256: 'DECIMAL256'>, <Type.BIGDECIMAL: 'BIGDECIMAL'>, <Type.UDOUBLE: 'UDOUBLE'>, <Type.SMALLMONEY: 'SMALLMONEY'>, <Type.DOUBLE: 'DOUBLE'>, <Type.MONEY: 'MONEY'>, <Type.DECIMAL: 'DECIMAL'>}</span>
|
<label class="view-value-button pdoc-button" for="DataType.REAL_TYPES-view-value"></label><span class="default_value">{<Type.FLOAT: 'FLOAT'>, <Type.BIGDECIMAL: 'BIGDECIMAL'>, <Type.DECIMAL32: 'DECIMAL32'>, <Type.MONEY: 'MONEY'>, <Type.DECIMAL: 'DECIMAL'>, <Type.DECIMAL128: 'DECIMAL128'>, <Type.UDOUBLE: 'UDOUBLE'>, <Type.DECIMAL64: 'DECIMAL64'>, <Type.UDECIMAL: 'UDECIMAL'>, <Type.DOUBLE: 'DOUBLE'>, <Type.SMALLMONEY: 'SMALLMONEY'>, <Type.DECIMAL256: 'DECIMAL256'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -61967,7 +61967,7 @@ Otherwise, this resets the expressions.</li>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">NUMERIC_TYPES</span> =
|
<span class="name">NUMERIC_TYPES</span> =
|
||||||
<input id="DataType.NUMERIC_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DataType.NUMERIC_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DataType.NUMERIC_TYPES-view-value"></label><span class="default_value">{<Type.FLOAT: 'FLOAT'>, <Type.DECIMAL64: 'DECIMAL64'>, <Type.INT128: 'INT128'>, <Type.DECIMAL256: 'DECIMAL256'>, <Type.INT256: 'INT256'>, <Type.BIGDECIMAL: 'BIGDECIMAL'>, <Type.DECIMAL128: 'DECIMAL128'>, <Type.DOUBLE: 'DOUBLE'>, <Type.UINT: 'UINT'>, <Type.SMALLINT: 'SMALLINT'>, <Type.DECIMAL32: 'DECIMAL32'>, <Type.BIGINT: 'BIGINT'>, <Type.SMALLMONEY: 'SMALLMONEY'>, <Type.USMALLINT: 'USMALLINT'>, <Type.UTINYINT: 'UTINYINT'>, <Type.UDECIMAL: 'UDECIMAL'>, <Type.MEDIUMINT: 'MEDIUMINT'>, <Type.TINYINT: 'TINYINT'>, <Type.BIT: 'BIT'>, <Type.UDOUBLE: 'UDOUBLE'>, <Type.UMEDIUMINT: 'UMEDIUMINT'>, <Type.INT: 'INT'>, <Type.UBIGINT: 'UBIGINT'>, <Type.UINT256: 'UINT256'>, <Type.MONEY: 'MONEY'>, <Type.DECIMAL: 'DECIMAL'>, <Type.UINT128: 'UINT128'>}</span>
|
<label class="view-value-button pdoc-button" for="DataType.NUMERIC_TYPES-view-value"></label><span class="default_value">{<Type.UMEDIUMINT: 'UMEDIUMINT'>, <Type.SMALLINT: 'SMALLINT'>, <Type.DECIMAL: 'DECIMAL'>, <Type.DECIMAL128: 'DECIMAL128'>, <Type.UINT: 'UINT'>, <Type.UDOUBLE: 'UDOUBLE'>, <Type.UDECIMAL: 'UDECIMAL'>, <Type.TINYINT: 'TINYINT'>, <Type.USMALLINT: 'USMALLINT'>, <Type.UTINYINT: 'UTINYINT'>, <Type.DECIMAL64: 'DECIMAL64'>, <Type.FLOAT: 'FLOAT'>, <Type.BIGINT: 'BIGINT'>, <Type.UINT256: 'UINT256'>, <Type.BIGDECIMAL: 'BIGDECIMAL'>, <Type.INT128: 'INT128'>, <Type.DECIMAL32: 'DECIMAL32'>, <Type.MONEY: 'MONEY'>, <Type.MEDIUMINT: 'MEDIUMINT'>, <Type.UINT128: 'UINT128'>, <Type.INT256: 'INT256'>, <Type.BIT: 'BIT'>, <Type.INT: 'INT'>, <Type.DOUBLE: 'DOUBLE'>, <Type.SMALLMONEY: 'SMALLMONEY'>, <Type.DECIMAL256: 'DECIMAL256'>, <Type.UBIGINT: 'UBIGINT'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -61980,7 +61980,7 @@ Otherwise, this resets the expressions.</li>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">TEMPORAL_TYPES</span> =
|
<span class="name">TEMPORAL_TYPES</span> =
|
||||||
<input id="DataType.TEMPORAL_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DataType.TEMPORAL_TYPES-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DataType.TEMPORAL_TYPES-view-value"></label><span class="default_value">{<Type.SMALLDATETIME: 'SMALLDATETIME'>, <Type.TIMESTAMPLTZ: 'TIMESTAMPLTZ'>, <Type.DATETIME2: 'DATETIME2'>, <Type.TIMETZ: 'TIMETZ'>, <Type.TIMESTAMPNTZ: 'TIMESTAMPNTZ'>, <Type.DATETIME: 'DATETIME'>, <Type.TIMESTAMP_NS: 'TIMESTAMP_NS'>, <Type.TIME: 'TIME'>, <Type.TIMESTAMP_S: 'TIMESTAMP_S'>, <Type.DATE32: 'DATE32'>, <Type.DATETIME64: 'DATETIME64'>, <Type.TIMESTAMPTZ: 'TIMESTAMPTZ'>, <Type.TIMESTAMP_MS: 'TIMESTAMP_MS'>, <Type.DATE: 'DATE'>, <Type.TIMESTAMP: 'TIMESTAMP'>}</span>
|
<label class="view-value-button pdoc-button" for="DataType.TEMPORAL_TYPES-view-value"></label><span class="default_value">{<Type.DATE32: 'DATE32'>, <Type.DATETIME64: 'DATETIME64'>, <Type.TIMETZ: 'TIMETZ'>, <Type.TIMESTAMPNTZ: 'TIMESTAMPNTZ'>, <Type.TIMESTAMPLTZ: 'TIMESTAMPLTZ'>, <Type.TIMESTAMP_NS: 'TIMESTAMP_NS'>, <Type.DATE: 'DATE'>, <Type.TIME: 'TIME'>, <Type.TIMESTAMPTZ: 'TIMESTAMPTZ'>, <Type.TIMESTAMP: 'TIMESTAMP'>, <Type.TIMESTAMP_MS: 'TIMESTAMP_MS'>, <Type.DATETIME2: 'DATETIME2'>, <Type.TIMESTAMP_S: 'TIMESTAMP_S'>, <Type.SMALLDATETIME: 'SMALLDATETIME'>, <Type.DATETIME: 'DATETIME'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -12157,7 +12157,7 @@ Default: True</li>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">SUPPORTED_JSON_PATH_PARTS</span> =
|
<span class="name">SUPPORTED_JSON_PATH_PARTS</span> =
|
||||||
<input id="Generator.SUPPORTED_JSON_PATH_PARTS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="Generator.SUPPORTED_JSON_PATH_PARTS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="Generator.SUPPORTED_JSON_PATH_PARTS-view-value"></label><span class="default_value">{<class '<a href="expressions.html#JSONPathSubscript">sqlglot.expressions.JSONPathSubscript</a>'>, <class '<a href="expressions.html#JSONPathSelector">sqlglot.expressions.JSONPathSelector</a>'>, <class '<a href="expressions.html#JSONPathSlice">sqlglot.expressions.JSONPathSlice</a>'>, <class '<a href="expressions.html#JSONPathScript">sqlglot.expressions.JSONPathScript</a>'>, <class '<a href="expressions.html#JSONPathUnion">sqlglot.expressions.JSONPathUnion</a>'>, <class '<a href="expressions.html#JSONPathRoot">sqlglot.expressions.JSONPathRoot</a>'>, <class '<a href="expressions.html#JSONPathRecursive">sqlglot.expressions.JSONPathRecursive</a>'>, <class '<a href="expressions.html#JSONPathKey">sqlglot.expressions.JSONPathKey</a>'>, <class '<a href="expressions.html#JSONPathWildcard">sqlglot.expressions.JSONPathWildcard</a>'>, <class '<a href="expressions.html#JSONPathFilter">sqlglot.expressions.JSONPathFilter</a>'>}</span>
|
<label class="view-value-button pdoc-button" for="Generator.SUPPORTED_JSON_PATH_PARTS-view-value"></label><span class="default_value">{<class '<a href="expressions.html#JSONPathFilter">sqlglot.expressions.JSONPathFilter</a>'>, <class '<a href="expressions.html#JSONPathWildcard">sqlglot.expressions.JSONPathWildcard</a>'>, <class '<a href="expressions.html#JSONPathSlice">sqlglot.expressions.JSONPathSlice</a>'>, <class '<a href="expressions.html#JSONPathUnion">sqlglot.expressions.JSONPathUnion</a>'>, <class '<a href="expressions.html#JSONPathScript">sqlglot.expressions.JSONPathScript</a>'>, <class '<a href="expressions.html#JSONPathSubscript">sqlglot.expressions.JSONPathSubscript</a>'>, <class '<a href="expressions.html#JSONPathRoot">sqlglot.expressions.JSONPathRoot</a>'>, <class '<a href="expressions.html#JSONPathSelector">sqlglot.expressions.JSONPathSelector</a>'>, <class '<a href="expressions.html#JSONPathRecursive">sqlglot.expressions.JSONPathRecursive</a>'>, <class '<a href="expressions.html#JSONPathKey">sqlglot.expressions.JSONPathKey</a>'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -12595,7 +12595,7 @@ Default: True</li>
|
||||||
<div id="Generator.PARAMETERIZABLE_TEXT_TYPES" class="classattr">
|
<div id="Generator.PARAMETERIZABLE_TEXT_TYPES" class="classattr">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">PARAMETERIZABLE_TEXT_TYPES</span> =
|
<span class="name">PARAMETERIZABLE_TEXT_TYPES</span> =
|
||||||
<span class="default_value">{<Type.NCHAR: 'NCHAR'>, <Type.NVARCHAR: 'NVARCHAR'>, <Type.VARCHAR: 'VARCHAR'>, <Type.CHAR: 'CHAR'>}</span>
|
<span class="default_value">{<Type.CHAR: 'CHAR'>, <Type.VARCHAR: 'VARCHAR'>, <Type.NCHAR: 'NCHAR'>, <Type.NVARCHAR: 'NVARCHAR'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -1920,7 +1920,7 @@ belong to some totally-ordered set.</p>
|
||||||
<section id="DATE_UNITS">
|
<section id="DATE_UNITS">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">DATE_UNITS</span> =
|
<span class="name">DATE_UNITS</span> =
|
||||||
<span class="default_value">{'year', 'week', 'year_month', 'month', 'quarter', 'day'}</span>
|
<span class="default_value">{'month', 'quarter', 'day', 'week', 'year_month', 'year'}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -641,7 +641,7 @@
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">ALL_JSON_PATH_PARTS</span> =
|
<span class="name">ALL_JSON_PATH_PARTS</span> =
|
||||||
<input id="ALL_JSON_PATH_PARTS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="ALL_JSON_PATH_PARTS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="ALL_JSON_PATH_PARTS-view-value"></label><span class="default_value">{<class '<a href="expressions.html#JSONPathSubscript">sqlglot.expressions.JSONPathSubscript</a>'>, <class '<a href="expressions.html#JSONPathSelector">sqlglot.expressions.JSONPathSelector</a>'>, <class '<a href="expressions.html#JSONPathSlice">sqlglot.expressions.JSONPathSlice</a>'>, <class '<a href="expressions.html#JSONPathScript">sqlglot.expressions.JSONPathScript</a>'>, <class '<a href="expressions.html#JSONPathUnion">sqlglot.expressions.JSONPathUnion</a>'>, <class '<a href="expressions.html#JSONPathRoot">sqlglot.expressions.JSONPathRoot</a>'>, <class '<a href="expressions.html#JSONPathRecursive">sqlglot.expressions.JSONPathRecursive</a>'>, <class '<a href="expressions.html#JSONPathKey">sqlglot.expressions.JSONPathKey</a>'>, <class '<a href="expressions.html#JSONPathWildcard">sqlglot.expressions.JSONPathWildcard</a>'>, <class '<a href="expressions.html#JSONPathFilter">sqlglot.expressions.JSONPathFilter</a>'>}</span>
|
<label class="view-value-button pdoc-button" for="ALL_JSON_PATH_PARTS-view-value"></label><span class="default_value">{<class '<a href="expressions.html#JSONPathFilter">sqlglot.expressions.JSONPathFilter</a>'>, <class '<a href="expressions.html#JSONPathWildcard">sqlglot.expressions.JSONPathWildcard</a>'>, <class '<a href="expressions.html#JSONPathSlice">sqlglot.expressions.JSONPathSlice</a>'>, <class '<a href="expressions.html#JSONPathUnion">sqlglot.expressions.JSONPathUnion</a>'>, <class '<a href="expressions.html#JSONPathScript">sqlglot.expressions.JSONPathScript</a>'>, <class '<a href="expressions.html#JSONPathSubscript">sqlglot.expressions.JSONPathSubscript</a>'>, <class '<a href="expressions.html#JSONPathRoot">sqlglot.expressions.JSONPathRoot</a>'>, <class '<a href="expressions.html#JSONPathSelector">sqlglot.expressions.JSONPathSelector</a>'>, <class '<a href="expressions.html#JSONPathRecursive">sqlglot.expressions.JSONPathRecursive</a>'>, <class '<a href="expressions.html#JSONPathKey">sqlglot.expressions.JSONPathKey</a>'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -355,7 +355,7 @@
|
||||||
</span><span id="L-237"><a href="#L-237"><span class="linenos">237</span></a> <span class="c1"># Find all columns that went into creating this one to list their lineage nodes.</span>
|
</span><span id="L-237"><a href="#L-237"><span class="linenos">237</span></a> <span class="c1"># Find all columns that went into creating this one to list their lineage nodes.</span>
|
||||||
</span><span id="L-238"><a href="#L-238"><span class="linenos">238</span></a> <span class="n">source_columns</span> <span class="o">=</span> <span class="nb">set</span><span class="p">(</span><span class="n">find_all_in_scope</span><span class="p">(</span><span class="n">select</span><span class="p">,</span> <span class="n">exp</span><span class="o">.</span><span class="n">Column</span><span class="p">))</span>
|
</span><span id="L-238"><a href="#L-238"><span class="linenos">238</span></a> <span class="n">source_columns</span> <span class="o">=</span> <span class="nb">set</span><span class="p">(</span><span class="n">find_all_in_scope</span><span class="p">(</span><span class="n">select</span><span class="p">,</span> <span class="n">exp</span><span class="o">.</span><span class="n">Column</span><span class="p">))</span>
|
||||||
</span><span id="L-239"><a href="#L-239"><span class="linenos">239</span></a>
|
</span><span id="L-239"><a href="#L-239"><span class="linenos">239</span></a>
|
||||||
</span><span id="L-240"><a href="#L-240"><span class="linenos">240</span></a> <span class="c1"># If the source is a UDTF find columns used in the UTDF to generate the table</span>
|
</span><span id="L-240"><a href="#L-240"><span class="linenos">240</span></a> <span class="c1"># If the source is a UDTF find columns used in the UDTF to generate the table</span>
|
||||||
</span><span id="L-241"><a href="#L-241"><span class="linenos">241</span></a> <span class="k">if</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">source</span><span class="p">,</span> <span class="n">exp</span><span class="o">.</span><span class="n">UDTF</span><span class="p">):</span>
|
</span><span id="L-241"><a href="#L-241"><span class="linenos">241</span></a> <span class="k">if</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">source</span><span class="p">,</span> <span class="n">exp</span><span class="o">.</span><span class="n">UDTF</span><span class="p">):</span>
|
||||||
</span><span id="L-242"><a href="#L-242"><span class="linenos">242</span></a> <span class="n">source_columns</span> <span class="o">|=</span> <span class="nb">set</span><span class="p">(</span><span class="n">source</span><span class="o">.</span><span class="n">find_all</span><span class="p">(</span><span class="n">exp</span><span class="o">.</span><span class="n">Column</span><span class="p">))</span>
|
</span><span id="L-242"><a href="#L-242"><span class="linenos">242</span></a> <span class="n">source_columns</span> <span class="o">|=</span> <span class="nb">set</span><span class="p">(</span><span class="n">source</span><span class="o">.</span><span class="n">find_all</span><span class="p">(</span><span class="n">exp</span><span class="o">.</span><span class="n">Column</span><span class="p">))</span>
|
||||||
</span><span id="L-243"><a href="#L-243"><span class="linenos">243</span></a> <span class="n">derived_tables</span> <span class="o">=</span> <span class="p">[</span>
|
</span><span id="L-243"><a href="#L-243"><span class="linenos">243</span></a> <span class="n">derived_tables</span> <span class="o">=</span> <span class="p">[</span>
|
||||||
|
@ -990,7 +990,7 @@
|
||||||
</span><span id="to_node-238"><a href="#to_node-238"><span class="linenos">238</span></a> <span class="c1"># Find all columns that went into creating this one to list their lineage nodes.</span>
|
</span><span id="to_node-238"><a href="#to_node-238"><span class="linenos">238</span></a> <span class="c1"># Find all columns that went into creating this one to list their lineage nodes.</span>
|
||||||
</span><span id="to_node-239"><a href="#to_node-239"><span class="linenos">239</span></a> <span class="n">source_columns</span> <span class="o">=</span> <span class="nb">set</span><span class="p">(</span><span class="n">find_all_in_scope</span><span class="p">(</span><span class="n">select</span><span class="p">,</span> <span class="n">exp</span><span class="o">.</span><span class="n">Column</span><span class="p">))</span>
|
</span><span id="to_node-239"><a href="#to_node-239"><span class="linenos">239</span></a> <span class="n">source_columns</span> <span class="o">=</span> <span class="nb">set</span><span class="p">(</span><span class="n">find_all_in_scope</span><span class="p">(</span><span class="n">select</span><span class="p">,</span> <span class="n">exp</span><span class="o">.</span><span class="n">Column</span><span class="p">))</span>
|
||||||
</span><span id="to_node-240"><a href="#to_node-240"><span class="linenos">240</span></a>
|
</span><span id="to_node-240"><a href="#to_node-240"><span class="linenos">240</span></a>
|
||||||
</span><span id="to_node-241"><a href="#to_node-241"><span class="linenos">241</span></a> <span class="c1"># If the source is a UDTF find columns used in the UTDF to generate the table</span>
|
</span><span id="to_node-241"><a href="#to_node-241"><span class="linenos">241</span></a> <span class="c1"># If the source is a UDTF find columns used in the UDTF to generate the table</span>
|
||||||
</span><span id="to_node-242"><a href="#to_node-242"><span class="linenos">242</span></a> <span class="k">if</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">source</span><span class="p">,</span> <span class="n">exp</span><span class="o">.</span><span class="n">UDTF</span><span class="p">):</span>
|
</span><span id="to_node-242"><a href="#to_node-242"><span class="linenos">242</span></a> <span class="k">if</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">source</span><span class="p">,</span> <span class="n">exp</span><span class="o">.</span><span class="n">UDTF</span><span class="p">):</span>
|
||||||
</span><span id="to_node-243"><a href="#to_node-243"><span class="linenos">243</span></a> <span class="n">source_columns</span> <span class="o">|=</span> <span class="nb">set</span><span class="p">(</span><span class="n">source</span><span class="o">.</span><span class="n">find_all</span><span class="p">(</span><span class="n">exp</span><span class="o">.</span><span class="n">Column</span><span class="p">))</span>
|
</span><span id="to_node-243"><a href="#to_node-243"><span class="linenos">243</span></a> <span class="n">source_columns</span> <span class="o">|=</span> <span class="nb">set</span><span class="p">(</span><span class="n">source</span><span class="o">.</span><span class="n">find_all</span><span class="p">(</span><span class="n">exp</span><span class="o">.</span><span class="n">Column</span><span class="p">))</span>
|
||||||
</span><span id="to_node-244"><a href="#to_node-244"><span class="linenos">244</span></a> <span class="n">derived_tables</span> <span class="o">=</span> <span class="p">[</span>
|
</span><span id="to_node-244"><a href="#to_node-244"><span class="linenos">244</span></a> <span class="n">derived_tables</span> <span class="o">=</span> <span class="p">[</span>
|
||||||
|
|
File diff suppressed because one or more lines are too long
|
@ -581,7 +581,7 @@ queries if it would result in multiple table selects in a single query:</p>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">UNMERGABLE_ARGS</span> =
|
<span class="name">UNMERGABLE_ARGS</span> =
|
||||||
<input id="UNMERGABLE_ARGS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="UNMERGABLE_ARGS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="UNMERGABLE_ARGS-view-value"></label><span class="default_value">{'offset', 'sample', 'kind', 'qualify', 'cluster', 'options', 'locks', 'operation_modifiers', 'match', 'laterals', 'connect', 'distinct', 'group', 'distribute', 'into', 'prewhere', 'with', 'settings', 'format', 'pivots', 'sort', 'limit', 'windows', 'having'}</span>
|
<label class="view-value-button pdoc-button" for="UNMERGABLE_ARGS-view-value"></label><span class="default_value">{'offset', 'cluster', 'distinct', 'qualify', 'connect', 'into', 'distribute', 'windows', 'having', 'prewhere', 'limit', 'format', 'with', 'sort', 'settings', 'kind', 'locks', 'sample', 'match', 'options', 'group', 'laterals', 'pivots', 'operation_modifiers'}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -3250,7 +3250,7 @@ prefix are statically known.</p>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">DATETRUNC_COMPARISONS</span> =
|
<span class="name">DATETRUNC_COMPARISONS</span> =
|
||||||
<input id="DATETRUNC_COMPARISONS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DATETRUNC_COMPARISONS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DATETRUNC_COMPARISONS-view-value"></label><span class="default_value">{<class '<a href="../expressions.html#LTE">sqlglot.expressions.LTE</a>'>, <class '<a href="../expressions.html#GT">sqlglot.expressions.GT</a>'>, <class '<a href="../expressions.html#In">sqlglot.expressions.In</a>'>, <class '<a href="../expressions.html#LT">sqlglot.expressions.LT</a>'>, <class '<a href="../expressions.html#EQ">sqlglot.expressions.EQ</a>'>, <class '<a href="../expressions.html#NEQ">sqlglot.expressions.NEQ</a>'>, <class '<a href="../expressions.html#GTE">sqlglot.expressions.GTE</a>'>}</span>
|
<label class="view-value-button pdoc-button" for="DATETRUNC_COMPARISONS-view-value"></label><span class="default_value">{<class '<a href="../expressions.html#GTE">sqlglot.expressions.GTE</a>'>, <class '<a href="../expressions.html#LTE">sqlglot.expressions.LTE</a>'>, <class '<a href="../expressions.html#GT">sqlglot.expressions.GT</a>'>, <class '<a href="../expressions.html#In">sqlglot.expressions.In</a>'>, <class '<a href="../expressions.html#LT">sqlglot.expressions.LT</a>'>, <class '<a href="../expressions.html#NEQ">sqlglot.expressions.NEQ</a>'>, <class '<a href="../expressions.html#EQ">sqlglot.expressions.EQ</a>'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -3334,7 +3334,7 @@ prefix are statically known.</p>
|
||||||
<section id="JOINS">
|
<section id="JOINS">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">JOINS</span> =
|
<span class="name">JOINS</span> =
|
||||||
<span class="default_value">{('', ''), ('RIGHT', ''), ('', 'INNER'), ('RIGHT', 'OUTER')}</span>
|
<span class="default_value">{('', 'INNER'), ('RIGHT', 'OUTER'), ('RIGHT', ''), ('', '')}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
@ -9400,7 +9400,7 @@
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">TOKENS_PRECEDING_HINT</span> =
|
<span class="name">TOKENS_PRECEDING_HINT</span> =
|
||||||
<input id="Tokenizer.TOKENS_PRECEDING_HINT-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="Tokenizer.TOKENS_PRECEDING_HINT-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="Tokenizer.TOKENS_PRECEDING_HINT-view-value"></label><span class="default_value">{<<a href="#TokenType.DELETE">TokenType.DELETE</a>: 'DELETE'>, <<a href="#TokenType.INSERT">TokenType.INSERT</a>: 'INSERT'>, <<a href="#TokenType.UPDATE">TokenType.UPDATE</a>: 'UPDATE'>, <<a href="#TokenType.SELECT">TokenType.SELECT</a>: 'SELECT'>}</span>
|
<label class="view-value-button pdoc-button" for="Tokenizer.TOKENS_PRECEDING_HINT-view-value"></label><span class="default_value">{<<a href="#TokenType.INSERT">TokenType.INSERT</a>: 'INSERT'>, <<a href="#TokenType.SELECT">TokenType.SELECT</a>: 'SELECT'>, <<a href="#TokenType.DELETE">TokenType.DELETE</a>: 'DELETE'>, <<a href="#TokenType.UPDATE">TokenType.UPDATE</a>: 'UPDATE'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -9439,7 +9439,7 @@
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">COMMANDS</span> =
|
<span class="name">COMMANDS</span> =
|
||||||
<input id="Tokenizer.COMMANDS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="Tokenizer.COMMANDS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="Tokenizer.COMMANDS-view-value"></label><span class="default_value">{<<a href="#TokenType.COMMAND">TokenType.COMMAND</a>: 'COMMAND'>, <<a href="#TokenType.EXECUTE">TokenType.EXECUTE</a>: 'EXECUTE'>, <<a href="#TokenType.RENAME">TokenType.RENAME</a>: 'RENAME'>, <<a href="#TokenType.FETCH">TokenType.FETCH</a>: 'FETCH'>, <<a href="#TokenType.SHOW">TokenType.SHOW</a>: 'SHOW'>}</span>
|
<label class="view-value-button pdoc-button" for="Tokenizer.COMMANDS-view-value"></label><span class="default_value">{<<a href="#TokenType.COMMAND">TokenType.COMMAND</a>: 'COMMAND'>, <<a href="#TokenType.RENAME">TokenType.RENAME</a>: 'RENAME'>, <<a href="#TokenType.EXECUTE">TokenType.EXECUTE</a>: 'EXECUTE'>, <<a href="#TokenType.SHOW">TokenType.SHOW</a>: 'SHOW'>, <<a href="#TokenType.FETCH">TokenType.FETCH</a>: 'FETCH'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -9451,7 +9451,7 @@
|
||||||
<div id="Tokenizer.COMMAND_PREFIX_TOKENS" class="classattr">
|
<div id="Tokenizer.COMMAND_PREFIX_TOKENS" class="classattr">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">COMMAND_PREFIX_TOKENS</span> =
|
<span class="name">COMMAND_PREFIX_TOKENS</span> =
|
||||||
<span class="default_value">{<<a href="#TokenType.SEMICOLON">TokenType.SEMICOLON</a>: 'SEMICOLON'>, <<a href="#TokenType.BEGIN">TokenType.BEGIN</a>: 'BEGIN'>}</span>
|
<span class="default_value">{<<a href="#TokenType.BEGIN">TokenType.BEGIN</a>: 'BEGIN'>, <<a href="#TokenType.SEMICOLON">TokenType.SEMICOLON</a>: 'SEMICOLON'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -164,9 +164,6 @@ This allows SQLGlot to return the code unmodified even though it cannot parse it
|
||||||
### Dialect-specific parsing
|
### Dialect-specific parsing
|
||||||
The base parser’s goal is to represent as many common constructs from different SQL dialects as possible. This makes the parser more lenient and less-repetitive/concise.
|
The base parser’s goal is to represent as many common constructs from different SQL dialects as possible. This makes the parser more lenient and less-repetitive/concise.
|
||||||
|
|
||||||
> [!WARNING]
|
|
||||||
> SQLGlot does not aim to be a SQL validator, so it may not detect certain syntax errors.
|
|
||||||
|
|
||||||
Dialect-specific parser behavior is implemented in two ways: feature flags and parser overrides.
|
Dialect-specific parser behavior is implemented in two ways: feature flags and parser overrides.
|
||||||
|
|
||||||
If two different parsing behaviors are common across dialects, the base parser may implement both and use feature flags to determine which should be used for a specific dialect. In contrast, parser overrides directly replace specific base parser methods.
|
If two different parsing behaviors are common across dialects, the base parser may implement both and use feature flags to determine which should be used for a specific dialect. In contrast, parser overrides directly replace specific base parser methods.
|
||||||
|
|
|
@ -8,6 +8,7 @@ from sqlglot import exp, generator, parser, tokens, transforms
|
||||||
from sqlglot.dialects.dialect import (
|
from sqlglot.dialects.dialect import (
|
||||||
Dialect,
|
Dialect,
|
||||||
NormalizationStrategy,
|
NormalizationStrategy,
|
||||||
|
annotate_with_type_lambda,
|
||||||
arg_max_or_min_no_count,
|
arg_max_or_min_no_count,
|
||||||
binary_from_function,
|
binary_from_function,
|
||||||
date_add_interval_sql,
|
date_add_interval_sql,
|
||||||
|
@ -313,7 +314,11 @@ def _build_levenshtein(args: t.List) -> exp.Levenshtein:
|
||||||
|
|
||||||
def _build_format_time(expr_type: t.Type[exp.Expression]) -> t.Callable[[t.List], exp.TimeToStr]:
|
def _build_format_time(expr_type: t.Type[exp.Expression]) -> t.Callable[[t.List], exp.TimeToStr]:
|
||||||
def _builder(args: t.List) -> exp.TimeToStr:
|
def _builder(args: t.List) -> exp.TimeToStr:
|
||||||
return exp.TimeToStr(this=expr_type(this=seq_get(args, 1)), format=seq_get(args, 0))
|
return exp.TimeToStr(
|
||||||
|
this=expr_type(this=seq_get(args, 1)),
|
||||||
|
format=seq_get(args, 0),
|
||||||
|
zone=seq_get(args, 2),
|
||||||
|
)
|
||||||
|
|
||||||
return _builder
|
return _builder
|
||||||
|
|
||||||
|
@ -394,8 +399,20 @@ class BigQuery(Dialect):
|
||||||
# All set operations require either a DISTINCT or ALL specifier
|
# All set operations require either a DISTINCT or ALL specifier
|
||||||
SET_OP_DISTINCT_BY_DEFAULT = dict.fromkeys((exp.Except, exp.Intersect, exp.Union), None)
|
SET_OP_DISTINCT_BY_DEFAULT = dict.fromkeys((exp.Except, exp.Intersect, exp.Union), None)
|
||||||
|
|
||||||
|
# BigQuery maps Type.TIMESTAMP to DATETIME, so we need to amend the inferred types
|
||||||
|
TYPE_TO_EXPRESSIONS = {
|
||||||
|
**Dialect.TYPE_TO_EXPRESSIONS,
|
||||||
|
exp.DataType.Type.TIMESTAMPTZ: Dialect.TYPE_TO_EXPRESSIONS[exp.DataType.Type.TIMESTAMP],
|
||||||
|
}
|
||||||
|
TYPE_TO_EXPRESSIONS.pop(exp.DataType.Type.TIMESTAMP)
|
||||||
|
|
||||||
ANNOTATORS = {
|
ANNOTATORS = {
|
||||||
**Dialect.ANNOTATORS,
|
**Dialect.ANNOTATORS,
|
||||||
|
**{
|
||||||
|
expr_type: annotate_with_type_lambda(data_type)
|
||||||
|
for data_type, expressions in TYPE_TO_EXPRESSIONS.items()
|
||||||
|
for expr_type in expressions
|
||||||
|
},
|
||||||
**{
|
**{
|
||||||
expr_type: lambda self, e: _annotate_math_functions(self, e)
|
expr_type: lambda self, e: _annotate_math_functions(self, e)
|
||||||
for expr_type in (exp.Floor, exp.Ceil, exp.Log, exp.Ln, exp.Sqrt, exp.Exp, exp.Round)
|
for expr_type in (exp.Floor, exp.Ceil, exp.Log, exp.Ln, exp.Sqrt, exp.Exp, exp.Round)
|
||||||
|
@ -937,7 +954,7 @@ class BigQuery(Dialect):
|
||||||
exp.Rollback: lambda *_: "ROLLBACK TRANSACTION",
|
exp.Rollback: lambda *_: "ROLLBACK TRANSACTION",
|
||||||
exp.Select: transforms.preprocess(
|
exp.Select: transforms.preprocess(
|
||||||
[
|
[
|
||||||
transforms.explode_to_unnest(),
|
transforms.explode_projection_to_unnest(),
|
||||||
transforms.unqualify_unnest,
|
transforms.unqualify_unnest,
|
||||||
transforms.eliminate_distinct_on,
|
transforms.eliminate_distinct_on,
|
||||||
_alias_ordered_group,
|
_alias_ordered_group,
|
||||||
|
@ -1174,7 +1191,9 @@ class BigQuery(Dialect):
|
||||||
if isinstance(this, (exp.TsOrDsToDatetime, exp.TsOrDsToTimestamp, exp.TsOrDsToDate))
|
if isinstance(this, (exp.TsOrDsToDatetime, exp.TsOrDsToTimestamp, exp.TsOrDsToDate))
|
||||||
else expression
|
else expression
|
||||||
)
|
)
|
||||||
return self.func(func_name, self.format_time(expression), time_expr.this)
|
return self.func(
|
||||||
|
func_name, self.format_time(expression), time_expr.this, expression.args.get("zone")
|
||||||
|
)
|
||||||
|
|
||||||
def eq_sql(self, expression: exp.EQ) -> str:
|
def eq_sql(self, expression: exp.EQ) -> str:
|
||||||
# Operands of = cannot be NULL in BigQuery
|
# Operands of = cannot be NULL in BigQuery
|
||||||
|
|
|
@ -61,6 +61,7 @@ class Databricks(Spark):
|
||||||
COPY_PARAMS_EQ_REQUIRED = True
|
COPY_PARAMS_EQ_REQUIRED = True
|
||||||
JSON_PATH_SINGLE_QUOTE_ESCAPE = False
|
JSON_PATH_SINGLE_QUOTE_ESCAPE = False
|
||||||
QUOTE_JSON_PATH = False
|
QUOTE_JSON_PATH = False
|
||||||
|
PARSE_JSON_NAME = "PARSE_JSON"
|
||||||
|
|
||||||
TRANSFORMS = {
|
TRANSFORMS = {
|
||||||
**Spark.Generator.TRANSFORMS,
|
**Spark.Generator.TRANSFORMS,
|
||||||
|
|
|
@ -51,7 +51,7 @@ UNESCAPED_SEQUENCES = {
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
def _annotate_with_type_lambda(data_type: exp.DataType.Type) -> t.Callable[[TypeAnnotator, E], E]:
|
def annotate_with_type_lambda(data_type: exp.DataType.Type) -> t.Callable[[TypeAnnotator, E], E]:
|
||||||
return lambda self, e: self._annotate_with_type(e, data_type)
|
return lambda self, e: self._annotate_with_type(e, data_type)
|
||||||
|
|
||||||
|
|
||||||
|
@ -683,15 +683,15 @@ class Dialect(metaclass=_Dialect):
|
||||||
exp.ParseJSON,
|
exp.ParseJSON,
|
||||||
},
|
},
|
||||||
exp.DataType.Type.TIME: {
|
exp.DataType.Type.TIME: {
|
||||||
|
exp.CurrentTime,
|
||||||
exp.Time,
|
exp.Time,
|
||||||
|
exp.TimeAdd,
|
||||||
|
exp.TimeSub,
|
||||||
},
|
},
|
||||||
exp.DataType.Type.TIMESTAMP: {
|
exp.DataType.Type.TIMESTAMP: {
|
||||||
exp.CurrentTime,
|
|
||||||
exp.CurrentTimestamp,
|
exp.CurrentTimestamp,
|
||||||
exp.StrToTime,
|
exp.StrToTime,
|
||||||
exp.TimeAdd,
|
|
||||||
exp.TimeStrToTime,
|
exp.TimeStrToTime,
|
||||||
exp.TimeSub,
|
|
||||||
exp.TimestampAdd,
|
exp.TimestampAdd,
|
||||||
exp.TimestampSub,
|
exp.TimestampSub,
|
||||||
exp.UnixToTime,
|
exp.UnixToTime,
|
||||||
|
@ -733,7 +733,7 @@ class Dialect(metaclass=_Dialect):
|
||||||
for expr_type in subclasses(exp.__name__, exp.Binary)
|
for expr_type in subclasses(exp.__name__, exp.Binary)
|
||||||
},
|
},
|
||||||
**{
|
**{
|
||||||
expr_type: _annotate_with_type_lambda(data_type)
|
expr_type: annotate_with_type_lambda(data_type)
|
||||||
for data_type, expressions in TYPE_TO_EXPRESSIONS.items()
|
for data_type, expressions in TYPE_TO_EXPRESSIONS.items()
|
||||||
for expr_type in expressions
|
for expr_type in expressions
|
||||||
},
|
},
|
||||||
|
|
|
@ -439,6 +439,7 @@ class DuckDB(Dialect):
|
||||||
NO_PAREN_FUNCTION_PARSERS = {
|
NO_PAREN_FUNCTION_PARSERS = {
|
||||||
**parser.Parser.NO_PAREN_FUNCTION_PARSERS,
|
**parser.Parser.NO_PAREN_FUNCTION_PARSERS,
|
||||||
"MAP": lambda self: self._parse_map(),
|
"MAP": lambda self: self._parse_map(),
|
||||||
|
"@": lambda self: exp.Abs(this=self._parse_bitwise()),
|
||||||
}
|
}
|
||||||
|
|
||||||
TABLE_ALIAS_TOKENS = parser.Parser.TABLE_ALIAS_TOKENS - {
|
TABLE_ALIAS_TOKENS = parser.Parser.TABLE_ALIAS_TOKENS - {
|
||||||
|
|
|
@ -73,16 +73,14 @@ def _add_date_sql(self: Hive.Generator, expression: DATE_ADD_OR_SUB) -> str:
|
||||||
if isinstance(expression, exp.DateSub):
|
if isinstance(expression, exp.DateSub):
|
||||||
multiplier *= -1
|
multiplier *= -1
|
||||||
|
|
||||||
if expression.expression.is_number:
|
increment = expression.expression
|
||||||
modified_increment = exp.Literal.number(expression.expression.to_py() * multiplier)
|
if isinstance(increment, exp.Literal):
|
||||||
else:
|
value = increment.to_py() if increment.is_number else int(increment.name)
|
||||||
modified_increment = expression.expression
|
increment = exp.Literal.number(value * multiplier)
|
||||||
if multiplier != 1:
|
elif multiplier != 1:
|
||||||
modified_increment = exp.Mul( # type: ignore
|
increment *= exp.Literal.number(multiplier)
|
||||||
this=modified_increment, expression=exp.Literal.number(multiplier)
|
|
||||||
)
|
|
||||||
|
|
||||||
return self.func(func, expression.this, modified_increment)
|
return self.func(func, expression.this, increment)
|
||||||
|
|
||||||
|
|
||||||
def _date_diff_sql(self: Hive.Generator, expression: exp.DateDiff | exp.TsOrDsDiff) -> str:
|
def _date_diff_sql(self: Hive.Generator, expression: exp.DateDiff | exp.TsOrDsDiff) -> str:
|
||||||
|
@ -469,7 +467,7 @@ class Hive(Dialect):
|
||||||
JSON_PATH_SINGLE_QUOTE_ESCAPE = True
|
JSON_PATH_SINGLE_QUOTE_ESCAPE = True
|
||||||
SUPPORTS_TO_NUMBER = False
|
SUPPORTS_TO_NUMBER = False
|
||||||
WITH_PROPERTIES_PREFIX = "TBLPROPERTIES"
|
WITH_PROPERTIES_PREFIX = "TBLPROPERTIES"
|
||||||
PARSE_JSON_NAME = None
|
PARSE_JSON_NAME: t.Optional[str] = None
|
||||||
PAD_FILL_PATTERN_IS_REQUIRED = True
|
PAD_FILL_PATTERN_IS_REQUIRED = True
|
||||||
SUPPORTS_MEDIAN = False
|
SUPPORTS_MEDIAN = False
|
||||||
ARRAY_SIZE_NAME = "SIZE"
|
ARRAY_SIZE_NAME = "SIZE"
|
||||||
|
|
|
@ -34,6 +34,7 @@ from sqlglot.dialects.dialect import (
|
||||||
from sqlglot.dialects.hive import Hive
|
from sqlglot.dialects.hive import Hive
|
||||||
from sqlglot.dialects.mysql import MySQL
|
from sqlglot.dialects.mysql import MySQL
|
||||||
from sqlglot.helper import apply_index_offset, seq_get
|
from sqlglot.helper import apply_index_offset, seq_get
|
||||||
|
from sqlglot.optimizer.scope import find_all_in_scope
|
||||||
from sqlglot.tokens import TokenType
|
from sqlglot.tokens import TokenType
|
||||||
from sqlglot.transforms import unqualify_columns
|
from sqlglot.transforms import unqualify_columns
|
||||||
from sqlglot.generator import unsupported_args
|
from sqlglot.generator import unsupported_args
|
||||||
|
@ -55,7 +56,7 @@ def _no_sort_array(self: Presto.Generator, expression: exp.SortArray) -> str:
|
||||||
|
|
||||||
|
|
||||||
def _schema_sql(self: Presto.Generator, expression: exp.Schema) -> str:
|
def _schema_sql(self: Presto.Generator, expression: exp.Schema) -> str:
|
||||||
if isinstance(expression.parent, exp.Property):
|
if isinstance(expression.parent, exp.PartitionedByProperty):
|
||||||
columns = ", ".join(f"'{c.name}'" for c in expression.expressions)
|
columns = ", ".join(f"'{c.name}'" for c in expression.expressions)
|
||||||
return f"ARRAY[{columns}]"
|
return f"ARRAY[{columns}]"
|
||||||
|
|
||||||
|
@ -188,6 +189,56 @@ def _date_delta_sql(
|
||||||
return _delta_sql
|
return _delta_sql
|
||||||
|
|
||||||
|
|
||||||
|
def _explode_to_unnest_sql(self: Presto.Generator, expression: exp.Lateral) -> str:
|
||||||
|
explode = expression.this
|
||||||
|
if isinstance(explode, exp.Explode):
|
||||||
|
exploded_type = explode.this.type
|
||||||
|
alias = expression.args.get("alias")
|
||||||
|
|
||||||
|
# This attempts a best-effort transpilation of LATERAL VIEW EXPLODE on a struct array
|
||||||
|
if (
|
||||||
|
isinstance(alias, exp.TableAlias)
|
||||||
|
and isinstance(exploded_type, exp.DataType)
|
||||||
|
and exploded_type.is_type(exp.DataType.Type.ARRAY)
|
||||||
|
and exploded_type.expressions
|
||||||
|
and exploded_type.expressions[0].is_type(exp.DataType.Type.STRUCT)
|
||||||
|
):
|
||||||
|
# When unnesting a ROW in Presto, it produces N columns, so we need to fix the alias
|
||||||
|
alias.set("columns", [c.this.copy() for c in exploded_type.expressions[0].expressions])
|
||||||
|
elif isinstance(explode, exp.Inline):
|
||||||
|
explode.replace(exp.Explode(this=explode.this.copy()))
|
||||||
|
|
||||||
|
return explode_to_unnest_sql(self, expression)
|
||||||
|
|
||||||
|
|
||||||
|
def _amend_exploded_column_table(expression: exp.Expression) -> exp.Expression:
|
||||||
|
# We check for expression.type because the columns can be amended only if types were inferred
|
||||||
|
if isinstance(expression, exp.Select) and expression.type:
|
||||||
|
for lateral in expression.args.get("laterals") or []:
|
||||||
|
alias = lateral.args.get("alias")
|
||||||
|
if (
|
||||||
|
not isinstance(lateral.this, exp.Explode)
|
||||||
|
or not isinstance(alias, exp.TableAlias)
|
||||||
|
or len(alias.columns) != 1
|
||||||
|
):
|
||||||
|
continue
|
||||||
|
|
||||||
|
new_table = alias.this
|
||||||
|
old_table = alias.columns[0].name.lower()
|
||||||
|
|
||||||
|
# When transpiling a LATERAL VIEW EXPLODE Spark query, the exploded fields may be qualified
|
||||||
|
# with the struct column, resulting in invalid Presto references that need to be amended
|
||||||
|
for column in find_all_in_scope(expression, exp.Column):
|
||||||
|
if column.db.lower() == old_table:
|
||||||
|
column.set("table", column.args["db"].pop())
|
||||||
|
elif column.table.lower() == old_table:
|
||||||
|
column.set("table", new_table.copy())
|
||||||
|
elif column.name.lower() == old_table and isinstance(column.parent, exp.Dot):
|
||||||
|
column.parent.replace(exp.column(column.parent.expression, table=new_table))
|
||||||
|
|
||||||
|
return expression
|
||||||
|
|
||||||
|
|
||||||
class Presto(Dialect):
|
class Presto(Dialect):
|
||||||
INDEX_OFFSET = 1
|
INDEX_OFFSET = 1
|
||||||
NULL_ORDERING = "nulls_are_last"
|
NULL_ORDERING = "nulls_are_last"
|
||||||
|
@ -403,7 +454,7 @@ class Presto(Dialect):
|
||||||
exp.Initcap: _initcap_sql,
|
exp.Initcap: _initcap_sql,
|
||||||
exp.Last: _first_last_sql,
|
exp.Last: _first_last_sql,
|
||||||
exp.LastDay: lambda self, e: self.func("LAST_DAY_OF_MONTH", e.this),
|
exp.LastDay: lambda self, e: self.func("LAST_DAY_OF_MONTH", e.this),
|
||||||
exp.Lateral: explode_to_unnest_sql,
|
exp.Lateral: _explode_to_unnest_sql,
|
||||||
exp.Left: left_to_substring_sql,
|
exp.Left: left_to_substring_sql,
|
||||||
exp.Levenshtein: unsupported_args("ins_cost", "del_cost", "sub_cost", "max_dist")(
|
exp.Levenshtein: unsupported_args("ins_cost", "del_cost", "sub_cost", "max_dist")(
|
||||||
rename_func("LEVENSHTEIN_DISTANCE")
|
rename_func("LEVENSHTEIN_DISTANCE")
|
||||||
|
@ -421,8 +472,9 @@ class Presto(Dialect):
|
||||||
[
|
[
|
||||||
transforms.eliminate_qualify,
|
transforms.eliminate_qualify,
|
||||||
transforms.eliminate_distinct_on,
|
transforms.eliminate_distinct_on,
|
||||||
transforms.explode_to_unnest(1),
|
transforms.explode_projection_to_unnest(1),
|
||||||
transforms.eliminate_semi_and_anti_joins,
|
transforms.eliminate_semi_and_anti_joins,
|
||||||
|
_amend_exploded_column_table,
|
||||||
]
|
]
|
||||||
),
|
),
|
||||||
exp.SortArray: _no_sort_array,
|
exp.SortArray: _no_sort_array,
|
||||||
|
|
|
@ -518,6 +518,8 @@ class Snowflake(Dialect):
|
||||||
**parser.Parser.PROPERTY_PARSERS,
|
**parser.Parser.PROPERTY_PARSERS,
|
||||||
"LOCATION": lambda self: self._parse_location_property(),
|
"LOCATION": lambda self: self._parse_location_property(),
|
||||||
"TAG": lambda self: self._parse_tag(),
|
"TAG": lambda self: self._parse_tag(),
|
||||||
|
"USING": lambda self: self._match_text_seq("TEMPLATE")
|
||||||
|
and self.expression(exp.UsingTemplateProperty, this=self._parse_statement()),
|
||||||
}
|
}
|
||||||
|
|
||||||
TYPE_CONVERTERS = {
|
TYPE_CONVERTERS = {
|
||||||
|
@ -862,7 +864,8 @@ class Snowflake(Dialect):
|
||||||
)
|
)
|
||||||
|
|
||||||
def _parse_location_path(self) -> exp.Var:
|
def _parse_location_path(self) -> exp.Var:
|
||||||
parts = [self._advance_any(ignore_reserved=True)]
|
start = self._curr
|
||||||
|
self._advance_any(ignore_reserved=True)
|
||||||
|
|
||||||
# We avoid consuming a comma token because external tables like @foo and @bar
|
# We avoid consuming a comma token because external tables like @foo and @bar
|
||||||
# can be joined in a query with a comma separator, as well as closing paren
|
# can be joined in a query with a comma separator, as well as closing paren
|
||||||
|
@ -870,9 +873,9 @@ class Snowflake(Dialect):
|
||||||
while self._is_connected() and not self._match_set(
|
while self._is_connected() and not self._match_set(
|
||||||
(TokenType.COMMA, TokenType.L_PAREN, TokenType.R_PAREN), advance=False
|
(TokenType.COMMA, TokenType.L_PAREN, TokenType.R_PAREN), advance=False
|
||||||
):
|
):
|
||||||
parts.append(self._advance_any(ignore_reserved=True))
|
self._advance_any(ignore_reserved=True)
|
||||||
|
|
||||||
return exp.var("".join(part.text for part in parts if part))
|
return exp.var(self._find_sql(start, self._prev))
|
||||||
|
|
||||||
def _parse_lambda_arg(self) -> t.Optional[exp.Expression]:
|
def _parse_lambda_arg(self) -> t.Optional[exp.Expression]:
|
||||||
this = super()._parse_lambda_arg()
|
this = super()._parse_lambda_arg()
|
||||||
|
@ -1028,7 +1031,7 @@ class Snowflake(Dialect):
|
||||||
exp.Select: transforms.preprocess(
|
exp.Select: transforms.preprocess(
|
||||||
[
|
[
|
||||||
transforms.eliminate_distinct_on,
|
transforms.eliminate_distinct_on,
|
||||||
transforms.explode_to_unnest(),
|
transforms.explode_projection_to_unnest(),
|
||||||
transforms.eliminate_semi_and_anti_joins,
|
transforms.eliminate_semi_and_anti_joins,
|
||||||
_transform_generate_date_array,
|
_transform_generate_date_array,
|
||||||
]
|
]
|
||||||
|
|
|
@ -21,20 +21,6 @@ from sqlglot.generator import unsupported_args
|
||||||
from sqlglot.tokens import TokenType
|
from sqlglot.tokens import TokenType
|
||||||
|
|
||||||
|
|
||||||
def _date_add_sql(self: SQLite.Generator, expression: exp.DateAdd) -> str:
|
|
||||||
modifier = expression.expression
|
|
||||||
modifier = modifier.name if modifier.is_string else self.sql(modifier)
|
|
||||||
unit = expression.args.get("unit")
|
|
||||||
modifier = f"'{modifier} {unit.name}'" if unit else f"'{modifier}'"
|
|
||||||
return self.func("DATE", expression.this, modifier)
|
|
||||||
|
|
||||||
|
|
||||||
def _json_extract_sql(self: SQLite.Generator, expression: exp.JSONExtract) -> str:
|
|
||||||
if expression.expressions:
|
|
||||||
return self.function_fallback_sql(expression)
|
|
||||||
return arrow_json_extract_sql(self, expression)
|
|
||||||
|
|
||||||
|
|
||||||
def _build_strftime(args: t.List) -> exp.Anonymous | exp.TimeToStr:
|
def _build_strftime(args: t.List) -> exp.Anonymous | exp.TimeToStr:
|
||||||
if len(args) == 1:
|
if len(args) == 1:
|
||||||
args.append(exp.CurrentTimestamp())
|
args.append(exp.CurrentTimestamp())
|
||||||
|
@ -182,11 +168,9 @@ class SQLite(Dialect):
|
||||||
exp.CurrentTime: lambda *_: "CURRENT_TIME",
|
exp.CurrentTime: lambda *_: "CURRENT_TIME",
|
||||||
exp.CurrentTimestamp: lambda *_: "CURRENT_TIMESTAMP",
|
exp.CurrentTimestamp: lambda *_: "CURRENT_TIMESTAMP",
|
||||||
exp.ColumnDef: transforms.preprocess([_generated_to_auto_increment]),
|
exp.ColumnDef: transforms.preprocess([_generated_to_auto_increment]),
|
||||||
exp.DateAdd: _date_add_sql,
|
|
||||||
exp.DateStrToDate: lambda self, e: self.sql(e, "this"),
|
exp.DateStrToDate: lambda self, e: self.sql(e, "this"),
|
||||||
exp.If: rename_func("IIF"),
|
exp.If: rename_func("IIF"),
|
||||||
exp.ILike: no_ilike_sql,
|
exp.ILike: no_ilike_sql,
|
||||||
exp.JSONExtract: _json_extract_sql,
|
|
||||||
exp.JSONExtractScalar: arrow_json_extract_sql,
|
exp.JSONExtractScalar: arrow_json_extract_sql,
|
||||||
exp.Levenshtein: unsupported_args("ins_cost", "del_cost", "sub_cost", "max_dist")(
|
exp.Levenshtein: unsupported_args("ins_cost", "del_cost", "sub_cost", "max_dist")(
|
||||||
rename_func("EDITDIST3")
|
rename_func("EDITDIST3")
|
||||||
|
@ -224,6 +208,18 @@ class SQLite(Dialect):
|
||||||
|
|
||||||
LIMIT_FETCH = "LIMIT"
|
LIMIT_FETCH = "LIMIT"
|
||||||
|
|
||||||
|
def jsonextract_sql(self, expression: exp.JSONExtract) -> str:
|
||||||
|
if expression.expressions:
|
||||||
|
return self.function_fallback_sql(expression)
|
||||||
|
return arrow_json_extract_sql(self, expression)
|
||||||
|
|
||||||
|
def dateadd_sql(self, expression: exp.DateAdd) -> str:
|
||||||
|
modifier = expression.expression
|
||||||
|
modifier = modifier.name if modifier.is_string else self.sql(modifier)
|
||||||
|
unit = expression.args.get("unit")
|
||||||
|
modifier = f"'{modifier} {unit.name}'" if unit else f"'{modifier}'"
|
||||||
|
return self.func("DATE", expression.this, modifier)
|
||||||
|
|
||||||
def cast_sql(self, expression: exp.Cast, safe_prefix: t.Optional[str] = None) -> str:
|
def cast_sql(self, expression: exp.Cast, safe_prefix: t.Optional[str] = None) -> str:
|
||||||
if expression.is_type("date"):
|
if expression.is_type("date"):
|
||||||
return self.func("DATE", expression.this)
|
return self.func("DATE", expression.this)
|
||||||
|
|
|
@ -633,6 +633,15 @@ class TSQL(Dialect):
|
||||||
else self.expression(exp.ScopeResolution, this=this, expression=to),
|
else self.expression(exp.ScopeResolution, this=this, expression=to),
|
||||||
}
|
}
|
||||||
|
|
||||||
|
def _parse_wrapped_select(self, table: bool = False) -> t.Optional[exp.Expression]:
|
||||||
|
if self._match(TokenType.MERGE):
|
||||||
|
comments = self._prev_comments
|
||||||
|
merge = self._parse_merge()
|
||||||
|
merge.add_comments(comments, prepend=True)
|
||||||
|
return merge
|
||||||
|
|
||||||
|
return super()._parse_wrapped_select(table=table)
|
||||||
|
|
||||||
def _parse_dcolon(self) -> t.Optional[exp.Expression]:
|
def _parse_dcolon(self) -> t.Optional[exp.Expression]:
|
||||||
# We want to use _parse_types() if the first token after :: is a known type,
|
# We want to use _parse_types() if the first token after :: is a known type,
|
||||||
# otherwise we could parse something like x::varchar(max) into a function
|
# otherwise we could parse something like x::varchar(max) into a function
|
||||||
|
@ -770,6 +779,18 @@ class TSQL(Dialect):
|
||||||
|
|
||||||
return self.expression(exp.UserDefinedFunction, this=this, expressions=expressions)
|
return self.expression(exp.UserDefinedFunction, this=this, expressions=expressions)
|
||||||
|
|
||||||
|
def _parse_into(self) -> t.Optional[exp.Into]:
|
||||||
|
into = super()._parse_into()
|
||||||
|
|
||||||
|
table = isinstance(into, exp.Into) and into.find(exp.Table)
|
||||||
|
if isinstance(table, exp.Table):
|
||||||
|
table_identifier = table.this
|
||||||
|
if table_identifier.args.get("temporary"):
|
||||||
|
# Promote the temporary property from the Identifier to the Into expression
|
||||||
|
t.cast(exp.Into, into).set("temporary", True)
|
||||||
|
|
||||||
|
return into
|
||||||
|
|
||||||
def _parse_id_var(
|
def _parse_id_var(
|
||||||
self,
|
self,
|
||||||
any_token: bool = True,
|
any_token: bool = True,
|
||||||
|
@ -1150,8 +1171,11 @@ class TSQL(Dialect):
|
||||||
if isinstance(ctas_expression, exp.UNWRAPPED_QUERIES):
|
if isinstance(ctas_expression, exp.UNWRAPPED_QUERIES):
|
||||||
ctas_expression = ctas_expression.subquery()
|
ctas_expression = ctas_expression.subquery()
|
||||||
|
|
||||||
|
properties = expression.args.get("properties") or exp.Properties()
|
||||||
|
is_temp = any(isinstance(p, exp.TemporaryProperty) for p in properties.expressions)
|
||||||
|
|
||||||
select_into = exp.select("*").from_(exp.alias_(ctas_expression, "temp", table=True))
|
select_into = exp.select("*").from_(exp.alias_(ctas_expression, "temp", table=True))
|
||||||
select_into.set("into", exp.Into(this=table))
|
select_into.set("into", exp.Into(this=table, temporary=is_temp))
|
||||||
|
|
||||||
if like_property:
|
if like_property:
|
||||||
select_into.limit(0, copy=False)
|
select_into.limit(0, copy=False)
|
||||||
|
@ -1180,6 +1204,16 @@ class TSQL(Dialect):
|
||||||
|
|
||||||
return self.prepend_ctes(expression, sql)
|
return self.prepend_ctes(expression, sql)
|
||||||
|
|
||||||
|
@generator.unsupported_args("unlogged", "expressions")
|
||||||
|
def into_sql(self, expression: exp.Into) -> str:
|
||||||
|
if expression.args.get("temporary"):
|
||||||
|
# If the Into expression has a temporary property, push this down to the Identifier
|
||||||
|
table = expression.find(exp.Table)
|
||||||
|
if table and isinstance(table.this, exp.Identifier):
|
||||||
|
table.this.set("temporary", True)
|
||||||
|
|
||||||
|
return f"{self.seg('INTO')} {self.sql(expression, 'this')}"
|
||||||
|
|
||||||
def count_sql(self, expression: exp.Count) -> str:
|
def count_sql(self, expression: exp.Count) -> str:
|
||||||
func_name = "COUNT_BIG" if expression.args.get("big_int") else "COUNT"
|
func_name = "COUNT_BIG" if expression.args.get("big_int") else "COUNT"
|
||||||
return rename_func(func_name)(self, expression)
|
return rename_func(func_name)(self, expression)
|
||||||
|
|
|
@ -3067,6 +3067,11 @@ class UnloggedProperty(Property):
|
||||||
arg_types = {}
|
arg_types = {}
|
||||||
|
|
||||||
|
|
||||||
|
# https://docs.snowflake.com/en/sql-reference/sql/create-table#create-table-using-template
|
||||||
|
class UsingTemplateProperty(Property):
|
||||||
|
arg_types = {"this": True}
|
||||||
|
|
||||||
|
|
||||||
# https://learn.microsoft.com/en-us/sql/t-sql/statements/create-view-transact-sql?view=sql-server-ver16
|
# https://learn.microsoft.com/en-us/sql/t-sql/statements/create-view-transact-sql?view=sql-server-ver16
|
||||||
class ViewAttributeProperty(Property):
|
class ViewAttributeProperty(Property):
|
||||||
arg_types = {"this": True}
|
arg_types = {"this": True}
|
||||||
|
@ -7012,7 +7017,7 @@ def maybe_copy(instance, copy=True):
|
||||||
return instance.copy() if copy and instance else instance
|
return instance.copy() if copy and instance else instance
|
||||||
|
|
||||||
|
|
||||||
def _to_s(node: t.Any, verbose: bool = False, level: int = 0) -> str:
|
def _to_s(node: t.Any, verbose: bool = False, level: int = 0, repr_str: bool = False) -> str:
|
||||||
"""Generate a textual representation of an Expression tree"""
|
"""Generate a textual representation of an Expression tree"""
|
||||||
indent = "\n" + (" " * (level + 1))
|
indent = "\n" + (" " * (level + 1))
|
||||||
delim = f",{indent}"
|
delim = f",{indent}"
|
||||||
|
@ -7033,7 +7038,10 @@ def _to_s(node: t.Any, verbose: bool = False, level: int = 0) -> str:
|
||||||
indent = ""
|
indent = ""
|
||||||
delim = ", "
|
delim = ", "
|
||||||
|
|
||||||
items = delim.join([f"{k}={_to_s(v, verbose, level + 1)}" for k, v in args.items()])
|
repr_str = node.is_string or (isinstance(node, Identifier) and node.quoted)
|
||||||
|
items = delim.join(
|
||||||
|
[f"{k}={_to_s(v, verbose, level + 1, repr_str=repr_str)}" for k, v in args.items()]
|
||||||
|
)
|
||||||
return f"{node.__class__.__name__}({indent}{items})"
|
return f"{node.__class__.__name__}({indent}{items})"
|
||||||
|
|
||||||
if isinstance(node, list):
|
if isinstance(node, list):
|
||||||
|
@ -7041,6 +7049,10 @@ def _to_s(node: t.Any, verbose: bool = False, level: int = 0) -> str:
|
||||||
items = f"{indent}{items}" if items else ""
|
items = f"{indent}{items}" if items else ""
|
||||||
return f"[{items}]"
|
return f"[{items}]"
|
||||||
|
|
||||||
|
# We use the representation of the string to avoid stripping out important whitespace
|
||||||
|
if repr_str and isinstance(node, str):
|
||||||
|
node = repr(node)
|
||||||
|
|
||||||
# Indent multiline strings to match the current level
|
# Indent multiline strings to match the current level
|
||||||
return indent.join(textwrap.dedent(str(node).strip("\n")).splitlines())
|
return indent.join(textwrap.dedent(str(node).strip("\n")).splitlines())
|
||||||
|
|
||||||
|
|
|
@ -200,6 +200,7 @@ class Generator(metaclass=_Generator):
|
||||||
exp.TransientProperty: lambda *_: "TRANSIENT",
|
exp.TransientProperty: lambda *_: "TRANSIENT",
|
||||||
exp.Union: lambda self, e: self.set_operations(e),
|
exp.Union: lambda self, e: self.set_operations(e),
|
||||||
exp.UnloggedProperty: lambda *_: "UNLOGGED",
|
exp.UnloggedProperty: lambda *_: "UNLOGGED",
|
||||||
|
exp.UsingTemplateProperty: lambda self, e: f"USING TEMPLATE {self.sql(e, 'this')}",
|
||||||
exp.UsingData: lambda self, e: f"USING DATA {self.sql(e, 'this')}",
|
exp.UsingData: lambda self, e: f"USING DATA {self.sql(e, 'this')}",
|
||||||
exp.Uuid: lambda *_: "UUID()",
|
exp.Uuid: lambda *_: "UUID()",
|
||||||
exp.UppercaseColumnConstraint: lambda *_: "UPPERCASE",
|
exp.UppercaseColumnConstraint: lambda *_: "UPPERCASE",
|
||||||
|
@ -596,6 +597,7 @@ class Generator(metaclass=_Generator):
|
||||||
exp.TransformModelProperty: exp.Properties.Location.POST_SCHEMA,
|
exp.TransformModelProperty: exp.Properties.Location.POST_SCHEMA,
|
||||||
exp.MergeTreeTTL: exp.Properties.Location.POST_SCHEMA,
|
exp.MergeTreeTTL: exp.Properties.Location.POST_SCHEMA,
|
||||||
exp.UnloggedProperty: exp.Properties.Location.POST_CREATE,
|
exp.UnloggedProperty: exp.Properties.Location.POST_CREATE,
|
||||||
|
exp.UsingTemplateProperty: exp.Properties.Location.POST_SCHEMA,
|
||||||
exp.ViewAttributeProperty: exp.Properties.Location.POST_SCHEMA,
|
exp.ViewAttributeProperty: exp.Properties.Location.POST_SCHEMA,
|
||||||
exp.VolatileProperty: exp.Properties.Location.POST_CREATE,
|
exp.VolatileProperty: exp.Properties.Location.POST_CREATE,
|
||||||
exp.WithDataProperty: exp.Properties.Location.POST_EXPRESSION,
|
exp.WithDataProperty: exp.Properties.Location.POST_EXPRESSION,
|
||||||
|
@ -3056,7 +3058,7 @@ class Generator(metaclass=_Generator):
|
||||||
elif field:
|
elif field:
|
||||||
in_sql = self.sql(field)
|
in_sql = self.sql(field)
|
||||||
else:
|
else:
|
||||||
in_sql = f"({self.expressions(expression, flat=True)})"
|
in_sql = f"({self.expressions(expression, dynamic=True, new_line=True, skip_first=True, skip_last=True)})"
|
||||||
|
|
||||||
return f"{self.sql(expression, 'this')}{is_global} IN {in_sql}"
|
return f"{self.sql(expression, 'this')}{is_global} IN {in_sql}"
|
||||||
|
|
||||||
|
|
|
@ -129,8 +129,7 @@ def _mergeable(
|
||||||
inner_select = inner_scope.expression.unnest()
|
inner_select = inner_scope.expression.unnest()
|
||||||
|
|
||||||
def _is_a_window_expression_in_unmergable_operation():
|
def _is_a_window_expression_in_unmergable_operation():
|
||||||
window_expressions = inner_select.find_all(exp.Window)
|
window_aliases = {s.alias_or_name for s in inner_select.selects if s.find(exp.Window)}
|
||||||
window_alias_names = {window.parent.alias_or_name for window in window_expressions}
|
|
||||||
inner_select_name = from_or_join.alias_or_name
|
inner_select_name = from_or_join.alias_or_name
|
||||||
unmergable_window_columns = [
|
unmergable_window_columns = [
|
||||||
column
|
column
|
||||||
|
@ -142,7 +141,7 @@ def _mergeable(
|
||||||
window_expressions_in_unmergable = [
|
window_expressions_in_unmergable = [
|
||||||
column
|
column
|
||||||
for column in unmergable_window_columns
|
for column in unmergable_window_columns
|
||||||
if column.table == inner_select_name and column.name in window_alias_names
|
if column.table == inner_select_name and column.name in window_aliases
|
||||||
]
|
]
|
||||||
return any(window_expressions_in_unmergable)
|
return any(window_expressions_in_unmergable)
|
||||||
|
|
||||||
|
|
|
@ -4,6 +4,7 @@ from sqlglot import alias, exp
|
||||||
from sqlglot.optimizer.qualify_columns import Resolver
|
from sqlglot.optimizer.qualify_columns import Resolver
|
||||||
from sqlglot.optimizer.scope import Scope, traverse_scope
|
from sqlglot.optimizer.scope import Scope, traverse_scope
|
||||||
from sqlglot.schema import ensure_schema
|
from sqlglot.schema import ensure_schema
|
||||||
|
from sqlglot.errors import OptimizeError
|
||||||
|
|
||||||
# Sentinel value that means an outer query selecting ALL columns
|
# Sentinel value that means an outer query selecting ALL columns
|
||||||
SELECT_ALL = object()
|
SELECT_ALL = object()
|
||||||
|
@ -49,6 +50,10 @@ def pushdown_projections(expression, schema=None, remove_unused_selections=True)
|
||||||
|
|
||||||
if isinstance(scope.expression, exp.SetOperation):
|
if isinstance(scope.expression, exp.SetOperation):
|
||||||
left, right = scope.union_scopes
|
left, right = scope.union_scopes
|
||||||
|
if len(left.expression.selects) != len(right.expression.selects):
|
||||||
|
scope_sql = scope.expression.sql()
|
||||||
|
raise OptimizeError(f"Invalid set operation due to column mismatch: {scope_sql}.")
|
||||||
|
|
||||||
referenced_columns[left] = parent_selections
|
referenced_columns[left] = parent_selections
|
||||||
|
|
||||||
if any(select.is_star for select in right.expression.selects):
|
if any(select.is_star for select in right.expression.selects):
|
||||||
|
|
|
@ -272,7 +272,7 @@ def _expand_alias_refs(
|
||||||
"""
|
"""
|
||||||
expression = scope.expression
|
expression = scope.expression
|
||||||
|
|
||||||
if not isinstance(expression, exp.Select):
|
if not isinstance(expression, exp.Select) or dialect == "oracle":
|
||||||
return
|
return
|
||||||
|
|
||||||
alias_to_expression: t.Dict[str, t.Tuple[exp.Expression, int]] = {}
|
alias_to_expression: t.Dict[str, t.Tuple[exp.Expression, int]] = {}
|
||||||
|
|
|
@ -3057,6 +3057,37 @@ class Parser(metaclass=_Parser):
|
||||||
def _parse_projections(self) -> t.List[exp.Expression]:
|
def _parse_projections(self) -> t.List[exp.Expression]:
|
||||||
return self._parse_expressions()
|
return self._parse_expressions()
|
||||||
|
|
||||||
|
def _parse_wrapped_select(self, table: bool = False) -> t.Optional[exp.Expression]:
|
||||||
|
if self._match_set((TokenType.PIVOT, TokenType.UNPIVOT)):
|
||||||
|
this: t.Optional[exp.Expression] = self._parse_simplified_pivot(
|
||||||
|
is_unpivot=self._prev.token_type == TokenType.UNPIVOT
|
||||||
|
)
|
||||||
|
elif self._match(TokenType.FROM):
|
||||||
|
from_ = self._parse_from(skip_from_token=True)
|
||||||
|
# Support parentheses for duckdb FROM-first syntax
|
||||||
|
select = self._parse_select()
|
||||||
|
if select:
|
||||||
|
select.set("from", from_)
|
||||||
|
this = select
|
||||||
|
else:
|
||||||
|
this = exp.select("*").from_(t.cast(exp.From, from_))
|
||||||
|
else:
|
||||||
|
this = (
|
||||||
|
self._parse_table()
|
||||||
|
if table
|
||||||
|
else self._parse_select(nested=True, parse_set_operation=False)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Transform exp.Values into a exp.Table to pass through parse_query_modifiers
|
||||||
|
# in case a modifier (e.g. join) is following
|
||||||
|
if table and isinstance(this, exp.Values) and this.alias:
|
||||||
|
alias = this.args["alias"].pop()
|
||||||
|
this = exp.Table(this=this, alias=alias)
|
||||||
|
|
||||||
|
this = self._parse_query_modifiers(self._parse_set_operations(this))
|
||||||
|
|
||||||
|
return this
|
||||||
|
|
||||||
def _parse_select(
|
def _parse_select(
|
||||||
self,
|
self,
|
||||||
nested: bool = False,
|
nested: bool = False,
|
||||||
|
@ -3140,38 +3171,11 @@ class Parser(metaclass=_Parser):
|
||||||
|
|
||||||
this = self._parse_query_modifiers(this)
|
this = self._parse_query_modifiers(this)
|
||||||
elif (table or nested) and self._match(TokenType.L_PAREN):
|
elif (table or nested) and self._match(TokenType.L_PAREN):
|
||||||
if self._match_set((TokenType.PIVOT, TokenType.UNPIVOT)):
|
this = self._parse_wrapped_select(table=table)
|
||||||
this = self._parse_simplified_pivot(
|
|
||||||
is_unpivot=self._prev.token_type == TokenType.UNPIVOT
|
|
||||||
)
|
|
||||||
elif self._match(TokenType.FROM):
|
|
||||||
from_ = self._parse_from(skip_from_token=True)
|
|
||||||
# Support parentheses for duckdb FROM-first syntax
|
|
||||||
select = self._parse_select()
|
|
||||||
if select:
|
|
||||||
select.set("from", from_)
|
|
||||||
this = select
|
|
||||||
else:
|
|
||||||
this = exp.select("*").from_(t.cast(exp.From, from_))
|
|
||||||
else:
|
|
||||||
this = (
|
|
||||||
self._parse_table()
|
|
||||||
if table
|
|
||||||
else self._parse_select(nested=True, parse_set_operation=False)
|
|
||||||
)
|
|
||||||
|
|
||||||
# Transform exp.Values into a exp.Table to pass through parse_query_modifiers
|
|
||||||
# in case a modifier (e.g. join) is following
|
|
||||||
if table and isinstance(this, exp.Values) and this.alias:
|
|
||||||
alias = this.args["alias"].pop()
|
|
||||||
this = exp.Table(this=this, alias=alias)
|
|
||||||
|
|
||||||
this = self._parse_query_modifiers(self._parse_set_operations(this))
|
|
||||||
|
|
||||||
self._match_r_paren()
|
|
||||||
|
|
||||||
# We return early here so that the UNION isn't attached to the subquery by the
|
# We return early here so that the UNION isn't attached to the subquery by the
|
||||||
# following call to _parse_set_operations, but instead becomes the parent node
|
# following call to _parse_set_operations, but instead becomes the parent node
|
||||||
|
self._match_r_paren()
|
||||||
return self._parse_subquery(this, parse_alias=parse_subquery_alias)
|
return self._parse_subquery(this, parse_alias=parse_subquery_alias)
|
||||||
elif self._match(TokenType.VALUES, advance=False):
|
elif self._match(TokenType.VALUES, advance=False):
|
||||||
this = self._parse_derived_table_values()
|
this = self._parse_derived_table_values()
|
||||||
|
|
|
@ -410,10 +410,12 @@ def unnest_to_explode(
|
||||||
return expression
|
return expression
|
||||||
|
|
||||||
|
|
||||||
def explode_to_unnest(index_offset: int = 0) -> t.Callable[[exp.Expression], exp.Expression]:
|
def explode_projection_to_unnest(
|
||||||
"""Convert explode/posexplode into unnest."""
|
index_offset: int = 0,
|
||||||
|
) -> t.Callable[[exp.Expression], exp.Expression]:
|
||||||
|
"""Convert explode/posexplode projections into unnests."""
|
||||||
|
|
||||||
def _explode_to_unnest(expression: exp.Expression) -> exp.Expression:
|
def _explode_projection_to_unnest(expression: exp.Expression) -> exp.Expression:
|
||||||
if isinstance(expression, exp.Select):
|
if isinstance(expression, exp.Select):
|
||||||
from sqlglot.optimizer.scope import Scope
|
from sqlglot.optimizer.scope import Scope
|
||||||
|
|
||||||
|
@ -558,7 +560,7 @@ def explode_to_unnest(index_offset: int = 0) -> t.Callable[[exp.Expression], exp
|
||||||
|
|
||||||
return expression
|
return expression
|
||||||
|
|
||||||
return _explode_to_unnest
|
return _explode_projection_to_unnest
|
||||||
|
|
||||||
|
|
||||||
def add_within_group_for_percentiles(expression: exp.Expression) -> exp.Expression:
|
def add_within_group_for_percentiles(expression: exp.Expression) -> exp.Expression:
|
||||||
|
|
|
@ -15,6 +15,7 @@ from sqlglot import (
|
||||||
from sqlglot.helper import logger as helper_logger
|
from sqlglot.helper import logger as helper_logger
|
||||||
from sqlglot.parser import logger as parser_logger
|
from sqlglot.parser import logger as parser_logger
|
||||||
from tests.dialects.test_dialect import Validator
|
from tests.dialects.test_dialect import Validator
|
||||||
|
from sqlglot.optimizer.annotate_types import annotate_types
|
||||||
|
|
||||||
|
|
||||||
class TestBigQuery(Validator):
|
class TestBigQuery(Validator):
|
||||||
|
@ -196,6 +197,9 @@ LANGUAGE js AS
|
||||||
self.validate_identity("CAST(x AS TIMESTAMPTZ)", "CAST(x AS TIMESTAMP)")
|
self.validate_identity("CAST(x AS TIMESTAMPTZ)", "CAST(x AS TIMESTAMP)")
|
||||||
self.validate_identity("CAST(x AS RECORD)", "CAST(x AS STRUCT)")
|
self.validate_identity("CAST(x AS RECORD)", "CAST(x AS STRUCT)")
|
||||||
self.validate_identity("SELECT * FROM x WHERE x.y >= (SELECT MAX(a) FROM b-c) - 20")
|
self.validate_identity("SELECT * FROM x WHERE x.y >= (SELECT MAX(a) FROM b-c) - 20")
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT FORMAT_TIMESTAMP('%Y-%m-%d %H:%M:%S', CURRENT_TIMESTAMP(), 'Europe/Berlin') AS ts"
|
||||||
|
)
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"SELECT cars, apples FROM some_table PIVOT(SUM(total_counts) FOR products IN ('general.cars' AS cars, 'food.apples' AS apples))"
|
"SELECT cars, apples FROM some_table PIVOT(SUM(total_counts) FOR products IN ('general.cars' AS cars, 'food.apples' AS apples))"
|
||||||
)
|
)
|
||||||
|
@ -317,6 +321,13 @@ LANGUAGE js AS
|
||||||
"SELECT CAST(1 AS INT64)",
|
"SELECT CAST(1 AS INT64)",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
self.validate_all(
|
||||||
|
"SELECT DATE_SUB(CURRENT_DATE(), INTERVAL 2 DAY)",
|
||||||
|
write={
|
||||||
|
"bigquery": "SELECT DATE_SUB(CURRENT_DATE, INTERVAL '2' DAY)",
|
||||||
|
"databricks": "SELECT DATE_ADD(CURRENT_DATE, -2)",
|
||||||
|
},
|
||||||
|
)
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"SELECT DATE_SUB(DATE '2008-12-25', INTERVAL 5 DAY)",
|
"SELECT DATE_SUB(DATE '2008-12-25', INTERVAL 5 DAY)",
|
||||||
write={
|
write={
|
||||||
|
@ -1309,8 +1320,8 @@ LANGUAGE js AS
|
||||||
"mysql": "DATE_ADD(CURRENT_DATE, INTERVAL '-1' DAY)",
|
"mysql": "DATE_ADD(CURRENT_DATE, INTERVAL '-1' DAY)",
|
||||||
"postgres": "CURRENT_DATE + INTERVAL '-1 DAY'",
|
"postgres": "CURRENT_DATE + INTERVAL '-1 DAY'",
|
||||||
"presto": "DATE_ADD('DAY', CAST('-1' AS BIGINT), CURRENT_DATE)",
|
"presto": "DATE_ADD('DAY', CAST('-1' AS BIGINT), CURRENT_DATE)",
|
||||||
"hive": "DATE_ADD(CURRENT_DATE, '-1')",
|
"hive": "DATE_ADD(CURRENT_DATE, -1)",
|
||||||
"spark": "DATE_ADD(CURRENT_DATE, '-1')",
|
"spark": "DATE_ADD(CURRENT_DATE, -1)",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
|
@ -2356,3 +2367,18 @@ OPTIONS (
|
||||||
"STRING_AGG(DISTINCT a ORDER BY b DESC, c DESC LIMIT 10)",
|
"STRING_AGG(DISTINCT a ORDER BY b DESC, c DESC LIMIT 10)",
|
||||||
"STRING_AGG(DISTINCT a, ',' ORDER BY b DESC, c DESC LIMIT 10)",
|
"STRING_AGG(DISTINCT a, ',' ORDER BY b DESC, c DESC LIMIT 10)",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def test_annotate_timestamps(self):
|
||||||
|
sql = """
|
||||||
|
SELECT
|
||||||
|
CURRENT_TIMESTAMP() AS curr_ts,
|
||||||
|
TIMESTAMP_SECONDS(2) AS ts_seconds,
|
||||||
|
PARSE_TIMESTAMP('%c', 'Thu Dec 25 07:30:00 2008', 'UTC') AS parsed_ts,
|
||||||
|
TIMESTAMP_ADD(TIMESTAMP "2008-12-25 15:30:00+00", INTERVAL 10 MINUTE) AS ts_add,
|
||||||
|
TIMESTAMP_SUB(TIMESTAMP "2008-12-25 15:30:00+00", INTERVAL 10 MINUTE) AS ts_sub,
|
||||||
|
"""
|
||||||
|
|
||||||
|
annotated = annotate_types(self.parse_one(sql), dialect="bigquery")
|
||||||
|
|
||||||
|
for select in annotated.selects:
|
||||||
|
self.assertEqual(select.type.sql("bigquery"), "TIMESTAMP")
|
||||||
|
|
|
@ -50,6 +50,7 @@ class TestDatabricks(Validator):
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"COPY INTO target FROM `s3://link` FILEFORMAT = AVRO VALIDATE = ALL FILES = ('file1', 'file2') FORMAT_OPTIONS ('opt1'='true', 'opt2'='test') COPY_OPTIONS ('mergeSchema'='true')"
|
"COPY INTO target FROM `s3://link` FILEFORMAT = AVRO VALIDATE = ALL FILES = ('file1', 'file2') FORMAT_OPTIONS ('opt1'='true', 'opt2'='test') COPY_OPTIONS ('mergeSchema'='true')"
|
||||||
)
|
)
|
||||||
|
self.validate_identity("SELECT PARSE_JSON('{}')")
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"SELECT DATE_FORMAT(CAST(FROM_UTC_TIMESTAMP(foo, 'America/Los_Angeles') AS TIMESTAMP), 'yyyy-MM-dd HH:mm:ss') AS foo FROM t",
|
"SELECT DATE_FORMAT(CAST(FROM_UTC_TIMESTAMP(foo, 'America/Los_Angeles') AS TIMESTAMP), 'yyyy-MM-dd HH:mm:ss') AS foo FROM t",
|
||||||
"SELECT DATE_FORMAT(CAST(FROM_UTC_TIMESTAMP(CAST(foo AS TIMESTAMP), 'America/Los_Angeles') AS TIMESTAMP), 'yyyy-MM-dd HH:mm:ss') AS foo FROM t",
|
"SELECT DATE_FORMAT(CAST(FROM_UTC_TIMESTAMP(CAST(foo AS TIMESTAMP), 'America/Los_Angeles') AS TIMESTAMP), 'yyyy-MM-dd HH:mm:ss') AS foo FROM t",
|
||||||
|
|
|
@ -1569,3 +1569,29 @@ class TestDuckDB(Validator):
|
||||||
""",
|
""",
|
||||||
"SELECT l_returnflag, l_linestatus, SUM(l_quantity) AS sum_qty, SUM(l_extendedprice) AS sum_base_price, SUM(l_extendedprice * (1 - l_discount)) AS sum_disc_price, SUM(l_extendedprice * (1 - l_discount) * (1 + l_tax)) AS sum_charge, AVG(l_quantity) AS avg_qty, AVG(l_extendedprice) AS avg_price, AVG(l_discount) AS avg_disc, COUNT(*) AS count_order",
|
"SELECT l_returnflag, l_linestatus, SUM(l_quantity) AS sum_qty, SUM(l_extendedprice) AS sum_base_price, SUM(l_extendedprice * (1 - l_discount)) AS sum_disc_price, SUM(l_extendedprice * (1 - l_discount) * (1 + l_tax)) AS sum_charge, AVG(l_quantity) AS avg_qty, AVG(l_extendedprice) AS avg_price, AVG(l_discount) AS avg_disc, COUNT(*) AS count_order",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def test_at_sign_to_abs(self):
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT @col FROM t",
|
||||||
|
"SELECT ABS(col) FROM t",
|
||||||
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT @col + 1 FROM t",
|
||||||
|
"SELECT ABS(col + 1) FROM t",
|
||||||
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT (@col) + 1 FROM t",
|
||||||
|
"SELECT (ABS(col)) + 1 FROM t",
|
||||||
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT @(-1)",
|
||||||
|
"SELECT ABS((-1))",
|
||||||
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT @(-1) + 1",
|
||||||
|
"SELECT ABS((-1) + 1)",
|
||||||
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT (@-1) + 1",
|
||||||
|
"SELECT (ABS(-1)) + 1",
|
||||||
|
)
|
||||||
|
|
|
@ -394,7 +394,7 @@ class TestSnowflake(Validator):
|
||||||
"""SELECT PARSE_JSON('{"fruit":"banana"}'):fruit""",
|
"""SELECT PARSE_JSON('{"fruit":"banana"}'):fruit""",
|
||||||
write={
|
write={
|
||||||
"bigquery": """SELECT JSON_EXTRACT(PARSE_JSON('{"fruit":"banana"}'), '$.fruit')""",
|
"bigquery": """SELECT JSON_EXTRACT(PARSE_JSON('{"fruit":"banana"}'), '$.fruit')""",
|
||||||
"databricks": """SELECT '{"fruit":"banana"}':fruit""",
|
"databricks": """SELECT PARSE_JSON('{"fruit":"banana"}'):fruit""",
|
||||||
"duckdb": """SELECT JSON('{"fruit":"banana"}') -> '$.fruit'""",
|
"duckdb": """SELECT JSON('{"fruit":"banana"}') -> '$.fruit'""",
|
||||||
"mysql": """SELECT JSON_EXTRACT('{"fruit":"banana"}', '$.fruit')""",
|
"mysql": """SELECT JSON_EXTRACT('{"fruit":"banana"}', '$.fruit')""",
|
||||||
"presto": """SELECT JSON_EXTRACT(JSON_PARSE('{"fruit":"banana"}'), '$.fruit')""",
|
"presto": """SELECT JSON_EXTRACT(JSON_PARSE('{"fruit":"banana"}'), '$.fruit')""",
|
||||||
|
@ -1057,6 +1057,9 @@ class TestSnowflake(Validator):
|
||||||
staged_file.sql(dialect="snowflake"),
|
staged_file.sql(dialect="snowflake"),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
self.validate_identity('SELECT * FROM @"mystage"')
|
||||||
|
self.validate_identity('SELECT * FROM @"myschema"."mystage"/file.gz')
|
||||||
|
self.validate_identity('SELECT * FROM @"my_DB"."schEMA1".mystage/file.gz')
|
||||||
self.validate_identity("SELECT metadata$filename FROM @s1/")
|
self.validate_identity("SELECT metadata$filename FROM @s1/")
|
||||||
self.validate_identity("SELECT * FROM @~")
|
self.validate_identity("SELECT * FROM @~")
|
||||||
self.validate_identity("SELECT * FROM @~/some/path/to/file.csv")
|
self.validate_identity("SELECT * FROM @~/some/path/to/file.csv")
|
||||||
|
@ -1463,6 +1466,7 @@ class TestSnowflake(Validator):
|
||||||
"CREATE TABLE t (id INT TAG (key1='value_1', key2='value_2'))",
|
"CREATE TABLE t (id INT TAG (key1='value_1', key2='value_2'))",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
self.validate_identity("CREATE OR REPLACE TABLE foo COPY GRANTS USING TEMPLATE (SELECT 1)")
|
||||||
self.validate_identity("USE SECONDARY ROLES ALL")
|
self.validate_identity("USE SECONDARY ROLES ALL")
|
||||||
self.validate_identity("USE SECONDARY ROLES NONE")
|
self.validate_identity("USE SECONDARY ROLES NONE")
|
||||||
self.validate_identity("USE SECONDARY ROLES a, b, c")
|
self.validate_identity("USE SECONDARY ROLES a, b, c")
|
||||||
|
@ -2386,11 +2390,14 @@ SINGLE = TRUE""",
|
||||||
)
|
)
|
||||||
|
|
||||||
def test_put_to_stage(self):
|
def test_put_to_stage(self):
|
||||||
|
self.validate_identity('PUT \'file:///dir/tmp.csv\' @"my_DB"."schEMA1"."MYstage"')
|
||||||
|
|
||||||
# PUT with file path and stage ref containing spaces (wrapped in single quotes)
|
# PUT with file path and stage ref containing spaces (wrapped in single quotes)
|
||||||
ast = parse_one("PUT 'file://my file.txt' '@s1/my folder'", read="snowflake")
|
ast = parse_one("PUT 'file://my file.txt' '@s1/my folder'", read="snowflake")
|
||||||
self.assertIsInstance(ast, exp.Put)
|
self.assertIsInstance(ast, exp.Put)
|
||||||
self.assertEqual(ast.this, exp.Literal(this="file://my file.txt", is_string=True))
|
self.assertEqual(ast.this, exp.Literal(this="file://my file.txt", is_string=True))
|
||||||
self.assertEqual(ast.args["target"], exp.Var(this="@s1/my folder"))
|
self.assertEqual(ast.args["target"], exp.Var(this="'@s1/my folder'"))
|
||||||
|
self.assertEqual(ast.sql("snowflake"), "PUT 'file://my file.txt' '@s1/my folder'")
|
||||||
|
|
||||||
# expression with additional properties
|
# expression with additional properties
|
||||||
ast = parse_one(
|
ast = parse_one(
|
||||||
|
|
|
@ -322,6 +322,13 @@ TBLPROPERTIES (
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
|
self.validate_all(
|
||||||
|
"SELECT id_column, name, age FROM test_table LATERAL VIEW INLINE(struc_column) explode_view AS name, age",
|
||||||
|
write={
|
||||||
|
"presto": "SELECT id_column, name, age FROM test_table CROSS JOIN UNNEST(struc_column) AS explode_view(name, age)",
|
||||||
|
"spark": "SELECT id_column, name, age FROM test_table LATERAL VIEW INLINE(struc_column) explode_view AS name, age",
|
||||||
|
},
|
||||||
|
)
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"SELECT ARRAY_AGG(x) FILTER (WHERE x = 5) FROM (SELECT 1 UNION ALL SELECT NULL) AS t(x)",
|
"SELECT ARRAY_AGG(x) FILTER (WHERE x = 5) FROM (SELECT 1 UNION ALL SELECT NULL) AS t(x)",
|
||||||
write={
|
write={
|
||||||
|
@ -843,7 +850,7 @@ TBLPROPERTIES (
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
def test_explode_to_unnest(self):
|
def test_explode_projection_to_unnest(self):
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"SELECT EXPLODE(x) FROM tbl",
|
"SELECT EXPLODE(x) FROM tbl",
|
||||||
write={
|
write={
|
||||||
|
@ -951,3 +958,42 @@ TBLPROPERTIES (
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"ANALYZE TABLE ctlg.db.tbl PARTITION(foo = 'foo', bar = 'bar') COMPUTE STATISTICS NOSCAN"
|
"ANALYZE TABLE ctlg.db.tbl PARTITION(foo = 'foo', bar = 'bar') COMPUTE STATISTICS NOSCAN"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def test_transpile_annotated_exploded_column(self):
|
||||||
|
from sqlglot.optimizer.annotate_types import annotate_types
|
||||||
|
from sqlglot.optimizer.qualify import qualify
|
||||||
|
|
||||||
|
for db_prefix in ("", "explode_view."):
|
||||||
|
with self.subTest(f"Annotated exploded column with prefix: {db_prefix}."):
|
||||||
|
sql = f"""
|
||||||
|
WITH test_table AS (
|
||||||
|
SELECT
|
||||||
|
12345 AS id_column,
|
||||||
|
ARRAY(
|
||||||
|
STRUCT('John' AS name, 30 AS age),
|
||||||
|
STRUCT('Mary' AS name, 20 AS age),
|
||||||
|
STRUCT('Mike' AS name, 80 AS age),
|
||||||
|
STRUCT('Dan' AS name, 50 AS age)
|
||||||
|
) AS struct_column
|
||||||
|
)
|
||||||
|
|
||||||
|
SELECT
|
||||||
|
id_column,
|
||||||
|
{db_prefix}new_column.name,
|
||||||
|
{db_prefix}new_column.age
|
||||||
|
FROM test_table
|
||||||
|
LATERAL VIEW EXPLODE(struct_column) explode_view AS new_column
|
||||||
|
"""
|
||||||
|
|
||||||
|
expr = self.parse_one(sql)
|
||||||
|
qualified = qualify(expr, dialect="spark")
|
||||||
|
annotated = annotate_types(qualified, dialect="spark")
|
||||||
|
|
||||||
|
self.assertEqual(
|
||||||
|
annotated.sql("spark"),
|
||||||
|
"WITH `test_table` AS (SELECT 12345 AS `id_column`, ARRAY(STRUCT('John' AS `name`, 30 AS `age`), STRUCT('Mary' AS `name`, 20 AS `age`), STRUCT('Mike' AS `name`, 80 AS `age`), STRUCT('Dan' AS `name`, 50 AS `age`)) AS `struct_column`) SELECT `test_table`.`id_column` AS `id_column`, `explode_view`.`new_column`.`name` AS `name`, `explode_view`.`new_column`.`age` AS `age` FROM `test_table` AS `test_table` LATERAL VIEW EXPLODE(`test_table`.`struct_column`) explode_view AS `new_column`",
|
||||||
|
)
|
||||||
|
self.assertEqual(
|
||||||
|
annotated.sql("presto"),
|
||||||
|
"""WITH "test_table" AS (SELECT 12345 AS "id_column", ARRAY[CAST(ROW('John', 30) AS ROW("name" VARCHAR, "age" INTEGER)), CAST(ROW('Mary', 20) AS ROW("name" VARCHAR, "age" INTEGER)), CAST(ROW('Mike', 80) AS ROW("name" VARCHAR, "age" INTEGER)), CAST(ROW('Dan', 50) AS ROW("name" VARCHAR, "age" INTEGER))] AS "struct_column") SELECT "test_table"."id_column" AS "id_column", "explode_view"."name" AS "name", "explode_view"."age" AS "age" FROM "test_table" AS "test_table" CROSS JOIN UNNEST("test_table"."struct_column") AS "explode_view"("name", "age")""",
|
||||||
|
)
|
||||||
|
|
|
@ -133,12 +133,25 @@ class TestTSQL(Validator):
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"WITH t(c) AS (SELECT 1) SELECT * INTO TEMP UNLOGGED foo FROM (SELECT c AS c FROM t) AS temp",
|
"WITH t(c) AS (SELECT 1) SELECT * INTO UNLOGGED #foo FROM (SELECT c AS c FROM t) AS temp",
|
||||||
write={
|
write={
|
||||||
"duckdb": "CREATE TEMPORARY TABLE foo AS WITH t(c) AS (SELECT 1) SELECT * FROM (SELECT c AS c FROM t) AS temp",
|
"duckdb": "CREATE TEMPORARY TABLE foo AS WITH t(c) AS (SELECT 1) SELECT * FROM (SELECT c AS c FROM t) AS temp",
|
||||||
"postgres": "WITH t(c) AS (SELECT 1) SELECT * INTO TEMPORARY foo FROM (SELECT c AS c FROM t) AS temp",
|
"postgres": "WITH t(c) AS (SELECT 1) SELECT * INTO TEMPORARY foo FROM (SELECT c AS c FROM t) AS temp",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
self.validate_all(
|
||||||
|
"WITH t(c) AS (SELECT 1) SELECT c INTO #foo FROM t",
|
||||||
|
read={
|
||||||
|
"tsql": "WITH t(c) AS (SELECT 1) SELECT c INTO #foo FROM t",
|
||||||
|
"postgres": "WITH t(c) AS (SELECT 1) SELECT c INTO TEMPORARY foo FROM t",
|
||||||
|
},
|
||||||
|
write={
|
||||||
|
"tsql": "WITH t(c) AS (SELECT 1) SELECT c INTO #foo FROM t",
|
||||||
|
"postgres": "WITH t(c) AS (SELECT 1) SELECT c INTO TEMPORARY foo FROM t",
|
||||||
|
"duckdb": "CREATE TEMPORARY TABLE foo AS WITH t(c) AS (SELECT 1) SELECT c FROM t",
|
||||||
|
"snowflake": "CREATE TEMPORARY TABLE foo AS WITH t(c) AS (SELECT 1) SELECT c FROM t",
|
||||||
|
},
|
||||||
|
)
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"WITH t(c) AS (SELECT 1) SELECT * INTO UNLOGGED foo FROM (SELECT c AS c FROM t) AS temp",
|
"WITH t(c) AS (SELECT 1) SELECT * INTO UNLOGGED foo FROM (SELECT c AS c FROM t) AS temp",
|
||||||
write={
|
write={
|
||||||
|
@ -151,6 +164,13 @@ class TestTSQL(Validator):
|
||||||
"duckdb": "CREATE TABLE foo AS WITH t(c) AS (SELECT 1) SELECT * FROM (SELECT c AS c FROM t) AS temp",
|
"duckdb": "CREATE TABLE foo AS WITH t(c) AS (SELECT 1) SELECT * FROM (SELECT c AS c FROM t) AS temp",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
self.validate_all(
|
||||||
|
"WITH y AS (SELECT 2 AS c) INSERT INTO #t SELECT * FROM y",
|
||||||
|
write={
|
||||||
|
"duckdb": "WITH y AS (SELECT 2 AS c) INSERT INTO t SELECT * FROM y",
|
||||||
|
"postgres": "WITH y AS (SELECT 2 AS c) INSERT INTO t SELECT * FROM y",
|
||||||
|
},
|
||||||
|
)
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"WITH y AS (SELECT 2 AS c) INSERT INTO t SELECT * FROM y",
|
"WITH y AS (SELECT 2 AS c) INSERT INTO t SELECT * FROM y",
|
||||||
read={
|
read={
|
||||||
|
@ -850,6 +870,9 @@ class TestTSQL(Validator):
|
||||||
)
|
)
|
||||||
|
|
||||||
def test_ddl(self):
|
def test_ddl(self):
|
||||||
|
for colstore in ("NONCLUSTERED COLUMNSTORE", "CLUSTERED COLUMNSTORE"):
|
||||||
|
self.validate_identity(f"CREATE {colstore} INDEX index_name ON foo.bar")
|
||||||
|
|
||||||
for view_attr in ("ENCRYPTION", "SCHEMABINDING", "VIEW_METADATA"):
|
for view_attr in ("ENCRYPTION", "SCHEMABINDING", "VIEW_METADATA"):
|
||||||
self.validate_identity(f"CREATE VIEW a.b WITH {view_attr} AS SELECT * FROM x")
|
self.validate_identity(f"CREATE VIEW a.b WITH {view_attr} AS SELECT * FROM x")
|
||||||
|
|
||||||
|
@ -871,19 +894,19 @@ class TestTSQL(Validator):
|
||||||
|
|
||||||
self.validate_identity("CREATE SCHEMA testSchema")
|
self.validate_identity("CREATE SCHEMA testSchema")
|
||||||
self.validate_identity("CREATE VIEW t AS WITH cte AS (SELECT 1 AS c) SELECT c FROM cte")
|
self.validate_identity("CREATE VIEW t AS WITH cte AS (SELECT 1 AS c) SELECT c FROM cte")
|
||||||
|
self.validate_identity("ALTER TABLE tbl SET SYSTEM_VERSIONING=OFF")
|
||||||
|
self.validate_identity("ALTER TABLE tbl SET FILESTREAM_ON = 'test'")
|
||||||
|
self.validate_identity("ALTER TABLE tbl SET DATA_DELETION=ON")
|
||||||
|
self.validate_identity("ALTER TABLE tbl SET DATA_DELETION=OFF")
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"ALTER TABLE tbl SET SYSTEM_VERSIONING=ON(HISTORY_TABLE=db.tbl, DATA_CONSISTENCY_CHECK=OFF, HISTORY_RETENTION_PERIOD=5 DAYS)"
|
"ALTER TABLE tbl SET SYSTEM_VERSIONING=ON(HISTORY_TABLE=db.tbl, DATA_CONSISTENCY_CHECK=OFF, HISTORY_RETENTION_PERIOD=5 DAYS)"
|
||||||
)
|
)
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"ALTER TABLE tbl SET SYSTEM_VERSIONING=ON(HISTORY_TABLE=db.tbl, HISTORY_RETENTION_PERIOD=INFINITE)"
|
"ALTER TABLE tbl SET SYSTEM_VERSIONING=ON(HISTORY_TABLE=db.tbl, HISTORY_RETENTION_PERIOD=INFINITE)"
|
||||||
)
|
)
|
||||||
self.validate_identity("ALTER TABLE tbl SET SYSTEM_VERSIONING=OFF")
|
|
||||||
self.validate_identity("ALTER TABLE tbl SET FILESTREAM_ON = 'test'")
|
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"ALTER TABLE tbl SET DATA_DELETION=ON(FILTER_COLUMN=col, RETENTION_PERIOD=5 MONTHS)"
|
"ALTER TABLE tbl SET DATA_DELETION=ON(FILTER_COLUMN=col, RETENTION_PERIOD=5 MONTHS)"
|
||||||
)
|
)
|
||||||
self.validate_identity("ALTER TABLE tbl SET DATA_DELETION=ON")
|
|
||||||
self.validate_identity("ALTER TABLE tbl SET DATA_DELETION=OFF")
|
|
||||||
|
|
||||||
self.validate_identity("ALTER VIEW v AS SELECT a, b, c, d FROM foo")
|
self.validate_identity("ALTER VIEW v AS SELECT a, b, c, d FROM foo")
|
||||||
self.validate_identity("ALTER VIEW v AS SELECT * FROM foo WHERE c > 100")
|
self.validate_identity("ALTER VIEW v AS SELECT * FROM foo WHERE c > 100")
|
||||||
|
@ -899,10 +922,44 @@ class TestTSQL(Validator):
|
||||||
"ALTER VIEW v WITH VIEW_METADATA AS SELECT * FROM foo WHERE c > 100",
|
"ALTER VIEW v WITH VIEW_METADATA AS SELECT * FROM foo WHERE c > 100",
|
||||||
check_command_warning=True,
|
check_command_warning=True,
|
||||||
)
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"CREATE COLUMNSTORE INDEX index_name ON foo.bar",
|
||||||
|
"CREATE NONCLUSTERED COLUMNSTORE INDEX index_name ON foo.bar",
|
||||||
|
)
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"CREATE PROCEDURE foo AS BEGIN DELETE FROM bla WHERE foo < CURRENT_TIMESTAMP - 7 END",
|
"CREATE PROCEDURE foo AS BEGIN DELETE FROM bla WHERE foo < CURRENT_TIMESTAMP - 7 END",
|
||||||
"CREATE PROCEDURE foo AS BEGIN DELETE FROM bla WHERE foo < GETDATE() - 7 END",
|
"CREATE PROCEDURE foo AS BEGIN DELETE FROM bla WHERE foo < GETDATE() - 7 END",
|
||||||
)
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"INSERT INTO Production.UpdatedInventory SELECT ProductID, LocationID, NewQty, PreviousQty FROM (MERGE INTO Production.ProductInventory AS pi USING (SELECT ProductID, SUM(OrderQty) FROM Sales.SalesOrderDetail AS sod INNER JOIN Sales.SalesOrderHeader AS soh ON sod.SalesOrderID = soh.SalesOrderID AND soh.OrderDate BETWEEN '20030701' AND '20030731' GROUP BY ProductID) AS src(ProductID, OrderQty) ON pi.ProductID = src.ProductID WHEN MATCHED AND pi.Quantity - src.OrderQty >= 0 THEN UPDATE SET pi.Quantity = pi.Quantity - src.OrderQty WHEN MATCHED AND pi.Quantity - src.OrderQty <= 0 THEN DELETE OUTPUT $action, Inserted.ProductID, Inserted.LocationID, Inserted.Quantity AS NewQty, Deleted.Quantity AS PreviousQty) AS Changes(Action, ProductID, LocationID, NewQty, PreviousQty) WHERE Action = 'UPDATE'",
|
||||||
|
"""INSERT INTO Production.UpdatedInventory
|
||||||
|
SELECT
|
||||||
|
ProductID,
|
||||||
|
LocationID,
|
||||||
|
NewQty,
|
||||||
|
PreviousQty
|
||||||
|
FROM (
|
||||||
|
MERGE INTO Production.ProductInventory AS pi
|
||||||
|
USING (
|
||||||
|
SELECT
|
||||||
|
ProductID,
|
||||||
|
SUM(OrderQty)
|
||||||
|
FROM Sales.SalesOrderDetail AS sod
|
||||||
|
INNER JOIN Sales.SalesOrderHeader AS soh
|
||||||
|
ON sod.SalesOrderID = soh.SalesOrderID
|
||||||
|
AND soh.OrderDate BETWEEN '20030701' AND '20030731'
|
||||||
|
GROUP BY
|
||||||
|
ProductID
|
||||||
|
) AS src(ProductID, OrderQty)
|
||||||
|
ON pi.ProductID = src.ProductID
|
||||||
|
WHEN MATCHED AND pi.Quantity - src.OrderQty >= 0 THEN UPDATE SET pi.Quantity = pi.Quantity - src.OrderQty
|
||||||
|
WHEN MATCHED AND pi.Quantity - src.OrderQty <= 0 THEN DELETE
|
||||||
|
OUTPUT $action, Inserted.ProductID, Inserted.LocationID, Inserted.Quantity AS NewQty, Deleted.Quantity AS PreviousQty
|
||||||
|
) AS Changes(Action, ProductID, LocationID, NewQty, PreviousQty)
|
||||||
|
WHERE
|
||||||
|
Action = 'UPDATE'""",
|
||||||
|
pretty=True,
|
||||||
|
)
|
||||||
|
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"CREATE TABLE [#temptest] (name INTEGER)",
|
"CREATE TABLE [#temptest] (name INTEGER)",
|
||||||
|
@ -1003,14 +1060,6 @@ class TestTSQL(Validator):
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
for colstore in ("NONCLUSTERED COLUMNSTORE", "CLUSTERED COLUMNSTORE"):
|
|
||||||
self.validate_identity(f"CREATE {colstore} INDEX index_name ON foo.bar")
|
|
||||||
|
|
||||||
self.validate_identity(
|
|
||||||
"CREATE COLUMNSTORE INDEX index_name ON foo.bar",
|
|
||||||
"CREATE NONCLUSTERED COLUMNSTORE INDEX index_name ON foo.bar",
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_insert_cte(self):
|
def test_insert_cte(self):
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"INSERT INTO foo.bar WITH cte AS (SELECT 1 AS one) SELECT * FROM cte",
|
"INSERT INTO foo.bar WITH cte AS (SELECT 1 AS one) SELECT * FROM cte",
|
||||||
|
|
|
@ -19,6 +19,15 @@ INT;
|
||||||
LEAST(1, 2.5, 3);
|
LEAST(1, 2.5, 3);
|
||||||
DOUBLE;
|
DOUBLE;
|
||||||
|
|
||||||
|
CURRENT_TIME();
|
||||||
|
TIME;
|
||||||
|
|
||||||
|
TIME_ADD(CAST('09:05:03' AS TIME), INTERVAL 2 HOUR);
|
||||||
|
TIME;
|
||||||
|
|
||||||
|
TIME_SUB(CAST('09:05:03' AS TIME), INTERVAL 2 HOUR);
|
||||||
|
TIME;
|
||||||
|
|
||||||
--------------------------------------
|
--------------------------------------
|
||||||
-- Spark2 / Spark3 / Databricks
|
-- Spark2 / Spark3 / Databricks
|
||||||
--------------------------------------
|
--------------------------------------
|
||||||
|
|
21
tests/fixtures/optimizer/merge_subqueries.sql
vendored
21
tests/fixtures/optimizer/merge_subqueries.sql
vendored
|
@ -259,7 +259,7 @@ FROM
|
||||||
t1;
|
t1;
|
||||||
WITH t1 AS (SELECT x.a AS a, x.b AS b, ROW_NUMBER() OVER (PARTITION BY x.a ORDER BY x.a) AS row_num FROM x AS x) SELECT SUM(t1.row_num) AS total_rows FROM t1 AS t1;
|
WITH t1 AS (SELECT x.a AS a, x.b AS b, ROW_NUMBER() OVER (PARTITION BY x.a ORDER BY x.a) AS row_num FROM x AS x) SELECT SUM(t1.row_num) AS total_rows FROM t1 AS t1;
|
||||||
|
|
||||||
# title: Test prevent merging of window if in group by func
|
# title: Test prevent merging of window if in group by
|
||||||
with t1 as (
|
with t1 as (
|
||||||
SELECT
|
SELECT
|
||||||
x.a,
|
x.a,
|
||||||
|
@ -277,7 +277,7 @@ GROUP BY t1.row_num
|
||||||
ORDER BY t1.row_num;
|
ORDER BY t1.row_num;
|
||||||
WITH t1 AS (SELECT x.a AS a, x.b AS b, ROW_NUMBER() OVER (PARTITION BY x.a ORDER BY x.a) AS row_num FROM x AS x) SELECT t1.row_num AS row_num, SUM(t1.a) AS total FROM t1 AS t1 GROUP BY t1.row_num ORDER BY row_num;
|
WITH t1 AS (SELECT x.a AS a, x.b AS b, ROW_NUMBER() OVER (PARTITION BY x.a ORDER BY x.a) AS row_num FROM x AS x) SELECT t1.row_num AS row_num, SUM(t1.a) AS total FROM t1 AS t1 GROUP BY t1.row_num ORDER BY row_num;
|
||||||
|
|
||||||
# title: Test prevent merging of window if in order by func
|
# title: Test prevent merging of window if in order by
|
||||||
with t1 as (
|
with t1 as (
|
||||||
SELECT
|
SELECT
|
||||||
x.a,
|
x.a,
|
||||||
|
@ -294,6 +294,23 @@ FROM
|
||||||
ORDER BY t1.row_num, t1.a;
|
ORDER BY t1.row_num, t1.a;
|
||||||
WITH t1 AS (SELECT x.a AS a, x.b AS b, ROW_NUMBER() OVER (PARTITION BY x.a ORDER BY x.a) AS row_num FROM x AS x) SELECT t1.row_num AS row_num, t1.a AS a FROM t1 AS t1 ORDER BY t1.row_num, t1.a;
|
WITH t1 AS (SELECT x.a AS a, x.b AS b, ROW_NUMBER() OVER (PARTITION BY x.a ORDER BY x.a) AS row_num FROM x AS x) SELECT t1.row_num AS row_num, t1.a AS a FROM t1 AS t1 ORDER BY t1.row_num, t1.a;
|
||||||
|
|
||||||
|
# title: Test preventing merging of window nested under complex projection if in order by
|
||||||
|
WITH t1 AS (
|
||||||
|
SELECT
|
||||||
|
x.a,
|
||||||
|
x.b,
|
||||||
|
ROW_NUMBER() OVER (PARTITION BY x.a ORDER BY x.a) - 1 AS row_num
|
||||||
|
FROM
|
||||||
|
x
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
t1.row_num AS row_num,
|
||||||
|
t1.a AS a
|
||||||
|
FROM
|
||||||
|
t1
|
||||||
|
ORDER BY t1.row_num, t1.a;
|
||||||
|
WITH t1 AS (SELECT x.a AS a, x.b AS b, ROW_NUMBER() OVER (PARTITION BY x.a ORDER BY x.a) - 1 AS row_num FROM x AS x) SELECT t1.row_num AS row_num, t1.a AS a FROM t1 AS t1 ORDER BY t1.row_num, t1.a;
|
||||||
|
|
||||||
# title: Test allow merging of window function
|
# title: Test allow merging of window function
|
||||||
with t1 as (
|
with t1 as (
|
||||||
SELECT
|
SELECT
|
||||||
|
|
13
tests/fixtures/optimizer/optimizer.sql
vendored
13
tests/fixtures/optimizer/optimizer.sql
vendored
|
@ -760,7 +760,10 @@ SELECT
|
||||||
`_q_0`.`first_half_sales` AS `first_half_sales`,
|
`_q_0`.`first_half_sales` AS `first_half_sales`,
|
||||||
`_q_0`.`second_half_sales` AS `second_half_sales`
|
`_q_0`.`second_half_sales` AS `second_half_sales`
|
||||||
FROM `produce` AS `produce`
|
FROM `produce` AS `produce`
|
||||||
UNPIVOT((`first_half_sales`, `second_half_sales`) FOR `semesters` IN ((`produce`.`q1`, `produce`.`q2`) AS 'semester_1', (`produce`.`q3`, `produce`.`q4`) AS 'semester_2')) AS `_q_0`;
|
UNPIVOT((`first_half_sales`, `second_half_sales`) FOR `semesters` IN (
|
||||||
|
(`produce`.`q1`, `produce`.`q2`) AS 'semester_1',
|
||||||
|
(`produce`.`q3`, `produce`.`q4`) AS 'semester_2'
|
||||||
|
)) AS `_q_0`;
|
||||||
|
|
||||||
# title: quoting is preserved
|
# title: quoting is preserved
|
||||||
# dialect: snowflake
|
# dialect: snowflake
|
||||||
|
@ -1382,7 +1385,13 @@ LEFT JOIN `_u_3` AS `_u_3`
|
||||||
ON `_u_3`.`_u_4` = `cs1`.`cs_order_number`
|
ON `_u_3`.`_u_4` = `cs1`.`cs_order_number`
|
||||||
JOIN `call_center` AS `call_center`
|
JOIN `call_center` AS `call_center`
|
||||||
ON `call_center`.`cc_call_center_sk` = `cs1`.`cs_call_center_sk`
|
ON `call_center`.`cc_call_center_sk` = `cs1`.`cs_call_center_sk`
|
||||||
AND `call_center`.`cc_county` IN ('Williamson County', 'Williamson County', 'Williamson County', 'Williamson County', 'Williamson County')
|
AND `call_center`.`cc_county` IN (
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County'
|
||||||
|
)
|
||||||
JOIN `customer_address` AS `customer_address`
|
JOIN `customer_address` AS `customer_address`
|
||||||
ON `cs1`.`cs_ship_addr_sk` = `customer_address`.`ca_address_sk`
|
ON `cs1`.`cs_ship_addr_sk` = `customer_address`.`ca_address_sk`
|
||||||
AND `customer_address`.`ca_state` = 'GA'
|
AND `customer_address`.`ca_state` = 'GA'
|
||||||
|
|
6
tests/fixtures/optimizer/qualify_columns.sql
vendored
6
tests/fixtures/optimizer/qualify_columns.sql
vendored
|
@ -770,6 +770,12 @@ WITH RECURSIVE t AS (SELECT 1 AS c UNION ALL SELECT t.c + 1 AS c FROM t AS t WHE
|
||||||
SELECT DISTINCT ON (new_col, b + 1, 1) t1.a AS new_col FROM x AS t1 ORDER BY new_col;
|
SELECT DISTINCT ON (new_col, b + 1, 1) t1.a AS new_col FROM x AS t1 ORDER BY new_col;
|
||||||
SELECT DISTINCT ON (new_col, t1.b + 1, new_col) t1.a AS new_col FROM x AS t1 ORDER BY new_col;
|
SELECT DISTINCT ON (new_col, t1.b + 1, new_col) t1.a AS new_col FROM x AS t1 ORDER BY new_col;
|
||||||
|
|
||||||
|
# title: Oracle does not support lateral alias expansion
|
||||||
|
# dialect: oracle
|
||||||
|
# execute: false
|
||||||
|
SELECT a AS b, b AS a FROM c;
|
||||||
|
SELECT C.A AS B, C.B AS A FROM C C;
|
||||||
|
|
||||||
--------------------------------------
|
--------------------------------------
|
||||||
-- Wrapped tables / join constructs
|
-- Wrapped tables / join constructs
|
||||||
--------------------------------------
|
--------------------------------------
|
||||||
|
|
479
tests/fixtures/optimizer/tpc-ds/tpc-ds.sql
vendored
479
tests/fixtures/optimizer/tpc-ds/tpc-ds.sql
vendored
|
@ -1227,7 +1227,408 @@ WITH "a1" AS (
|
||||||
SUBSTRING("customer_address"."ca_zip", 1, 5) AS "ca_zip"
|
SUBSTRING("customer_address"."ca_zip", 1, 5) AS "ca_zip"
|
||||||
FROM "customer_address" AS "customer_address"
|
FROM "customer_address" AS "customer_address"
|
||||||
WHERE
|
WHERE
|
||||||
SUBSTRING("customer_address"."ca_zip", 1, 5) IN ('67436', '26121', '38443', '63157', '68856', '19485', '86425', '26741', '70991', '60899', '63573', '47556', '56193', '93314', '87827', '62017', '85067', '95390', '48091', '10261', '81845', '41790', '42853', '24675', '12840', '60065', '84430', '57451', '24021', '91735', '75335', '71935', '34482', '56943', '70695', '52147', '56251', '28411', '86653', '23005', '22478', '29031', '34398', '15365', '42460', '33337', '59433', '73943', '72477', '74081', '74430', '64605', '39006', '11226', '49057', '97308', '42663', '18187', '19768', '43454', '32147', '76637', '51975', '11181', '45630', '33129', '45995', '64386', '55522', '26697', '20963', '35154', '64587', '49752', '66386', '30586', '59286', '13177', '66646', '84195', '74316', '36853', '32927', '12469', '11904', '36269', '17724', '55346', '12595', '53988', '65439', '28015', '63268', '73590', '29216', '82575', '69267', '13805', '91678', '79460', '94152', '14961', '15419', '48277', '62588', '55493', '28360', '14152', '55225', '18007', '53705', '56573', '80245', '71769', '57348', '36845', '13039', '17270', '22363', '83474', '25294', '43269', '77666', '15488', '99146', '64441', '43338', '38736', '62754', '48556', '86057', '23090', '38114', '66061', '18910', '84385', '23600', '19975', '27883', '65719', '19933', '32085', '49731', '40473', '27190', '46192', '23949', '44738', '12436', '64794', '68741', '15333', '24282', '49085', '31844', '71156', '48441', '17100', '98207', '44982', '20277', '71496', '96299', '37583', '22206', '89174', '30589', '61924', '53079', '10976', '13104', '42794', '54772', '15809', '56434', '39975', '13874', '30753', '77598', '78229', '59478', '12345', '55547', '57422', '42600', '79444', '29074', '29752', '21676', '32096', '43044', '39383', '37296', '36295', '63077', '16572', '31275', '18701', '40197', '48242', '27219', '49865', '84175', '30446', '25165', '13807', '72142', '70499', '70464', '71429', '18111', '70857', '29545', '36425', '52706', '36194', '42963', '75068', '47921', '74763', '90990', '89456', '62073', '88397', '73963', '75885', '62657', '12530', '81146', '57434', '25099', '41429', '98441', '48713', '52552', '31667', '14072', '13903', '44709', '85429', '58017', '38295', '44875', '73541', '30091', '12707', '23762', '62258', '33247', '78722', '77431', '14510', '35656', '72428', '92082', '35267', '43759', '24354', '90952', '11512', '21242', '22579', '56114', '32339', '52282', '41791', '24484', '95020', '28408', '99710', '11899', '43344', '72915', '27644', '62708', '74479', '17177', '32619', '12351', '91339', '31169', '57081', '53522', '16712', '34419', '71779', '44187', '46206', '96099', '61910', '53664', '12295', '31837', '33096', '10813', '63048', '31732', '79118', '73084', '72783', '84952', '46965', '77956', '39815', '32311', '75329', '48156', '30826', '49661', '13736', '92076', '74865', '88149', '92397', '52777', '68453', '32012', '21222', '52721', '24626', '18210', '42177', '91791', '75251', '82075', '44372', '45542', '20609', '60115', '17362', '22750', '90434', '31852', '54071', '33762', '14705', '40718', '56433', '30996', '40657', '49056', '23585', '66455', '41021', '74736', '72151', '37007', '21729', '60177', '84558', '59027', '93855', '60022', '86443', '19541', '86886', '30532', '39062', '48532', '34713', '52077', '22564', '64638', '15273', '31677', '36138', '62367', '60261', '80213', '42818', '25113', '72378', '69802', '69096', '55443', '28820', '13848', '78258', '37490', '30556', '77380', '28447', '44550', '26791', '70609', '82182', '33306', '43224', '22322', '86959', '68519', '14308', '46501', '81131', '34056', '61991', '19896', '87804', '65774', '92564')
|
SUBSTRING("customer_address"."ca_zip", 1, 5) IN (
|
||||||
|
'67436',
|
||||||
|
'26121',
|
||||||
|
'38443',
|
||||||
|
'63157',
|
||||||
|
'68856',
|
||||||
|
'19485',
|
||||||
|
'86425',
|
||||||
|
'26741',
|
||||||
|
'70991',
|
||||||
|
'60899',
|
||||||
|
'63573',
|
||||||
|
'47556',
|
||||||
|
'56193',
|
||||||
|
'93314',
|
||||||
|
'87827',
|
||||||
|
'62017',
|
||||||
|
'85067',
|
||||||
|
'95390',
|
||||||
|
'48091',
|
||||||
|
'10261',
|
||||||
|
'81845',
|
||||||
|
'41790',
|
||||||
|
'42853',
|
||||||
|
'24675',
|
||||||
|
'12840',
|
||||||
|
'60065',
|
||||||
|
'84430',
|
||||||
|
'57451',
|
||||||
|
'24021',
|
||||||
|
'91735',
|
||||||
|
'75335',
|
||||||
|
'71935',
|
||||||
|
'34482',
|
||||||
|
'56943',
|
||||||
|
'70695',
|
||||||
|
'52147',
|
||||||
|
'56251',
|
||||||
|
'28411',
|
||||||
|
'86653',
|
||||||
|
'23005',
|
||||||
|
'22478',
|
||||||
|
'29031',
|
||||||
|
'34398',
|
||||||
|
'15365',
|
||||||
|
'42460',
|
||||||
|
'33337',
|
||||||
|
'59433',
|
||||||
|
'73943',
|
||||||
|
'72477',
|
||||||
|
'74081',
|
||||||
|
'74430',
|
||||||
|
'64605',
|
||||||
|
'39006',
|
||||||
|
'11226',
|
||||||
|
'49057',
|
||||||
|
'97308',
|
||||||
|
'42663',
|
||||||
|
'18187',
|
||||||
|
'19768',
|
||||||
|
'43454',
|
||||||
|
'32147',
|
||||||
|
'76637',
|
||||||
|
'51975',
|
||||||
|
'11181',
|
||||||
|
'45630',
|
||||||
|
'33129',
|
||||||
|
'45995',
|
||||||
|
'64386',
|
||||||
|
'55522',
|
||||||
|
'26697',
|
||||||
|
'20963',
|
||||||
|
'35154',
|
||||||
|
'64587',
|
||||||
|
'49752',
|
||||||
|
'66386',
|
||||||
|
'30586',
|
||||||
|
'59286',
|
||||||
|
'13177',
|
||||||
|
'66646',
|
||||||
|
'84195',
|
||||||
|
'74316',
|
||||||
|
'36853',
|
||||||
|
'32927',
|
||||||
|
'12469',
|
||||||
|
'11904',
|
||||||
|
'36269',
|
||||||
|
'17724',
|
||||||
|
'55346',
|
||||||
|
'12595',
|
||||||
|
'53988',
|
||||||
|
'65439',
|
||||||
|
'28015',
|
||||||
|
'63268',
|
||||||
|
'73590',
|
||||||
|
'29216',
|
||||||
|
'82575',
|
||||||
|
'69267',
|
||||||
|
'13805',
|
||||||
|
'91678',
|
||||||
|
'79460',
|
||||||
|
'94152',
|
||||||
|
'14961',
|
||||||
|
'15419',
|
||||||
|
'48277',
|
||||||
|
'62588',
|
||||||
|
'55493',
|
||||||
|
'28360',
|
||||||
|
'14152',
|
||||||
|
'55225',
|
||||||
|
'18007',
|
||||||
|
'53705',
|
||||||
|
'56573',
|
||||||
|
'80245',
|
||||||
|
'71769',
|
||||||
|
'57348',
|
||||||
|
'36845',
|
||||||
|
'13039',
|
||||||
|
'17270',
|
||||||
|
'22363',
|
||||||
|
'83474',
|
||||||
|
'25294',
|
||||||
|
'43269',
|
||||||
|
'77666',
|
||||||
|
'15488',
|
||||||
|
'99146',
|
||||||
|
'64441',
|
||||||
|
'43338',
|
||||||
|
'38736',
|
||||||
|
'62754',
|
||||||
|
'48556',
|
||||||
|
'86057',
|
||||||
|
'23090',
|
||||||
|
'38114',
|
||||||
|
'66061',
|
||||||
|
'18910',
|
||||||
|
'84385',
|
||||||
|
'23600',
|
||||||
|
'19975',
|
||||||
|
'27883',
|
||||||
|
'65719',
|
||||||
|
'19933',
|
||||||
|
'32085',
|
||||||
|
'49731',
|
||||||
|
'40473',
|
||||||
|
'27190',
|
||||||
|
'46192',
|
||||||
|
'23949',
|
||||||
|
'44738',
|
||||||
|
'12436',
|
||||||
|
'64794',
|
||||||
|
'68741',
|
||||||
|
'15333',
|
||||||
|
'24282',
|
||||||
|
'49085',
|
||||||
|
'31844',
|
||||||
|
'71156',
|
||||||
|
'48441',
|
||||||
|
'17100',
|
||||||
|
'98207',
|
||||||
|
'44982',
|
||||||
|
'20277',
|
||||||
|
'71496',
|
||||||
|
'96299',
|
||||||
|
'37583',
|
||||||
|
'22206',
|
||||||
|
'89174',
|
||||||
|
'30589',
|
||||||
|
'61924',
|
||||||
|
'53079',
|
||||||
|
'10976',
|
||||||
|
'13104',
|
||||||
|
'42794',
|
||||||
|
'54772',
|
||||||
|
'15809',
|
||||||
|
'56434',
|
||||||
|
'39975',
|
||||||
|
'13874',
|
||||||
|
'30753',
|
||||||
|
'77598',
|
||||||
|
'78229',
|
||||||
|
'59478',
|
||||||
|
'12345',
|
||||||
|
'55547',
|
||||||
|
'57422',
|
||||||
|
'42600',
|
||||||
|
'79444',
|
||||||
|
'29074',
|
||||||
|
'29752',
|
||||||
|
'21676',
|
||||||
|
'32096',
|
||||||
|
'43044',
|
||||||
|
'39383',
|
||||||
|
'37296',
|
||||||
|
'36295',
|
||||||
|
'63077',
|
||||||
|
'16572',
|
||||||
|
'31275',
|
||||||
|
'18701',
|
||||||
|
'40197',
|
||||||
|
'48242',
|
||||||
|
'27219',
|
||||||
|
'49865',
|
||||||
|
'84175',
|
||||||
|
'30446',
|
||||||
|
'25165',
|
||||||
|
'13807',
|
||||||
|
'72142',
|
||||||
|
'70499',
|
||||||
|
'70464',
|
||||||
|
'71429',
|
||||||
|
'18111',
|
||||||
|
'70857',
|
||||||
|
'29545',
|
||||||
|
'36425',
|
||||||
|
'52706',
|
||||||
|
'36194',
|
||||||
|
'42963',
|
||||||
|
'75068',
|
||||||
|
'47921',
|
||||||
|
'74763',
|
||||||
|
'90990',
|
||||||
|
'89456',
|
||||||
|
'62073',
|
||||||
|
'88397',
|
||||||
|
'73963',
|
||||||
|
'75885',
|
||||||
|
'62657',
|
||||||
|
'12530',
|
||||||
|
'81146',
|
||||||
|
'57434',
|
||||||
|
'25099',
|
||||||
|
'41429',
|
||||||
|
'98441',
|
||||||
|
'48713',
|
||||||
|
'52552',
|
||||||
|
'31667',
|
||||||
|
'14072',
|
||||||
|
'13903',
|
||||||
|
'44709',
|
||||||
|
'85429',
|
||||||
|
'58017',
|
||||||
|
'38295',
|
||||||
|
'44875',
|
||||||
|
'73541',
|
||||||
|
'30091',
|
||||||
|
'12707',
|
||||||
|
'23762',
|
||||||
|
'62258',
|
||||||
|
'33247',
|
||||||
|
'78722',
|
||||||
|
'77431',
|
||||||
|
'14510',
|
||||||
|
'35656',
|
||||||
|
'72428',
|
||||||
|
'92082',
|
||||||
|
'35267',
|
||||||
|
'43759',
|
||||||
|
'24354',
|
||||||
|
'90952',
|
||||||
|
'11512',
|
||||||
|
'21242',
|
||||||
|
'22579',
|
||||||
|
'56114',
|
||||||
|
'32339',
|
||||||
|
'52282',
|
||||||
|
'41791',
|
||||||
|
'24484',
|
||||||
|
'95020',
|
||||||
|
'28408',
|
||||||
|
'99710',
|
||||||
|
'11899',
|
||||||
|
'43344',
|
||||||
|
'72915',
|
||||||
|
'27644',
|
||||||
|
'62708',
|
||||||
|
'74479',
|
||||||
|
'17177',
|
||||||
|
'32619',
|
||||||
|
'12351',
|
||||||
|
'91339',
|
||||||
|
'31169',
|
||||||
|
'57081',
|
||||||
|
'53522',
|
||||||
|
'16712',
|
||||||
|
'34419',
|
||||||
|
'71779',
|
||||||
|
'44187',
|
||||||
|
'46206',
|
||||||
|
'96099',
|
||||||
|
'61910',
|
||||||
|
'53664',
|
||||||
|
'12295',
|
||||||
|
'31837',
|
||||||
|
'33096',
|
||||||
|
'10813',
|
||||||
|
'63048',
|
||||||
|
'31732',
|
||||||
|
'79118',
|
||||||
|
'73084',
|
||||||
|
'72783',
|
||||||
|
'84952',
|
||||||
|
'46965',
|
||||||
|
'77956',
|
||||||
|
'39815',
|
||||||
|
'32311',
|
||||||
|
'75329',
|
||||||
|
'48156',
|
||||||
|
'30826',
|
||||||
|
'49661',
|
||||||
|
'13736',
|
||||||
|
'92076',
|
||||||
|
'74865',
|
||||||
|
'88149',
|
||||||
|
'92397',
|
||||||
|
'52777',
|
||||||
|
'68453',
|
||||||
|
'32012',
|
||||||
|
'21222',
|
||||||
|
'52721',
|
||||||
|
'24626',
|
||||||
|
'18210',
|
||||||
|
'42177',
|
||||||
|
'91791',
|
||||||
|
'75251',
|
||||||
|
'82075',
|
||||||
|
'44372',
|
||||||
|
'45542',
|
||||||
|
'20609',
|
||||||
|
'60115',
|
||||||
|
'17362',
|
||||||
|
'22750',
|
||||||
|
'90434',
|
||||||
|
'31852',
|
||||||
|
'54071',
|
||||||
|
'33762',
|
||||||
|
'14705',
|
||||||
|
'40718',
|
||||||
|
'56433',
|
||||||
|
'30996',
|
||||||
|
'40657',
|
||||||
|
'49056',
|
||||||
|
'23585',
|
||||||
|
'66455',
|
||||||
|
'41021',
|
||||||
|
'74736',
|
||||||
|
'72151',
|
||||||
|
'37007',
|
||||||
|
'21729',
|
||||||
|
'60177',
|
||||||
|
'84558',
|
||||||
|
'59027',
|
||||||
|
'93855',
|
||||||
|
'60022',
|
||||||
|
'86443',
|
||||||
|
'19541',
|
||||||
|
'86886',
|
||||||
|
'30532',
|
||||||
|
'39062',
|
||||||
|
'48532',
|
||||||
|
'34713',
|
||||||
|
'52077',
|
||||||
|
'22564',
|
||||||
|
'64638',
|
||||||
|
'15273',
|
||||||
|
'31677',
|
||||||
|
'36138',
|
||||||
|
'62367',
|
||||||
|
'60261',
|
||||||
|
'80213',
|
||||||
|
'42818',
|
||||||
|
'25113',
|
||||||
|
'72378',
|
||||||
|
'69802',
|
||||||
|
'69096',
|
||||||
|
'55443',
|
||||||
|
'28820',
|
||||||
|
'13848',
|
||||||
|
'78258',
|
||||||
|
'37490',
|
||||||
|
'30556',
|
||||||
|
'77380',
|
||||||
|
'28447',
|
||||||
|
'44550',
|
||||||
|
'26791',
|
||||||
|
'70609',
|
||||||
|
'82182',
|
||||||
|
'33306',
|
||||||
|
'43224',
|
||||||
|
'22322',
|
||||||
|
'86959',
|
||||||
|
'68519',
|
||||||
|
'14308',
|
||||||
|
'46501',
|
||||||
|
'81131',
|
||||||
|
'34056',
|
||||||
|
'61991',
|
||||||
|
'19896',
|
||||||
|
'87804',
|
||||||
|
'65774',
|
||||||
|
'92564'
|
||||||
|
)
|
||||||
INTERSECT
|
INTERSECT
|
||||||
SELECT
|
SELECT
|
||||||
"a1"."ca_zip" AS "ca_zip"
|
"a1"."ca_zip" AS "ca_zip"
|
||||||
|
@ -1580,7 +1981,13 @@ LEFT JOIN "_u_4" AS "_u_4"
|
||||||
ON "_u_4"."_u_5" = "c"."c_customer_sk"
|
ON "_u_4"."_u_5" = "c"."c_customer_sk"
|
||||||
JOIN "customer_address" AS "ca"
|
JOIN "customer_address" AS "ca"
|
||||||
ON "c"."c_current_addr_sk" = "ca"."ca_address_sk"
|
ON "c"."c_current_addr_sk" = "ca"."ca_address_sk"
|
||||||
AND "ca"."ca_county" IN ('Lycoming County', 'Sheridan County', 'Kandiyohi County', 'Pike County', 'Greene County')
|
AND "ca"."ca_county" IN (
|
||||||
|
'Lycoming County',
|
||||||
|
'Sheridan County',
|
||||||
|
'Kandiyohi County',
|
||||||
|
'Pike County',
|
||||||
|
'Greene County'
|
||||||
|
)
|
||||||
JOIN "customer_demographics" AS "customer_demographics"
|
JOIN "customer_demographics" AS "customer_demographics"
|
||||||
ON "c"."c_current_cdemo_sk" = "customer_demographics"."cd_demo_sk"
|
ON "c"."c_current_cdemo_sk" = "customer_demographics"."cd_demo_sk"
|
||||||
WHERE
|
WHERE
|
||||||
|
@ -2413,7 +2820,13 @@ LEFT JOIN "_u_3" AS "_u_3"
|
||||||
ON "_u_3"."_u_4" = "cs1"."cs_order_number"
|
ON "_u_3"."_u_4" = "cs1"."cs_order_number"
|
||||||
JOIN "call_center" AS "call_center"
|
JOIN "call_center" AS "call_center"
|
||||||
ON "call_center"."cc_call_center_sk" = "cs1"."cs_call_center_sk"
|
ON "call_center"."cc_call_center_sk" = "cs1"."cs_call_center_sk"
|
||||||
AND "call_center"."cc_county" IN ('Williamson County', 'Williamson County', 'Williamson County', 'Williamson County', 'Williamson County')
|
AND "call_center"."cc_county" IN (
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County'
|
||||||
|
)
|
||||||
JOIN "customer_address" AS "customer_address"
|
JOIN "customer_address" AS "customer_address"
|
||||||
ON "cs1"."cs_ship_addr_sk" = "customer_address"."ca_address_sk"
|
ON "cs1"."cs_ship_addr_sk" = "customer_address"."ca_address_sk"
|
||||||
AND "customer_address"."ca_state" = 'IA'
|
AND "customer_address"."ca_state" = 'IA'
|
||||||
|
@ -4221,7 +4634,16 @@ WITH "dn" AS (
|
||||||
ELSE NULL
|
ELSE NULL
|
||||||
END > 1.2
|
END > 1.2
|
||||||
JOIN "store" AS "store"
|
JOIN "store" AS "store"
|
||||||
ON "store"."s_county" IN ('Williamson County', 'Williamson County', 'Williamson County', 'Williamson County', 'Williamson County', 'Williamson County', 'Williamson County', 'Williamson County')
|
ON "store"."s_county" IN (
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County'
|
||||||
|
)
|
||||||
AND "store"."s_store_sk" = "store_sales"."ss_store_sk"
|
AND "store"."s_store_sk" = "store_sales"."ss_store_sk"
|
||||||
GROUP BY
|
GROUP BY
|
||||||
"store_sales"."ss_ticket_number",
|
"store_sales"."ss_ticket_number",
|
||||||
|
@ -6339,7 +6761,12 @@ WITH "tmp1" AS (
|
||||||
WHERE
|
WHERE
|
||||||
(
|
(
|
||||||
"item"."i_brand" IN ('amalgimporto #1', 'edu packscholar #1', 'exportiimporto #1', 'importoamalg #1')
|
"item"."i_brand" IN ('amalgimporto #1', 'edu packscholar #1', 'exportiimporto #1', 'importoamalg #1')
|
||||||
OR "item"."i_brand" IN ('scholaramalgamalg #14', 'scholaramalgamalg #7', 'exportiunivamalg #9', 'scholaramalgamalg #9')
|
OR "item"."i_brand" IN (
|
||||||
|
'scholaramalgamalg #14',
|
||||||
|
'scholaramalgamalg #7',
|
||||||
|
'exportiunivamalg #9',
|
||||||
|
'scholaramalgamalg #9'
|
||||||
|
)
|
||||||
)
|
)
|
||||||
AND (
|
AND (
|
||||||
"item"."i_brand" IN ('amalgimporto #1', 'edu packscholar #1', 'exportiimporto #1', 'importoamalg #1')
|
"item"."i_brand" IN ('amalgimporto #1', 'edu packscholar #1', 'exportiimporto #1', 'importoamalg #1')
|
||||||
|
@ -6350,11 +6777,21 @@ WITH "tmp1" AS (
|
||||||
OR "item"."i_class" IN ('personal', 'portable', 'reference', 'self-help')
|
OR "item"."i_class" IN ('personal', 'portable', 'reference', 'self-help')
|
||||||
)
|
)
|
||||||
AND (
|
AND (
|
||||||
"item"."i_brand" IN ('scholaramalgamalg #14', 'scholaramalgamalg #7', 'exportiunivamalg #9', 'scholaramalgamalg #9')
|
"item"."i_brand" IN (
|
||||||
|
'scholaramalgamalg #14',
|
||||||
|
'scholaramalgamalg #7',
|
||||||
|
'exportiunivamalg #9',
|
||||||
|
'scholaramalgamalg #9'
|
||||||
|
)
|
||||||
OR "item"."i_category" IN ('Women', 'Music', 'Men')
|
OR "item"."i_category" IN ('Women', 'Music', 'Men')
|
||||||
)
|
)
|
||||||
AND (
|
AND (
|
||||||
"item"."i_brand" IN ('scholaramalgamalg #14', 'scholaramalgamalg #7', 'exportiunivamalg #9', 'scholaramalgamalg #9')
|
"item"."i_brand" IN (
|
||||||
|
'scholaramalgamalg #14',
|
||||||
|
'scholaramalgamalg #7',
|
||||||
|
'exportiunivamalg #9',
|
||||||
|
'scholaramalgamalg #9'
|
||||||
|
)
|
||||||
OR "item"."i_class" IN ('accessories', 'classical', 'fragrances', 'pants')
|
OR "item"."i_class" IN ('accessories', 'classical', 'fragrances', 'pants')
|
||||||
)
|
)
|
||||||
AND (
|
AND (
|
||||||
|
@ -7755,7 +8192,12 @@ WITH "tmp1" AS (
|
||||||
WHERE
|
WHERE
|
||||||
(
|
(
|
||||||
"item"."i_brand" IN ('amalgimporto #1', 'edu packscholar #1', 'exportiimporto #1', 'importoamalg #1')
|
"item"."i_brand" IN ('amalgimporto #1', 'edu packscholar #1', 'exportiimporto #1', 'importoamalg #1')
|
||||||
OR "item"."i_brand" IN ('scholaramalgamalg #14', 'scholaramalgamalg #7', 'exportiunivamalg #9', 'scholaramalgamalg #9')
|
OR "item"."i_brand" IN (
|
||||||
|
'scholaramalgamalg #14',
|
||||||
|
'scholaramalgamalg #7',
|
||||||
|
'exportiunivamalg #9',
|
||||||
|
'scholaramalgamalg #9'
|
||||||
|
)
|
||||||
)
|
)
|
||||||
AND (
|
AND (
|
||||||
"item"."i_brand" IN ('amalgimporto #1', 'edu packscholar #1', 'exportiimporto #1', 'importoamalg #1')
|
"item"."i_brand" IN ('amalgimporto #1', 'edu packscholar #1', 'exportiimporto #1', 'importoamalg #1')
|
||||||
|
@ -7766,11 +8208,21 @@ WITH "tmp1" AS (
|
||||||
OR "item"."i_class" IN ('personal', 'portable', 'reference', 'self-help')
|
OR "item"."i_class" IN ('personal', 'portable', 'reference', 'self-help')
|
||||||
)
|
)
|
||||||
AND (
|
AND (
|
||||||
"item"."i_brand" IN ('scholaramalgamalg #14', 'scholaramalgamalg #7', 'exportiunivamalg #9', 'scholaramalgamalg #9')
|
"item"."i_brand" IN (
|
||||||
|
'scholaramalgamalg #14',
|
||||||
|
'scholaramalgamalg #7',
|
||||||
|
'exportiunivamalg #9',
|
||||||
|
'scholaramalgamalg #9'
|
||||||
|
)
|
||||||
OR "item"."i_category" IN ('Women', 'Music', 'Men')
|
OR "item"."i_category" IN ('Women', 'Music', 'Men')
|
||||||
)
|
)
|
||||||
AND (
|
AND (
|
||||||
"item"."i_brand" IN ('scholaramalgamalg #14', 'scholaramalgamalg #7', 'exportiunivamalg #9', 'scholaramalgamalg #9')
|
"item"."i_brand" IN (
|
||||||
|
'scholaramalgamalg #14',
|
||||||
|
'scholaramalgamalg #7',
|
||||||
|
'exportiunivamalg #9',
|
||||||
|
'scholaramalgamalg #9'
|
||||||
|
)
|
||||||
OR "item"."i_class" IN ('accessories', 'classical', 'fragrances', 'pants')
|
OR "item"."i_class" IN ('accessories', 'classical', 'fragrances', 'pants')
|
||||||
)
|
)
|
||||||
AND (
|
AND (
|
||||||
|
@ -9677,7 +10129,12 @@ WITH "dj" AS (
|
||||||
ELSE NULL
|
ELSE NULL
|
||||||
END > 1
|
END > 1
|
||||||
JOIN "store" AS "store"
|
JOIN "store" AS "store"
|
||||||
ON "store"."s_county" IN ('Williamson County', 'Williamson County', 'Williamson County', 'Williamson County')
|
ON "store"."s_county" IN (
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County',
|
||||||
|
'Williamson County'
|
||||||
|
)
|
||||||
AND "store"."s_store_sk" = "store_sales"."ss_store_sk"
|
AND "store"."s_store_sk" = "store_sales"."ss_store_sk"
|
||||||
GROUP BY
|
GROUP BY
|
||||||
"store_sales"."ss_ticket_number",
|
"store_sales"."ss_ticket_number",
|
||||||
|
|
|
@ -9,6 +9,19 @@ from sqlglot import ParseError, alias, exp, parse_one
|
||||||
class TestExpressions(unittest.TestCase):
|
class TestExpressions(unittest.TestCase):
|
||||||
maxDiff = None
|
maxDiff = None
|
||||||
|
|
||||||
|
def test_to_s(self):
|
||||||
|
self.assertEqual(repr(parse_one("5")), "Literal(this=5, is_string=False)")
|
||||||
|
self.assertEqual(repr(parse_one("5.3")), "Literal(this=5.3, is_string=False)")
|
||||||
|
self.assertEqual(repr(parse_one("True")), "Boolean(this=True)")
|
||||||
|
self.assertEqual(repr(parse_one("' x'")), "Literal(this=' x', is_string=True)")
|
||||||
|
self.assertEqual(repr(parse_one("' \n x'")), "Literal(this=' \\n x', is_string=True)")
|
||||||
|
self.assertEqual(
|
||||||
|
repr(parse_one(" x ")), "Column(\n this=Identifier(this=x, quoted=False))"
|
||||||
|
)
|
||||||
|
self.assertEqual(
|
||||||
|
repr(parse_one('" x "')), "Column(\n this=Identifier(this=' x ', quoted=True))"
|
||||||
|
)
|
||||||
|
|
||||||
def test_arg_key(self):
|
def test_arg_key(self):
|
||||||
self.assertEqual(parse_one("sum(1)").find(exp.Literal).arg_key, "this")
|
self.assertEqual(parse_one("sum(1)").find(exp.Literal).arg_key, "this")
|
||||||
|
|
||||||
|
|
Loading…
Add table
Reference in a new issue