Merging upstream version 25.8.1.
Signed-off-by: Daniel Baumann <daniel@debian.org>
This commit is contained in:
parent
1d73cb497c
commit
50df8dea29
61 changed files with 50550 additions and 50354 deletions
30
CHANGELOG.md
30
CHANGELOG.md
|
@ -1,6 +1,34 @@
|
||||||
Changelog
|
Changelog
|
||||||
=========
|
=========
|
||||||
|
|
||||||
|
## [v25.8.0] - 2024-07-29
|
||||||
|
### :sparkles: New Features
|
||||||
|
- [`e37d63a`](https://github.com/tobymao/sqlglot/commit/e37d63a17d4709135c1de7876b2898cf7bd2e641) - **bigquery**: add support for BYTEINT closes [#3838](https://github.com/tobymao/sqlglot/pull/3838) *(commit by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- [`4c912cd`](https://github.com/tobymao/sqlglot/commit/4c912cd2302874b8abeed3cafa93ff3771b8dcba) - **clickhouse**: improve parsing/transpilation of StrToDate *(PR [#3839](https://github.com/tobymao/sqlglot/pull/3839) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- :arrow_lower_right: *addresses issue [#3837](https://github.com/tobymao/sqlglot/issues/3837) opened by [@ace-xc](https://github.com/ace-xc)*
|
||||||
|
- [`45f45ea`](https://github.com/tobymao/sqlglot/commit/45f45eaaac5a9130168dddaef4713542886a83cb) - **duckdb**: add support for SUMMARIZE *(PR [#3840](https://github.com/tobymao/sqlglot/pull/3840) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- :arrow_lower_right: *addresses issue [#3823](https://github.com/tobymao/sqlglot/issues/3823) opened by [@cpcloud](https://github.com/cpcloud)*
|
||||||
|
|
||||||
|
### :bug: Bug Fixes
|
||||||
|
- [`57ecc84`](https://github.com/tobymao/sqlglot/commit/57ecc8465a3c4d1e0ab1db71dc185c80efc5d0aa) - **duckdb**: wrap left IN clause json extract arrow operand fixes [#3836](https://github.com/tobymao/sqlglot/pull/3836) *(commit by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- [`2ffb070`](https://github.com/tobymao/sqlglot/commit/2ffb07070952cde7ac9a1883cbf9b4c477c55abb) - **duckdb**: allow fixed length array casts closes [#3841](https://github.com/tobymao/sqlglot/pull/3841) *(PR [#3842](https://github.com/tobymao/sqlglot/pull/3842) by [@tobymao](https://github.com/tobymao))*
|
||||||
|
- [`d71eb4e`](https://github.com/tobymao/sqlglot/commit/d71eb4ebc2a0f82c567b32de51298f0d82f400a1) - pretty gen for tuples *(commit by [@tobymao](https://github.com/tobymao))*
|
||||||
|
- [`12ae9cd`](https://github.com/tobymao/sqlglot/commit/12ae9cdc1c1f52735f8c60488b5d98a4872bf764) - **tsql**: handle JSON_QUERY with a single argument *(PR [#3847](https://github.com/tobymao/sqlglot/pull/3847) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- :arrow_lower_right: *fixes issue [#3843](https://github.com/tobymao/sqlglot/issues/3843) opened by [@zachary62](https://github.com/zachary62)*
|
||||||
|
- [`f8ca6b4`](https://github.com/tobymao/sqlglot/commit/f8ca6b4048ee22585cd7635f83b25fe2df9bd748) - **tsql**: bubble up exp.Create CTEs to improve transpilability *(PR [#3848](https://github.com/tobymao/sqlglot/pull/3848) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- :arrow_lower_right: *fixes issue [#3844](https://github.com/tobymao/sqlglot/issues/3844) opened by [@zachary62](https://github.com/zachary62)*
|
||||||
|
- [`89976c1`](https://github.com/tobymao/sqlglot/commit/89976c1dbb61bdfe3bbb98702b18365e90a69acb) - **parser**: allow 'cube' to be used for identifiers *(PR [#3850](https://github.com/tobymao/sqlglot/pull/3850) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
|
||||||
|
### :recycle: Refactors
|
||||||
|
- [`d00ea9c`](https://github.com/tobymao/sqlglot/commit/d00ea9c4d39f686fabbe864e88cfe5c071fd4f66) - exclude boolean args in Generator.format_args *(PR [#3849](https://github.com/tobymao/sqlglot/pull/3849) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
|
||||||
|
|
||||||
|
## [v25.7.1] - 2024-07-25
|
||||||
|
### :bug: Bug Fixes
|
||||||
|
- [`ae95c18`](https://github.com/tobymao/sqlglot/commit/ae95c18f636d34c7f92b48cd5970f4fa6ad81b08) - alter table add columns closes [#3835](https://github.com/tobymao/sqlglot/pull/3835) *(commit by [@tobymao](https://github.com/tobymao))*
|
||||||
|
- [`9b5839d`](https://github.com/tobymao/sqlglot/commit/9b5839d7fb04f78c9ef50b112cd9d4d24558c912) - make ast consistent *(commit by [@tobymao](https://github.com/tobymao))*
|
||||||
|
|
||||||
|
|
||||||
## [v25.7.0] - 2024-07-25
|
## [v25.7.0] - 2024-07-25
|
||||||
### :sparkles: New Features
|
### :sparkles: New Features
|
||||||
- [`ba0aa50`](https://github.com/tobymao/sqlglot/commit/ba0aa50072f623c299eb4d2dbb69993541fff27b) - **duckdb**: Transpile BQ's exp.DatetimeAdd, exp.DatetimeSub *(PR [#3777](https://github.com/tobymao/sqlglot/pull/3777) by [@VaggelisD](https://github.com/VaggelisD))*
|
- [`ba0aa50`](https://github.com/tobymao/sqlglot/commit/ba0aa50072f623c299eb4d2dbb69993541fff27b) - **duckdb**: Transpile BQ's exp.DatetimeAdd, exp.DatetimeSub *(PR [#3777](https://github.com/tobymao/sqlglot/pull/3777) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
@ -4208,3 +4236,5 @@ Changelog
|
||||||
[v25.6.0]: https://github.com/tobymao/sqlglot/compare/v25.5.1...v25.6.0
|
[v25.6.0]: https://github.com/tobymao/sqlglot/compare/v25.5.1...v25.6.0
|
||||||
[v25.6.1]: https://github.com/tobymao/sqlglot/compare/v25.6.0...v25.6.1
|
[v25.6.1]: https://github.com/tobymao/sqlglot/compare/v25.6.0...v25.6.1
|
||||||
[v25.7.0]: https://github.com/tobymao/sqlglot/compare/v25.6.1...v25.7.0
|
[v25.7.0]: https://github.com/tobymao/sqlglot/compare/v25.6.1...v25.7.0
|
||||||
|
[v25.7.1]: https://github.com/tobymao/sqlglot/compare/v25.7.0...v25.7.1
|
||||||
|
[v25.8.0]: https://github.com/tobymao/sqlglot/compare/v25.7.1...v25.8.0
|
||||||
|
|
File diff suppressed because one or more lines are too long
|
@ -76,8 +76,8 @@
|
||||||
</span><span id="L-12"><a href="#L-12"><span class="linenos">12</span></a><span class="n">__version_tuple__</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
</span><span id="L-12"><a href="#L-12"><span class="linenos">12</span></a><span class="n">__version_tuple__</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
||||||
</span><span id="L-13"><a href="#L-13"><span class="linenos">13</span></a><span class="n">version_tuple</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
</span><span id="L-13"><a href="#L-13"><span class="linenos">13</span></a><span class="n">version_tuple</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
||||||
</span><span id="L-14"><a href="#L-14"><span class="linenos">14</span></a>
|
</span><span id="L-14"><a href="#L-14"><span class="linenos">14</span></a>
|
||||||
</span><span id="L-15"><a href="#L-15"><span class="linenos">15</span></a><span class="n">__version__</span> <span class="o">=</span> <span class="n">version</span> <span class="o">=</span> <span class="s1">'25.7.0'</span>
|
</span><span id="L-15"><a href="#L-15"><span class="linenos">15</span></a><span class="n">__version__</span> <span class="o">=</span> <span class="n">version</span> <span class="o">=</span> <span class="s1">'25.8.0'</span>
|
||||||
</span><span id="L-16"><a href="#L-16"><span class="linenos">16</span></a><span class="n">__version_tuple__</span> <span class="o">=</span> <span class="n">version_tuple</span> <span class="o">=</span> <span class="p">(</span><span class="mi">25</span><span class="p">,</span> <span class="mi">7</span><span class="p">,</span> <span class="mi">0</span><span class="p">)</span>
|
</span><span id="L-16"><a href="#L-16"><span class="linenos">16</span></a><span class="n">__version_tuple__</span> <span class="o">=</span> <span class="n">version_tuple</span> <span class="o">=</span> <span class="p">(</span><span class="mi">25</span><span class="p">,</span> <span class="mi">8</span><span class="p">,</span> <span class="mi">0</span><span class="p">)</span>
|
||||||
</span></pre></div>
|
</span></pre></div>
|
||||||
|
|
||||||
|
|
||||||
|
@ -97,7 +97,7 @@
|
||||||
<section id="version">
|
<section id="version">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">version</span><span class="annotation">: str</span> =
|
<span class="name">version</span><span class="annotation">: str</span> =
|
||||||
<span class="default_value">'25.7.0'</span>
|
<span class="default_value">'25.8.0'</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -109,7 +109,7 @@
|
||||||
<section id="version_tuple">
|
<section id="version_tuple">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">version_tuple</span><span class="annotation">: object</span> =
|
<span class="name">version_tuple</span><span class="annotation">: object</span> =
|
||||||
<span class="default_value">(25, 7, 0)</span>
|
<span class="default_value">(25, 8, 0)</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
@ -950,7 +950,6 @@ Default: True</li>
|
||||||
<dd id="RisingWave.Generator.bitwiserightshift_sql" class="function"><a href="../generator.html#Generator.bitwiserightshift_sql">bitwiserightshift_sql</a></dd>
|
<dd id="RisingWave.Generator.bitwiserightshift_sql" class="function"><a href="../generator.html#Generator.bitwiserightshift_sql">bitwiserightshift_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.bitwisexor_sql" class="function"><a href="../generator.html#Generator.bitwisexor_sql">bitwisexor_sql</a></dd>
|
<dd id="RisingWave.Generator.bitwisexor_sql" class="function"><a href="../generator.html#Generator.bitwisexor_sql">bitwisexor_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.currentdate_sql" class="function"><a href="../generator.html#Generator.currentdate_sql">currentdate_sql</a></dd>
|
<dd id="RisingWave.Generator.currentdate_sql" class="function"><a href="../generator.html#Generator.currentdate_sql">currentdate_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.currenttimestamp_sql" class="function"><a href="../generator.html#Generator.currenttimestamp_sql">currenttimestamp_sql</a></dd>
|
|
||||||
<dd id="RisingWave.Generator.collate_sql" class="function"><a href="../generator.html#Generator.collate_sql">collate_sql</a></dd>
|
<dd id="RisingWave.Generator.collate_sql" class="function"><a href="../generator.html#Generator.collate_sql">collate_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.command_sql" class="function"><a href="../generator.html#Generator.command_sql">command_sql</a></dd>
|
<dd id="RisingWave.Generator.command_sql" class="function"><a href="../generator.html#Generator.command_sql">command_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.comment_sql" class="function"><a href="../generator.html#Generator.comment_sql">comment_sql</a></dd>
|
<dd id="RisingWave.Generator.comment_sql" class="function"><a href="../generator.html#Generator.comment_sql">comment_sql</a></dd>
|
||||||
|
@ -1047,7 +1046,6 @@ Default: True</li>
|
||||||
<dd id="RisingWave.Generator.lastday_sql" class="function"><a href="../generator.html#Generator.lastday_sql">lastday_sql</a></dd>
|
<dd id="RisingWave.Generator.lastday_sql" class="function"><a href="../generator.html#Generator.lastday_sql">lastday_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.dateadd_sql" class="function"><a href="../generator.html#Generator.dateadd_sql">dateadd_sql</a></dd>
|
<dd id="RisingWave.Generator.dateadd_sql" class="function"><a href="../generator.html#Generator.dateadd_sql">dateadd_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.arrayany_sql" class="function"><a href="../generator.html#Generator.arrayany_sql">arrayany_sql</a></dd>
|
<dd id="RisingWave.Generator.arrayany_sql" class="function"><a href="../generator.html#Generator.arrayany_sql">arrayany_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.generateseries_sql" class="function"><a href="../generator.html#Generator.generateseries_sql">generateseries_sql</a></dd>
|
|
||||||
<dd id="RisingWave.Generator.struct_sql" class="function"><a href="../generator.html#Generator.struct_sql">struct_sql</a></dd>
|
<dd id="RisingWave.Generator.struct_sql" class="function"><a href="../generator.html#Generator.struct_sql">struct_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.partitionrange_sql" class="function"><a href="../generator.html#Generator.partitionrange_sql">partitionrange_sql</a></dd>
|
<dd id="RisingWave.Generator.partitionrange_sql" class="function"><a href="../generator.html#Generator.partitionrange_sql">partitionrange_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.truncatetable_sql" class="function"><a href="../generator.html#Generator.truncatetable_sql">truncatetable_sql</a></dd>
|
<dd id="RisingWave.Generator.truncatetable_sql" class="function"><a href="../generator.html#Generator.truncatetable_sql">truncatetable_sql</a></dd>
|
||||||
|
@ -1062,12 +1060,10 @@ Default: True</li>
|
||||||
<dd id="RisingWave.Generator.scope_resolution" class="function"><a href="../generator.html#Generator.scope_resolution">scope_resolution</a></dd>
|
<dd id="RisingWave.Generator.scope_resolution" class="function"><a href="../generator.html#Generator.scope_resolution">scope_resolution</a></dd>
|
||||||
<dd id="RisingWave.Generator.scoperesolution_sql" class="function"><a href="../generator.html#Generator.scoperesolution_sql">scoperesolution_sql</a></dd>
|
<dd id="RisingWave.Generator.scoperesolution_sql" class="function"><a href="../generator.html#Generator.scoperesolution_sql">scoperesolution_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.parsejson_sql" class="function"><a href="../generator.html#Generator.parsejson_sql">parsejson_sql</a></dd>
|
<dd id="RisingWave.Generator.parsejson_sql" class="function"><a href="../generator.html#Generator.parsejson_sql">parsejson_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.length_sql" class="function"><a href="../generator.html#Generator.length_sql">length_sql</a></dd>
|
|
||||||
<dd id="RisingWave.Generator.rand_sql" class="function"><a href="../generator.html#Generator.rand_sql">rand_sql</a></dd>
|
<dd id="RisingWave.Generator.rand_sql" class="function"><a href="../generator.html#Generator.rand_sql">rand_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.strtodate_sql" class="function"><a href="../generator.html#Generator.strtodate_sql">strtodate_sql</a></dd>
|
|
||||||
<dd id="RisingWave.Generator.strtotime_sql" class="function"><a href="../generator.html#Generator.strtotime_sql">strtotime_sql</a></dd>
|
|
||||||
<dd id="RisingWave.Generator.changes_sql" class="function"><a href="../generator.html#Generator.changes_sql">changes_sql</a></dd>
|
<dd id="RisingWave.Generator.changes_sql" class="function"><a href="../generator.html#Generator.changes_sql">changes_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.pad_sql" class="function"><a href="../generator.html#Generator.pad_sql">pad_sql</a></dd>
|
<dd id="RisingWave.Generator.pad_sql" class="function"><a href="../generator.html#Generator.pad_sql">pad_sql</a></dd>
|
||||||
|
<dd id="RisingWave.Generator.summarize_sql" class="function"><a href="../generator.html#Generator.summarize_sql">summarize_sql</a></dd>
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
<div><dt><a href="postgres.html#Postgres.Generator">sqlglot.dialects.postgres.Postgres.Generator</a></dt>
|
<div><dt><a href="postgres.html#Postgres.Generator">sqlglot.dialects.postgres.Postgres.Generator</a></dt>
|
||||||
|
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
@ -2618,7 +2618,6 @@ Default: True</li>
|
||||||
<dd id="Python.Generator.bitwisexor_sql" class="function"><a href="../generator.html#Generator.bitwisexor_sql">bitwisexor_sql</a></dd>
|
<dd id="Python.Generator.bitwisexor_sql" class="function"><a href="../generator.html#Generator.bitwisexor_sql">bitwisexor_sql</a></dd>
|
||||||
<dd id="Python.Generator.cast_sql" class="function"><a href="../generator.html#Generator.cast_sql">cast_sql</a></dd>
|
<dd id="Python.Generator.cast_sql" class="function"><a href="../generator.html#Generator.cast_sql">cast_sql</a></dd>
|
||||||
<dd id="Python.Generator.currentdate_sql" class="function"><a href="../generator.html#Generator.currentdate_sql">currentdate_sql</a></dd>
|
<dd id="Python.Generator.currentdate_sql" class="function"><a href="../generator.html#Generator.currentdate_sql">currentdate_sql</a></dd>
|
||||||
<dd id="Python.Generator.currenttimestamp_sql" class="function"><a href="../generator.html#Generator.currenttimestamp_sql">currenttimestamp_sql</a></dd>
|
|
||||||
<dd id="Python.Generator.collate_sql" class="function"><a href="../generator.html#Generator.collate_sql">collate_sql</a></dd>
|
<dd id="Python.Generator.collate_sql" class="function"><a href="../generator.html#Generator.collate_sql">collate_sql</a></dd>
|
||||||
<dd id="Python.Generator.command_sql" class="function"><a href="../generator.html#Generator.command_sql">command_sql</a></dd>
|
<dd id="Python.Generator.command_sql" class="function"><a href="../generator.html#Generator.command_sql">command_sql</a></dd>
|
||||||
<dd id="Python.Generator.comment_sql" class="function"><a href="../generator.html#Generator.comment_sql">comment_sql</a></dd>
|
<dd id="Python.Generator.comment_sql" class="function"><a href="../generator.html#Generator.comment_sql">comment_sql</a></dd>
|
||||||
|
@ -2716,7 +2715,6 @@ Default: True</li>
|
||||||
<dd id="Python.Generator.lastday_sql" class="function"><a href="../generator.html#Generator.lastday_sql">lastday_sql</a></dd>
|
<dd id="Python.Generator.lastday_sql" class="function"><a href="../generator.html#Generator.lastday_sql">lastday_sql</a></dd>
|
||||||
<dd id="Python.Generator.dateadd_sql" class="function"><a href="../generator.html#Generator.dateadd_sql">dateadd_sql</a></dd>
|
<dd id="Python.Generator.dateadd_sql" class="function"><a href="../generator.html#Generator.dateadd_sql">dateadd_sql</a></dd>
|
||||||
<dd id="Python.Generator.arrayany_sql" class="function"><a href="../generator.html#Generator.arrayany_sql">arrayany_sql</a></dd>
|
<dd id="Python.Generator.arrayany_sql" class="function"><a href="../generator.html#Generator.arrayany_sql">arrayany_sql</a></dd>
|
||||||
<dd id="Python.Generator.generateseries_sql" class="function"><a href="../generator.html#Generator.generateseries_sql">generateseries_sql</a></dd>
|
|
||||||
<dd id="Python.Generator.struct_sql" class="function"><a href="../generator.html#Generator.struct_sql">struct_sql</a></dd>
|
<dd id="Python.Generator.struct_sql" class="function"><a href="../generator.html#Generator.struct_sql">struct_sql</a></dd>
|
||||||
<dd id="Python.Generator.partitionrange_sql" class="function"><a href="../generator.html#Generator.partitionrange_sql">partitionrange_sql</a></dd>
|
<dd id="Python.Generator.partitionrange_sql" class="function"><a href="../generator.html#Generator.partitionrange_sql">partitionrange_sql</a></dd>
|
||||||
<dd id="Python.Generator.truncatetable_sql" class="function"><a href="../generator.html#Generator.truncatetable_sql">truncatetable_sql</a></dd>
|
<dd id="Python.Generator.truncatetable_sql" class="function"><a href="../generator.html#Generator.truncatetable_sql">truncatetable_sql</a></dd>
|
||||||
|
@ -2731,12 +2729,10 @@ Default: True</li>
|
||||||
<dd id="Python.Generator.scope_resolution" class="function"><a href="../generator.html#Generator.scope_resolution">scope_resolution</a></dd>
|
<dd id="Python.Generator.scope_resolution" class="function"><a href="../generator.html#Generator.scope_resolution">scope_resolution</a></dd>
|
||||||
<dd id="Python.Generator.scoperesolution_sql" class="function"><a href="../generator.html#Generator.scoperesolution_sql">scoperesolution_sql</a></dd>
|
<dd id="Python.Generator.scoperesolution_sql" class="function"><a href="../generator.html#Generator.scoperesolution_sql">scoperesolution_sql</a></dd>
|
||||||
<dd id="Python.Generator.parsejson_sql" class="function"><a href="../generator.html#Generator.parsejson_sql">parsejson_sql</a></dd>
|
<dd id="Python.Generator.parsejson_sql" class="function"><a href="../generator.html#Generator.parsejson_sql">parsejson_sql</a></dd>
|
||||||
<dd id="Python.Generator.length_sql" class="function"><a href="../generator.html#Generator.length_sql">length_sql</a></dd>
|
|
||||||
<dd id="Python.Generator.rand_sql" class="function"><a href="../generator.html#Generator.rand_sql">rand_sql</a></dd>
|
<dd id="Python.Generator.rand_sql" class="function"><a href="../generator.html#Generator.rand_sql">rand_sql</a></dd>
|
||||||
<dd id="Python.Generator.strtodate_sql" class="function"><a href="../generator.html#Generator.strtodate_sql">strtodate_sql</a></dd>
|
|
||||||
<dd id="Python.Generator.strtotime_sql" class="function"><a href="../generator.html#Generator.strtotime_sql">strtotime_sql</a></dd>
|
|
||||||
<dd id="Python.Generator.changes_sql" class="function"><a href="../generator.html#Generator.changes_sql">changes_sql</a></dd>
|
<dd id="Python.Generator.changes_sql" class="function"><a href="../generator.html#Generator.changes_sql">changes_sql</a></dd>
|
||||||
<dd id="Python.Generator.pad_sql" class="function"><a href="../generator.html#Generator.pad_sql">pad_sql</a></dd>
|
<dd id="Python.Generator.pad_sql" class="function"><a href="../generator.html#Generator.pad_sql">pad_sql</a></dd>
|
||||||
|
<dd id="Python.Generator.summarize_sql" class="function"><a href="../generator.html#Generator.summarize_sql">summarize_sql</a></dd>
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
</dl>
|
</dl>
|
||||||
|
|
File diff suppressed because it is too large
Load diff
File diff suppressed because one or more lines are too long
File diff suppressed because it is too large
Load diff
|
@ -585,7 +585,7 @@
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">ALL_JSON_PATH_PARTS</span> =
|
<span class="name">ALL_JSON_PATH_PARTS</span> =
|
||||||
<input id="ALL_JSON_PATH_PARTS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="ALL_JSON_PATH_PARTS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="ALL_JSON_PATH_PARTS-view-value"></label><span class="default_value">{<class '<a href="expressions.html#JSONPathScript">sqlglot.expressions.JSONPathScript</a>'>, <class '<a href="expressions.html#JSONPathRoot">sqlglot.expressions.JSONPathRoot</a>'>, <class '<a href="expressions.html#JSONPathRecursive">sqlglot.expressions.JSONPathRecursive</a>'>, <class '<a href="expressions.html#JSONPathKey">sqlglot.expressions.JSONPathKey</a>'>, <class '<a href="expressions.html#JSONPathWildcard">sqlglot.expressions.JSONPathWildcard</a>'>, <class '<a href="expressions.html#JSONPathFilter">sqlglot.expressions.JSONPathFilter</a>'>, <class '<a href="expressions.html#JSONPathUnion">sqlglot.expressions.JSONPathUnion</a>'>, <class '<a href="expressions.html#JSONPathSubscript">sqlglot.expressions.JSONPathSubscript</a>'>, <class '<a href="expressions.html#JSONPathSelector">sqlglot.expressions.JSONPathSelector</a>'>, <class '<a href="expressions.html#JSONPathSlice">sqlglot.expressions.JSONPathSlice</a>'>}</span>
|
<label class="view-value-button pdoc-button" for="ALL_JSON_PATH_PARTS-view-value"></label><span class="default_value">{<class '<a href="expressions.html#JSONPathUnion">sqlglot.expressions.JSONPathUnion</a>'>, <class '<a href="expressions.html#JSONPathSubscript">sqlglot.expressions.JSONPathSubscript</a>'>, <class '<a href="expressions.html#JSONPathSelector">sqlglot.expressions.JSONPathSelector</a>'>, <class '<a href="expressions.html#JSONPathSlice">sqlglot.expressions.JSONPathSlice</a>'>, <class '<a href="expressions.html#JSONPathScript">sqlglot.expressions.JSONPathScript</a>'>, <class '<a href="expressions.html#JSONPathWildcard">sqlglot.expressions.JSONPathWildcard</a>'>, <class '<a href="expressions.html#JSONPathRoot">sqlglot.expressions.JSONPathRoot</a>'>, <class '<a href="expressions.html#JSONPathRecursive">sqlglot.expressions.JSONPathRecursive</a>'>, <class '<a href="expressions.html#JSONPathKey">sqlglot.expressions.JSONPathKey</a>'>, <class '<a href="expressions.html#JSONPathFilter">sqlglot.expressions.JSONPathFilter</a>'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
File diff suppressed because one or more lines are too long
|
@ -586,7 +586,7 @@ queries if it would result in multiple table selects in a single query:</p>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">UNMERGABLE_ARGS</span> =
|
<span class="name">UNMERGABLE_ARGS</span> =
|
||||||
<input id="UNMERGABLE_ARGS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="UNMERGABLE_ARGS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="UNMERGABLE_ARGS-view-value"></label><span class="default_value">{'options', 'qualify', 'sort', 'format', 'sample', 'with', 'laterals', 'distinct', 'kind', 'locks', 'cluster', 'distribute', 'limit', 'prewhere', 'into', 'match', 'pivots', 'group', 'having', 'offset', 'windows', 'connect', 'settings'}</span>
|
<label class="view-value-button pdoc-button" for="UNMERGABLE_ARGS-view-value"></label><span class="default_value">{'limit', 'with', 'into', 'match', 'pivots', 'group', 'locks', 'prewhere', 'kind', 'connect', 'having', 'settings', 'windows', 'options', 'sort', 'laterals', 'qualify', 'format', 'cluster', 'distinct', 'offset', 'sample', 'distribute'}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -3209,7 +3209,7 @@ prefix are statically known.</p>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">DATETRUNC_COMPARISONS</span> =
|
<span class="name">DATETRUNC_COMPARISONS</span> =
|
||||||
<input id="DATETRUNC_COMPARISONS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DATETRUNC_COMPARISONS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DATETRUNC_COMPARISONS-view-value"></label><span class="default_value">{<class '<a href="../expressions.html#LT">sqlglot.expressions.LT</a>'>, <class '<a href="../expressions.html#NEQ">sqlglot.expressions.NEQ</a>'>, <class '<a href="../expressions.html#EQ">sqlglot.expressions.EQ</a>'>, <class '<a href="../expressions.html#In">sqlglot.expressions.In</a>'>, <class '<a href="../expressions.html#LTE">sqlglot.expressions.LTE</a>'>, <class '<a href="../expressions.html#GTE">sqlglot.expressions.GTE</a>'>, <class '<a href="../expressions.html#GT">sqlglot.expressions.GT</a>'>}</span>
|
<label class="view-value-button pdoc-button" for="DATETRUNC_COMPARISONS-view-value"></label><span class="default_value">{<class '<a href="../expressions.html#GT">sqlglot.expressions.GT</a>'>, <class '<a href="../expressions.html#EQ">sqlglot.expressions.EQ</a>'>, <class '<a href="../expressions.html#LT">sqlglot.expressions.LT</a>'>, <class '<a href="../expressions.html#NEQ">sqlglot.expressions.NEQ</a>'>, <class '<a href="../expressions.html#In">sqlglot.expressions.In</a>'>, <class '<a href="../expressions.html#GTE">sqlglot.expressions.GTE</a>'>, <class '<a href="../expressions.html#LTE">sqlglot.expressions.LTE</a>'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -3289,7 +3289,7 @@ prefix are statically known.</p>
|
||||||
<section id="JOINS">
|
<section id="JOINS">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">JOINS</span> =
|
<span class="name">JOINS</span> =
|
||||||
<span class="default_value">{('', 'INNER'), ('RIGHT', ''), ('RIGHT', 'OUTER'), ('', '')}</span>
|
<span class="default_value">{('RIGHT', ''), ('', 'INNER'), ('RIGHT', 'OUTER'), ('', '')}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
26084
docs/sqlglot/parser.html
26084
docs/sqlglot/parser.html
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because it is too large
Load diff
|
@ -322,6 +322,7 @@ class BigQuery(Dialect):
|
||||||
"ANY TYPE": TokenType.VARIANT,
|
"ANY TYPE": TokenType.VARIANT,
|
||||||
"BEGIN": TokenType.COMMAND,
|
"BEGIN": TokenType.COMMAND,
|
||||||
"BEGIN TRANSACTION": TokenType.BEGIN,
|
"BEGIN TRANSACTION": TokenType.BEGIN,
|
||||||
|
"BYTEINT": TokenType.INT,
|
||||||
"BYTES": TokenType.BINARY,
|
"BYTES": TokenType.BINARY,
|
||||||
"CURRENT_DATETIME": TokenType.CURRENT_DATETIME,
|
"CURRENT_DATETIME": TokenType.CURRENT_DATETIME,
|
||||||
"DATETIME": TokenType.TIMESTAMP,
|
"DATETIME": TokenType.TIMESTAMP,
|
||||||
|
|
|
@ -81,6 +81,14 @@ def _build_count_if(args: t.List) -> exp.CountIf | exp.CombinedAggFunc:
|
||||||
return exp.CombinedAggFunc(this="countIf", expressions=args, parts=("count", "If"))
|
return exp.CombinedAggFunc(this="countIf", expressions=args, parts=("count", "If"))
|
||||||
|
|
||||||
|
|
||||||
|
def _build_str_to_date(args: t.List) -> exp.Cast | exp.Anonymous:
|
||||||
|
if len(args) == 3:
|
||||||
|
return exp.Anonymous(this="STR_TO_DATE", expressions=args)
|
||||||
|
|
||||||
|
strtodate = exp.StrToDate.from_arg_list(args)
|
||||||
|
return exp.cast(strtodate, exp.DataType.build(exp.DataType.Type.DATETIME))
|
||||||
|
|
||||||
|
|
||||||
def _datetime_delta_sql(name: str) -> t.Callable[[Generator, DATEΤΙΜΕ_DELTA], str]:
|
def _datetime_delta_sql(name: str) -> t.Callable[[Generator, DATEΤΙΜΕ_DELTA], str]:
|
||||||
def _delta_sql(self: Generator, expression: DATEΤΙΜΕ_DELTA) -> str:
|
def _delta_sql(self: Generator, expression: DATEΤΙΜΕ_DELTA) -> str:
|
||||||
if not expression.unit:
|
if not expression.unit:
|
||||||
|
@ -181,6 +189,7 @@ class ClickHouse(Dialect):
|
||||||
"MAP": parser.build_var_map,
|
"MAP": parser.build_var_map,
|
||||||
"MATCH": exp.RegexpLike.from_arg_list,
|
"MATCH": exp.RegexpLike.from_arg_list,
|
||||||
"RANDCANONICAL": exp.Rand.from_arg_list,
|
"RANDCANONICAL": exp.Rand.from_arg_list,
|
||||||
|
"STR_TO_DATE": _build_str_to_date,
|
||||||
"TUPLE": exp.Struct.from_arg_list,
|
"TUPLE": exp.Struct.from_arg_list,
|
||||||
"TIMESTAMP_SUB": build_date_delta(exp.TimestampSub, default_unit=None),
|
"TIMESTAMP_SUB": build_date_delta(exp.TimestampSub, default_unit=None),
|
||||||
"TIMESTAMPSUB": build_date_delta(exp.TimestampSub, default_unit=None),
|
"TIMESTAMPSUB": build_date_delta(exp.TimestampSub, default_unit=None),
|
||||||
|
@ -836,6 +845,24 @@ class ClickHouse(Dialect):
|
||||||
"NAMED COLLECTION",
|
"NAMED COLLECTION",
|
||||||
}
|
}
|
||||||
|
|
||||||
|
def strtodate_sql(self, expression: exp.StrToDate) -> str:
|
||||||
|
strtodate_sql = self.function_fallback_sql(expression)
|
||||||
|
|
||||||
|
if not isinstance(expression.parent, exp.Cast):
|
||||||
|
# StrToDate returns DATEs in other dialects (eg. postgres), so
|
||||||
|
# this branch aims to improve the transpilation to clickhouse
|
||||||
|
return f"CAST({strtodate_sql} AS DATE)"
|
||||||
|
|
||||||
|
return strtodate_sql
|
||||||
|
|
||||||
|
def cast_sql(self, expression: exp.Cast, safe_prefix: t.Optional[str] = None) -> str:
|
||||||
|
this = expression.this
|
||||||
|
|
||||||
|
if isinstance(this, exp.StrToDate) and expression.to == exp.DataType.build("datetime"):
|
||||||
|
return self.sql(this)
|
||||||
|
|
||||||
|
return super().cast_sql(expression, safe_prefix=safe_prefix)
|
||||||
|
|
||||||
def _jsonpathsubscript_sql(self, expression: exp.JSONPathSubscript) -> str:
|
def _jsonpathsubscript_sql(self, expression: exp.JSONPathSubscript) -> str:
|
||||||
this = self.json_path_part(expression.this)
|
this = self.json_path_part(expression.this)
|
||||||
return str(int(this) + 1) if is_int(this) else this
|
return str(int(this) + 1) if is_int(this) else this
|
||||||
|
|
|
@ -158,7 +158,7 @@ def _struct_sql(self: DuckDB.Generator, expression: exp.Struct) -> str:
|
||||||
|
|
||||||
def _datatype_sql(self: DuckDB.Generator, expression: exp.DataType) -> str:
|
def _datatype_sql(self: DuckDB.Generator, expression: exp.DataType) -> str:
|
||||||
if expression.is_type("array"):
|
if expression.is_type("array"):
|
||||||
return f"{self.expressions(expression, flat=True)}[]"
|
return f"{self.expressions(expression, flat=True)}[{self.expressions(expression, key='values', flat=True)}]"
|
||||||
|
|
||||||
# Type TIMESTAMP / TIME WITH TIME ZONE does not support any modifiers
|
# Type TIMESTAMP / TIME WITH TIME ZONE does not support any modifiers
|
||||||
if expression.is_type("timestamptz", "timetz"):
|
if expression.is_type("timestamptz", "timetz"):
|
||||||
|
@ -186,9 +186,14 @@ def _unix_to_time_sql(self: DuckDB.Generator, expression: exp.UnixToTime) -> str
|
||||||
return self.func("TO_TIMESTAMP", exp.Div(this=timestamp, expression=exp.func("POW", 10, scale)))
|
return self.func("TO_TIMESTAMP", exp.Div(this=timestamp, expression=exp.func("POW", 10, scale)))
|
||||||
|
|
||||||
|
|
||||||
|
WRAPPED_JSON_EXTRACT_EXPRESSIONS = (exp.Binary, exp.Bracket, exp.In)
|
||||||
|
|
||||||
|
|
||||||
def _arrow_json_extract_sql(self: DuckDB.Generator, expression: JSON_EXTRACT_TYPE) -> str:
|
def _arrow_json_extract_sql(self: DuckDB.Generator, expression: JSON_EXTRACT_TYPE) -> str:
|
||||||
arrow_sql = arrow_json_extract_sql(self, expression)
|
arrow_sql = arrow_json_extract_sql(self, expression)
|
||||||
if not expression.same_parent and isinstance(expression.parent, (exp.Binary, exp.Bracket)):
|
if not expression.same_parent and isinstance(
|
||||||
|
expression.parent, WRAPPED_JSON_EXTRACT_EXPRESSIONS
|
||||||
|
):
|
||||||
arrow_sql = self.wrap(arrow_sql)
|
arrow_sql = self.wrap(arrow_sql)
|
||||||
return arrow_sql
|
return arrow_sql
|
||||||
|
|
||||||
|
@ -238,14 +243,15 @@ class DuckDB(Dialect):
|
||||||
"POSITIONAL": TokenType.POSITIONAL,
|
"POSITIONAL": TokenType.POSITIONAL,
|
||||||
"SIGNED": TokenType.INT,
|
"SIGNED": TokenType.INT,
|
||||||
"STRING": TokenType.TEXT,
|
"STRING": TokenType.TEXT,
|
||||||
"UBIGINT": TokenType.UBIGINT,
|
"SUMMARIZE": TokenType.SUMMARIZE,
|
||||||
"UINTEGER": TokenType.UINT,
|
|
||||||
"USMALLINT": TokenType.USMALLINT,
|
|
||||||
"UTINYINT": TokenType.UTINYINT,
|
|
||||||
"TIMESTAMP_S": TokenType.TIMESTAMP_S,
|
"TIMESTAMP_S": TokenType.TIMESTAMP_S,
|
||||||
"TIMESTAMP_MS": TokenType.TIMESTAMP_MS,
|
"TIMESTAMP_MS": TokenType.TIMESTAMP_MS,
|
||||||
"TIMESTAMP_NS": TokenType.TIMESTAMP_NS,
|
"TIMESTAMP_NS": TokenType.TIMESTAMP_NS,
|
||||||
"TIMESTAMP_US": TokenType.TIMESTAMP,
|
"TIMESTAMP_US": TokenType.TIMESTAMP,
|
||||||
|
"UBIGINT": TokenType.UBIGINT,
|
||||||
|
"UINTEGER": TokenType.UINT,
|
||||||
|
"USMALLINT": TokenType.USMALLINT,
|
||||||
|
"UTINYINT": TokenType.UTINYINT,
|
||||||
"VARCHAR": TokenType.TEXT,
|
"VARCHAR": TokenType.TEXT,
|
||||||
}
|
}
|
||||||
KEYWORDS.pop("/*+")
|
KEYWORDS.pop("/*+")
|
||||||
|
@ -744,10 +750,9 @@ class DuckDB(Dialect):
|
||||||
def generateseries_sql(self, expression: exp.GenerateSeries) -> str:
|
def generateseries_sql(self, expression: exp.GenerateSeries) -> str:
|
||||||
# GENERATE_SERIES(a, b) -> [a, b], RANGE(a, b) -> [a, b)
|
# GENERATE_SERIES(a, b) -> [a, b], RANGE(a, b) -> [a, b)
|
||||||
if expression.args.get("is_end_exclusive"):
|
if expression.args.get("is_end_exclusive"):
|
||||||
expression.set("is_end_exclusive", None)
|
|
||||||
return rename_func("RANGE")(self, expression)
|
return rename_func("RANGE")(self, expression)
|
||||||
|
|
||||||
return super().generateseries_sql(expression)
|
return self.function_fallback_sql(expression)
|
||||||
|
|
||||||
def bracket_sql(self, expression: exp.Bracket) -> str:
|
def bracket_sql(self, expression: exp.Bracket) -> str:
|
||||||
this = expression.this
|
this = expression.this
|
||||||
|
|
|
@ -168,12 +168,9 @@ def _serial_to_generated(expression: exp.Expression) -> exp.Expression:
|
||||||
|
|
||||||
def _build_generate_series(args: t.List) -> exp.GenerateSeries:
|
def _build_generate_series(args: t.List) -> exp.GenerateSeries:
|
||||||
# The goal is to convert step values like '1 day' or INTERVAL '1 day' into INTERVAL '1' day
|
# The goal is to convert step values like '1 day' or INTERVAL '1 day' into INTERVAL '1' day
|
||||||
|
# Note: postgres allows calls with just two arguments -- the "step" argument defaults to 1
|
||||||
step = seq_get(args, 2)
|
step = seq_get(args, 2)
|
||||||
|
if step is not None:
|
||||||
if step is None:
|
|
||||||
# Postgres allows calls with just two arguments -- the "step" argument defaults to 1
|
|
||||||
return exp.GenerateSeries.from_arg_list(args)
|
|
||||||
|
|
||||||
if step.is_string:
|
if step.is_string:
|
||||||
args[2] = exp.to_interval(step.this)
|
args[2] = exp.to_interval(step.this)
|
||||||
elif isinstance(step, exp.Interval) and not step.args.get("unit"):
|
elif isinstance(step, exp.Interval) and not step.args.get("unit"):
|
||||||
|
|
|
@ -393,9 +393,6 @@ class Presto(Dialect):
|
||||||
TRANSFORMS = {
|
TRANSFORMS = {
|
||||||
**generator.Generator.TRANSFORMS,
|
**generator.Generator.TRANSFORMS,
|
||||||
exp.AnyValue: rename_func("ARBITRARY"),
|
exp.AnyValue: rename_func("ARBITRARY"),
|
||||||
exp.ApproxDistinct: lambda self, e: self.func(
|
|
||||||
"APPROX_DISTINCT", e.this, e.args.get("accuracy")
|
|
||||||
),
|
|
||||||
exp.ApproxQuantile: rename_func("APPROX_PERCENTILE"),
|
exp.ApproxQuantile: rename_func("APPROX_PERCENTILE"),
|
||||||
exp.ArgMax: rename_func("MAX_BY"),
|
exp.ArgMax: rename_func("MAX_BY"),
|
||||||
exp.ArgMin: rename_func("MIN_BY"),
|
exp.ArgMin: rename_func("MIN_BY"),
|
||||||
|
|
|
@ -223,7 +223,7 @@ class SQLite(Dialect):
|
||||||
exp.select(exp.alias_("value", column_alias)).from_(expression).subquery()
|
exp.select(exp.alias_("value", column_alias)).from_(expression).subquery()
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
sql = super().generateseries_sql(expression)
|
sql = self.function_fallback_sql(expression)
|
||||||
|
|
||||||
return sql
|
return sql
|
||||||
|
|
||||||
|
|
|
@ -322,6 +322,15 @@ def _build_with_arg_as_text(
|
||||||
return _parse
|
return _parse
|
||||||
|
|
||||||
|
|
||||||
|
def _build_json_query(args: t.List, dialect: Dialect) -> exp.JSONExtract:
|
||||||
|
if len(args) == 1:
|
||||||
|
# The default value for path is '$'. As a result, if you don't provide a
|
||||||
|
# value for path, JSON_QUERY returns the input expression.
|
||||||
|
args.append(exp.Literal.string("$"))
|
||||||
|
|
||||||
|
return parser.build_extract_json_with_path(exp.JSONExtract)(args, dialect)
|
||||||
|
|
||||||
|
|
||||||
def _json_extract_sql(
|
def _json_extract_sql(
|
||||||
self: TSQL.Generator, expression: exp.JSONExtract | exp.JSONExtractScalar
|
self: TSQL.Generator, expression: exp.JSONExtract | exp.JSONExtractScalar
|
||||||
) -> str:
|
) -> str:
|
||||||
|
@ -510,7 +519,7 @@ class TSQL(Dialect):
|
||||||
"GETDATE": exp.CurrentTimestamp.from_arg_list,
|
"GETDATE": exp.CurrentTimestamp.from_arg_list,
|
||||||
"HASHBYTES": _build_hashbytes,
|
"HASHBYTES": _build_hashbytes,
|
||||||
"ISNULL": exp.Coalesce.from_arg_list,
|
"ISNULL": exp.Coalesce.from_arg_list,
|
||||||
"JSON_QUERY": parser.build_extract_json_with_path(exp.JSONExtract),
|
"JSON_QUERY": _build_json_query,
|
||||||
"JSON_VALUE": parser.build_extract_json_with_path(exp.JSONExtractScalar),
|
"JSON_VALUE": parser.build_extract_json_with_path(exp.JSONExtractScalar),
|
||||||
"LEN": _build_with_arg_as_text(exp.Length),
|
"LEN": _build_with_arg_as_text(exp.Length),
|
||||||
"LEFT": _build_with_arg_as_text(exp.Left),
|
"LEFT": _build_with_arg_as_text(exp.Left),
|
||||||
|
@ -790,6 +799,7 @@ class TSQL(Dialect):
|
||||||
PARSE_JSON_NAME = None
|
PARSE_JSON_NAME = None
|
||||||
|
|
||||||
EXPRESSIONS_WITHOUT_NESTED_CTES = {
|
EXPRESSIONS_WITHOUT_NESTED_CTES = {
|
||||||
|
exp.Create,
|
||||||
exp.Delete,
|
exp.Delete,
|
||||||
exp.Insert,
|
exp.Insert,
|
||||||
exp.Intersect,
|
exp.Intersect,
|
||||||
|
@ -989,31 +999,32 @@ class TSQL(Dialect):
|
||||||
kind = expression.kind
|
kind = expression.kind
|
||||||
exists = expression.args.pop("exists", None)
|
exists = expression.args.pop("exists", None)
|
||||||
|
|
||||||
if kind == "VIEW":
|
|
||||||
expression.this.set("catalog", None)
|
|
||||||
|
|
||||||
sql = super().create_sql(expression)
|
|
||||||
|
|
||||||
like_property = expression.find(exp.LikeProperty)
|
like_property = expression.find(exp.LikeProperty)
|
||||||
if like_property:
|
if like_property:
|
||||||
ctas_expression = like_property.this
|
ctas_expression = like_property.this
|
||||||
else:
|
else:
|
||||||
ctas_expression = expression.expression
|
ctas_expression = expression.expression
|
||||||
|
|
||||||
|
if kind == "VIEW":
|
||||||
|
expression.this.set("catalog", None)
|
||||||
|
with_ = expression.args.get("with")
|
||||||
|
if ctas_expression and with_:
|
||||||
|
# We've already preprocessed the Create expression to bubble up any nested CTEs,
|
||||||
|
# but CREATE VIEW actually requires the WITH clause to come after it so we need
|
||||||
|
# to amend the AST by moving the CTEs to the CREATE VIEW statement's query.
|
||||||
|
ctas_expression.set("with", with_.pop())
|
||||||
|
|
||||||
|
sql = super().create_sql(expression)
|
||||||
|
|
||||||
table = expression.find(exp.Table)
|
table = expression.find(exp.Table)
|
||||||
|
|
||||||
# Convert CTAS statement to SELECT .. INTO ..
|
# Convert CTAS statement to SELECT .. INTO ..
|
||||||
if kind == "TABLE" and ctas_expression:
|
if kind == "TABLE" and ctas_expression:
|
||||||
ctas_with = ctas_expression.args.get("with")
|
|
||||||
if ctas_with:
|
|
||||||
ctas_with = ctas_with.pop()
|
|
||||||
|
|
||||||
if isinstance(ctas_expression, exp.UNWRAPPED_QUERIES):
|
if isinstance(ctas_expression, exp.UNWRAPPED_QUERIES):
|
||||||
ctas_expression = ctas_expression.subquery()
|
ctas_expression = ctas_expression.subquery()
|
||||||
|
|
||||||
select_into = exp.select("*").from_(exp.alias_(ctas_expression, "temp", table=True))
|
select_into = exp.select("*").from_(exp.alias_(ctas_expression, "temp", table=True))
|
||||||
select_into.set("into", exp.Into(this=table))
|
select_into.set("into", exp.Into(this=table))
|
||||||
select_into.set("with", ctas_with)
|
|
||||||
|
|
||||||
if like_property:
|
if like_property:
|
||||||
select_into.limit(0, copy=False)
|
select_into.limit(0, copy=False)
|
||||||
|
|
|
@ -1439,6 +1439,11 @@ class Describe(Expression):
|
||||||
arg_types = {"this": True, "style": False, "kind": False, "expressions": False}
|
arg_types = {"this": True, "style": False, "kind": False, "expressions": False}
|
||||||
|
|
||||||
|
|
||||||
|
# https://duckdb.org/docs/guides/meta/summarize.html
|
||||||
|
class Summarize(Expression):
|
||||||
|
arg_types = {"this": True, "table": False}
|
||||||
|
|
||||||
|
|
||||||
class Kill(Expression):
|
class Kill(Expression):
|
||||||
arg_types = {"this": True, "kind": False}
|
arg_types = {"this": True, "kind": False}
|
||||||
|
|
||||||
|
|
|
@ -105,12 +105,6 @@ class Generator(metaclass=_Generator):
|
||||||
exp.InlineLengthColumnConstraint: lambda self, e: f"INLINE LENGTH {self.sql(e, 'this')}",
|
exp.InlineLengthColumnConstraint: lambda self, e: f"INLINE LENGTH {self.sql(e, 'this')}",
|
||||||
exp.InputModelProperty: lambda self, e: f"INPUT{self.sql(e, 'this')}",
|
exp.InputModelProperty: lambda self, e: f"INPUT{self.sql(e, 'this')}",
|
||||||
exp.IntervalSpan: lambda self, e: f"{self.sql(e, 'this')} TO {self.sql(e, 'expression')}",
|
exp.IntervalSpan: lambda self, e: f"{self.sql(e, 'this')} TO {self.sql(e, 'expression')}",
|
||||||
exp.JSONExtract: lambda self, e: self.func(
|
|
||||||
"JSON_EXTRACT", e.this, e.expression, *e.expressions
|
|
||||||
),
|
|
||||||
exp.JSONExtractScalar: lambda self, e: self.func(
|
|
||||||
"JSON_EXTRACT_SCALAR", e.this, e.expression, *e.expressions
|
|
||||||
),
|
|
||||||
exp.LanguageProperty: lambda self, e: self.naked_property(e),
|
exp.LanguageProperty: lambda self, e: self.naked_property(e),
|
||||||
exp.LocationProperty: lambda self, e: self.naked_property(e),
|
exp.LocationProperty: lambda self, e: self.naked_property(e),
|
||||||
exp.LogProperty: lambda _, e: f"{'NO ' if e.args.get('no') else ''}LOG",
|
exp.LogProperty: lambda _, e: f"{'NO ' if e.args.get('no') else ''}LOG",
|
||||||
|
@ -146,7 +140,6 @@ class Generator(metaclass=_Generator):
|
||||||
exp.TemporaryProperty: lambda *_: "TEMPORARY",
|
exp.TemporaryProperty: lambda *_: "TEMPORARY",
|
||||||
exp.TagColumnConstraint: lambda self, e: f"TAG ({self.expressions(e, flat=True)})",
|
exp.TagColumnConstraint: lambda self, e: f"TAG ({self.expressions(e, flat=True)})",
|
||||||
exp.TitleColumnConstraint: lambda self, e: f"TITLE {self.sql(e, 'this')}",
|
exp.TitleColumnConstraint: lambda self, e: f"TITLE {self.sql(e, 'this')}",
|
||||||
exp.Timestamp: lambda self, e: self.func("TIMESTAMP", e.this, e.args.get("zone")),
|
|
||||||
exp.ToMap: lambda self, e: f"MAP {self.sql(e, 'this')}",
|
exp.ToMap: lambda self, e: f"MAP {self.sql(e, 'this')}",
|
||||||
exp.ToTableProperty: lambda self, e: f"TO {self.sql(e.this)}",
|
exp.ToTableProperty: lambda self, e: f"TO {self.sql(e.this)}",
|
||||||
exp.TransformModelProperty: lambda self, e: self.func("TRANSFORM", *e.expressions),
|
exp.TransformModelProperty: lambda self, e: self.func("TRANSFORM", *e.expressions),
|
||||||
|
@ -1846,7 +1839,7 @@ class Generator(metaclass=_Generator):
|
||||||
return f"{this} {kind} {expr}"
|
return f"{this} {kind} {expr}"
|
||||||
|
|
||||||
def tuple_sql(self, expression: exp.Tuple) -> str:
|
def tuple_sql(self, expression: exp.Tuple) -> str:
|
||||||
return f"({self.expressions(expression, flat=True)})"
|
return f"({self.expressions(expression, dynamic=True, new_line=True, skip_first=True, skip_last=True)})"
|
||||||
|
|
||||||
def update_sql(self, expression: exp.Update) -> str:
|
def update_sql(self, expression: exp.Update) -> str:
|
||||||
this = self.sql(expression, "this")
|
this = self.sql(expression, "this")
|
||||||
|
@ -2994,9 +2987,6 @@ class Generator(metaclass=_Generator):
|
||||||
zone = self.sql(expression, "this")
|
zone = self.sql(expression, "this")
|
||||||
return f"CURRENT_DATE({zone})" if zone else "CURRENT_DATE"
|
return f"CURRENT_DATE({zone})" if zone else "CURRENT_DATE"
|
||||||
|
|
||||||
def currenttimestamp_sql(self, expression: exp.CurrentTimestamp) -> str:
|
|
||||||
return self.func("CURRENT_TIMESTAMP", expression.this)
|
|
||||||
|
|
||||||
def collate_sql(self, expression: exp.Collate) -> str:
|
def collate_sql(self, expression: exp.Collate) -> str:
|
||||||
if self.COLLATE_IS_FUNC:
|
if self.COLLATE_IS_FUNC:
|
||||||
return self.function_fallback_sql(expression)
|
return self.function_fallback_sql(expression)
|
||||||
|
@ -3354,7 +3344,9 @@ class Generator(metaclass=_Generator):
|
||||||
return f"{self.normalize_func(name)}{prefix}{self.format_args(*args)}{suffix}"
|
return f"{self.normalize_func(name)}{prefix}{self.format_args(*args)}{suffix}"
|
||||||
|
|
||||||
def format_args(self, *args: t.Optional[str | exp.Expression]) -> str:
|
def format_args(self, *args: t.Optional[str | exp.Expression]) -> str:
|
||||||
arg_sqls = tuple(self.sql(arg) for arg in args if arg is not None)
|
arg_sqls = tuple(
|
||||||
|
self.sql(arg) for arg in args if arg is not None and not isinstance(arg, bool)
|
||||||
|
)
|
||||||
if self.pretty and self.too_wide(arg_sqls):
|
if self.pretty and self.too_wide(arg_sqls):
|
||||||
return self.indent("\n" + ",\n".join(arg_sqls) + "\n", skip_first=True, skip_last=True)
|
return self.indent("\n" + ",\n".join(arg_sqls) + "\n", skip_first=True, skip_last=True)
|
||||||
return ", ".join(arg_sqls)
|
return ", ".join(arg_sqls)
|
||||||
|
@ -3397,12 +3389,8 @@ class Generator(metaclass=_Generator):
|
||||||
return sep.join(sql for sql in (self.sql(e) for e in expressions) if sql)
|
return sep.join(sql for sql in (self.sql(e) for e in expressions) if sql)
|
||||||
|
|
||||||
num_sqls = len(expressions)
|
num_sqls = len(expressions)
|
||||||
|
|
||||||
# These are calculated once in case we have the leading_comma / pretty option set, correspondingly
|
|
||||||
if self.pretty and not self.leading_comma:
|
|
||||||
stripped_sep = sep.strip()
|
|
||||||
|
|
||||||
result_sqls = []
|
result_sqls = []
|
||||||
|
|
||||||
for i, e in enumerate(expressions):
|
for i, e in enumerate(expressions):
|
||||||
sql = self.sql(e, comment=False)
|
sql = self.sql(e, comment=False)
|
||||||
if not sql:
|
if not sql:
|
||||||
|
@ -3415,7 +3403,7 @@ class Generator(metaclass=_Generator):
|
||||||
result_sqls.append(f"{sep if i > 0 else ''}{prefix}{sql}{comments}")
|
result_sqls.append(f"{sep if i > 0 else ''}{prefix}{sql}{comments}")
|
||||||
else:
|
else:
|
||||||
result_sqls.append(
|
result_sqls.append(
|
||||||
f"{prefix}{sql}{stripped_sep if i + 1 < num_sqls else ''}{comments}"
|
f"{prefix}{sql}{(sep.rstrip() if comments else sep) if i + 1 < num_sqls else ''}{comments}"
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
result_sqls.append(f"{prefix}{sql}{comments}{sep if i + 1 < num_sqls else ''}")
|
result_sqls.append(f"{prefix}{sql}{comments}{sep if i + 1 < num_sqls else ''}")
|
||||||
|
@ -3424,7 +3412,7 @@ class Generator(metaclass=_Generator):
|
||||||
if new_line:
|
if new_line:
|
||||||
result_sqls.insert(0, "")
|
result_sqls.insert(0, "")
|
||||||
result_sqls.append("")
|
result_sqls.append("")
|
||||||
result_sql = "\n".join(result_sqls)
|
result_sql = "\n".join(s.rstrip() for s in result_sqls)
|
||||||
else:
|
else:
|
||||||
result_sql = "".join(result_sqls)
|
result_sql = "".join(result_sqls)
|
||||||
return (
|
return (
|
||||||
|
@ -3761,10 +3749,6 @@ class Generator(metaclass=_Generator):
|
||||||
|
|
||||||
return self.function_fallback_sql(expression)
|
return self.function_fallback_sql(expression)
|
||||||
|
|
||||||
def generateseries_sql(self, expression: exp.GenerateSeries) -> str:
|
|
||||||
expression.set("is_end_exclusive", None)
|
|
||||||
return self.function_fallback_sql(expression)
|
|
||||||
|
|
||||||
def struct_sql(self, expression: exp.Struct) -> str:
|
def struct_sql(self, expression: exp.Struct) -> str:
|
||||||
expression.set(
|
expression.set(
|
||||||
"expressions",
|
"expressions",
|
||||||
|
@ -4027,9 +4011,6 @@ class Generator(metaclass=_Generator):
|
||||||
|
|
||||||
return self.func(self.PARSE_JSON_NAME, expression.this, expression.expression)
|
return self.func(self.PARSE_JSON_NAME, expression.this, expression.expression)
|
||||||
|
|
||||||
def length_sql(self, expression: exp.Length) -> str:
|
|
||||||
return self.func("LENGTH", expression.this)
|
|
||||||
|
|
||||||
def rand_sql(self, expression: exp.Rand) -> str:
|
def rand_sql(self, expression: exp.Rand) -> str:
|
||||||
lower = self.sql(expression, "lower")
|
lower = self.sql(expression, "lower")
|
||||||
upper = self.sql(expression, "upper")
|
upper = self.sql(expression, "upper")
|
||||||
|
@ -4038,17 +4019,6 @@ class Generator(metaclass=_Generator):
|
||||||
return f"({upper} - {lower}) * {self.func('RAND', expression.this)} + {lower}"
|
return f"({upper} - {lower}) * {self.func('RAND', expression.this)} + {lower}"
|
||||||
return self.func("RAND", expression.this)
|
return self.func("RAND", expression.this)
|
||||||
|
|
||||||
def strtodate_sql(self, expression: exp.StrToDate) -> str:
|
|
||||||
return self.func("STR_TO_DATE", expression.this, expression.args.get("format"))
|
|
||||||
|
|
||||||
def strtotime_sql(self, expression: exp.StrToTime) -> str:
|
|
||||||
return self.func(
|
|
||||||
"STR_TO_TIME",
|
|
||||||
expression.this,
|
|
||||||
expression.args.get("format"),
|
|
||||||
expression.args.get("zone"),
|
|
||||||
)
|
|
||||||
|
|
||||||
def changes_sql(self, expression: exp.Changes) -> str:
|
def changes_sql(self, expression: exp.Changes) -> str:
|
||||||
information = self.sql(expression, "information")
|
information = self.sql(expression, "information")
|
||||||
information = f"INFORMATION => {information}"
|
information = f"INFORMATION => {information}"
|
||||||
|
@ -4067,3 +4037,7 @@ class Generator(metaclass=_Generator):
|
||||||
fill_pattern = "' '"
|
fill_pattern = "' '"
|
||||||
|
|
||||||
return self.func(f"{prefix}PAD", expression.this, expression.expression, fill_pattern)
|
return self.func(f"{prefix}PAD", expression.this, expression.expression, fill_pattern)
|
||||||
|
|
||||||
|
def summarize_sql(self, expression: exp.Summarize) -> str:
|
||||||
|
table = " TABLE" if expression.args.get("table") else ""
|
||||||
|
return f"SUMMARIZE{table} {self.sql(expression.this)}"
|
||||||
|
|
|
@ -179,8 +179,9 @@ def apply_index_offset(
|
||||||
|
|
||||||
if not expression.type:
|
if not expression.type:
|
||||||
annotate_types(expression)
|
annotate_types(expression)
|
||||||
|
|
||||||
if t.cast(exp.DataType, expression.type).this in exp.DataType.INTEGER_TYPES:
|
if t.cast(exp.DataType, expression.type).this in exp.DataType.INTEGER_TYPES:
|
||||||
logger.warning("Applying array index offset (%s)", offset)
|
logger.info("Applying array index offset (%s)", offset)
|
||||||
expression = simplify(expression + offset)
|
expression = simplify(expression + offset)
|
||||||
return [expression]
|
return [expression]
|
||||||
|
|
||||||
|
|
|
@ -393,6 +393,7 @@ class Parser(metaclass=_Parser):
|
||||||
TokenType.COMMIT,
|
TokenType.COMMIT,
|
||||||
TokenType.CONSTRAINT,
|
TokenType.CONSTRAINT,
|
||||||
TokenType.COPY,
|
TokenType.COPY,
|
||||||
|
TokenType.CUBE,
|
||||||
TokenType.DEFAULT,
|
TokenType.DEFAULT,
|
||||||
TokenType.DELETE,
|
TokenType.DELETE,
|
||||||
TokenType.DESC,
|
TokenType.DESC,
|
||||||
|
@ -673,7 +674,7 @@ class Parser(metaclass=_Parser):
|
||||||
exp.Cluster: lambda self: self._parse_sort(exp.Cluster, TokenType.CLUSTER_BY),
|
exp.Cluster: lambda self: self._parse_sort(exp.Cluster, TokenType.CLUSTER_BY),
|
||||||
exp.Column: lambda self: self._parse_column(),
|
exp.Column: lambda self: self._parse_column(),
|
||||||
exp.Condition: lambda self: self._parse_assignment(),
|
exp.Condition: lambda self: self._parse_assignment(),
|
||||||
exp.DataType: lambda self: self._parse_types(allow_identifiers=False),
|
exp.DataType: lambda self: self._parse_types(allow_identifiers=False, schema=True),
|
||||||
exp.Expression: lambda self: self._parse_expression(),
|
exp.Expression: lambda self: self._parse_expression(),
|
||||||
exp.From: lambda self: self._parse_from(joins=True),
|
exp.From: lambda self: self._parse_from(joins=True),
|
||||||
exp.Group: lambda self: self._parse_group(),
|
exp.Group: lambda self: self._parse_group(),
|
||||||
|
@ -2825,12 +2826,14 @@ class Parser(metaclass=_Parser):
|
||||||
this = self._parse_derived_table_values()
|
this = self._parse_derived_table_values()
|
||||||
elif from_:
|
elif from_:
|
||||||
this = exp.select("*").from_(from_.this, copy=False)
|
this = exp.select("*").from_(from_.this, copy=False)
|
||||||
|
elif self._match(TokenType.SUMMARIZE):
|
||||||
|
table = self._match(TokenType.TABLE)
|
||||||
|
this = self._parse_select() or self._parse_string() or self._parse_table()
|
||||||
|
return self.expression(exp.Summarize, this=this, table=table)
|
||||||
else:
|
else:
|
||||||
this = None
|
this = None
|
||||||
|
|
||||||
if parse_set_operation:
|
return self._parse_set_operations(this) if parse_set_operation else this
|
||||||
return self._parse_set_operations(this)
|
|
||||||
return this
|
|
||||||
|
|
||||||
def _parse_with(self, skip_with_token: bool = False) -> t.Optional[exp.With]:
|
def _parse_with(self, skip_with_token: bool = False) -> t.Optional[exp.With]:
|
||||||
if not skip_with_token and not self._match(TokenType.WITH):
|
if not skip_with_token and not self._match(TokenType.WITH):
|
||||||
|
@ -3825,7 +3828,7 @@ class Parser(metaclass=_Parser):
|
||||||
while True:
|
while True:
|
||||||
expressions = self._parse_csv(
|
expressions = self._parse_csv(
|
||||||
lambda: None
|
lambda: None
|
||||||
if self._match(TokenType.ROLLUP, advance=False)
|
if self._match_set((TokenType.CUBE, TokenType.ROLLUP), advance=False)
|
||||||
else self._parse_assignment()
|
else self._parse_assignment()
|
||||||
)
|
)
|
||||||
if expressions:
|
if expressions:
|
||||||
|
@ -4613,7 +4616,11 @@ class Parser(metaclass=_Parser):
|
||||||
|
|
||||||
matched_array = False
|
matched_array = False
|
||||||
values = self._parse_csv(self._parse_assignment) or None
|
values = self._parse_csv(self._parse_assignment) or None
|
||||||
if values and not schema:
|
if (
|
||||||
|
values
|
||||||
|
and not schema
|
||||||
|
and this.is_type(exp.DataType.Type.ARRAY, exp.DataType.Type.MAP)
|
||||||
|
):
|
||||||
self._retreat(index)
|
self._retreat(index)
|
||||||
break
|
break
|
||||||
|
|
||||||
|
|
|
@ -364,6 +364,7 @@ class TokenType(AutoName):
|
||||||
STORAGE_INTEGRATION = auto()
|
STORAGE_INTEGRATION = auto()
|
||||||
STRAIGHT_JOIN = auto()
|
STRAIGHT_JOIN = auto()
|
||||||
STRUCT = auto()
|
STRUCT = auto()
|
||||||
|
SUMMARIZE = auto()
|
||||||
TABLE_SAMPLE = auto()
|
TABLE_SAMPLE = auto()
|
||||||
TAG = auto()
|
TAG = auto()
|
||||||
TEMPORARY = auto()
|
TEMPORARY = auto()
|
||||||
|
|
|
@ -289,6 +289,10 @@ LANGUAGE js AS
|
||||||
r"REGEXP_EXTRACT(svc_plugin_output, r'\\\((.*)')",
|
r"REGEXP_EXTRACT(svc_plugin_output, r'\\\((.*)')",
|
||||||
r"REGEXP_EXTRACT(svc_plugin_output, '\\\\\\((.*)')",
|
r"REGEXP_EXTRACT(svc_plugin_output, '\\\\\\((.*)')",
|
||||||
)
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT CAST(1 AS BYTEINT)",
|
||||||
|
"SELECT CAST(1 AS INT64)",
|
||||||
|
)
|
||||||
|
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"SAFE_CAST(some_date AS DATE FORMAT 'DD MONTH YYYY')",
|
"SAFE_CAST(some_date AS DATE FORMAT 'DD MONTH YYYY')",
|
||||||
|
|
|
@ -28,6 +28,8 @@ class TestClickhouse(Validator):
|
||||||
self.assertEqual(expr.sql(dialect="clickhouse"), "COUNT(x)")
|
self.assertEqual(expr.sql(dialect="clickhouse"), "COUNT(x)")
|
||||||
self.assertIsNone(expr._meta)
|
self.assertIsNone(expr._meta)
|
||||||
|
|
||||||
|
self.validate_identity("SELECT STR_TO_DATE(str, fmt, tz)")
|
||||||
|
self.validate_identity("SELECT STR_TO_DATE('05 12 2000', '%d %m %Y')")
|
||||||
self.validate_identity("SELECT EXTRACT(YEAR FROM toDateTime('2023-02-01'))")
|
self.validate_identity("SELECT EXTRACT(YEAR FROM toDateTime('2023-02-01'))")
|
||||||
self.validate_identity("extract(haystack, pattern)")
|
self.validate_identity("extract(haystack, pattern)")
|
||||||
self.validate_identity("SELECT * FROM x LIMIT 1 UNION ALL SELECT * FROM y")
|
self.validate_identity("SELECT * FROM x LIMIT 1 UNION ALL SELECT * FROM y")
|
||||||
|
@ -153,6 +155,17 @@ class TestClickhouse(Validator):
|
||||||
"SELECT SUM(1) AS impressions FROM (SELECT ['Istanbul', 'Berlin', 'Bobruisk'] AS cities) WHERE arrayJoin(cities) IN ('Istanbul', 'Berlin')",
|
"SELECT SUM(1) AS impressions FROM (SELECT ['Istanbul', 'Berlin', 'Bobruisk'] AS cities) WHERE arrayJoin(cities) IN ('Istanbul', 'Berlin')",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
self.validate_all(
|
||||||
|
"SELECT CAST(STR_TO_DATE('05 12 2000', '%d %m %Y') AS DATE)",
|
||||||
|
read={
|
||||||
|
"clickhouse": "SELECT CAST(STR_TO_DATE('05 12 2000', '%d %m %Y') AS DATE)",
|
||||||
|
"postgres": "SELECT TO_DATE('05 12 2000', 'DD MM YYYY')",
|
||||||
|
},
|
||||||
|
write={
|
||||||
|
"clickhouse": "SELECT CAST(STR_TO_DATE('05 12 2000', '%d %m %Y') AS DATE)",
|
||||||
|
"postgres": "SELECT CAST(CAST(TO_DATE('05 12 2000', 'DD MM YYYY') AS TIMESTAMP) AS DATE)",
|
||||||
|
},
|
||||||
|
)
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"SELECT * FROM x PREWHERE y = 1 WHERE z = 2",
|
"SELECT * FROM x PREWHERE y = 1 WHERE z = 2",
|
||||||
write={
|
write={
|
||||||
|
|
|
@ -8,6 +8,8 @@ class TestDuckDB(Validator):
|
||||||
dialect = "duckdb"
|
dialect = "duckdb"
|
||||||
|
|
||||||
def test_duckdb(self):
|
def test_duckdb(self):
|
||||||
|
self.validate_identity("x::int[3]", "CAST(x AS INT[3])")
|
||||||
|
|
||||||
with self.assertRaises(ParseError):
|
with self.assertRaises(ParseError):
|
||||||
parse_one("1 //", read="duckdb")
|
parse_one("1 //", read="duckdb")
|
||||||
|
|
||||||
|
@ -293,9 +295,19 @@ class TestDuckDB(Validator):
|
||||||
self.validate_identity("x -> '$.family'")
|
self.validate_identity("x -> '$.family'")
|
||||||
self.validate_identity("CREATE TABLE color (name ENUM('RED', 'GREEN', 'BLUE'))")
|
self.validate_identity("CREATE TABLE color (name ENUM('RED', 'GREEN', 'BLUE'))")
|
||||||
self.validate_identity("SELECT * FROM foo WHERE bar > $baz AND bla = $bob")
|
self.validate_identity("SELECT * FROM foo WHERE bar > $baz AND bla = $bob")
|
||||||
|
self.validate_identity("SUMMARIZE tbl").assert_is(exp.Summarize)
|
||||||
|
self.validate_identity("SUMMARIZE SELECT * FROM tbl").assert_is(exp.Summarize)
|
||||||
|
self.validate_identity("CREATE TABLE tbl_summary AS SELECT * FROM (SUMMARIZE tbl)")
|
||||||
|
self.validate_identity(
|
||||||
|
"SUMMARIZE TABLE 'https://blobs.duckdb.org/data/Star_Trek-Season_1.csv'"
|
||||||
|
).assert_is(exp.Summarize)
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"SELECT * FROM x LEFT JOIN UNNEST(y)", "SELECT * FROM x LEFT JOIN UNNEST(y) ON TRUE"
|
"SELECT * FROM x LEFT JOIN UNNEST(y)", "SELECT * FROM x LEFT JOIN UNNEST(y) ON TRUE"
|
||||||
)
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT col FROM t WHERE JSON_EXTRACT_STRING(col, '$.id') NOT IN ('b')",
|
||||||
|
"SELECT col FROM t WHERE NOT (col ->> '$.id') IN ('b')",
|
||||||
|
)
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"SELECT a, LOGICAL_OR(b) FROM foo GROUP BY a",
|
"SELECT a, LOGICAL_OR(b) FROM foo GROUP BY a",
|
||||||
"SELECT a, BOOL_OR(b) FROM foo GROUP BY a",
|
"SELECT a, BOOL_OR(b) FROM foo GROUP BY a",
|
||||||
|
@ -839,10 +851,10 @@ class TestDuckDB(Validator):
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
cm.output,
|
cm.output,
|
||||||
[
|
[
|
||||||
"WARNING:sqlglot:Applying array index offset (-1)",
|
"INFO:sqlglot:Applying array index offset (-1)",
|
||||||
"WARNING:sqlglot:Applying array index offset (1)",
|
"INFO:sqlglot:Applying array index offset (1)",
|
||||||
"WARNING:sqlglot:Applying array index offset (1)",
|
"INFO:sqlglot:Applying array index offset (1)",
|
||||||
"WARNING:sqlglot:Applying array index offset (1)",
|
"INFO:sqlglot:Applying array index offset (1)",
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
@ -1071,10 +1071,10 @@ class TestPostgres(Validator):
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
cm.output,
|
cm.output,
|
||||||
[
|
[
|
||||||
"WARNING:sqlglot:Applying array index offset (-1)",
|
"INFO:sqlglot:Applying array index offset (-1)",
|
||||||
"WARNING:sqlglot:Applying array index offset (1)",
|
"INFO:sqlglot:Applying array index offset (1)",
|
||||||
"WARNING:sqlglot:Applying array index offset (1)",
|
"INFO:sqlglot:Applying array index offset (1)",
|
||||||
"WARNING:sqlglot:Applying array index offset (1)",
|
"INFO:sqlglot:Applying array index offset (1)",
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
@ -48,6 +48,16 @@ class TestTSQL(Validator):
|
||||||
"SELECT 1 WHERE EXISTS(SELECT 1)",
|
"SELECT 1 WHERE EXISTS(SELECT 1)",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
self.validate_all(
|
||||||
|
"WITH A AS (SELECT 2 AS value), C AS (SELECT * FROM A) SELECT * INTO TEMP_NESTED_WITH FROM (SELECT * FROM C) AS temp",
|
||||||
|
read={
|
||||||
|
"snowflake": "CREATE TABLE TEMP_NESTED_WITH AS WITH C AS (WITH A AS (SELECT 2 AS value) SELECT * FROM A) SELECT * FROM C",
|
||||||
|
"tsql": "WITH A AS (SELECT 2 AS value), C AS (SELECT * FROM A) SELECT * INTO TEMP_NESTED_WITH FROM (SELECT * FROM C) AS temp",
|
||||||
|
},
|
||||||
|
write={
|
||||||
|
"snowflake": "CREATE TABLE TEMP_NESTED_WITH AS WITH A AS (SELECT 2 AS value), C AS (SELECT * FROM A) SELECT * FROM (SELECT * FROM C) AS temp",
|
||||||
|
},
|
||||||
|
)
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"SELECT IIF(cond <> 0, 'True', 'False')",
|
"SELECT IIF(cond <> 0, 'True', 'False')",
|
||||||
read={
|
read={
|
||||||
|
@ -797,6 +807,7 @@ class TestTSQL(Validator):
|
||||||
f"UNIQUE {clustered_keyword} ([internal_id] ASC))",
|
f"UNIQUE {clustered_keyword} ([internal_id] ASC))",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
self.validate_identity("CREATE VIEW t AS WITH cte AS (SELECT 1 AS c) SELECT c FROM cte")
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"ALTER TABLE tbl SET SYSTEM_VERSIONING=ON(HISTORY_TABLE=db.tbl, DATA_CONSISTENCY_CHECK=OFF, HISTORY_RETENTION_PERIOD=5 DAYS)"
|
"ALTER TABLE tbl SET SYSTEM_VERSIONING=ON(HISTORY_TABLE=db.tbl, DATA_CONSISTENCY_CHECK=OFF, HISTORY_RETENTION_PERIOD=5 DAYS)"
|
||||||
)
|
)
|
||||||
|
@ -1135,6 +1146,11 @@ WHERE
|
||||||
self.validate_all("ISNULL(x, y)", write={"spark": "COALESCE(x, y)"})
|
self.validate_all("ISNULL(x, y)", write={"spark": "COALESCE(x, y)"})
|
||||||
|
|
||||||
def test_json(self):
|
def test_json(self):
|
||||||
|
self.validate_identity(
|
||||||
|
"""JSON_QUERY(REPLACE(REPLACE(x , '''', '"'), '""', '"'))""",
|
||||||
|
"""ISNULL(JSON_QUERY(REPLACE(REPLACE(x, '''', '"'), '""', '"'), '$'), JSON_VALUE(REPLACE(REPLACE(x, '''', '"'), '""', '"'), '$'))""",
|
||||||
|
)
|
||||||
|
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"JSON_QUERY(r.JSON, '$.Attr_INT')",
|
"JSON_QUERY(r.JSON, '$.Attr_INT')",
|
||||||
write={
|
write={
|
||||||
|
|
1
tests/fixtures/identity.sql
vendored
1
tests/fixtures/identity.sql
vendored
|
@ -868,6 +868,7 @@ SELECT name
|
||||||
SELECT copy
|
SELECT copy
|
||||||
SELECT rollup
|
SELECT rollup
|
||||||
SELECT unnest
|
SELECT unnest
|
||||||
|
SELECT cube, cube.x FROM cube
|
||||||
SELECT * FROM a STRAIGHT_JOIN b
|
SELECT * FROM a STRAIGHT_JOIN b
|
||||||
SELECT COUNT(DISTINCT "foo bar") FROM (SELECT 1 AS "foo bar") AS t
|
SELECT COUNT(DISTINCT "foo bar") FROM (SELECT 1 AS "foo bar") AS t
|
||||||
SELECT vector
|
SELECT vector
|
||||||
|
|
10
tests/fixtures/pretty.sql
vendored
10
tests/fixtures/pretty.sql
vendored
|
@ -395,3 +395,13 @@ JOIN b
|
||||||
JOIN d
|
JOIN d
|
||||||
USING (f)
|
USING (f)
|
||||||
USING (g);
|
USING (g);
|
||||||
|
|
||||||
|
('aaaaaaaaaaa', 'bbbbbbbbbbbbbbbb', 'ccccccccccccc', 'ddddddddddd', 'eeeeeeeeeeeeeeeeeeeee');
|
||||||
|
(
|
||||||
|
'aaaaaaaaaaa',
|
||||||
|
'bbbbbbbbbbbbbbbb',
|
||||||
|
'ccccccccccccc',
|
||||||
|
'ddddddddddd',
|
||||||
|
'eeeeeeeeeeeeeeeeeeeee'
|
||||||
|
);
|
||||||
|
|
||||||
|
|
|
@ -1001,6 +1001,8 @@ FROM foo""",
|
||||||
self.assertEqual(exp.DataType.build("ARRAY<UNKNOWN>").sql(), "ARRAY<UNKNOWN>")
|
self.assertEqual(exp.DataType.build("ARRAY<UNKNOWN>").sql(), "ARRAY<UNKNOWN>")
|
||||||
self.assertEqual(exp.DataType.build("ARRAY<NULL>").sql(), "ARRAY<NULL>")
|
self.assertEqual(exp.DataType.build("ARRAY<NULL>").sql(), "ARRAY<NULL>")
|
||||||
self.assertEqual(exp.DataType.build("varchar(100) collate 'en-ci'").sql(), "VARCHAR(100)")
|
self.assertEqual(exp.DataType.build("varchar(100) collate 'en-ci'").sql(), "VARCHAR(100)")
|
||||||
|
self.assertEqual(exp.DataType.build("int[3]").sql(dialect="duckdb"), "INT[3]")
|
||||||
|
self.assertEqual(exp.DataType.build("int[3][3]").sql(dialect="duckdb"), "INT[3][3]")
|
||||||
|
|
||||||
with self.assertRaises(ParseError):
|
with self.assertRaises(ParseError):
|
||||||
exp.DataType.build("varchar(")
|
exp.DataType.build("varchar(")
|
||||||
|
|
|
@ -815,10 +815,10 @@ FROM x""",
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
cm.output,
|
cm.output,
|
||||||
[
|
[
|
||||||
"WARNING:sqlglot:Applying array index offset (1)",
|
"INFO:sqlglot:Applying array index offset (1)",
|
||||||
"WARNING:sqlglot:Applying array index offset (-1)",
|
"INFO:sqlglot:Applying array index offset (-1)",
|
||||||
"WARNING:sqlglot:Applying array index offset (1)",
|
"INFO:sqlglot:Applying array index offset (1)",
|
||||||
"WARNING:sqlglot:Applying array index offset (1)",
|
"INFO:sqlglot:Applying array index offset (1)",
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue