Merging upstream version 25.31.4.
Signed-off-by: Daniel Baumann <daniel@debian.org>
This commit is contained in:
parent
94fd84e2b9
commit
4c76f76a29
68 changed files with 58911 additions and 55752 deletions
99
CHANGELOG.md
99
CHANGELOG.md
|
@ -1,6 +1,100 @@
|
||||||
Changelog
|
Changelog
|
||||||
=========
|
=========
|
||||||
|
|
||||||
|
## [v25.31.3] - 2024-11-17
|
||||||
|
### :sparkles: New Features
|
||||||
|
- [`835e717`](https://github.com/tobymao/sqlglot/commit/835e71795f994599dbc19f1a5969b464154926e1) - **clickhouse**: transform function support *(PR [#4408](https://github.com/tobymao/sqlglot/pull/4408) by [@GaliFFun](https://github.com/GaliFFun))*
|
||||||
|
|
||||||
|
### :bug: Bug Fixes
|
||||||
|
- [`0479743`](https://github.com/tobymao/sqlglot/commit/047974393cebbddbbfb878071d159a3e538b0e4d) - **snowflake**: cast to TimeToStr arg to TIMESTAMP more conservatively *(commit by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
|
||||||
|
|
||||||
|
## [v25.31.2] - 2024-11-17
|
||||||
|
### :bug: Bug Fixes
|
||||||
|
- [`d851269`](https://github.com/tobymao/sqlglot/commit/d851269780c7f0a0c756289c3dea9b1aa58d2a69) - use existing aliases in DISTINCT ON elimination, if any *(commit by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
|
||||||
|
|
||||||
|
## [v25.31.1] - 2024-11-17
|
||||||
|
### :sparkles: New Features
|
||||||
|
- [`b00d857`](https://github.com/tobymao/sqlglot/commit/b00d857cd8a6d2452c2170077cbfa82352f708dd) - add support for specifying column in row_number function *(PR [#4406](https://github.com/tobymao/sqlglot/pull/4406) by [@GaliFFun](https://github.com/GaliFFun))*
|
||||||
|
|
||||||
|
### :bug: Bug Fixes
|
||||||
|
- [`0e46cc7`](https://github.com/tobymao/sqlglot/commit/0e46cc7fa2d80ba4e92182b3fa5f1075a63f4754) - refactor DISTINCT ON elimination transformation *(PR [#4407](https://github.com/tobymao/sqlglot/pull/4407) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
|
||||||
|
|
||||||
|
## [v25.31.0] - 2024-11-16
|
||||||
|
### :boom: BREAKING CHANGES
|
||||||
|
- due to [`f4abfd5`](https://github.com/tobymao/sqlglot/commit/f4abfd59b8255cf8c39bf51028ee5f6ed704927f) - Support FORMAT_TIMESTAMP *(PR [#4383](https://github.com/tobymao/sqlglot/pull/4383) by [@VaggelisD](https://github.com/VaggelisD))*:
|
||||||
|
|
||||||
|
Support FORMAT_TIMESTAMP (#4383)
|
||||||
|
|
||||||
|
- due to [`45eef60`](https://github.com/tobymao/sqlglot/commit/45eef600064ad024b34e32e7acc3aca409fbd9c4) - use select star when eliminating distinct on *(PR [#4401](https://github.com/tobymao/sqlglot/pull/4401) by [@agrigoroi-palantir](https://github.com/agrigoroi-palantir))*:
|
||||||
|
|
||||||
|
use select star when eliminating distinct on (#4401)
|
||||||
|
|
||||||
|
|
||||||
|
### :sparkles: New Features
|
||||||
|
- [`72ffdcb`](https://github.com/tobymao/sqlglot/commit/72ffdcb631bf7afdeda2ce96911442a94b7f11eb) - **bigquery**: Add parsing support for STRPOS(...) *(PR [#4378](https://github.com/tobymao/sqlglot/pull/4378) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- [`e7b67e0`](https://github.com/tobymao/sqlglot/commit/e7b67e0c280179188ce1bca650735978b758dca1) - **bigquery**: Support MAKE_INTERVAL *(PR [#4384](https://github.com/tobymao/sqlglot/pull/4384) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- [`37c4809`](https://github.com/tobymao/sqlglot/commit/37c4809dfda48224fd982ea8a48d3dbc5c17f9ae) - **bigquery**: Support INT64(...) *(PR [#4391](https://github.com/tobymao/sqlglot/pull/4391) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- [`9694999`](https://github.com/tobymao/sqlglot/commit/96949999d394e27df8b0287a14e9ac82d52bc0f9) - Add support for CONTAINS(...) *(PR [#4399](https://github.com/tobymao/sqlglot/pull/4399) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
|
||||||
|
### :bug: Bug Fixes
|
||||||
|
- [`f4abfd5`](https://github.com/tobymao/sqlglot/commit/f4abfd59b8255cf8c39bf51028ee5f6ed704927f) - **bigquery**: Support FORMAT_TIMESTAMP *(PR [#4383](https://github.com/tobymao/sqlglot/pull/4383) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- [`bb46ee3`](https://github.com/tobymao/sqlglot/commit/bb46ee33d481a888882cbbb26a9240dd2dbb10ee) - **parser**: Parse exp.Column for DROP COLUMN *(PR [#4390](https://github.com/tobymao/sqlglot/pull/4390) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- :arrow_lower_right: *fixes issue [#4388](https://github.com/tobymao/sqlglot/issues/4388) opened by [@AhlamHani](https://github.com/AhlamHani)*
|
||||||
|
- [`79f6783`](https://github.com/tobymao/sqlglot/commit/79f67830d7d3ba92bff91eeb95b4dc8bdfa6c44e) - **snowflake**: Wrap DIV0 operands if they're binary expressions *(PR [#4393](https://github.com/tobymao/sqlglot/pull/4393) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- :arrow_lower_right: *fixes issue [#4392](https://github.com/tobymao/sqlglot/issues/4392) opened by [@diogo-fernan](https://github.com/diogo-fernan)*
|
||||||
|
- [`647b98d`](https://github.com/tobymao/sqlglot/commit/647b98d84643b88a41218fb67f6a2bd83ca4c702) - **starrocks**: Add RESERVED_KEYWORDS specific to starrocks *(PR [#4402](https://github.com/tobymao/sqlglot/pull/4402) by [@notexistence](https://github.com/notexistence))*
|
||||||
|
- [`45eef60`](https://github.com/tobymao/sqlglot/commit/45eef600064ad024b34e32e7acc3aca409fbd9c4) - use select star when eliminating distinct on *(PR [#4401](https://github.com/tobymao/sqlglot/pull/4401) by [@agrigoroi-palantir](https://github.com/agrigoroi-palantir))*
|
||||||
|
|
||||||
|
### :recycle: Refactors
|
||||||
|
- [`a3af2af`](https://github.com/tobymao/sqlglot/commit/a3af2af3a893dfd6c6946b732aa086d1f1d91570) - attach stamement comments consistently *(PR [#4377](https://github.com/tobymao/sqlglot/pull/4377) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- :arrow_lower_right: *addresses issue [#4376](https://github.com/tobymao/sqlglot/issues/4376) opened by [@YieldRay](https://github.com/YieldRay)*
|
||||||
|
|
||||||
|
### :wrench: Chores
|
||||||
|
- [`858c5b1`](https://github.com/tobymao/sqlglot/commit/858c5b1a43f74e11b8c357986c78b5068792b3af) - improve contribution guide *(PR [#4379](https://github.com/tobymao/sqlglot/pull/4379) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- [`160e688`](https://github.com/tobymao/sqlglot/commit/160e6883225cd6ad41a218213f73aa9f91b5fc5e) - fix relative benchmark import, comment out sqltree *(PR [#4403](https://github.com/tobymao/sqlglot/pull/4403) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- [`8d78add`](https://github.com/tobymao/sqlglot/commit/8d78addccaaffa4ea2dcfe1de002f8a653f137b7) - bump PYO3 to v"0.22.6" *(PR [#4400](https://github.com/tobymao/sqlglot/pull/4400) by [@MartinSahlen](https://github.com/MartinSahlen))*
|
||||||
|
- [`f78e755`](https://github.com/tobymao/sqlglot/commit/f78e755adaf52823642d2b0e1cae54da835ec653) - bump sqlglotrs to v0.2.14 *(commit by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
|
||||||
|
|
||||||
|
## [v25.30.0] - 2024-11-11
|
||||||
|
### :boom: BREAKING CHANGES
|
||||||
|
- due to [`60625ea`](https://github.com/tobymao/sqlglot/commit/60625eae34deb6a6fc36c0f3996f1281eae0ef6f) - Fix STRUCT cast generation *(PR [#4366](https://github.com/tobymao/sqlglot/pull/4366) by [@VaggelisD](https://github.com/VaggelisD))*:
|
||||||
|
|
||||||
|
Fix STRUCT cast generation (#4366)
|
||||||
|
|
||||||
|
|
||||||
|
### :sparkles: New Features
|
||||||
|
- [`87ab8fe`](https://github.com/tobymao/sqlglot/commit/87ab8fe9cc4d6d060d8fe8a9c3faf8c47c2c9ed6) - **spark, bigquery**: Add support for UNIX_SECONDS(...) *(PR [#4350](https://github.com/tobymao/sqlglot/pull/4350) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- [`42da638`](https://github.com/tobymao/sqlglot/commit/42da63812ed489d1d8bbef0fc14c7dfa5ce57b7a) - **bigquery**: Support JSON_VALUE_ARRAY(...) *(PR [#4356](https://github.com/tobymao/sqlglot/pull/4356) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- [`e337a42`](https://github.com/tobymao/sqlglot/commit/e337a42dd56f5358e617750e7a70a0d4b7eab3f9) - **bigquery**: Parse REGEXP_SUBSTR as exp.RegexpExtract *(PR [#4358](https://github.com/tobymao/sqlglot/pull/4358) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- [`602dbf8`](https://github.com/tobymao/sqlglot/commit/602dbf84ce23f41fba6a87db70ecec6113044bac) - Support REGEXP_EXTRACT_ALL *(PR [#4359](https://github.com/tobymao/sqlglot/pull/4359) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- [`27a44a2`](https://github.com/tobymao/sqlglot/commit/27a44a22ff78cc35e8ab7c91b94311ef93d86c5a) - improve Levenshtein expression transpilation *(PR [#4360](https://github.com/tobymao/sqlglot/pull/4360) by [@krzysztof-kwitt](https://github.com/krzysztof-kwitt))*
|
||||||
|
- [`79c675a`](https://github.com/tobymao/sqlglot/commit/79c675a49fb44a6a7a97ea0de79822d8571724be) - **bigquery**: Support JSON_QUERY_ARRAY & JSON_EXTRACT_ARRAY *(PR [#4361](https://github.com/tobymao/sqlglot/pull/4361) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- [`57722db`](https://github.com/tobymao/sqlglot/commit/57722db90394d9a102c0e76a3e4d32a9f72f9ff9) - optionally wrap connectors when using builders *(PR [#4369](https://github.com/tobymao/sqlglot/pull/4369) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- :arrow_lower_right: *addresses issue [#4362](https://github.com/tobymao/sqlglot/issues/4362) opened by [@gabrielteotonio](https://github.com/gabrielteotonio)*
|
||||||
|
- :arrow_lower_right: *addresses issue [#4367](https://github.com/tobymao/sqlglot/issues/4367) opened by [@gabrielteotonio](https://github.com/gabrielteotonio)*
|
||||||
|
|
||||||
|
### :bug: Bug Fixes
|
||||||
|
- [`eb8e2fe`](https://github.com/tobymao/sqlglot/commit/eb8e2fe3ab3fb4b88f72843a5bd21f4a3c1d895c) - bubble up comments in qualified column refs fixes [#4353](https://github.com/tobymao/sqlglot/pull/4353) *(commit by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
- [`efcbfdb`](https://github.com/tobymao/sqlglot/commit/efcbfdb67b12853581fbfc0d4c4a450c0281849b) - **clickhouse**: Generate exp.Median as lowercase *(PR [#4355](https://github.com/tobymao/sqlglot/pull/4355) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- :arrow_lower_right: *fixes issue [#4354](https://github.com/tobymao/sqlglot/issues/4354) opened by [@cpcloud](https://github.com/cpcloud)*
|
||||||
|
- [`60625ea`](https://github.com/tobymao/sqlglot/commit/60625eae34deb6a6fc36c0f3996f1281eae0ef6f) - **duckdb**: Fix STRUCT cast generation *(PR [#4366](https://github.com/tobymao/sqlglot/pull/4366) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- :arrow_lower_right: *fixes issue [#4365](https://github.com/tobymao/sqlglot/issues/4365) opened by [@NickCrews](https://github.com/NickCrews)*
|
||||||
|
- [`a665030`](https://github.com/tobymao/sqlglot/commit/a665030323b200f3bed241bb928993b9807c4100) - safe removal while iterating expression list for multiple UNNEST expressions *(PR [#4364](https://github.com/tobymao/sqlglot/pull/4364) by [@gauravsagar483](https://github.com/gauravsagar483))*
|
||||||
|
- [`a71cee4`](https://github.com/tobymao/sqlglot/commit/a71cee4b4eafad9988b945c69dc75583ae105ec7) - Transpilation of exp.ArraySize from Postgres (read) *(PR [#4370](https://github.com/tobymao/sqlglot/pull/4370) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- :arrow_lower_right: *fixes issue [#4368](https://github.com/tobymao/sqlglot/issues/4368) opened by [@dor-bernstein](https://github.com/dor-bernstein)*
|
||||||
|
- [`702fe31`](https://github.com/tobymao/sqlglot/commit/702fe318dadbe6cb83676e2a23ee830774697bb0) - Remove flaky timing test *(PR [#4371](https://github.com/tobymao/sqlglot/pull/4371) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- [`4d3904e`](https://github.com/tobymao/sqlglot/commit/4d3904e8906f0573f3352ad82282ea09c571daa8) - **spark**: Support DB's TIMESTAMP_DIFF *(PR [#4373](https://github.com/tobymao/sqlglot/pull/4373) by [@VaggelisD](https://github.com/VaggelisD))*
|
||||||
|
- :arrow_lower_right: *fixes issue [#4372](https://github.com/tobymao/sqlglot/issues/4372) opened by [@nikmalviya](https://github.com/nikmalviya)*
|
||||||
|
- [`060ecfc`](https://github.com/tobymao/sqlglot/commit/060ecfc75fd8a07ffbc19f34959155a0fce317b6) - don't generate comments in table_name *(PR [#4375](https://github.com/tobymao/sqlglot/pull/4375) by [@georgesittas](https://github.com/georgesittas))*
|
||||||
|
|
||||||
|
### :wrench: Chores
|
||||||
|
- [`e19fb62`](https://github.com/tobymao/sqlglot/commit/e19fb620dbe6e405518aee381183e4640b638aa4) - improve error handling for unnest_to_explode *(PR [#4339](https://github.com/tobymao/sqlglot/pull/4339) by [@gauravsagar483](https://github.com/gauravsagar483))*
|
||||||
|
|
||||||
|
|
||||||
## [v25.29.0] - 2024-11-05
|
## [v25.29.0] - 2024-11-05
|
||||||
### :boom: BREAKING CHANGES
|
### :boom: BREAKING CHANGES
|
||||||
- due to [`e92904e`](https://github.com/tobymao/sqlglot/commit/e92904e61ab3b14fe18d472df19311f9b014f6cc) - Transpile ANY to EXISTS *(PR [#4305](https://github.com/tobymao/sqlglot/pull/4305) by [@VaggelisD](https://github.com/VaggelisD))*:
|
- due to [`e92904e`](https://github.com/tobymao/sqlglot/commit/e92904e61ab3b14fe18d472df19311f9b014f6cc) - Transpile ANY to EXISTS *(PR [#4305](https://github.com/tobymao/sqlglot/pull/4305) by [@VaggelisD](https://github.com/VaggelisD))*:
|
||||||
|
@ -5214,3 +5308,8 @@ Changelog
|
||||||
[v25.27.0]: https://github.com/tobymao/sqlglot/compare/v25.26.0...v25.27.0
|
[v25.27.0]: https://github.com/tobymao/sqlglot/compare/v25.26.0...v25.27.0
|
||||||
[v25.28.0]: https://github.com/tobymao/sqlglot/compare/v25.27.0...v25.28.0
|
[v25.28.0]: https://github.com/tobymao/sqlglot/compare/v25.27.0...v25.28.0
|
||||||
[v25.29.0]: https://github.com/tobymao/sqlglot/compare/v25.28.0...v25.29.0
|
[v25.29.0]: https://github.com/tobymao/sqlglot/compare/v25.28.0...v25.29.0
|
||||||
|
[v25.30.0]: https://github.com/tobymao/sqlglot/compare/v25.29.0...v25.30.0
|
||||||
|
[v25.31.0]: https://github.com/tobymao/sqlglot/compare/v25.30.0...v25.31.0
|
||||||
|
[v25.31.1]: https://github.com/tobymao/sqlglot/compare/v25.31.0...v25.31.1
|
||||||
|
[v25.31.2]: https://github.com/tobymao/sqlglot/compare/v25.31.1...v25.31.2
|
||||||
|
[v25.31.3]: https://github.com/tobymao/sqlglot/compare/v25.31.2...v25.31.3
|
||||||
|
|
|
@ -9,30 +9,45 @@ easy and transparent as possible, whether it's:
|
||||||
- Proposing new features
|
- Proposing new features
|
||||||
|
|
||||||
## We develop with Github
|
## We develop with Github
|
||||||
|
|
||||||
We use github to host code, to track issues and feature requests, as well as accept pull requests.
|
We use github to host code, to track issues and feature requests, as well as accept pull requests.
|
||||||
|
|
||||||
|
## Finding tasks to work on
|
||||||
|
|
||||||
|
When the core SQLGlot team does not plan to work on an issue, it is usually closed as "not planned". This may happen
|
||||||
|
when a request is exceptionally difficult to address, or because the team deems that it shouldn't be prioritized.
|
||||||
|
|
||||||
|
These issues can be a good starting point when looking for tasks to work on. Simply filter the issue list to fetch
|
||||||
|
the closed issues and then search for those marked as "not planned". If the scope of an issue is not clear or you
|
||||||
|
need guidance, feel free to ask for clarifications.
|
||||||
|
|
||||||
|
Before taking on a task, consider studying the [AST primer](https://github.com/tobymao/sqlglot/blob/main/posts/ast_primer.md) and the [onboarding document](https://github.com/tobymao/sqlglot/blob/main/posts/onboarding.md).
|
||||||
|
|
||||||
## Submitting code changes
|
## Submitting code changes
|
||||||
Pull requests are the best way to propose changes to the codebase. We actively welcome your pull requests:
|
|
||||||
|
|
||||||
Please keep PR's small and do your best to follow the conventions of the project. If you have a feature that requires a lot of code changes,
|
Pull requests are the best way to propose changes to the codebase, and we actively welcome them.
|
||||||
please reach out to us on [Slack](https://tobikodata.com/slack) before making a PR. This will increase the chances of your PR getting in.
|
|
||||||
|
|
||||||
1. Fork the repo and create your branch from `main`.
|
Pull requests should be small and they need to follow the conventions of the project. For features that require
|
||||||
2. If you've added code that should be tested, add tests.
|
many changes, please reach out to us on [Slack](https://tobikodata.com/slack) before making a request, in order
|
||||||
3. If you've changed APIs, update the documentation.
|
to share any relevant context and increase its chances of getting merged.
|
||||||
4. Ensure the test suite & linter [checks](https://github.com/tobymao/sqlglot/blob/main/README.md#run-tests-and-lint) pass.
|
|
||||||
5. Issue that pull request and wait for it to be reviewed by a maintainer or contributor!
|
1. Fork the repo and create your branch from `main`
|
||||||
|
2. If you've added code with non-trivial changes, add tests
|
||||||
|
3. If you've changed APIs, update the documentation (docstrings)
|
||||||
|
4. Ensure the test suite & linter [checks](https://github.com/tobymao/sqlglot/blob/main/README.md#run-tests-and-lint) pass
|
||||||
|
5. Issue that pull request and wait for it to be reviewed by a maintainer or contributor
|
||||||
|
|
||||||
Note: make sure to follow the [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/) guidelines when creating a PR.
|
Note: make sure to follow the [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/) guidelines when creating a PR.
|
||||||
|
|
||||||
## Report bugs using Github's [issues](https://github.com/tobymao/sqlglot/issues)
|
## Report bugs using Github's [issues](https://github.com/tobymao/sqlglot/issues)
|
||||||
|
|
||||||
We use GitHub issues to track public bugs. Report a bug by opening a new issue.
|
We use GitHub issues to track public bugs. Report a bug by opening a new issue.
|
||||||
|
|
||||||
**Great Bug Reports** tend to have:
|
**Great Bug Reports** tend to have:
|
||||||
|
|
||||||
- A quick summary and/or background
|
- A quick summary and/or background
|
||||||
- Steps to reproduce
|
- Steps to reproduce
|
||||||
- Be specific!
|
- Be specific
|
||||||
- Give sample code if you can
|
- Give sample code if you can
|
||||||
- What you expected would happen
|
- What you expected would happen
|
||||||
- What actually happens
|
- What actually happens
|
||||||
|
@ -40,12 +55,15 @@ We use GitHub issues to track public bugs. Report a bug by opening a new issue.
|
||||||
- References (e.g. documentation pages related to the issue)
|
- References (e.g. documentation pages related to the issue)
|
||||||
|
|
||||||
## Start a discussion using Github's [discussions](https://github.com/tobymao/sqlglot/discussions)
|
## Start a discussion using Github's [discussions](https://github.com/tobymao/sqlglot/discussions)
|
||||||
|
|
||||||
[We use GitHub discussions](https://github.com/tobymao/sqlglot/discussions/190) to discuss about the current state
|
[We use GitHub discussions](https://github.com/tobymao/sqlglot/discussions/190) to discuss about the current state
|
||||||
of the code. If you want to propose a new feature, this is the right place to do it! Just start a discussion, and
|
of the code. If you want to propose a new feature, this is the right place to do it. Just start a discussion, and
|
||||||
let us know why you think this feature would be a good addition to SQLGlot (by possibly including some usage examples).
|
let us know why you think this feature would be a good addition to SQLGlot (by possibly including some usage examples).
|
||||||
|
|
||||||
## [License](https://github.com/tobymao/sqlglot/blob/main/LICENSE)
|
## [License](https://github.com/tobymao/sqlglot/blob/main/LICENSE)
|
||||||
|
|
||||||
By contributing, you agree that your contributions will be licensed under its MIT License.
|
By contributing, you agree that your contributions will be licensed under its MIT License.
|
||||||
|
|
||||||
## References
|
## References
|
||||||
|
|
||||||
This document was adapted from [briandk's template](https://gist.github.com/briandk/3d2e8b3ec8daf5a27a62).
|
This document was adapted from [briandk's template](https://gist.github.com/briandk/3d2e8b3ec8daf5a27a62).
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
import collections.abc
|
import collections.abc
|
||||||
|
|
||||||
from benchmarks.helpers import ascii_table
|
from helpers import ascii_table
|
||||||
|
|
||||||
# moz_sql_parser 3.10 compatibility
|
# moz_sql_parser 3.10 compatibility
|
||||||
collections.Iterable = collections.abc.Iterable
|
collections.Iterable = collections.abc.Iterable
|
||||||
|
@ -12,7 +12,7 @@ import numpy as np
|
||||||
# import moz_sql_parser
|
# import moz_sql_parser
|
||||||
# import sqloxide
|
# import sqloxide
|
||||||
# import sqlparse
|
# import sqlparse
|
||||||
import sqltree
|
# import sqltree
|
||||||
|
|
||||||
import sqlglot
|
import sqlglot
|
||||||
|
|
||||||
|
@ -203,7 +203,7 @@ libs = [
|
||||||
"sqlglot",
|
"sqlglot",
|
||||||
"sqlglotrs",
|
"sqlglotrs",
|
||||||
# "sqlfluff",
|
# "sqlfluff",
|
||||||
"sqltree",
|
# "sqltree",
|
||||||
# "sqlparse",
|
# "sqlparse",
|
||||||
# "moz_sql_parser",
|
# "moz_sql_parser",
|
||||||
# "sqloxide",
|
# "sqloxide",
|
||||||
|
|
|
@ -1,7 +1,7 @@
|
||||||
import typing as t
|
import typing as t
|
||||||
from argparse import ArgumentParser
|
from argparse import ArgumentParser
|
||||||
|
|
||||||
from benchmarks.helpers import ascii_table
|
from helpers import ascii_table
|
||||||
from sqlglot.optimizer import optimize
|
from sqlglot.optimizer import optimize
|
||||||
from sqlglot import parse_one
|
from sqlglot import parse_one
|
||||||
from tests.helpers import load_sql_fixture_pairs, TPCH_SCHEMA, TPCDS_SCHEMA
|
from tests.helpers import load_sql_fixture_pairs, TPCH_SCHEMA, TPCDS_SCHEMA
|
||||||
|
|
File diff suppressed because one or more lines are too long
|
@ -76,8 +76,8 @@
|
||||||
</span><span id="L-12"><a href="#L-12"><span class="linenos">12</span></a><span class="n">__version_tuple__</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
</span><span id="L-12"><a href="#L-12"><span class="linenos">12</span></a><span class="n">__version_tuple__</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
||||||
</span><span id="L-13"><a href="#L-13"><span class="linenos">13</span></a><span class="n">version_tuple</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
</span><span id="L-13"><a href="#L-13"><span class="linenos">13</span></a><span class="n">version_tuple</span><span class="p">:</span> <span class="n">VERSION_TUPLE</span>
|
||||||
</span><span id="L-14"><a href="#L-14"><span class="linenos">14</span></a>
|
</span><span id="L-14"><a href="#L-14"><span class="linenos">14</span></a>
|
||||||
</span><span id="L-15"><a href="#L-15"><span class="linenos">15</span></a><span class="n">__version__</span> <span class="o">=</span> <span class="n">version</span> <span class="o">=</span> <span class="s1">'25.29.0'</span>
|
</span><span id="L-15"><a href="#L-15"><span class="linenos">15</span></a><span class="n">__version__</span> <span class="o">=</span> <span class="n">version</span> <span class="o">=</span> <span class="s1">'25.31.3'</span>
|
||||||
</span><span id="L-16"><a href="#L-16"><span class="linenos">16</span></a><span class="n">__version_tuple__</span> <span class="o">=</span> <span class="n">version_tuple</span> <span class="o">=</span> <span class="p">(</span><span class="mi">25</span><span class="p">,</span> <span class="mi">29</span><span class="p">,</span> <span class="mi">0</span><span class="p">)</span>
|
</span><span id="L-16"><a href="#L-16"><span class="linenos">16</span></a><span class="n">__version_tuple__</span> <span class="o">=</span> <span class="n">version_tuple</span> <span class="o">=</span> <span class="p">(</span><span class="mi">25</span><span class="p">,</span> <span class="mi">31</span><span class="p">,</span> <span class="mi">3</span><span class="p">)</span>
|
||||||
</span></pre></div>
|
</span></pre></div>
|
||||||
|
|
||||||
|
|
||||||
|
@ -97,7 +97,7 @@
|
||||||
<section id="version">
|
<section id="version">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">version</span><span class="annotation">: str</span> =
|
<span class="name">version</span><span class="annotation">: str</span> =
|
||||||
<span class="default_value">'25.29.0'</span>
|
<span class="default_value">'25.31.3'</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -109,7 +109,7 @@
|
||||||
<section id="version_tuple">
|
<section id="version_tuple">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">version_tuple</span><span class="annotation">: object</span> =
|
<span class="name">version_tuple</span><span class="annotation">: object</span> =
|
||||||
<span class="default_value">(25, 29, 0)</span>
|
<span class="default_value">(25, 31, 3)</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
@ -737,7 +737,9 @@ Default: True</li>
|
||||||
<dd id="RisingWave.Generator.PAD_FILL_PATTERN_IS_REQUIRED" class="variable"><a href="../generator.html#Generator.PAD_FILL_PATTERN_IS_REQUIRED">PAD_FILL_PATTERN_IS_REQUIRED</a></dd>
|
<dd id="RisingWave.Generator.PAD_FILL_PATTERN_IS_REQUIRED" class="variable"><a href="../generator.html#Generator.PAD_FILL_PATTERN_IS_REQUIRED">PAD_FILL_PATTERN_IS_REQUIRED</a></dd>
|
||||||
<dd id="RisingWave.Generator.SUPPORTS_EXPLODING_PROJECTIONS" class="variable"><a href="../generator.html#Generator.SUPPORTS_EXPLODING_PROJECTIONS">SUPPORTS_EXPLODING_PROJECTIONS</a></dd>
|
<dd id="RisingWave.Generator.SUPPORTS_EXPLODING_PROJECTIONS" class="variable"><a href="../generator.html#Generator.SUPPORTS_EXPLODING_PROJECTIONS">SUPPORTS_EXPLODING_PROJECTIONS</a></dd>
|
||||||
<dd id="RisingWave.Generator.SUPPORTS_CONVERT_TIMEZONE" class="variable"><a href="../generator.html#Generator.SUPPORTS_CONVERT_TIMEZONE">SUPPORTS_CONVERT_TIMEZONE</a></dd>
|
<dd id="RisingWave.Generator.SUPPORTS_CONVERT_TIMEZONE" class="variable"><a href="../generator.html#Generator.SUPPORTS_CONVERT_TIMEZONE">SUPPORTS_CONVERT_TIMEZONE</a></dd>
|
||||||
|
<dd id="RisingWave.Generator.SUPPORTS_UNIX_SECONDS" class="variable"><a href="../generator.html#Generator.SUPPORTS_UNIX_SECONDS">SUPPORTS_UNIX_SECONDS</a></dd>
|
||||||
<dd id="RisingWave.Generator.PARSE_JSON_NAME" class="variable"><a href="../generator.html#Generator.PARSE_JSON_NAME">PARSE_JSON_NAME</a></dd>
|
<dd id="RisingWave.Generator.PARSE_JSON_NAME" class="variable"><a href="../generator.html#Generator.PARSE_JSON_NAME">PARSE_JSON_NAME</a></dd>
|
||||||
|
<dd id="RisingWave.Generator.ARRAY_SIZE_NAME" class="variable"><a href="../generator.html#Generator.ARRAY_SIZE_NAME">ARRAY_SIZE_NAME</a></dd>
|
||||||
<dd id="RisingWave.Generator.TIME_PART_SINGULARS" class="variable"><a href="../generator.html#Generator.TIME_PART_SINGULARS">TIME_PART_SINGULARS</a></dd>
|
<dd id="RisingWave.Generator.TIME_PART_SINGULARS" class="variable"><a href="../generator.html#Generator.TIME_PART_SINGULARS">TIME_PART_SINGULARS</a></dd>
|
||||||
<dd id="RisingWave.Generator.TOKEN_MAPPING" class="variable"><a href="../generator.html#Generator.TOKEN_MAPPING">TOKEN_MAPPING</a></dd>
|
<dd id="RisingWave.Generator.TOKEN_MAPPING" class="variable"><a href="../generator.html#Generator.TOKEN_MAPPING">TOKEN_MAPPING</a></dd>
|
||||||
<dd id="RisingWave.Generator.STRUCT_DELIMITER" class="variable"><a href="../generator.html#Generator.STRUCT_DELIMITER">STRUCT_DELIMITER</a></dd>
|
<dd id="RisingWave.Generator.STRUCT_DELIMITER" class="variable"><a href="../generator.html#Generator.STRUCT_DELIMITER">STRUCT_DELIMITER</a></dd>
|
||||||
|
@ -1065,6 +1067,7 @@ Default: True</li>
|
||||||
<dd id="RisingWave.Generator.toarray_sql" class="function"><a href="../generator.html#Generator.toarray_sql">toarray_sql</a></dd>
|
<dd id="RisingWave.Generator.toarray_sql" class="function"><a href="../generator.html#Generator.toarray_sql">toarray_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.tsordstotime_sql" class="function"><a href="../generator.html#Generator.tsordstotime_sql">tsordstotime_sql</a></dd>
|
<dd id="RisingWave.Generator.tsordstotime_sql" class="function"><a href="../generator.html#Generator.tsordstotime_sql">tsordstotime_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.tsordstotimestamp_sql" class="function"><a href="../generator.html#Generator.tsordstotimestamp_sql">tsordstotimestamp_sql</a></dd>
|
<dd id="RisingWave.Generator.tsordstotimestamp_sql" class="function"><a href="../generator.html#Generator.tsordstotimestamp_sql">tsordstotimestamp_sql</a></dd>
|
||||||
|
<dd id="RisingWave.Generator.tsordstodatetime_sql" class="function"><a href="../generator.html#Generator.tsordstodatetime_sql">tsordstodatetime_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.tsordstodate_sql" class="function"><a href="../generator.html#Generator.tsordstodate_sql">tsordstodate_sql</a></dd>
|
<dd id="RisingWave.Generator.tsordstodate_sql" class="function"><a href="../generator.html#Generator.tsordstodate_sql">tsordstodate_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.unixdate_sql" class="function"><a href="../generator.html#Generator.unixdate_sql">unixdate_sql</a></dd>
|
<dd id="RisingWave.Generator.unixdate_sql" class="function"><a href="../generator.html#Generator.unixdate_sql">unixdate_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.lastday_sql" class="function"><a href="../generator.html#Generator.lastday_sql">lastday_sql</a></dd>
|
<dd id="RisingWave.Generator.lastday_sql" class="function"><a href="../generator.html#Generator.lastday_sql">lastday_sql</a></dd>
|
||||||
|
@ -1108,6 +1111,8 @@ Default: True</li>
|
||||||
<dd id="RisingWave.Generator.string_sql" class="function"><a href="../generator.html#Generator.string_sql">string_sql</a></dd>
|
<dd id="RisingWave.Generator.string_sql" class="function"><a href="../generator.html#Generator.string_sql">string_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.median_sql" class="function"><a href="../generator.html#Generator.median_sql">median_sql</a></dd>
|
<dd id="RisingWave.Generator.median_sql" class="function"><a href="../generator.html#Generator.median_sql">median_sql</a></dd>
|
||||||
<dd id="RisingWave.Generator.overflowtruncatebehavior_sql" class="function"><a href="../generator.html#Generator.overflowtruncatebehavior_sql">overflowtruncatebehavior_sql</a></dd>
|
<dd id="RisingWave.Generator.overflowtruncatebehavior_sql" class="function"><a href="../generator.html#Generator.overflowtruncatebehavior_sql">overflowtruncatebehavior_sql</a></dd>
|
||||||
|
<dd id="RisingWave.Generator.unixseconds_sql" class="function"><a href="../generator.html#Generator.unixseconds_sql">unixseconds_sql</a></dd>
|
||||||
|
<dd id="RisingWave.Generator.arraysize_sql" class="function"><a href="../generator.html#Generator.arraysize_sql">arraysize_sql</a></dd>
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
<div><dt><a href="postgres.html#Postgres.Generator">sqlglot.dialects.postgres.Postgres.Generator</a></dt>
|
<div><dt><a href="postgres.html#Postgres.Generator">sqlglot.dialects.postgres.Postgres.Generator</a></dt>
|
||||||
|
@ -1129,6 +1134,7 @@ Default: True</li>
|
||||||
<dd id="RisingWave.Generator.COPY_HAS_INTO_KEYWORD" class="variable"><a href="postgres.html#Postgres.Generator.COPY_HAS_INTO_KEYWORD">COPY_HAS_INTO_KEYWORD</a></dd>
|
<dd id="RisingWave.Generator.COPY_HAS_INTO_KEYWORD" class="variable"><a href="postgres.html#Postgres.Generator.COPY_HAS_INTO_KEYWORD">COPY_HAS_INTO_KEYWORD</a></dd>
|
||||||
<dd id="RisingWave.Generator.ARRAY_CONCAT_IS_VAR_LEN" class="variable"><a href="postgres.html#Postgres.Generator.ARRAY_CONCAT_IS_VAR_LEN">ARRAY_CONCAT_IS_VAR_LEN</a></dd>
|
<dd id="RisingWave.Generator.ARRAY_CONCAT_IS_VAR_LEN" class="variable"><a href="postgres.html#Postgres.Generator.ARRAY_CONCAT_IS_VAR_LEN">ARRAY_CONCAT_IS_VAR_LEN</a></dd>
|
||||||
<dd id="RisingWave.Generator.SUPPORTS_MEDIAN" class="variable"><a href="postgres.html#Postgres.Generator.SUPPORTS_MEDIAN">SUPPORTS_MEDIAN</a></dd>
|
<dd id="RisingWave.Generator.SUPPORTS_MEDIAN" class="variable"><a href="postgres.html#Postgres.Generator.SUPPORTS_MEDIAN">SUPPORTS_MEDIAN</a></dd>
|
||||||
|
<dd id="RisingWave.Generator.ARRAY_SIZE_DIM_REQUIRED" class="variable"><a href="postgres.html#Postgres.Generator.ARRAY_SIZE_DIM_REQUIRED">ARRAY_SIZE_DIM_REQUIRED</a></dd>
|
||||||
<dd id="RisingWave.Generator.SUPPORTED_JSON_PATH_PARTS" class="variable"><a href="postgres.html#Postgres.Generator.SUPPORTED_JSON_PATH_PARTS">SUPPORTED_JSON_PATH_PARTS</a></dd>
|
<dd id="RisingWave.Generator.SUPPORTED_JSON_PATH_PARTS" class="variable"><a href="postgres.html#Postgres.Generator.SUPPORTED_JSON_PATH_PARTS">SUPPORTED_JSON_PATH_PARTS</a></dd>
|
||||||
<dd id="RisingWave.Generator.TYPE_MAPPING" class="variable"><a href="postgres.html#Postgres.Generator.TYPE_MAPPING">TYPE_MAPPING</a></dd>
|
<dd id="RisingWave.Generator.TYPE_MAPPING" class="variable"><a href="postgres.html#Postgres.Generator.TYPE_MAPPING">TYPE_MAPPING</a></dd>
|
||||||
<dd id="RisingWave.Generator.TRANSFORMS" class="variable"><a href="postgres.html#Postgres.Generator.TRANSFORMS">TRANSFORMS</a></dd>
|
<dd id="RisingWave.Generator.TRANSFORMS" class="variable"><a href="postgres.html#Postgres.Generator.TRANSFORMS">TRANSFORMS</a></dd>
|
||||||
|
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
@ -1874,7 +1874,7 @@ belong to some totally-ordered set.</p>
|
||||||
<section id="DATE_UNITS">
|
<section id="DATE_UNITS">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">DATE_UNITS</span> =
|
<span class="name">DATE_UNITS</span> =
|
||||||
<span class="default_value">{'year_month', 'week', 'month', 'quarter', 'day', 'year'}</span>
|
<span class="default_value">{'week', 'month', 'day', 'quarter', 'year_month', 'year'}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -586,7 +586,7 @@
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">ALL_JSON_PATH_PARTS</span> =
|
<span class="name">ALL_JSON_PATH_PARTS</span> =
|
||||||
<input id="ALL_JSON_PATH_PARTS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="ALL_JSON_PATH_PARTS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="ALL_JSON_PATH_PARTS-view-value"></label><span class="default_value">{<class '<a href="expressions.html#JSONPathRecursive">sqlglot.expressions.JSONPathRecursive</a>'>, <class '<a href="expressions.html#JSONPathKey">sqlglot.expressions.JSONPathKey</a>'>, <class '<a href="expressions.html#JSONPathWildcard">sqlglot.expressions.JSONPathWildcard</a>'>, <class '<a href="expressions.html#JSONPathFilter">sqlglot.expressions.JSONPathFilter</a>'>, <class '<a href="expressions.html#JSONPathUnion">sqlglot.expressions.JSONPathUnion</a>'>, <class '<a href="expressions.html#JSONPathSubscript">sqlglot.expressions.JSONPathSubscript</a>'>, <class '<a href="expressions.html#JSONPathSelector">sqlglot.expressions.JSONPathSelector</a>'>, <class '<a href="expressions.html#JSONPathSlice">sqlglot.expressions.JSONPathSlice</a>'>, <class '<a href="expressions.html#JSONPathScript">sqlglot.expressions.JSONPathScript</a>'>, <class '<a href="expressions.html#JSONPathRoot">sqlglot.expressions.JSONPathRoot</a>'>}</span>
|
<label class="view-value-button pdoc-button" for="ALL_JSON_PATH_PARTS-view-value"></label><span class="default_value">{<class '<a href="expressions.html#JSONPathKey">sqlglot.expressions.JSONPathKey</a>'>, <class '<a href="expressions.html#JSONPathWildcard">sqlglot.expressions.JSONPathWildcard</a>'>, <class '<a href="expressions.html#JSONPathFilter">sqlglot.expressions.JSONPathFilter</a>'>, <class '<a href="expressions.html#JSONPathUnion">sqlglot.expressions.JSONPathUnion</a>'>, <class '<a href="expressions.html#JSONPathSubscript">sqlglot.expressions.JSONPathSubscript</a>'>, <class '<a href="expressions.html#JSONPathSelector">sqlglot.expressions.JSONPathSelector</a>'>, <class '<a href="expressions.html#JSONPathSlice">sqlglot.expressions.JSONPathSlice</a>'>, <class '<a href="expressions.html#JSONPathScript">sqlglot.expressions.JSONPathScript</a>'>, <class '<a href="expressions.html#JSONPathRoot">sqlglot.expressions.JSONPathRoot</a>'>, <class '<a href="expressions.html#JSONPathRecursive">sqlglot.expressions.JSONPathRecursive</a>'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
File diff suppressed because one or more lines are too long
|
@ -581,7 +581,7 @@ queries if it would result in multiple table selects in a single query:</p>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">UNMERGABLE_ARGS</span> =
|
<span class="name">UNMERGABLE_ARGS</span> =
|
||||||
<input id="UNMERGABLE_ARGS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="UNMERGABLE_ARGS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="UNMERGABLE_ARGS-view-value"></label><span class="default_value">{'having', 'into', 'offset', 'operation_modifiers', 'limit', 'distribute', 'kind', 'qualify', 'settings', 'prewhere', 'sample', 'pivots', 'locks', 'options', 'sort', 'laterals', 'connect', 'with', 'format', 'group', 'cluster', 'windows', 'distinct', 'match'}</span>
|
<label class="view-value-button pdoc-button" for="UNMERGABLE_ARGS-view-value"></label><span class="default_value">{'with', 'into', 'settings', 'prewhere', 'sort', 'options', 'pivots', 'having', 'laterals', 'distribute', 'offset', 'format', 'match', 'cluster', 'limit', 'distinct', 'locks', 'kind', 'sample', 'group', 'operation_modifiers', 'windows', 'qualify', 'connect'}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -3231,7 +3231,7 @@ prefix are statically known.</p>
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">DATETRUNC_COMPARISONS</span> =
|
<span class="name">DATETRUNC_COMPARISONS</span> =
|
||||||
<input id="DATETRUNC_COMPARISONS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="DATETRUNC_COMPARISONS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="DATETRUNC_COMPARISONS-view-value"></label><span class="default_value">{<class '<a href="../expressions.html#LT">sqlglot.expressions.LT</a>'>, <class '<a href="../expressions.html#In">sqlglot.expressions.In</a>'>, <class '<a href="../expressions.html#NEQ">sqlglot.expressions.NEQ</a>'>, <class '<a href="../expressions.html#EQ">sqlglot.expressions.EQ</a>'>, <class '<a href="../expressions.html#GTE">sqlglot.expressions.GTE</a>'>, <class '<a href="../expressions.html#GT">sqlglot.expressions.GT</a>'>, <class '<a href="../expressions.html#LTE">sqlglot.expressions.LTE</a>'>}</span>
|
<label class="view-value-button pdoc-button" for="DATETRUNC_COMPARISONS-view-value"></label><span class="default_value">{<class '<a href="../expressions.html#GT">sqlglot.expressions.GT</a>'>, <class '<a href="../expressions.html#EQ">sqlglot.expressions.EQ</a>'>, <class '<a href="../expressions.html#LT">sqlglot.expressions.LT</a>'>, <class '<a href="../expressions.html#LTE">sqlglot.expressions.LTE</a>'>, <class '<a href="../expressions.html#NEQ">sqlglot.expressions.NEQ</a>'>, <class '<a href="../expressions.html#GTE">sqlglot.expressions.GTE</a>'>, <class '<a href="../expressions.html#In">sqlglot.expressions.In</a>'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
@ -3315,7 +3315,7 @@ prefix are statically known.</p>
|
||||||
<section id="JOINS">
|
<section id="JOINS">
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">JOINS</span> =
|
<span class="name">JOINS</span> =
|
||||||
<span class="default_value">{('RIGHT', ''), ('RIGHT', 'OUTER'), ('', 'INNER'), ('', '')}</span>
|
<span class="default_value">{('RIGHT', 'OUTER'), ('', 'INNER'), ('RIGHT', ''), ('', '')}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
13866
docs/sqlglot/parser.html
13866
docs/sqlglot/parser.html
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
@ -9075,7 +9075,7 @@
|
||||||
<div class="attr variable">
|
<div class="attr variable">
|
||||||
<span class="name">COMMANDS</span> =
|
<span class="name">COMMANDS</span> =
|
||||||
<input id="Tokenizer.COMMANDS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
<input id="Tokenizer.COMMANDS-view-value" class="view-value-toggle-state" type="checkbox" aria-hidden="true" tabindex="-1">
|
||||||
<label class="view-value-button pdoc-button" for="Tokenizer.COMMANDS-view-value"></label><span class="default_value">{<<a href="#TokenType.SHOW">TokenType.SHOW</a>: 'SHOW'>, <<a href="#TokenType.FETCH">TokenType.FETCH</a>: 'FETCH'>, <<a href="#TokenType.EXECUTE">TokenType.EXECUTE</a>: 'EXECUTE'>, <<a href="#TokenType.COMMAND">TokenType.COMMAND</a>: 'COMMAND'>, <<a href="#TokenType.RENAME">TokenType.RENAME</a>: 'RENAME'>}</span>
|
<label class="view-value-button pdoc-button" for="Tokenizer.COMMANDS-view-value"></label><span class="default_value">{<<a href="#TokenType.SHOW">TokenType.SHOW</a>: 'SHOW'>, <<a href="#TokenType.EXECUTE">TokenType.EXECUTE</a>: 'EXECUTE'>, <<a href="#TokenType.FETCH">TokenType.FETCH</a>: 'FETCH'>, <<a href="#TokenType.COMMAND">TokenType.COMMAND</a>: 'COMMAND'>, <<a href="#TokenType.RENAME">TokenType.RENAME</a>: 'RENAME'>}</span>
|
||||||
|
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
File diff suppressed because it is too large
Load diff
|
@ -26,6 +26,7 @@ from sqlglot.dialects.dialect import (
|
||||||
timestrtotime_sql,
|
timestrtotime_sql,
|
||||||
ts_or_ds_add_cast,
|
ts_or_ds_add_cast,
|
||||||
unit_to_var,
|
unit_to_var,
|
||||||
|
str_position_sql,
|
||||||
)
|
)
|
||||||
from sqlglot.helper import seq_get, split_num_words
|
from sqlglot.helper import seq_get, split_num_words
|
||||||
from sqlglot.tokens import TokenType
|
from sqlglot.tokens import TokenType
|
||||||
|
@ -211,7 +212,7 @@ def _build_time(args: t.List) -> exp.Func:
|
||||||
|
|
||||||
def _build_datetime(args: t.List) -> exp.Func:
|
def _build_datetime(args: t.List) -> exp.Func:
|
||||||
if len(args) == 1:
|
if len(args) == 1:
|
||||||
return exp.TsOrDsToTimestamp.from_arg_list(args)
|
return exp.TsOrDsToDatetime.from_arg_list(args)
|
||||||
if len(args) == 2:
|
if len(args) == 2:
|
||||||
return exp.Datetime.from_arg_list(args)
|
return exp.Datetime.from_arg_list(args)
|
||||||
return exp.TimestampFromParts.from_arg_list(args)
|
return exp.TimestampFromParts.from_arg_list(args)
|
||||||
|
@ -304,6 +305,25 @@ def _build_levenshtein(args: t.List) -> exp.Levenshtein:
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _build_format_time(expr_type: t.Type[exp.Expression]) -> t.Callable[[t.List], exp.TimeToStr]:
|
||||||
|
def _builder(args: t.List) -> exp.TimeToStr:
|
||||||
|
return exp.TimeToStr(this=expr_type(this=seq_get(args, 1)), format=seq_get(args, 0))
|
||||||
|
|
||||||
|
return _builder
|
||||||
|
|
||||||
|
|
||||||
|
def _build_contains_substring(args: t.List) -> exp.Contains | exp.Anonymous:
|
||||||
|
if len(args) == 3:
|
||||||
|
return exp.Anonymous(this="CONTAINS_SUBSTRING", expressions=args)
|
||||||
|
|
||||||
|
# Lowercase the operands in case of transpilation, as exp.Contains
|
||||||
|
# is case-sensitive on other dialects
|
||||||
|
this = exp.Lower(this=seq_get(args, 0))
|
||||||
|
expr = exp.Lower(this=seq_get(args, 1))
|
||||||
|
|
||||||
|
return exp.Contains(this=this, expression=expr)
|
||||||
|
|
||||||
|
|
||||||
class BigQuery(Dialect):
|
class BigQuery(Dialect):
|
||||||
WEEK_OFFSET = -1
|
WEEK_OFFSET = -1
|
||||||
UNNEST_COLUMN_ONLY = True
|
UNNEST_COLUMN_ONLY = True
|
||||||
|
@ -449,6 +469,7 @@ class BigQuery(Dialect):
|
||||||
|
|
||||||
FUNCTIONS = {
|
FUNCTIONS = {
|
||||||
**parser.Parser.FUNCTIONS,
|
**parser.Parser.FUNCTIONS,
|
||||||
|
"CONTAINS_SUBSTRING": _build_contains_substring,
|
||||||
"DATE": _build_date,
|
"DATE": _build_date,
|
||||||
"DATE_ADD": build_date_delta_with_interval(exp.DateAdd),
|
"DATE_ADD": build_date_delta_with_interval(exp.DateAdd),
|
||||||
"DATE_SUB": build_date_delta_with_interval(exp.DateSub),
|
"DATE_SUB": build_date_delta_with_interval(exp.DateSub),
|
||||||
|
@ -462,9 +483,7 @@ class BigQuery(Dialect):
|
||||||
"DATETIME_SUB": build_date_delta_with_interval(exp.DatetimeSub),
|
"DATETIME_SUB": build_date_delta_with_interval(exp.DatetimeSub),
|
||||||
"DIV": binary_from_function(exp.IntDiv),
|
"DIV": binary_from_function(exp.IntDiv),
|
||||||
"EDIT_DISTANCE": _build_levenshtein,
|
"EDIT_DISTANCE": _build_levenshtein,
|
||||||
"FORMAT_DATE": lambda args: exp.TimeToStr(
|
"FORMAT_DATE": _build_format_time(exp.TsOrDsToDate),
|
||||||
this=exp.TsOrDsToDate(this=seq_get(args, 1)), format=seq_get(args, 0)
|
|
||||||
),
|
|
||||||
"GENERATE_ARRAY": exp.GenerateSeries.from_arg_list,
|
"GENERATE_ARRAY": exp.GenerateSeries.from_arg_list,
|
||||||
"JSON_EXTRACT_SCALAR": _build_extract_json_with_default_path(exp.JSONExtractScalar),
|
"JSON_EXTRACT_SCALAR": _build_extract_json_with_default_path(exp.JSONExtractScalar),
|
||||||
"JSON_EXTRACT_ARRAY": _build_extract_json_with_default_path(exp.JSONExtractArray),
|
"JSON_EXTRACT_ARRAY": _build_extract_json_with_default_path(exp.JSONExtractArray),
|
||||||
|
@ -492,6 +511,7 @@ class BigQuery(Dialect):
|
||||||
this=seq_get(args, 0),
|
this=seq_get(args, 0),
|
||||||
expression=seq_get(args, 1) or exp.Literal.string(","),
|
expression=seq_get(args, 1) or exp.Literal.string(","),
|
||||||
),
|
),
|
||||||
|
"STRPOS": exp.StrPosition.from_arg_list,
|
||||||
"TIME": _build_time,
|
"TIME": _build_time,
|
||||||
"TIME_ADD": build_date_delta_with_interval(exp.TimeAdd),
|
"TIME_ADD": build_date_delta_with_interval(exp.TimeAdd),
|
||||||
"TIME_SUB": build_date_delta_with_interval(exp.TimeSub),
|
"TIME_SUB": build_date_delta_with_interval(exp.TimeSub),
|
||||||
|
@ -506,14 +526,14 @@ class BigQuery(Dialect):
|
||||||
),
|
),
|
||||||
"TIMESTAMP_SECONDS": lambda args: exp.UnixToTime(this=seq_get(args, 0)),
|
"TIMESTAMP_SECONDS": lambda args: exp.UnixToTime(this=seq_get(args, 0)),
|
||||||
"TO_JSON_STRING": exp.JSONFormat.from_arg_list,
|
"TO_JSON_STRING": exp.JSONFormat.from_arg_list,
|
||||||
"FORMAT_DATETIME": lambda args: exp.TimeToStr(
|
"FORMAT_DATETIME": _build_format_time(exp.TsOrDsToDatetime),
|
||||||
this=exp.TsOrDsToTimestamp(this=seq_get(args, 1)), format=seq_get(args, 0)
|
"FORMAT_TIMESTAMP": _build_format_time(exp.TsOrDsToTimestamp),
|
||||||
),
|
|
||||||
}
|
}
|
||||||
|
|
||||||
FUNCTION_PARSERS = {
|
FUNCTION_PARSERS = {
|
||||||
**parser.Parser.FUNCTION_PARSERS,
|
**parser.Parser.FUNCTION_PARSERS,
|
||||||
"ARRAY": lambda self: self.expression(exp.Array, expressions=[self._parse_statement()]),
|
"ARRAY": lambda self: self.expression(exp.Array, expressions=[self._parse_statement()]),
|
||||||
|
"MAKE_INTERVAL": lambda self: self._parse_make_interval(),
|
||||||
}
|
}
|
||||||
FUNCTION_PARSERS.pop("TRIM")
|
FUNCTION_PARSERS.pop("TRIM")
|
||||||
|
|
||||||
|
@ -744,6 +764,26 @@ class BigQuery(Dialect):
|
||||||
|
|
||||||
return unnest
|
return unnest
|
||||||
|
|
||||||
|
def _parse_make_interval(self):
|
||||||
|
expr = exp.MakeInterval()
|
||||||
|
|
||||||
|
for arg_key in expr.arg_types:
|
||||||
|
value = self._parse_lambda()
|
||||||
|
|
||||||
|
if not value:
|
||||||
|
break
|
||||||
|
|
||||||
|
# Non-named arguments are filled sequentially, (optionally) followed by named arguments
|
||||||
|
# that can appear in any order e.g MAKE_INTERVAL(1, minute => 5, day => 2)
|
||||||
|
if isinstance(value, exp.Kwarg):
|
||||||
|
arg_key = value.this.name
|
||||||
|
|
||||||
|
expr.set(arg_key, value)
|
||||||
|
|
||||||
|
self._match(TokenType.COMMA)
|
||||||
|
|
||||||
|
return expr
|
||||||
|
|
||||||
class Generator(generator.Generator):
|
class Generator(generator.Generator):
|
||||||
INTERVAL_ALLOWS_PLURAL_FORM = False
|
INTERVAL_ALLOWS_PLURAL_FORM = False
|
||||||
JOIN_HINTS = False
|
JOIN_HINTS = False
|
||||||
|
@ -809,6 +849,7 @@ class BigQuery(Dialect):
|
||||||
exp.If: if_sql(false_value="NULL"),
|
exp.If: if_sql(false_value="NULL"),
|
||||||
exp.ILike: no_ilike_sql,
|
exp.ILike: no_ilike_sql,
|
||||||
exp.IntDiv: rename_func("DIV"),
|
exp.IntDiv: rename_func("DIV"),
|
||||||
|
exp.Int64: rename_func("INT64"),
|
||||||
exp.JSONFormat: rename_func("TO_JSON_STRING"),
|
exp.JSONFormat: rename_func("TO_JSON_STRING"),
|
||||||
exp.Levenshtein: _levenshtein_sql,
|
exp.Levenshtein: _levenshtein_sql,
|
||||||
exp.Max: max_or_greatest,
|
exp.Max: max_or_greatest,
|
||||||
|
@ -845,6 +886,7 @@ class BigQuery(Dialect):
|
||||||
"DETERMINISTIC" if e.name == "IMMUTABLE" else "NOT DETERMINISTIC"
|
"DETERMINISTIC" if e.name == "IMMUTABLE" else "NOT DETERMINISTIC"
|
||||||
),
|
),
|
||||||
exp.String: rename_func("STRING"),
|
exp.String: rename_func("STRING"),
|
||||||
|
exp.StrPosition: str_position_sql,
|
||||||
exp.StrToDate: _str_to_datetime_sql,
|
exp.StrToDate: _str_to_datetime_sql,
|
||||||
exp.StrToTime: _str_to_datetime_sql,
|
exp.StrToTime: _str_to_datetime_sql,
|
||||||
exp.TimeAdd: date_add_interval_sql("TIME", "ADD"),
|
exp.TimeAdd: date_add_interval_sql("TIME", "ADD"),
|
||||||
|
@ -859,7 +901,8 @@ class BigQuery(Dialect):
|
||||||
exp.TsOrDsAdd: _ts_or_ds_add_sql,
|
exp.TsOrDsAdd: _ts_or_ds_add_sql,
|
||||||
exp.TsOrDsDiff: _ts_or_ds_diff_sql,
|
exp.TsOrDsDiff: _ts_or_ds_diff_sql,
|
||||||
exp.TsOrDsToTime: rename_func("TIME"),
|
exp.TsOrDsToTime: rename_func("TIME"),
|
||||||
exp.TsOrDsToTimestamp: rename_func("DATETIME"),
|
exp.TsOrDsToDatetime: rename_func("DATETIME"),
|
||||||
|
exp.TsOrDsToTimestamp: rename_func("TIMESTAMP"),
|
||||||
exp.Unhex: rename_func("FROM_HEX"),
|
exp.Unhex: rename_func("FROM_HEX"),
|
||||||
exp.UnixDate: rename_func("UNIX_DATE"),
|
exp.UnixDate: rename_func("UNIX_DATE"),
|
||||||
exp.UnixToTime: _unix_to_time_sql,
|
exp.UnixToTime: _unix_to_time_sql,
|
||||||
|
@ -1048,16 +1091,20 @@ class BigQuery(Dialect):
|
||||||
return super().table_parts(expression)
|
return super().table_parts(expression)
|
||||||
|
|
||||||
def timetostr_sql(self, expression: exp.TimeToStr) -> str:
|
def timetostr_sql(self, expression: exp.TimeToStr) -> str:
|
||||||
if isinstance(expression.this, exp.TsOrDsToTimestamp):
|
this = expression.this
|
||||||
|
if isinstance(this, exp.TsOrDsToDatetime):
|
||||||
func_name = "FORMAT_DATETIME"
|
func_name = "FORMAT_DATETIME"
|
||||||
|
elif isinstance(this, exp.TsOrDsToTimestamp):
|
||||||
|
func_name = "FORMAT_TIMESTAMP"
|
||||||
else:
|
else:
|
||||||
func_name = "FORMAT_DATE"
|
func_name = "FORMAT_DATE"
|
||||||
this = (
|
|
||||||
expression.this
|
time_expr = (
|
||||||
if isinstance(expression.this, (exp.TsOrDsToTimestamp, exp.TsOrDsToDate))
|
this
|
||||||
|
if isinstance(this, (exp.TsOrDsToDatetime, exp.TsOrDsToTimestamp, exp.TsOrDsToDate))
|
||||||
else expression
|
else expression
|
||||||
)
|
)
|
||||||
return self.func(func_name, self.format_time(expression), this.this)
|
return self.func(func_name, self.format_time(expression), time_expr.this)
|
||||||
|
|
||||||
def eq_sql(self, expression: exp.EQ) -> str:
|
def eq_sql(self, expression: exp.EQ) -> str:
|
||||||
# Operands of = cannot be NULL in BigQuery
|
# Operands of = cannot be NULL in BigQuery
|
||||||
|
@ -1119,3 +1166,13 @@ class BigQuery(Dialect):
|
||||||
if expression.name == "TIMESTAMP":
|
if expression.name == "TIMESTAMP":
|
||||||
expression.set("this", "SYSTEM_TIME")
|
expression.set("this", "SYSTEM_TIME")
|
||||||
return super().version_sql(expression)
|
return super().version_sql(expression)
|
||||||
|
|
||||||
|
def contains_sql(self, expression: exp.Contains) -> str:
|
||||||
|
this = expression.this
|
||||||
|
expr = expression.expression
|
||||||
|
|
||||||
|
if isinstance(this, exp.Lower) and isinstance(expr, exp.Lower):
|
||||||
|
this = this.this
|
||||||
|
expr = expr.this
|
||||||
|
|
||||||
|
return self.func("CONTAINS_SUBSTRING", this, expr)
|
||||||
|
|
|
@ -275,6 +275,7 @@ class ClickHouse(Dialect):
|
||||||
"EDITDISTANCE": exp.Levenshtein.from_arg_list,
|
"EDITDISTANCE": exp.Levenshtein.from_arg_list,
|
||||||
"LEVENSHTEINDISTANCE": exp.Levenshtein.from_arg_list,
|
"LEVENSHTEINDISTANCE": exp.Levenshtein.from_arg_list,
|
||||||
}
|
}
|
||||||
|
FUNCTIONS.pop("TRANSFORM")
|
||||||
|
|
||||||
AGG_FUNCTIONS = {
|
AGG_FUNCTIONS = {
|
||||||
"count",
|
"count",
|
||||||
|
|
|
@ -1724,3 +1724,14 @@ def explode_to_unnest_sql(self: Generator, expression: exp.Lateral) -> str:
|
||||||
|
|
||||||
def timestampdiff_sql(self: Generator, expression: exp.DatetimeDiff | exp.TimestampDiff) -> str:
|
def timestampdiff_sql(self: Generator, expression: exp.DatetimeDiff | exp.TimestampDiff) -> str:
|
||||||
return self.func("TIMESTAMPDIFF", expression.unit, expression.expression, expression.this)
|
return self.func("TIMESTAMPDIFF", expression.unit, expression.expression, expression.this)
|
||||||
|
|
||||||
|
|
||||||
|
def no_make_interval_sql(self: Generator, expression: exp.MakeInterval, sep: str = ", ") -> str:
|
||||||
|
args = []
|
||||||
|
for unit, value in expression.args.items():
|
||||||
|
if isinstance(value, exp.Kwarg):
|
||||||
|
value = value.expression
|
||||||
|
|
||||||
|
args.append(f"{value} {unit}")
|
||||||
|
|
||||||
|
return f"INTERVAL '{self.format_args(*args, sep=sep)}'"
|
||||||
|
|
|
@ -35,6 +35,7 @@ from sqlglot.dialects.dialect import (
|
||||||
sha256_sql,
|
sha256_sql,
|
||||||
build_regexp_extract,
|
build_regexp_extract,
|
||||||
explode_to_unnest_sql,
|
explode_to_unnest_sql,
|
||||||
|
no_make_interval_sql,
|
||||||
)
|
)
|
||||||
from sqlglot.generator import unsupported_args
|
from sqlglot.generator import unsupported_args
|
||||||
from sqlglot.helper import seq_get
|
from sqlglot.helper import seq_get
|
||||||
|
@ -315,6 +316,7 @@ class DuckDB(Dialect):
|
||||||
"BPCHAR": TokenType.TEXT,
|
"BPCHAR": TokenType.TEXT,
|
||||||
"CHAR": TokenType.TEXT,
|
"CHAR": TokenType.TEXT,
|
||||||
"CHARACTER VARYING": TokenType.TEXT,
|
"CHARACTER VARYING": TokenType.TEXT,
|
||||||
|
"DETACH": TokenType.COMMAND,
|
||||||
"EXCLUDE": TokenType.EXCEPT,
|
"EXCLUDE": TokenType.EXCEPT,
|
||||||
"LOGICAL": TokenType.BOOLEAN,
|
"LOGICAL": TokenType.BOOLEAN,
|
||||||
"ONLY": TokenType.ONLY,
|
"ONLY": TokenType.ONLY,
|
||||||
|
@ -558,6 +560,7 @@ class DuckDB(Dialect):
|
||||||
exp.Lateral: explode_to_unnest_sql,
|
exp.Lateral: explode_to_unnest_sql,
|
||||||
exp.LogicalOr: rename_func("BOOL_OR"),
|
exp.LogicalOr: rename_func("BOOL_OR"),
|
||||||
exp.LogicalAnd: rename_func("BOOL_AND"),
|
exp.LogicalAnd: rename_func("BOOL_AND"),
|
||||||
|
exp.MakeInterval: lambda self, e: no_make_interval_sql(self, e, sep=" "),
|
||||||
exp.MD5Digest: lambda self, e: self.func("UNHEX", self.func("MD5", e.this)),
|
exp.MD5Digest: lambda self, e: self.func("UNHEX", self.func("MD5", e.this)),
|
||||||
exp.MonthsBetween: lambda self, e: self.func(
|
exp.MonthsBetween: lambda self, e: self.func(
|
||||||
"DATEDIFF",
|
"DATEDIFF",
|
||||||
|
@ -645,6 +648,7 @@ class DuckDB(Dialect):
|
||||||
exp.DataType.Type.BINARY: "BLOB",
|
exp.DataType.Type.BINARY: "BLOB",
|
||||||
exp.DataType.Type.BPCHAR: "TEXT",
|
exp.DataType.Type.BPCHAR: "TEXT",
|
||||||
exp.DataType.Type.CHAR: "TEXT",
|
exp.DataType.Type.CHAR: "TEXT",
|
||||||
|
exp.DataType.Type.DATETIME: "TIMESTAMP",
|
||||||
exp.DataType.Type.FLOAT: "REAL",
|
exp.DataType.Type.FLOAT: "REAL",
|
||||||
exp.DataType.Type.NCHAR: "TEXT",
|
exp.DataType.Type.NCHAR: "TEXT",
|
||||||
exp.DataType.Type.NVARCHAR: "TEXT",
|
exp.DataType.Type.NVARCHAR: "TEXT",
|
||||||
|
|
|
@ -25,6 +25,7 @@ from sqlglot.dialects.dialect import (
|
||||||
no_safe_divide_sql,
|
no_safe_divide_sql,
|
||||||
no_timestamp_sql,
|
no_timestamp_sql,
|
||||||
timestampdiff_sql,
|
timestampdiff_sql,
|
||||||
|
no_make_interval_sql,
|
||||||
)
|
)
|
||||||
from sqlglot.generator import unsupported_args
|
from sqlglot.generator import unsupported_args
|
||||||
from sqlglot.helper import flatten, is_float, is_int, seq_get
|
from sqlglot.helper import flatten, is_float, is_int, seq_get
|
||||||
|
@ -105,11 +106,14 @@ def _build_date_time_add(expr_type: t.Type[E]) -> t.Callable[[t.List], E]:
|
||||||
|
|
||||||
# https://docs.snowflake.com/en/sql-reference/functions/div0
|
# https://docs.snowflake.com/en/sql-reference/functions/div0
|
||||||
def _build_if_from_div0(args: t.List) -> exp.If:
|
def _build_if_from_div0(args: t.List) -> exp.If:
|
||||||
cond = exp.EQ(this=seq_get(args, 1), expression=exp.Literal.number(0)).and_(
|
lhs = exp._wrap(seq_get(args, 0), exp.Binary)
|
||||||
exp.Is(this=seq_get(args, 0), expression=exp.null()).not_()
|
rhs = exp._wrap(seq_get(args, 1), exp.Binary)
|
||||||
|
|
||||||
|
cond = exp.EQ(this=rhs, expression=exp.Literal.number(0)).and_(
|
||||||
|
exp.Is(this=lhs, expression=exp.null()).not_()
|
||||||
)
|
)
|
||||||
true = exp.Literal.number(0)
|
true = exp.Literal.number(0)
|
||||||
false = exp.Div(this=seq_get(args, 0), expression=seq_get(args, 1))
|
false = exp.Div(this=lhs, expression=rhs)
|
||||||
return exp.If(this=cond, true=true, false=false)
|
return exp.If(this=cond, true=true, false=false)
|
||||||
|
|
||||||
|
|
||||||
|
@ -866,6 +870,7 @@ class Snowflake(Dialect):
|
||||||
exp.LogicalAnd: rename_func("BOOLAND_AGG"),
|
exp.LogicalAnd: rename_func("BOOLAND_AGG"),
|
||||||
exp.LogicalOr: rename_func("BOOLOR_AGG"),
|
exp.LogicalOr: rename_func("BOOLOR_AGG"),
|
||||||
exp.Map: lambda self, e: var_map_sql(self, e, "OBJECT_CONSTRUCT"),
|
exp.Map: lambda self, e: var_map_sql(self, e, "OBJECT_CONSTRUCT"),
|
||||||
|
exp.MakeInterval: no_make_interval_sql,
|
||||||
exp.Max: max_or_greatest,
|
exp.Max: max_or_greatest,
|
||||||
exp.Min: min_or_least,
|
exp.Min: min_or_least,
|
||||||
exp.ParseJSON: lambda self, e: self.func(
|
exp.ParseJSON: lambda self, e: self.func(
|
||||||
|
@ -908,9 +913,6 @@ class Snowflake(Dialect):
|
||||||
),
|
),
|
||||||
exp.TimestampTrunc: timestamptrunc_sql(),
|
exp.TimestampTrunc: timestamptrunc_sql(),
|
||||||
exp.TimeStrToTime: timestrtotime_sql,
|
exp.TimeStrToTime: timestrtotime_sql,
|
||||||
exp.TimeToStr: lambda self, e: self.func(
|
|
||||||
"TO_CHAR", exp.cast(e.this, exp.DataType.Type.TIMESTAMP), self.format_time(e)
|
|
||||||
),
|
|
||||||
exp.TimeToUnix: lambda self, e: f"EXTRACT(epoch_second FROM {self.sql(e, 'this')})",
|
exp.TimeToUnix: lambda self, e: f"EXTRACT(epoch_second FROM {self.sql(e, 'this')})",
|
||||||
exp.ToArray: rename_func("TO_ARRAY"),
|
exp.ToArray: rename_func("TO_ARRAY"),
|
||||||
exp.ToChar: lambda self, e: self.function_fallback_sql(e),
|
exp.ToChar: lambda self, e: self.function_fallback_sql(e),
|
||||||
|
@ -1147,3 +1149,10 @@ class Snowflake(Dialect):
|
||||||
exp.ParseJSON(this=this) if this.is_string else this,
|
exp.ParseJSON(this=this) if this.is_string else this,
|
||||||
expression.expression,
|
expression.expression,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def timetostr_sql(self, expression: exp.TimeToStr) -> str:
|
||||||
|
this = expression.this
|
||||||
|
if not isinstance(this, exp.TsOrDsToTimestamp):
|
||||||
|
this = exp.cast(this, exp.DataType.Type.TIMESTAMP)
|
||||||
|
|
||||||
|
return self.func("TO_CHAR", this, self.format_time(expression))
|
||||||
|
|
|
@ -112,6 +112,161 @@ class StarRocks(MySQL):
|
||||||
|
|
||||||
TRANSFORMS.pop(exp.DateTrunc)
|
TRANSFORMS.pop(exp.DateTrunc)
|
||||||
|
|
||||||
|
# https://docs.starrocks.io/docs/sql-reference/sql-statements/keywords/#reserved-keywords
|
||||||
|
RESERVED_KEYWORDS = {
|
||||||
|
"add",
|
||||||
|
"all",
|
||||||
|
"alter",
|
||||||
|
"analyze",
|
||||||
|
"and",
|
||||||
|
"array",
|
||||||
|
"as",
|
||||||
|
"asc",
|
||||||
|
"between",
|
||||||
|
"bigint",
|
||||||
|
"bitmap",
|
||||||
|
"both",
|
||||||
|
"by",
|
||||||
|
"case",
|
||||||
|
"char",
|
||||||
|
"character",
|
||||||
|
"check",
|
||||||
|
"collate",
|
||||||
|
"column",
|
||||||
|
"compaction",
|
||||||
|
"convert",
|
||||||
|
"create",
|
||||||
|
"cross",
|
||||||
|
"cube",
|
||||||
|
"current_date",
|
||||||
|
"current_role",
|
||||||
|
"current_time",
|
||||||
|
"current_timestamp",
|
||||||
|
"current_user",
|
||||||
|
"database",
|
||||||
|
"databases",
|
||||||
|
"decimal",
|
||||||
|
"decimalv2",
|
||||||
|
"decimal32",
|
||||||
|
"decimal64",
|
||||||
|
"decimal128",
|
||||||
|
"default",
|
||||||
|
"deferred",
|
||||||
|
"delete",
|
||||||
|
"dense_rank",
|
||||||
|
"desc",
|
||||||
|
"describe",
|
||||||
|
"distinct",
|
||||||
|
"double",
|
||||||
|
"drop",
|
||||||
|
"dual",
|
||||||
|
"else",
|
||||||
|
"except",
|
||||||
|
"exists",
|
||||||
|
"explain",
|
||||||
|
"false",
|
||||||
|
"first_value",
|
||||||
|
"float",
|
||||||
|
"for",
|
||||||
|
"force",
|
||||||
|
"from",
|
||||||
|
"full",
|
||||||
|
"function",
|
||||||
|
"grant",
|
||||||
|
"group",
|
||||||
|
"grouping",
|
||||||
|
"grouping_id",
|
||||||
|
"groups",
|
||||||
|
"having",
|
||||||
|
"hll",
|
||||||
|
"host",
|
||||||
|
"if",
|
||||||
|
"ignore",
|
||||||
|
"immediate",
|
||||||
|
"in",
|
||||||
|
"index",
|
||||||
|
"infile",
|
||||||
|
"inner",
|
||||||
|
"insert",
|
||||||
|
"int",
|
||||||
|
"integer",
|
||||||
|
"intersect",
|
||||||
|
"into",
|
||||||
|
"is",
|
||||||
|
"join",
|
||||||
|
"json",
|
||||||
|
"key",
|
||||||
|
"keys",
|
||||||
|
"kill",
|
||||||
|
"lag",
|
||||||
|
"largeint",
|
||||||
|
"last_value",
|
||||||
|
"lateral",
|
||||||
|
"lead",
|
||||||
|
"left",
|
||||||
|
"like",
|
||||||
|
"limit",
|
||||||
|
"load",
|
||||||
|
"localtime",
|
||||||
|
"localtimestamp",
|
||||||
|
"maxvalue",
|
||||||
|
"minus",
|
||||||
|
"mod",
|
||||||
|
"not",
|
||||||
|
"ntile",
|
||||||
|
"null",
|
||||||
|
"on",
|
||||||
|
"or",
|
||||||
|
"order",
|
||||||
|
"outer",
|
||||||
|
"outfile",
|
||||||
|
"over",
|
||||||
|
"partition",
|
||||||
|
"percentile",
|
||||||
|
"primary",
|
||||||
|
"procedure",
|
||||||
|
"qualify",
|
||||||
|
"range",
|
||||||
|
"rank",
|
||||||
|
"read",
|
||||||
|
"regexp",
|
||||||
|
"release",
|
||||||
|
"rename",
|
||||||
|
"replace",
|
||||||
|
"revoke",
|
||||||
|
"right",
|
||||||
|
"rlike",
|
||||||
|
"row",
|
||||||
|
"row_number",
|
||||||
|
"rows",
|
||||||
|
"schema",
|
||||||
|
"schemas",
|
||||||
|
"select",
|
||||||
|
"set",
|
||||||
|
"set_var",
|
||||||
|
"show",
|
||||||
|
"smallint",
|
||||||
|
"system",
|
||||||
|
"table",
|
||||||
|
"terminated",
|
||||||
|
"text",
|
||||||
|
"then",
|
||||||
|
"tinyint",
|
||||||
|
"to",
|
||||||
|
"true",
|
||||||
|
"union",
|
||||||
|
"unique",
|
||||||
|
"unsigned",
|
||||||
|
"update",
|
||||||
|
"use",
|
||||||
|
"using",
|
||||||
|
"values",
|
||||||
|
"varchar",
|
||||||
|
"when",
|
||||||
|
"where",
|
||||||
|
"with",
|
||||||
|
}
|
||||||
|
|
||||||
def create_sql(self, expression: exp.Create) -> str:
|
def create_sql(self, expression: exp.Create) -> str:
|
||||||
# Starrocks' primary key is defined outside of the schema, so we need to move it there
|
# Starrocks' primary key is defined outside of the schema, so we need to move it there
|
||||||
schema = expression.this
|
schema = expression.this
|
||||||
|
|
|
@ -301,7 +301,7 @@ class Expression(metaclass=_Expression):
|
||||||
"""
|
"""
|
||||||
return deepcopy(self)
|
return deepcopy(self)
|
||||||
|
|
||||||
def add_comments(self, comments: t.Optional[t.List[str]] = None) -> None:
|
def add_comments(self, comments: t.Optional[t.List[str]] = None, prepend: bool = False) -> None:
|
||||||
if self.comments is None:
|
if self.comments is None:
|
||||||
self.comments = []
|
self.comments = []
|
||||||
|
|
||||||
|
@ -313,8 +313,13 @@ class Expression(metaclass=_Expression):
|
||||||
k, *v = kv.split("=")
|
k, *v = kv.split("=")
|
||||||
value = v[0].strip() if v else True
|
value = v[0].strip() if v else True
|
||||||
self.meta[k.strip()] = value
|
self.meta[k.strip()] = value
|
||||||
|
|
||||||
|
if not prepend:
|
||||||
self.comments.append(comment)
|
self.comments.append(comment)
|
||||||
|
|
||||||
|
if prepend:
|
||||||
|
self.comments = comments + self.comments
|
||||||
|
|
||||||
def pop_comments(self) -> t.List[str]:
|
def pop_comments(self) -> t.List[str]:
|
||||||
comments = self.comments or []
|
comments = self.comments or []
|
||||||
self.comments = None
|
self.comments = None
|
||||||
|
@ -5455,6 +5460,10 @@ class ConcatWs(Concat):
|
||||||
_sql_names = ["CONCAT_WS"]
|
_sql_names = ["CONCAT_WS"]
|
||||||
|
|
||||||
|
|
||||||
|
class Contains(Func):
|
||||||
|
arg_types = {"this": True, "expression": True}
|
||||||
|
|
||||||
|
|
||||||
# https://docs.oracle.com/cd/B13789_01/server.101/b10759/operators004.htm#i1035022
|
# https://docs.oracle.com/cd/B13789_01/server.101/b10759/operators004.htm#i1035022
|
||||||
class ConnectByRoot(Func):
|
class ConnectByRoot(Func):
|
||||||
pass
|
pass
|
||||||
|
@ -5584,6 +5593,17 @@ class MonthsBetween(Func):
|
||||||
arg_types = {"this": True, "expression": True, "roundoff": False}
|
arg_types = {"this": True, "expression": True, "roundoff": False}
|
||||||
|
|
||||||
|
|
||||||
|
class MakeInterval(Func):
|
||||||
|
arg_types = {
|
||||||
|
"year": False,
|
||||||
|
"month": False,
|
||||||
|
"day": False,
|
||||||
|
"hour": False,
|
||||||
|
"minute": False,
|
||||||
|
"second": False,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
class LastDay(Func, TimeUnit):
|
class LastDay(Func, TimeUnit):
|
||||||
_sql_names = ["LAST_DAY", "LAST_DAY_OF_MONTH"]
|
_sql_names = ["LAST_DAY", "LAST_DAY_OF_MONTH"]
|
||||||
arg_types = {"this": True, "unit": False}
|
arg_types = {"this": True, "unit": False}
|
||||||
|
@ -5812,6 +5832,11 @@ class IsNan(Func):
|
||||||
_sql_names = ["IS_NAN", "ISNAN"]
|
_sql_names = ["IS_NAN", "ISNAN"]
|
||||||
|
|
||||||
|
|
||||||
|
# https://cloud.google.com/bigquery/docs/reference/standard-sql/json_functions#int64_for_json
|
||||||
|
class Int64(Func):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
class IsInf(Func):
|
class IsInf(Func):
|
||||||
_sql_names = ["IS_INF", "ISINF"]
|
_sql_names = ["IS_INF", "ISINF"]
|
||||||
|
|
||||||
|
@ -6304,7 +6329,7 @@ class Round(Func):
|
||||||
|
|
||||||
|
|
||||||
class RowNumber(Func):
|
class RowNumber(Func):
|
||||||
arg_types: t.Dict[str, t.Any] = {}
|
arg_types = {"this": False}
|
||||||
|
|
||||||
|
|
||||||
class SafeDivide(Func):
|
class SafeDivide(Func):
|
||||||
|
@ -6490,6 +6515,10 @@ class TsOrDsToDate(Func):
|
||||||
arg_types = {"this": True, "format": False, "safe": False}
|
arg_types = {"this": True, "format": False, "safe": False}
|
||||||
|
|
||||||
|
|
||||||
|
class TsOrDsToDatetime(Func):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
class TsOrDsToTime(Func):
|
class TsOrDsToTime(Func):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
@ -6947,7 +6976,15 @@ def _combine(
|
||||||
return this
|
return this
|
||||||
|
|
||||||
|
|
||||||
def _wrap(expression: E, kind: t.Type[Expression]) -> E | Paren:
|
@t.overload
|
||||||
|
def _wrap(expression: None, kind: t.Type[Expression]) -> None: ...
|
||||||
|
|
||||||
|
|
||||||
|
@t.overload
|
||||||
|
def _wrap(expression: E, kind: t.Type[Expression]) -> E | Paren: ...
|
||||||
|
|
||||||
|
|
||||||
|
def _wrap(expression: t.Optional[E], kind: t.Type[Expression]) -> t.Optional[E] | Paren:
|
||||||
return Paren(this=expression) if isinstance(expression, kind) else expression
|
return Paren(this=expression) if isinstance(expression, kind) else expression
|
||||||
|
|
||||||
|
|
||||||
|
@ -7793,8 +7830,8 @@ def cast(
|
||||||
existing_cast_type: DataType.Type = expr.to.this
|
existing_cast_type: DataType.Type = expr.to.this
|
||||||
new_cast_type: DataType.Type = data_type.this
|
new_cast_type: DataType.Type = data_type.this
|
||||||
types_are_equivalent = type_mapping.get(
|
types_are_equivalent = type_mapping.get(
|
||||||
existing_cast_type, existing_cast_type
|
existing_cast_type, existing_cast_type.value
|
||||||
) == type_mapping.get(new_cast_type, new_cast_type)
|
) == type_mapping.get(new_cast_type, new_cast_type.value)
|
||||||
if expr.is_type(data_type) or types_are_equivalent:
|
if expr.is_type(data_type) or types_are_equivalent:
|
||||||
return expr
|
return expr
|
||||||
|
|
||||||
|
|
|
@ -148,6 +148,7 @@ class Generator(metaclass=_Generator):
|
||||||
exp.InputModelProperty: lambda self, e: f"INPUT{self.sql(e, 'this')}",
|
exp.InputModelProperty: lambda self, e: f"INPUT{self.sql(e, 'this')}",
|
||||||
exp.Intersect: lambda self, e: self.set_operations(e),
|
exp.Intersect: lambda self, e: self.set_operations(e),
|
||||||
exp.IntervalSpan: lambda self, e: f"{self.sql(e, 'this')} TO {self.sql(e, 'expression')}",
|
exp.IntervalSpan: lambda self, e: f"{self.sql(e, 'this')} TO {self.sql(e, 'expression')}",
|
||||||
|
exp.Int64: lambda self, e: self.sql(exp.cast(e.this, exp.DataType.Type.BIGINT)),
|
||||||
exp.LanguageProperty: lambda self, e: self.naked_property(e),
|
exp.LanguageProperty: lambda self, e: self.naked_property(e),
|
||||||
exp.LocationProperty: lambda self, e: self.naked_property(e),
|
exp.LocationProperty: lambda self, e: self.naked_property(e),
|
||||||
exp.LogProperty: lambda _, e: f"{'NO ' if e.args.get('no') else ''}LOG",
|
exp.LogProperty: lambda _, e: f"{'NO ' if e.args.get('no') else ''}LOG",
|
||||||
|
@ -593,6 +594,7 @@ class Generator(metaclass=_Generator):
|
||||||
WITH_SEPARATED_COMMENTS: t.Tuple[t.Type[exp.Expression], ...] = (
|
WITH_SEPARATED_COMMENTS: t.Tuple[t.Type[exp.Expression], ...] = (
|
||||||
exp.Command,
|
exp.Command,
|
||||||
exp.Create,
|
exp.Create,
|
||||||
|
exp.Describe,
|
||||||
exp.Delete,
|
exp.Delete,
|
||||||
exp.Drop,
|
exp.Drop,
|
||||||
exp.From,
|
exp.From,
|
||||||
|
@ -3896,7 +3898,14 @@ class Generator(metaclass=_Generator):
|
||||||
if isinstance(this, exp.TsOrDsToTimestamp) or this.is_type(exp.DataType.Type.TIMESTAMP):
|
if isinstance(this, exp.TsOrDsToTimestamp) or this.is_type(exp.DataType.Type.TIMESTAMP):
|
||||||
return self.sql(this)
|
return self.sql(this)
|
||||||
|
|
||||||
return self.sql(exp.cast(this, exp.DataType.Type.TIMESTAMP))
|
return self.sql(exp.cast(this, exp.DataType.Type.TIMESTAMP, dialect=self.dialect))
|
||||||
|
|
||||||
|
def tsordstodatetime_sql(self, expression: exp.TsOrDsToDatetime) -> str:
|
||||||
|
this = expression.this
|
||||||
|
if isinstance(this, exp.TsOrDsToDatetime) or this.is_type(exp.DataType.Type.DATETIME):
|
||||||
|
return self.sql(this)
|
||||||
|
|
||||||
|
return self.sql(exp.cast(this, exp.DataType.Type.DATETIME, dialect=self.dialect))
|
||||||
|
|
||||||
def tsordstodate_sql(self, expression: exp.TsOrDsToDate) -> str:
|
def tsordstodate_sql(self, expression: exp.TsOrDsToDate) -> str:
|
||||||
this = expression.this
|
this = expression.this
|
||||||
|
|
|
@ -806,7 +806,7 @@ class Parser(metaclass=_Parser):
|
||||||
kind=self._parse_var_from_options(self.USABLES, raise_unmatched=False),
|
kind=self._parse_var_from_options(self.USABLES, raise_unmatched=False),
|
||||||
this=self._parse_table(schema=False),
|
this=self._parse_table(schema=False),
|
||||||
),
|
),
|
||||||
TokenType.SEMICOLON: lambda self: self.expression(exp.Semicolon),
|
TokenType.SEMICOLON: lambda self: exp.Semicolon(),
|
||||||
}
|
}
|
||||||
|
|
||||||
UNARY_PARSERS = {
|
UNARY_PARSERS = {
|
||||||
|
@ -1715,7 +1715,10 @@ class Parser(metaclass=_Parser):
|
||||||
return None
|
return None
|
||||||
|
|
||||||
if self._match_set(self.STATEMENT_PARSERS):
|
if self._match_set(self.STATEMENT_PARSERS):
|
||||||
return self.STATEMENT_PARSERS[self._prev.token_type](self)
|
comments = self._prev_comments
|
||||||
|
stmt = self.STATEMENT_PARSERS[self._prev.token_type](self)
|
||||||
|
stmt.add_comments(comments, prepend=True)
|
||||||
|
return stmt
|
||||||
|
|
||||||
if self._match_set(self.dialect.tokenizer.COMMANDS):
|
if self._match_set(self.dialect.tokenizer.COMMANDS):
|
||||||
return self._parse_command()
|
return self._parse_command()
|
||||||
|
@ -1735,7 +1738,11 @@ class Parser(metaclass=_Parser):
|
||||||
|
|
||||||
concurrently = self._match_text_seq("CONCURRENTLY")
|
concurrently = self._match_text_seq("CONCURRENTLY")
|
||||||
if_exists = exists or self._parse_exists()
|
if_exists = exists or self._parse_exists()
|
||||||
table = self._parse_table_parts(
|
|
||||||
|
if kind == "COLUMN":
|
||||||
|
this = self._parse_column()
|
||||||
|
else:
|
||||||
|
this = self._parse_table_parts(
|
||||||
schema=True, is_db_reference=self._prev.token_type == TokenType.SCHEMA
|
schema=True, is_db_reference=self._prev.token_type == TokenType.SCHEMA
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -1748,9 +1755,8 @@ class Parser(metaclass=_Parser):
|
||||||
|
|
||||||
return self.expression(
|
return self.expression(
|
||||||
exp.Drop,
|
exp.Drop,
|
||||||
comments=start.comments,
|
|
||||||
exists=if_exists,
|
exists=if_exists,
|
||||||
this=table,
|
this=this,
|
||||||
expressions=expressions,
|
expressions=expressions,
|
||||||
kind=self.dialect.CREATABLE_KIND_MAPPING.get(kind) or kind,
|
kind=self.dialect.CREATABLE_KIND_MAPPING.get(kind) or kind,
|
||||||
temporary=temporary,
|
temporary=temporary,
|
||||||
|
@ -1772,7 +1778,6 @@ class Parser(metaclass=_Parser):
|
||||||
def _parse_create(self) -> exp.Create | exp.Command:
|
def _parse_create(self) -> exp.Create | exp.Command:
|
||||||
# Note: this can't be None because we've matched a statement parser
|
# Note: this can't be None because we've matched a statement parser
|
||||||
start = self._prev
|
start = self._prev
|
||||||
comments = self._prev_comments
|
|
||||||
|
|
||||||
replace = (
|
replace = (
|
||||||
start.token_type == TokenType.REPLACE
|
start.token_type == TokenType.REPLACE
|
||||||
|
@ -1919,7 +1924,6 @@ class Parser(metaclass=_Parser):
|
||||||
create_kind_text = create_token.text.upper()
|
create_kind_text = create_token.text.upper()
|
||||||
return self.expression(
|
return self.expression(
|
||||||
exp.Create,
|
exp.Create,
|
||||||
comments=comments,
|
|
||||||
this=this,
|
this=this,
|
||||||
kind=self.dialect.CREATABLE_KIND_MAPPING.get(create_kind_text) or create_kind_text,
|
kind=self.dialect.CREATABLE_KIND_MAPPING.get(create_kind_text) or create_kind_text,
|
||||||
replace=replace,
|
replace=replace,
|
||||||
|
@ -2659,7 +2663,7 @@ class Parser(metaclass=_Parser):
|
||||||
)
|
)
|
||||||
|
|
||||||
def _parse_insert(self) -> t.Union[exp.Insert, exp.MultitableInserts]:
|
def _parse_insert(self) -> t.Union[exp.Insert, exp.MultitableInserts]:
|
||||||
comments = ensure_list(self._prev_comments)
|
comments = []
|
||||||
hint = self._parse_hint()
|
hint = self._parse_hint()
|
||||||
overwrite = self._match(TokenType.OVERWRITE)
|
overwrite = self._match(TokenType.OVERWRITE)
|
||||||
ignore = self._match(TokenType.IGNORE)
|
ignore = self._match(TokenType.IGNORE)
|
||||||
|
@ -2845,7 +2849,6 @@ class Parser(metaclass=_Parser):
|
||||||
# This handles MySQL's "Multiple-Table Syntax"
|
# This handles MySQL's "Multiple-Table Syntax"
|
||||||
# https://dev.mysql.com/doc/refman/8.0/en/delete.html
|
# https://dev.mysql.com/doc/refman/8.0/en/delete.html
|
||||||
tables = None
|
tables = None
|
||||||
comments = self._prev_comments
|
|
||||||
if not self._match(TokenType.FROM, advance=False):
|
if not self._match(TokenType.FROM, advance=False):
|
||||||
tables = self._parse_csv(self._parse_table) or None
|
tables = self._parse_csv(self._parse_table) or None
|
||||||
|
|
||||||
|
@ -2853,7 +2856,6 @@ class Parser(metaclass=_Parser):
|
||||||
|
|
||||||
return self.expression(
|
return self.expression(
|
||||||
exp.Delete,
|
exp.Delete,
|
||||||
comments=comments,
|
|
||||||
tables=tables,
|
tables=tables,
|
||||||
this=self._match(TokenType.FROM) and self._parse_table(joins=True),
|
this=self._match(TokenType.FROM) and self._parse_table(joins=True),
|
||||||
using=self._match(TokenType.USING) and self._parse_table(joins=True),
|
using=self._match(TokenType.USING) and self._parse_table(joins=True),
|
||||||
|
@ -2864,13 +2866,11 @@ class Parser(metaclass=_Parser):
|
||||||
)
|
)
|
||||||
|
|
||||||
def _parse_update(self) -> exp.Update:
|
def _parse_update(self) -> exp.Update:
|
||||||
comments = self._prev_comments
|
|
||||||
this = self._parse_table(joins=True, alias_tokens=self.UPDATE_ALIAS_TOKENS)
|
this = self._parse_table(joins=True, alias_tokens=self.UPDATE_ALIAS_TOKENS)
|
||||||
expressions = self._match(TokenType.SET) and self._parse_csv(self._parse_equality)
|
expressions = self._match(TokenType.SET) and self._parse_csv(self._parse_equality)
|
||||||
returning = self._parse_returning()
|
returning = self._parse_returning()
|
||||||
return self.expression(
|
return self.expression(
|
||||||
exp.Update,
|
exp.Update,
|
||||||
comments=comments,
|
|
||||||
**{ # type: ignore
|
**{ # type: ignore
|
||||||
"this": this,
|
"this": this,
|
||||||
"expressions": expressions,
|
"expressions": expressions,
|
||||||
|
|
|
@ -179,27 +179,42 @@ def eliminate_distinct_on(expression: exp.Expression) -> exp.Expression:
|
||||||
if (
|
if (
|
||||||
isinstance(expression, exp.Select)
|
isinstance(expression, exp.Select)
|
||||||
and expression.args.get("distinct")
|
and expression.args.get("distinct")
|
||||||
and expression.args["distinct"].args.get("on")
|
and isinstance(expression.args["distinct"].args.get("on"), exp.Tuple)
|
||||||
and isinstance(expression.args["distinct"].args["on"], exp.Tuple)
|
|
||||||
):
|
):
|
||||||
distinct_cols = expression.args["distinct"].pop().args["on"].expressions
|
row_number_window_alias = find_new_name(expression.named_selects, "_row_number")
|
||||||
outer_selects = expression.selects
|
|
||||||
row_number = find_new_name(expression.named_selects, "_row_number")
|
|
||||||
window = exp.Window(this=exp.RowNumber(), partition_by=distinct_cols)
|
|
||||||
order = expression.args.get("order")
|
|
||||||
|
|
||||||
|
distinct_cols = expression.args["distinct"].pop().args["on"].expressions
|
||||||
|
window = exp.Window(this=exp.RowNumber(), partition_by=distinct_cols)
|
||||||
|
|
||||||
|
order = expression.args.get("order")
|
||||||
if order:
|
if order:
|
||||||
window.set("order", order.pop())
|
window.set("order", order.pop())
|
||||||
else:
|
else:
|
||||||
window.set("order", exp.Order(expressions=[c.copy() for c in distinct_cols]))
|
window.set("order", exp.Order(expressions=[c.copy() for c in distinct_cols]))
|
||||||
|
|
||||||
window = exp.alias_(window, row_number)
|
window = exp.alias_(window, row_number_window_alias)
|
||||||
expression.select(window, copy=False)
|
expression.select(window, copy=False)
|
||||||
|
|
||||||
|
# We add aliases to the projections so that we can safely reference them in the outer query
|
||||||
|
new_selects = []
|
||||||
|
taken_names = {row_number_window_alias}
|
||||||
|
for select in expression.selects[:-1]:
|
||||||
|
if select.is_star:
|
||||||
|
new_selects = [exp.Star()]
|
||||||
|
break
|
||||||
|
|
||||||
|
if not isinstance(select, exp.Alias):
|
||||||
|
alias = find_new_name(taken_names, select.output_name or "_col")
|
||||||
|
quoted = select.this.args.get("quoted") if isinstance(select, exp.Column) else None
|
||||||
|
select = select.replace(exp.alias_(select, alias, quoted=quoted))
|
||||||
|
|
||||||
|
taken_names.add(select.output_name)
|
||||||
|
new_selects.append(select.args["alias"])
|
||||||
|
|
||||||
return (
|
return (
|
||||||
exp.select(*outer_selects, copy=False)
|
exp.select(*new_selects, copy=False)
|
||||||
.from_(expression.subquery("_t", copy=False), copy=False)
|
.from_(expression.subquery("_t", copy=False), copy=False)
|
||||||
.where(exp.column(row_number).eq(1), copy=False)
|
.where(exp.column(row_number_window_alias).eq(1), copy=False)
|
||||||
)
|
)
|
||||||
|
|
||||||
return expression
|
return expression
|
||||||
|
|
163
sqlglotrs/Cargo.lock
generated
163
sqlglotrs/Cargo.lock
generated
|
@ -8,12 +8,6 @@ version = "1.1.0"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "d468802bab17cbc0cc575e9b053f41e72aa36bfa6b7f55e3529ffa43161b97fa"
|
checksum = "d468802bab17cbc0cc575e9b053f41e72aa36bfa6b7f55e3529ffa43161b97fa"
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "bitflags"
|
|
||||||
version = "1.3.2"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a"
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "cfg-if"
|
name = "cfg-if"
|
||||||
version = "1.0.0"
|
version = "1.0.0"
|
||||||
|
@ -22,9 +16,9 @@ checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "heck"
|
name = "heck"
|
||||||
version = "0.4.1"
|
version = "0.5.0"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "95505c38b4572b2d910cecb0281560f54b440a19336cbbcb27bf6ce6adc6f5a8"
|
checksum = "2304e00983f87ffb38b55b444b5e3b60a884b5d30c0fca7d82fe33449bbe55ea"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "indoc"
|
name = "indoc"
|
||||||
|
@ -38,16 +32,6 @@ version = "0.2.150"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "89d92a4743f9a61002fae18374ed11e7973f530cb3a3255fb354818118b2203c"
|
checksum = "89d92a4743f9a61002fae18374ed11e7973f530cb3a3255fb354818118b2203c"
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "lock_api"
|
|
||||||
version = "0.4.11"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "3c168f8615b12bc01f9c17e2eb0cc07dcae1940121185446edc3744920e8ef45"
|
|
||||||
dependencies = [
|
|
||||||
"autocfg",
|
|
||||||
"scopeguard",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "memoffset"
|
name = "memoffset"
|
||||||
version = "0.9.0"
|
version = "0.9.0"
|
||||||
|
@ -64,48 +48,32 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "3fdb12b2476b595f9358c5161aa467c2438859caa136dec86c26fdd2efe17b92"
|
checksum = "3fdb12b2476b595f9358c5161aa467c2438859caa136dec86c26fdd2efe17b92"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "parking_lot"
|
name = "portable-atomic"
|
||||||
version = "0.12.1"
|
version = "1.9.0"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "3742b2c103b9f06bc9fff0a37ff4912935851bee6d36f3c02bcc755bcfec228f"
|
checksum = "cc9c68a3f6da06753e9335d63e27f6b9754dd1920d941135b7ea8224f141adb2"
|
||||||
dependencies = [
|
|
||||||
"lock_api",
|
|
||||||
"parking_lot_core",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "parking_lot_core"
|
|
||||||
version = "0.9.9"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "4c42a9226546d68acdd9c0a280d17ce19bfe27a46bf68784e4066115788d008e"
|
|
||||||
dependencies = [
|
|
||||||
"cfg-if",
|
|
||||||
"libc",
|
|
||||||
"redox_syscall",
|
|
||||||
"smallvec",
|
|
||||||
"windows-targets",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "proc-macro2"
|
name = "proc-macro2"
|
||||||
version = "1.0.70"
|
version = "1.0.89"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "39278fbbf5fb4f646ce651690877f89d1c5811a3d4acb27700c1cb3cdb78fd3b"
|
checksum = "f139b0662de085916d1fb67d2b4169d1addddda1919e696f3252b740b629986e"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"unicode-ident",
|
"unicode-ident",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "pyo3"
|
name = "pyo3"
|
||||||
version = "0.20.0"
|
version = "0.22.6"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "04e8453b658fe480c3e70c8ed4e3d3ec33eb74988bd186561b0cc66b85c3bc4b"
|
checksum = "f402062616ab18202ae8319da13fa4279883a2b8a9d9f83f20dbade813ce1884"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"cfg-if",
|
"cfg-if",
|
||||||
"indoc",
|
"indoc",
|
||||||
"libc",
|
"libc",
|
||||||
"memoffset",
|
"memoffset",
|
||||||
"parking_lot",
|
"once_cell",
|
||||||
|
"portable-atomic",
|
||||||
"pyo3-build-config",
|
"pyo3-build-config",
|
||||||
"pyo3-ffi",
|
"pyo3-ffi",
|
||||||
"pyo3-macros",
|
"pyo3-macros",
|
||||||
|
@ -114,9 +82,9 @@ dependencies = [
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "pyo3-build-config"
|
name = "pyo3-build-config"
|
||||||
version = "0.20.0"
|
version = "0.22.6"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "a96fe70b176a89cff78f2fa7b3c930081e163d5379b4dcdf993e3ae29ca662e5"
|
checksum = "b14b5775b5ff446dd1056212d778012cbe8a0fbffd368029fd9e25b514479c38"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"once_cell",
|
"once_cell",
|
||||||
"target-lexicon",
|
"target-lexicon",
|
||||||
|
@ -124,9 +92,9 @@ dependencies = [
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "pyo3-ffi"
|
name = "pyo3-ffi"
|
||||||
version = "0.20.0"
|
version = "0.22.6"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "214929900fd25e6604661ed9cf349727c8920d47deff196c4e28165a6ef2a96b"
|
checksum = "9ab5bcf04a2cdcbb50c7d6105de943f543f9ed92af55818fd17b660390fc8636"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"libc",
|
"libc",
|
||||||
"pyo3-build-config",
|
"pyo3-build-config",
|
||||||
|
@ -134,9 +102,9 @@ dependencies = [
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "pyo3-macros"
|
name = "pyo3-macros"
|
||||||
version = "0.20.0"
|
version = "0.22.6"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "dac53072f717aa1bfa4db832b39de8c875b7c7af4f4a6fe93cdbf9264cf8383b"
|
checksum = "0fd24d897903a9e6d80b968368a34e1525aeb719d568dba8b3d4bfa5dc67d453"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"proc-macro2",
|
"proc-macro2",
|
||||||
"pyo3-macros-backend",
|
"pyo3-macros-backend",
|
||||||
|
@ -146,58 +114,38 @@ dependencies = [
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "pyo3-macros-backend"
|
name = "pyo3-macros-backend"
|
||||||
version = "0.20.0"
|
version = "0.22.6"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "7774b5a8282bd4f25f803b1f0d945120be959a36c72e08e7cd031c792fdfd424"
|
checksum = "36c011a03ba1e50152b4b394b479826cad97e7a21eb52df179cd91ac411cbfbe"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"heck",
|
"heck",
|
||||||
"proc-macro2",
|
"proc-macro2",
|
||||||
|
"pyo3-build-config",
|
||||||
"quote",
|
"quote",
|
||||||
"syn",
|
"syn",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "quote"
|
name = "quote"
|
||||||
version = "1.0.33"
|
version = "1.0.37"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "5267fca4496028628a95160fc423a33e8b2e6af8a5302579e322e4b520293cae"
|
checksum = "b5b9d34b8991d19d98081b46eacdd8eb58c6f2b201139f7c5f643cc155a633af"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"proc-macro2",
|
"proc-macro2",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "redox_syscall"
|
|
||||||
version = "0.4.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "4722d768eff46b75989dd134e5c353f0d6296e5aaa3132e776cbdb56be7731aa"
|
|
||||||
dependencies = [
|
|
||||||
"bitflags",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "scopeguard"
|
|
||||||
version = "1.2.0"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "smallvec"
|
|
||||||
version = "1.11.2"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "4dccd0940a2dcdf68d092b8cbab7dc0ad8fa938bf95787e1b916b0e3d0e8e970"
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "sqlglotrs"
|
name = "sqlglotrs"
|
||||||
version = "0.2.13"
|
version = "0.2.14"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"pyo3",
|
"pyo3",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "syn"
|
name = "syn"
|
||||||
version = "2.0.41"
|
version = "2.0.87"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "44c8b28c477cc3bf0e7966561e3460130e1255f7a1cf71931075f1c5e7a7e269"
|
checksum = "25aa4ce346d03a6dcd68dd8b4010bcb74e54e62c90c573f394c46eae99aba32d"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"proc-macro2",
|
"proc-macro2",
|
||||||
"quote",
|
"quote",
|
||||||
|
@ -206,9 +154,9 @@ dependencies = [
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "target-lexicon"
|
name = "target-lexicon"
|
||||||
version = "0.12.12"
|
version = "0.12.16"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "14c39fd04924ca3a864207c66fc2cd7d22d7c016007f9ce846cbb9326331930a"
|
checksum = "61c41af27dd6d1e27b1b16b489db798443478cef1f06a660c96db617ba5de3b1"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "unicode-ident"
|
name = "unicode-ident"
|
||||||
|
@ -221,60 +169,3 @@ name = "unindent"
|
||||||
version = "0.2.3"
|
version = "0.2.3"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "c7de7d73e1754487cb58364ee906a499937a0dfabd86bcb980fa99ec8c8fa2ce"
|
checksum = "c7de7d73e1754487cb58364ee906a499937a0dfabd86bcb980fa99ec8c8fa2ce"
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "windows-targets"
|
|
||||||
version = "0.48.5"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "9a2fa6e2155d7247be68c096456083145c183cbbbc2764150dda45a87197940c"
|
|
||||||
dependencies = [
|
|
||||||
"windows_aarch64_gnullvm",
|
|
||||||
"windows_aarch64_msvc",
|
|
||||||
"windows_i686_gnu",
|
|
||||||
"windows_i686_msvc",
|
|
||||||
"windows_x86_64_gnu",
|
|
||||||
"windows_x86_64_gnullvm",
|
|
||||||
"windows_x86_64_msvc",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "windows_aarch64_gnullvm"
|
|
||||||
version = "0.48.5"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "2b38e32f0abccf9987a4e3079dfb67dcd799fb61361e53e2882c3cbaf0d905d8"
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "windows_aarch64_msvc"
|
|
||||||
version = "0.48.5"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "dc35310971f3b2dbbf3f0690a219f40e2d9afcf64f9ab7cc1be722937c26b4bc"
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "windows_i686_gnu"
|
|
||||||
version = "0.48.5"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "a75915e7def60c94dcef72200b9a8e58e5091744960da64ec734a6c6e9b3743e"
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "windows_i686_msvc"
|
|
||||||
version = "0.48.5"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "8f55c233f70c4b27f66c523580f78f1004e8b5a8b659e05a4eb49d4166cca406"
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "windows_x86_64_gnu"
|
|
||||||
version = "0.48.5"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "53d40abd2583d23e4718fddf1ebec84dbff8381c07cae67ff7768bbf19c6718e"
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "windows_x86_64_gnullvm"
|
|
||||||
version = "0.48.5"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "0b7b52767868a23d5bab768e390dc5f5c55825b6d30b86c844ff2dc7414044cc"
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "windows_x86_64_msvc"
|
|
||||||
version = "0.48.5"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538"
|
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
[package]
|
[package]
|
||||||
name = "sqlglotrs"
|
name = "sqlglotrs"
|
||||||
version = "0.2.13"
|
version = "0.2.14"
|
||||||
edition = "2021"
|
edition = "2021"
|
||||||
license = "MIT"
|
license = "MIT"
|
||||||
|
|
||||||
|
@ -9,4 +9,4 @@ name = "sqlglotrs"
|
||||||
crate-type = ["cdylib"]
|
crate-type = ["cdylib"]
|
||||||
|
|
||||||
[dependencies]
|
[dependencies]
|
||||||
pyo3 = "0.20.0"
|
pyo3 = "0.22.6"
|
||||||
|
|
|
@ -43,19 +43,19 @@ impl Token {
|
||||||
) -> Token {
|
) -> Token {
|
||||||
Python::with_gil(|py| Token {
|
Python::with_gil(|py| Token {
|
||||||
token_type,
|
token_type,
|
||||||
token_type_py: PyNone::get(py).into(),
|
token_type_py: PyNone::get_bound(py).into_py(py),
|
||||||
text: PyString::new(py, &text).into(),
|
text: PyString::new_bound(py, &text).into_py(py),
|
||||||
line,
|
line,
|
||||||
col,
|
col,
|
||||||
start,
|
start,
|
||||||
end,
|
end,
|
||||||
comments: PyList::new(py, &comments).into(),
|
comments: PyList::new_bound(py, &comments).into(),
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn append_comments(&self, comments: &mut Vec<String>) {
|
pub fn append_comments(&self, comments: &mut Vec<String>) {
|
||||||
Python::with_gil(|py| {
|
Python::with_gil(|py| {
|
||||||
let pylist = self.comments.as_ref(py);
|
let pylist = self.comments.bind(py);
|
||||||
for comment in comments.iter() {
|
for comment in comments.iter() {
|
||||||
if let Err(_) = pylist.append(comment) {
|
if let Err(_) = pylist.append(comment) {
|
||||||
panic!("Failed to append comments to the Python list");
|
panic!("Failed to append comments to the Python list");
|
||||||
|
@ -74,20 +74,20 @@ impl Token {
|
||||||
Python::with_gil(|py| {
|
Python::with_gil(|py| {
|
||||||
Ok(format!(
|
Ok(format!(
|
||||||
"<Token token_type: {}, text: {}, line: {}, col: {}, start: {}, end: {}, comments: {}>",
|
"<Token token_type: {}, text: {}, line: {}, col: {}, start: {}, end: {}, comments: {}>",
|
||||||
self.token_type_py.as_ref(py).repr()?,
|
self.token_type_py.bind(py).repr()?,
|
||||||
self.text.as_ref(py).repr()?,
|
self.text.bind(py).repr()?,
|
||||||
self.line,
|
self.line,
|
||||||
self.col,
|
self.col,
|
||||||
self.start,
|
self.start,
|
||||||
self.end,
|
self.end,
|
||||||
self.comments.as_ref(py).repr()?,
|
self.comments.bind(py).repr()?,
|
||||||
))
|
))
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[pymodule]
|
#[pymodule]
|
||||||
fn sqlglotrs(_py: Python<'_>, m: &PyModule) -> PyResult<()> {
|
fn sqlglotrs(m: &Bound<'_, PyModule>) -> PyResult<()> {
|
||||||
m.add_class::<Token>()?;
|
m.add_class::<Token>()?;
|
||||||
m.add_class::<TokenTypeSettings>()?;
|
m.add_class::<TokenTypeSettings>()?;
|
||||||
m.add_class::<TokenizerSettings>()?;
|
m.add_class::<TokenizerSettings>()?;
|
||||||
|
|
|
@ -557,27 +557,6 @@ LANGUAGE js AS
|
||||||
"tsql": "SELECT CAST('2008-12-25 15:30:00' AS TIME)",
|
"tsql": "SELECT CAST('2008-12-25 15:30:00' AS TIME)",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
self.validate_all(
|
|
||||||
"SELECT FORMAT_DATE('%Y%m%d', '2023-12-25')",
|
|
||||||
write={
|
|
||||||
"bigquery": "SELECT FORMAT_DATE('%Y%m%d', '2023-12-25')",
|
|
||||||
"duckdb": "SELECT STRFTIME(CAST('2023-12-25' AS DATE), '%Y%m%d')",
|
|
||||||
},
|
|
||||||
)
|
|
||||||
self.validate_all(
|
|
||||||
"SELECT FORMAT_DATETIME('%Y%m%d %H:%M:%S', DATETIME '2023-12-25 15:30:00')",
|
|
||||||
write={
|
|
||||||
"bigquery": "SELECT FORMAT_DATETIME('%Y%m%d %H:%M:%S', CAST('2023-12-25 15:30:00' AS DATETIME))",
|
|
||||||
"duckdb": "SELECT STRFTIME(CAST('2023-12-25 15:30:00' AS TIMESTAMP), '%Y%m%d %H:%M:%S')",
|
|
||||||
},
|
|
||||||
)
|
|
||||||
self.validate_all(
|
|
||||||
"SELECT FORMAT_DATETIME('%x', '2023-12-25 15:30:00')",
|
|
||||||
write={
|
|
||||||
"bigquery": "SELECT FORMAT_DATETIME('%x', '2023-12-25 15:30:00')",
|
|
||||||
"duckdb": "SELECT STRFTIME(CAST('2023-12-25 15:30:00' AS TIMESTAMP), '%x')",
|
|
||||||
},
|
|
||||||
)
|
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"SELECT COUNTIF(x)",
|
"SELECT COUNTIF(x)",
|
||||||
read={
|
read={
|
||||||
|
@ -685,7 +664,7 @@ LANGUAGE js AS
|
||||||
write={
|
write={
|
||||||
"bigquery": "SELECT DATETIME_ADD('2023-01-01T00:00:00', INTERVAL '1' MILLISECOND)",
|
"bigquery": "SELECT DATETIME_ADD('2023-01-01T00:00:00', INTERVAL '1' MILLISECOND)",
|
||||||
"databricks": "SELECT TIMESTAMPADD(MILLISECOND, '1', '2023-01-01T00:00:00')",
|
"databricks": "SELECT TIMESTAMPADD(MILLISECOND, '1', '2023-01-01T00:00:00')",
|
||||||
"duckdb": "SELECT CAST('2023-01-01T00:00:00' AS DATETIME) + INTERVAL '1' MILLISECOND",
|
"duckdb": "SELECT CAST('2023-01-01T00:00:00' AS TIMESTAMP) + INTERVAL '1' MILLISECOND",
|
||||||
"snowflake": "SELECT TIMESTAMPADD(MILLISECOND, '1', '2023-01-01T00:00:00')",
|
"snowflake": "SELECT TIMESTAMPADD(MILLISECOND, '1', '2023-01-01T00:00:00')",
|
||||||
},
|
},
|
||||||
),
|
),
|
||||||
|
@ -696,7 +675,7 @@ LANGUAGE js AS
|
||||||
write={
|
write={
|
||||||
"bigquery": "SELECT DATETIME_SUB('2023-01-01T00:00:00', INTERVAL '1' MILLISECOND)",
|
"bigquery": "SELECT DATETIME_SUB('2023-01-01T00:00:00', INTERVAL '1' MILLISECOND)",
|
||||||
"databricks": "SELECT TIMESTAMPADD(MILLISECOND, '1' * -1, '2023-01-01T00:00:00')",
|
"databricks": "SELECT TIMESTAMPADD(MILLISECOND, '1' * -1, '2023-01-01T00:00:00')",
|
||||||
"duckdb": "SELECT CAST('2023-01-01T00:00:00' AS DATETIME) - INTERVAL '1' MILLISECOND",
|
"duckdb": "SELECT CAST('2023-01-01T00:00:00' AS TIMESTAMP) - INTERVAL '1' MILLISECOND",
|
||||||
},
|
},
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
|
@ -706,7 +685,7 @@ LANGUAGE js AS
|
||||||
write={
|
write={
|
||||||
"bigquery": "SELECT DATETIME_TRUNC('2023-01-01T01:01:01', HOUR)",
|
"bigquery": "SELECT DATETIME_TRUNC('2023-01-01T01:01:01', HOUR)",
|
||||||
"databricks": "SELECT DATE_TRUNC('HOUR', '2023-01-01T01:01:01')",
|
"databricks": "SELECT DATE_TRUNC('HOUR', '2023-01-01T01:01:01')",
|
||||||
"duckdb": "SELECT DATE_TRUNC('HOUR', CAST('2023-01-01T01:01:01' AS DATETIME))",
|
"duckdb": "SELECT DATE_TRUNC('HOUR', CAST('2023-01-01T01:01:01' AS TIMESTAMP))",
|
||||||
},
|
},
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
|
@ -1611,6 +1590,55 @@ WHERE
|
||||||
"snowflake": """SELECT TRANSFORM(GET_PATH(PARSE_JSON('{"arr": [1, "a"]}'), 'arr'), x -> CAST(x AS VARCHAR))""",
|
"snowflake": """SELECT TRANSFORM(GET_PATH(PARSE_JSON('{"arr": [1, "a"]}'), 'arr'), x -> CAST(x AS VARCHAR))""",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
self.validate_all(
|
||||||
|
"SELECT STRPOS('foo@example.com', '@')",
|
||||||
|
write={
|
||||||
|
"bigquery": "SELECT STRPOS('foo@example.com', '@')",
|
||||||
|
"duckdb": "SELECT STRPOS('foo@example.com', '@')",
|
||||||
|
"snowflake": "SELECT POSITION('@', 'foo@example.com')",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
self.validate_all(
|
||||||
|
"SELECT ts + MAKE_INTERVAL(1, 2, minute => 5, day => 3)",
|
||||||
|
write={
|
||||||
|
"bigquery": "SELECT ts + MAKE_INTERVAL(1, 2, day => 3, minute => 5)",
|
||||||
|
"duckdb": "SELECT ts + INTERVAL '1 year 2 month 5 minute 3 day'",
|
||||||
|
"snowflake": "SELECT ts + INTERVAL '1 year, 2 month, 5 minute, 3 day'",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
self.validate_all(
|
||||||
|
"""SELECT INT64(JSON_QUERY(JSON '{"key": 2000}', '$.key'))""",
|
||||||
|
write={
|
||||||
|
"bigquery": """SELECT INT64(JSON_QUERY(PARSE_JSON('{"key": 2000}'), '$.key'))""",
|
||||||
|
"duckdb": """SELECT CAST(JSON('{"key": 2000}') -> '$.key' AS BIGINT)""",
|
||||||
|
"snowflake": """SELECT CAST(GET_PATH(PARSE_JSON('{"key": 2000}'), 'key') AS BIGINT)""",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
self.validate_identity(
|
||||||
|
"CONTAINS_SUBSTRING(a, b, json_scope => 'JSON_KEYS_AND_VALUES')"
|
||||||
|
).assert_is(exp.Anonymous)
|
||||||
|
|
||||||
|
self.validate_all(
|
||||||
|
"""CONTAINS_SUBSTRING(a, b)""",
|
||||||
|
read={
|
||||||
|
"": "CONTAINS(a, b)",
|
||||||
|
"spark": "CONTAINS(a, b)",
|
||||||
|
"databricks": "CONTAINS(a, b)",
|
||||||
|
"snowflake": "CONTAINS(a, b)",
|
||||||
|
"duckdb": "CONTAINS(a, b)",
|
||||||
|
"oracle": "CONTAINS(a, b)",
|
||||||
|
},
|
||||||
|
write={
|
||||||
|
"": "CONTAINS(LOWER(a), LOWER(b))",
|
||||||
|
"spark": "CONTAINS(LOWER(a), LOWER(b))",
|
||||||
|
"databricks": "CONTAINS(LOWER(a), LOWER(b))",
|
||||||
|
"snowflake": "CONTAINS(LOWER(a), LOWER(b))",
|
||||||
|
"duckdb": "CONTAINS(LOWER(a), LOWER(b))",
|
||||||
|
"oracle": "CONTAINS(LOWER(a), LOWER(b))",
|
||||||
|
"bigquery": "CONTAINS_SUBSTRING(a, b)",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
def test_errors(self):
|
def test_errors(self):
|
||||||
with self.assertRaises(TokenError):
|
with self.assertRaises(TokenError):
|
||||||
|
@ -2213,3 +2241,34 @@ OPTIONS (
|
||||||
"databricks": "REGEXP_EXTRACT_ALL('a1_a2a3_a4A5a6', '(a)[0-9]')",
|
"databricks": "REGEXP_EXTRACT_ALL('a1_a2a3_a4A5a6', '(a)[0-9]')",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def test_format_temporal(self):
|
||||||
|
self.validate_all(
|
||||||
|
"SELECT FORMAT_DATE('%Y%m%d', '2023-12-25')",
|
||||||
|
write={
|
||||||
|
"bigquery": "SELECT FORMAT_DATE('%Y%m%d', '2023-12-25')",
|
||||||
|
"duckdb": "SELECT STRFTIME(CAST('2023-12-25' AS DATE), '%Y%m%d')",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
self.validate_all(
|
||||||
|
"SELECT FORMAT_DATETIME('%Y%m%d %H:%M:%S', DATETIME '2023-12-25 15:30:00')",
|
||||||
|
write={
|
||||||
|
"bigquery": "SELECT FORMAT_DATETIME('%Y%m%d %H:%M:%S', CAST('2023-12-25 15:30:00' AS DATETIME))",
|
||||||
|
"duckdb": "SELECT STRFTIME(CAST('2023-12-25 15:30:00' AS TIMESTAMP), '%Y%m%d %H:%M:%S')",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
self.validate_all(
|
||||||
|
"SELECT FORMAT_DATETIME('%x', '2023-12-25 15:30:00')",
|
||||||
|
write={
|
||||||
|
"bigquery": "SELECT FORMAT_DATETIME('%x', '2023-12-25 15:30:00')",
|
||||||
|
"duckdb": "SELECT STRFTIME(CAST('2023-12-25 15:30:00' AS TIMESTAMP), '%x')",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
self.validate_all(
|
||||||
|
"""SELECT FORMAT_TIMESTAMP("%b-%d-%Y", TIMESTAMP "2050-12-25 15:30:55+00")""",
|
||||||
|
write={
|
||||||
|
"bigquery": "SELECT FORMAT_TIMESTAMP('%b-%d-%Y', CAST('2050-12-25 15:30:55+00' AS TIMESTAMP))",
|
||||||
|
"duckdb": "SELECT STRFTIME(CAST(CAST('2050-12-25 15:30:55+00' AS TIMESTAMPTZ) AS TIMESTAMP), '%b-%d-%Y')",
|
||||||
|
"snowflake": "SELECT TO_CHAR(CAST(CAST('2050-12-25 15:30:55+00' AS TIMESTAMPTZ) AS TIMESTAMP), 'mon-DD-yyyy')",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
|
@ -251,7 +251,7 @@ class TestClickhouse(Validator):
|
||||||
},
|
},
|
||||||
write={
|
write={
|
||||||
"clickhouse": "SELECT CAST('2020-01-01' AS Nullable(DateTime)) + INTERVAL '500' MICROSECOND",
|
"clickhouse": "SELECT CAST('2020-01-01' AS Nullable(DateTime)) + INTERVAL '500' MICROSECOND",
|
||||||
"duckdb": "SELECT CAST('2020-01-01' AS DATETIME) + INTERVAL '500' MICROSECOND",
|
"duckdb": "SELECT CAST('2020-01-01' AS TIMESTAMP) + INTERVAL '500' MICROSECOND",
|
||||||
"postgres": "SELECT CAST('2020-01-01' AS TIMESTAMP) + INTERVAL '500 MICROSECOND'",
|
"postgres": "SELECT CAST('2020-01-01' AS TIMESTAMP) + INTERVAL '500 MICROSECOND'",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
@ -1245,3 +1245,17 @@ LIFETIME(MIN 0 MAX 0)""",
|
||||||
scopes = traverse_scope(parse_one(sql, dialect=self.dialect))
|
scopes = traverse_scope(parse_one(sql, dialect=self.dialect))
|
||||||
self.assertEqual(len(scopes), 1)
|
self.assertEqual(len(scopes), 1)
|
||||||
self.assertEqual(set(scopes[0].sources), {"t"})
|
self.assertEqual(set(scopes[0].sources), {"t"})
|
||||||
|
|
||||||
|
def test_window_functions(self):
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT row_number(column1) OVER (PARTITION BY column2 ORDER BY column3) FROM table"
|
||||||
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT row_number() OVER (PARTITION BY column2 ORDER BY column3) FROM table"
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_functions(self):
|
||||||
|
self.validate_identity("SELECT TRANSFORM(foo, [1, 2], ['first', 'second']) FROM table")
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT TRANSFORM(foo, [1, 2], ['first', 'second'], 'default') FROM table"
|
||||||
|
)
|
||||||
|
|
|
@ -1688,6 +1688,7 @@ class TestDialect(Validator):
|
||||||
"duckdb": "STRPOS(haystack, needle)",
|
"duckdb": "STRPOS(haystack, needle)",
|
||||||
"postgres": "STRPOS(haystack, needle)",
|
"postgres": "STRPOS(haystack, needle)",
|
||||||
"presto": "STRPOS(haystack, needle)",
|
"presto": "STRPOS(haystack, needle)",
|
||||||
|
"bigquery": "STRPOS(haystack, needle)",
|
||||||
"spark": "LOCATE(needle, haystack)",
|
"spark": "LOCATE(needle, haystack)",
|
||||||
"clickhouse": "position(haystack, needle)",
|
"clickhouse": "position(haystack, needle)",
|
||||||
"snowflake": "POSITION(needle, haystack)",
|
"snowflake": "POSITION(needle, haystack)",
|
||||||
|
|
|
@ -382,6 +382,7 @@ class TestDuckDB(Validator):
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"ATTACH DATABASE ':memory:' AS new_database", check_command_warning=True
|
"ATTACH DATABASE ':memory:' AS new_database", check_command_warning=True
|
||||||
)
|
)
|
||||||
|
self.validate_identity("DETACH DATABASE new_database", check_command_warning=True)
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"SELECT {'yes': 'duck', 'maybe': 'goose', 'huh': NULL, 'no': 'heron'}"
|
"SELECT {'yes': 'duck', 'maybe': 'goose', 'huh': NULL, 'no': 'heron'}"
|
||||||
)
|
)
|
||||||
|
|
|
@ -81,6 +81,10 @@ class TestMySQL(Validator):
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"CREATE OR REPLACE VIEW my_view AS SELECT column1 AS `boo`, column2 AS `foo` FROM my_table WHERE column3 = 'some_value' UNION SELECT q.* FROM fruits_table, JSON_TABLE(Fruits, '$[*]' COLUMNS(id VARCHAR(255) PATH '$.$id', value VARCHAR(255) PATH '$.value')) AS q",
|
"CREATE OR REPLACE VIEW my_view AS SELECT column1 AS `boo`, column2 AS `foo` FROM my_table WHERE column3 = 'some_value' UNION SELECT q.* FROM fruits_table, JSON_TABLE(Fruits, '$[*]' COLUMNS(id VARCHAR(255) PATH '$.$id', value VARCHAR(255) PATH '$.value')) AS q",
|
||||||
)
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"/*left*/ EXPLAIN SELECT /*hint*/ col FROM t1 /*right*/",
|
||||||
|
"/* left */ DESCRIBE /* hint */ SELECT col FROM t1 /* right */",
|
||||||
|
)
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"CREATE TABLE t (name VARCHAR)",
|
"CREATE TABLE t (name VARCHAR)",
|
||||||
"CREATE TABLE t (name TEXT)",
|
"CREATE TABLE t (name TEXT)",
|
||||||
|
|
|
@ -228,21 +228,21 @@ class TestRedshift(Validator):
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"SELECT DISTINCT ON (a) a, b FROM x ORDER BY c DESC",
|
"SELECT DISTINCT ON (a) a, b FROM x ORDER BY c DESC",
|
||||||
write={
|
write={
|
||||||
"bigquery": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"bigquery": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"databricks": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"databricks": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"drill": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"drill": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"hive": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"hive": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"mysql": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY CASE WHEN c IS NULL THEN 1 ELSE 0 END DESC, c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"mysql": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY CASE WHEN c IS NULL THEN 1 ELSE 0 END DESC, c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"oracle": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) _t WHERE _row_number = 1",
|
"oracle": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) _t WHERE _row_number = 1",
|
||||||
"presto": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"presto": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"redshift": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"redshift": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"snowflake": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"snowflake": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"spark": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"spark": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"sqlite": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"sqlite": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"starrocks": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY CASE WHEN c IS NULL THEN 1 ELSE 0 END DESC, c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"starrocks": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY CASE WHEN c IS NULL THEN 1 ELSE 0 END DESC, c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"tableau": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"tableau": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"teradata": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"teradata": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"trino": "SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"trino": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC NULLS FIRST) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
"tsql": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY CASE WHEN c IS NULL THEN 1 ELSE 0 END DESC, c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"tsql": "SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY CASE WHEN c IS NULL THEN 1 ELSE 0 END DESC, c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
|
@ -605,6 +605,17 @@ WHERE
|
||||||
"duckdb": "CASE WHEN bar = 0 AND NOT foo IS NULL THEN 0 ELSE foo / bar END",
|
"duckdb": "CASE WHEN bar = 0 AND NOT foo IS NULL THEN 0 ELSE foo / bar END",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
self.validate_all(
|
||||||
|
"DIV0(a - b, c - d)",
|
||||||
|
write={
|
||||||
|
"snowflake": "IFF((c - d) = 0 AND NOT (a - b) IS NULL, 0, (a - b) / (c - d))",
|
||||||
|
"sqlite": "IIF((c - d) = 0 AND NOT (a - b) IS NULL, 0, CAST((a - b) AS REAL) / (c - d))",
|
||||||
|
"presto": "IF((c - d) = 0 AND NOT (a - b) IS NULL, 0, CAST((a - b) AS DOUBLE) / (c - d))",
|
||||||
|
"spark": "IF((c - d) = 0 AND NOT (a - b) IS NULL, 0, (a - b) / (c - d))",
|
||||||
|
"hive": "IF((c - d) = 0 AND NOT (a - b) IS NULL, 0, (a - b) / (c - d))",
|
||||||
|
"duckdb": "CASE WHEN (c - d) = 0 AND NOT (a - b) IS NULL THEN 0 ELSE (a - b) / (c - d) END",
|
||||||
|
},
|
||||||
|
)
|
||||||
self.validate_all(
|
self.validate_all(
|
||||||
"ZEROIFNULL(foo)",
|
"ZEROIFNULL(foo)",
|
||||||
write={
|
write={
|
||||||
|
|
|
@ -39,6 +39,9 @@ class TestStarrocks(Validator):
|
||||||
self.validate_identity(
|
self.validate_identity(
|
||||||
"""SELECT CAST(PARSE_JSON(fieldvalue) -> '00000000-0000-0000-0000-00000000' AS VARCHAR) AS `code` FROM (SELECT '{"00000000-0000-0000-0000-00000000":"code01"}') AS t(fieldvalue)"""
|
"""SELECT CAST(PARSE_JSON(fieldvalue) -> '00000000-0000-0000-0000-00000000' AS VARCHAR) AS `code` FROM (SELECT '{"00000000-0000-0000-0000-00000000":"code01"}') AS t(fieldvalue)"""
|
||||||
)
|
)
|
||||||
|
self.validate_identity(
|
||||||
|
"SELECT text FROM example_table", write_sql="SELECT `text` FROM example_table"
|
||||||
|
)
|
||||||
|
|
||||||
def test_time(self):
|
def test_time(self):
|
||||||
self.validate_identity("TIMESTAMP('2022-01-01')")
|
self.validate_identity("TIMESTAMP('2022-01-01')")
|
||||||
|
|
|
@ -879,3 +879,8 @@ class TestParser(unittest.TestCase):
|
||||||
expr = parse_one(sql)
|
expr = parse_one(sql)
|
||||||
self.assertIsInstance(expr, exp.Insert)
|
self.assertIsInstance(expr, exp.Insert)
|
||||||
self.assertIsInstance(expr.expression.expressions[0].expressions[0], cls)
|
self.assertIsInstance(expr.expression.expressions[0].expressions[0], cls)
|
||||||
|
|
||||||
|
def test_drop_column(self):
|
||||||
|
ast = parse_one("ALTER TABLE tbl DROP COLUMN col")
|
||||||
|
self.assertEqual(len(list(ast.find_all(exp.Table))), 1)
|
||||||
|
self.assertEqual(len(list(ast.find_all(exp.Column))), 1)
|
||||||
|
|
|
@ -55,17 +55,17 @@ class TestTransforms(unittest.TestCase):
|
||||||
self.validate(
|
self.validate(
|
||||||
eliminate_distinct_on,
|
eliminate_distinct_on,
|
||||||
"SELECT DISTINCT ON (a) a, b FROM x ORDER BY c DESC",
|
"SELECT DISTINCT ON (a) a, b FROM x ORDER BY c DESC",
|
||||||
"SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
)
|
)
|
||||||
self.validate(
|
self.validate(
|
||||||
eliminate_distinct_on,
|
eliminate_distinct_on,
|
||||||
"SELECT DISTINCT ON (a) a, b FROM x",
|
"SELECT DISTINCT ON (a) a, b FROM x",
|
||||||
"SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY a) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY a) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
)
|
)
|
||||||
self.validate(
|
self.validate(
|
||||||
eliminate_distinct_on,
|
eliminate_distinct_on,
|
||||||
"SELECT DISTINCT ON (a, b) a, b FROM x ORDER BY c DESC",
|
"SELECT DISTINCT ON (a, b) a, b FROM x ORDER BY c DESC",
|
||||||
"SELECT a, b FROM (SELECT a, b, ROW_NUMBER() OVER (PARTITION BY a, b ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
"SELECT a, b FROM (SELECT a AS a, b AS b, ROW_NUMBER() OVER (PARTITION BY a, b ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
)
|
)
|
||||||
self.validate(
|
self.validate(
|
||||||
eliminate_distinct_on,
|
eliminate_distinct_on,
|
||||||
|
@ -75,7 +75,37 @@ class TestTransforms(unittest.TestCase):
|
||||||
self.validate(
|
self.validate(
|
||||||
eliminate_distinct_on,
|
eliminate_distinct_on,
|
||||||
"SELECT DISTINCT ON (_row_number) _row_number FROM x ORDER BY c DESC",
|
"SELECT DISTINCT ON (_row_number) _row_number FROM x ORDER BY c DESC",
|
||||||
"SELECT _row_number FROM (SELECT _row_number, ROW_NUMBER() OVER (PARTITION BY _row_number ORDER BY c DESC) AS _row_number_2 FROM x) AS _t WHERE _row_number_2 = 1",
|
"SELECT _row_number FROM (SELECT _row_number AS _row_number, ROW_NUMBER() OVER (PARTITION BY _row_number ORDER BY c DESC) AS _row_number_2 FROM x) AS _t WHERE _row_number_2 = 1",
|
||||||
|
)
|
||||||
|
self.validate(
|
||||||
|
eliminate_distinct_on,
|
||||||
|
"SELECT DISTINCT ON (x.a, x.b) x.a, x.b FROM x ORDER BY c DESC",
|
||||||
|
"SELECT a, b FROM (SELECT x.a AS a, x.b AS b, ROW_NUMBER() OVER (PARTITION BY x.a, x.b ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
|
)
|
||||||
|
self.validate(
|
||||||
|
eliminate_distinct_on,
|
||||||
|
"SELECT DISTINCT ON (a) x.a, y.a FROM x CROSS JOIN y ORDER BY c DESC",
|
||||||
|
"SELECT a, a_2 FROM (SELECT x.a AS a, y.a AS a_2, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x CROSS JOIN y) AS _t WHERE _row_number = 1",
|
||||||
|
)
|
||||||
|
self.validate(
|
||||||
|
eliminate_distinct_on,
|
||||||
|
"SELECT DISTINCT ON (a) a, a + b FROM x ORDER BY c DESC",
|
||||||
|
"SELECT a, _col FROM (SELECT a AS a, a + b AS _col, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
|
)
|
||||||
|
self.validate(
|
||||||
|
eliminate_distinct_on,
|
||||||
|
"SELECT DISTINCT ON (a) * FROM x ORDER BY c DESC",
|
||||||
|
"SELECT * FROM (SELECT *, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1",
|
||||||
|
)
|
||||||
|
self.validate(
|
||||||
|
eliminate_distinct_on,
|
||||||
|
'SELECT DISTINCT ON (a) a AS "A", b FROM x ORDER BY c DESC',
|
||||||
|
'SELECT "A", b FROM (SELECT a AS "A", b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1',
|
||||||
|
)
|
||||||
|
self.validate(
|
||||||
|
eliminate_distinct_on,
|
||||||
|
'SELECT DISTINCT ON (a) "A", b FROM x ORDER BY c DESC',
|
||||||
|
'SELECT "A", b FROM (SELECT "A" AS "A", b AS b, ROW_NUMBER() OVER (PARTITION BY a ORDER BY c DESC) AS _row_number FROM x) AS _t WHERE _row_number = 1',
|
||||||
)
|
)
|
||||||
|
|
||||||
def test_eliminate_qualify(self):
|
def test_eliminate_qualify(self):
|
||||||
|
|
Loading…
Add table
Reference in a new issue