-
Updated
May 26, 2022 - Python
spark-sql
Here are 537 public repositories matching this topic...
-
Updated
May 26, 2022 - Scala
Code of Conduct
- I agree to follow this project's Code of Conduct
Search before asking
- I have searched in the issues and found no similar issues.
What would you like to be improved?
Ubuntu 22.04 is available and is the latest LTS version, we can migr
-
Updated
Dec 12, 2021 - Scala
-
Updated
Dec 2, 2021 - Scala
-
Updated
May 22, 2022
-
Updated
May 25, 2022 - JavaScript
Is your feature request related to a problem? Please describe.
Today the user needs to deploy udf jars and reference data csvs manually to the blob location
Describe the solution you'd like
Enable the user to choose a file on a local disk which the web portal will then upload to the right location
-
Updated
Jan 24, 2022 - HTML
-
Updated
Feb 1, 2019 - TypeScript
-
Updated
Mar 18, 2022
-
Updated
Mar 25, 2022 - JavaScript
-
Updated
Oct 1, 2018 - Scala
-
Updated
Aug 12, 2020 - Java
When writing data with qbeast
format, the user needs to specify every time the columnsToIndex
or cubeSize
. This is ok if you want to change them, but it shouldn't be always explicit.
For example, if the user wants to append data to an existing table and maintain the same configuration, it should be able to write:
df.write.format("qbeast").save("existing-path")
instead
tsc编译报错
版本信息
- dt-sql-parser版本:4.0.0-beta.2.2
- tsc版本:4.5.4
$ tsc -v
Version 4.5.4
报错信息
执行npx tsc
,报错如下:
node_modules/dt-sql-parser/dist/lib/flinksql/FlinkSqlParserListener.d.ts:4:16 - error TS1005: '(' expected.
4 constructor: typeof FlinkSqlParserListener;
~
node_modules/dt-sql-parser/dist/lib/flinksql/FlinkSqlParserVisitor.d.ts:4:16 - e
补充 Bundle 时的引入注意事项
-
Updated
Jun 25, 2018 - Scala
-
Updated
Dec 19, 2019 - Scala
-
Updated
Mar 15, 2022 - Scala
-
Updated
Dec 29, 2020
-
Updated
May 9, 2020 - Java
-
Updated
Jan 19, 2022 - Scala
-
Updated
Jul 26, 2020 - Scala
-
Updated
Dec 13, 2021 - R
-
Updated
Dec 12, 2018 - Python
-
Updated
Sep 26, 2020 - Python
Improve this page
Add a description, image, and links to the spark-sql topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the spark-sql topic, visit your repo's landing page and select "manage topics."
Describe the bug
When the finalizer is called for CLR JvmObjectId objects, it calls the
rm
DotnetBackend method and this calls goes through theJvmBridge
class. Because therm
call goes through [JvmBridge.CallJavaMethod](https://github.com/do