Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(databricks-driver): Enable Azure AD authentication via service principal #6763

Open
wants to merge 53 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 48 commits
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
83d2179
Generate credential by azure service principle
MaggieZhang-02 Jun 21, 2023
3f5a6fb
Fix lint issues
MaggieZhang-02 Jun 21, 2023
7b9305f
Add unit test for new env variables
MaggieZhang-02 Jun 21, 2023
fa454f7
Add unit test for DatabricksDriver
MaggieZhang-02 Jun 26, 2023
ad396ba
Add test script for databricks driver
MaggieZhang-02 Jun 26, 2023
b72bff6
Merge branch 'cube-js:master' into add-azure-credential
MaggieZhang-01 Jun 26, 2023
50a59a6
Update azure identity to 3.2.3
MaggieZhang-02 Jun 26, 2023
c450ef2
Add yarn lock
MaggieZhang-02 Jun 28, 2023
2c5fad7
Merge branch 'cube-js:master' into add-azure-credential
MaggieZhang-01 Jun 29, 2023
86a608e
Merge branch 'cube-js:master' into add-azure-credential
MaggieZhang-01 Jul 3, 2023
c3744c9
Merge branch 'cube-js:master' into add-azure-credential
MaggieZhang-01 Jul 5, 2023
7b285b4
Merge branch 'cube-js:master' into add-azure-credential
MaggieZhang-01 Jul 10, 2023
34beaa6
Fix jest async issue
MaggieZhang-02 Jul 12, 2023
767a941
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Jul 14, 2023
43876f3
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Jul 24, 2023
c0a7f1d
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Jul 26, 2023
f81bfcc
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Jul 31, 2023
8b5c0b8
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Aug 7, 2023
3439d85
Add azure prefix for related env variables
MaggieZhang-02 Sep 1, 2023
e48c6b1
Keep shared key credential as default
MaggieZhang-02 Sep 1, 2023
13b6cac
Merge branch 'cube-js:master' into add-azure-credential
MaggieZhang-01 Sep 1, 2023
9ac16ea
Make sas codes more readable
MaggieZhang-02 Sep 6, 2023
94e4d8b
Fix typo error in comments
MaggieZhang-02 Sep 6, 2023
0952466
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Sep 18, 2023
0d10b79
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Sep 19, 2023
ac2c8c7
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Oct 9, 2023
699a35e
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Oct 16, 2023
4568bdf
doc(@cubejs-backend/databricks-jdbc-driver):Add new env variables to doc
MaggieZhang-02 Oct 23, 2023
842b561
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Oct 30, 2023
3251d22
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Oct 31, 2023
6222edb
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Nov 7, 2023
b119780
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Nov 17, 2023
c43b1eb
Remove new Databricks variables doc changes
MaggieZhang-02 Nov 21, 2023
e755c1a
Merge remote-tracking branch 'origin' into add-azure-credential
MaggieZhang-02 Nov 21, 2023
22d06a4
doc(@cubejs-backend/databricks-jdbc-driver):Add new env variables to …
MaggieZhang-02 Nov 21, 2023
d08515e
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Nov 27, 2023
e23c6f0
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Nov 29, 2023
b3fa7cc
Keep dependencies version consistent with master
MaggieZhang-01 Nov 29, 2023
e389d50
Merge branch 'cube-js:master' into add-azure-credential
MaggieZhang-01 Dec 1, 2023
944ef37
Upgrade azure indentity version
MaggieZhang-02 Dec 1, 2023
26db2a0
Add the changes of yarn lock for upgrade
MaggieZhang-02 Dec 8, 2023
7c49e75
Merge remote-tracking branch 'origin' into add-azure-credential
MaggieZhang-02 Dec 8, 2023
7c18152
Add yarn lock for azure identity
MaggieZhang-02 Dec 8, 2023
8507162
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Dec 12, 2023
458bc70
Fix undefined azure key error with principal provided
MaggieZhang-02 Dec 12, 2023
a25d8d7
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Jan 3, 2024
0babf3f
Merge branch 'cube-js:master' into add-azure-credential
MaggieZhang-01 Jan 8, 2024
b9123f8
Add azure export bucket env variables
MaggieZhang-02 Jan 8, 2024
162dd3f
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Jan 16, 2024
611f6f2
Merge branch 'master' into add-azure-credential
MaggieZhang-01 Jan 23, 2024
794b68b
Merge branch 'cube-js:master' into add-azure-credential
MaggieZhang-01 Jan 26, 2024
bfbd8a6
Merge remote-tracking branch 'origin' into add-azure-credential
MaggieZhang-02 Jun 21, 2024
63a3856
Merge branch 'master' into add-azure-credential
paveltiunov Sep 13, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions docs/pages/product/configuration/data-sources/databricks-jdbc.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,17 @@ CUBEJS_DB_EXPORT_BUCKET=wasbs://[email protected]
CUBEJS_DB_EXPORT_BUCKET_AZURE_KEY=<AZURE_STORAGE_ACCOUNT_ACCESS_KEY>
```

Access key provides full access to the configuration and data,
to use a fine-grained control over access to storage resources, follow [the Databricks guide on authorize with Azure Active Directory][authorize-with-azure-active-directory].

[Create the service principal][azure-authentication-with-service-principal] and replace the access key as follows:

```dotenv
CUBEJS_DB_EXPORT_BUCKET_AZURE_TENANT_ID=<AZURE_TENANT_ID>
CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_ID=<AZURE_CLIENT_ID>
CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET=<AZURE_CLIENT_SECRET>
```

## SSL/TLS

Cube does not require any additional configuration to enable SSL/TLS for
Expand All @@ -145,6 +156,10 @@ bucket][self-preaggs-export-bucket] **must be** configured.
[azure-bs]: https://azure.microsoft.com/en-gb/services/storage/blobs/
[azure-bs-docs-get-key]:
https://docs.microsoft.com/en-us/azure/storage/common/storage-account-keys-manage?toc=%2Fazure%2Fstorage%2Fblobs%2Ftoc.json&tabs=azure-portal#view-account-access-keys
[authorize-with-azure-active-directory]:
https://learn.microsoft.com/en-us/rest/api/storageservices/authorize-with-azure-active-directory
[azure-authentication-with-service-principal]:
https://learn.microsoft.com/en-us/azure/developer/java/sdk/identity-service-principal-auth
[databricks]: https://databricks.com/
[databricks-docs-dbfs]:
https://docs.databricks.com/data/databricks-file-system.html#mount-object-storage-to-dbfs
Expand Down
60 changes: 60 additions & 0 deletions docs/pages/reference/configuration/environment-variables.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -398,6 +398,66 @@ with a data source][ref-config-multiple-ds-decorating-env].
| -------------------------------------- | ---------------------- | --------------------- |
| [A valid AWS region][aws-docs-regions] | N/A | N/A |

## `CUBEJS_DB_EXPORT_BUCKET_AZURE_KEY`

The Azure Access Key to use for the export bucket.

<InfoBox>

When using multiple data sources, this environment variable can be [decorated
with a data source][ref-config-multiple-ds-decorating-env].

</InfoBox>

| Possible Values | Default in Development | Default in Production |
| ------------------------ | ---------------------- | --------------------- |
| A valid Azure Access Key | N/A | N/A |

## `CUBEJS_DB_EXPORT_BUCKET_AZURE_TENANT_ID`

The Azure tenant ID to use for the export bucket.

<InfoBox>

When using multiple data sources, this environment variable can be [decorated
with a data source][ref-config-multiple-ds-decorating-env].

</InfoBox>

| Possible Values | Default in Development | Default in Production |
| ----------------------- | ---------------------- | --------------------- |
| A valid Azure Tenant ID | N/A | N/A |

## `CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_ID`

The Azure client ID to use for the export bucket.

<InfoBox>

When using multiple data sources, this environment variable can be [decorated
with a data source][ref-config-multiple-ds-decorating-env].

</InfoBox>

| Possible Values | Default in Development | Default in Production |
| ----------------------- | ---------------------- | --------------------- |
| A valid Azure Client ID | N/A | N/A |

## `CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET`

The Azure client secret to use for the export bucket.

<InfoBox>

When using multiple data sources, this environment variable can be [decorated
with a data source][ref-config-multiple-ds-decorating-env].

</InfoBox>

| Possible Values | Default in Development | Default in Production |
| --------------------------- | ---------------------- | --------------------- |
| A valid Azure Client Secret | N/A | N/A |

## `CUBEJS_DB_EXPORT_BUCKET_MOUNT_DIR`

The mount path to use for a [Databricks DBFS mount][databricks-docs-dbfs].
Expand Down
39 changes: 39 additions & 0 deletions packages/cubejs-backend-shared/src/env.ts
Original file line number Diff line number Diff line change
Expand Up @@ -772,6 +772,45 @@ const variables: Record<string, (...args: any) => any> = {
]
),

/**
* Tenant ID for the Azure based export bucket srorage.
*/
dbExportBucketAzureTenantId: ({
dataSource,
}: {
dataSource: string,
}) => (
process.env[
keyByDataSource('CUBEJS_DB_EXPORT_BUCKET_AZURE_TENANT_ID', dataSource)
]
),

/**
* Client ID for the Azure based export bucket srorage.
*/
dbExportBucketAzureClientId: ({
dataSource,
}: {
dataSource: string,
}) => (
process.env[
keyByDataSource('CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_ID', dataSource)
]
),

/**
* Client Secret for the Azure based export bucket srorage.
*/
dbExportBucketAzureClientSecret: ({
dataSource,
}: {
dataSource: string,
}) => (
process.env[
keyByDataSource('CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET', dataSource)
]
),

/**
* Export bucket options for Integration based.
*/
Expand Down
87 changes: 87 additions & 0 deletions packages/cubejs-backend-shared/test/db_env_multi.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -956,6 +956,93 @@ describe('Multiple datasources', () => {
);
});

test('getEnv("dbExportBucketAzureTenantId")', () => {
process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_TENANT_ID = 'default1';
process.env.CUBEJS_DS_POSTGRES_DB_EXPORT_BUCKET_AZURE_TENANT_ID = 'postgres1';
process.env.CUBEJS_DS_WRONG_DB_EXPORT_BUCKET_AZURE_TENANT_ID = 'wrong1';
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'default' })).toEqual('default1');
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'postgres' })).toEqual('postgres1');
expect(() => getEnv('dbExportBucketAzureTenantId', { dataSource: 'wrong' })).toThrow(
'The wrong data source is missing in the declared CUBEJS_DATASOURCES.'
);

process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_TENANT_ID = 'default2';
process.env.CUBEJS_DS_POSTGRES_DB_EXPORT_BUCKET_AZURE_TENANT_ID = 'postgres2';
process.env.CUBEJS_DS_WRONG_DB_EXPORT_BUCKET_AZURE_TENANT_ID = 'wrong2';
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'default' })).toEqual('default2');
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'postgres' })).toEqual('postgres2');
expect(() => getEnv('dbExportBucketAzureTenantId', { dataSource: 'wrong' })).toThrow(
'The wrong data source is missing in the declared CUBEJS_DATASOURCES.'
);

delete process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_TENANT_ID;
delete process.env.CUBEJS_DS_POSTGRES_DB_EXPORT_BUCKET_AZURE_TENANT_ID;
delete process.env.CUBEJS_DS_WRONG_DB_EXPORT_BUCKET_AZURE_TENANT_ID;
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'default' })).toBeUndefined();
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'postgres' })).toBeUndefined();
expect(() => getEnv('dbExportBucketAzureTenantId', { dataSource: 'wrong' })).toThrow(
'The wrong data source is missing in the declared CUBEJS_DATASOURCES.'
);
});

test('getEnv("dbExportBucketAzureClientId")', () => {
process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_ID = 'default1';
process.env.CUBEJS_DS_POSTGRES_DB_EXPORT_BUCKET_AZURE_CLIENT_ID = 'postgres1';
process.env.CUBEJS_DS_WRONG_DB_EXPORT_BUCKET_AZURE_CLIENT_ID = 'wrong1';
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'default' })).toEqual('default1');
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'postgres' })).toEqual('postgres1');
expect(() => getEnv('dbExportBucketAzureClientId', { dataSource: 'wrong' })).toThrow(
'The wrong data source is missing in the declared CUBEJS_DATASOURCES.'
);

process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_ID = 'default2';
process.env.CUBEJS_DS_POSTGRES_DB_EXPORT_BUCKET_AZURE_CLIENT_ID = 'postgres2';
process.env.CUBEJS_DS_WRONG_DB_EXPORT_BUCKET_AZURE_CLIENT_ID = 'wrong2';
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'default' })).toEqual('default2');
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'postgres' })).toEqual('postgres2');
expect(() => getEnv('dbExportBucketAzureClientId', { dataSource: 'wrong' })).toThrow(
'The wrong data source is missing in the declared CUBEJS_DATASOURCES.'
);

delete process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_ID;
delete process.env.CUBEJS_DS_POSTGRES_DB_EXPORT_BUCKET_AZURE_CLIENT_ID;
delete process.env.CUBEJS_DS_WRONG_DB_EXPORT_BUCKET_AZURE_CLIENT_ID;
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'default' })).toBeUndefined();
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'postgres' })).toBeUndefined();
expect(() => getEnv('dbExportBucketAzureClientId', { dataSource: 'wrong' })).toThrow(
'The wrong data source is missing in the declared CUBEJS_DATASOURCES.'
);
});

test('getEnv("dbExportBucketAzureClientSecret")', () => {
process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET = 'default1';
process.env.CUBEJS_DS_POSTGRES_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET = 'postgres1';
process.env.CUBEJS_DS_WRONG_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET = 'wrong1';
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'default' })).toEqual('default1');
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'postgres' })).toEqual('postgres1');
expect(() => getEnv('dbExportBucketAzureClientSecret', { dataSource: 'wrong' })).toThrow(
'The wrong data source is missing in the declared CUBEJS_DATASOURCES.'
);

process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET = 'default2';
process.env.CUBEJS_DS_POSTGRES_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET = 'postgres2';
process.env.CUBEJS_DS_WRONG_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET = 'wrong2';
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'default' })).toEqual('default2');
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'postgres' })).toEqual('postgres2');
expect(() => getEnv('dbExportBucketAzureClientSecret', { dataSource: 'wrong' })).toThrow(
'The wrong data source is missing in the declared CUBEJS_DATASOURCES.'
);

delete process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET;
delete process.env.CUBEJS_DS_POSTGRES_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET;
delete process.env.CUBEJS_DS_WRONG_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET;
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'default' })).toBeUndefined();
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'postgres' })).toBeUndefined();
expect(() => getEnv('dbExportBucketAzureClientSecret', { dataSource: 'wrong' })).toThrow(
'The wrong data source is missing in the declared CUBEJS_DATASOURCES.'
);
});

test('getEnv("dbExportIntegration")', () => {
process.env.CUBEJS_DB_EXPORT_INTEGRATION = 'default1';
process.env.CUBEJS_DS_POSTGRES_DB_EXPORT_INTEGRATION = 'postgres1';
Expand Down
51 changes: 51 additions & 0 deletions packages/cubejs-backend-shared/test/db_env_single.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -618,6 +618,57 @@ describe('Single datasources', () => {
expect(getEnv('dbExportBucketAzureKey', { dataSource: 'wrong' })).toBeUndefined();
});

test('getEnv("dbExportBucketAzureTenantId")', () => {
process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_TENANT_ID = 'default1';
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'default' })).toEqual('default1');
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'postgres' })).toEqual('default1');
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'wrong' })).toEqual('default1');

process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_TENANT_ID = 'default2';
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'default' })).toEqual('default2');
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'postgres' })).toEqual('default2');
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'wrong' })).toEqual('default2');

delete process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_TENANT_ID;
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'default' })).toBeUndefined();
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'postgres' })).toBeUndefined();
expect(getEnv('dbExportBucketAzureTenantId', { dataSource: 'wrong' })).toBeUndefined();
});

test('getEnv("dbExportBucketAzureClientId")', () => {
process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_ID = 'default1';
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'default' })).toEqual('default1');
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'postgres' })).toEqual('default1');
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'wrong' })).toEqual('default1');

process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_ID = 'default2';
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'default' })).toEqual('default2');
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'postgres' })).toEqual('default2');
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'wrong' })).toEqual('default2');

delete process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_ID;
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'default' })).toBeUndefined();
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'postgres' })).toBeUndefined();
expect(getEnv('dbExportBucketAzureClientId', { dataSource: 'wrong' })).toBeUndefined();
});

test('getEnv("dbExportBucketAzureClientSecret")', () => {
process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET = 'default1';
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'default' })).toEqual('default1');
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'postgres' })).toEqual('default1');
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'wrong' })).toEqual('default1');

process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET = 'default2';
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'default' })).toEqual('default2');
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'postgres' })).toEqual('default2');
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'wrong' })).toEqual('default2');

delete process.env.CUBEJS_DB_EXPORT_BUCKET_AZURE_CLIENT_SECRET;
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'default' })).toBeUndefined();
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'postgres' })).toBeUndefined();
expect(getEnv('dbExportBucketAzureClientSecret', { dataSource: 'wrong' })).toBeUndefined();
});

test('getEnv("dbExportIntegration")', () => {
process.env.CUBEJS_DB_EXPORT_INTEGRATION = 'default1';
expect(getEnv('dbExportIntegration', { dataSource: 'default' })).toEqual('default1');
Expand Down
3 changes: 3 additions & 0 deletions packages/cubejs-databricks-jdbc-driver/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@
"build": "rm -rf dist && npm run tsc",
"tsc": "tsc",
"watch": "tsc -w",
"test": "npm run unit",
"unit": "jest dist/test --forceExit",
"lint": "eslint src/* --ext .ts",
"lint:fix": "eslint --fix src/* --ext .ts",
"postinstall": "node bin/post-install"
Expand All @@ -31,6 +33,7 @@
"@aws-sdk/client-s3": "^3.49.0",
"@aws-sdk/s3-request-presigner": "^3.49.0",
"@azure/storage-blob": "^12.9.0",
"@azure/identity": "^3.3.1",
"@cubejs-backend/base-driver": "^0.34.41",
"@cubejs-backend/jdbc-driver": "^0.34.41",
"@cubejs-backend/schema-compiler": "^0.34.42",
Expand Down
Loading