Skip to content

Commit 2390b54

Browse files
authored
Fix more doc issues (#6072)
1 parent aca8c50 commit 2390b54

File tree

3 files changed

+6
-6
lines changed

3 files changed

+6
-6
lines changed

airbyte-integrations/connectors/destination-databricks/src/main/resources/spec.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@
3939
"title": "Databricks Personal Access Token",
4040
"type": "string",
4141
"description": "",
42-
"examples": [""],
42+
"examples": ["dapi0123456789abcdefghij0123456789AB"],
4343
"airbyte_secret": true
4444
},
4545
"database_schema": {

airbyte-integrations/connectors/source-lever-hiring/README.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
# Lever Hiring Source
22

33
This is the repository for the Lever Hiring source connector, written in Python.
4-
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/lever-hiring).
54

65
## Local development
76

docs/integrations/destinations/databricks.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -17,9 +17,10 @@ Due to legal reasons, this is currently a private connector that is only availab
1717
| Incremental - Dedupe Sync || |
1818
| Namespaces || |
1919

20-
## Configuration
20+
## Data Source
21+
Databricks supports various cloud storage as the [data source](https://docs.databricks.com/data/data-sources/index.html). Currently, only Amazon S3 is supported.
2122

22-
Databricks parameters
23+
## Configuration
2324

2425
| Category | Parameter | Type | Notes |
2526
| :--- | :--- | :---: | :--- |
@@ -28,8 +29,8 @@ Databricks parameters
2829
| | Port | string | Optional. Default to "443". See [documentation](https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html#get-server-hostname-port-http-path-and-jdbc-url). |
2930
| | Personal Access Token | string | Required. See [documentation](https://docs.databricks.com/sql/user/security/personal-access-tokens.html). |
3031
| General | Database schema | string | Optional. Default to "public". Each data stream will be written to a table under this database schema. |
31-
| | Purge Staging Files and Tables | The connector creates staging files and tables on S3. By default they will be purged when the data sync is complete. Set it to `false` for debugging purpose. |
32-
| S3 | Bucket Name | string | Name of the bucket to sync data into. |
32+
| | Purge Staging Data | boolean | The connector creates staging files and tables on S3. By default they will be purged when the data sync is complete. Set it to `false` for debugging purpose. |
33+
| Data Source - S3 | Bucket Name | string | Name of the bucket to sync data into. |
3334
| | Bucket Path | string | Subdirectory under the above bucket to sync the data into. |
3435
| | Region | string | See [documentation](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html#concepts-available-regions) for all region codes. |
3536
| | Access Key ID | string | AWS/Minio credential. |

0 commit comments

Comments
 (0)