Skip to content

Commit aadbee2

Browse files
committed
title fix
1 parent b7d640c commit aadbee2

File tree

3 files changed

+8
-8
lines changed

3 files changed

+8
-8
lines changed

docs/core_concepts/11_persistent_storage/index.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -252,7 +252,7 @@ Then from Windmill, just [fill the S3 resource type](../../integrations/s3.md).
252252

253253
4. [Integrate it to Windmill](../../integrations/microsoft-azure-blob.md) by filling the [resource type details](https://hub.windmill.dev/resource_types/137) for Azure Blob APIs.
254254

255-
### Connect your Windmill workspace to your S3 bucket or you Azure Blob storage
255+
### Connect your Windmill workspace to your S3 bucket or your Azure Blob storage
256256

257257
Once you've created an [S3 or Azure Blob resource](../../integrations/s3.md) in Windmill, go to the workspace settings > S3 Storage. Select the resource and click Save.
258258

docs/core_concepts/18_files_binary_data/index.mdx

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -12,15 +12,15 @@ Binary data, such as files, are not easy to handle. Windmill provides two option
1212

1313
## Windmill integration with S3 or Azure Blob Storage
1414

15-
The recommended way to store binary data is to upload it to S3 leveraging [Windmill's native S3 integrations](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage).
15+
The recommended way to store binary data is to upload it to S3 leveraging [Windmill's native S3 integrations](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage).
1616

1717
:::info
1818

1919
Windmill's integration with S3 and Azure Blob Storage works exactly the same and the features described below works in both cases. The only difference is that you need to select an `azure_blob` resource when setting up the S3 storage in the Workspace settings.
2020

2121
:::
2222

23-
By [setting a S3 resource for the workspace](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage), you can have an easy access to your bucket from the script editor. It becomes easy to consume S3 files as input, and write back to S3 anywhere in a script.
23+
By [setting a S3 resource for the workspace](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage), you can have an easy access to your bucket from the script editor. It becomes easy to consume S3 files as input, and write back to S3 anywhere in a script.
2424

2525
S3 files in Windmill are just pointers to the S3 object using its key. As such, they are represented by a simple JSON:
2626

@@ -45,8 +45,8 @@ Windmill provides helpers in its SDKs to consume and produce S3 file seamlessly.
4545
<div className="grid grid-cols-2 gap-6 mb-4">
4646
<DocCard
4747
title="Persistent Storage - S3 Integration"
48-
description="Connect your Windmill workspace to your S3 bucket or you Azure Blob storage."
49-
href="/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage"
48+
description="Connect your Windmill workspace to your S3 bucket or your Azure Blob storage."
49+
href="/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage"
5050
/>
5151
</div>
5252

@@ -303,7 +303,7 @@ def main(input_file: S3Object):
303303

304304
:::info
305305

306-
Polars and DuckDB needs to be configured to access S3 within the Windmill script. The job will need to accessed the S3 resources, which either needs to be accessible to the user running the job, or the S3 resource needs to be [set as public in the workspace settings](/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage).
306+
Polars and DuckDB needs to be configured to access S3 within the Windmill script. The job will need to accessed the S3 resources, which either needs to be accessible to the user running the job, or the S3 resource needs to be [set as public in the workspace settings](/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage).
307307

308308
:::
309309

docs/core_concepts/27_data_pipelines/index.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -72,8 +72,8 @@ More details on S3 integration:
7272
<div className="grid grid-cols-2 gap-6 mb-4">
7373
<DocCard
7474
title="Persistent Storage - S3 Integration"
75-
description="Connect your Windmill workspace to your S3 bucket or you Azure Blob storage."
76-
href="/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage"
75+
description="Connect your Windmill workspace to your S3 bucket or your Azure Blob storage."
76+
href="/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage"
7777
/>
7878
</div>
7979

0 commit comments

Comments
 (0)