You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/core_concepts/11_persistent_storage/index.mdx
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -252,7 +252,7 @@ Then from Windmill, just [fill the S3 resource type](../../integrations/s3.md).
252
252
253
253
4.[Integrate it to Windmill](../../integrations/microsoft-azure-blob.md) by filling the [resource type details](https://hub.windmill.dev/resource_types/137) for Azure Blob APIs.
254
254
255
-
### Connect your Windmill workspace to your S3 bucket or you Azure Blob storage
255
+
### Connect your Windmill workspace to your S3 bucket or your Azure Blob storage
256
256
257
257
Once you've created an [S3 or Azure Blob resource](../../integrations/s3.md) in Windmill, go to the workspace settings > S3 Storage. Select the resource and click Save.
Copy file name to clipboardExpand all lines: docs/core_concepts/18_files_binary_data/index.mdx
+5-5Lines changed: 5 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -12,15 +12,15 @@ Binary data, such as files, are not easy to handle. Windmill provides two option
12
12
13
13
## Windmill integration with S3 or Azure Blob Storage
14
14
15
-
The recommended way to store binary data is to upload it to S3 leveraging [Windmill's native S3 integrations](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage).
15
+
The recommended way to store binary data is to upload it to S3 leveraging [Windmill's native S3 integrations](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage).
16
16
17
17
:::info
18
18
19
19
Windmill's integration with S3 and Azure Blob Storage works exactly the same and the features described below works in both cases. The only difference is that you need to select an `azure_blob` resource when setting up the S3 storage in the Workspace settings.
20
20
21
21
:::
22
22
23
-
By [setting a S3 resource for the workspace](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage), you can have an easy access to your bucket from the script editor. It becomes easy to consume S3 files as input, and write back to S3 anywhere in a script.
23
+
By [setting a S3 resource for the workspace](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage), you can have an easy access to your bucket from the script editor. It becomes easy to consume S3 files as input, and write back to S3 anywhere in a script.
24
24
25
25
S3 files in Windmill are just pointers to the S3 object using its key. As such, they are represented by a simple JSON:
26
26
@@ -45,8 +45,8 @@ Windmill provides helpers in its SDKs to consume and produce S3 file seamlessly.
45
45
<divclassName="grid grid-cols-2 gap-6 mb-4">
46
46
<DocCard
47
47
title="Persistent Storage - S3 Integration"
48
-
description="Connect your Windmill workspace to your S3 bucket or you Azure Blob storage."
Polars and DuckDB needs to be configured to access S3 within the Windmill script. The job will need to accessed the S3 resources, which either needs to be accessible to the user running the job, or the S3 resource needs to be [set as public in the workspace settings](/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage).
306
+
Polars and DuckDB needs to be configured to access S3 within the Windmill script. The job will need to accessed the S3 resources, which either needs to be accessible to the user running the job, or the S3 resource needs to be [set as public in the workspace settings](/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage).
0 commit comments