You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/core_concepts/10_error_handling/index.mdx
+2Lines changed: 2 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -92,6 +92,8 @@ You can pick the Slack pre-set schedule error handler or define your own.
92
92
93
93
## Workspace Error Handler
94
94
95
+
Define a script or flow to be executed automatically in case of error in the workspace.
96
+
95
97
### Workspace Error Handler on Slack
96
98
97
99
On [Cloud plans and Self-Hosted & Enterprise Edition](/pricing), you can [connect workspace to Slack](../../integrations/slack.mdx) and enable an automated error handler on a given channel.
Copy file name to clipboardExpand all lines: docs/core_concepts/11_persistent_storage/index.mdx
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -252,7 +252,7 @@ Then from Windmill, just [fill the S3 resource type](../../integrations/s3.md).
252
252
253
253
4.[Integrate it to Windmill](../../integrations/microsoft-azure-blob.md) by filling the [resource type details](https://hub.windmill.dev/resource_types/137) for Azure Blob APIs.
254
254
255
-
### Connect your Windmill workspace to your S3 bucket or you Azure Blob storage
255
+
### Connect your Windmill workspace to your S3 bucket or your Azure Blob storage
256
256
257
257
Once you've created an [S3 or Azure Blob resource](../../integrations/s3.md) in Windmill, go to the workspace settings > S3 Storage. Select the resource and click Save.
Copy file name to clipboardExpand all lines: docs/core_concepts/18_files_binary_data/index.mdx
+5-5Lines changed: 5 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -12,15 +12,15 @@ Binary data, such as files, are not easy to handle. Windmill provides two option
12
12
13
13
## Windmill integration with S3 or Azure Blob Storage
14
14
15
-
The recommended way to store binary data is to upload it to S3 leveraging [Windmill's native S3 integrations](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage).
15
+
The recommended way to store binary data is to upload it to S3 leveraging [Windmill's native S3 integrations](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage).
16
16
17
17
:::info
18
18
19
19
Windmill's integration with S3 and Azure Blob Storage works exactly the same and the features described below works in both cases. The only difference is that you need to select an `azure_blob` resource when setting up the S3 storage in the Workspace settings.
20
20
21
21
:::
22
22
23
-
By [setting a S3 resource for the workspace](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage), you can have an easy access to your bucket from the script editor. It becomes easy to consume S3 files as input, and write back to S3 anywhere in a script.
23
+
By [setting a S3 resource for the workspace](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage), you can have an easy access to your bucket from the script editor. It becomes easy to consume S3 files as input, and write back to S3 anywhere in a script.
24
24
25
25
S3 files in Windmill are just pointers to the S3 object using its key. As such, they are represented by a simple JSON:
26
26
@@ -45,8 +45,8 @@ Windmill provides helpers in its SDKs to consume and produce S3 file seamlessly.
45
45
<divclassName="grid grid-cols-2 gap-6 mb-4">
46
46
<DocCard
47
47
title="Persistent Storage - S3 Integration"
48
-
description="Connect your Windmill workspace to your S3 bucket or you Azure Blob storage."
Polars and DuckDB needs to be configured to access S3 within the Windmill script. The job will need to accessed the S3 resources, which either needs to be accessible to the user running the job, or the S3 resource needs to be [set as public in the workspace settings](/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage).
306
+
Polars and DuckDB needs to be configured to access S3 within the Windmill script. The job will need to accessed the S3 resources, which either needs to be accessible to the user running the job, or the S3 resource needs to be [set as public in the workspace settings](/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-your-azure-blob-storage).
@@ -33,13 +35,13 @@ version of the Script/Flow and the other one with just a hash, i.e. `/h/<hash>`,
33
35
hiding potentially sensitive information and always corresponding to that
34
36
version of the script, even with overwrites.
35
37
36
-
### Asynchronous
38
+
####Asynchronous
37
39
38
40
Jobs can be triggered in asynchronous mode, meaning that the webhook is triggered, and the returning value is the uuid of the job assigned to execute the underlying code
39
41
40
42
These links are available in the "UUID/Async" tab.
41
43
42
-
### Synchronous
44
+
####Synchronous
43
45
44
46
The second type of autogenerated endpoint is the **synchronous** webhook. This
45
47
webhook triggers the execution, automatically extracts the underlying code's
@@ -72,15 +74,15 @@ Be cautious with potentially long-running jobs in **synchronous** mode.
72
74
73
75
:::
74
76
75
-
### Asynchronous vs. Synchronous
77
+
####Asynchronous vs. Synchronous
76
78
77
79
It's always better to use asynchronous mode as it allows your client not to wait for the response and it avoids Windmill to have to maintain a connection to your client while the job is running. However, for short-running jobs where it's easier in your code to block until you get a response, then use the synchronous mode.
78
80
79
81
When using the **synchronous mode**, the webhook returns the result of the script directly. If the script returns an error, the endpoint still returns the `200` status code with the error as a JSON object.
80
82
81
83
When using the **asynchronous mode**, the webhook returns a `uuid` and you can poll the [get job](https://app.windmill.dev/openapi.html#/operations/getJob) API call to fetch the status and results once it is completed.
82
84
83
-
## User token
85
+
###User token
84
86
85
87
To interact with Windmill you always need to use `Bearer` token authentication.
86
88
@@ -99,15 +101,15 @@ securely!
99
101
100
102

101
103
102
-
## Webhook specific tokens
104
+
###Webhook specific tokens
103
105
104
106
Webhook specific tokens allow sharing tokens publicly without fear since the token will only be able to trigger a specific script/flow and not impersonate you for any other operations.
105
107
106
108
It also avoids the hassle of having to create an anonymous user and check their permissions. If you can run the script yourself, then the webhook specific token will still inherit your own permissions.
107
109
108
110

109
111
110
-
## Triggering
112
+
###Triggering
111
113
112
114
Once you have a webhook URL and a user token, issue a request to the
113
115
endpoint and you will get the appropriate return as response.
@@ -127,7 +129,7 @@ as a secret (for more context please check [OWASP ref.1] and [OWASP ref.2]).
127
129
Examples using cURL for `POST` requests:
128
130
129
131
```bash
130
-
# Request with Header
132
+
## Request with Header
131
133
curl -X POST \
132
134
--data '{}' \
133
135
-H "Content-Type: application/json" \
@@ -136,7 +138,7 @@ curl -X POST \
136
138
```
137
139
138
140
```bash
139
-
# Query parameter
141
+
## Query parameter
140
142
curl -X POST \
141
143
--data '{}' \
142
144
-H "Content-Type: application/json" \
@@ -146,7 +148,7 @@ curl -X POST \
146
148
Examples using cURL for synchronous GET requests:
147
149
148
150
```bash
149
-
# Request with Header".
151
+
## Request with Header".
150
152
curl -X GET \
151
153
-H "Content-Type: application/json" \
152
154
-H "Authorization: Bearer supersecret" \
@@ -161,13 +163,13 @@ encountered issues), by checking the [Runs menu][runs] on the app.
161
163
162
164

163
165
164
-
## Request headers
166
+
###Request headers
165
167
166
168
It is possible for jobs to take request headers as arguments. To do so, either specify in the query args the headers to process at `include_header`, separated with `,`. e.g: `/api/w/admins/jobs/run_wait_result/p/u/user/undisputed_script?include_header=X-Sign,foo`
167
169
168
170
or use the env variable: `INCLUDE_HEADERS` with the same format so that all requests to any job will include the headers.
169
171
170
-
## Non Object payload / body
172
+
###Non Object payload / body
171
173
172
174
If the payload is not an object, it will be wrapped in an object with the key `body` and the value will be the payload/body itself. e.g:
173
175
@@ -182,11 +184,11 @@ def main(body: List[int]):
182
184
print(body)
183
185
```
184
186
185
-
## Raw payload / body
187
+
###Raw payload / body
186
188
187
189
Similarly to request headers, if the query args contain `raw=true`, then an additional argument will be added: `raw_string` which contains the entire json payload as a string (without any parsing). This is useful to verify the signature of the payload for example (discord require the endpoints to verify the signature for instance).
188
190
189
-
## Custom Response Code
191
+
###Custom Response Code
190
192
191
193
For all sync run jobs endpoints, if the response contains a key `windmill_status_code` with a number value, that value will be used as the status code. For example, if a script or flow returns:
192
194
@@ -211,7 +213,7 @@ with a status code `201`.
211
213
212
214
Note that if the status code is invalid (w.r.t [RFC9110](https://httpwg.org/specs/rfc9110.html#overview.of.status.codes)), the endpoint will return an error.
213
215
214
-
## Custom Content Type
216
+
###Custom Content Type
215
217
216
218
Similarly to the above, for all sync run jobs endpoints, if the response contains a key `windmill_content_type`, the associated value will be used as the content type header of the response. For example, if a script or flow returns:
217
219
@@ -230,7 +232,7 @@ the synchronous endpoint will return:
230
232
231
233
with the response header: "Content-Type: text/csv".
232
234
233
-
## Return early for flows
235
+
###Return early for flows
234
236
235
237
It is possible to define a node at which the flow will return at for sync endpoints. The rest of the flow will continue asynchronously.
236
238
@@ -244,20 +246,20 @@ Useful when some webhooks need to return extremely fast but not just the uuid (d
244
246
/>
245
247
</div>
246
248
247
-
## Exposing a webhook URL
249
+
###Exposing a webhook URL
248
250
249
251
Single port proxy can be leveraged to expose a webhook with a custom URL. In its docker-compose, Windmill uses Caddy but the logic can be adapted for others.
250
252
251
-
In the Caddyfile, the (`handle_path`)[https://caddyserver.com/docs/caddyfile/directives/handle_path#handle-path] and (`rewrite`)[https://caddyserver.com/docs/caddyfile/directives/rewrite#rewrite] directive can be used:
253
+
In the Caddyfile, the [`handle_path`](https://caddyserver.com/docs/caddyfile/directives/handle_path#handle-path) and [`rewrite`](https://caddyserver.com/docs/caddyfile/directives/rewrite#rewrite) directive can be used:
0 commit comments