Skip to content

Commit 7f8ebeb

Browse files
hcourdenthugocasa
andauthored
Windmill AI changelog, docs, and dedicated page, other docs stuff (windmill-labs#525)
* Windmill AI changelog, docs, and dedicated page, other docs stuff * feat: add video to changelog * Database studio youtube * Changelog feedback Hugo, s3 integration doc --------- Co-authored-by: HugoCasa <hugo@casademont.ch>
1 parent 41c8f76 commit 7f8ebeb

File tree

24 files changed

+240
-38
lines changed

24 files changed

+240
-38
lines changed

changelog/2024-01-23-database-studio/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ version: v1.251.1
44
title: Database Studio
55
tags: ['App Editor', 'Postgres']
66
image: ./database-studio.png
7-
description: Introducing the Database Studio, a web-based database management tool that leverages Ag Grid for table display and interaction. This component enhances the user's ability to interact with database content in a dynamic and intuitive way.
7+
description: Introducing the Database Studio, a web-based database management tool that leverages Ag Grid for table display and interaction. In apps, interaction with database content made easy; from a PostgreSQL resource, display, edit, add rows, delete rows ... and connect to other components.
88
features:
99
[
1010
'Display the content of a table.',

changelog/2024-01-23-rich-render/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
slug: rich-render
33
version: v1.251.1
44
title: Rich results render
5-
tags: ['Scripts']
5+
tags: ['Scripts', 'Flow Editor']
66
image: ./rich-render.png
77
description: Added rich results render for arrays of objects and markdown.
88
features:
Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
---
2+
slug: ai-copilot
3+
version: v1.270.0
4+
title: Flow & Metadata Copilot
5+
tags: ['Windmill AI', 'Flow Editor', 'Scripts']
6+
video: /videos/ai_fill_inputs.mp4
7+
description: The Flow & Metadata Copilot is an assistant powered by an OpenAI resource that simplifies your script & flows building experience by population fields (summaries, descriptions, step input expressions) automatically based on context and prompts.
8+
features:
9+
[
10+
'Fills summary of script & flow steps.',
11+
'Links flow step inputs to previous steps results.',
12+
'Fills flow loops iterator expressions from context.',
13+
'Completes branches predicate expressions from prompts.',
14+
'Defines CRON schedules from prompts.',
15+
]
16+
docs: /docs/core_concepts/ai_generation
17+
---

docs/advanced/6_imports/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ of having to maintain a separate requirements file.
8686
We use a simple heuristics to infer the package name: the import root name must be the package name. We also maintain a list of exceptions.
8787
You can make a PR to add your dependency to the list of exceptions [here](https://github.com/windmill-labs/windmill/blob/baac93f40140ee37548a273885c028a8e6500b6d/backend/parsers/windmill-parser-py-imports/src/lib.rs#L48)
8888

89-
## Pinning dependencies
89+
## Pinning dependencies and Requirements
9090

9191
If the imports are not properly analyzed, there exists an escape hatch to
9292
override the inferred imports. One needs to head the Script with the following comment:

docs/apps/4_app_configuration_settings/database_studio.mdx

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,15 @@ import DocCard from '@site/src/components/DocCard';
44

55
This component allows you to create a database studio. The database studio is a web-based database management tool. It allows you to display and edit the content of a database. It uses Ag Grid to display the table.
66

7-
<video
8-
className="border-2 rounded-xl object-cover w-full h-full dark:border-gray-800"
9-
controls
10-
src="/videos/database_studio.mp4"
11-
/>
7+
<iframe
8+
style={{ aspectRatio: '16/9' }}
9+
src="https://www.youtube.com/embed/Fd_0EffVDtw?vq=hd1080"
10+
title="Database Studio"
11+
frameBorder="0"
12+
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share"
13+
allowFullScreen
14+
className="border-2 rounded-xl object-cover w-full dark:border-gray-800"
15+
></iframe>
1216

1317
## Features
1418

docs/core_concepts/11_persistent_storage/index.mdx

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -254,16 +254,18 @@ Then from Windmill, just [fill the S3 resource type](../../integrations/s3.md).
254254

255255
### Connect your Windmill workspace to your S3 bucket or you Azure Blob storage
256256

257-
Once you've created an S3 or Azure Blob resource in Windmill, go to the workspace settings > S3 Storage. Select the resource and click Save.
257+
Once you've created an [S3 or Azure Blob resource](../../integrations/s3.md) in Windmill, go to the workspace settings > S3 Storage. Select the resource and click Save.
258258

259259
![S3 storage workspace settings](./workspace_settings.png)
260260

261-
The resource can be set to be public. In this case, the permissions set on the resource will be ignored when users interact with the S3 bucket via Windmill. Note that when the resource is public, the users might be able to access all of its details (including access keys and secrets) via some Windmill endpoints.
261+
The resource can be set to be public with toggle "S3 resource details can be accessed by all users of this workspace".
262+
263+
In this case, the [permissions](../16_roles_and_permissions/index.mdx#path) set on the resource will be ignored when users interact with the S3 bucket via Windmill. Note that when the resource is public, the users might be able to access all of its details (including access keys and secrets) via some Windmill endpoints.
262264
When the resource is not set to be public, Windmill guarantees that users who don't have access to the resource won't be able to retrieve any of its details. That being said, access to a specific file inside the bucket will still be possible, and downloading and uploading objects will also be accessible to any workspace user. In short, as long as the user knows the path of the file they want to access, they will be able to read its content. The main difference is that users won't be able to browse the content of the bucket.
263265

264266
Once the workspace is configured, access to the bucket is made easy in Windmill.
265267

266-
When a script accepts a S3 file as input, it can be directly uploaded or chosen from from the bucket explorer.
268+
When a script accepts a S3 file as input, it can be directly uploaded or chosen from the bucket explorer.
267269

268270
![S3 file upload](./file_upload.png)
269271

@@ -273,7 +275,7 @@ When a script outputs a S3 file, it can be downloaded or previewed directly in W
273275

274276
![S3 file download](./file_download.png)
275277

276-
For more info on how to use S3 files in Windmill, see [Handling files and binary data](/docs/core_concepts/files_binary_data)
278+
For more info on how to use files and S3 files in Windmill, see [Handling files and binary data](/docs/core_concepts/files_binary_data)
277279

278280
## Structured Databases: Postgres (Supabase, Neon.tech)
279281

docs/core_concepts/18_files_binary_data/index.mdx

Lines changed: 26 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -5,22 +5,22 @@ import TabItem from '@theme/TabItem';
55
# Handling Files and Binary Data
66

77
In Windmill, JSON is the primary data format used for representing information.
8-
Binary data, such as files, are not easily to handle. Windmill provides two options.
8+
Binary data, such as files, are not easy to handle. Windmill provides two options.
99

1010
1. Have a dedicated storage for binary data: S3 or Azure Blob. Windmill has a first class integration with S3 buckets or Azure Blob containers.
1111
2. If the above is not an option, there's always the possibility to store the binary as base64 encoded string.
1212

13-
### Windmill integration with S3 or Azure Blob Storage
13+
## Windmill integration with S3 or Azure Blob Storage
1414

15-
The recommended way to store binary data is to upload it to S3 leveraging Windmill's native integrations.
15+
The recommended way to store binary data is to upload it to S3 leveraging [Windmill's native S3 integrations](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage).
1616

1717
:::info
1818

1919
Windmill's integration with S3 and Azure Blob Storage works exactly the same and the features described below works in both cases. The only difference is that you need to select an `azure_blob` resource when setting up the S3 storage in the Workspace settings.
2020

2121
:::
2222

23-
By [setting a S3 resource for the workspace](/docs/core_concepts/data_pipelines#windmill-integration-with-an-external-s3-storage), you can have an easy access to your bucket from the script editor. It becomes easy to consume S3 files as input, and write back to S3 anywhere in a script.
23+
By [setting a S3 resource for the workspace](../11_persistent_storage/index.mdx#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage), you can have an easy access to your bucket from the script editor. It becomes easy to consume S3 files as input, and write back to S3 anywhere in a script.
2424

2525
S3 files in Windmill are just pointers to the S3 object using its key. As such, they are represented by a simple JSON:
2626

@@ -30,9 +30,27 @@ S3 files in Windmill are just pointers to the S3 object using its key. As such,
3030
}
3131
```
3232

33+
When a script accepts a S3 file as input, it can be directly uploaded or chosen from the bucket explorer.
34+
35+
![S3 file upload](../11_persistent_storage/file_upload.png)
36+
37+
![S3 bucket browsing](../11_persistent_storage/bucket_browsing.png)
38+
39+
When a script outputs a S3 file, it can be downloaded or previewed directly in Windmill's UI (for displayable files like text files, CSVs or parquet files).
40+
41+
![S3 file download](../11_persistent_storage/file_download.png)
42+
3343
Windmill provides helpers in its SDKs to consume and produce S3 file seamlessly.
3444

35-
#### Read a file from S3 within a script
45+
<div className="grid grid-cols-2 gap-6 mb-4">
46+
<DocCard
47+
title="Persistent Storage - S3 Integration"
48+
description="Connect your Windmill workspace to your S3 bucket or you Azure Blob storage."
49+
href="/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage"
50+
/>
51+
</div>
52+
53+
### Read a file from S3 within a script
3654

3755
<Tabs className="unique-tabs">
3856

@@ -103,7 +121,7 @@ def main(input_file: S3Object):
103121

104122
![Read S3 file](./s3_file_input.png)
105123

106-
#### Create a file in S3 within a script
124+
### Create a file in S3 within a script
107125

108126
<Tabs className="unique-tabs">
109127

@@ -173,7 +191,7 @@ def main(s3_file_path: str):
173191
Certain file types, typically parquet files, can be directly rendered by Windmill
174192
:::
175193

176-
#### Windmill embedded integration with Polars and DuckDB for data pipelines
194+
### Windmill embedded integration with Polars and DuckDB for data pipelines
177195

178196
ETL can be easily implemented in Windmill using its integration with Polars and DuckDB for facilitate working with tabular data. In this case, you don't need to manually interact with the S3 bucket, Polars/DuckDB does it natively and in a efficient way. Reading and Writing datasets to S3 can be done seamlessly.
179197

@@ -299,7 +317,7 @@ For more info, see our page dedicated to Data Pipelines in Windmill:
299317
/>
300318
</div>
301319

302-
### Base64 encoded strings
320+
## Base64 encoded strings
303321

304322
Base64 strings can also be used, but the main difficulty is that those Base64 strings can not be distinguished from normal strings.
305323
Hence, the interpretation ofthose Base64 encoded strings is either done depending on the context,

docs/core_concepts/22_ai_generation/index.mdx

Lines changed: 58 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -64,6 +64,16 @@ Upon error when executing code, you will be offered to "AI Fix" it. The assistan
6464
src="/videos/ai_fix.mp4"
6565
/>
6666

67+
### Summary Copilot
68+
69+
From your code, the AI assistant can generate a script summary.
70+
71+
<video
72+
className="border-2 rounded-xl object-cover w-full h-full dark:border-gray-800"
73+
controls
74+
src="/videos/summary_compilot.mp4"
75+
/>
76+
6777
## Windmill AI for Flows
6878

6979
Generate Workflows from prompts.
@@ -128,6 +138,46 @@ This allows you to avoid relying on webhooks sent by external APIs, which can be
128138
/>
129139
</div>
130140

141+
### Summary Copilot for Steps
142+
143+
From your code, the AI assistant can generate a summary for flow steps.
144+
145+
<video
146+
className="border-2 rounded-xl object-cover w-full h-full dark:border-gray-800"
147+
controls
148+
src="/videos/summary_copilot_steps.mp4"
149+
/>
150+
151+
### Step Input Copilot
152+
153+
When adding a new step to a flow, the AI assistant will suggest inputs based on the previous steps' results and flow inputs.
154+
155+
<video
156+
className="border-2 rounded-xl object-cover w-full h-full dark:border-gray-800"
157+
controls
158+
src="/videos/step_input_copilot.mp4"
159+
/>
160+
161+
### Flow Loops Iterator Expressions from Context
162+
163+
When adding a [for loop](../../flows/12_flow_loops.md), the AI assistant will suggest iterator expressions based on the previous steps' results.
164+
165+
<video
166+
className="border-2 rounded-xl object-cover w-full h-full dark:border-gray-800"
167+
controls
168+
src="/videos/iterator_prefill.mp4"
169+
/>
170+
171+
### Flow Branches Predicate Expressions from Prompts
172+
173+
When adding a [branch](../../flows/13_flow_branches.md), the AI assistant will suggest predicate expressions from a prompt.
174+
175+
<video
176+
className="border-2 rounded-xl object-cover w-full h-full dark:border-gray-800"
177+
controls
178+
src="/videos/branch_predicate_copilot.mp4"
179+
/>
180+
131181
## Windmill AI Code Completion
132182

133183
When Code Completion is enabled, Windmill AI suggests completions as you type in all code editors (Script, Flow, Apps).
@@ -137,3 +187,11 @@ When Code Completion is enabled, Windmill AI suggests completions as you type in
137187
controls
138188
src="/videos/code_autopilot.mp4"
139189
/>
190+
191+
## CRON Schedules from Prompt
192+
193+
<video
194+
className="border-2 rounded-xl object-cover w-full h-full dark:border-gray-800"
195+
controls
196+
src="/videos/cron_from_prompt.mp4"
197+
/>

docs/core_concepts/27_data_pipelines/index.mdx

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -67,6 +67,16 @@ Clicking on one of those buttons, a drawer will open displaying the content of t
6767
From there you always have the possibility to use the S3 client library of your choice to read and write to S3.
6868
That being said, Polars and DuckDB can read/write directly from/to files stored in S3 Windmill now ships with helpers to make the entire data processing mechanics very cohesive.
6969

70+
More details on S3 integration:
71+
72+
<div className="grid grid-cols-2 gap-6 mb-4">
73+
<DocCard
74+
title="Persistent Storage - S3 Integration"
75+
description="Connect your Windmill workspace to your S3 bucket or you Azure Blob storage."
76+
href="/docs/core_concepts/persistent_storage#connect-your-windmill-workspace-to-your-s3-bucket-or-you-azure-blob-storage"
77+
/>
78+
</div>
79+
7080
### Canonical data pipeline in Windmill w/ Polars and DuckDB
7181

7282
With S3 as the external store, a transformation script in a flow will typically perform:

docs/flows/12_flow_loops.md

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,15 @@ There are 4 configuration options:
2020

2121
### Iterator expression
2222

23-
The [JS expression](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Expressions_and_Operators) that will be evaluated to get the list of items to iterate over. You can also [connect with a previous result](./16_architecture.mdx) that contain several items, it will iterate over all of them.
23+
The [JavaScript expression](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Expressions_and_Operators) that will be evaluated to get the list of items to iterate over. You can also [connect with a previous result](./16_architecture.mdx) that contain several items, it will iterate over all of them.
24+
25+
It can be pre-filled automatically by [Windmill AI](../core_concepts/22_ai_generation/index.mdx) from flow context:
26+
27+
<video
28+
className="border-2 rounded-xl object-cover w-full h-full dark:border-gray-800"
29+
controls
30+
src="/videos/iterator_prefill.mp4"
31+
/>
2432

2533
### Skip failure
2634

0 commit comments

Comments
 (0)