Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@ DATABASE_NAME=baserow
# BASEROW_OPENAI_UPLOADED_FILE_SIZE_LIMIT_MB=
# BASEROW_MAX_IMPORT_FILE_SIZE_MB=
# BASEROW_UNIQUE_ROW_VALUES_SIZE_LIMIT=
# BASEROW_AI_FIELD_MAX_CONCURRENT_GENERATIONS=

# BASEROW_AUTOMATION_HISTORY_PAGE_SIZE_LIMIT=
# BASEROW_AUTOMATION_WORKFLOW_RATE_LIMIT_MAX_RUNS=
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@ def _build_connection(self):
return psycopg.connect(
host=self.host,
port=self.port,
database=self.database,
dbname=self.database,
user=self.username,
)

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"type": "bug",
"message": "Fix backup_baserow management command by using correct pg3 dbname.",
"issue_origin": "github",
"issue_number": 4308,
"domain": "core",
"bullet_points": [],
"created_at": "2025-11-28"
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"type": "feature",
"message": "Run AI field generation in parallel",
"issue_origin": "github",
"issue_number": 4227,
"domain": "database",
"bullet_points": [],
"created_at": "2025-11-20"
}
1 change: 1 addition & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -211,6 +211,7 @@ x-backend-variables:
BASEROW_MISTRAL_MODELS:
BASEROW_OLLAMA_HOST:
BASEROW_OLLAMA_MODELS:
BASEROW_AI_FIELD_MAX_CONCURRENT_GENERATIONS:
BASEROW_SERVE_FILES_THROUGH_BACKEND:
BASEROW_SERVE_FILES_THROUGH_BACKEND_PERMISSION:
BASEROW_SERVE_FILES_THROUGH_BACKEND_EXPIRE_SECONDS:
Expand Down
1 change: 1 addition & 0 deletions docs/installation/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -148,6 +148,7 @@ The installation methods referred to in the variable descriptions are:
| BASEROW\_MISTRAL\_MODELS | Provide a comma separated list of Mistral models (https://docs.mistral.ai/getting-started/models/models_overview/) that you would like to enable in the instance (e.g. `mistral-large-latest,mistral-small-latest`). Note that this only works if an Mistral API key is set. If this variable is not provided, the user won't be able to choose a model. | |
| BASEROW\_OLLAMA\_HOST | Provide an OLLAMA host to allow using OLLAMA for generative AI features like the AI field. | |
| BASEROW\_OLLAMA\_MODELS | Provide a comma separated list of Ollama models (https://ollama.com/library) that you would like to enable in the instance (e.g. `llama2`). Note that this only works if an Ollama host is set. If this variable is not provided, the user won't be able to choose a model. | |
| BASEROW\_AI\_FIELD\_MAX\_CONCURRENT\_GENERATIONS | If AI field values are recalculated in a large number (i.e. recalculating whole table, empty rows, or a selection of rows), this controls the number of concurrent requests issued to AI model to generate values. | 5 |

### Backend Misc Configuration
| Name | Description | Defaults |
Expand Down
1 change: 0 additions & 1 deletion premium/backend/src/baserow_premium/api/fields/views.py
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,6 @@ def post(self, request: Request, field_id: int, data) -> Response:
context=ai_field.table,
)

GenerateAIValuesJobType().get_valid_generative_ai_model_type_or_raise(ai_field)
job = JobHandler().create_and_start_job(
request.user,
GenerateAIValuesJobType.type,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

from django.core.exceptions import ImproperlyConfigured

from baserow.config.settings.utils import try_int


def setup(settings):
"""
Expand Down Expand Up @@ -33,3 +35,8 @@ def setup(settings):
settings.BASEROW_PREMIUM_GROUPED_AGGREGATE_SERVICE_MAX_AGG_BUCKETS = (
BASEROW_PREMIUM_GROUPED_AGGREGATE_SERVICE_MAX_AGG_BUCKETS
)

# Used to limit thread pool size for running AI field generation in parallel
settings.BASEROW_AI_FIELD_MAX_CONCURRENT_GENERATIONS = try_int(
os.getenv("BASEROW_AI_FIELD_MAX_CONCURRENT_GENERATIONS"), 5
)
Loading
Loading