feat: add project pages and more project settings config#24
feat: add project pages and more project settings config#24
Conversation
…ions and file creation.
Also bump package version to 0.1.0 and update pnpm-lock.yaml to include tiptap, hocuspocus, yjs and related dependency changes
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
📝 WalkthroughWalkthroughThis pull request introduces repository configuration management with schema and naming convention support, implements lazy-loaded email transporter initialization, adds file creation with frontmatter generation, redesigns the sidebar with collapsible layout, refactors authentication blacklist error handling, and creates a new projects dashboard page. Changes
Sequence Diagram(s)sequenceDiagram
participant Client as Client (Browser)
participant UI as FileExplorer Component
participant Handler as createFileHandler
participant Service as RepoService
participant GitHub as GitHub API
Client->>UI: Create file with path & name
UI->>UI: Fetch applicable config<br/>by matching path
UI->>UI: Validate file name<br/>against naming_convention
alt Validation fails
UI->>Client: Display error message
else Validation passes
UI->>UI: Generate frontmatter<br/>from schema (YAML)
UI->>Handler: POST /create-file<br/>(path, content with frontmatter)
Handler->>Service: validateAllowedDirectory(path)
Service->>Service: Match path against<br/>configured paths
alt Path not allowed
Service-->>Handler: { allowed: false }
Handler-->>Client: Error (403)
else Path allowed
Service-->>Handler: { allowed: true, config }
Handler->>Service: bulkUpdateFiles(owner, repo,<br/>[{ path, content }])
Service->>GitHub: Get latest commit tree
GitHub-->>Service: base_tree_sha
Service->>GitHub: Create new tree<br/>with base_tree_sha
GitHub-->>Service: new_tree_sha
Service->>GitHub: Create new commit<br/>with new_tree_sha
GitHub-->>Service: new_commit_sha
Service->>GitHub: Update ref<br/>to new_commit_sha
GitHub-->>Service: Success
Service-->>Handler: commitSha
Handler-->>Client: 201 Created
UI->>UI: Reload directory<br/>contents
end
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 4
🤖 Fix all issues with AI agents
In `@package.json`:
- Around line 33-41: Update dependencies to use the Tiptap v3 collaboration
caret and remove the old v2 and unused v1 packages: replace
"@tiptap/extension-collaboration-cursor" with
"@tiptap/extension-collaboration-caret@^3.x.x" so it matches the other
"@tiptap/*" v3 packages (e.g., "@tiptap/react", "@tiptap/starter-kit"), and
remove any "tiptap" v1 entry from dependencies to avoid unused/ incompatible
packages; ensure package.json only lists the v3 collaboration caret and then run
install to update lockfile.
In `@src/components/dashboard/FileExplorer.tsx`:
- Around line 399-411: The code uses btoa(initialContent) which throws on
non-ASCII chars; replace it with a Unicode-safe base64 helper (e.g., a
toBase64(str) function) and call that instead of btoa when building the POST
payload for the create-file request in FileExplorer.tsx; update the use-site
where (api.projects as
any)[projectId].repo[repoOwner][repoName]["create-file"].$post is called to use
toBase64(initialContent) so frontmatter with non-ASCII characters is encoded
safely.
In `@src/components/dashboard/ProjectSettingsModal.tsx`:
- Around line 50-62: The save currently builds payload using settings ? {
...fields... } : { public_directories } which causes a partial PUT when settings
is null; update the save flow in ProjectSettingsModal (the code that builds
payload / handleSave or saveSettings) to avoid overwriting project settings:
either block/disable save until settings is loaded (use a settingsLoaded or
isFetching flag) or always include explicit default values for keys
(allow_file_creation, allow_file_editing, allow_file_deletion, require_approval,
auto_merge, max_file_size, allowed_extensions, collaborator_message, etc.) when
settings is null, or switch the network call from PUT to a PATCH/merge endpoint
so partial updates are safe. Ensure the change references the existing payload
and settings variables so the API won’t receive a replace with only
public_directories.
In `@src/lib/server/repo/handlers.ts`:
- Around line 132-168: createFileHandler is receiving base64-encoded content
from the client but passes it straight to repoService.bulkUpdateFiles, causing
the base64 string to be written to git; decode the incoming content before
calling bulkUpdateFiles (e.g. detect base64 and run a decode step on the content
variable), so modify createFileHandler to transform content from base64 to the
original text prior to invoking repoService.bulkUpdateFiles; keep all existing
validation (projectService.checkProjectAccess,
repoService.validateAllowedDirectory, repoService.validateFileName) and only
change the assignment/usage of the content passed into bulkUpdateFiles.
🧹 Nitpick comments (5)
src/lib/server/email/service.ts (2)
28-45: Redundant null check afterisEmailEnabled().Since
isEmailEnabled()internally callsgetEmailTransporter()and returnsgetEmailTransporter() !== null, when line 29 passes,getEmailTransporter()on line 39 will return the same cached transporter (not null). The null check on lines 40-45 is unreachable in normal execution.While this defensive coding is harmless, it does result in
getEmailTransporter()being called twice. Consider simplifying:♻️ Suggested simplification
- // Check if email is enabled using the function call - if (!isEmailEnabled()) { + const transporter = getEmailTransporter(); + if (!transporter) { console.warn( "[Email] Email service is not configured. Skipping email send.", ); return { success: false, error: "Email service not configured", }; } - - const transporter = getEmailTransporter(); - if (!transporter) { - return { - success: false, - error: "Email transporter initialization failed", - }; - }
48-49: Consider adding a fallback forgmailUser.While
GMAIL_USERshould exist if the transporter was created successfully, explicitly handling a potentialundefinedvalue would make thefromfield on line 70 more robust. A fallback or assertion would prevent sending emails with an invalidfromaddress like"Mini CMS" <undefined>.♻️ Suggested defensive check
const gmailUser = process.env.GMAIL_USER; + if (!gmailUser) { + return { + success: false, + error: "GMAIL_USER environment variable not set", + }; + } const fromName = process.env.EMAIL_FROM_NAME || "Mini CMS";src/pages/dashboard/projects.astro (1)
48-61: Consider adding user-facing error feedback for failed project fetches.When project fetching fails, the error is logged but the user sees an empty projects list with no indication that something went wrong. Consider setting an error state to display a user-friendly message.
Suggested improvement
+let projectsError: string | null = null; let projects: any[] = []; if (currentOrg && userId) { try { const projectsResponse = await userProjects.fetch(userId); const projectsData = projectsResponse?.data; if (projectsData) { projects = projectsData.filter( (project: { org_id: any; }) => project.org_id === currentOrg.id, ); } } catch (error) { console.error("Error fetching projects:", error); + projectsError = "Failed to load projects. Please try again."; } }Then render the error in the template when
projectsErroris set.src/components/dashboard/FileExplorer.tsx (2)
344-355: Duplicated validation logic with server-side code.This
validateFileNamefunction duplicates the implementation insrc/lib/server/repo/service.ts(lines 132-145). While client-side validation improves UX, consider extracting this to a shared utility to ensure consistency and reduce maintenance burden.
363-376: Config matching logic may have edge cases.The condition
fullPath.startsWith(c.path + "/") || fullPath === c.path || createPath === c.pathcould match unintended configs. For example, if you have configs forsrcandsrc-lib, a file atsrc-lib/foo.mdmight incorrectly match thesrcconfig due to thecreatePath === c.pathfallback.Consider a more precise matching approach
-const config = repoConfig.find(c => fullPath.startsWith(c.path + "/") || fullPath === c.path || createPath === c.path); +const config = repoConfig.find(c => + fullPath.startsWith(c.path + "/") || + fullPath === c.path || + (createPath && (createPath === c.path || createPath.startsWith(c.path + "/"))) +);
| "@tiptap/extension-collaboration": "^3.15.3", | ||
| "@tiptap/extension-collaboration-cursor": "^2.26.2", | ||
| "@tiptap/extension-table": "^3.15.3", | ||
| "@tiptap/extension-table-cell": "^3.15.3", | ||
| "@tiptap/extension-table-header": "^3.15.3", | ||
| "@tiptap/extension-table-row": "^3.15.3", | ||
| "@tiptap/react": "^3.15.3", | ||
| "@tiptap/starter-kit": "^3.15.3", | ||
| "@toast-ui/editor": "^3.2.2", |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's locate and read the package.json file to verify versions
find . -maxdepth 2 -name "package.json" -type fRepository: Lftobs/mini-cms
Length of output: 73
🏁 Script executed:
# Search for any imports of the legacy 'tiptap' v1 package
rg -n --type=js --type=ts --type=tsx --type=jsx --type=astro "from ['\"]tiptap['\"]|require\\(['\"]tiptap['\"]\\)" -A2 -B2Repository: Lftobs/mini-cms
Length of output: 86
🌐 Web query:
Tiptap @tiptap/extension-collaboration-cursorv2 compatibility with@tiptap v3 packages
💡 Result:
@tiptap/extension-collaboration-cursor v2 is not compatible with @tiptap/* v3 packages—you generally can’t mix major versions in Tiptap because of breaking API changes and peer dependency expectations.
In Tiptap v3, the cursor extension was renamed from CollaborationCursor to CollaborationCaret. So for a v3 editor you should use:
@tiptap/extension-collaboration(v3)@tiptap/extension-collaboration-caret(v3) (tiptap.dev)
Practical rule: keep your Tiptap packages on the same major version (all v2 or all v3). (tiptap.dev)
Citations:
- 1: https://tiptap.dev/docs/resources/whats-new?utm_source=openai
- 2: https://tiptap.dev/docs/guides/upgrade-tiptap-v2?utm_source=openai
🌐 Web query:
Are @tiptap/extension-collaboration-cursorv2 and@tiptap v3 packages compatible
💡 Result:
No—don’t mix Tiptap major versions.
@tiptap/extension-collaboration-cursor@v2is a Tiptap v2 extension and is not intended to work with@tiptap/*@v3(v3 is a breaking major release, and packages are expected to be on the same major version). [1]- In Tiptap v3, the old
CollaborationCursorextension was renamed; you should use@tiptap/extension-collaboration-caret@v3instead (along with@tiptap/extension-collaboration@v3,@tiptap/core@v3, etc.). [2]
What to do: upgrade to the v3 caret package:
- install
@tiptap/extension-collaboration-caret(v3) rather than@tiptap/extension-collaboration-cursor(v2). [2]
Sources: Tiptap v3 upgrade/changes docs. [1], [2]
🏁 Script executed:
# Search for legacy tiptap v1 imports - fix file type issue
rg -n "from ['\"]tiptap['\"]|require\\(['\"]tiptap['\"]\\)" -A2 -B2Repository: Lftobs/mini-cms
Length of output: 41
🏁 Script executed:
# Read the package.json file to verify all Tiptap versions
cat package.json | head -100Repository: Lftobs/mini-cms
Length of output: 2303
🌐 Web query:
Astro v5 TypeScript version requirement
💡 Result:
Astro v5 itself does not declare a required TypeScript version (it doesn’t list typescript as a dependency/peer dependency), so TypeScript is optional unless you’re using TS tooling. [1]
If you use Astro’s type-checking tooling (astro check / @astrojs/check), it requires TypeScript ^5.0.0. [2]
Sources: [1] [2]
Replace @tiptap/extension-collaboration-cursor v2 with the v3 equivalent @tiptap/extension-collaboration-caret.
The v2 and v3 Tiptap packages are incompatible due to breaking API changes. In Tiptap v3, the CollaborationCursor extension was renamed to CollaborationCaret. Replace the v2 package with @tiptap/extension-collaboration-caret@^3.x.x to match your other Tiptap v3 dependencies.
Additionally, the tiptap v1 package in dependencies is unused and can be removed to clean up the dependency tree.
🤖 Prompt for AI Agents
In `@package.json` around lines 33 - 41, Update dependencies to use the Tiptap v3
collaboration caret and remove the old v2 and unused v1 packages: replace
"@tiptap/extension-collaboration-cursor" with
"@tiptap/extension-collaboration-caret@^3.x.x" so it matches the other
"@tiptap/*" v3 packages (e.g., "@tiptap/react", "@tiptap/starter-kit"), and
remove any "tiptap" v1 entry from dependencies to avoid unused/ incompatible
packages; ensure package.json only lists the v3 collaboration caret and then run
install to update lockfile.
| const response = await (api.projects as any)[projectId].repo[ | ||
| repoOwner | ||
| ][repoName]["create-file"].$post({ | ||
| json: { | ||
| path: fullPath, | ||
| content: btoa(""), | ||
| content: btoa(initialContent), | ||
| message: `Create ${fullPath}`, | ||
| }, | ||
| }); | ||
| if (!response.ok) throw new Error(response.statusText); | ||
| if (!response.ok) { | ||
| const errData = await response.json(); | ||
| throw new Error(errData.message || response.statusText); | ||
| } |
There was a problem hiding this comment.
btoa() may fail with non-ASCII characters in frontmatter.
If the generated frontmatter contains non-ASCII characters (e.g., from schema field names or default values), btoa() will throw an error. Consider using a Unicode-safe encoding approach.
Suggested fix for Unicode safety
-content: btoa(initialContent),
+content: btoa(unescape(encodeURIComponent(initialContent))),Or use a helper function:
const toBase64 = (str: string) => btoa(new TextEncoder().encode(str).reduce((data, byte) => data + String.fromCharCode(byte), ''));📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const response = await (api.projects as any)[projectId].repo[ | |
| repoOwner | |
| ][repoName]["create-file"].$post({ | |
| json: { | |
| path: fullPath, | |
| content: btoa(""), | |
| content: btoa(initialContent), | |
| message: `Create ${fullPath}`, | |
| }, | |
| }); | |
| if (!response.ok) throw new Error(response.statusText); | |
| if (!response.ok) { | |
| const errData = await response.json(); | |
| throw new Error(errData.message || response.statusText); | |
| } | |
| const response = await (api.projects as any)[projectId].repo[ | |
| repoOwner | |
| ][repoName]["create-file"].$post({ | |
| json: { | |
| path: fullPath, | |
| content: btoa(unescape(encodeURIComponent(initialContent))), | |
| message: `Create ${fullPath}`, | |
| }, | |
| }); | |
| if (!response.ok) { | |
| const errData = await response.json(); | |
| throw new Error(errData.message || response.statusText); | |
| } |
🤖 Prompt for AI Agents
In `@src/components/dashboard/FileExplorer.tsx` around lines 399 - 411, The code
uses btoa(initialContent) which throws on non-ASCII chars; replace it with a
Unicode-safe base64 helper (e.g., a toBase64(str) function) and call that
instead of btoa when building the POST payload for the create-file request in
FileExplorer.tsx; update the use-site where (api.projects as
any)[projectId].repo[repoOwner][repoName]["create-file"].$post is called to use
toBase64(initialContent) so frontmatter with non-ASCII characters is encoded
safely.
| const payload = settings ? { | ||
| public_directories: JSON.stringify(dirs), | ||
| allow_file_creation: settings.allow_file_creation, | ||
| allow_file_editing: settings.allow_file_editing, | ||
| allow_file_deletion: settings.allow_file_deletion, | ||
| require_approval: settings.require_approval, | ||
| auto_merge: settings.auto_merge, | ||
| max_file_size: settings.max_file_size, | ||
| allowed_extensions: settings.allowed_extensions, | ||
| collaborator_message: settings.collaborator_message, | ||
| } : { | ||
| public_directories: JSON.stringify(dirs), | ||
| allow_file_creation: settings?.allow_file_creation ?? false, | ||
| allow_file_editing: settings?.allow_file_editing ?? true, | ||
| allow_file_deletion: settings?.allow_file_deletion ?? false, | ||
| require_approval: settings?.require_approval ?? true, | ||
| auto_merge: settings?.auto_merge ?? false, | ||
| max_file_size: settings?.max_file_size ?? 1048576, | ||
| allowed_extensions: | ||
| settings?.allowed_extensions ?? | ||
| JSON.stringify([".md"]), | ||
| collaborator_message: settings?.collaborator_message ?? "", | ||
| }; |
There was a problem hiding this comment.
Avoid PUT with a partial payload when settings is null.
With settings missing, you now send only public_directories via a PUT. If the API treats PUT as replace, this can drop required flags (or fail validation) on fresh projects or when fetch fails. Consider blocking save until settings are loaded, restoring explicit defaults, or switching to a PATCH/merge endpoint if partial updates are intended.
🛠️ Possible guard (prevents partial overwrite)
const handleSave = async () => {
setSaving(true);
try {
+ if (!settings) {
+ alert("Settings not loaded yet. Please try again.");
+ return;
+ }
const dirs = allowedDirs
.split("\n")
.map((d) => d.trim())
.filter((d) => d);
- const payload = settings ? {
- public_directories: JSON.stringify(dirs),
- allow_file_creation: settings.allow_file_creation,
- allow_file_editing: settings.allow_file_editing,
- allow_file_deletion: settings.allow_file_deletion,
- require_approval: settings.require_approval,
- auto_merge: settings.auto_merge,
- max_file_size: settings.max_file_size,
- allowed_extensions: settings.allowed_extensions,
- collaborator_message: settings.collaborator_message,
- } : {
- public_directories: JSON.stringify(dirs),
- };
+ const payload = {
+ public_directories: JSON.stringify(dirs),
+ allow_file_creation: settings.allow_file_creation,
+ allow_file_editing: settings.allow_file_editing,
+ allow_file_deletion: settings.allow_file_deletion,
+ require_approval: settings.require_approval,
+ auto_merge: settings.auto_merge,
+ max_file_size: settings.max_file_size,
+ allowed_extensions: settings.allowed_extensions,
+ collaborator_message: settings.collaborator_message,
+ };🤖 Prompt for AI Agents
In `@src/components/dashboard/ProjectSettingsModal.tsx` around lines 50 - 62, The
save currently builds payload using settings ? { ...fields... } : {
public_directories } which causes a partial PUT when settings is null; update
the save flow in ProjectSettingsModal (the code that builds payload / handleSave
or saveSettings) to avoid overwriting project settings: either block/disable
save until settings is loaded (use a settingsLoaded or isFetching flag) or
always include explicit default values for keys (allow_file_creation,
allow_file_editing, allow_file_deletion, require_approval, auto_merge,
max_file_size, allowed_extensions, collaborator_message, etc.) when settings is
null, or switch the network call from PUT to a PATCH/merge endpoint so partial
updates are safe. Ensure the change references the existing payload and settings
variables so the API won’t receive a replace with only public_directories.
| export const createFileHandler = async (c: Context) => { | ||
| try { | ||
| const { owner, repo } = c.req.param(); | ||
| const projectId = c.req.param("projectId"); | ||
| const { path, content, message } = await c.req.json(); | ||
| const user = c.get("user"); | ||
|
|
||
| if (!projectId) { | ||
| return c.json(error("Project ID is required"), 400); | ||
| } | ||
|
|
||
| await projectService.checkProjectAccess(projectId, user.id); | ||
|
|
||
| const { allowed, config } = await repoService.validateAllowedDirectory(projectId, owner, repo, path); | ||
| if (!allowed) { | ||
| return c.json(error("Access to this directory is not allowed"), 403); | ||
| } | ||
|
|
||
| if (config?.naming_convention) { | ||
| const filename = path.split("/").pop() || ""; | ||
| // Remove extension | ||
| const nameWithoutExt = filename.includes(".") | ||
| ? filename.split(".").slice(0, -1).join(".") | ||
| : filename; | ||
|
|
||
| if (!repoService.validateFileName(nameWithoutExt, config.naming_convention)) { | ||
| return c.json(error(`File name must follow ${config.naming_convention} convention`), 400); | ||
| } | ||
| } | ||
|
|
||
| // We use bulkUpdateFiles for single file creation too as it handles git flow | ||
| const result = await repoService.bulkUpdateFiles(projectId, owner, repo, [{ path, content }], message); | ||
| return c.json(success(result), 201); | ||
| } catch (err: any) { | ||
| return c.json(error(err.message), err.statusCode || 500); | ||
| } | ||
| }; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, find and check FileExplorer.tsx around line 404
find . -name "FileExplorer.tsx" -type fRepository: Lftobs/mini-cms
Length of output: 102
🏁 Script executed:
# Find service.ts and check bulkUpdateFiles implementation
find . -name "service.ts" -type f -path "*repo*"Repository: Lftobs/mini-cms
Length of output: 91
🏁 Script executed:
# Search for createBlob usage in the repo
rg -n "createBlob" -A 5 --type ts --type tsxRepository: Lftobs/mini-cms
Length of output: 86
🏁 Script executed:
# Check FileExplorer.tsx around line 404
cat -n ./src/components/dashboard/FileExplorer.tsx | sed -n '395,415p'Repository: Lftobs/mini-cms
Length of output: 785
🏁 Script executed:
# Search for createBlob in all files
rg -n "createBlob" -A 5Repository: Lftobs/mini-cms
Length of output: 492
🏁 Script executed:
# Check service.ts for bulkUpdateFiles around line 222
wc -l ./src/lib/server/repo/service.tsRepository: Lftobs/mini-cms
Length of output: 95
🏁 Script executed:
# Read bulkUpdateFiles function context - around line 200-230
cat -n ./src/lib/server/repo/service.ts | sed -n '195,235p'Repository: Lftobs/mini-cms
Length of output: 1589
🏁 Script executed:
# Check if there's any content decoding in the handler or service
rg -n "atob|Buffer.from.*base64|base64" ./src/lib/server/repo/Repository: Lftobs/mini-cms
Length of output: 305
🏁 Script executed:
# Check the full createFileHandler to see if it processes content
cat -n ./src/lib/server/repo/handlers.ts | sed -n '132,175p'Repository: Lftobs/mini-cms
Length of output: 1827
Decode base64 content before passing to bulkUpdateFiles.
The client sends base64-encoded content (FileExplorer.tsx line 404: btoa(initialContent)), but the handler passes it directly to bulkUpdateFiles without decoding. The service then sends this base64 string to GitHub's API with encoding: "utf-8", causing the base64 string itself to be written as file content instead of the decoded original content.
Decode the content before passing to bulkUpdateFiles:
Fix: Decode base64 on server
+ // Decode base64 content from client
+ const decodedContent = Buffer.from(content, 'base64').toString('utf-8');
+
// We use bulkUpdateFiles for single file creation too as it handles git flow
- const result = await repoService.bulkUpdateFiles(projectId, owner, repo, [{ path, content }], message);
+ const result = await repoService.bulkUpdateFiles(projectId, owner, repo, [{ path, content: decodedContent }], message);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export const createFileHandler = async (c: Context) => { | |
| try { | |
| const { owner, repo } = c.req.param(); | |
| const projectId = c.req.param("projectId"); | |
| const { path, content, message } = await c.req.json(); | |
| const user = c.get("user"); | |
| if (!projectId) { | |
| return c.json(error("Project ID is required"), 400); | |
| } | |
| await projectService.checkProjectAccess(projectId, user.id); | |
| const { allowed, config } = await repoService.validateAllowedDirectory(projectId, owner, repo, path); | |
| if (!allowed) { | |
| return c.json(error("Access to this directory is not allowed"), 403); | |
| } | |
| if (config?.naming_convention) { | |
| const filename = path.split("/").pop() || ""; | |
| // Remove extension | |
| const nameWithoutExt = filename.includes(".") | |
| ? filename.split(".").slice(0, -1).join(".") | |
| : filename; | |
| if (!repoService.validateFileName(nameWithoutExt, config.naming_convention)) { | |
| return c.json(error(`File name must follow ${config.naming_convention} convention`), 400); | |
| } | |
| } | |
| // We use bulkUpdateFiles for single file creation too as it handles git flow | |
| const result = await repoService.bulkUpdateFiles(projectId, owner, repo, [{ path, content }], message); | |
| return c.json(success(result), 201); | |
| } catch (err: any) { | |
| return c.json(error(err.message), err.statusCode || 500); | |
| } | |
| }; | |
| export const createFileHandler = async (c: Context) => { | |
| try { | |
| const { owner, repo } = c.req.param(); | |
| const projectId = c.req.param("projectId"); | |
| const { path, content, message } = await c.req.json(); | |
| const user = c.get("user"); | |
| if (!projectId) { | |
| return c.json(error("Project ID is required"), 400); | |
| } | |
| await projectService.checkProjectAccess(projectId, user.id); | |
| const { allowed, config } = await repoService.validateAllowedDirectory(projectId, owner, repo, path); | |
| if (!allowed) { | |
| return c.json(error("Access to this directory is not allowed"), 403); | |
| } | |
| if (config?.naming_convention) { | |
| const filename = path.split("/").pop() || ""; | |
| // Remove extension | |
| const nameWithoutExt = filename.includes(".") | |
| ? filename.split(".").slice(0, -1).join(".") | |
| : filename; | |
| if (!repoService.validateFileName(nameWithoutExt, config.naming_convention)) { | |
| return c.json(error(`File name must follow ${config.naming_convention} convention`), 400); | |
| } | |
| } | |
| // Decode base64 content from client | |
| const decodedContent = Buffer.from(content, 'base64').toString('utf-8'); | |
| // We use bulkUpdateFiles for single file creation too as it handles git flow | |
| const result = await repoService.bulkUpdateFiles(projectId, owner, repo, [{ path, content: decodedContent }], message); | |
| return c.json(success(result), 201); | |
| } catch (err: any) { | |
| return c.json(error(err.message), err.statusCode || 500); | |
| } | |
| }; |
🤖 Prompt for AI Agents
In `@src/lib/server/repo/handlers.ts` around lines 132 - 168, createFileHandler is
receiving base64-encoded content from the client but passes it straight to
repoService.bulkUpdateFiles, causing the base64 string to be written to git;
decode the incoming content before calling bulkUpdateFiles (e.g. detect base64
and run a decode step on the content variable), so modify createFileHandler to
transform content from base64 to the original text prior to invoking
repoService.bulkUpdateFiles; keep all existing validation
(projectService.checkProjectAccess, repoService.validateAllowedDirectory,
repoService.validateFileName) and only change the assignment/usage of the
content passed into bulkUpdateFiles.
Summary by CodeRabbit
New Features
Chores
✏️ Tip: You can customize this high-level summary in your review settings.