Skip to content

Comments

feat: add project pages and more project settings config#24

Open
Lftobs wants to merge 5 commits intomainfrom
feat/sync-editor
Open

feat: add project pages and more project settings config#24
Lftobs wants to merge 5 commits intomainfrom
feat/sync-editor

Conversation

@Lftobs
Copy link
Owner

@Lftobs Lftobs commented Jan 23, 2026

Summary by CodeRabbit

  • New Features

    • Collapsible sidebar for improved responsive mobile experience
    • Configuration-driven file creation with automatic naming convention enforcement and schema-based templates
    • New projects dashboard for managing organization projects
  • Chores

    • Updated version to 0.1.0
    • Added real-time collaboration and editing libraries to support future enhancements

✏️ Tip: You can customize this high-level summary in your review settings.

@vercel
Copy link

vercel bot commented Jan 23, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Review Updated (UTC)
mini-cms Ready Ready Preview, Comment Jan 23, 2026 0:49am

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 23, 2026

📝 Walkthrough

Walkthrough

This pull request introduces repository configuration management with schema and naming convention support, implements lazy-loaded email transporter initialization, adds file creation with frontmatter generation, redesigns the sidebar with collapsible layout, refactors authentication blacklist error handling, and creates a new projects dashboard page.

Changes

Cohort / File(s) Summary
Dependencies & Versioning
package.json
Version bumped from 0.0.1 to 0.1.0; expanded dependencies with collaboration libraries (@hocuspocus/provider, @hocuspocus/server, yjs, tiptap extensions) and UI packages.
Repository Configuration & File Creation
src/lib/server/repo/service.ts,
src/lib/server/repo/handlers.ts,
src/lib/server/repo/index.ts
New getRepoConfig() method returns array of config objects (path, schema, naming_convention); validateAllowedDirectory() now returns matched config; new validateFileName() enforces naming conventions (kebab-case, snake_case, camelCase). New GET /config and POST /create-file routes added; createFileHandler validates naming conventions and enforces access control.
File Explorer Configuration
src/components/dashboard/FileExplorer.tsx
Replaced allowedDirectories with config-driven repoConfig array; added validateFileName() helper; file creation now validates names against naming_convention and generates optional frontmatter based on schema using YAML. Enhanced error handling and view refresh logic.
Email Service Refactor
src/lib/server/email/client.ts,
src/lib/server/email/service.ts
Transporter initialization moved to lazy-loaded getEmailTransporter() with caching and async verification; isEmailEnabled() changed from boolean export to function; sendInviteEmail() updated to use new getEmailTransporter() with null-checks. Environment variable access migrated to process.env.
Authentication & Settings
src/lib/server/auth/blacklist.ts,
src/components/dashboard/ProjectSettingsModal.tsx
Token blacklist now throws "Token validation unavailable" error on Redis failure (fail-fast behavior) instead of returning false; settings payload construction now conditionally includes full settings object fields when available.
UI Layout & Navigation
src/components/dashboard/Sidebar.astro
Sidebar redesigned with collapsible state: header exposes collapse toggle; navigation items adapt to centered icons when collapsed; headings and labels visibility controlled by isCollapsed state; dropdown interactions gated behind collapsed state.
Projects Dashboard
src/pages/dashboard/projects.astro
New page that fetches user organizations and projects, handles org context from URL params, displays projects grid with View Repo and Edit actions, renders empty states, and includes GitHub installation status alerts.

Sequence Diagram(s)

sequenceDiagram
    participant Client as Client (Browser)
    participant UI as FileExplorer Component
    participant Handler as createFileHandler
    participant Service as RepoService
    participant GitHub as GitHub API

    Client->>UI: Create file with path & name
    UI->>UI: Fetch applicable config<br/>by matching path
    UI->>UI: Validate file name<br/>against naming_convention
    alt Validation fails
        UI->>Client: Display error message
    else Validation passes
        UI->>UI: Generate frontmatter<br/>from schema (YAML)
        UI->>Handler: POST /create-file<br/>(path, content with frontmatter)
        Handler->>Service: validateAllowedDirectory(path)
        Service->>Service: Match path against<br/>configured paths
        alt Path not allowed
            Service-->>Handler: { allowed: false }
            Handler-->>Client: Error (403)
        else Path allowed
            Service-->>Handler: { allowed: true, config }
            Handler->>Service: bulkUpdateFiles(owner, repo,<br/>[{ path, content }])
            Service->>GitHub: Get latest commit tree
            GitHub-->>Service: base_tree_sha
            Service->>GitHub: Create new tree<br/>with base_tree_sha
            GitHub-->>Service: new_tree_sha
            Service->>GitHub: Create new commit<br/>with new_tree_sha
            GitHub-->>Service: new_commit_sha
            Service->>GitHub: Update ref<br/>to new_commit_sha
            GitHub-->>Service: Success
            Service-->>Handler: commitSha
            Handler-->>Client: 201 Created
            UI->>UI: Reload directory<br/>contents
        end
    end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~50 minutes

Possibly related PRs

Poem

🐰 A config takes flight, with schemas so bright,
Files now born with frontmatter delight,
Lazy emails load, the sidebar can fold,
Projects dashboard shines, bold and gold!

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 60.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The pull request title "feat: add project pages and more project settings config" accurately summarizes the main changes: new project pages (src/pages/dashboard/projects.astro) and enhanced project settings configuration (DirectoryConfig, naming conventions, file creation handlers).

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@Lftobs Lftobs changed the title Feat/sync editor feat: add project pages and more prooject settings config Jan 23, 2026
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🤖 Fix all issues with AI agents
In `@package.json`:
- Around line 33-41: Update dependencies to use the Tiptap v3 collaboration
caret and remove the old v2 and unused v1 packages: replace
"@tiptap/extension-collaboration-cursor" with
"@tiptap/extension-collaboration-caret@^3.x.x" so it matches the other
"@tiptap/*" v3 packages (e.g., "@tiptap/react", "@tiptap/starter-kit"), and
remove any "tiptap" v1 entry from dependencies to avoid unused/ incompatible
packages; ensure package.json only lists the v3 collaboration caret and then run
install to update lockfile.

In `@src/components/dashboard/FileExplorer.tsx`:
- Around line 399-411: The code uses btoa(initialContent) which throws on
non-ASCII chars; replace it with a Unicode-safe base64 helper (e.g., a
toBase64(str) function) and call that instead of btoa when building the POST
payload for the create-file request in FileExplorer.tsx; update the use-site
where (api.projects as
any)[projectId].repo[repoOwner][repoName]["create-file"].$post is called to use
toBase64(initialContent) so frontmatter with non-ASCII characters is encoded
safely.

In `@src/components/dashboard/ProjectSettingsModal.tsx`:
- Around line 50-62: The save currently builds payload using settings ? {
...fields... } : { public_directories } which causes a partial PUT when settings
is null; update the save flow in ProjectSettingsModal (the code that builds
payload / handleSave or saveSettings) to avoid overwriting project settings:
either block/disable save until settings is loaded (use a settingsLoaded or
isFetching flag) or always include explicit default values for keys
(allow_file_creation, allow_file_editing, allow_file_deletion, require_approval,
auto_merge, max_file_size, allowed_extensions, collaborator_message, etc.) when
settings is null, or switch the network call from PUT to a PATCH/merge endpoint
so partial updates are safe. Ensure the change references the existing payload
and settings variables so the API won’t receive a replace with only
public_directories.

In `@src/lib/server/repo/handlers.ts`:
- Around line 132-168: createFileHandler is receiving base64-encoded content
from the client but passes it straight to repoService.bulkUpdateFiles, causing
the base64 string to be written to git; decode the incoming content before
calling bulkUpdateFiles (e.g. detect base64 and run a decode step on the content
variable), so modify createFileHandler to transform content from base64 to the
original text prior to invoking repoService.bulkUpdateFiles; keep all existing
validation (projectService.checkProjectAccess,
repoService.validateAllowedDirectory, repoService.validateFileName) and only
change the assignment/usage of the content passed into bulkUpdateFiles.
🧹 Nitpick comments (5)
src/lib/server/email/service.ts (2)

28-45: Redundant null check after isEmailEnabled().

Since isEmailEnabled() internally calls getEmailTransporter() and returns getEmailTransporter() !== null, when line 29 passes, getEmailTransporter() on line 39 will return the same cached transporter (not null). The null check on lines 40-45 is unreachable in normal execution.

While this defensive coding is harmless, it does result in getEmailTransporter() being called twice. Consider simplifying:

♻️ Suggested simplification
-    // Check if email is enabled using the function call
-    if (!isEmailEnabled()) {
+    const transporter = getEmailTransporter();
+    if (!transporter) {
         console.warn(
             "[Email] Email service is not configured. Skipping email send.",
         );
         return {
             success: false,
             error: "Email service not configured",
         };
     }
-
-    const transporter = getEmailTransporter();
-    if (!transporter) {
-        return {
-            success: false,
-            error: "Email transporter initialization failed",
-        };
-    }

48-49: Consider adding a fallback for gmailUser.

While GMAIL_USER should exist if the transporter was created successfully, explicitly handling a potential undefined value would make the from field on line 70 more robust. A fallback or assertion would prevent sending emails with an invalid from address like "Mini CMS" <undefined>.

♻️ Suggested defensive check
         const gmailUser = process.env.GMAIL_USER;
+        if (!gmailUser) {
+            return {
+                success: false,
+                error: "GMAIL_USER environment variable not set",
+            };
+        }
         const fromName = process.env.EMAIL_FROM_NAME || "Mini CMS";
src/pages/dashboard/projects.astro (1)

48-61: Consider adding user-facing error feedback for failed project fetches.

When project fetching fails, the error is logged but the user sees an empty projects list with no indication that something went wrong. Consider setting an error state to display a user-friendly message.

Suggested improvement
+let projectsError: string | null = null;
 let projects: any[] = [];
 if (currentOrg && userId) {
 	try {
 		const projectsResponse = await userProjects.fetch(userId);
 		const projectsData = projectsResponse?.data;
 		if (projectsData) {
 			projects = projectsData.filter(
 				(project: { org_id: any; }) => project.org_id === currentOrg.id,
 			);
 		}
 	} catch (error) {
 		console.error("Error fetching projects:", error);
+		projectsError = "Failed to load projects. Please try again.";
 	}
 }

Then render the error in the template when projectsError is set.

src/components/dashboard/FileExplorer.tsx (2)

344-355: Duplicated validation logic with server-side code.

This validateFileName function duplicates the implementation in src/lib/server/repo/service.ts (lines 132-145). While client-side validation improves UX, consider extracting this to a shared utility to ensure consistency and reduce maintenance burden.


363-376: Config matching logic may have edge cases.

The condition fullPath.startsWith(c.path + "/") || fullPath === c.path || createPath === c.path could match unintended configs. For example, if you have configs for src and src-lib, a file at src-lib/foo.md might incorrectly match the src config due to the createPath === c.path fallback.

Consider a more precise matching approach
-const config = repoConfig.find(c => fullPath.startsWith(c.path + "/") || fullPath === c.path || createPath === c.path);
+const config = repoConfig.find(c => 
+  fullPath.startsWith(c.path + "/") || 
+  fullPath === c.path || 
+  (createPath && (createPath === c.path || createPath.startsWith(c.path + "/")))
+);

Comment on lines +33 to +41
"@tiptap/extension-collaboration": "^3.15.3",
"@tiptap/extension-collaboration-cursor": "^2.26.2",
"@tiptap/extension-table": "^3.15.3",
"@tiptap/extension-table-cell": "^3.15.3",
"@tiptap/extension-table-header": "^3.15.3",
"@tiptap/extension-table-row": "^3.15.3",
"@tiptap/react": "^3.15.3",
"@tiptap/starter-kit": "^3.15.3",
"@toast-ui/editor": "^3.2.2",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# First, let's locate and read the package.json file to verify versions
find . -maxdepth 2 -name "package.json" -type f

Repository: Lftobs/mini-cms

Length of output: 73


🏁 Script executed:

# Search for any imports of the legacy 'tiptap' v1 package
rg -n --type=js --type=ts --type=tsx --type=jsx --type=astro "from ['\"]tiptap['\"]|require\\(['\"]tiptap['\"]\\)" -A2 -B2

Repository: Lftobs/mini-cms

Length of output: 86


🌐 Web query:

Tiptap @tiptap/extension-collaboration-cursorv2 compatibility with@tiptap v3 packages

💡 Result:

@tiptap/extension-collaboration-cursor v2 is not compatible with @tiptap/* v3 packages—you generally can’t mix major versions in Tiptap because of breaking API changes and peer dependency expectations.

In Tiptap v3, the cursor extension was renamed from CollaborationCursor to CollaborationCaret. So for a v3 editor you should use:

  • @tiptap/extension-collaboration (v3)
  • @tiptap/extension-collaboration-caret (v3) (tiptap.dev)

Practical rule: keep your Tiptap packages on the same major version (all v2 or all v3). (tiptap.dev)

Citations:


🌐 Web query:

Are @tiptap/extension-collaboration-cursorv2 and@tiptap v3 packages compatible

💡 Result:

No—don’t mix Tiptap major versions.

  • @tiptap/extension-collaboration-cursor@v2 is a Tiptap v2 extension and is not intended to work with @tiptap/*@v3 (v3 is a breaking major release, and packages are expected to be on the same major version). [1]
  • In Tiptap v3, the old CollaborationCursor extension was renamed; you should use @tiptap/extension-collaboration-caret@v3 instead (along with @tiptap/extension-collaboration@v3, @tiptap/core@v3, etc.). [2]

What to do: upgrade to the v3 caret package:

  • install @tiptap/extension-collaboration-caret (v3) rather than @tiptap/extension-collaboration-cursor (v2). [2]

Sources: Tiptap v3 upgrade/changes docs. [1], [2]


🏁 Script executed:

# Search for legacy tiptap v1 imports - fix file type issue
rg -n "from ['\"]tiptap['\"]|require\\(['\"]tiptap['\"]\\)" -A2 -B2

Repository: Lftobs/mini-cms

Length of output: 41


🏁 Script executed:

# Read the package.json file to verify all Tiptap versions
cat package.json | head -100

Repository: Lftobs/mini-cms

Length of output: 2303


🌐 Web query:

Astro v5 TypeScript version requirement

💡 Result:

Astro v5 itself does not declare a required TypeScript version (it doesn’t list typescript as a dependency/peer dependency), so TypeScript is optional unless you’re using TS tooling. [1]

If you use Astro’s type-checking tooling (astro check / @astrojs/check), it requires TypeScript ^5.0.0. [2]

Sources: [1] [2]


Replace @tiptap/extension-collaboration-cursor v2 with the v3 equivalent @tiptap/extension-collaboration-caret.

The v2 and v3 Tiptap packages are incompatible due to breaking API changes. In Tiptap v3, the CollaborationCursor extension was renamed to CollaborationCaret. Replace the v2 package with @tiptap/extension-collaboration-caret@^3.x.x to match your other Tiptap v3 dependencies.

Additionally, the tiptap v1 package in dependencies is unused and can be removed to clean up the dependency tree.

🤖 Prompt for AI Agents
In `@package.json` around lines 33 - 41, Update dependencies to use the Tiptap v3
collaboration caret and remove the old v2 and unused v1 packages: replace
"@tiptap/extension-collaboration-cursor" with
"@tiptap/extension-collaboration-caret@^3.x.x" so it matches the other
"@tiptap/*" v3 packages (e.g., "@tiptap/react", "@tiptap/starter-kit"), and
remove any "tiptap" v1 entry from dependencies to avoid unused/ incompatible
packages; ensure package.json only lists the v3 collaboration caret and then run
install to update lockfile.

Comment on lines 399 to +411
const response = await (api.projects as any)[projectId].repo[
repoOwner
][repoName]["create-file"].$post({
json: {
path: fullPath,
content: btoa(""),
content: btoa(initialContent),
message: `Create ${fullPath}`,
},
});
if (!response.ok) throw new Error(response.statusText);
if (!response.ok) {
const errData = await response.json();
throw new Error(errData.message || response.statusText);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

btoa() may fail with non-ASCII characters in frontmatter.

If the generated frontmatter contains non-ASCII characters (e.g., from schema field names or default values), btoa() will throw an error. Consider using a Unicode-safe encoding approach.

Suggested fix for Unicode safety
-content: btoa(initialContent),
+content: btoa(unescape(encodeURIComponent(initialContent))),

Or use a helper function:

const toBase64 = (str: string) => btoa(new TextEncoder().encode(str).reduce((data, byte) => data + String.fromCharCode(byte), ''));
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const response = await (api.projects as any)[projectId].repo[
repoOwner
][repoName]["create-file"].$post({
json: {
path: fullPath,
content: btoa(""),
content: btoa(initialContent),
message: `Create ${fullPath}`,
},
});
if (!response.ok) throw new Error(response.statusText);
if (!response.ok) {
const errData = await response.json();
throw new Error(errData.message || response.statusText);
}
const response = await (api.projects as any)[projectId].repo[
repoOwner
][repoName]["create-file"].$post({
json: {
path: fullPath,
content: btoa(unescape(encodeURIComponent(initialContent))),
message: `Create ${fullPath}`,
},
});
if (!response.ok) {
const errData = await response.json();
throw new Error(errData.message || response.statusText);
}
🤖 Prompt for AI Agents
In `@src/components/dashboard/FileExplorer.tsx` around lines 399 - 411, The code
uses btoa(initialContent) which throws on non-ASCII chars; replace it with a
Unicode-safe base64 helper (e.g., a toBase64(str) function) and call that
instead of btoa when building the POST payload for the create-file request in
FileExplorer.tsx; update the use-site where (api.projects as
any)[projectId].repo[repoOwner][repoName]["create-file"].$post is called to use
toBase64(initialContent) so frontmatter with non-ASCII characters is encoded
safely.

Comment on lines +50 to 62
const payload = settings ? {
public_directories: JSON.stringify(dirs),
allow_file_creation: settings.allow_file_creation,
allow_file_editing: settings.allow_file_editing,
allow_file_deletion: settings.allow_file_deletion,
require_approval: settings.require_approval,
auto_merge: settings.auto_merge,
max_file_size: settings.max_file_size,
allowed_extensions: settings.allowed_extensions,
collaborator_message: settings.collaborator_message,
} : {
public_directories: JSON.stringify(dirs),
allow_file_creation: settings?.allow_file_creation ?? false,
allow_file_editing: settings?.allow_file_editing ?? true,
allow_file_deletion: settings?.allow_file_deletion ?? false,
require_approval: settings?.require_approval ?? true,
auto_merge: settings?.auto_merge ?? false,
max_file_size: settings?.max_file_size ?? 1048576,
allowed_extensions:
settings?.allowed_extensions ??
JSON.stringify([".md"]),
collaborator_message: settings?.collaborator_message ?? "",
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Avoid PUT with a partial payload when settings is null.

With settings missing, you now send only public_directories via a PUT. If the API treats PUT as replace, this can drop required flags (or fail validation) on fresh projects or when fetch fails. Consider blocking save until settings are loaded, restoring explicit defaults, or switching to a PATCH/merge endpoint if partial updates are intended.

🛠️ Possible guard (prevents partial overwrite)
 const handleSave = async () => {
   setSaving(true);
   try {
+    if (!settings) {
+      alert("Settings not loaded yet. Please try again.");
+      return;
+    }
     const dirs = allowedDirs
       .split("\n")
       .map((d) => d.trim())
       .filter((d) => d);

-    const payload = settings ? {
-      public_directories: JSON.stringify(dirs),
-      allow_file_creation: settings.allow_file_creation,
-      allow_file_editing: settings.allow_file_editing,
-      allow_file_deletion: settings.allow_file_deletion,
-      require_approval: settings.require_approval,
-      auto_merge: settings.auto_merge,
-      max_file_size: settings.max_file_size,
-      allowed_extensions: settings.allowed_extensions,
-      collaborator_message: settings.collaborator_message,
-    } : {
-      public_directories: JSON.stringify(dirs),
-    };
+    const payload = {
+      public_directories: JSON.stringify(dirs),
+      allow_file_creation: settings.allow_file_creation,
+      allow_file_editing: settings.allow_file_editing,
+      allow_file_deletion: settings.allow_file_deletion,
+      require_approval: settings.require_approval,
+      auto_merge: settings.auto_merge,
+      max_file_size: settings.max_file_size,
+      allowed_extensions: settings.allowed_extensions,
+      collaborator_message: settings.collaborator_message,
+    };
🤖 Prompt for AI Agents
In `@src/components/dashboard/ProjectSettingsModal.tsx` around lines 50 - 62, The
save currently builds payload using settings ? { ...fields... } : {
public_directories } which causes a partial PUT when settings is null; update
the save flow in ProjectSettingsModal (the code that builds payload / handleSave
or saveSettings) to avoid overwriting project settings: either block/disable
save until settings is loaded (use a settingsLoaded or isFetching flag) or
always include explicit default values for keys (allow_file_creation,
allow_file_editing, allow_file_deletion, require_approval, auto_merge,
max_file_size, allowed_extensions, collaborator_message, etc.) when settings is
null, or switch the network call from PUT to a PATCH/merge endpoint so partial
updates are safe. Ensure the change references the existing payload and settings
variables so the API won’t receive a replace with only public_directories.

Comment on lines +132 to +168
export const createFileHandler = async (c: Context) => {
try {
const { owner, repo } = c.req.param();
const projectId = c.req.param("projectId");
const { path, content, message } = await c.req.json();
const user = c.get("user");

if (!projectId) {
return c.json(error("Project ID is required"), 400);
}

await projectService.checkProjectAccess(projectId, user.id);

const { allowed, config } = await repoService.validateAllowedDirectory(projectId, owner, repo, path);
if (!allowed) {
return c.json(error("Access to this directory is not allowed"), 403);
}

if (config?.naming_convention) {
const filename = path.split("/").pop() || "";
// Remove extension
const nameWithoutExt = filename.includes(".")
? filename.split(".").slice(0, -1).join(".")
: filename;

if (!repoService.validateFileName(nameWithoutExt, config.naming_convention)) {
return c.json(error(`File name must follow ${config.naming_convention} convention`), 400);
}
}

// We use bulkUpdateFiles for single file creation too as it handles git flow
const result = await repoService.bulkUpdateFiles(projectId, owner, repo, [{ path, content }], message);
return c.json(success(result), 201);
} catch (err: any) {
return c.json(error(err.message), err.statusCode || 500);
}
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# First, find and check FileExplorer.tsx around line 404
find . -name "FileExplorer.tsx" -type f

Repository: Lftobs/mini-cms

Length of output: 102


🏁 Script executed:

# Find service.ts and check bulkUpdateFiles implementation
find . -name "service.ts" -type f -path "*repo*"

Repository: Lftobs/mini-cms

Length of output: 91


🏁 Script executed:

# Search for createBlob usage in the repo
rg -n "createBlob" -A 5 --type ts --type tsx

Repository: Lftobs/mini-cms

Length of output: 86


🏁 Script executed:

# Check FileExplorer.tsx around line 404
cat -n ./src/components/dashboard/FileExplorer.tsx | sed -n '395,415p'

Repository: Lftobs/mini-cms

Length of output: 785


🏁 Script executed:

# Search for createBlob in all files
rg -n "createBlob" -A 5

Repository: Lftobs/mini-cms

Length of output: 492


🏁 Script executed:

# Check service.ts for bulkUpdateFiles around line 222
wc -l ./src/lib/server/repo/service.ts

Repository: Lftobs/mini-cms

Length of output: 95


🏁 Script executed:

# Read bulkUpdateFiles function context - around line 200-230
cat -n ./src/lib/server/repo/service.ts | sed -n '195,235p'

Repository: Lftobs/mini-cms

Length of output: 1589


🏁 Script executed:

# Check if there's any content decoding in the handler or service
rg -n "atob|Buffer.from.*base64|base64" ./src/lib/server/repo/

Repository: Lftobs/mini-cms

Length of output: 305


🏁 Script executed:

# Check the full createFileHandler to see if it processes content
cat -n ./src/lib/server/repo/handlers.ts | sed -n '132,175p'

Repository: Lftobs/mini-cms

Length of output: 1827


Decode base64 content before passing to bulkUpdateFiles.

The client sends base64-encoded content (FileExplorer.tsx line 404: btoa(initialContent)), but the handler passes it directly to bulkUpdateFiles without decoding. The service then sends this base64 string to GitHub's API with encoding: "utf-8", causing the base64 string itself to be written as file content instead of the decoded original content.

Decode the content before passing to bulkUpdateFiles:

Fix: Decode base64 on server
+        // Decode base64 content from client
+        const decodedContent = Buffer.from(content, 'base64').toString('utf-8');
+
         // We use bulkUpdateFiles for single file creation too as it handles git flow
-        const result = await repoService.bulkUpdateFiles(projectId, owner, repo, [{ path, content }], message);
+        const result = await repoService.bulkUpdateFiles(projectId, owner, repo, [{ path, content: decodedContent }], message);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export const createFileHandler = async (c: Context) => {
try {
const { owner, repo } = c.req.param();
const projectId = c.req.param("projectId");
const { path, content, message } = await c.req.json();
const user = c.get("user");
if (!projectId) {
return c.json(error("Project ID is required"), 400);
}
await projectService.checkProjectAccess(projectId, user.id);
const { allowed, config } = await repoService.validateAllowedDirectory(projectId, owner, repo, path);
if (!allowed) {
return c.json(error("Access to this directory is not allowed"), 403);
}
if (config?.naming_convention) {
const filename = path.split("/").pop() || "";
// Remove extension
const nameWithoutExt = filename.includes(".")
? filename.split(".").slice(0, -1).join(".")
: filename;
if (!repoService.validateFileName(nameWithoutExt, config.naming_convention)) {
return c.json(error(`File name must follow ${config.naming_convention} convention`), 400);
}
}
// We use bulkUpdateFiles for single file creation too as it handles git flow
const result = await repoService.bulkUpdateFiles(projectId, owner, repo, [{ path, content }], message);
return c.json(success(result), 201);
} catch (err: any) {
return c.json(error(err.message), err.statusCode || 500);
}
};
export const createFileHandler = async (c: Context) => {
try {
const { owner, repo } = c.req.param();
const projectId = c.req.param("projectId");
const { path, content, message } = await c.req.json();
const user = c.get("user");
if (!projectId) {
return c.json(error("Project ID is required"), 400);
}
await projectService.checkProjectAccess(projectId, user.id);
const { allowed, config } = await repoService.validateAllowedDirectory(projectId, owner, repo, path);
if (!allowed) {
return c.json(error("Access to this directory is not allowed"), 403);
}
if (config?.naming_convention) {
const filename = path.split("/").pop() || "";
// Remove extension
const nameWithoutExt = filename.includes(".")
? filename.split(".").slice(0, -1).join(".")
: filename;
if (!repoService.validateFileName(nameWithoutExt, config.naming_convention)) {
return c.json(error(`File name must follow ${config.naming_convention} convention`), 400);
}
}
// Decode base64 content from client
const decodedContent = Buffer.from(content, 'base64').toString('utf-8');
// We use bulkUpdateFiles for single file creation too as it handles git flow
const result = await repoService.bulkUpdateFiles(projectId, owner, repo, [{ path, content: decodedContent }], message);
return c.json(success(result), 201);
} catch (err: any) {
return c.json(error(err.message), err.statusCode || 500);
}
};
🤖 Prompt for AI Agents
In `@src/lib/server/repo/handlers.ts` around lines 132 - 168, createFileHandler is
receiving base64-encoded content from the client but passes it straight to
repoService.bulkUpdateFiles, causing the base64 string to be written to git;
decode the incoming content before calling bulkUpdateFiles (e.g. detect base64
and run a decode step on the content variable), so modify createFileHandler to
transform content from base64 to the original text prior to invoking
repoService.bulkUpdateFiles; keep all existing validation
(projectService.checkProjectAccess, repoService.validateAllowedDirectory,
repoService.validateFileName) and only change the assignment/usage of the
content passed into bulkUpdateFiles.

@Lftobs Lftobs changed the title feat: add project pages and more prooject settings config feat: add project pages and more project settings config Jan 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant