Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
40af9c6
test case for all function
sasi2312 Jul 24, 2024
69362ce
Merge branch 'main' of https://github.com/Code4GovTech/DMP-CMS-Backen…
sasi2312 Jul 25, 2024
156b66d
supabase query changes
sasi2312 Aug 2, 2024
fb75d0a
Merge pull request #26 from Code4GovTech/test_case
jaanbaaz Aug 2, 2024
3deea61
Merge branch 'dev' of https://github.com/Code4GovTech/DMP-CMS-Backend…
sasi2312 Aug 9, 2024
a695966
ORM convertion of all query
sasi2312 Aug 9, 2024
cfab860
code remove - supabase
sasi2312 Aug 9, 2024
76c1847
testcase changes
sasi2312 Aug 9, 2024
313f99c
package removed
sasi2312 Aug 9, 2024
ce1707f
changes
sasi2312 Aug 9, 2024
1f18949
query changes
sasi2312 Aug 9, 2024
b6f0b64
workflow changes - addded postgres vars
sasi2312 Aug 9, 2024
d9e4cb0
Merge pull request #29 from Code4GovTech/migration_orm
sasi2312 Aug 9, 2024
44b2cde
req added
sasi2312 Aug 9, 2024
2b0ede2
Merge pull request #30 from Code4GovTech/migration_orm
sasi2312 Aug 9, 2024
f74e6a2
added logs
sasi2312 Aug 21, 2024
7829f59
Merge pull request #31 from Code4GovTech/migration_orm
sasi2312 Aug 21, 2024
1c2d102
Create build-and-push.yaml
jaanbaaz Sep 12, 2024
91f2afe
Merge branch 'main' of github.com:Code4GovTech/DMP-CMS-Backend-CRON i…
jaanbaaz Nov 29, 2024
4501aff
first commit
jaanbaaz Dec 4, 2024
cb8c194
Initial setup commit
jaanbaaz Dec 4, 2024
f6cb9cc
Complete setup of alembic to manage migrations
jaanbaaz Dec 4, 2024
4ade940
Added models, migrations, and database interactions
jaanbaaz Dec 6, 2024
eb21097
added migration files
jaanbaaz Dec 9, 2024
89d70bf
Model and migration changes
jaanbaaz Dec 22, 2024
d393d52
Merge pull request #1 from Code4GovTech/dev
jaanbaaz Dec 22, 2024
bc63a9a
Model and migration changes
jaanbaaz Dec 22, 2024
77cb56d
Merge pull request #2 from Code4GovTech/dev
jaanbaaz Dec 22, 2024
4cbeec2
dmp api changes and model changes
jaanbaaz Dec 23, 2024
85b93ba
Merge pull request #3 from Code4GovTech/dev
jaanbaaz Dec 23, 2024
2d77a75
dmp api type casting
jaanbaaz Dec 23, 2024
8ca6de8
Merge pull request #4 from Code4GovTech/dev
jaanbaaz Dec 23, 2024
a738043
Relative paths for models
jaanbaaz Dec 24, 2024
8bef9ed
Merge pull request #5 from Code4GovTech/dev
jaanbaaz Dec 24, 2024
c51f330
Shared models integrated
jaanbaaz Dec 24, 2024
59342df
Merge pull request #35 from Code4GovTech/feature/migrations-submodule
jaanbaaz Dec 24, 2024
741896b
discord bot renamed
jaanbaaz Dec 24, 2024
16aa221
Merge pull request #6 from Code4GovTech/dev
jaanbaaz Dec 24, 2024
b43331b
Cleaned Submodule
shreyash-work-fl Dec 27, 2024
33d9257
Merged Repo as folder
shreyash-work-fl Dec 27, 2024
0539974
Trying git push
shreyash-work-fl Dec 27, 2024
510823f
Removed Comments
shreyash-work-fl Dec 27, 2024
fc7ec44
Merge pull request #37 from Code4GovTech/feature/limit-submodule
Shreyash-work-em Dec 27, 2024
d982ba2
updated Dockerfile and .dockerignore to build submodules
Srijan-SS02 Dec 27, 2024
141cc38
Merge pull request #38 from Code4GovTech/dockerfile
Shreyash-work-em Dec 27, 2024
463b3d2
Updated requirements.txt for flask
Srijan-SS02 Dec 27, 2024
3f372fc
Removed Submodule
shreyash-work-fl Jan 3, 2025
253f171
Readded Submodule
shreyash-work-fl Jan 3, 2025
66f5833
Cleaned Comments
shreyash-work-fl Jan 10, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
.env
!.git
57 changes: 57 additions & 0 deletions .github/workflows/build-and-push.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
name: Build and Push Docker Image

on:
push:
branches:
- main
- dev
release:
types: [published]
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}

jobs:
build-and-push:
runs-on: ubuntu-latest
# Sets the permissions granted to the `GITHUB_TOKEN` for the actions in this job.
permissions:
contents: read
packages: write
steps:

- name: Checkout code
uses: actions/checkout@v2

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2

- name: Log in to the Container registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}

- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
# minimal
type=pep440,pattern={{version}},value=${{ github.ref_name }},enable=${{ github.event_name == 'release' }}
# branch event
type=ref,event=branch
type=raw,value=latest,enable=${{ github.event_name == 'release' }}

- name: Build and Push Docker image
uses: docker/build-push-action@v4
with:
# build-args:
context: .
push: true
cache-from: type=gha
cache-to: type=gha,mode=max
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
10 changes: 10 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -66,10 +66,16 @@ jobs:
SUPABASE_KEY: ${{ secrets[format('APP_{0}_SUPABASE_KEY', needs.set_vars.outputs.APP_ENV)] }}
SUPABASE_URL: ${{ vars[format('APP_{0}_SUPABASE_URL', needs.set_vars.outputs.APP_ENV)] }}
SCHEDULER_DELAY_IN_MINS: ${{ vars[format('APP_{0}_SCHEDULER_DELAY_IN_MINS', needs.set_vars.outputs.APP_ENV)] }}
POSTGRES_DB_HOST: ${{ secrets[format('APP_{0}_POSTGRES_DB_HOST', needs.set_vars.outputs.APP_ENV)] }}
POSTGRES_DB_NAME: ${{ secrets[format('APP_{0}_POSTGRES_DB_NAME', needs.set_vars.outputs.APP_ENV)] }}
POSTGRES_DB_USER: ${{ secrets[format('APP_{0}_POSTGRES_DB_USER', needs.set_vars.outputs.APP_ENV)] }}
POSTGRES_DB_PASS: ${{ secrets[format('APP_{0}_POSTGRES_DB_PASS', needs.set_vars.outputs.APP_ENV)] }}

steps:
- name: Checkout code
uses: actions/checkout@v2


- name: Log in to the Container registry
uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1
with:
Expand All @@ -87,6 +93,10 @@ jobs:
echo "SUPABASE_URL=${SUPABASE_URL}" >> .env
echo "SUPABASE_KEY=${SUPABASE_KEY}" >> .env
echo "SCHEDULER_DELAY_IN_MINS=${SCHEDULER_DELAY_IN_MINS}" >> .env
echo "POSTGRES_DB_HOST=${POSTGRES_DB_HOST}" >> .env
echo "POSTGRES_DB_NAME=${POSTGRES_DB_NAME}" >> .env
echo "POSTGRES_DB_USER=${POSTGRES_DB_USER}" >> .env
echo "POSTGRES_DB_PASS=${POSTGRES_DB_PASS}" >> .env

mv .env ${{ env.DOT_ENV_FILE_NAME }}

Expand Down
3 changes: 3 additions & 0 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[submodule "shared_migrations"]
path = shared_migrations
url = https://github.com/Code4GovTech/shared-models-migrations.git
11 changes: 10 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,21 @@ FROM python:3.12-slim
# Set the working directory in the container
WORKDIR /app

# Copy the current directory contents into the container at /app
# Install necessary tools
RUN apt-get update && \
apt-get install -y --no-install-recommends git openssh-client && \
rm -rf /var/lib/apt/lists/*

# Copy the current directory contents, including the .git directory
COPY . /app

# Set up the SSH agent forwarding
RUN --mount=type=ssh git submodule update --init --recursive

# Install any needed packages specified in requirements.txt
RUN pip install --no-cache-dir -r requirements.txt

# Expose the application port
EXPOSE 5000

# Run app.py when the container launches
Expand Down
107 changes: 74 additions & 33 deletions app.py
Original file line number Diff line number Diff line change
@@ -1,20 +1,32 @@
# app.py
from quart import Quart
import httpx
import os,markdown2
from db import SupabaseInterface
import os, markdown2, httpx
from apscheduler.schedulers.asyncio import AsyncIOScheduler
from dotenv import load_dotenv
from datetime import datetime

from datetime import datetime, timezone
# from query import PostgresORM
from shared_migrations.db import PostgresORM, get_postgres_uri
from shared_migrations.db.dmp_cron import DmpCronQueries
from utils import handle_week_data, parse_issue_description
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
from sqlalchemy.orm import sessionmaker
from datetime import datetime
from shared_migrations.db.models import *
from sqlalchemy.pool import NullPool

# Load environment variables from .env file
load_dotenv()
delay_mins: str = os.getenv("SCHEDULER_DELAY_IN_MINS")

app = Quart(__name__)

# Initialize Quart app
app.config['SQLALCHEMY_DATABASE_URI'] = get_postgres_uri()

# Initialize Async SQLAlchemy
engine = create_async_engine(app.config['SQLALCHEMY_DATABASE_URI'], echo=False, poolclass=NullPool)
async_session = sessionmaker(autocommit=False, autoflush=False, bind=engine, class_=AsyncSession)

scheduler = AsyncIOScheduler()


Expand Down Expand Up @@ -88,10 +100,9 @@ async def dmp_updates():
GITHUB_TOKEN = os.getenv('GITHUB_TOKEN')
try:
TARGET_DATE = os.getenv('TARGET_DATE')
db = SupabaseInterface().get_instance()

# Loop through all dmp issues
dmp_tickets = db.get_dmp_issues()
dmp_tickets = await DmpCronQueries.get_all_dmp_issues(async_session)

for dmp in dmp_tickets:
dmp_id = dmp['id']
Expand All @@ -100,7 +111,7 @@ async def dmp_updates():
repo = dmp['repo']
owner = dmp['repo_owner']

app.logger.info("DMP_ID: "+str(dmp_id))
app.logger.info("DMP_ID: " + str(dmp_id))

# # Make the HTTP request to GitHub API
headers = {
Expand All @@ -120,18 +131,22 @@ async def dmp_updates():
# Parse issue discription
print('processing description ')
issue_update = define_issue_description_update(issue_response.json())
issue_update['mentor_username'] = dmp['mentor_username'] #get from db
issue_update['contributor_username'] = dmp['contributor_username'] #get from db

issue_update['mentor_username'] = dmp['mentor_username'] # get from db
issue_update['contributor_username'] = dmp['contributor_username'] # get from db

app.logger.info('Decription from remote: ', issue_update)
update_data = db.update_data(
issue_update, 'dmp_issues', 'id', dmp_id)

update_data = await DmpCronQueries.update_dmp_issue(async_session, issue_id=dmp_id,
update_data=issue_update)

print(f"dmp_issue update works - dmp_id {dmp_id}") if update_data else print(
f"dmp_issue update failed - dmp_id {dmp_id}")
app.logger.info(update_data)
else:
print('issue response ', issue_response)
app.logger.error("Description API failed: " +
str(issue_response.status_code) + " for dmp_id: "+str(dmp_id))
str(issue_response.status_code) + " for dmp_id: " + str(dmp_id))

# 2. Read & Update comments of the ticket
page = 1
Expand All @@ -148,29 +163,43 @@ async def dmp_updates():
week_learning_status = False
# Loop through comments
comments_array = comments_response.json()
if comments_array == [] or len(comments_array)==0:
if comments_array == [] or len(comments_array) == 0:
break
for val in comments_response.json():
# Handle if any of the comments are week data
# Handle if any of the comments are week data
plain_text_body = markdown2.markdown(val['body'])
if "Weekly Goals" in plain_text_body and not week_update_status:
week_update_status = handle_week_data(val, dmp['issue_url'], dmp_id, issue_update['mentor_username'])

week_update_status = await handle_week_data(val, dmp['issue_url'], dmp_id,
issue_update['mentor_username'],
async_session)

if "Weekly Learnings" in plain_text_body and not week_learning_status:
week_learning_status = handle_week_data(val, dmp['issue_url'], dmp_id, issue_update['mentor_username'])

week_learning_status = await handle_week_data(val, dmp['issue_url'], dmp_id,
issue_update['mentor_username'],
async_session)

# Parse comments
comment_update = define_issue_update(
val, dmp_id=dmp_id)
app.logger.info(
'Comment from remote: ', comment_update)
upsert_comments = db.upsert_data(
comment_update, 'dmp_issue_updates')
comment_update = define_issue_update(val, dmp_id=dmp_id)
app.logger.info('Comment from remote: ', comment_update)

# get created_at
created_timestamp = await DmpCronQueries.get_timestamp(async_session, DmpIssueUpdates,
'created_at', 'comment_id',
comment_update['comment_id'])
comment_update[
'created_at'] = datetime.utcnow() if not created_timestamp else created_timestamp
comment_update['comment_updated_at'] = datetime.utcnow().replace(tzinfo=None)
comment_update['created_at'] = comment_update['created_at'].replace(tzinfo=None)

upsert_comments = await DmpCronQueries.upsert_data_orm(async_session, comment_update)

print(f"dmp_issue_updates works dmp_id - {dmp_id}") if upsert_comments else print(
f"comment failed dmp_id - {dmp_id}")
app.logger.info(upsert_comments)
else:
print('issue response ', issue_response)
app.logger.error("Comments API failed: " +
str(issue_response.status_code) + " for dmp_id: "+str(dmp_id))
str(issue_response.status_code) + " for dmp_id: " + str(dmp_id))
break
page = page + 1

Expand All @@ -188,25 +217,37 @@ async def dmp_updates():
pr_created_at = pr_val['created_at']
if (pr_created_at >= TARGET_DATE):
pr_data = define_pr_update(pr_val, dmp_id)
upsert_pr = db.upsert_data(
pr_data, 'dmp_pr_updates')

created_timestamp = await DmpCronQueries.get_timestamp(async_session, DmpPrUpdates,
'created_at', 'pr_id',
pr_data['pr_id'])
pr_data['created_at'] = datetime.utcnow() if not created_timestamp else created_timestamp
pr_data['created_at'] = pr_data['created_at'].replace(tzinfo=None)

upsert_pr = await DmpCronQueries.upsert_pr_update(async_session, pr_data)

print(f"dmp_pr_updates works - dmp_id is {dmp_id}") if upsert_pr else print(
f"dmp_pr_updates failed - dmp_id is {dmp_id}")
app.logger.info(upsert_pr)
else:
print('issue response ', issue_response)
app.logger.error("PR API failed: " +
str(issue_response.status_code) + " for dmp_id: "+str(dmp_id))
str(issue_response.status_code) + " for dmp_id: " + str(dmp_id))
print(f"last run at - {datetime.utcnow()}")
return "success"
except Exception as e:
print(e)
print(f"last run with error - {datetime.utcnow()}")
return "Server Error"


@app.before_serving
async def start_scheduler():
app.logger.info(
"Scheduling dmp_updates_job to run every "+delay_mins+" mins")
"Scheduling dmp_updates_job to run every " + delay_mins + " mins")
scheduler.add_job(dmp_updates, 'interval', minutes=int(delay_mins))
scheduler.start()


if __name__ == '__main__':
app.run(host='0.0.0.0')
app.run(host='0.0.0.0')
77 changes: 0 additions & 77 deletions db.py
Original file line number Diff line number Diff line change
@@ -1,77 +0,0 @@
import os, sys
from typing import Any
from supabase import create_client, Client
from supabase.lib.client_options import ClientOptions
from abc import ABC, abstractmethod
from dotenv import load_dotenv

load_dotenv()

client_options = ClientOptions(postgrest_client_timeout=None)

url: str = os.getenv("SUPABASE_URL")
key: str = os.getenv("SUPABASE_KEY")

class SupabaseInterface():

_instance = None

def __init__(self):
# Initialize Supabase client upon first instantiation
if not SupabaseInterface._instance:
self.supabase_url =url
self.supabase_key =key
self.client: Client = create_client(self.supabase_url, self.supabase_key, options=client_options)
SupabaseInterface._instance = self
else:
SupabaseInterface._instance = self._instance



@staticmethod
def get_instance():
# Static method to retrieve the singleton instance
if not SupabaseInterface._instance:
# If no instance exists, create a new one
SupabaseInterface._instance = SupabaseInterface()
return SupabaseInterface._instance


def readAll(self, table):
data = self.client.table(f"{table}").select("*").execute()
return data.data

def add_data(self, data,table_name):
data = self.client.table(table_name).insert(data).execute()
return data.data

def update_data(self,data,table_name, match_column, match_value):
response = self.client.table(table_name).update(data).eq(match_column, match_value).execute()
return response.data

def multiple_update_data(self,data,table_name, match_column, match_value):
response = self.client.table(table_name).update(data).eq(match_column[0], match_value[0]).eq(match_column[1], match_value[1]).execute()
return response.data

def upsert_data(self,data,table_name):
response = self.client.table(table_name).upsert(data).execute()
return response.data

def add_data_filter(self, data, table_name):
# Construct the filter based on the provided column names and values
filter_data = {column: data[column] for column in ['dmp_id','issue_number','owner']}

# Check if the data already exists in the table based on the filter
existing_data = self.client.table(table_name).select("*").eq('dmp_id',data['dmp_id']).execute()

# If the data already exists, return without creating a new record
if existing_data.data:
return "Data already exists"

# If the data doesn't exist, insert it into the table
new_data = self.client.table(table_name).insert(data).execute()
return new_data.data

def get_dmp_issues(self):
response = self.client.table('dmp_issues').select('*, dmp_orgs(*)').execute()
return response.data
Empty file added models.py
Empty file.
Loading
Loading