Compare commits

...

35 Commits

Author SHA1 Message Date
NarayanBavisetti
7f7a85d235 Merge branch 'develop' of github.com:makeplane/plane into chore/page_structuring 2023-11-16 18:28:23 +05:30
NarayanBavisetti
f113375a15 fix: lock and archive fixes 2023-11-16 17:33:06 +05:30
Manish Gupta
dd60dec887
Dev/mg selfhosting fix (#2782)
* fixes to self hosting

* self hosting fixes

* removed .temp

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-16 14:38:55 +05:30
Bavisetti Narayan
0c1097592e
fix: pages revamping (#2760)
* fix: page transaction model

* fix: page transaction model

* fix: migration and optimisation

* fix: back migration of page blocks

* fix: added issue embed

* fix: migration fixes

* fix: resolved changes
2023-11-16 14:38:12 +05:30
NarayanBavisetti
71de47496a fix: resolved changes 2023-11-16 14:19:06 +05:30
Anmol Singh Bhatia
bed66235f2
style: workspace sidebar dropdown improvement (#2783) 2023-11-16 14:11:33 +05:30
NarayanBavisetti
cd8f1eb952 fix: migration fixes 2023-11-16 12:45:39 +05:30
NarayanBavisetti
2028f3ede2 Merge branch 'develop' of github.com:makeplane/plane into fix/page_structuring 2023-11-16 12:39:49 +05:30
NarayanBavisetti
572be0fe60 fix: added issue embed 2023-11-16 12:36:30 +05:30
NarayanBavisetti
176e184220 fix: back migration of page blocks 2023-11-15 17:18:21 +05:30
Nikhil
26b1e9d5f1
dev: squashed migrations (#2779)
* dev: migration squash

* dev: migrations squashed for apis and webhooks

* dev: packages updated and  move dj-database-url for local settings

* dev: update package changes
2023-11-15 17:15:02 +05:30
Bavisetti Narayan
79347ec62b
feat: api webhooks (#2543)
* dev: initiate external apis

* dev: external api

* dev: external public api implementation

* dev: add prefix to all api tokens

* dev: flag to enable disable api token api access

* dev: webhook model create and apis

* dev: webhook settings

* fix: webhook logs

* chore: removed drf spectacular

* dev: remove retry_count and fix api logging for get requests

* dev: refactor webhook logic

* fix: celery retry mechanism

* chore: event and action change

* chore: migrations changes

* dev: proxy setup for apis

* chore: changed retry time and cleanup

* chore: added issue comment and inbox issue api endpoints

* fix: migration files

* fix: added env variables

* fix: removed issue attachment from proxy

* fix: added new migration file

* fix: restricted wehbook access

* chore: changed urls

* chore: fixed porject serializer

* fix: set expire for api token

* fix: retrive endpoint for api token

* feat: Api Token screens & api integration

* dev: webhook endpoint changes

* dev: add fields for webhook updates

* feat: Download Api secret key

* chore: removed BASE API URL

* feat: revoke token access

* dev: migration fixes

* feat: workspace webhooks (#2748)

* feat: workspace webhook store, services integeration and rendered webhook list and create

* chore: handled webhook update and rengenerate token in workspace webhooks

* feat: regenerate key and delete functionality

---------

Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>

* fix: url validation added

* fix: seperated env for webhook and api

* Web hooks refactoring

* add show option for generated hook key

* Api token restructure

* webhook minor fixes

* fix build errors

* chore: improvements in file structring

* dev: rate limiting the open apis

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: LAKHAN BAHETI <lakhanbaheti9@gmail.com>
Co-authored-by: rahulramesha <71900764+rahulramesha@users.noreply.github.com>
Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>
Co-authored-by: Nikhil <118773738+pablohashescobar@users.noreply.github.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-15 15:56:57 +05:30
Nikhil
7b965179d8
dev: update bucket script to make the bucket public (#2767)
* dev: update bucket script to make the bucket public

* dev: remove auto bucket script from docker compose
2023-11-15 15:56:08 +05:30
Nikhil
fc51ffc589
chore: user workflow (#2762)
* dev: workspace member deactivation and leave endpoints and filters

* dev: deactivated for project members

* dev: project members leave

* dev: project member check on workspace deactivation

* dev: project member queryset update and remove leave project endpoint

* dev: rename is_deactivated to is_active and user deactivation apis

* dev: check if the user is already part of workspace then make them active

* dev: workspace and project save

* dev: update project members to make them active

* dev: project invitation

* dev: automatic user workspace and project member create when user sign in/up

* dev: fix member invites

* dev: rename deactivation variable

* dev: update project member invitation

* dev: additional permission layer for workspace

* dev: update the url for  workspace invitations

* dev: remove invitation urls from users

* dev: cleanup workspace invitation workflow

* dev: workspace and project invitation
2023-11-15 15:53:16 +05:30
sabith-tu
96f6e37cc5
fix: Delete estimate popup is not closing automatically (#2777) 2023-11-15 14:08:52 +05:30
Nikhil
29774ce84a
dev: API settings (#2594)
* dev: update settings file structure and added extra settings for CORS

* dev: remove WEB_URL variable and add celery integration for sentry

* dev: aws and minio settings

* dev: add cors origins to env

* dev: update settings
2023-11-15 12:31:52 +05:30
Nikhil
8cbe9c26fc
enhancement: label sort order (#2763)
* chore: label sort ordering

* dev: ordering

* fix: sort order

* fix: save of labels

* dev: remove ordering by name

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-15 12:25:44 +05:30
Prateek Shourya
7f42566207
Fix: Custom menu item not automatically closing, affecting delete popup behavior. (#2771) 2023-11-14 23:05:30 +05:30
Ankush Deshmukh
b60237b676
Standarding priority icons across the platform (#2776) 2023-11-14 20:52:43 +05:30
Prateek Shourya
1fe09d369f
style: text overflow fix and border color update (#2769)
* style: fix text overflow in:
* Issue activity
* Cycle and Module Select in Create Issue form
* Delete Module modal
* Join Project modal

* style: update assignee select border as per design.
2023-11-14 18:34:51 +05:30
Dakshesh Jain
b7757c6b1a
fix: bugs (#2761)
* fix: semicolon on estimate settings page

* refactor: project settings automations store implementation

* fix: active cycle stuck on infinite loading

* fix: removed delete project option from sidebar

* fix: discloser not opening when navigating to project

* fix: clear filter not working & filter appearing even if nothing is selected

* refactor: select label store implementation

* refactor: select state store implementation
2023-11-14 18:33:01 +05:30
Anmol Singh Bhatia
1a25bacce1
style: create update view modal consistency (#2775) 2023-11-14 18:30:10 +05:30
Anmol Singh Bhatia
6797df239d
chore: no lead option added in lead select dropdown (#2774) 2023-11-14 18:29:39 +05:30
Anmol Singh Bhatia
43e7c10eb7
chore: spreadsheet layout column responsiveness (#2768) 2023-11-14 18:28:49 +05:30
Anmol Singh Bhatia
bdc9c9c2a8
chore: create update issue modal improvement (#2765) 2023-11-14 18:28:15 +05:30
Anmol Singh Bhatia
f0c72bf249
fix: breadcrumb project icon improvement (#2764) 2023-11-14 18:27:47 +05:30
sabith-tu
a8904bfc48
style: ui fixes for pages and views (#2770) 2023-11-14 18:26:50 +05:30
NarayanBavisetti
73c2416055 fix: migration and optimisation 2023-11-13 19:12:55 +05:30
NarayanBavisetti
598c22f91d Merge branch 'develop' of github.com:makeplane/plane into fix/page_structuring 2023-11-13 17:23:00 +05:30
Nikhil
b31041726b
dev: create bucket through application (#2720) 2023-11-13 15:57:19 +05:30
Prateek Shourya
e6f947ad90
style: ui improvements and bug fixes (#2758)
* style: add transition to favorite projects dropdown.

* style: update project integration settings borders.

* style: fix text overflow issue in project views.

* fix: issue with non-functional cancel button in leave project modal.
2023-11-13 14:42:45 +05:30
Dakshesh Jain
7963993171
fix: workspace settings bugs (#2743)
* fix: double layout in exports

* fix: typo in jira email address section

* fix: workspace members not mutating

* fix: removed un-used variable

* fix: workspace members can't be filtered using email

* fix: autocomplete in workspace delete

* fix: autocomplete in project delete modal

* fix: update member function in store

* fix: sidebar link not active when in github/jira

* style: margin top & icon inconsistency

* fix: typo in create workspace

* fix: workspace leave flow

* fix: redirection to delete issue

* fix: autocomplete off in jira api token

* refactor: reduced api call, added optional chaining & removed variable with low scope
2023-11-13 13:34:05 +05:30
NarayanBavisetti
03026449ea fix: page transaction model 2023-10-11 15:13:05 +05:30
NarayanBavisetti
1039337c45 Merge branch 'develop' of github.com:makeplane/plane into fix/page_structuring 2023-10-09 11:16:33 +05:30
NarayanBavisetti
5216f184c1 fix: page transaction model 2023-10-06 12:00:18 +05:30
185 changed files with 7397 additions and 2386 deletions

View File

@ -33,3 +33,8 @@ USE_MINIO=1
# Nginx Configuration # Nginx Configuration
NGINX_PORT=80 NGINX_PORT=80
# Set it to 0, to disable it
ENABLE_WEBHOOK=1
# Set it to 0, to disable it
ENABLE_API=1

View File

@ -1,7 +1,7 @@
# Backend # Backend
# Debug value for api server use it as 0 for production use # Debug value for api server use it as 0 for production use
DEBUG=0 DEBUG=0
DJANGO_SETTINGS_MODULE="plane.settings.production" CORS_ALLOWED_ORIGINS="http://localhost"
# Error logs # Error logs
SENTRY_DSN="" SENTRY_DSN=""
@ -70,6 +70,12 @@ ENABLE_MAGIC_LINK_LOGIN="0"
# Email redirections and minio domain settings # Email redirections and minio domain settings
WEB_URL="http://localhost" WEB_URL="http://localhost"
# Set it to 0, to disable it
ENABLE_WEBHOOK=1
# Set it to 0, to disable it
ENABLE_API=1
# Gunicorn Workers # Gunicorn Workers
GUNICORN_WORKERS=2 GUNICORN_WORKERS=2

View File

@ -0,0 +1,83 @@
import os, sys
import boto3
import json
from botocore.exceptions import ClientError
sys.path.append("/code")
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "plane.settings.production")
import django
django.setup()
def set_bucket_public_policy(s3_client, bucket_name):
public_policy = {
"Version": "2012-10-17",
"Statement": [{
"Effect": "Allow",
"Principal": "*",
"Action": ["s3:GetObject"],
"Resource": [f"arn:aws:s3:::{bucket_name}/*"]
}]
}
try:
s3_client.put_bucket_policy(
Bucket=bucket_name,
Policy=json.dumps(public_policy)
)
print(f"Public read access policy set for bucket '{bucket_name}'.")
except ClientError as e:
print(f"Error setting public read access policy: {e}")
def create_bucket():
try:
from django.conf import settings
# Create a session using the credentials from Django settings
session = boto3.session.Session(
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
)
# Create an S3 client using the session
s3_client = session.client('s3', endpoint_url=settings.AWS_S3_ENDPOINT_URL)
bucket_name = settings.AWS_STORAGE_BUCKET_NAME
print("Checking bucket...")
# Check if the bucket exists
s3_client.head_bucket(Bucket=bucket_name)
# If head_bucket does not raise an exception, the bucket exists
print(f"Bucket '{bucket_name}' already exists.")
set_bucket_public_policy(s3_client, bucket_name)
except ClientError as e:
error_code = int(e.response['Error']['Code'])
bucket_name = settings.AWS_STORAGE_BUCKET_NAME
if error_code == 404:
# Bucket does not exist, create it
print(f"Bucket '{bucket_name}' does not exist. Creating bucket...")
try:
s3_client.create_bucket(Bucket=bucket_name)
print(f"Bucket '{bucket_name}' created successfully.")
set_bucket_public_policy(s3_client, bucket_name)
except ClientError as create_error:
print(f"Failed to create bucket: {create_error}")
elif error_code == 403:
# Access to the bucket is forbidden
print(f"Access to the bucket '{bucket_name}' is forbidden. Check permissions.")
else:
# Another ClientError occurred
print(f"Failed to check bucket: {e}")
except Exception as ex:
# Handle any other exception
print(f"An error occurred: {ex}")
if __name__ == "__main__":
create_bucket()

View File

@ -5,5 +5,7 @@ python manage.py migrate
# Create a Default User # Create a Default User
python bin/user_script.py python bin/user_script.py
# Create the default bucket
python bin/bucket_script.py
exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:8000 --max-requests 1200 --max-requests-jitter 1000 --access-logfile - exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:8000 --max-requests 1200 --max-requests-jitter 1000 --access-logfile -

View File

@ -1,2 +1,17 @@
from .workspace import WorkSpaceBasePermission, WorkSpaceAdminPermission, WorkspaceEntityPermission, WorkspaceViewerPermission
from .project import ProjectBasePermission, ProjectEntityPermission, ProjectMemberPermission, ProjectLitePermission from .workspace import (
WorkSpaceBasePermission,
WorkspaceOwnerPermission,
WorkSpaceAdminPermission,
WorkspaceEntityPermission,
WorkspaceViewerPermission,
WorkspaceUserPermission,
)
from .project import (
ProjectBasePermission,
ProjectEntityPermission,
ProjectMemberPermission,
ProjectLitePermission,
)

View File

@ -13,14 +13,15 @@ Guest = 5
class ProjectBasePermission(BasePermission): class ProjectBasePermission(BasePermission):
def has_permission(self, request, view): def has_permission(self, request, view):
if request.user.is_anonymous: if request.user.is_anonymous:
return False return False
## Safe Methods -> Handle the filtering logic in queryset ## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS: if request.method in SAFE_METHODS:
return WorkspaceMember.objects.filter( return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug, member=request.user workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists() ).exists()
## Only workspace owners or admins can create the projects ## Only workspace owners or admins can create the projects
@ -29,6 +30,7 @@ class ProjectBasePermission(BasePermission):
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
member=request.user, member=request.user,
role__in=[Admin, Member], role__in=[Admin, Member],
is_active=True,
).exists() ).exists()
## Only Project Admins can update project attributes ## Only Project Admins can update project attributes
@ -37,19 +39,21 @@ class ProjectBasePermission(BasePermission):
member=request.user, member=request.user,
role=Admin, role=Admin,
project_id=view.project_id, project_id=view.project_id,
is_active=True,
).exists() ).exists()
class ProjectMemberPermission(BasePermission): class ProjectMemberPermission(BasePermission):
def has_permission(self, request, view): def has_permission(self, request, view):
if request.user.is_anonymous: if request.user.is_anonymous:
return False return False
## Safe Methods -> Handle the filtering logic in queryset ## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS: if request.method in SAFE_METHODS:
return ProjectMember.objects.filter( return ProjectMember.objects.filter(
workspace__slug=view.workspace_slug, member=request.user workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists() ).exists()
## Only workspace owners or admins can create the projects ## Only workspace owners or admins can create the projects
if request.method == "POST": if request.method == "POST":
@ -57,6 +61,7 @@ class ProjectMemberPermission(BasePermission):
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
member=request.user, member=request.user,
role__in=[Admin, Member], role__in=[Admin, Member],
is_active=True,
).exists() ).exists()
## Only Project Admins can update project attributes ## Only Project Admins can update project attributes
@ -65,12 +70,12 @@ class ProjectMemberPermission(BasePermission):
member=request.user, member=request.user,
role__in=[Admin, Member], role__in=[Admin, Member],
project_id=view.project_id, project_id=view.project_id,
is_active=True,
).exists() ).exists()
class ProjectEntityPermission(BasePermission): class ProjectEntityPermission(BasePermission):
def has_permission(self, request, view): def has_permission(self, request, view):
if request.user.is_anonymous: if request.user.is_anonymous:
return False return False
@ -80,6 +85,7 @@ class ProjectEntityPermission(BasePermission):
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
member=request.user, member=request.user,
project_id=view.project_id, project_id=view.project_id,
is_active=True,
).exists() ).exists()
## Only project members or admins can create and edit the project attributes ## Only project members or admins can create and edit the project attributes
@ -88,11 +94,11 @@ class ProjectEntityPermission(BasePermission):
member=request.user, member=request.user,
role__in=[Admin, Member], role__in=[Admin, Member],
project_id=view.project_id, project_id=view.project_id,
is_active=True,
).exists() ).exists()
class ProjectLitePermission(BasePermission): class ProjectLitePermission(BasePermission):
def has_permission(self, request, view): def has_permission(self, request, view):
if request.user.is_anonymous: if request.user.is_anonymous:
return False return False
@ -101,4 +107,5 @@ class ProjectLitePermission(BasePermission):
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
member=request.user, member=request.user,
project_id=view.project_id, project_id=view.project_id,
is_active=True,
).exists() ).exists()

View File

@ -32,15 +32,31 @@ class WorkSpaceBasePermission(BasePermission):
member=request.user, member=request.user,
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
role__in=[Owner, Admin], role__in=[Owner, Admin],
is_active=True,
).exists() ).exists()
# allow only owner to delete the workspace # allow only owner to delete the workspace
if request.method == "DELETE": if request.method == "DELETE":
return WorkspaceMember.objects.filter( return WorkspaceMember.objects.filter(
member=request.user, workspace__slug=view.workspace_slug, role=Owner member=request.user,
workspace__slug=view.workspace_slug,
role=Owner,
is_active=True,
).exists() ).exists()
class WorkspaceOwnerPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
role=Owner,
).exists()
class WorkSpaceAdminPermission(BasePermission): class WorkSpaceAdminPermission(BasePermission):
def has_permission(self, request, view): def has_permission(self, request, view):
if request.user.is_anonymous: if request.user.is_anonymous:
@ -50,6 +66,7 @@ class WorkSpaceAdminPermission(BasePermission):
member=request.user, member=request.user,
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
role__in=[Owner, Admin], role__in=[Owner, Admin],
is_active=True,
).exists() ).exists()
@ -63,12 +80,14 @@ class WorkspaceEntityPermission(BasePermission):
return WorkspaceMember.objects.filter( return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
member=request.user, member=request.user,
is_active=True,
).exists() ).exists()
return WorkspaceMember.objects.filter( return WorkspaceMember.objects.filter(
member=request.user, member=request.user,
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
role__in=[Owner, Admin], role__in=[Owner, Admin],
is_active=True,
).exists() ).exists()
@ -78,5 +97,20 @@ class WorkspaceViewerPermission(BasePermission):
return False return False
return WorkspaceMember.objects.filter( return WorkspaceMember.objects.filter(
member=request.user, workspace__slug=view.workspace_slug, role__gte=10 member=request.user,
workspace__slug=view.workspace_slug,
role__gte=10,
is_active=True,
).exists()
class WorkspaceUserPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
is_active=True,
).exists() ).exists()

View File

@ -71,7 +71,7 @@ from .module import (
ModuleFavoriteSerializer, ModuleFavoriteSerializer,
) )
from .api_token import APITokenSerializer from .api import APITokenSerializer, APITokenReadSerializer
from .integration import ( from .integration import (
IntegrationSerializer, IntegrationSerializer,
@ -85,7 +85,7 @@ from .integration import (
from .importer import ImporterSerializer from .importer import ImporterSerializer
from .page import PageSerializer, PageBlockSerializer, PageFavoriteSerializer from .page import PageSerializer, PageLogSerializer, SubPageSerializer, PageFavoriteSerializer
from .estimate import ( from .estimate import (
EstimateSerializer, EstimateSerializer,
@ -100,3 +100,5 @@ from .analytic import AnalyticViewSerializer
from .notification import NotificationSerializer from .notification import NotificationSerializer
from .exporter import ExporterHistorySerializer from .exporter import ExporterHistorySerializer
from .webhook import WebhookSerializer, WebhookLogSerializer

View File

@ -0,0 +1,31 @@
from .base import BaseSerializer
from plane.db.models import APIToken, APIActivityLog
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = "__all__"
read_only_fields = [
"token",
"expired_at",
"created_at",
"updated_at",
"workspace",
"user",
]
class APITokenReadSerializer(BaseSerializer):
class Meta:
model = APIToken
exclude = ('token',)
class APIActivityLogSerializer(BaseSerializer):
class Meta:
model = APIActivityLog
fields = "__all__"

View File

@ -1,14 +0,0 @@
from .base import BaseSerializer
from plane.db.models import APIToken
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = [
"label",
"user",
"user_type",
"workspace",
"created_at",
]

View File

@ -6,28 +6,7 @@ from .base import BaseSerializer
from .issue import IssueFlatSerializer, LabelLiteSerializer from .issue import IssueFlatSerializer, LabelLiteSerializer
from .workspace import WorkspaceLiteSerializer from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer from .project import ProjectLiteSerializer
from plane.db.models import Page, PageBlock, PageFavorite, PageLabel, Label from plane.db.models import Page, PageLog, PageFavorite, PageLabel, Label, Issue, Module
class PageBlockSerializer(BaseSerializer):
issue_detail = IssueFlatSerializer(source="issue", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
class Meta:
model = PageBlock
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
class PageBlockLiteSerializer(BaseSerializer):
class Meta:
model = PageBlock
fields = "__all__"
class PageSerializer(BaseSerializer): class PageSerializer(BaseSerializer):
@ -38,7 +17,6 @@ class PageSerializer(BaseSerializer):
write_only=True, write_only=True,
required=False, required=False,
) )
blocks = PageBlockLiteSerializer(read_only=True, many=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True) project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True) workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
@ -102,6 +80,41 @@ class PageSerializer(BaseSerializer):
return super().update(instance, validated_data) return super().update(instance, validated_data)
class SubPageSerializer(BaseSerializer):
entity_details = serializers.SerializerMethodField()
class Meta:
model = PageLog
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
def get_entity_details(self, obj):
entity_name = obj.entity_name
if entity_name == 'forward_link' or entity_name == 'back_link':
try:
page = Page.objects.get(pk=obj.entity_identifier)
return PageSerializer(page).data
except Page.DoesNotExist:
return None
return None
class PageLogSerializer(BaseSerializer):
class Meta:
model = PageLog
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
class PageFavoriteSerializer(BaseSerializer): class PageFavoriteSerializer(BaseSerializer):
page_detail = PageSerializer(source="page", read_only=True) page_detail = PageSerializer(source="page", read_only=True)

View File

@ -103,13 +103,16 @@ class ProjectListSerializer(DynamicBaseSerializer):
members = serializers.SerializerMethodField() members = serializers.SerializerMethodField()
def get_members(self, obj): def get_members(self, obj):
project_members = ProjectMember.objects.filter(project_id=obj.id).values( project_members = ProjectMember.objects.filter(
project_id=obj.id,
is_active=True,
).values(
"id", "id",
"member_id", "member_id",
"member__display_name", "member__display_name",
"member__avatar", "member__avatar",
) )
return project_members return list(project_members)
class Meta: class Meta:
model = Project model = Project

View File

@ -0,0 +1,30 @@
# Third party imports
from rest_framework import serializers
# Module imports
from .base import DynamicBaseSerializer
from plane.db.models import Webhook, WebhookLog
from plane.db.models.webhook import validate_domain, validate_schema
class WebhookSerializer(DynamicBaseSerializer):
url = serializers.URLField(validators=[validate_schema, validate_domain])
class Meta:
model = Webhook
fields = "__all__"
read_only_fields = [
"workspace",
"secret_key",
]
class WebhookLogSerializer(DynamicBaseSerializer):
class Meta:
model = WebhookLog
fields = "__all__"
read_only_fields = [
"workspace",
"webhook"
]

View File

@ -19,6 +19,12 @@ from .state import urlpatterns as state_urls
from .user import urlpatterns as user_urls from .user import urlpatterns as user_urls
from .views import urlpatterns as view_urls from .views import urlpatterns as view_urls
from .workspace import urlpatterns as workspace_urls from .workspace import urlpatterns as workspace_urls
from .api import urlpatterns as api_urls
from .webhook import urlpatterns as webhook_urls
# Django imports
from django.conf import settings
urlpatterns = [ urlpatterns = [
@ -44,3 +50,9 @@ urlpatterns = [
*view_urls, *view_urls,
*workspace_urls, *workspace_urls,
] ]
if settings.ENABLE_WEBHOOK:
urlpatterns += webhook_urls
if settings.ENABLE_API:
urlpatterns += api_urls

View File

@ -0,0 +1,17 @@
from django.urls import path
from plane.api.views import ApiTokenEndpoint
urlpatterns = [
# API Tokens
path(
"workspaces/<str:slug>/api-tokens/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
path(
"workspaces/<str:slug>/api-tokens/<uuid:pk>/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
## End API Tokens
]

View File

@ -3,9 +3,9 @@ from django.urls import path
from plane.api.views import ( from plane.api.views import (
PageViewSet, PageViewSet,
PageBlockViewSet,
PageFavoriteViewSet, PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint, PageLogEndpoint,
SubPagesEndpoint,
) )
@ -31,27 +31,6 @@ urlpatterns = [
), ),
name="project-pages", name="project-pages",
), ),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/",
PageBlockViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-page-blocks",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:pk>/",
PageBlockViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-page-blocks",
),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/", "workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/",
PageFavoriteViewSet.as_view( PageFavoriteViewSet.as_view(
@ -72,8 +51,83 @@ urlpatterns = [
name="user-favorite-pages", name="user-favorite-pages",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:page_block_id>/issues/", "workspaces/<str:slug>/projects/<uuid:project_id>/pages/",
CreateIssueFromPageBlockEndpoint.as_view(), PageViewSet.as_view(
name="page-block-issues", {
"get": "list",
"post": "create",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:pk>/",
PageViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/archive/",
PageViewSet.as_view(
{
"post": "archive",
}
),
name="project-page-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unarchive/",
PageViewSet.as_view(
{
"post": "unarchive",
}
),
name="project-page-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-pages/",
PageViewSet.as_view(
{
"get": "archive_list",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/lock/",
PageViewSet.as_view(
{
"post": "lock",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unlock/",
PageViewSet.as_view(
{
"post": "unlock",
}
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/",
PageLogEndpoint.as_view(),
name="page-transactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/<uuid:transaction>/",
PageLogEndpoint.as_view(),
name="page-transactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/sub-pages/",
SubPagesEndpoint.as_view(),
name="sub-page",
), ),
] ]

View File

@ -2,17 +2,16 @@ from django.urls import path
from plane.api.views import ( from plane.api.views import (
ProjectViewSet, ProjectViewSet,
InviteProjectEndpoint, ProjectInvitationsViewset,
ProjectMemberViewSet, ProjectMemberViewSet,
ProjectMemberInvitationsViewset,
ProjectMemberUserEndpoint, ProjectMemberUserEndpoint,
ProjectJoinEndpoint, ProjectJoinEndpoint,
AddTeamToProjectEndpoint, AddTeamToProjectEndpoint,
ProjectUserViewsEndpoint, ProjectUserViewsEndpoint,
ProjectIdentifierEndpoint, ProjectIdentifierEndpoint,
ProjectFavoritesViewSet, ProjectFavoritesViewSet,
LeaveProjectEndpoint,
ProjectPublicCoverImagesEndpoint, ProjectPublicCoverImagesEndpoint,
UserProjectInvitationsViewset,
) )
@ -45,13 +44,48 @@ urlpatterns = [
name="project-identifiers", name="project-identifiers",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invite/", "workspaces/<str:slug>/projects/<uuid:project_id>/invitations/",
InviteProjectEndpoint.as_view(), ProjectInvitationsViewset.as_view(
name="invite-project", {
"get": "list",
"post": "create",
},
),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/<uuid:pk>/",
ProjectInvitationsViewset.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-member-invite",
),
path(
"users/me/invitations/projects/",
UserProjectInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="user-project-invitations",
),
path(
"workspaces/<str:slug>/projects/join/",
ProjectJoinEndpoint.as_view(),
name="project-join",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/", "workspaces/<str:slug>/projects/<uuid:project_id>/members/",
ProjectMemberViewSet.as_view({"get": "list", "post": "create"}), ProjectMemberViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-member", name="project-member",
), ),
path( path(
@ -66,30 +100,19 @@ urlpatterns = [
name="project-member", name="project-member",
), ),
path( path(
"workspaces/<str:slug>/projects/join/", "workspaces/<str:slug>/projects/<uuid:project_id>/members/leave/",
ProjectJoinEndpoint.as_view(), ProjectMemberViewSet.as_view(
name="project-join", {
"post": "leave",
}
),
name="project-member",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/team-invite/", "workspaces/<str:slug>/projects/<uuid:project_id>/team-invite/",
AddTeamToProjectEndpoint.as_view(), AddTeamToProjectEndpoint.as_view(),
name="projects", name="projects",
), ),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/",
ProjectMemberInvitationsViewset.as_view({"get": "list"}),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/<uuid:pk>/",
ProjectMemberInvitationsViewset.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-member-invite",
),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-views/", "workspaces/<str:slug>/projects/<uuid:project_id>/project-views/",
ProjectUserViewsEndpoint.as_view(), ProjectUserViewsEndpoint.as_view(),
@ -119,11 +142,6 @@ urlpatterns = [
), ),
name="project-favorite", name="project-favorite",
), ),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/leave/",
LeaveProjectEndpoint.as_view(),
name="leave-project",
),
path( path(
"project-covers/", "project-covers/",
ProjectPublicCoverImagesEndpoint.as_view(), ProjectPublicCoverImagesEndpoint.as_view(),

View File

@ -9,15 +9,10 @@ from plane.api.views import (
ChangePasswordEndpoint, ChangePasswordEndpoint,
## End User ## End User
## Workspaces ## Workspaces
UserWorkspaceInvitationsEndpoint,
UserWorkSpacesEndpoint, UserWorkSpacesEndpoint,
JoinWorkspaceEndpoint,
UserWorkspaceInvitationsEndpoint,
UserWorkspaceInvitationEndpoint,
UserActivityGraphEndpoint, UserActivityGraphEndpoint,
UserIssueCompletedGraphEndpoint, UserIssueCompletedGraphEndpoint,
UserWorkspaceDashboardEndpoint, UserWorkspaceDashboardEndpoint,
UserProjectInvitationsViewset,
## End Workspaces ## End Workspaces
) )
@ -26,7 +21,11 @@ urlpatterns = [
path( path(
"users/me/", "users/me/",
UserEndpoint.as_view( UserEndpoint.as_view(
{"get": "retrieve", "patch": "partial_update", "delete": "destroy"} {
"get": "retrieve",
"patch": "partial_update",
"delete": "deactivate",
}
), ),
name="users", name="users",
), ),
@ -65,23 +64,6 @@ urlpatterns = [
UserWorkSpacesEndpoint.as_view(), UserWorkSpacesEndpoint.as_view(),
name="user-workspace", name="user-workspace",
), ),
# user workspace invitations
path(
"users/me/invitations/workspaces/",
UserWorkspaceInvitationsEndpoint.as_view({"get": "list", "post": "create"}),
name="user-workspace-invitations",
),
# user workspace invitation
path(
"users/me/invitations/<uuid:pk>/",
UserWorkspaceInvitationEndpoint.as_view(
{
"get": "retrieve",
}
),
name="user-workspace-invitation",
),
# user join workspace
# User Graphs # User Graphs
path( path(
"users/me/workspaces/<str:slug>/activity-graph/", "users/me/workspaces/<str:slug>/activity-graph/",
@ -99,15 +81,4 @@ urlpatterns = [
name="user-workspace-dashboard", name="user-workspace-dashboard",
), ),
## End User Graph ## End User Graph
path(
"users/me/invitations/workspaces/<str:slug>/<uuid:pk>/join/",
JoinWorkspaceEndpoint.as_view(),
name="user-join-workspace",
),
# user project invitations
path(
"users/me/invitations/projects/",
UserProjectInvitationsViewset.as_view({"get": "list", "post": "create"}),
name="user-project-invitations",
),
] ]

View File

@ -0,0 +1,31 @@
from django.urls import path
from plane.api.views import (
WebhookEndpoint,
WebhookLogsEndpoint,
WebhookSecretRegenerateEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/webhooks/",
WebhookEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhooks/<uuid:pk>/",
WebhookEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhooks/<uuid:pk>/regenerate/",
WebhookSecretRegenerateEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhook-logs/<uuid:webhook_id>/",
WebhookLogsEndpoint.as_view(),
name="webhooks",
),
]

View File

@ -2,8 +2,9 @@ from django.urls import path
from plane.api.views import ( from plane.api.views import (
UserWorkspaceInvitationsViewSet,
WorkSpaceViewSet, WorkSpaceViewSet,
InviteWorkspaceEndpoint, WorkspaceJoinEndpoint,
WorkSpaceMemberViewSet, WorkSpaceMemberViewSet,
WorkspaceInvitationsViewset, WorkspaceInvitationsViewset,
WorkspaceMemberUserEndpoint, WorkspaceMemberUserEndpoint,
@ -17,7 +18,6 @@ from plane.api.views import (
WorkspaceUserProfileEndpoint, WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint, WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint, WorkspaceLabelsEndpoint,
LeaveWorkspaceEndpoint,
) )
@ -49,14 +49,14 @@ urlpatterns = [
), ),
name="workspace", name="workspace",
), ),
path(
"workspaces/<str:slug>/invite/",
InviteWorkspaceEndpoint.as_view(),
name="invite-workspace",
),
path( path(
"workspaces/<str:slug>/invitations/", "workspaces/<str:slug>/invitations/",
WorkspaceInvitationsViewset.as_view({"get": "list"}), WorkspaceInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="workspace-invitations", name="workspace-invitations",
), ),
path( path(
@ -69,6 +69,23 @@ urlpatterns = [
), ),
name="workspace-invitations", name="workspace-invitations",
), ),
# user workspace invitations
path(
"users/me/workspaces/invitations/",
UserWorkspaceInvitationsViewSet.as_view(
{
"get": "list",
"post": "create",
},
),
name="user-workspace-invitations",
),
path(
"workspaces/<str:slug>/invitations/<uuid:pk>/join/",
WorkspaceJoinEndpoint.as_view(),
name="workspace-join",
),
# user join workspace
path( path(
"workspaces/<str:slug>/members/", "workspaces/<str:slug>/members/",
WorkSpaceMemberViewSet.as_view({"get": "list"}), WorkSpaceMemberViewSet.as_view({"get": "list"}),
@ -85,6 +102,15 @@ urlpatterns = [
), ),
name="workspace-member", name="workspace-member",
), ),
path(
"workspaces/<str:slug>/members/leave/",
WorkSpaceMemberViewSet.as_view(
{
"post": "leave",
},
),
name="leave-workspace-members",
),
path( path(
"workspaces/<str:slug>/teams/", "workspaces/<str:slug>/teams/",
TeamMemberViewSet.as_view( TeamMemberViewSet.as_view(
@ -168,9 +194,4 @@ urlpatterns = [
WorkspaceLabelsEndpoint.as_view(), WorkspaceLabelsEndpoint.as_view(),
name="workspace-labels", name="workspace-labels",
), ),
path(
"workspaces/<str:slug>/members/leave/",
LeaveWorkspaceEndpoint.as_view(),
name="leave-workspace-members",
),
] ]

View File

@ -124,9 +124,10 @@ from plane.api.views import (
## End Modules ## End Modules
# Pages # Pages
PageViewSet, PageViewSet,
PageBlockViewSet, PageLogEndpoint,
SubPagesEndpoint,
PageFavoriteViewSet, PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint, CreateIssueFromBlockEndpoint,
## End Pages ## End Pages
# Api Tokens # Api Tokens
ApiTokenEndpoint, ApiTokenEndpoint,
@ -1222,25 +1223,81 @@ urlpatterns = [
name="project-pages", name="project-pages",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/", "workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/archive/",
PageBlockViewSet.as_view( PageViewSet.as_view(
{
"post": "archive",
}
),
name="project-page-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unarchive/",
PageViewSet.as_view(
{
"post": "unarchive",
}
),
name="project-page-unarchive"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-pages/",
PageViewSet.as_view(
{
"get": "archive_list",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/lock/",
PageViewSet.as_view(
{
"post": "lock",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unlock/",
PageViewSet.as_view(
{
"post": "unlock",
}
)
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/",
PageLogEndpoint.as_view(), name="page-transactions"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/<uuid:transaction>/",
PageLogEndpoint.as_view(), name="page-transactions"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/sub-pages/",
SubPagesEndpoint.as_view(), name="sub-page"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/estimates/",
BulkEstimatePointEndpoint.as_view(
{ {
"get": "list", "get": "list",
"post": "create", "post": "create",
} }
), ),
name="project-page-blocks", name="bulk-create-estimate-points",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/estimates/<uuid:estimate_id>/",
PageBlockViewSet.as_view( BulkEstimatePointEndpoint.as_view(
{ {
"get": "retrieve", "get": "retrieve",
"patch": "partial_update", "patch": "partial_update",
"delete": "destroy", "delete": "destroy",
} }
), ),
name="project-page-blocks", name="bulk-create-estimate-points",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/", "workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/",
@ -1263,7 +1320,7 @@ urlpatterns = [
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:page_block_id>/issues/", "workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:page_block_id>/issues/",
CreateIssueFromPageBlockEndpoint.as_view(), CreateIssueFromBlockEndpoint.as_view(),
name="page-block-issues", name="page-block-issues",
), ),
## End Pages ## End Pages

View File

@ -2,10 +2,8 @@ from .project import (
ProjectViewSet, ProjectViewSet,
ProjectMemberViewSet, ProjectMemberViewSet,
UserProjectInvitationsViewset, UserProjectInvitationsViewset,
InviteProjectEndpoint, ProjectInvitationsViewset,
AddTeamToProjectEndpoint, AddTeamToProjectEndpoint,
ProjectMemberInvitationsViewset,
ProjectMemberInviteDetailViewSet,
ProjectIdentifierEndpoint, ProjectIdentifierEndpoint,
ProjectJoinEndpoint, ProjectJoinEndpoint,
ProjectUserViewsEndpoint, ProjectUserViewsEndpoint,
@ -14,7 +12,6 @@ from .project import (
ProjectDeployBoardViewSet, ProjectDeployBoardViewSet,
ProjectDeployBoardPublicSettingsEndpoint, ProjectDeployBoardPublicSettingsEndpoint,
WorkspaceProjectDeployBoardEndpoint, WorkspaceProjectDeployBoardEndpoint,
LeaveProjectEndpoint,
ProjectPublicCoverImagesEndpoint, ProjectPublicCoverImagesEndpoint,
) )
from .user import ( from .user import (
@ -26,19 +23,17 @@ from .user import (
from .oauth import OauthEndpoint from .oauth import OauthEndpoint
from .base import BaseAPIView, BaseViewSet from .base import BaseAPIView, BaseViewSet, WebhookMixin
from .workspace import ( from .workspace import (
WorkSpaceViewSet, WorkSpaceViewSet,
UserWorkSpacesEndpoint, UserWorkSpacesEndpoint,
WorkSpaceAvailabilityCheckEndpoint, WorkSpaceAvailabilityCheckEndpoint,
InviteWorkspaceEndpoint, WorkspaceJoinEndpoint,
JoinWorkspaceEndpoint,
WorkSpaceMemberViewSet, WorkSpaceMemberViewSet,
TeamMemberViewSet, TeamMemberViewSet,
WorkspaceInvitationsViewset, WorkspaceInvitationsViewset,
UserWorkspaceInvitationsEndpoint, UserWorkspaceInvitationsViewSet,
UserWorkspaceInvitationEndpoint,
UserLastProjectWithWorkspaceEndpoint, UserLastProjectWithWorkspaceEndpoint,
WorkspaceMemberUserEndpoint, WorkspaceMemberUserEndpoint,
WorkspaceMemberUserViewsEndpoint, WorkspaceMemberUserViewsEndpoint,
@ -51,7 +46,6 @@ from .workspace import (
WorkspaceUserProfileEndpoint, WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint, WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint, WorkspaceLabelsEndpoint,
LeaveWorkspaceEndpoint,
) )
from .state import StateViewSet from .state import StateViewSet
from .view import ( from .view import (
@ -121,7 +115,7 @@ from .module import (
ModuleFavoriteViewSet, ModuleFavoriteViewSet,
) )
from .api_token import ApiTokenEndpoint from .api import ApiTokenEndpoint
from .integration import ( from .integration import (
WorkspaceIntegrationViewSet, WorkspaceIntegrationViewSet,
@ -144,9 +138,10 @@ from .importer import (
from .page import ( from .page import (
PageViewSet, PageViewSet,
PageBlockViewSet,
PageFavoriteViewSet, PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint, PageLogEndpoint,
SubPagesEndpoint,
CreateIssueFromBlockEndpoint,
) )
from .search import GlobalSearchEndpoint, IssueSearchEndpoint from .search import GlobalSearchEndpoint, IssueSearchEndpoint
@ -178,3 +173,5 @@ from .notification import (
from .exporter import ExportIssuesEndpoint from .exporter import ExportIssuesEndpoint
from .config import ConfigurationEndpoint from .config import ConfigurationEndpoint
from .webhook import WebhookEndpoint, WebhookLogsEndpoint, WebhookSecretRegenerateEndpoint

View File

@ -0,0 +1,78 @@
# Python import
from uuid import uuid4
# Third party
from rest_framework.response import Response
from rest_framework import status
# Module import
from .base import BaseAPIView
from plane.db.models import APIToken, Workspace
from plane.api.serializers import APITokenSerializer, APITokenReadSerializer
from plane.api.permissions import WorkspaceOwnerPermission
class ApiTokenEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug):
label = request.data.get("label", str(uuid4().hex))
description = request.data.get("description", "")
workspace = Workspace.objects.get(slug=slug)
expired_at = request.data.get("expired_at", None)
# Check the user type
user_type = 1 if request.user.is_bot else 0
api_token = APIToken.objects.create(
label=label,
description=description,
user=request.user,
workspace=workspace,
user_type=user_type,
expired_at=expired_at,
)
serializer = APITokenSerializer(api_token)
# Token will be only visible while creating
return Response(
serializer.data,
status=status.HTTP_201_CREATED,
)
def get(self, request, slug, pk=None):
if pk == None:
api_tokens = APIToken.objects.filter(
user=request.user, workspace__slug=slug
)
serializer = APITokenReadSerializer(api_tokens, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
else:
api_tokens = APIToken.objects.get(
user=request.user, workspace__slug=slug, pk=pk
)
serializer = APITokenReadSerializer(api_tokens)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, slug, pk):
api_token = APIToken.objects.get(
workspace__slug=slug,
user=request.user,
pk=pk,
)
api_token.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, pk):
api_token = APIToken.objects.get(
workspace__slug=slug,
user=request.user,
pk=pk,
)
serializer = APITokenSerializer(api_token, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

View File

@ -1,47 +0,0 @@
# Python import
from uuid import uuid4
# Third party
from rest_framework.response import Response
from rest_framework import status
from sentry_sdk import capture_exception
# Module import
from .base import BaseAPIView
from plane.db.models import APIToken
from plane.api.serializers import APITokenSerializer
class ApiTokenEndpoint(BaseAPIView):
def post(self, request):
label = request.data.get("label", str(uuid4().hex))
workspace = request.data.get("workspace", False)
if not workspace:
return Response(
{"error": "Workspace is required"}, status=status.HTTP_200_OK
)
api_token = APIToken.objects.create(
label=label, user=request.user, workspace_id=workspace
)
serializer = APITokenSerializer(api_token)
# Token will be only vissible while creating
return Response(
{"api_token": serializer.data, "token": api_token.token},
status=status.HTTP_201_CREATED,
)
def get(self, request):
api_tokens = APIToken.objects.filter(user=request.user)
serializer = APITokenSerializer(api_tokens, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, pk):
api_token = APIToken.objects.get(pk=pk)
api_token.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@ -33,7 +33,7 @@ from plane.bgtasks.forgot_password_task import forgot_password
class RequestEmailVerificationEndpoint(BaseAPIView): class RequestEmailVerificationEndpoint(BaseAPIView):
def get(self, request): def get(self, request):
token = RefreshToken.for_user(request.user).access_token token = RefreshToken.for_user(request.user).access_token
current_site = settings.WEB_URL current_site = request.META.get('HTTP_ORIGIN')
email_verification.delay( email_verification.delay(
request.user.first_name, request.user.email, token, current_site request.user.first_name, request.user.email, token, current_site
) )
@ -76,7 +76,7 @@ class ForgotPasswordEndpoint(BaseAPIView):
uidb64 = urlsafe_base64_encode(smart_bytes(user.id)) uidb64 = urlsafe_base64_encode(smart_bytes(user.id))
token = PasswordResetTokenGenerator().make_token(user) token = PasswordResetTokenGenerator().make_token(user)
current_site = settings.WEB_URL current_site = request.META.get('HTTP_ORIGIN')
forgot_password.delay( forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site user.first_name, user.email, uidb64, token, current_site

View File

@ -4,7 +4,7 @@ import random
import string import string
import json import json
import requests import requests
from requests.exceptions import RequestException
# Django imports # Django imports
from django.utils import timezone from django.utils import timezone
from django.core.exceptions import ValidationError from django.core.exceptions import ValidationError
@ -22,8 +22,13 @@ from sentry_sdk import capture_exception, capture_message
# Module imports # Module imports
from . import BaseAPIView from . import BaseAPIView
from plane.db.models import User from plane.db.models import (
from plane.api.serializers import UserSerializer User,
WorkspaceMemberInvite,
WorkspaceMember,
ProjectMemberInvite,
ProjectMember,
)
from plane.settings.redis import redis_instance from plane.settings.redis import redis_instance
from plane.bgtasks.magic_link_code_task import magic_link from plane.bgtasks.magic_link_code_task import magic_link
@ -86,35 +91,93 @@ class SignUpEndpoint(BaseAPIView):
user.token_updated_at = timezone.now() user.token_updated_at = timezone.now()
user.save() user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user) access_token, refresh_token = get_tokens_for_user(user)
data = { data = {
"access_token": access_token, "access_token": access_token,
"refresh_token": refresh_token, "refresh_token": refresh_token,
} }
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
return Response(data, status=status.HTTP_200_OK) return Response(data, status=status.HTTP_200_OK)
@ -176,33 +239,92 @@ class SignInEndpoint(BaseAPIView):
user.token_updated_at = timezone.now() user.token_updated_at = timezone.now()
user.save() user.save()
access_token, refresh_token = get_tokens_for_user(user) # Check if user has any accepted invites for workspace and add them to workspace
# Send Analytics workspace_member_invites = WorkspaceMemberInvite.objects.filter(
if settings.ANALYTICS_BASE_API: email=user.email, accepted=True
_ = requests.post( )
settings.ANALYTICS_BASE_API,
headers={ WorkspaceMember.objects.bulk_create(
"Content-Type": "application/json", [
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY, WorkspaceMember(
}, workspace_id=workspace_member_invite.workspace_id,
json={ member=user,
"event_id": uuid.uuid4().hex, role=workspace_member_invite.role,
"event_data": { )
"medium": "email", for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
}, },
"user": {"email": email, "id": str(user.id)}, json={
"device_ctx": { "event_id": uuid.uuid4().hex,
"ip": request.META.get("REMOTE_ADDR"), "event_data": {
"user_agent": request.META.get("HTTP_USER_AGENT"), "medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
}, },
"event_type": "SIGN_IN", )
}, except RequestException as e:
) capture_exception(e)
data = { data = {
"access_token": access_token, "access_token": access_token,
"refresh_token": refresh_token, "refresh_token": refresh_token,
} }
access_token, refresh_token = get_tokens_for_user(user)
return Response(data, status=status.HTTP_200_OK) return Response(data, status=status.HTTP_200_OK)
@ -287,7 +409,8 @@ class MagicSignInGenerateEndpoint(BaseAPIView):
ri.set(key, json.dumps(value), ex=expiry) ri.set(key, json.dumps(value), ex=expiry)
current_site = settings.WEB_URL
current_site = request.META.get('HTTP_ORIGIN')
magic_link.delay(email, key, token, current_site) magic_link.delay(email, key, token, current_site)
return Response({"key": key}, status=status.HTTP_200_OK) return Response({"key": key}, status=status.HTTP_200_OK)
@ -319,27 +442,37 @@ class MagicSignInEndpoint(BaseAPIView):
if str(token) == str(user_token): if str(token) == str(user_token):
if User.objects.filter(email=email).exists(): if User.objects.filter(email=email).exists():
user = User.objects.get(email=email) user = User.objects.get(email=email)
# Send event to Jitsu for tracking if not user.is_active:
if settings.ANALYTICS_BASE_API: return Response(
_ = requests.post( {
settings.ANALYTICS_BASE_API, "error": "Your account has been deactivated. Please contact your site administrator."
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
}, },
status=status.HTTP_403_FORBIDDEN,
) )
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
else: else:
user = User.objects.create( user = User.objects.create(
email=email, email=email,
@ -347,27 +480,30 @@ class MagicSignInEndpoint(BaseAPIView):
password=make_password(uuid.uuid4().hex), password=make_password(uuid.uuid4().hex),
is_password_autoset=True, is_password_autoset=True,
) )
# Send event to Jitsu for tracking try:
if settings.ANALYTICS_BASE_API: # Send event to Jitsu for tracking
_ = requests.post( if settings.ANALYTICS_BASE_API:
settings.ANALYTICS_BASE_API, _ = requests.post(
headers={ settings.ANALYTICS_BASE_API,
"Content-Type": "application/json", headers={
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY, "Content-Type": "application/json",
}, "X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
}, },
"user": {"email": email, "id": str(user.id)}, json={
"device_ctx": { "event_id": uuid.uuid4().hex,
"ip": request.META.get("REMOTE_ADDR"), "event_data": {
"user_agent": request.META.get("HTTP_USER_AGENT"), "medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
}, },
"event_type": "SIGN_UP", )
}, except RequestException as e:
) capture_exception(e)
user.last_active = timezone.now() user.last_active = timezone.now()
user.last_login_time = timezone.now() user.last_login_time = timezone.now()
@ -376,6 +512,63 @@ class MagicSignInEndpoint(BaseAPIView):
user.token_updated_at = timezone.now() user.token_updated_at = timezone.now()
user.save() user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
access_token, refresh_token = get_tokens_for_user(user) access_token, refresh_token = get_tokens_for_user(user)
data = { data = {
"access_token": access_token, "access_token": access_token,

View File

@ -1,5 +1,6 @@
# Python imports # Python imports
import zoneinfo import zoneinfo
import json
# Django imports # Django imports
from django.urls import resolve from django.urls import resolve
@ -7,6 +8,7 @@ from django.conf import settings
from django.utils import timezone from django.utils import timezone
from django.db import IntegrityError from django.db import IntegrityError
from django.core.exceptions import ObjectDoesNotExist, ValidationError from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.core.serializers.json import DjangoJSONEncoder
# Third part imports # Third part imports
from rest_framework import status from rest_framework import status
@ -22,6 +24,7 @@ from django_filters.rest_framework import DjangoFilterBackend
# Module imports # Module imports
from plane.utils.paginator import BasePaginator from plane.utils.paginator import BasePaginator
from plane.bgtasks.webhook_task import send_webhook
class TimezoneMixin: class TimezoneMixin:
@ -29,6 +32,7 @@ class TimezoneMixin:
This enables timezone conversion according This enables timezone conversion according
to the user set timezone to the user set timezone
""" """
def initial(self, request, *args, **kwargs): def initial(self, request, *args, **kwargs):
super().initial(request, *args, **kwargs) super().initial(request, *args, **kwargs)
if request.user.is_authenticated: if request.user.is_authenticated:
@ -37,8 +41,29 @@ class TimezoneMixin:
timezone.deactivate() timezone.deactivate()
class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator): class WebhookMixin:
webhook_event = None
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
if (
self.webhook_event
and self.request.method in ["POST", "PATCH", "DELETE"]
and response.status_code in [200, 201, 204]
and settings.ENABLE_WEBHOOK
):
send_webhook.delay(
event=self.webhook_event,
event_data=json.dumps(response.data, cls=DjangoJSONEncoder),
action=self.request.method,
slug=self.workspace_slug,
)
return response
class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
model = None model = None
permission_classes = [ permission_classes = [
@ -71,18 +96,30 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
return response return response
except Exception as e: except Exception as e:
if isinstance(e, IntegrityError): if isinstance(e, IntegrityError):
return Response({"error": "The payload is not valid"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "The payload is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ValidationError): if isinstance(e, ValidationError):
return Response({"error": "Please provide valid detail"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "Please provide valid detail"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ObjectDoesNotExist): if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0] model_name = str(exc).split(" matching query does not exist.")[0]
return Response({"error": f"{model_name} does not exist."}, status=status.HTTP_404_NOT_FOUND) return Response(
{"error": f"{model_name} does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
if isinstance(e, KeyError): if isinstance(e, KeyError):
capture_exception(e) capture_exception(e)
return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": f"key {e} does not exist"},
status=status.HTTP_400_BAD_REQUEST,
)
print(e) if settings.DEBUG else print("Server Error") print(e) if settings.DEBUG else print("Server Error")
capture_exception(e) capture_exception(e)
@ -99,8 +136,8 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
print( print(
f"{request.method} - {request.get_full_path()} of Queries: {len(connection.queries)}" f"{request.method} - {request.get_full_path()} of Queries: {len(connection.queries)}"
) )
return response
return response
except Exception as exc: except Exception as exc:
response = self.handle_exception(exc) response = self.handle_exception(exc)
return exc return exc
@ -120,7 +157,6 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
class BaseAPIView(TimezoneMixin, APIView, BasePaginator): class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
permission_classes = [ permission_classes = [
IsAuthenticated, IsAuthenticated,
] ]
@ -139,7 +175,6 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
queryset = backend().filter_queryset(self.request, queryset, self) queryset = backend().filter_queryset(self.request, queryset, self)
return queryset return queryset
def handle_exception(self, exc): def handle_exception(self, exc):
""" """
Handle any exception that occurs, by returning an appropriate response, Handle any exception that occurs, by returning an appropriate response,
@ -150,19 +185,29 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
return response return response
except Exception as e: except Exception as e:
if isinstance(e, IntegrityError): if isinstance(e, IntegrityError):
return Response({"error": "The payload is not valid"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "The payload is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ValidationError): if isinstance(e, ValidationError):
return Response({"error": "Please provide valid detail"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "Please provide valid detail"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ObjectDoesNotExist): if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0] model_name = str(exc).split(" matching query does not exist.")[0]
return Response({"error": f"{model_name} does not exist."}, status=status.HTTP_404_NOT_FOUND) return Response(
{"error": f"{model_name} does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
if isinstance(e, KeyError): if isinstance(e, KeyError):
return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST) return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST)
print(e) if settings.DEBUG else print("Server Error") if settings.DEBUG:
print(e)
capture_exception(e) capture_exception(e)
return Response({"error": "Something went wrong please try again later"}, status=status.HTTP_500_INTERNAL_SERVER_ERROR) return Response({"error": "Something went wrong please try again later"}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)

View File

@ -23,7 +23,7 @@ from rest_framework import status
from sentry_sdk import capture_exception from sentry_sdk import capture_exception
# Module imports # Module imports
from . import BaseViewSet, BaseAPIView from . import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import ( from plane.api.serializers import (
CycleSerializer, CycleSerializer,
CycleIssueSerializer, CycleIssueSerializer,
@ -48,9 +48,10 @@ from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot from plane.utils.analytics_plot import burndown_plot
class CycleViewSet(BaseViewSet): class CycleViewSet(WebhookMixin, BaseViewSet):
serializer_class = CycleSerializer serializer_class = CycleSerializer
model = Cycle model = Cycle
webhook_event = "cycle"
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]
@ -499,10 +500,10 @@ class CycleViewSet(BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
class CycleIssueViewSet(BaseViewSet): class CycleIssueViewSet(WebhookMixin, BaseViewSet):
serializer_class = CycleIssueSerializer serializer_class = CycleIssueSerializer
model = CycleIssue model = CycleIssue
webhook_event = "cycle"
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]

View File

@ -64,9 +64,7 @@ class InboxViewSet(BaseViewSet):
serializer.save(project_id=self.kwargs.get("project_id")) serializer.save(project_id=self.kwargs.get("project_id"))
def destroy(self, request, slug, project_id, pk): def destroy(self, request, slug, project_id, pk):
inbox = Inbox.objects.get( inbox = Inbox.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
workspace__slug=slug, project_id=project_id, pk=pk
)
# Handle default inbox delete # Handle default inbox delete
if inbox.is_default: if inbox.is_default:
return Response( return Response(
@ -128,9 +126,7 @@ class InboxIssueViewSet(BaseViewSet):
.values("count") .values("count")
) )
.annotate( .annotate(
attachment_count=IssueAttachment.objects.filter( attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
issue=OuterRef("id")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
@ -150,7 +146,6 @@ class InboxIssueViewSet(BaseViewSet):
status=status.HTTP_200_OK, status=status.HTTP_200_OK,
) )
def create(self, request, slug, project_id, inbox_id): def create(self, request, slug, project_id, inbox_id):
if not request.data.get("issue", {}).get("name", False): if not request.data.get("issue", {}).get("name", False):
return Response( return Response(
@ -198,7 +193,7 @@ class InboxIssueViewSet(BaseViewSet):
issue_id=str(issue.id), issue_id=str(issue.id),
project_id=str(project_id), project_id=str(project_id),
current_instance=None, current_instance=None,
epoch=int(timezone.now().timestamp()) epoch=int(timezone.now().timestamp()),
) )
# create an inbox issue # create an inbox issue
InboxIssue.objects.create( InboxIssue.objects.create(
@ -216,10 +211,20 @@ class InboxIssueViewSet(BaseViewSet):
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
) )
# Get the project member # Get the project member
project_member = ProjectMember.objects.get(workspace__slug=slug, project_id=project_id, member=request.user) project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
# Only project members admins and created_by users can access this endpoint # Only project members admins and created_by users can access this endpoint
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(request.user.id): if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
return Response({"error": "You cannot edit inbox issues"}, status=status.HTTP_400_BAD_REQUEST) request.user.id
):
return Response(
{"error": "You cannot edit inbox issues"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get issue data # Get issue data
issue_data = request.data.pop("issue", False) issue_data = request.data.pop("issue", False)
@ -233,8 +238,10 @@ class InboxIssueViewSet(BaseViewSet):
# viewers and guests since only viewers and guests # viewers and guests since only viewers and guests
issue_data = { issue_data = {
"name": issue_data.get("name", issue.name), "name": issue_data.get("name", issue.name),
"description_html": issue_data.get("description_html", issue.description_html), "description_html": issue_data.get(
"description": issue_data.get("description", issue.description) "description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
} }
issue_serializer = IssueCreateSerializer( issue_serializer = IssueCreateSerializer(
@ -256,7 +263,7 @@ class InboxIssueViewSet(BaseViewSet):
IssueSerializer(current_instance).data, IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder, cls=DjangoJSONEncoder,
), ),
epoch=int(timezone.now().timestamp()) epoch=int(timezone.now().timestamp()),
) )
issue_serializer.save() issue_serializer.save()
else: else:
@ -307,7 +314,9 @@ class InboxIssueViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
else: else:
return Response(InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK) return Response(
InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
)
def retrieve(self, request, slug, project_id, inbox_id, pk): def retrieve(self, request, slug, project_id, inbox_id, pk):
inbox_issue = InboxIssue.objects.get( inbox_issue = InboxIssue.objects.get(
@ -324,15 +333,27 @@ class InboxIssueViewSet(BaseViewSet):
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
) )
# Get the project member # Get the project member
project_member = ProjectMember.objects.get(workspace__slug=slug, project_id=project_id, member=request.user) project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(request.user.id): if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
return Response({"error": "You cannot delete inbox issue"}, status=status.HTTP_400_BAD_REQUEST) request.user.id
):
return Response(
{"error": "You cannot delete inbox issue"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check the issue status # Check the issue status
if inbox_issue.status in [-2, -1, 0, 2]: if inbox_issue.status in [-2, -1, 0, 2]:
# Delete the issue also # Delete the issue also
Issue.objects.filter(workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id).delete() Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id
).delete()
inbox_issue.delete() inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
@ -347,7 +368,10 @@ class InboxIssuePublicViewSet(BaseViewSet):
] ]
def get_queryset(self): def get_queryset(self):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=self.kwargs.get("slug"), project_id=self.kwargs.get("project_id")) project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
)
if project_deploy_board is not None: if project_deploy_board is not None:
return self.filter_queryset( return self.filter_queryset(
super() super()
@ -363,9 +387,14 @@ class InboxIssuePublicViewSet(BaseViewSet):
return InboxIssue.objects.none() return InboxIssue.objects.none()
def list(self, request, slug, project_id, inbox_id): def list(self, request, slug, project_id, inbox_id):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id) project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None: if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
filters = issue_filters(request.query_params, "GET") filters = issue_filters(request.query_params, "GET")
issues = ( issues = (
@ -392,9 +421,7 @@ class InboxIssuePublicViewSet(BaseViewSet):
.values("count") .values("count")
) )
.annotate( .annotate(
attachment_count=IssueAttachment.objects.filter( attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
issue=OuterRef("id")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
@ -415,9 +442,14 @@ class InboxIssuePublicViewSet(BaseViewSet):
) )
def create(self, request, slug, project_id, inbox_id): def create(self, request, slug, project_id, inbox_id):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id) project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None: if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
if not request.data.get("issue", {}).get("name", False): if not request.data.get("issue", {}).get("name", False):
return Response( return Response(
@ -465,7 +497,7 @@ class InboxIssuePublicViewSet(BaseViewSet):
issue_id=str(issue.id), issue_id=str(issue.id),
project_id=str(project_id), project_id=str(project_id),
current_instance=None, current_instance=None,
epoch=int(timezone.now().timestamp()) epoch=int(timezone.now().timestamp()),
) )
# create an inbox issue # create an inbox issue
InboxIssue.objects.create( InboxIssue.objects.create(
@ -479,34 +511,41 @@ class InboxIssuePublicViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, inbox_id, pk): def partial_update(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id) project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None: if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get( inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
) )
# Get the project member # Get the project member
if str(inbox_issue.created_by_id) != str(request.user.id): if str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot edit inbox issues"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "You cannot edit inbox issues"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get issue data # Get issue data
issue_data = request.data.pop("issue", False) issue_data = request.data.pop("issue", False)
issue = Issue.objects.get( issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
) )
# viewers and guests since only viewers and guests # viewers and guests since only viewers and guests
issue_data = { issue_data = {
"name": issue_data.get("name", issue.name), "name": issue_data.get("name", issue.name),
"description_html": issue_data.get("description_html", issue.description_html), "description_html": issue_data.get(
"description": issue_data.get("description", issue.description) "description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
} }
issue_serializer = IssueCreateSerializer( issue_serializer = IssueCreateSerializer(issue, data=issue_data, partial=True)
issue, data=issue_data, partial=True
)
if issue_serializer.is_valid(): if issue_serializer.is_valid():
current_instance = issue current_instance = issue
@ -523,16 +562,21 @@ class InboxIssuePublicViewSet(BaseViewSet):
IssueSerializer(current_instance).data, IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder, cls=DjangoJSONEncoder,
), ),
epoch=int(timezone.now().timestamp()) epoch=int(timezone.now().timestamp()),
) )
issue_serializer.save() issue_serializer.save()
return Response(issue_serializer.data, status=status.HTTP_200_OK) return Response(issue_serializer.data, status=status.HTTP_200_OK)
return Response(issue_serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(issue_serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def retrieve(self, request, slug, project_id, inbox_id, pk): def retrieve(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id) project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None: if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get( inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
@ -544,16 +588,24 @@ class InboxIssuePublicViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, inbox_id, pk): def destroy(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id) project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None: if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get( inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
) )
if str(inbox_issue.created_by_id) != str(request.user.id): if str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot delete inbox issue"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "You cannot delete inbox issue"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue.delete() inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)

View File

@ -33,7 +33,7 @@ from rest_framework.permissions import AllowAny, IsAuthenticated
from sentry_sdk import capture_exception from sentry_sdk import capture_exception
# Module imports # Module imports
from . import BaseViewSet, BaseAPIView from . import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import ( from plane.api.serializers import (
IssueCreateSerializer, IssueCreateSerializer,
IssueActivitySerializer, IssueActivitySerializer,
@ -84,7 +84,7 @@ from plane.utils.grouper import group_results
from plane.utils.issue_filters import issue_filters from plane.utils.issue_filters import issue_filters
class IssueViewSet(BaseViewSet): class IssueViewSet(WebhookMixin, BaseViewSet):
def get_serializer_class(self): def get_serializer_class(self):
return ( return (
IssueCreateSerializer IssueCreateSerializer
@ -93,6 +93,7 @@ class IssueViewSet(BaseViewSet):
) )
model = Issue model = Issue
webhook_event = "issue"
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]
@ -594,9 +595,10 @@ class IssueActivityEndpoint(BaseAPIView):
return Response(result_list, status=status.HTTP_200_OK) return Response(result_list, status=status.HTTP_200_OK)
class IssueCommentViewSet(BaseViewSet): class IssueCommentViewSet(WebhookMixin, BaseViewSet):
serializer_class = IssueCommentSerializer serializer_class = IssueCommentSerializer
model = IssueComment model = IssueComment
webhook_event = "issue-comment"
permission_classes = [ permission_classes = [
ProjectLitePermission, ProjectLitePermission,
] ]
@ -623,6 +625,7 @@ class IssueCommentViewSet(BaseViewSet):
workspace__slug=self.kwargs.get("slug"), workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"), project_id=self.kwargs.get("project_id"),
member_id=self.request.user.id, member_id=self.request.user.id,
is_active=True,
) )
) )
) )
@ -753,8 +756,8 @@ class LabelViewSet(BaseViewSet):
.select_related("project") .select_related("project")
.select_related("workspace") .select_related("workspace")
.select_related("parent") .select_related("parent")
.order_by("name")
.distinct() .distinct()
.order_by("sort_order")
) )
@ -1254,7 +1257,11 @@ class IssueSubscriberViewSet(BaseViewSet):
def list(self, request, slug, project_id, issue_id): def list(self, request, slug, project_id, issue_id):
members = ( members = (
ProjectMember.objects.filter(workspace__slug=slug, project_id=project_id) ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
.annotate( .annotate(
is_subscribed=Exists( is_subscribed=Exists(
IssueSubscriber.objects.filter( IssueSubscriber.objects.filter(
@ -1498,6 +1505,7 @@ class IssueCommentPublicViewSet(BaseViewSet):
workspace__slug=self.kwargs.get("slug"), workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"), project_id=self.kwargs.get("project_id"),
member_id=self.request.user.id, member_id=self.request.user.id,
is_active=True,
) )
) )
) )
@ -1538,6 +1546,7 @@ class IssueCommentPublicViewSet(BaseViewSet):
if not ProjectMember.objects.filter( if not ProjectMember.objects.filter(
project_id=project_id, project_id=project_id,
member=request.user, member=request.user,
is_active=True,
).exists(): ).exists():
# Add the user for workspace tracking # Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create( _ = ProjectPublicMember.objects.get_or_create(
@ -1651,6 +1660,7 @@ class IssueReactionPublicViewSet(BaseViewSet):
if not ProjectMember.objects.filter( if not ProjectMember.objects.filter(
project_id=project_id, project_id=project_id,
member=request.user, member=request.user,
is_active=True,
).exists(): ).exists():
# Add the user for workspace tracking # Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create( _ = ProjectPublicMember.objects.get_or_create(
@ -1744,7 +1754,9 @@ class CommentReactionPublicViewSet(BaseViewSet):
project_id=project_id, comment_id=comment_id, actor=request.user project_id=project_id, comment_id=comment_id, actor=request.user
) )
if not ProjectMember.objects.filter( if not ProjectMember.objects.filter(
project_id=project_id, member=request.user project_id=project_id,
member=request.user,
is_active=True,
).exists(): ).exists():
# Add the user for workspace tracking # Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create( _ = ProjectPublicMember.objects.get_or_create(
@ -1829,7 +1841,9 @@ class IssueVotePublicViewSet(BaseViewSet):
) )
# Add the user for workspace tracking # Add the user for workspace tracking
if not ProjectMember.objects.filter( if not ProjectMember.objects.filter(
project_id=project_id, member=request.user project_id=project_id,
member=request.user,
is_active=True,
).exists(): ).exists():
_ = ProjectPublicMember.objects.get_or_create( _ = ProjectPublicMember.objects.get_or_create(
project_id=project_id, project_id=project_id,

View File

@ -15,7 +15,7 @@ from rest_framework import status
from sentry_sdk import capture_exception from sentry_sdk import capture_exception
# Module imports # Module imports
from . import BaseViewSet from . import BaseViewSet, WebhookMixin
from plane.api.serializers import ( from plane.api.serializers import (
ModuleWriteSerializer, ModuleWriteSerializer,
ModuleSerializer, ModuleSerializer,
@ -41,11 +41,12 @@ from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot from plane.utils.analytics_plot import burndown_plot
class ModuleViewSet(BaseViewSet): class ModuleViewSet(WebhookMixin, BaseViewSet):
model = Module model = Module
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]
webhook_event = "module"
def get_serializer_class(self): def get_serializer_class(self):
return ( return (

View File

@ -85,7 +85,10 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
# Created issues # Created issues
if type == "created": if type == "created":
if WorkspaceMember.objects.filter( if WorkspaceMember.objects.filter(
workspace__slug=slug, member=request.user, role__lt=15 workspace__slug=slug,
member=request.user,
role__lt=15,
is_active=True,
).exists(): ).exists():
notifications = Notification.objects.none() notifications = Notification.objects.none()
else: else:
@ -255,7 +258,10 @@ class MarkAllReadNotificationViewSet(BaseViewSet):
# Created issues # Created issues
if type == "created": if type == "created":
if WorkspaceMember.objects.filter( if WorkspaceMember.objects.filter(
workspace__slug=slug, member=request.user, role__lt=15 workspace__slug=slug,
member=request.user,
role__lt=15,
is_active=True,
).exists(): ).exists():
notifications = Notification.objects.none() notifications = Notification.objects.none()
else: else:

View File

@ -2,6 +2,7 @@
import uuid import uuid
import requests import requests
import os import os
from requests.exceptions import RequestException
# Django imports # Django imports
from django.utils import timezone from django.utils import timezone
@ -20,7 +21,14 @@ from google.oauth2 import id_token
from google.auth.transport import requests as google_auth_request from google.auth.transport import requests as google_auth_request
# Module imports # Module imports
from plane.db.models import SocialLoginConnection, User from plane.db.models import (
SocialLoginConnection,
User,
WorkspaceMemberInvite,
WorkspaceMember,
ProjectMemberInvite,
ProjectMember,
)
from plane.api.serializers import UserSerializer from plane.api.serializers import UserSerializer
from .base import BaseAPIView from .base import BaseAPIView
@ -168,7 +176,6 @@ class OauthEndpoint(BaseAPIView):
) )
## Login Case ## Login Case
if not user.is_active: if not user.is_active:
return Response( return Response(
{ {
@ -185,12 +192,61 @@ class OauthEndpoint(BaseAPIView):
user.is_email_verified = email_verified user.is_email_verified = email_verified
user.save() user.save()
access_token, refresh_token = get_tokens_for_user(user) # Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
data = { WorkspaceMember.objects.bulk_create(
"access_token": access_token, [
"refresh_token": refresh_token, WorkspaceMember(
} workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
SocialLoginConnection.objects.update_or_create( SocialLoginConnection.objects.update_or_create(
medium=medium, medium=medium,
@ -201,26 +257,36 @@ class OauthEndpoint(BaseAPIView):
"last_login_at": timezone.now(), "last_login_at": timezone.now(),
}, },
) )
if settings.ANALYTICS_BASE_API: try:
_ = requests.post( if settings.ANALYTICS_BASE_API:
settings.ANALYTICS_BASE_API, _ = requests.post(
headers={ settings.ANALYTICS_BASE_API,
"Content-Type": "application/json", headers={
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY, "Content-Type": "application/json",
}, "X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
}, },
"user": {"email": email, "id": str(user.id)}, json={
"device_ctx": { "event_id": uuid.uuid4().hex,
"ip": request.META.get("REMOTE_ADDR"), "event_data": {
"user_agent": request.META.get("HTTP_USER_AGENT"), "medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
}, },
"event_type": "SIGN_IN", )
}, except RequestException as e:
) capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_200_OK) return Response(data, status=status.HTTP_200_OK)
except User.DoesNotExist: except User.DoesNotExist:
@ -260,31 +326,85 @@ class OauthEndpoint(BaseAPIView):
user.token_updated_at = timezone.now() user.token_updated_at = timezone.now()
user.save() user.save()
access_token, refresh_token = get_tokens_for_user(user) # Check if user has any accepted invites for workspace and add them to workspace
data = { workspace_member_invites = WorkspaceMemberInvite.objects.filter(
"access_token": access_token, email=user.email, accepted=True
"refresh_token": refresh_token, )
}
if settings.ANALYTICS_BASE_API: WorkspaceMember.objects.bulk_create(
_ = requests.post( [
settings.ANALYTICS_BASE_API, WorkspaceMember(
headers={ workspace_id=workspace_member_invite.workspace_id,
"Content-Type": "application/json", member=user,
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY, role=workspace_member_invite.role,
}, )
json={ for workspace_member_invite in workspace_member_invites
"event_id": uuid.uuid4().hex, ],
"event_data": { ignore_conflicts=True,
"medium": f"oauth-{medium}", )
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
}, },
"user": {"email": email, "id": str(user.id)}, json={
"device_ctx": { "event_id": uuid.uuid4().hex,
"ip": request.META.get("REMOTE_ADDR"), "event_data": {
"user_agent": request.META.get("HTTP_USER_AGENT"), "medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
}, },
"event_type": "SIGN_UP", )
}, except RequestException as e:
) capture_exception(e)
SocialLoginConnection.objects.update_or_create( SocialLoginConnection.objects.update_or_create(
medium=medium, medium=medium,
@ -295,4 +415,10 @@ class OauthEndpoint(BaseAPIView):
"last_login_at": timezone.now(), "last_login_at": timezone.now(),
}, },
) )
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_201_CREATED) return Response(data, status=status.HTTP_201_CREATED)

View File

@ -1,9 +1,19 @@
# Python imports # Python imports
from datetime import timedelta, date from datetime import timedelta, date, datetime
# Django imports # Django imports
from django.db import connection
from django.db.models import Exists, OuterRef, Q, Prefetch from django.db.models import Exists, OuterRef, Q, Prefetch
from django.utils import timezone from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from django.db.models import (
OuterRef,
Func,
F,
Q,
Exists,
)
# Third party imports # Third party imports
from rest_framework import status from rest_framework import status
@ -15,20 +25,37 @@ from .base import BaseViewSet, BaseAPIView
from plane.api.permissions import ProjectEntityPermission from plane.api.permissions import ProjectEntityPermission
from plane.db.models import ( from plane.db.models import (
Page, Page,
PageBlock,
PageFavorite, PageFavorite,
Issue, Issue,
IssueAssignee, IssueAssignee,
IssueActivity, IssueActivity,
PageLog,
) )
from plane.api.serializers import ( from plane.api.serializers import (
PageSerializer, PageSerializer,
PageBlockSerializer,
PageFavoriteSerializer, PageFavoriteSerializer,
PageLogSerializer,
IssueLiteSerializer, IssueLiteSerializer,
SubPageSerializer,
) )
def unarchive_archive_page_and_descendants(page_id, archived_at):
# Your SQL query
sql = """
WITH RECURSIVE descendants AS (
SELECT id FROM pages WHERE id = %s
UNION ALL
SELECT pages.id FROM pages, descendants WHERE pages.parent_id = descendants.id
)
UPDATE pages SET archived_at = %s WHERE id IN (SELECT id FROM descendants);
"""
# Execute the SQL query
with connection.cursor() as cursor:
cursor.execute(sql, [page_id, archived_at])
class PageViewSet(BaseViewSet): class PageViewSet(BaseViewSet):
serializer_class = PageSerializer serializer_class = PageSerializer
model = Page model = Page
@ -52,6 +79,7 @@ class PageViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug")) .filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter(project__project_projectmember__member=self.request.user) .filter(project__project_projectmember__member=self.request.user)
.filter(parent__isnull=True)
.filter(Q(owned_by=self.request.user) | Q(access=0)) .filter(Q(owned_by=self.request.user) | Q(access=0))
.select_related("project") .select_related("project")
.select_related("workspace") .select_related("workspace")
@ -59,23 +87,10 @@ class PageViewSet(BaseViewSet):
.annotate(is_favorite=Exists(subquery)) .annotate(is_favorite=Exists(subquery))
.order_by(self.request.GET.get("order_by", "-created_at")) .order_by(self.request.GET.get("order_by", "-created_at"))
.prefetch_related("labels") .prefetch_related("labels")
.order_by("name", "-is_favorite") .order_by("-is_favorite", "-created_at")
.prefetch_related(
Prefetch(
"blocks",
queryset=PageBlock.objects.select_related(
"page", "issue", "workspace", "project"
),
)
)
.distinct() .distinct()
) )
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"), owned_by=self.request.user
)
def create(self, request, slug, project_id): def create(self, request, slug, project_id):
serializer = PageSerializer( serializer = PageSerializer(
data=request.data, data=request.data,
@ -88,34 +103,88 @@ class PageViewSet(BaseViewSet):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def partial_update(self, request, slug, project_id, pk): def partial_update(self, request, slug, project_id, pk):
page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id) try:
# Only update access if the page owner is the requesting user page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id)
if (
page.access != request.data.get("access", page.access) if page.is_locked:
and page.owned_by_id != request.user.id return Response(
): {"error": "Page is locked"},
status=status.HTTP_400_BAD_REQUEST,
)
parent = request.data.get("parent", None)
if parent:
_ = Page.objects.get(
pk=parent, workspace__slug=slug, project_id=project_id
)
# Only update access if the page owner is the requesting user
if (
page.access != request.data.get("access", page.access)
and page.owned_by_id != request.user.id
):
return Response(
{
"error": "Access cannot be updated since this page is owned by someone else"
},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = PageSerializer(page, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except Page.DoesNotExist:
return Response( return Response(
{ {
"error": "Access cannot be updated since this page is owned by someone else" "error": "Access cannot be updated since this page is owned by someone else"
}, },
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
serializer = PageSerializer(page, data=request.data, partial=True)
if serializer.is_valid(): def lock(self, request, slug, project_id, page_id):
serializer.save() page = Page.objects.get(pk=page_id, workspace__slug=slug, project_id=project_id)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) # only the owner can lock the page
if request.user.id != page.owned_by_id:
return Response(
{"error": "Only the page owner can lock the page"},
)
page.is_locked = True
page.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def unlock(self, request, slug, project_id, page_id):
page = Page.objects.get(pk=page_id, workspace__slug=slug, project_id=project_id)
# only the owner can unlock the page
if request.user.id != page.owned_by_id:
return Response(
{"error": "Only the page owner can unlock the page"},
status=status.HTTP_400_BAD_REQUEST,
)
page.is_locked = False
page.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def list(self, request, slug, project_id): def list(self, request, slug, project_id):
queryset = self.get_queryset() queryset = self.get_queryset().filter(archived_at__isnull=True)
page_view = request.GET.get("page_view", False) page_view = request.GET.get("page_view", False)
if not page_view: if not page_view:
return Response({"error": "Page View parameter is required"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "Page View parameter is required"},
status=status.HTTP_400_BAD_REQUEST,
)
# All Pages # All Pages
if page_view == "all": if page_view == "all":
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK) return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# Recent pages # Recent pages
if page_view == "recent": if page_view == "recent":
@ -123,66 +192,130 @@ class PageViewSet(BaseViewSet):
day_before = current_time - timedelta(days=1) day_before = current_time - timedelta(days=1)
todays_pages = queryset.filter(updated_at__date=date.today()) todays_pages = queryset.filter(updated_at__date=date.today())
yesterdays_pages = queryset.filter(updated_at__date=day_before) yesterdays_pages = queryset.filter(updated_at__date=day_before)
earlier_this_week = queryset.filter( updated_at__date__range=( earlier_this_week = queryset.filter(
updated_at__date__range=(
(timezone.now() - timedelta(days=7)), (timezone.now() - timedelta(days=7)),
(timezone.now() - timedelta(days=2)), (timezone.now() - timedelta(days=2)),
)) )
)
return Response( return Response(
{ {
"today": PageSerializer(todays_pages, many=True).data, "today": PageSerializer(todays_pages, many=True).data,
"yesterday": PageSerializer(yesterdays_pages, many=True).data, "yesterday": PageSerializer(yesterdays_pages, many=True).data,
"earlier_this_week": PageSerializer(earlier_this_week, many=True).data, "earlier_this_week": PageSerializer(
}, earlier_this_week, many=True
status=status.HTTP_200_OK, ).data,
) },
status=status.HTTP_200_OK,
)
# Favorite Pages # Favorite Pages
if page_view == "favorite": if page_view == "favorite":
queryset = queryset.filter(is_favorite=True) queryset = queryset.filter(is_favorite=True)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK) return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# My pages # My pages
if page_view == "created_by_me": if page_view == "created_by_me":
queryset = queryset.filter(owned_by=request.user) queryset = queryset.filter(owned_by=request.user)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK) return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# Created by other Pages # Created by other Pages
if page_view == "created_by_other": if page_view == "created_by_other":
queryset = queryset.filter(~Q(owned_by=request.user), access=0) queryset = queryset.filter(~Q(owned_by=request.user), access=0)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK) return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
return Response({"error": "No matching view found"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "No matching view found"}, status=status.HTTP_400_BAD_REQUEST
class PageBlockViewSet(BaseViewSet):
serializer_class = PageBlockSerializer
model = PageBlock
permission_classes = [
ProjectEntityPermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(page_id=self.kwargs.get("page_id"))
.filter(project__project_projectmember__member=self.request.user)
.select_related("project")
.select_related("workspace")
.select_related("page")
.select_related("issue")
.order_by("sort_order")
.distinct()
) )
def perform_create(self, serializer): def archive(self, request, slug, project_id, page_id):
serializer.save( page = Page.objects.get(pk=page_id, workspace__slug=slug, project_id=project_id)
project_id=self.kwargs.get("project_id"),
page_id=self.kwargs.get("page_id"), if page.owned_by_id != request.user.id:
return Response(
{"error": "Only the owner of the page can archive a page"},
status=status.HTTP_204_NO_CONTENT,
)
unarchive_archive_page_and_descendants(page_id, datetime.now())
return Response(status=status.HTTP_400_BAD_REQUEST)
def unarchive(self, request, slug, project_id, page_id):
page = Page.objects.get(pk=page_id, workspace__slug=slug, project_id=project_id)
if page.owned_by_id != request.user.id:
return Response(
{"error": "Only the owner of the page can unarchive a page"},
status=status.HTTP_400_BAD_REQUEST,
)
page.parent = None
page.save()
unarchive_archive_page_and_descendants(page_id, None)
return Response(status=status.HTTP_204_NO_CONTENT)
def archive_list(self, request, slug, project_id):
pages = (
Page.objects.filter(
project_id=project_id,
workspace__slug=slug,
)
.filter(archived_at__isnull=False)
.filter(parent_id__isnull=True)
) )
return Response(
PageSerializer(pages, many=True).data, status=status.HTTP_200_OK
)
unarchive_archive_page_and_descendants(page_id, datetime.now())
return Response(status=status.HTTP_204_NO_CONTENT)
def unarchive(self, request, slug, project_id, page_id):
page = Page.objects.get(
project_id=project_id,
owned_by_id=request.user.id,
workspace__slug=slug,
pk=page_id,
)
page.parent = None
page.save()
unarchive_archive_page_and_descendants(page_id, None)
return Response(status=status.HTTP_204_NO_CONTENT)
def archive_list(self, request, slug, project_id):
pages = (
Page.objects.filter(
project_id=project_id,
workspace__slug=slug,
)
.filter(archived_at__isnull=False)
.filter(parent_id__isnull=True)
)
if not pages:
return Response(
{"error": "No pages found"}, status=status.HTTP_400_BAD_REQUEST
)
return Response(
PageSerializer(pages, many=True).data, status=status.HTTP_200_OK
)
class PageFavoriteViewSet(BaseViewSet): class PageFavoriteViewSet(BaseViewSet):
permission_classes = [ permission_classes = [
@ -196,6 +329,7 @@ class PageFavoriteViewSet(BaseViewSet):
return self.filter_queryset( return self.filter_queryset(
super() super()
.get_queryset() .get_queryset()
.filter(archived_at__isnull=True)
.filter(workspace__slug=self.kwargs.get("slug")) .filter(workspace__slug=self.kwargs.get("slug"))
.filter(user=self.request.user) .filter(user=self.request.user)
.select_related("page", "page__owned_by") .select_related("page", "page__owned_by")
@ -218,24 +352,62 @@ class PageFavoriteViewSet(BaseViewSet):
page_favorite.delete() page_favorite.delete()
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
class CreateIssueFromPageBlockEndpoint(BaseAPIView): class PageLogEndpoint(BaseAPIView):
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]
def post(self, request, slug, project_id, page_id, page_block_id): serializer_class = PageLogSerializer
page_block = PageBlock.objects.get( model = PageLog
pk=page_block_id,
def post(self, request, slug, project_id, page_id):
serializer = PageLogSerializer(data=request.data)
if serializer.is_valid():
serializer.save(project_id=project_id, page_id=page_id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, page_id, transaction):
page_transaction = PageLog.objects.get(
workspace__slug=slug, workspace__slug=slug,
project_id=project_id, project_id=project_id,
page_id=page_id, page_id=page_id,
transaction=transaction,
)
serializer = PageLogSerializer(
page_transaction, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, page_id, transaction):
transaction = PageLog.objects.get(
workspace__slug=slug,
project_id=project_id,
page_id=page_id,
transaction=transaction,
)
# Delete the transaction object
transaction.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CreateIssueFromBlockEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id, page_id):
page = Page.objects.get(
workspace__slug=slug,
project_id=project_id,
pk=page_id,
) )
issue = Issue.objects.create( issue = Issue.objects.create(
name=page_block.name, name=request.data.get("name"),
project_id=project_id, project_id=project_id,
description=page_block.description,
description_html=page_block.description_html,
description_stripped=page_block.description_stripped,
) )
_ = IssueAssignee.objects.create( _ = IssueAssignee.objects.create(
issue=issue, assignee=request.user, project_id=project_id issue=issue, assignee=request.user, project_id=project_id
@ -245,11 +417,31 @@ class CreateIssueFromPageBlockEndpoint(BaseAPIView):
issue=issue, issue=issue,
actor=request.user, actor=request.user,
project_id=project_id, project_id=project_id,
comment=f"created the issue from {page_block.name} block", comment=f"created the issue from {page.name} block",
verb="created", verb="created",
) )
page_block.issue = issue
page_block.save()
return Response(IssueLiteSerializer(issue).data, status=status.HTTP_200_OK) return Response(IssueLiteSerializer(issue).data, status=status.HTTP_200_OK)
class SubPagesEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
@method_decorator(gzip_page)
def get(self, request, slug, project_id, page_id):
pages = (
PageLog.objects.filter(
page_id=page_id,
project_id=project_id,
workspace__slug=slug,
entity_name__in=["forward_link", "back_link"],
)
.filter(archived_at__isnull=True)
.select_related("project")
.select_related("workspace")
)
return Response(
SubPageSerializer(pages, many=True).data, status=status.HTTP_200_OK
)

View File

@ -17,16 +17,16 @@ from django.db.models import (
) )
from django.core.validators import validate_email from django.core.validators import validate_email
from django.conf import settings from django.conf import settings
from django.utils import timezone
# Third Party imports # Third Party imports
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework import status from rest_framework import status
from rest_framework import serializers from rest_framework import serializers
from rest_framework.permissions import AllowAny from rest_framework.permissions import AllowAny
from sentry_sdk import capture_exception
# Module imports # Module imports
from .base import BaseViewSet, BaseAPIView from .base import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import ( from plane.api.serializers import (
ProjectSerializer, ProjectSerializer,
ProjectListSerializer, ProjectListSerializer,
@ -39,6 +39,7 @@ from plane.api.serializers import (
) )
from plane.api.permissions import ( from plane.api.permissions import (
WorkspaceUserPermission,
ProjectBasePermission, ProjectBasePermission,
ProjectEntityPermission, ProjectEntityPermission,
ProjectMemberPermission, ProjectMemberPermission,
@ -58,13 +59,6 @@ from plane.db.models import (
ProjectIdentifier, ProjectIdentifier,
Module, Module,
Cycle, Cycle,
CycleFavorite,
ModuleFavorite,
PageFavorite,
IssueViewFavorite,
Page,
IssueAssignee,
ModuleMember,
Inbox, Inbox,
ProjectDeployBoard, ProjectDeployBoard,
IssueProperty, IssueProperty,
@ -73,9 +67,10 @@ from plane.db.models import (
from plane.bgtasks.project_invitation_task import project_invitation from plane.bgtasks.project_invitation_task import project_invitation
class ProjectViewSet(BaseViewSet): class ProjectViewSet(WebhookMixin, BaseViewSet):
serializer_class = ProjectSerializer serializer_class = ProjectSerializer
model = Project model = Project
webhook_event = "project"
permission_classes = [ permission_classes = [
ProjectBasePermission, ProjectBasePermission,
@ -110,12 +105,15 @@ class ProjectViewSet(BaseViewSet):
member=self.request.user, member=self.request.user,
project_id=OuterRef("pk"), project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"), workspace__slug=self.kwargs.get("slug"),
is_active=True,
) )
) )
) )
.annotate( .annotate(
total_members=ProjectMember.objects.filter( total_members=ProjectMember.objects.filter(
project_id=OuterRef("id"), member__is_bot=False project_id=OuterRef("id"),
member__is_bot=False,
is_active=True,
) )
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
@ -137,6 +135,7 @@ class ProjectViewSet(BaseViewSet):
member_role=ProjectMember.objects.filter( member_role=ProjectMember.objects.filter(
project_id=OuterRef("pk"), project_id=OuterRef("pk"),
member_id=self.request.user.id, member_id=self.request.user.id,
is_active=True,
).values("role") ).values("role")
) )
.annotate( .annotate(
@ -157,6 +156,7 @@ class ProjectViewSet(BaseViewSet):
member=request.user, member=request.user,
project_id=OuterRef("pk"), project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"), workspace__slug=self.kwargs.get("slug"),
is_active=True,
).values("sort_order") ).values("sort_order")
projects = ( projects = (
self.get_queryset() self.get_queryset()
@ -166,6 +166,7 @@ class ProjectViewSet(BaseViewSet):
"project_projectmember", "project_projectmember",
queryset=ProjectMember.objects.filter( queryset=ProjectMember.objects.filter(
workspace__slug=slug, workspace__slug=slug,
is_active=True,
).select_related("member"), ).select_related("member"),
) )
) )
@ -345,66 +346,104 @@ class ProjectViewSet(BaseViewSet):
) )
class InviteProjectEndpoint(BaseAPIView): class ProjectInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [ permission_classes = [
ProjectBasePermission, ProjectBasePermission,
] ]
def post(self, request, slug, project_id): def get_queryset(self):
email = request.data.get("email", False) return self.filter_queryset(
role = request.data.get("role", False) super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.select_related("project")
.select_related("workspace", "workspace__owner")
)
def create(self, request, slug, project_id):
emails = request.data.get("emails", [])
# Check if email is provided # Check if email is provided
if not email: if not emails:
return Response( return Response(
{"error": "Email is required"}, status=status.HTTP_400_BAD_REQUEST {"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST
) )
validate_email(email) requesting_user = ProjectMember.objects.get(
# Check if user is already a member of workspace workspace__slug=slug, project_id=project_id, member_id=request.user.id
if ProjectMember.objects.filter( )
project_id=project_id,
member__email=email, # Check if any invited user has an higher role
member__is_bot=False, if len(
).exists(): [
email
for email in emails
if int(email.get("role", 10)) > requesting_user.role
]
):
return Response( return Response(
{"error": "User is already member of workspace"}, {"error": "You cannot invite a user with higher role"},
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
user = User.objects.filter(email=email).first() workspace = Workspace.objects.get(slug=slug)
if user is None: project_invitations = []
token = jwt.encode( for email in emails:
{"email": email, "timestamp": datetime.now().timestamp()}, try:
settings.SECRET_KEY, validate_email(email.get("email"))
algorithm="HS256", project_invitations.append(
) ProjectMemberInvite(
project_invitation_obj = ProjectMemberInvite.objects.create( email=email.get("email").strip().lower(),
email=email.strip().lower(), project_id=project_id,
project_id=project_id, workspace_id=workspace.id,
token=token, token=jwt.encode(
role=role, {
) "email": email,
domain = settings.WEB_URL "timestamp": datetime.now().timestamp(),
project_invitation.delay(email, project_id, token, domain) },
settings.SECRET_KEY,
algorithm="HS256",
),
role=email.get("role", 10),
created_by=request.user,
)
)
except ValidationError:
return Response(
{
"error": f"Invalid email - {email} provided a valid email address is required to send the invite"
},
status=status.HTTP_400_BAD_REQUEST,
)
return Response( # Create workspace member invite
{ project_invitations = ProjectMemberInvite.objects.bulk_create(
"message": "Email sent successfully", project_invitations, batch_size=10, ignore_conflicts=True
"id": project_invitation_obj.id,
},
status=status.HTTP_200_OK,
)
project_member = ProjectMember.objects.create(
member=user, project_id=project_id, role=role
) )
current_site = f"{request.scheme}://{request.get_host()}",
_ = IssueProperty.objects.create(user=user, project_id=project_id) # Send invitations
for invitation in project_invitations:
project_invitations.delay(
invitation.email,
project_id,
invitation.token,
current_site,
request.user.email,
)
return Response( return Response(
ProjectMemberSerializer(project_member).data, status=status.HTTP_200_OK {
"message": "Email sent successfully",
},
status=status.HTTP_200_OK,
) )
@ -420,40 +459,134 @@ class UserProjectInvitationsViewset(BaseViewSet):
.select_related("workspace", "workspace__owner", "project") .select_related("workspace", "workspace__owner", "project")
) )
def create(self, request): def create(self, request, slug):
invitations = request.data.get("invitations") project_ids = request.data.get("project_ids", [])
project_invitations = ProjectMemberInvite.objects.filter(
pk__in=invitations, accepted=True # Get the workspace user role
workspace_member = WorkspaceMember.objects.get(
member=request.user,
workspace__slug=slug,
is_active=True,
) )
workspace_role = workspace_member.role
workspace = workspace_member.workspace
ProjectMember.objects.bulk_create( ProjectMember.objects.bulk_create(
[ [
ProjectMember( ProjectMember(
project=invitation.project, project_id=project_id,
workspace=invitation.project.workspace,
member=request.user, member=request.user,
role=invitation.role, role=15 if workspace_role >= 15 else 10,
workspace=workspace,
created_by=request.user, created_by=request.user,
) )
for invitation in project_invitations for project_id in project_ids
] ],
ignore_conflicts=True,
) )
IssueProperty.objects.bulk_create( IssueProperty.objects.bulk_create(
[ [
ProjectMember( IssueProperty(
project=invitation.project, project_id=project_id,
workspace=invitation.project.workspace,
user=request.user, user=request.user,
workspace=workspace,
created_by=request.user, created_by=request.user,
) )
for invitation in project_invitations for project_id in project_ids
] ],
ignore_conflicts=True,
) )
# Delete joined project invites return Response(
project_invitations.delete() {"message": "Projects joined successfully"},
status=status.HTTP_201_CREATED,
)
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectJoinEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request, slug, project_id, pk):
project_invite = ProjectMemberInvite.objects.get(
pk=pk,
project_id=project_id,
workspace__slug=slug,
)
email = request.data.get("email", "")
if email == "" or project_invite.email != email:
return Response(
{"error": "You do not have permission to join the project"},
status=status.HTTP_403_FORBIDDEN,
)
if project_invite.responded_at is None:
project_invite.accepted = request.data.get("accepted", False)
project_invite.responded_at = timezone.now()
project_invite.save()
if project_invite.accepted:
# Check if the user account exists
user = User.objects.filter(email=email).first()
# Check if user is a part of workspace
workspace_member = WorkspaceMember.objects.filter(
workspace__slug=slug, member=user
).first()
# Add him to workspace
if workspace_member is None:
_ = WorkspaceMember.objects.create(
workspace_id=project_invite.workspace_id,
member=user,
role=15 if project_invite.role >= 15 else project_invite.role,
)
else:
# Else make him active
workspace_member.is_active = True
workspace_member.save()
# Check if the user was already a member of project then activate the user
project_member = ProjectMember.objects.filter(
workspace_id=project_invite.workspace_id, member=user
).first()
if project_member is None:
# Create a Project Member
_ = ProjectMember.objects.create(
workspace_id=project_invite.workspace_id,
member=user,
role=project_invite.role,
)
else:
project_member.is_active = True
project_member.role = project_member.role
project_member.save()
return Response(
{"message": "Project Invitation Accepted"},
status=status.HTTP_200_OK,
)
return Response(
{"message": "Project Invitation was not accepted"},
status=status.HTTP_200_OK,
)
return Response(
{"error": "You have already responded to the invitation request"},
status=status.HTTP_400_BAD_REQUEST,
)
def get(self, request, slug, project_id, pk):
project_invitation = ProjectMemberInvite.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
serializer = ProjectMemberInviteSerializer(project_invitation)
return Response(serializer.data, status=status.HTTP_200_OK)
class ProjectMemberViewSet(BaseViewSet): class ProjectMemberViewSet(BaseViewSet):
@ -475,6 +608,7 @@ class ProjectMemberViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug")) .filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter(member__is_bot=False) .filter(member__is_bot=False)
.filter()
.select_related("project") .select_related("project")
.select_related("member") .select_related("member")
.select_related("workspace", "workspace__owner") .select_related("workspace", "workspace__owner")
@ -542,13 +676,17 @@ class ProjectMemberViewSet(BaseViewSet):
def list(self, request, slug, project_id): def list(self, request, slug, project_id):
project_member = ProjectMember.objects.get( project_member = ProjectMember.objects.get(
member=request.user, workspace__slug=slug, project_id=project_id member=request.user,
workspace__slug=slug,
project_id=project_id,
is_active=True,
) )
project_members = ProjectMember.objects.filter( project_members = ProjectMember.objects.filter(
project_id=project_id, project_id=project_id,
workspace__slug=slug, workspace__slug=slug,
member__is_bot=False, member__is_bot=False,
is_active=True,
).select_related("project", "member", "workspace") ).select_related("project", "member", "workspace")
if project_member.role > 10: if project_member.role > 10:
@ -559,7 +697,10 @@ class ProjectMemberViewSet(BaseViewSet):
def partial_update(self, request, slug, project_id, pk): def partial_update(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get( project_member = ProjectMember.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id pk=pk,
workspace__slug=slug,
project_id=project_id,
is_active=True,
) )
if request.user.id == project_member.member_id: if request.user.id == project_member.member_id:
return Response( return Response(
@ -568,7 +709,10 @@ class ProjectMemberViewSet(BaseViewSet):
) )
# Check while updating user roles # Check while updating user roles
requested_project_member = ProjectMember.objects.get( requested_project_member = ProjectMember.objects.get(
project_id=project_id, workspace__slug=slug, member=request.user project_id=project_id,
workspace__slug=slug,
member=request.user,
is_active=True,
) )
if ( if (
"role" in request.data "role" in request.data
@ -591,54 +735,66 @@ class ProjectMemberViewSet(BaseViewSet):
def destroy(self, request, slug, project_id, pk): def destroy(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get( project_member = ProjectMember.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk workspace__slug=slug,
project_id=project_id,
pk=pk,
member__is_bot=False,
is_active=True,
) )
# check requesting user role # check requesting user role
requesting_project_member = ProjectMember.objects.get( requesting_project_member = ProjectMember.objects.get(
workspace__slug=slug, member=request.user, project_id=project_id workspace__slug=slug,
member=request.user,
project_id=project_id,
is_active=True,
) )
# User cannot remove himself
if str(project_member.id) == str(requesting_project_member.id):
return Response(
{
"error": "You cannot remove yourself from the workspace. Please use leave workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
# User cannot deactivate higher role
if requesting_project_member.role < project_member.role: if requesting_project_member.role < project_member.role:
return Response( return Response(
{"error": "You cannot remove a user having role higher than yourself"}, {"error": "You cannot remove a user having role higher than you"},
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
# Remove all favorites project_member.is_active = False
ProjectFavorite.objects.filter( project_member.save()
workspace__slug=slug, project_id=project_id, user=project_member.member return Response(status=status.HTTP_204_NO_CONTENT)
).delete()
CycleFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
ModuleFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
PageFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
IssueViewFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
# Also remove issue from issue assigned
IssueAssignee.objects.filter(
workspace__slug=slug,
project_id=project_id,
assignee=project_member.member,
).delete()
# Remove if module member def leave(self, request, slug, project_id):
ModuleMember.objects.filter( project_member = ProjectMember.objects.get(
workspace__slug=slug, workspace__slug=slug,
project_id=project_id, project_id=project_id,
member=project_member.member, member=request.user,
).delete() is_active=True,
# Delete owned Pages )
Page.objects.filter(
workspace__slug=slug, # Check if the leaving user is the only admin of the project
project_id=project_id, if (
owned_by=project_member.member, project_member.role == 20
).delete() and not ProjectMember.objects.filter(
project_member.delete() workspace__slug=slug,
project_id=project_id,
role=20,
is_active=True,
).count()
> 1
):
return Response(
{
"error": "You cannot leave the project as your the only admin of the project you will have to either delete the project or create an another admin",
},
status=status.HTTP_400_BAD_REQUEST,
)
# Deactivate the user
project_member.is_active = False
project_member.save()
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
@ -691,46 +847,6 @@ class AddTeamToProjectEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED) return Response(serializer.data, status=status.HTTP_201_CREATED)
class ProjectMemberInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.select_related("project")
.select_related("workspace", "workspace__owner")
)
class ProjectMemberInviteDetailViewSet(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.select_related("project")
.select_related("workspace", "workspace__owner")
)
class ProjectIdentifierEndpoint(BaseAPIView): class ProjectIdentifierEndpoint(BaseAPIView):
permission_classes = [ permission_classes = [
ProjectBasePermission, ProjectBasePermission,
@ -774,59 +890,14 @@ class ProjectIdentifierEndpoint(BaseAPIView):
) )
class ProjectJoinEndpoint(BaseAPIView):
def post(self, request, slug):
project_ids = request.data.get("project_ids", [])
# Get the workspace user role
workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug
)
workspace_role = workspace_member.role
workspace = workspace_member.workspace
ProjectMember.objects.bulk_create(
[
ProjectMember(
project_id=project_id,
member=request.user,
role=20
if workspace_role >= 15
else (15 if workspace_role == 10 else workspace_role),
workspace=workspace,
created_by=request.user,
)
for project_id in project_ids
],
ignore_conflicts=True,
)
IssueProperty.objects.bulk_create(
[
IssueProperty(
project_id=project_id,
user=request.user,
workspace=workspace,
created_by=request.user,
)
for project_id in project_ids
],
ignore_conflicts=True,
)
return Response(
{"message": "Projects joined successfully"},
status=status.HTTP_201_CREATED,
)
class ProjectUserViewsEndpoint(BaseAPIView): class ProjectUserViewsEndpoint(BaseAPIView):
def post(self, request, slug, project_id): def post(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug) project = Project.objects.get(pk=project_id, workspace__slug=slug)
project_member = ProjectMember.objects.filter( project_member = ProjectMember.objects.filter(
member=request.user, project=project member=request.user,
project=project,
is_active=True,
).first() ).first()
if project_member is None: if project_member is None:
@ -850,7 +921,10 @@ class ProjectUserViewsEndpoint(BaseAPIView):
class ProjectMemberUserEndpoint(BaseAPIView): class ProjectMemberUserEndpoint(BaseAPIView):
def get(self, request, slug, project_id): def get(self, request, slug, project_id):
project_member = ProjectMember.objects.get( project_member = ProjectMember.objects.get(
project_id=project_id, workspace__slug=slug, member=request.user project_id=project_id,
workspace__slug=slug,
member=request.user,
is_active=True,
) )
serializer = ProjectMemberSerializer(project_member) serializer = ProjectMemberSerializer(project_member)
@ -983,39 +1057,6 @@ class WorkspaceProjectDeployBoardEndpoint(BaseAPIView):
return Response(projects, status=status.HTTP_200_OK) return Response(projects, status=status.HTTP_200_OK)
class LeaveProjectEndpoint(BaseAPIView):
permission_classes = [
ProjectLitePermission,
]
def delete(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
workspace__slug=slug,
member=request.user,
project_id=project_id,
)
# Only Admin case
if (
project_member.role == 20
and ProjectMember.objects.filter(
workspace__slug=slug,
role=20,
project_id=project_id,
).count()
== 1
):
return Response(
{
"error": "You cannot leave the project since you are the only admin of the project you should delete the project"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the member from workspace
project_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectPublicCoverImagesEndpoint(BaseAPIView): class ProjectPublicCoverImagesEndpoint(BaseAPIView):
permission_classes = [ permission_classes = [
AllowAny, AllowAny,

View File

@ -13,13 +13,7 @@ from plane.api.serializers import (
) )
from plane.api.views.base import BaseViewSet, BaseAPIView from plane.api.views.base import BaseViewSet, BaseAPIView
from plane.db.models import ( from plane.db.models import User, IssueActivity, WorkspaceMember
User,
Workspace,
WorkspaceMemberInvite,
Issue,
IssueActivity,
)
from plane.utils.paginator import BasePaginator from plane.utils.paginator import BasePaginator
@ -41,10 +35,28 @@ class UserEndpoint(BaseViewSet):
serialized_data = UserMeSettingsSerializer(request.user).data serialized_data = UserMeSettingsSerializer(request.user).data
return Response(serialized_data, status=status.HTTP_200_OK) return Response(serialized_data, status=status.HTTP_200_OK)
def deactivate(self, request):
# Check all workspace user is active
user = self.get_object()
if WorkspaceMember.objects.filter(
member=request.user, is_active=True
).exists():
return Response(
{
"error": "User cannot deactivate account as user is active in some workspaces"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Deactivate the user
user.is_active = False
user.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class UpdateUserOnBoardedEndpoint(BaseAPIView): class UpdateUserOnBoardedEndpoint(BaseAPIView):
def patch(self, request): def patch(self, request):
user = User.objects.get(pk=request.user.id) user = User.objects.get(pk=request.user.id, is_active=True)
user.is_onboarded = request.data.get("is_onboarded", False) user.is_onboarded = request.data.get("is_onboarded", False)
user.save() user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK) return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)
@ -52,7 +64,7 @@ class UpdateUserOnBoardedEndpoint(BaseAPIView):
class UpdateUserTourCompletedEndpoint(BaseAPIView): class UpdateUserTourCompletedEndpoint(BaseAPIView):
def patch(self, request): def patch(self, request):
user = User.objects.get(pk=request.user.id) user = User.objects.get(pk=request.user.id, is_active=True)
user.is_tour_completed = request.data.get("is_tour_completed", False) user.is_tour_completed = request.data.get("is_tour_completed", False)
user.save() user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK) return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)

View File

@ -0,0 +1,130 @@
# Django imports
from django.db import IntegrityError
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.db.models import Webhook, WebhookLog, Workspace
from plane.db.models.webhook import generate_token
from .base import BaseAPIView
from plane.api.permissions import WorkspaceOwnerPermission
from plane.api.serializers import WebhookSerializer, WebhookLogSerializer
class WebhookEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug):
workspace = Workspace.objects.get(slug=slug)
try:
serializer = WebhookSerializer(data=request.data)
if serializer.is_valid():
serializer.save(workspace_id=workspace.id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
{"error": "URL already exists for the workspace"},
status=status.HTTP_410_GONE,
)
raise IntegrityError
def get(self, request, slug, pk=None):
if pk == None:
webhooks = Webhook.objects.filter(workspace__slug=slug)
serializer = WebhookSerializer(
webhooks,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
many=True,
)
return Response(serializer.data, status=status.HTTP_200_OK)
else:
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
serializer = WebhookSerializer(
webhook,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, pk):
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
serializer = WebhookSerializer(
webhook,
data=request.data,
partial=True,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, pk):
webhook = Webhook.objects.get(pk=pk, workspace__slug=slug)
webhook.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class WebhookSecretRegenerateEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug, pk):
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
webhook.secret_key = generate_token()
webhook.save()
serializer = WebhookSerializer(webhook)
return Response(serializer.data, status=status.HTTP_200_OK)
class WebhookLogsEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def get(self, request, slug, webhook_id):
webhook_logs = WebhookLog.objects.filter(
workspace__slug=slug, webhook_id=webhook_id
)
serializer = WebhookLogSerializer(webhook_logs, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@ -2,7 +2,6 @@
import jwt import jwt
from datetime import date, datetime from datetime import date, datetime
from dateutil.relativedelta import relativedelta from dateutil.relativedelta import relativedelta
from uuid import uuid4
# Django imports # Django imports
from django.db import IntegrityError from django.db import IntegrityError
@ -26,13 +25,11 @@ from django.db.models import (
) )
from django.db.models.functions import ExtractWeek, Cast, ExtractDay from django.db.models.functions import ExtractWeek, Cast, ExtractDay
from django.db.models.fields import DateField from django.db.models.fields import DateField
from django.contrib.auth.hashers import make_password
# Third party modules # Third party modules
from rest_framework import status from rest_framework import status
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.permissions import AllowAny from rest_framework.permissions import AllowAny, IsAuthenticated
from sentry_sdk import capture_exception
# Module imports # Module imports
from plane.api.serializers import ( from plane.api.serializers import (
@ -59,14 +56,6 @@ from plane.db.models import (
IssueActivity, IssueActivity,
Issue, Issue,
WorkspaceTheme, WorkspaceTheme,
IssueAssignee,
ProjectFavorite,
CycleFavorite,
ModuleMember,
ModuleFavorite,
PageFavorite,
Page,
IssueViewFavorite,
IssueLink, IssueLink,
IssueAttachment, IssueAttachment,
IssueSubscriber, IssueSubscriber,
@ -106,7 +95,9 @@ class WorkSpaceViewSet(BaseViewSet):
def get_queryset(self): def get_queryset(self):
member_count = ( member_count = (
WorkspaceMember.objects.filter( WorkspaceMember.objects.filter(
workspace=OuterRef("id"), member__is_bot=False workspace=OuterRef("id"),
member__is_bot=False,
is_active=True,
) )
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
@ -181,7 +172,9 @@ class UserWorkSpacesEndpoint(BaseAPIView):
def get(self, request): def get(self, request):
member_count = ( member_count = (
WorkspaceMember.objects.filter( WorkspaceMember.objects.filter(
workspace=OuterRef("id"), member__is_bot=False workspace=OuterRef("id"),
member__is_bot=False,
is_active=True,
) )
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
@ -227,23 +220,40 @@ class WorkSpaceAvailabilityCheckEndpoint(BaseAPIView):
return Response({"status": not workspace}, status=status.HTTP_200_OK) return Response({"status": not workspace}, status=status.HTTP_200_OK)
class InviteWorkspaceEndpoint(BaseAPIView): class WorkspaceInvitationsViewset(BaseViewSet):
"""Endpoint for creating, listing and deleting workspaces"""
serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite
permission_classes = [ permission_classes = [
WorkSpaceAdminPermission, WorkSpaceAdminPermission,
] ]
def post(self, request, slug): def get_queryset(self):
emails = request.data.get("emails", False) return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "workspace__owner", "created_by")
)
def create(self, request, slug):
emails = request.data.get("emails", [])
# Check if email is provided # Check if email is provided
if not emails or not len(emails): if not emails:
return Response( return Response(
{"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST {"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST
) )
# check for role level # check for role level of the requesting user
requesting_user = WorkspaceMember.objects.get( requesting_user = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user workspace__slug=slug,
member=request.user,
is_active=True,
) )
# Check if any invited user has an higher role
if len( if len(
[ [
email email
@ -256,15 +266,17 @@ class InviteWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
# Get the workspace object
workspace = Workspace.objects.get(slug=slug) workspace = Workspace.objects.get(slug=slug)
# Check if user is already a member of workspace # Check if user is already a member of workspace
workspace_members = WorkspaceMember.objects.filter( workspace_members = WorkspaceMember.objects.filter(
workspace_id=workspace.id, workspace_id=workspace.id,
member__email__in=[email.get("email") for email in emails], member__email__in=[email.get("email") for email in emails],
is_active=True,
).select_related("member", "workspace", "workspace__owner") ).select_related("member", "workspace", "workspace__owner")
if len(workspace_members): if workspace_members:
return Response( return Response(
{ {
"error": "Some users are already member of workspace", "error": "Some users are already member of workspace",
@ -302,35 +314,20 @@ class InviteWorkspaceEndpoint(BaseAPIView):
}, },
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
WorkspaceMemberInvite.objects.bulk_create( # Create workspace member invite
workspace_invitations = WorkspaceMemberInvite.objects.bulk_create(
workspace_invitations, batch_size=10, ignore_conflicts=True workspace_invitations, batch_size=10, ignore_conflicts=True
) )
workspace_invitations = WorkspaceMemberInvite.objects.filter( current_site = f"{request.scheme}://{request.get_host()}",
email__in=[email.get("email") for email in emails]
).select_related("workspace")
# create the user if signup is disabled
if settings.DOCKERIZED and not settings.ENABLE_SIGNUP:
_ = User.objects.bulk_create(
[
User(
username=str(uuid4().hex),
email=invitation.email,
password=make_password(uuid4().hex),
is_password_autoset=True,
)
for invitation in workspace_invitations
],
batch_size=100,
)
# Send invitations
for invitation in workspace_invitations: for invitation in workspace_invitations:
workspace_invitation.delay( workspace_invitation.delay(
invitation.email, invitation.email,
workspace.id, workspace.id,
invitation.token, invitation.token,
settings.WEB_URL, current_site,
request.user.email, request.user.email,
) )
@ -341,11 +338,19 @@ class InviteWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_200_OK, status=status.HTTP_200_OK,
) )
def destroy(self, request, slug, pk):
workspace_member_invite = WorkspaceMemberInvite.objects.get(
pk=pk, workspace__slug=slug
)
workspace_member_invite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class JoinWorkspaceEndpoint(BaseAPIView):
class WorkspaceJoinEndpoint(BaseAPIView):
permission_classes = [ permission_classes = [
AllowAny, AllowAny,
] ]
"""Invitation response endpoint the user can respond to the invitation"""
def post(self, request, slug, pk): def post(self, request, slug, pk):
workspace_invite = WorkspaceMemberInvite.objects.get( workspace_invite = WorkspaceMemberInvite.objects.get(
@ -354,12 +359,14 @@ class JoinWorkspaceEndpoint(BaseAPIView):
email = request.data.get("email", "") email = request.data.get("email", "")
# Check the email
if email == "" or workspace_invite.email != email: if email == "" or workspace_invite.email != email:
return Response( return Response(
{"error": "You do not have permission to join the workspace"}, {"error": "You do not have permission to join the workspace"},
status=status.HTTP_403_FORBIDDEN, status=status.HTTP_403_FORBIDDEN,
) )
# If already responded then return error
if workspace_invite.responded_at is None: if workspace_invite.responded_at is None:
workspace_invite.accepted = request.data.get("accepted", False) workspace_invite.accepted = request.data.get("accepted", False)
workspace_invite.responded_at = timezone.now() workspace_invite.responded_at = timezone.now()
@ -371,12 +378,23 @@ class JoinWorkspaceEndpoint(BaseAPIView):
# If the user is present then create the workspace member # If the user is present then create the workspace member
if user is not None: if user is not None:
WorkspaceMember.objects.create( # Check if the user was already a member of workspace then activate the user
workspace=workspace_invite.workspace, workspace_member = WorkspaceMember.objects.filter(
member=user, workspace=workspace_invite.workspace, member=user
role=workspace_invite.role, ).first()
) if workspace_member is not None:
workspace_member.is_active = True
workspace_member.role = workspace_invite.role
workspace_member.save()
else:
# Create a Workspace
_ = WorkspaceMember.objects.create(
workspace=workspace_invite.workspace,
member=user,
role=workspace_invite.role,
)
# Set the user last_workspace_id to the accepted workspace
user.last_workspace_id = workspace_invite.workspace.id user.last_workspace_id = workspace_invite.workspace.id
user.save() user.save()
@ -388,6 +406,7 @@ class JoinWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_200_OK, status=status.HTTP_200_OK,
) )
# Workspace invitation rejected
return Response( return Response(
{"message": "Workspace Invitation was not accepted"}, {"message": "Workspace Invitation was not accepted"},
status=status.HTTP_200_OK, status=status.HTTP_200_OK,
@ -398,37 +417,13 @@ class JoinWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
def get(self, request, slug, pk):
class WorkspaceInvitationsViewset(BaseViewSet): workspace_invitation = WorkspaceMemberInvite.objects.get(workspace__slug=slug, pk=pk)
serializer_class = WorkSpaceMemberInviteSerializer serializer = WorkSpaceMemberInviteSerializer(workspace_invitation)
model = WorkspaceMemberInvite return Response(serializer.data, status=status.HTTP_200_OK)
permission_classes = [
WorkSpaceAdminPermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "workspace__owner", "created_by")
)
def destroy(self, request, slug, pk):
workspace_member_invite = WorkspaceMemberInvite.objects.get(
pk=pk, workspace__slug=slug
)
# delete the user if signup is disabled
if settings.DOCKERIZED and not settings.ENABLE_SIGNUP:
user = User.objects.filter(email=workspace_member_invite.email).first()
if user is not None:
user.delete()
workspace_member_invite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class UserWorkspaceInvitationsEndpoint(BaseViewSet): class UserWorkspaceInvitationsViewSet(BaseViewSet):
serializer_class = WorkSpaceMemberInviteSerializer serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite model = WorkspaceMemberInvite
@ -442,9 +437,19 @@ class UserWorkspaceInvitationsEndpoint(BaseViewSet):
) )
def create(self, request): def create(self, request):
invitations = request.data.get("invitations") invitations = request.data.get("invitations", [])
workspace_invitations = WorkspaceMemberInvite.objects.filter(pk__in=invitations) workspace_invitations = WorkspaceMemberInvite.objects.filter(
pk__in=invitations, email=request.user.email
).order_by("-created_at")
# If the user is already a member of workspace and was deactivated then activate the user
for invitation in workspace_invitations:
# Update the WorkspaceMember for this specific invitation
WorkspaceMember.objects.filter(
workspace_id=invitation.workspace_id, member=request.user
).update(is_active=True, role=invitation.role)
# Bulk create the user for all the workspaces
WorkspaceMember.objects.bulk_create( WorkspaceMember.objects.bulk_create(
[ [
WorkspaceMember( WorkspaceMember(
@ -481,20 +486,24 @@ class WorkSpaceMemberViewSet(BaseViewSet):
return self.filter_queryset( return self.filter_queryset(
super() super()
.get_queryset() .get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"), member__is_bot=False) .filter(
workspace__slug=self.kwargs.get("slug"),
member__is_bot=False,
is_active=True,
)
.select_related("workspace", "workspace__owner") .select_related("workspace", "workspace__owner")
.select_related("member") .select_related("member")
) )
def list(self, request, slug): def list(self, request, slug):
workspace_member = WorkspaceMember.objects.get( workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug member=request.user,
workspace__slug=slug,
is_active=True,
) )
workspace_members = WorkspaceMember.objects.filter( # Get all active workspace members
workspace__slug=slug, workspace_members = self.get_queryset()
member__is_bot=False,
).select_related("workspace", "member")
if workspace_member.role > 10: if workspace_member.role > 10:
serializer = WorkspaceMemberAdminSerializer(workspace_members, many=True) serializer = WorkspaceMemberAdminSerializer(workspace_members, many=True)
@ -506,7 +515,12 @@ class WorkSpaceMemberViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, pk): def partial_update(self, request, slug, pk):
workspace_member = WorkspaceMember.objects.get(pk=pk, workspace__slug=slug) workspace_member = WorkspaceMember.objects.get(
pk=pk,
workspace__slug=slug,
member__is_bot=False,
is_active=True,
)
if request.user.id == workspace_member.member_id: if request.user.id == workspace_member.member_id:
return Response( return Response(
{"error": "You cannot update your own role"}, {"error": "You cannot update your own role"},
@ -515,7 +529,9 @@ class WorkSpaceMemberViewSet(BaseViewSet):
# Get the requested user role # Get the requested user role
requested_workspace_member = WorkspaceMember.objects.get( requested_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user workspace__slug=slug,
member=request.user,
is_active=True,
) )
# Check if role is being updated # Check if role is being updated
# One cannot update role higher than his own role # One cannot update role higher than his own role
@ -540,68 +556,121 @@ class WorkSpaceMemberViewSet(BaseViewSet):
def destroy(self, request, slug, pk): def destroy(self, request, slug, pk):
# Check the user role who is deleting the user # Check the user role who is deleting the user
workspace_member = WorkspaceMember.objects.get(workspace__slug=slug, pk=pk) workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug,
pk=pk,
member__is_bot=False,
is_active=True,
)
# check requesting user role # check requesting user role
requesting_workspace_member = WorkspaceMember.objects.get( requesting_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user workspace__slug=slug,
member=request.user,
is_active=True,
) )
if str(workspace_member.id) == str(requesting_workspace_member.id):
return Response(
{
"error": "You cannot remove yourself from the workspace. Please use leave workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
if requesting_workspace_member.role < workspace_member.role: if requesting_workspace_member.role < workspace_member.role:
return Response( return Response(
{"error": "You cannot remove a user having role higher than you"}, {"error": "You cannot remove a user having role higher than you"},
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
# Check for the only member in the workspace
if ( if (
workspace_member.role == 20 Project.objects.annotate(
and WorkspaceMember.objects.filter( total_members=Count("project_projectmember"),
workspace__slug=slug, member_with_role=Count(
role=20, "project_projectmember",
member__is_bot=False, filter=Q(
).count() project_projectmember__member_id=request.user.id,
== 1 project_projectmember__role=20,
),
),
)
.filter(total_members=1, member_with_role=1, workspace__slug=slug)
.exists()
): ):
return Response( return Response(
{"error": "Cannot delete the only Admin for the workspace"}, {
"error": "User is part of some projects where they are the only admin you should leave that project first"
},
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
# Delete the user also from all the projects # Deactivate the users from the projects where the user is part of
ProjectMember.objects.filter( _ = ProjectMember.objects.filter(
workspace__slug=slug, member=workspace_member.member workspace__slug=slug,
).delete() member_id=workspace_member.member_id,
# Remove all favorites is_active=True,
ProjectFavorite.objects.filter( ).update(is_active=False)
workspace__slug=slug, user=workspace_member.member
).delete()
CycleFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
ModuleFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
PageFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
IssueViewFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
# Also remove issue from issue assigned
IssueAssignee.objects.filter(
workspace__slug=slug, assignee=workspace_member.member
).delete()
# Remove if module member workspace_member.is_active = False
ModuleMember.objects.filter( workspace_member.save()
workspace__slug=slug, member=workspace_member.member return Response(status=status.HTTP_204_NO_CONTENT)
).delete()
# Delete owned Pages
Page.objects.filter(
workspace__slug=slug, owned_by=workspace_member.member
).delete()
workspace_member.delete() def leave(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug,
member=request.user,
is_active=True,
)
# Check if the leaving user is the only admin of the workspace
if (
workspace_member.role == 20
and not WorkspaceMember.objects.filter(
workspace__slug=slug,
role=20,
is_active=True,
).count()
> 1
):
return Response(
{
"error": "You cannot leave the workspace as your the only admin of the workspace you will have to either delete the workspace or create an another admin"
},
status=status.HTTP_400_BAD_REQUEST,
)
if (
Project.objects.annotate(
total_members=Count("project_projectmember"),
member_with_role=Count(
"project_projectmember",
filter=Q(
project_projectmember__member_id=request.user.id,
project_projectmember__role=20,
),
),
)
.filter(total_members=1, member_with_role=1, workspace__slug=slug)
.exists()
):
return Response(
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
},
status=status.HTTP_400_BAD_REQUEST,
)
# # Deactivate the users from the projects where the user is part of
_ = ProjectMember.objects.filter(
workspace__slug=slug,
member_id=workspace_member.member_id,
is_active=True,
).update(is_active=False)
# # Deactivate the user
workspace_member.is_active = False
workspace_member.save()
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
@ -629,7 +698,9 @@ class TeamMemberViewSet(BaseViewSet):
def create(self, request, slug): def create(self, request, slug):
members = list( members = list(
WorkspaceMember.objects.filter( WorkspaceMember.objects.filter(
workspace__slug=slug, member__id__in=request.data.get("members", []) workspace__slug=slug,
member__id__in=request.data.get("members", []),
is_active=True,
) )
.annotate(member_str_id=Cast("member", output_field=CharField())) .annotate(member_str_id=Cast("member", output_field=CharField()))
.distinct() .distinct()
@ -658,23 +729,6 @@ class TeamMemberViewSet(BaseViewSet):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class UserWorkspaceInvitationEndpoint(BaseViewSet):
model = WorkspaceMemberInvite
serializer_class = WorkSpaceMemberInviteSerializer
permission_classes = [
AllowAny,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(pk=self.kwargs.get("pk"))
.select_related("workspace")
)
class UserLastProjectWithWorkspaceEndpoint(BaseAPIView): class UserLastProjectWithWorkspaceEndpoint(BaseAPIView):
def get(self, request): def get(self, request):
user = User.objects.get(pk=request.user.id) user = User.objects.get(pk=request.user.id)
@ -711,7 +765,9 @@ class UserLastProjectWithWorkspaceEndpoint(BaseAPIView):
class WorkspaceMemberUserEndpoint(BaseAPIView): class WorkspaceMemberUserEndpoint(BaseAPIView):
def get(self, request, slug): def get(self, request, slug):
workspace_member = WorkspaceMember.objects.get( workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug member=request.user,
workspace__slug=slug,
is_active=True,
) )
serializer = WorkspaceMemberMeSerializer(workspace_member) serializer = WorkspaceMemberMeSerializer(workspace_member)
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
@ -720,7 +776,9 @@ class WorkspaceMemberUserEndpoint(BaseAPIView):
class WorkspaceMemberUserViewsEndpoint(BaseAPIView): class WorkspaceMemberUserViewsEndpoint(BaseAPIView):
def post(self, request, slug): def post(self, request, slug):
workspace_member = WorkspaceMember.objects.get( workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user workspace__slug=slug,
member=request.user,
is_active=True,
) )
workspace_member.view_props = request.data.get("view_props", {}) workspace_member.view_props = request.data.get("view_props", {})
workspace_member.save() workspace_member.save()
@ -1046,7 +1104,9 @@ class WorkspaceUserProfileEndpoint(BaseAPIView):
user_data = User.objects.get(pk=user_id) user_data = User.objects.get(pk=user_id)
requesting_workspace_member = WorkspaceMember.objects.get( requesting_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user workspace__slug=slug,
member=request.user,
is_active=True,
) )
projects = [] projects = []
if requesting_workspace_member.role >= 10: if requesting_workspace_member.role >= 10:
@ -1250,9 +1310,7 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
status=status.HTTP_200_OK, status=status.HTTP_200_OK,
) )
return Response( return Response(issues, status=status.HTTP_200_OK)
issues, status=status.HTTP_200_OK
)
class WorkspaceLabelsEndpoint(BaseAPIView): class WorkspaceLabelsEndpoint(BaseAPIView):
@ -1266,30 +1324,3 @@ class WorkspaceLabelsEndpoint(BaseAPIView):
project__project_projectmember__member=request.user, project__project_projectmember__member=request.user,
).values("parent", "name", "color", "id", "project_id", "workspace__slug") ).values("parent", "name", "color", "id", "project_id", "workspace__slug")
return Response(labels, status=status.HTTP_200_OK) return Response(labels, status=status.HTTP_200_OK)
class LeaveWorkspaceEndpoint(BaseAPIView):
permission_classes = [
WorkspaceEntityPermission,
]
def delete(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
)
# Only Admin case
if (
workspace_member.role == 20
and WorkspaceMember.objects.filter(workspace__slug=slug, role=20).count()
== 1
):
return Response(
{
"error": "You cannot leave the workspace since you are the only admin of the workspace you should delete the workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the member from workspace
workspace_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@ -0,0 +1,47 @@
# Django imports
from django.utils import timezone
from django.db.models import Q
# Third party imports
from rest_framework import authentication
from rest_framework.exceptions import AuthenticationFailed
# Module imports
from plane.db.models import APIToken
class APIKeyAuthentication(authentication.BaseAuthentication):
"""
Authentication with an API Key
"""
www_authenticate_realm = "api"
media_type = "application/json"
auth_header_name = "X-Api-Key"
def get_api_token(self, request):
return request.headers.get(self.auth_header_name)
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
token=token,
is_active=True,
)
except APIToken.DoesNotExist:
raise AuthenticationFailed("Given API token is not valid")
# save api token last used
api_token.last_used = timezone.now()
api_token.save(update_fields=["last_used"])
return (api_token.user, api_token.token)
def authenticate(self, request):
token = self.get_api_token(request=request)
if not token:
return None
# Validate the API token
user, token = self.validate_api_token(token)
return user, token

View File

@ -0,0 +1,5 @@
from django.apps import AppConfig
class ApiConfig(AppConfig):
name = "plane.authentication"

View File

@ -73,6 +73,12 @@ def service_importer(service, importer_id):
] ]
) )
# Check if any of the users are already member of workspace
_ = WorkspaceMember.objects.filter(
member__in=[user for user in workspace_users],
workspace_id=importer.workspace_id,
).update(is_active=True)
# Add new users to Workspace and project automatically # Add new users to Workspace and project automatically
WorkspaceMember.objects.bulk_create( WorkspaceMember.objects.bulk_create(
[ [

View File

@ -12,7 +12,7 @@ from celery import shared_task
from sentry_sdk import capture_exception from sentry_sdk import capture_exception
# Module imports # Module imports
from plane.db.models import Issue, Project, State from plane.db.models import Issue, Project, State, Page
from plane.bgtasks.issue_activites_task import issue_activity from plane.bgtasks.issue_activites_task import issue_activity
@ -20,6 +20,7 @@ from plane.bgtasks.issue_activites_task import issue_activity
def archive_and_close_old_issues(): def archive_and_close_old_issues():
archive_old_issues() archive_old_issues()
close_old_issues() close_old_issues()
delete_archived_pages()
def archive_old_issues(): def archive_old_issues():
@ -80,7 +81,7 @@ def archive_old_issues():
project_id=project_id, project_id=project_id,
current_instance=json.dumps({"archived_at": None}), current_instance=json.dumps({"archived_at": None}),
subscriber=False, subscriber=False,
epoch=int(timezone.now().timestamp()) epoch=int(timezone.now().timestamp()),
) )
for issue in issues_to_update for issue in issues_to_update
] ]
@ -142,17 +143,21 @@ def close_old_issues():
# Bulk Update the issues and log the activity # Bulk Update the issues and log the activity
if issues_to_update: if issues_to_update:
Issue.objects.bulk_update(issues_to_update, ["state"], batch_size=100) Issue.objects.bulk_update(
issues_to_update, ["state"], batch_size=100
)
[ [
issue_activity.delay( issue_activity.delay(
type="issue.activity.updated", type="issue.activity.updated",
requested_data=json.dumps({"closed_to": str(issue.state_id)}), requested_data=json.dumps(
{"closed_to": str(issue.state_id)}
),
actor_id=str(project.created_by_id), actor_id=str(project.created_by_id),
issue_id=issue.id, issue_id=issue.id,
project_id=project_id, project_id=project_id,
current_instance=None, current_instance=None,
subscriber=False, subscriber=False,
epoch=int(timezone.now().timestamp()) epoch=int(timezone.now().timestamp()),
) )
for issue in issues_to_update for issue in issues_to_update
] ]
@ -162,3 +167,20 @@ def close_old_issues():
print(e) print(e)
capture_exception(e) capture_exception(e)
return return
def delete_archived_pages():
try:
pages_to_delete = Page.objects.filter(
archived_at__isnull=False,
archived_at__lte=(timezone.now() - timedelta(days=30)),
)
pages_to_delete._raw_delete(pages_to_delete.db)
return
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@ -13,23 +13,24 @@ from plane.db.models import Project, User, ProjectMemberInvite
@shared_task @shared_task
def project_invitation(email, project_id, token, current_site): def project_invitation(email, project_id, token, current_site, invitor):
try: try:
user = User.objects.get(email=invitor)
project = Project.objects.get(pk=project_id) project = Project.objects.get(pk=project_id)
project_member_invite = ProjectMemberInvite.objects.get( project_member_invite = ProjectMemberInvite.objects.get(
token=token, email=email token=token, email=email
) )
relativelink = f"/project-member-invitation/{project_member_invite.id}" relativelink = f"/project-invitations/?invitation_id={project_member_invite.id}&email={email}&slug={project.workspace.slug}&project_id={str(project_id)}"
abs_url = current_site + relativelink abs_url = current_site + relativelink
from_email_string = settings.EMAIL_FROM from_email_string = settings.EMAIL_FROM
subject = f"{project.created_by.first_name or project.created_by.email} invited you to join {project.name} on Plane" subject = f"{user.first_name or user.display_name or user.email} invited you to join {project.name} on Plane"
context = { context = {
"email": email, "email": email,
"first_name": project.created_by.first_name, "first_name": user.first_name,
"project_name": project.name, "project_name": project.name,
"invitation_url": abs_url, "invitation_url": abs_url,
} }

View File

@ -0,0 +1,139 @@
import requests
import uuid
import hashlib
import json
# Django imports
from django.conf import settings
# Third party imports
from celery import shared_task
from sentry_sdk import capture_exception
from plane.db.models import Webhook, WebhookLog
@shared_task(
bind=True,
autoretry_for=(requests.RequestException,),
retry_backoff=600,
max_retries=5,
retry_jitter=True,
)
def webhook_task(self, webhook, slug, event, event_data, action):
try:
webhook = Webhook.objects.get(id=webhook, workspace__slug=slug)
headers = {
"Content-Type": "application/json",
"User-Agent": "Autopilot",
"X-Plane-Delivery": str(uuid.uuid4()),
"X-Plane-Event": event,
}
# Your secret key
if webhook.secret_key:
# Concatenate the data and the secret key
message = event_data + webhook.secret_key
# Create a SHA-256 hash of the message
sha256 = hashlib.sha256()
sha256.update(message.encode("utf-8"))
signature = sha256.hexdigest()
headers["X-Plane-Signature"] = signature
event_data = json.loads(event_data) if event_data is not None else None
action = {
"POST": "create",
"PATCH": "update",
"PUT": "update",
"DELETE": "delete",
}.get(action, action)
payload = {
"event": event,
"action": action,
"webhook_id": str(webhook.id),
"workspace_id": str(webhook.workspace_id),
"data": event_data,
}
# Send the webhook event
response = requests.post(
webhook.url,
headers=headers,
json=payload,
timeout=30,
)
# Log the webhook request
WebhookLog.objects.create(
workspace_id=str(webhook.workspace_id),
webhook_id=str(webhook.id),
event_type=str(event),
request_method=str(action),
request_headers=str(headers),
request_body=str(payload),
response_status=str(response.status_code),
response_headers=str(response.headers),
response_body=str(response.text),
retry_count=str(self.request.retries),
)
except requests.RequestException as e:
# Log the failed webhook request
WebhookLog.objects.create(
workspace_id=str(webhook.workspace_id),
webhook_id=str(webhook.id),
event_type=str(event),
request_method=str(action),
request_headers=str(headers),
request_body=str(payload),
response_status=500,
response_headers="",
response_body=str(e),
retry_count=str(self.request.retries),
)
# Retry logic
if self.request.retries >= self.max_retries:
Webhook.objects.filter(pk=webhook.id).update(is_active=False)
return
raise requests.RequestException()
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return
@shared_task()
def send_webhook(event, event_data, action, slug):
try:
webhooks = Webhook.objects.filter(workspace__slug=slug, is_active=True)
if event == "project":
webhooks = webhooks.filter(project=True)
if event == "issue":
webhooks = webhooks.filter(issue=True)
if event == "module":
webhooks = webhooks.filter(module=True)
if event == "cycle":
webhooks = webhooks.filter(cycle=True)
if event == "issue-comment":
webhooks = webhooks.filter(issue_comment=True)
for webhook in webhooks:
webhook_task.delay(webhook.id, slug, event, event_data, action)
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@ -11,25 +11,33 @@ from slack_sdk import WebClient
from slack_sdk.errors import SlackApiError from slack_sdk.errors import SlackApiError
# Module imports # Module imports
from plane.db.models import Workspace, WorkspaceMemberInvite from plane.db.models import User, Workspace, WorkspaceMemberInvite
@shared_task @shared_task
def workspace_invitation(email, workspace_id, token, current_site, invitor): def workspace_invitation(email, workspace_id, token, current_site, invitor):
try: try:
user = User.objects.get(email=invitor)
workspace = Workspace.objects.get(pk=workspace_id) workspace = Workspace.objects.get(pk=workspace_id)
workspace_member_invite = WorkspaceMemberInvite.objects.get( workspace_member_invite = WorkspaceMemberInvite.objects.get(
token=token, email=email token=token, email=email
) )
realtivelink = ( # Relative link
f"/workspace-member-invitation/?invitation_id={workspace_member_invite.id}&email={email}" relative_link = (
f"/workspace-invitations/?invitation_id={workspace_member_invite.id}&email={email}&slug={workspace.slug}"
) )
abs_url = current_site + realtivelink
# The complete url including the domain
abs_url = current_site + relative_link
# The email from
from_email_string = settings.EMAIL_FROM from_email_string = settings.EMAIL_FROM
subject = f"{invitor or email} invited you to join {workspace.name} on Plane" # Subject of the email
subject = f"{user.first_name or user.display_name or user.email} invited you to join {workspace.name} on Plane"
context = { context = {
"email": email, "email": email,

View File

@ -3,7 +3,7 @@
from django.conf import settings from django.conf import settings
from django.db import migrations, models from django.db import migrations, models
import django.db.models.deletion import django.db.models.deletion
import plane.db.models.api_token import plane.db.models.api
import uuid import uuid
@ -40,8 +40,8 @@ class Migration(migrations.Migration):
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')), ('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')), ('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)), ('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('token', models.CharField(default=plane.db.models.api_token.generate_token, max_length=255, unique=True)), ('token', models.CharField(default=plane.db.models.api.generate_token, max_length=255, unique=True)),
('label', models.CharField(default=plane.db.models.api_token.generate_label_token, max_length=255)), ('label', models.CharField(default=plane.db.models.api.generate_label_token, max_length=255)),
('user_type', models.PositiveSmallIntegerField(choices=[(0, 'Human'), (1, 'Bot')], default=0)), ('user_type', models.PositiveSmallIntegerField(choices=[(0, 'Human'), (1, 'Bot')], default=0)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='apitoken_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')), ('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='apitoken_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='apitoken_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')), ('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='apitoken_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),

View File

@ -1,41 +0,0 @@
# Generated by Django 4.2.5 on 2023-10-18 12:04
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.issue
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0045_issueactivity_epoch_workspacemember_issue_props_and_more'),
]
operations = [
migrations.CreateModel(
name="issue_mentions",
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4,editable=False, primary_key=True, serialize=False, unique=True)),
('mention', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to=settings.AUTH_USER_MODEL)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL,related_name='issuemention_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('issue', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to='db.issue')),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_issuemention', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='issuemention_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_issuemention', to='db.workspace')),
],
options={
'verbose_name': 'IssueMention',
'verbose_name_plural': 'IssueMentions',
'db_table': 'issue_mentions',
'ordering': ('-created_at',),
},
),
migrations.AlterField(
model_name='issueproperty',
name='properties',
field=models.JSONField(default=plane.db.models.issue.get_default_properties),
),
]

View File

@ -0,0 +1,984 @@
# Generated by Django 4.2.5 on 2023-11-15 09:47
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.issue
import uuid
import random
def random_sort_ordering(apps, schema_editor):
Label = apps.get_model("db", "Label")
bulk_labels = []
for label in Label.objects.all():
label.sort_order = random.randint(0,65535)
bulk_labels.append(label)
Label.objects.bulk_update(bulk_labels, ["sort_order"], batch_size=1000)
class Migration(migrations.Migration):
dependencies = [
('db', '0045_issueactivity_epoch_workspacemember_issue_props_and_more'),
]
operations = [
migrations.AddField(
model_name='label',
name='sort_order',
field=models.FloatField(default=65535),
),
migrations.AlterField(
model_name='analyticview',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='analyticview',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='apitoken',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='apitoken',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycle',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cycle',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cycle',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycle',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='cycleissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cycleissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cycleissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycleissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='estimate',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='estimate',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='estimate',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='estimate',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='estimatepoint',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='estimatepoint',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='estimatepoint',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='estimatepoint',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='fileasset',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='fileasset',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubissuesync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubissuesync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubissuesync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubissuesync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubrepository',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubrepository',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubrepository',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubrepository',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='importer',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='importer',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='importer',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='importer',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='inbox',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='inbox',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='inbox',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='inbox',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='inboxissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='inboxissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='inboxissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='inboxissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='integration',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='integration',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueactivity',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueactivity',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueactivity',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueactivity',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueassignee',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueassignee',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueassignee',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueassignee',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueattachment',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueattachment',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueattachment',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueattachment',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueblocker',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueblocker',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueblocker',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueblocker',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuecomment',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuecomment',
name='issue',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_comments', to='db.issue'),
),
migrations.AlterField(
model_name='issuecomment',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuecomment',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuecomment',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuelabel',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuelabel',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuelabel',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuelabel',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuelink',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuelink',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuelink',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuelink',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueproperty',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueproperty',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueproperty',
name='properties',
field=models.JSONField(default=plane.db.models.issue.get_default_properties),
),
migrations.AlterField(
model_name='issueproperty',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueproperty',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuesequence',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuesequence',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuesequence',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuesequence',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueview',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueview',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueview',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueview',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='label',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='label',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='label',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='label',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='module',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='module',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='module',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='module',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='moduleissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='moduleissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='moduleissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='moduleissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulelink',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulelink',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulelink',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulelink',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulemember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulemember',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulemember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulemember',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='page',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='page',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='page',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='page',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pageblock',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pageblock',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pageblock',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pageblock',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pagefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pagefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pagefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pagefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pagelabel',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pagelabel',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pagelabel',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pagelabel',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='project',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='project',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectfavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='projectidentifier',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectidentifier',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectmember',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectmember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmember',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='socialloginconnection',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='socialloginconnection',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='state',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='state',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='state',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='state',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='team',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='team',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='teammember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='teammember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspace',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspace',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspaceintegration',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspaceintegration',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacemember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacemember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacememberinvite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacememberinvite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacetheme',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacetheme',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.CreateModel(
name='IssueMention',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('issue', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to='db.issue')),
('mention', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to=settings.AUTH_USER_MODEL)),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace')),
],
options={
'verbose_name': 'Issue Mention',
'verbose_name_plural': 'Issue Mentions',
'db_table': 'issue_mentions',
'ordering': ('-created_at',),
'unique_together': {('issue', 'mention')},
},
),
migrations.RunPython(random_sort_ordering),
]

View File

@ -0,0 +1,131 @@
# Generated by Django 4.2.5 on 2023-11-15 11:20
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.api
import plane.db.models.webhook
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0046_label_sort_order_alter_analyticview_created_by_and_more'),
]
operations = [
migrations.CreateModel(
name='Webhook',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('url', models.URLField(validators=[plane.db.models.webhook.validate_schema, plane.db.models.webhook.validate_domain])),
('is_active', models.BooleanField(default=True)),
('secret_key', models.CharField(default=plane.db.models.webhook.generate_token, max_length=255)),
('project', models.BooleanField(default=False)),
('issue', models.BooleanField(default=False)),
('module', models.BooleanField(default=False)),
('cycle', models.BooleanField(default=False)),
('issue_comment', models.BooleanField(default=False)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_webhooks', to='db.workspace')),
],
options={
'verbose_name': 'Webhook',
'verbose_name_plural': 'Webhooks',
'db_table': 'webhooks',
'ordering': ('-created_at',),
'unique_together': {('workspace', 'url')},
},
),
migrations.AddField(
model_name='apitoken',
name='description',
field=models.TextField(blank=True),
),
migrations.AddField(
model_name='apitoken',
name='expired_at',
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name='apitoken',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AddField(
model_name='apitoken',
name='last_used',
field=models.DateTimeField(null=True),
),
migrations.AddField(
model_name='projectmember',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AddField(
model_name='workspacemember',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AlterField(
model_name='apitoken',
name='token',
field=models.CharField(db_index=True, default=plane.db.models.api.generate_token, max_length=255, unique=True),
),
migrations.CreateModel(
name='WebhookLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('event_type', models.CharField(blank=True, max_length=255, null=True)),
('request_method', models.CharField(blank=True, max_length=10, null=True)),
('request_headers', models.TextField(blank=True, null=True)),
('request_body', models.TextField(blank=True, null=True)),
('response_status', models.TextField(blank=True, null=True)),
('response_headers', models.TextField(blank=True, null=True)),
('response_body', models.TextField(blank=True, null=True)),
('retry_count', models.PositiveSmallIntegerField(default=0)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('webhook', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='logs', to='db.webhook')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='webhook_logs', to='db.workspace')),
],
options={
'verbose_name': 'Webhook Log',
'verbose_name_plural': 'Webhook Logs',
'db_table': 'webhook_logs',
'ordering': ('-created_at',),
},
),
migrations.CreateModel(
name='APIActivityLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('token_identifier', models.CharField(max_length=255)),
('path', models.CharField(max_length=255)),
('method', models.CharField(max_length=10)),
('query_params', models.TextField(blank=True, null=True)),
('headers', models.TextField(blank=True, null=True)),
('body', models.TextField(blank=True, null=True)),
('response_code', models.PositiveIntegerField()),
('response_body', models.TextField(blank=True, null=True)),
('ip_address', models.GenericIPAddressField(blank=True, null=True)),
('user_agent', models.CharField(blank=True, max_length=512, null=True)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
],
options={
'verbose_name': 'API Activity Log',
'verbose_name_plural': 'API Activity Logs',
'db_table': 'api_activity_logs',
'ordering': ('-created_at',),
},
),
]

View File

@ -0,0 +1,54 @@
# Generated by Django 4.2.5 on 2023-11-13 12:53
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0047_webhook_apitoken_description_apitoken_expired_at_and_more'),
]
operations = [
migrations.CreateModel(
name='PageLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('transaction', models.UUIDField(default=uuid.uuid4)),
('entity_identifier', models.UUIDField(null=True)),
('entity_name', models.CharField(choices=[('to_do', 'To Do'), ('issue', 'issue'), ('image', 'Image'), ('video', 'Video'), ('file', 'File'), ('link', 'Link'), ('cycle', 'Cycle'), ('module', 'Module'), ('back_link', 'Back Link'), ('forward_link', 'Forward Link'), ('mention', 'Mention')], max_length=30, verbose_name='Transaction Type')),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('page', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='page_log', to='db.page')),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace')),
],
options={
'verbose_name': 'Page Log',
'verbose_name_plural': 'Page Logs',
'db_table': 'page_logs',
'ordering': ('-created_at',),
'unique_together': {('page', 'transaction')}
},
),
migrations.AddField(
model_name='page',
name='archived_at',
field=models.DateField(null=True),
),
migrations.AddField(
model_name='page',
name='is_locked',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='page',
name='parent',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='child_page', to='db.page'),
),
]

View File

@ -0,0 +1,72 @@
# Generated by Django 4.2.5 on 2023-11-15 09:16
# Python imports
import uuid
from django.db import migrations
def update_pages(apps, schema_editor):
try:
Page = apps.get_model("db", "Page")
PageBlock = apps.get_model("db", "PageBlock")
PageLog = apps.get_model("db", "PageLog")
updated_pages = []
page_logs = []
# looping through all the pages
for page in Page.objects.all():
page_blocks = PageBlock.objects.filter(
page_id=page.id, project_id=page.project_id, workspace_id=page.workspace_id
).order_by("sort_order")
if page_blocks:
# looping through all the page blocks in a page
for page_block in page_blocks:
if page_block.issue is not None:
project_identifier = page.project.identifier
sequence_id = page_block.issue.sequence_id
transaction = uuid.uuid4().hex
embed_component = f'<issue-embed-component id="{transaction}" entity_name="issue" entity_identifier="{page_block.issue_id}" sequence_id="{sequence_id}" project_identifier="{project_identifier}" title="{page_block.name}"></issue-embed-component>'
page.description_html += embed_component
# create the page transaction for the issue
page_logs.append(
PageLog(
page_id=page_block.page_id,
transaction=transaction,
entity_identifier=page_block.issue_id,
entity_name="issue",
project_id=page.project_id,
workspace_id=page.workspace_id,
created_by_id=page_block.created_by_id,
updated_by_id=page_block.updated_by_id,
)
)
else:
# adding the page block name and description to the page description
page.description_html += f"<h2>{page_block.name}</h2>"
page.description_html += page_block.description_html
updated_pages.append(page)
Page.objects.bulk_update(
updated_pages,
["description_html"],
batch_size=100,
)
PageLog.objects.bulk_create(page_logs, batch_size=100)
except Exception as e:
print(e)
class Migration(migrations.Migration):
dependencies = [
("db", "0048_auto_20231116_0713"),
]
operations = [
migrations.RunPython(update_pages),
]

View File

@ -54,7 +54,7 @@ from .view import GlobalView, IssueView, IssueViewFavorite
from .module import Module, ModuleMember, ModuleIssue, ModuleLink, ModuleFavorite from .module import Module, ModuleMember, ModuleIssue, ModuleLink, ModuleFavorite
from .api_token import APIToken from .api import APIToken, APIActivityLog
from .integration import ( from .integration import (
WorkspaceIntegration, WorkspaceIntegration,
@ -68,7 +68,7 @@ from .integration import (
from .importer import Importer from .importer import Importer
from .page import Page, PageBlock, PageFavorite, PageLabel from .page import Page, PageLog, PageFavorite, PageLabel
from .estimate import Estimate, EstimatePoint from .estimate import Estimate, EstimatePoint
@ -79,3 +79,5 @@ from .analytic import AnalyticView
from .notification import Notification from .notification import Notification
from .exporter import ExporterHistory from .exporter import ExporterHistory
from .webhook import Webhook, WebhookLog

View File

@ -0,0 +1,80 @@
# Python imports
from uuid import uuid4
# Django imports
from django.db import models
from django.conf import settings
from .base import BaseModel
def generate_label_token():
return uuid4().hex
def generate_token():
return "plane_api_" + uuid4().hex
class APIToken(BaseModel):
# Meta information
label = models.CharField(max_length=255, default=generate_label_token)
description = models.TextField(blank=True)
is_active = models.BooleanField(default=True)
last_used = models.DateTimeField(null=True)
# Token
token = models.CharField(
max_length=255, unique=True, default=generate_token, db_index=True
)
# User Information
user = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="bot_tokens",
)
user_type = models.PositiveSmallIntegerField(
choices=((0, "Human"), (1, "Bot")), default=0
)
workspace = models.ForeignKey(
"db.Workspace", related_name="api_tokens", on_delete=models.CASCADE, null=True
)
expired_at = models.DateTimeField(blank=True, null=True)
class Meta:
verbose_name = "API Token"
verbose_name_plural = "API Tokems"
db_table = "api_tokens"
ordering = ("-created_at",)
def __str__(self):
return str(self.user.id)
class APIActivityLog(BaseModel):
token_identifier = models.CharField(max_length=255)
# Request Info
path = models.CharField(max_length=255)
method = models.CharField(max_length=10)
query_params = models.TextField(null=True, blank=True)
headers = models.TextField(null=True, blank=True)
body = models.TextField(null=True, blank=True)
# Response info
response_code = models.PositiveIntegerField()
response_body = models.TextField(null=True, blank=True)
# Meta information
ip_address = models.GenericIPAddressField(null=True, blank=True)
user_agent = models.CharField(max_length=512, null=True, blank=True)
class Meta:
verbose_name = "API Activity Log"
verbose_name_plural = "API Activity Logs"
db_table = "api_activity_logs"
ordering = ("-created_at",)
def __str__(self):
return str(self.token_identifier)

View File

@ -1,41 +0,0 @@
# Python imports
from uuid import uuid4
# Django imports
from django.db import models
from django.conf import settings
from .base import BaseModel
def generate_label_token():
return uuid4().hex
def generate_token():
return uuid4().hex + uuid4().hex
class APIToken(BaseModel):
token = models.CharField(max_length=255, unique=True, default=generate_token)
label = models.CharField(max_length=255, default=generate_label_token)
user = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="bot_tokens",
)
user_type = models.PositiveSmallIntegerField(
choices=((0, "Human"), (1, "Bot")), default=0
)
workspace = models.ForeignKey(
"db.Workspace", related_name="api_tokens", on_delete=models.CASCADE, null=True
)
class Meta:
verbose_name = "API Token"
verbose_name_plural = "API Tokems"
db_table = "api_tokens"
ordering = ("-created_at",)
def __str__(self):
return str(self.user.name)

View File

@ -132,25 +132,7 @@ class Issue(ProjectBaseModel):
self.state = default_state self.state = default_state
except ImportError: except ImportError:
pass pass
else:
try:
from plane.db.models import State, PageBlock
# Check if the current issue state and completed state id are same
if self.state.group == "completed":
self.completed_at = timezone.now()
# check if there are any page blocks
PageBlock.objects.filter(issue_id=self.id).filter().update(
completed_at=timezone.now()
)
else:
PageBlock.objects.filter(issue_id=self.id).filter().update(
completed_at=None
)
self.completed_at = None
except ImportError:
pass
if self._state.adding: if self._state.adding:
# Get the maximum display_id value from the database # Get the maximum display_id value from the database
last_id = IssueSequence.objects.filter(project=self.project).aggregate( last_id = IssueSequence.objects.filter(project=self.project).aggregate(
@ -431,6 +413,7 @@ class Label(ProjectBaseModel):
name = models.CharField(max_length=255) name = models.CharField(max_length=255)
description = models.TextField(blank=True) description = models.TextField(blank=True)
color = models.CharField(max_length=255, blank=True) color = models.CharField(max_length=255, blank=True)
sort_order = models.FloatField(default=65535)
class Meta: class Meta:
unique_together = ["name", "project"] unique_together = ["name", "project"]
@ -439,6 +422,18 @@ class Label(ProjectBaseModel):
db_table = "labels" db_table = "labels"
ordering = ("-created_at",) ordering = ("-created_at",)
def save(self, *args, **kwargs):
if self._state.adding:
# Get the maximum sequence value from the database
last_id = Label.objects.filter(project=self.project).aggregate(
largest=models.Max("sort_order")
)["largest"]
# if last_id is not None
if last_id is not None:
self.sort_order = last_id + 10000
super(Label, self).save(*args, **kwargs)
def __str__(self): def __str__(self):
return str(self.name) return str(self.name)

View File

@ -1,3 +1,5 @@
import uuid
# Django imports # Django imports
from django.db import models from django.db import models
from django.conf import settings from django.conf import settings
@ -22,6 +24,15 @@ class Page(ProjectBaseModel):
labels = models.ManyToManyField( labels = models.ManyToManyField(
"db.Label", blank=True, related_name="pages", through="db.PageLabel" "db.Label", blank=True, related_name="pages", through="db.PageLabel"
) )
parent = models.ForeignKey(
"self",
on_delete=models.CASCADE,
null=True,
blank=True,
related_name="child_page",
)
archived_at = models.DateField(null=True)
is_locked = models.BooleanField(default=False)
class Meta: class Meta:
verbose_name = "Page" verbose_name = "Page"
@ -34,6 +45,42 @@ class Page(ProjectBaseModel):
return f"{self.owned_by.email} <{self.name}>" return f"{self.owned_by.email} <{self.name}>"
class PageLog(ProjectBaseModel):
TYPE_CHOICES = (
("to_do", "To Do"),
("issue", "issue"),
("image", "Image"),
("video", "Video"),
("file", "File"),
("link", "Link"),
("cycle","Cycle"),
("module", "Module"),
("back_link", "Back Link"),
("forward_link", "Forward Link"),
("mention", "Mention"),
)
transaction = models.UUIDField(default=uuid.uuid4)
page = models.ForeignKey(
Page, related_name="page_log", on_delete=models.CASCADE
)
entity_identifier = models.UUIDField(null=True)
entity_name = models.CharField(
max_length=30,
choices=TYPE_CHOICES,
verbose_name="Transaction Type",
)
class Meta:
unique_together = ["page", "transaction"]
verbose_name = "Page Log"
verbose_name_plural = "Page Logs"
db_table = "page_logs"
ordering = ("-created_at",)
def __str__(self):
return f"{self.page.name} {self.type}"
class PageBlock(ProjectBaseModel): class PageBlock(ProjectBaseModel):
page = models.ForeignKey("db.Page", on_delete=models.CASCADE, related_name="blocks") page = models.ForeignKey("db.Page", on_delete=models.CASCADE, related_name="blocks")
name = models.CharField(max_length=255) name = models.CharField(max_length=255)

View File

@ -166,6 +166,7 @@ class ProjectMember(ProjectBaseModel):
default_props = models.JSONField(default=get_default_props) default_props = models.JSONField(default=get_default_props)
preferences = models.JSONField(default=get_default_preferences) preferences = models.JSONField(default=get_default_preferences)
sort_order = models.FloatField(default=65535) sort_order = models.FloatField(default=65535)
is_active = models.BooleanField(default=True)
def save(self, *args, **kwargs): def save(self, *args, **kwargs):
if self._state.adding: if self._state.adding:

View File

@ -0,0 +1,90 @@
# Python imports
from uuid import uuid4
from urllib.parse import urlparse
# Django imports
from django.db import models
from django.core.exceptions import ValidationError
# Module imports
from plane.db.models import BaseModel
def generate_token():
return "plane_wh_" + uuid4().hex
def validate_schema(value):
parsed_url = urlparse(value)
print(parsed_url)
if parsed_url.scheme not in ["http", "https"]:
raise ValidationError("Invalid schema. Only HTTP and HTTPS are allowed.")
def validate_domain(value):
parsed_url = urlparse(value)
domain = parsed_url.netloc
if domain in ["localhost", "127.0.0.1"]:
raise ValidationError("Local URLs are not allowed.")
class Webhook(BaseModel):
workspace = models.ForeignKey(
"db.Workspace",
on_delete=models.CASCADE,
related_name="workspace_webhooks",
)
url = models.URLField(
validators=[
validate_schema,
validate_domain,
]
)
is_active = models.BooleanField(default=True)
secret_key = models.CharField(max_length=255, default=generate_token)
project = models.BooleanField(default=False)
issue = models.BooleanField(default=False)
module = models.BooleanField(default=False)
cycle = models.BooleanField(default=False)
issue_comment = models.BooleanField(default=False)
def __str__(self):
return f"{self.workspace.slug} {self.url}"
class Meta:
unique_together = ["workspace", "url"]
verbose_name = "Webhook"
verbose_name_plural = "Webhooks"
db_table = "webhooks"
ordering = ("-created_at",)
class WebhookLog(BaseModel):
workspace = models.ForeignKey(
"db.Workspace", on_delete=models.CASCADE, related_name="webhook_logs"
)
# Associated webhook
webhook = models.ForeignKey(Webhook, on_delete=models.CASCADE, related_name="logs")
# Basic request details
event_type = models.CharField(max_length=255, blank=True, null=True)
request_method = models.CharField(max_length=10, blank=True, null=True)
request_headers = models.TextField(blank=True, null=True)
request_body = models.TextField(blank=True, null=True)
# Response details
response_status = models.TextField(blank=True, null=True)
response_headers = models.TextField(blank=True, null=True)
response_body = models.TextField(blank=True, null=True)
# Retry Count
retry_count = models.PositiveSmallIntegerField(default=0)
class Meta:
verbose_name = "Webhook Log"
verbose_name_plural = "Webhook Logs"
db_table = "webhook_logs"
ordering = ("-created_at",)
def __str__(self):
return f"{self.event_type} {str(self.webhook.url)}"

View File

@ -99,6 +99,7 @@ class WorkspaceMember(BaseModel):
view_props = models.JSONField(default=get_default_props) view_props = models.JSONField(default=get_default_props)
default_props = models.JSONField(default=get_default_props) default_props = models.JSONField(default=get_default_props)
issue_props = models.JSONField(default=get_issue_props) issue_props = models.JSONField(default=get_issue_props)
is_active = models.BooleanField(default=True)
class Meta: class Meta:
unique_together = ["workspace", "member"] unique_together = ["workspace", "member"]

View File

@ -0,0 +1,40 @@
from plane.db.models import APIToken, APIActivityLog
class APITokenLogMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
request_body = request.body
response = self.get_response(request)
self.process_request(request, response, request_body)
return response
def process_request(self, request, response, request_body):
api_key_header = "X-Api-Key"
api_key = request.headers.get(api_key_header)
# If the API key is present, log the request
if api_key:
try:
APIActivityLog.objects.create(
token_identifier=api_key,
path=request.path,
method=request.method,
query_params=request.META.get("QUERY_STRING", ""),
headers=str(request.headers),
body=(request_body.decode('utf-8') if request_body else None),
response_body=(
response.content.decode("utf-8") if response.content else None
),
response_code=response.status_code,
ip_address=request.META.get("REMOTE_ADDR", None),
user_agent=request.META.get("HTTP_USER_AGENT", None),
)
except Exception as e:
print(e)
# If the token does not exist, you can decide whether to log this as an invalid attempt
pass
return None

View File

View File

@ -0,0 +1,5 @@
from django.apps import AppConfig
class ProxyConfig(AppConfig):
name = "plane.proxy"

View File

@ -0,0 +1,45 @@
from django.utils import timezone
from rest_framework.throttling import SimpleRateThrottle
class ApiKeyRateThrottle(SimpleRateThrottle):
scope = 'api_key'
def get_cache_key(self, request, view):
# Retrieve the API key from the request header
api_key = request.headers.get('X-Api-Key')
if not api_key:
return None # Allow the request if there's no API key
# Use the API key as part of the cache key
return f'{self.scope}:{api_key}'
def allow_request(self, request, view):
# Calculate the current time as a Unix timestamp
now = timezone.now().timestamp()
# Use the parent class's method to check if the request is allowed
allowed = super().allow_request(request, view)
if allowed:
# Calculate the remaining limit and reset time
history = self.cache.get(self.key, [])
# Remove old histories
while history and history[-1] <= now - self.duration:
history.pop()
# Calculate the requests
num_requests = len(history)
# Check available requests
available = self.num_requests - num_requests
# Unix timestamp for when the rate limit will reset
reset_time = int(now + self.duration)
# Add headers
request.META['X-RateLimit-Remaining'] = max(0, available)
request.META['X-RateLimit-Reset'] = reset_time
return allowed

View File

@ -0,0 +1,13 @@
from .cycle import urlpatterns as cycle_patterns
from .inbox import urlpatterns as inbox_patterns
from .issue import urlpatterns as issue_patterns
from .module import urlpatterns as module_patterns
from .project import urlpatterns as project_patterns
urlpatterns = [
*cycle_patterns,
*inbox_patterns,
*issue_patterns,
*module_patterns,
*project_patterns,
]

View File

@ -0,0 +1,35 @@
from django.urls import path
from plane.proxy.views.cycle import (
CycleAPIEndpoint,
CycleIssueAPIEndpoint,
TransferCycleIssueAPIEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/",
CycleAPIEndpoint.as_view(),
name="cycles",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/",
CycleAPIEndpoint.as_view(),
name="cycles",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/",
CycleIssueAPIEndpoint.as_view(),
name="cycle-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:pk>/",
CycleIssueAPIEndpoint.as_view(),
name="cycle-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/transfer-issues/",
TransferCycleIssueAPIEndpoint.as_view(),
name="transfer-issues",
),
]

View File

@ -0,0 +1,17 @@
from django.urls import path
from plane.proxy.views import InboxIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/<uuid:pk>/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
]

View File

@ -0,0 +1,51 @@
from django.urls import path
from plane.proxy.views import (
IssueAPIEndpoint,
LabelAPIEndpoint,
IssueLinkAPIEndpoint,
IssueCommentAPIEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueAPIEndpoint.as_view(),
name="issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/",
IssueAPIEndpoint.as_view(),
name="issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/",
LabelAPIEndpoint.as_view(),
name="labels",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/<uuid:pk>/",
LabelAPIEndpoint.as_view(),
name="labels",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/",
IssueLinkAPIEndpoint.as_view(),
name="issue-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/<uuid:pk>/",
IssueLinkAPIEndpoint.as_view(),
name="issue-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/",
IssueCommentAPIEndpoint.as_view(),
name="project-issue-comment",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/<uuid:pk>/",
IssueCommentAPIEndpoint.as_view(),
name="project-issue-comment",
),
]

View File

@ -0,0 +1,26 @@
from django.urls import path
from plane.proxy.views import ModuleAPIEndpoint, ModuleIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/",
ModuleAPIEndpoint.as_view(),
name="modules",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:pk>/",
ModuleAPIEndpoint.as_view(),
name="modules",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/",
ModuleIssueAPIEndpoint.as_view(),
name="module-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:pk>/",
ModuleIssueAPIEndpoint.as_view(),
name="module-issues",
),
]

View File

@ -0,0 +1,16 @@
from django.urls import path
from plane.proxy.views import ProjectAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/",
ProjectAPIEndpoint.as_view(),
name="project",
),
path(
"workspaces/<str:slug>/projects/<uuid:pk>/",
ProjectAPIEndpoint.as_view(),
name="project",
),
]

View File

@ -0,0 +1,18 @@
from .project import ProjectAPIEndpoint
from .issue import (
IssueAPIEndpoint,
LabelAPIEndpoint,
IssueLinkAPIEndpoint,
IssueCommentAPIEndpoint,
)
from .cycle import (
CycleAPIEndpoint,
CycleIssueAPIEndpoint,
TransferCycleIssueAPIEndpoint,
)
from .module import ModuleAPIEndpoint, ModuleIssueAPIEndpoint
from .inbox import InboxIssueAPIEndpoint

View File

@ -0,0 +1,101 @@
# Python imports
import re
import json
import requests
# Django imports
from django.conf import settings
# Third party imports
from rest_framework.views import APIView
from rest_framework.response import Response
from rest_framework.permissions import IsAuthenticated
from rest_framework_simplejwt.tokens import RefreshToken
# Module imports
from plane.authentication.api_authentication import APIKeyAuthentication
from plane.proxy.rate_limit import ApiKeyRateThrottle
class BaseAPIView(APIView):
authentication_classes = [
APIKeyAuthentication,
]
permission_classes = [
IsAuthenticated,
]
throttle_classes = [
ApiKeyRateThrottle,
]
def _get_jwt_token(self, request):
refresh = RefreshToken.for_user(request.user)
return str(refresh.access_token)
def _get_url_path(self, request):
match = re.search(r"/v1/(.*)", request.path)
return match.group(1) if match else ""
def _get_headers(self, request):
return {
"Authorization": f"Bearer {self._get_jwt_token(request=request)}",
"Content-Type": request.headers.get("Content-Type", "application/json"),
}
def _get_url(self, request):
path = self._get_url_path(request=request)
url = request.build_absolute_uri("/api/" + path)
return url
def _get_query_params(self, request):
query_params = request.GET
return query_params
def _get_payload(self, request):
content_type = request.headers.get("Content-Type", "application/json")
if content_type.startswith("multipart/form-data"):
files_dict = {k: v[0] for k, v in request.FILES.lists()}
return (None, files_dict)
else:
return (json.dumps(request.data), None)
def _make_request(self, request, method="GET"):
data_payload, files_payload = self._get_payload(request=request)
response = requests.request(
method=method,
url=self._get_url(request=request),
headers=self._get_headers(request=request),
params=self._get_query_params(request=request),
data=data_payload,
files=files_payload,
)
return response.json(), response.status_code
def finalize_response(self, request, response, *args, **kwargs):
# Call super to get the default response
response = super().finalize_response(request, response, *args, **kwargs)
# Add custom headers if they exist in the request META
ratelimit_remaining = request.META.get('X-RateLimit-Remaining')
if ratelimit_remaining is not None:
response['X-RateLimit-Remaining'] = ratelimit_remaining
ratelimit_reset = request.META.get('X-RateLimit-Reset')
if ratelimit_reset is not None:
response['X-RateLimit-Reset'] = ratelimit_reset
return response
def get(self, request, *args, **kwargs):
response, status_code = self._make_request(request=request, method="GET")
return Response(response, status=status_code)
def post(self, request, *args, **kwargs):
response, status_code = self._make_request(request=request, method="POST")
return Response(response, status=status_code)
def partial_update(self, request, *args, **kwargs):
response, status_code = self._make_request(request=request, method="PATCH")
return Response(response, status=status_code)

View File

@ -0,0 +1,30 @@
from .base import BaseAPIView
class CycleAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to cycle.
"""
pass
class CycleIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to cycle issues.
"""
pass
class TransferCycleIssueAPIEndpoint(BaseAPIView):
"""
This viewset provides `create` actions for transfering the issues into a particular cycle.
"""
pass

View File

@ -0,0 +1,10 @@
from .base import BaseAPIView
class InboxIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to inbox issues.
"""
pass

View File

@ -0,0 +1,37 @@
from .base import BaseAPIView
class IssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to issue.
"""
pass
class LabelAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to the labels.
"""
pass
class IssueLinkAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to the links of the particular issue.
"""
pass
class IssueCommentAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to comments of the particular issue.
"""
pass

View File

@ -0,0 +1,20 @@
from .base import BaseAPIView
class ModuleAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module.
"""
pass
class ModuleIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module issues.
"""
pass

View File

@ -0,0 +1,5 @@
from .base import BaseAPIView
class ProjectAPIEndpoint(BaseAPIView):
pass

View File

@ -1,22 +1,41 @@
"""Global Settings"""
# Python imports
import os import os
import datetime import ssl
import certifi
from datetime import timedelta from datetime import timedelta
from urllib.parse import urlparse
# Django imports
from django.core.management.utils import get_random_secret_key from django.core.management.utils import get_random_secret_key
# Third party imports
import dj_database_url
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.redis import RedisIntegration
from sentry_sdk.integrations.celery import CeleryIntegration
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Secret Key
SECRET_KEY = os.environ.get("SECRET_KEY", get_random_secret_key()) SECRET_KEY = os.environ.get("SECRET_KEY", get_random_secret_key())
# SECURITY WARNING: don't run with debug turned on in production! # SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True DEBUG = False
ALLOWED_HOSTS = [] # Allowed Hosts
ALLOWED_HOSTS = ["*"]
# To access webhook
ENABLE_WEBHOOK = os.environ.get("ENABLE_WEBHOOK", "1") == "1"
# To access plane api through api tokens
ENABLE_API = os.environ.get("ENABLE_API", "1") == "1"
# Redirect if / is not present
APPEND_SLASH = True
# Application definition # Application definition
INSTALLED_APPS = [ INSTALLED_APPS = [
"django.contrib.auth", "django.contrib.auth",
"django.contrib.contenttypes", "django.contrib.contenttypes",
@ -29,6 +48,7 @@ INSTALLED_APPS = [
"plane.utils", "plane.utils",
"plane.web", "plane.web",
"plane.middleware", "plane.middleware",
"plane.proxy",
# Third-party things # Third-party things
"rest_framework", "rest_framework",
"rest_framework.authtoken", "rest_framework.authtoken",
@ -36,12 +56,13 @@ INSTALLED_APPS = [
"corsheaders", "corsheaders",
"taggit", "taggit",
"django_celery_beat", "django_celery_beat",
"storages",
] ]
# Middlewares
MIDDLEWARE = [ MIDDLEWARE = [
"corsheaders.middleware.CorsMiddleware", "corsheaders.middleware.CorsMiddleware",
"django.middleware.security.SecurityMiddleware", "django.middleware.security.SecurityMiddleware",
# "whitenoise.middleware.WhiteNoiseMiddleware",
"django.contrib.sessions.middleware.SessionMiddleware", "django.contrib.sessions.middleware.SessionMiddleware",
"django.middleware.common.CommonMiddleware", "django.middleware.common.CommonMiddleware",
"django.middleware.csrf.CsrfViewMiddleware", "django.middleware.csrf.CsrfViewMiddleware",
@ -49,8 +70,10 @@ MIDDLEWARE = [
"django.middleware.clickjacking.XFrameOptionsMiddleware", "django.middleware.clickjacking.XFrameOptionsMiddleware",
"crum.CurrentRequestUserMiddleware", "crum.CurrentRequestUserMiddleware",
"django.middleware.gzip.GZipMiddleware", "django.middleware.gzip.GZipMiddleware",
] "plane.middleware.api_log_middleware.APITokenLogMiddleware",
]
# Rest Framework settings
REST_FRAMEWORK = { REST_FRAMEWORK = {
"DEFAULT_AUTHENTICATION_CLASSES": ( "DEFAULT_AUTHENTICATION_CLASSES": (
"rest_framework_simplejwt.authentication.JWTAuthentication", "rest_framework_simplejwt.authentication.JWTAuthentication",
@ -58,15 +81,19 @@ REST_FRAMEWORK = {
"DEFAULT_PERMISSION_CLASSES": ("rest_framework.permissions.IsAuthenticated",), "DEFAULT_PERMISSION_CLASSES": ("rest_framework.permissions.IsAuthenticated",),
"DEFAULT_RENDERER_CLASSES": ("rest_framework.renderers.JSONRenderer",), "DEFAULT_RENDERER_CLASSES": ("rest_framework.renderers.JSONRenderer",),
"DEFAULT_FILTER_BACKENDS": ("django_filters.rest_framework.DjangoFilterBackend",), "DEFAULT_FILTER_BACKENDS": ("django_filters.rest_framework.DjangoFilterBackend",),
"DEFAULT_THROTTLE_CLASSES": ("plane.proxy.rate_limit.ApiKeyRateThrottle",),
"DEFAULT_THROTTLE_RATES": {
"api_key": "60/minute",
},
} }
AUTHENTICATION_BACKENDS = ( # Django Auth Backend
"django.contrib.auth.backends.ModelBackend", # default AUTHENTICATION_BACKENDS = ("django.contrib.auth.backends.ModelBackend",) # default
# "guardian.backends.ObjectPermissionBackend",
)
# Root Urls
ROOT_URLCONF = "plane.urls" ROOT_URLCONF = "plane.urls"
# Templates
TEMPLATES = [ TEMPLATES = [
{ {
"BACKEND": "django.template.backends.django.DjangoTemplates", "BACKEND": "django.template.backends.django.DjangoTemplates",
@ -85,52 +112,68 @@ TEMPLATES = [
}, },
] ]
# Cookie Settings
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
JWT_AUTH = { # CORS Settings
"JWT_ENCODE_HANDLER": "rest_framework_jwt.utils.jwt_encode_handler", CORS_ALLOW_CREDENTIALS = True
"JWT_DECODE_HANDLER": "rest_framework_jwt.utils.jwt_decode_handler", CORS_ALLOWED_ORIGINS = os.environ.get("CORS_ALLOWED_ORIGINS", "").split(",")
"JWT_PAYLOAD_HANDLER": "rest_framework_jwt.utils.jwt_payload_handler",
"JWT_PAYLOAD_GET_USER_ID_HANDLER": "rest_framework_jwt.utils.jwt_get_user_id_from_payload_handler",
"JWT_RESPONSE_PAYLOAD_HANDLER": "rest_framework_jwt.utils.jwt_response_payload_handler",
"JWT_SECRET_KEY": SECRET_KEY,
"JWT_GET_USER_SECRET_KEY": None,
"JWT_PUBLIC_KEY": None,
"JWT_PRIVATE_KEY": None,
"JWT_ALGORITHM": "HS256",
"JWT_VERIFY": True,
"JWT_VERIFY_EXPIRATION": True,
"JWT_LEEWAY": 0,
"JWT_EXPIRATION_DELTA": datetime.timedelta(seconds=604800),
"JWT_AUDIENCE": None,
"JWT_ISSUER": None,
"JWT_ALLOW_REFRESH": False,
"JWT_REFRESH_EXPIRATION_DELTA": datetime.timedelta(days=7),
"JWT_AUTH_HEADER_PREFIX": "JWT",
"JWT_AUTH_COOKIE": None,
}
# Application Settings
WSGI_APPLICATION = "plane.wsgi.application" WSGI_APPLICATION = "plane.wsgi.application"
ASGI_APPLICATION = "plane.asgi.application" ASGI_APPLICATION = "plane.asgi.application"
# Django Sites # Django Sites
SITE_ID = 1 SITE_ID = 1
# User Model # User Model
AUTH_USER_MODEL = "db.User" AUTH_USER_MODEL = "db.User"
# Database # Database
if bool(os.environ.get("DATABASE_URL")):
DATABASES = { # Parse database configuration from $DATABASE_URL
"default": { DATABASES = {
"ENGINE": "django.db.backends.sqlite3", "default": dj_database_url.config(),
"NAME": os.path.join(BASE_DIR, "db.sqlite3"), }
else:
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": os.environ.get("POSTGRES_DB"),
"USER": os.environ.get("POSTGRES_USER"),
"PASSWORD": os.environ.get("POSTGRES_PASSWORD"),
"HOST": os.environ.get("POSTGRES_HOST"),
}
} }
}
# Redis Config
REDIS_URL = os.environ.get("REDIS_URL")
REDIS_SSL = "rediss" in REDIS_URL
# Password validation if REDIS_SSL:
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"CONNECTION_POOL_KWARGS": {"ssl_cert_reqs": False},
},
}
}
else:
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
},
}
}
# Password validations
AUTH_PASSWORD_VALIDATORS = [ AUTH_PASSWORD_VALIDATORS = [
{ {
"NAME": "django.contrib.auth.password_validation.UserAttributeSimilarityValidator", "NAME": "django.contrib.auth.password_validation.UserAttributeSimilarityValidator",
@ -147,7 +190,6 @@ AUTH_PASSWORD_VALIDATORS = [
] ]
# Static files (CSS, JavaScript, Images) # Static files (CSS, JavaScript, Images)
STATIC_URL = "/static/" STATIC_URL = "/static/"
STATIC_ROOT = os.path.join(BASE_DIR, "static-assets", "collected-static") STATIC_ROOT = os.path.join(BASE_DIR, "static-assets", "collected-static")
STATICFILES_DIRS = (os.path.join(BASE_DIR, "static"),) STATICFILES_DIRS = (os.path.join(BASE_DIR, "static"),)
@ -156,21 +198,19 @@ STATICFILES_DIRS = (os.path.join(BASE_DIR, "static"),)
MEDIA_ROOT = "mediafiles" MEDIA_ROOT = "mediafiles"
MEDIA_URL = "/media/" MEDIA_URL = "/media/"
# Internationalization # Internationalization
LANGUAGE_CODE = "en-us" LANGUAGE_CODE = "en-us"
TIME_ZONE = "UTC"
USE_I18N = True USE_I18N = True
USE_L10N = True USE_L10N = True
# Timezones
USE_TZ = True USE_TZ = True
TIME_ZONE = "UTC"
# Default Auto Field
DEFAULT_AUTO_FIELD = "django.db.models.BigAutoField" DEFAULT_AUTO_FIELD = "django.db.models.BigAutoField"
# Email settings
EMAIL_BACKEND = "django.core.mail.backends.smtp.EmailBackend" EMAIL_BACKEND = "django.core.mail.backends.smtp.EmailBackend"
# Host for sending e-mail. # Host for sending e-mail.
EMAIL_HOST = os.environ.get("EMAIL_HOST") EMAIL_HOST = os.environ.get("EMAIL_HOST")
@ -183,7 +223,30 @@ EMAIL_USE_TLS = os.environ.get("EMAIL_USE_TLS", "1") == "1"
EMAIL_USE_SSL = os.environ.get("EMAIL_USE_SSL", "0") == "1" EMAIL_USE_SSL = os.environ.get("EMAIL_USE_SSL", "0") == "1"
EMAIL_FROM = os.environ.get("EMAIL_FROM", "Team Plane <team@mailer.plane.so>") EMAIL_FROM = os.environ.get("EMAIL_FROM", "Team Plane <team@mailer.plane.so>")
# Storage Settings
STORAGES = {
"staticfiles": {
"BACKEND": "whitenoise.storage.CompressedManifestStaticFilesStorage",
},
}
STORAGES["default"] = {"BACKEND": "storages.backends.s3boto3.S3Boto3Storage"}
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID", "access-key")
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY", "secret-key")
AWS_STORAGE_BUCKET_NAME = os.environ.get("AWS_S3_BUCKET_NAME", "uploads")
AWS_DEFAULT_ACL = "public-read"
AWS_QUERYSTRING_AUTH = False
AWS_S3_FILE_OVERWRITE = False
AWS_S3_ENDPOINT_URL = os.environ.get("AWS_S3_ENDPOINT_URL", None) or os.environ.get(
"MINIO_ENDPOINT_URL", None
)
if AWS_S3_ENDPOINT_URL:
parsed_url = urlparse(os.environ.get("WEB_URL", "http://localhost"))
AWS_S3_CUSTOM_DOMAIN = f"{parsed_url.netloc}/{AWS_STORAGE_BUCKET_NAME}"
AWS_S3_URL_PROTOCOL = f"{parsed_url.scheme}:"
# JWT Auth Configuration
SIMPLE_JWT = { SIMPLE_JWT = {
"ACCESS_TOKEN_LIFETIME": timedelta(minutes=10080), "ACCESS_TOKEN_LIFETIME": timedelta(minutes=10080),
"REFRESH_TOKEN_LIFETIME": timedelta(days=43200), "REFRESH_TOKEN_LIFETIME": timedelta(days=43200),
@ -211,7 +274,71 @@ SIMPLE_JWT = {
"SLIDING_TOKEN_REFRESH_LIFETIME": timedelta(days=1), "SLIDING_TOKEN_REFRESH_LIFETIME": timedelta(days=1),
} }
# Celery Configuration
CELERY_TIMEZONE = TIME_ZONE CELERY_TIMEZONE = TIME_ZONE
CELERY_TASK_SERIALIZER = 'json' CELERY_TASK_SERIALIZER = "json"
CELERY_ACCEPT_CONTENT = ['application/json'] CELERY_ACCEPT_CONTENT = ["application/json"]
CELERY_IMPORTS = ("plane.bgtasks.issue_automation_task","plane.bgtasks.exporter_expired_task")
if REDIS_SSL:
redis_url = os.environ.get("REDIS_URL")
broker_url = (
f"{redis_url}?ssl_cert_reqs={ssl.CERT_NONE.name}&ssl_ca_certs={certifi.where()}"
)
CELERY_BROKER_URL = broker_url
CELERY_RESULT_BACKEND = broker_url
else:
CELERY_BROKER_URL = REDIS_URL
CELERY_RESULT_BACKEND = REDIS_URL
CELERY_IMPORTS = (
"plane.bgtasks.issue_automation_task",
"plane.bgtasks.exporter_expired_task",
)
# Sentry Settings
# Enable Sentry Settings
if bool(os.environ.get("SENTRY_DSN", False)):
sentry_sdk.init(
dsn=os.environ.get("SENTRY_DSN", ""),
integrations=[
DjangoIntegration(),
RedisIntegration(),
CeleryIntegration(monitor_beat_tasks=True),
],
traces_sample_rate=1,
send_default_pii=True,
environment=os.environ.get("ENVIRONMENT", "development"),
profiles_sample_rate=1.0,
)
# Application Envs
PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False) # For External
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Unsplash Access key
UNSPLASH_ACCESS_KEY = os.environ.get("UNSPLASH_ACCESS_KEY")
# Github Access Token
GITHUB_ACCESS_TOKEN = os.environ.get("GITHUB_ACCESS_TOKEN", False)
# Analytics
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
# Open AI Settings
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")
# Scout Settings
SCOUT_MONITOR = os.environ.get("SCOUT_MONITOR", False)
SCOUT_KEY = os.environ.get("SCOUT_KEY", "")
SCOUT_NAME = "Plane"
# Set the variable true if running in docker environment
DOCKERIZED = int(os.environ.get("DOCKERIZED", 1)) == 1
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1

View File

@ -1,123 +1,39 @@
"""Development settings and globals.""" """Development settings"""
from __future__ import absolute_import
import dj_database_url
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.redis import RedisIntegration
from .common import * # noqa from .common import * # noqa
DEBUG = int(os.environ.get("DEBUG", 1)) == 1 DEBUG = True
ALLOWED_HOSTS = [ ALLOWED_HOSTS = [
"*", "*",
] ]
# Debug Toolbar settings
INSTALLED_APPS += ("debug_toolbar",)
MIDDLEWARE += ("debug_toolbar.middleware.DebugToolbarMiddleware",)
DEBUG_TOOLBAR_PATCH_SETTINGS = False
# Only show emails in console don't send it to smtp
EMAIL_BACKEND = "django.core.mail.backends.console.EmailBackend" EMAIL_BACKEND = "django.core.mail.backends.console.EmailBackend"
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": os.environ.get("PGUSER", "plane"),
"USER": "",
"PASSWORD": "",
"HOST": os.environ.get("PGHOST", "localhost"),
}
}
DOCKERIZED = int(os.environ.get("DOCKERIZED", 0)) == 1
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
if DOCKERIZED:
DATABASES["default"] = dj_database_url.config()
CACHES = { CACHES = {
"default": { "default": {
"BACKEND": "django.core.cache.backends.locmem.LocMemCache", "BACKEND": "django.core.cache.backends.locmem.LocMemCache",
} }
} }
INSTALLED_APPS += ("debug_toolbar",)
MIDDLEWARE += ("debug_toolbar.middleware.DebugToolbarMiddleware",)
DEBUG_TOOLBAR_PATCH_SETTINGS = False
INTERNAL_IPS = ("127.0.0.1",) INTERNAL_IPS = ("127.0.0.1",)
CORS_ORIGIN_ALLOW_ALL = True CORS_ORIGIN_ALLOW_ALL = True
if os.environ.get("SENTRY_DSN", False):
sentry_sdk.init(
dsn=os.environ.get("SENTRY_DSN"),
integrations=[DjangoIntegration(), RedisIntegration()],
# If you wish to associate users to errors (assuming you are using
# django.contrib.auth) you may enable sending PII data.
send_default_pii=True,
environment="local",
traces_sample_rate=0.7,
profiles_sample_rate=1.0,
)
else:
LOGGING = {
"version": 1,
"disable_existing_loggers": False,
"handlers": {
"console": {
"class": "logging.StreamHandler",
},
},
"root": {
"handlers": ["console"],
"level": "DEBUG",
},
"loggers": {
"*": {
"handlers": ["console"],
"level": "DEBUG",
"propagate": True,
},
},
}
REDIS_HOST = "localhost"
REDIS_PORT = 6379
REDIS_URL = os.environ.get("REDIS_URL")
MEDIA_URL = "/uploads/" MEDIA_URL = "/uploads/"
MEDIA_ROOT = os.path.join(BASE_DIR, "uploads") MEDIA_ROOT = os.path.join(BASE_DIR, "uploads")
if DOCKERIZED: # For local settings
REDIS_URL = os.environ.get("REDIS_URL") CORS_ALLOW_ALL_ORIGINS = True
CORS_ALLOWED_ORIGINS = [
WEB_URL = os.environ.get("WEB_URL", "http://localhost:3000") "http://localhost:3000",
PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False) "http://127.0.0.1:3000",
"http://localhost:4000",
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False) "http://127.0.0.1:4000",
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False) ]
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
LOGGER_BASE_URL = os.environ.get("LOGGER_BASE_URL", False)
CELERY_RESULT_BACKEND = os.environ.get("REDIS_URL")
CELERY_BROKER_URL = os.environ.get("REDIS_URL")
GITHUB_ACCESS_TOKEN = os.environ.get("GITHUB_ACCESS_TOKEN", False)
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Unsplash Access key
UNSPLASH_ACCESS_KEY = os.environ.get("UNSPLASH_ACCESS_KEY")

View File

@ -1,282 +1,13 @@
"""Production settings and globals.""" """Production settings"""
import ssl
import certifi
import dj_database_url
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.redis import RedisIntegration
from urllib.parse import urlparse
from .common import * # noqa from .common import * # noqa
# Database # SECURITY WARNING: don't run with debug turned on in production!
DEBUG = int(os.environ.get("DEBUG", 0)) == 1 DEBUG = int(os.environ.get("DEBUG", 0)) == 1
if bool(os.environ.get("DATABASE_URL")):
# Parse database configuration from $DATABASE_URL
DATABASES["default"] = dj_database_url.config()
else:
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": os.environ.get("POSTGRES_DB"),
"USER": os.environ.get("POSTGRES_USER"),
"PASSWORD": os.environ.get("POSTGRES_PASSWORD"),
"HOST": os.environ.get("POSTGRES_HOST"),
}
}
SITE_ID = 1
# Set the variable true if running in docker environment
DOCKERIZED = int(os.environ.get("DOCKERIZED", 0)) == 1
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
# Enable Connection Pooling (if desired)
# DATABASES['default']['ENGINE'] = 'django_postgrespool'
# Honor the 'X-Forwarded-Proto' header for request.is_secure() # Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https") SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# TODO: Make it FALSE and LIST DOMAINS IN FULL PROD.
CORS_ALLOW_ALL_ORIGINS = True
CORS_ALLOW_METHODS = [
"DELETE",
"GET",
"OPTIONS",
"PATCH",
"POST",
"PUT",
]
CORS_ALLOW_HEADERS = [
"accept",
"accept-encoding",
"authorization",
"content-type",
"dnt",
"origin",
"user-agent",
"x-csrftoken",
"x-requested-with",
]
CORS_ALLOW_CREDENTIALS = True
INSTALLED_APPS += ("scout_apm.django",) INSTALLED_APPS += ("scout_apm.django",)
STORAGES = {
"staticfiles": {
"BACKEND": "whitenoise.storage.CompressedManifestStaticFilesStorage",
},
}
if bool(os.environ.get("SENTRY_DSN", False)):
sentry_sdk.init(
dsn=os.environ.get("SENTRY_DSN", ""),
integrations=[DjangoIntegration(), RedisIntegration()],
# If you wish to associate users to errors (assuming you are using
# django.contrib.auth) you may enable sending PII data.
traces_sample_rate=1,
send_default_pii=True,
environment="production",
profiles_sample_rate=1.0,
)
if DOCKERIZED and USE_MINIO:
INSTALLED_APPS += ("storages",)
STORAGES["default"] = {"BACKEND": "storages.backends.s3boto3.S3Boto3Storage"}
# The AWS access key to use.
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID", "access-key")
# The AWS secret access key to use.
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY", "secret-key")
# The name of the bucket to store files in.
AWS_STORAGE_BUCKET_NAME = os.environ.get("AWS_S3_BUCKET_NAME", "uploads")
# The full URL to the S3 endpoint. Leave blank to use the default region URL.
AWS_S3_ENDPOINT_URL = os.environ.get(
"AWS_S3_ENDPOINT_URL", "http://plane-minio:9000"
)
# Default permissions
AWS_DEFAULT_ACL = "public-read"
AWS_QUERYSTRING_AUTH = False
AWS_S3_FILE_OVERWRITE = False
# Custom Domain settings
parsed_url = urlparse(os.environ.get("WEB_URL", "http://localhost"))
AWS_S3_CUSTOM_DOMAIN = f"{parsed_url.netloc}/{AWS_STORAGE_BUCKET_NAME}"
AWS_S3_URL_PROTOCOL = f"{parsed_url.scheme}:"
else:
# The AWS region to connect to.
AWS_REGION = os.environ.get("AWS_REGION", "")
# The AWS access key to use.
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID", "")
# The AWS secret access key to use.
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY", "")
# The optional AWS session token to use.
# AWS_SESSION_TOKEN = ""
# The name of the bucket to store files in.
AWS_S3_BUCKET_NAME = os.environ.get("AWS_S3_BUCKET_NAME")
# How to construct S3 URLs ("auto", "path", "virtual").
AWS_S3_ADDRESSING_STYLE = "auto"
# The full URL to the S3 endpoint. Leave blank to use the default region URL.
AWS_S3_ENDPOINT_URL = os.environ.get("AWS_S3_ENDPOINT_URL", "")
# A prefix to be applied to every stored file. This will be joined to every filename using the "/" separator.
AWS_S3_KEY_PREFIX = ""
# Whether to enable authentication for stored files. If True, then generated URLs will include an authentication
# token valid for `AWS_S3_MAX_AGE_SECONDS`. If False, then generated URLs will not include an authentication token,
# and their permissions will be set to "public-read".
AWS_S3_BUCKET_AUTH = False
# How long generated URLs are valid for. This affects the expiry of authentication tokens if `AWS_S3_BUCKET_AUTH`
# is True. It also affects the "Cache-Control" header of the files.
# Important: Changing this setting will not affect existing files.
AWS_S3_MAX_AGE_SECONDS = 60 * 60 # 1 hours.
# A URL prefix to be used for generated URLs. This is useful if your bucket is served through a CDN. This setting
# cannot be used with `AWS_S3_BUCKET_AUTH`.
AWS_S3_PUBLIC_URL = ""
# If True, then files will be stored with reduced redundancy. Check the S3 documentation and make sure you
# understand the consequences before enabling.
# Important: Changing this setting will not affect existing files.
AWS_S3_REDUCED_REDUNDANCY = False
# The Content-Disposition header used when the file is downloaded. This can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_CONTENT_DISPOSITION = ""
# The Content-Language header used when the file is downloaded. This can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_CONTENT_LANGUAGE = ""
# A mapping of custom metadata for each file. Each value can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_METADATA = {}
# If True, then files will be stored using AES256 server-side encryption.
# If this is a string value (e.g., "aws:kms"), that encryption type will be used.
# Otherwise, server-side encryption is not be enabled.
# Important: Changing this setting will not affect existing files.
AWS_S3_ENCRYPT_KEY = False
# The AWS S3 KMS encryption key ID (the `SSEKMSKeyId` parameter) is set from this string if present.
# This is only relevant if AWS S3 KMS server-side encryption is enabled (above).
# AWS_S3_KMS_ENCRYPTION_KEY_ID = ""
# If True, then text files will be stored using gzip content encoding. Files will only be gzipped if their
# compressed size is smaller than their uncompressed size.
# Important: Changing this setting will not affect existing files.
AWS_S3_GZIP = True
# The signature version to use for S3 requests.
AWS_S3_SIGNATURE_VERSION = None
# If True, then files with the same name will overwrite each other. By default it's set to False to have
# extra characters appended.
AWS_S3_FILE_OVERWRITE = False
STORAGES["default"] = {
"BACKEND": "django_s3_storage.storage.S3Storage",
}
# AWS Settings End
# Enable Connection Pooling (if desired)
# DATABASES['default']['ENGINE'] = 'django_postgrespool'
# Honor the 'X-Forwarded-Proto' header for request.is_secure() # Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https") SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# Allow all host headers
ALLOWED_HOSTS = [
"*",
]
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
REDIS_URL = os.environ.get("REDIS_URL")
if DOCKERIZED:
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
},
}
}
else:
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"CONNECTION_POOL_KWARGS": {"ssl_cert_reqs": False},
},
}
}
WEB_URL = os.environ.get("WEB_URL", "https://app.plane.so")
PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False)
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
LOGGER_BASE_URL = os.environ.get("LOGGER_BASE_URL", False)
redis_url = os.environ.get("REDIS_URL")
broker_url = (
f"{redis_url}?ssl_cert_reqs={ssl.CERT_NONE.name}&ssl_ca_certs={certifi.where()}"
)
if DOCKERIZED:
CELERY_BROKER_URL = REDIS_URL
CELERY_RESULT_BACKEND = REDIS_URL
else:
CELERY_BROKER_URL = broker_url
CELERY_RESULT_BACKEND = broker_url
GITHUB_ACCESS_TOKEN = os.environ.get("GITHUB_ACCESS_TOKEN", False)
# Enable or Disable signups
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Scout Settings
SCOUT_MONITOR = os.environ.get("SCOUT_MONITOR", False)
SCOUT_KEY = os.environ.get("SCOUT_KEY", "")
SCOUT_NAME = "Plane"
# Unsplash Access key
UNSPLASH_ACCESS_KEY = os.environ.get("UNSPLASH_ACCESS_KEY")

View File

@ -6,13 +6,7 @@ from urllib.parse import urlparse
def redis_instance(): def redis_instance():
# connect to redis # connect to redis
if ( if settings.REDIS_SSL:
settings.DOCKERIZED
or os.environ.get("DJANGO_SETTINGS_MODULE", "plane.settings.production")
== "plane.settings.local"
):
ri = redis.Redis.from_url(settings.REDIS_URL, db=0)
else:
url = urlparse(settings.REDIS_URL) url = urlparse(settings.REDIS_URL)
ri = redis.Redis( ri = redis.Redis(
host=url.hostname, host=url.hostname,
@ -21,5 +15,7 @@ def redis_instance():
ssl=True, ssl=True,
ssl_cert_reqs=None, ssl_cert_reqs=None,
) )
else:
ri = redis.Redis.from_url(settings.REDIS_URL, db=0)
return ri return ri

View File

@ -1,129 +0,0 @@
"""Self hosted settings and globals."""
from urllib.parse import urlparse
import dj_database_url
from urllib.parse import urlparse
from .common import * # noqa
# Database
DEBUG = int(os.environ.get("DEBUG", 0)) == 1
# Docker configurations
DOCKERIZED = 1
USE_MINIO = 1
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": "plane",
"USER": os.environ.get("PGUSER", ""),
"PASSWORD": os.environ.get("PGPASSWORD", ""),
"HOST": os.environ.get("PGHOST", ""),
}
}
# Parse database configuration from $DATABASE_URL
DATABASES["default"] = dj_database_url.config()
SITE_ID = 1
# File size limit
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
CORS_ALLOW_METHODS = [
"DELETE",
"GET",
"OPTIONS",
"PATCH",
"POST",
"PUT",
]
CORS_ALLOW_HEADERS = [
"accept",
"accept-encoding",
"authorization",
"content-type",
"dnt",
"origin",
"user-agent",
"x-csrftoken",
"x-requested-with",
]
CORS_ALLOW_CREDENTIALS = True
CORS_ALLOW_ALL_ORIGINS = True
STORAGES = {
"staticfiles": {
"BACKEND": "whitenoise.storage.CompressedManifestStaticFilesStorage",
},
}
INSTALLED_APPS += ("storages",)
STORAGES["default"] = {"BACKEND": "storages.backends.s3boto3.S3Boto3Storage"}
# The AWS access key to use.
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID", "access-key")
# The AWS secret access key to use.
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY", "secret-key")
# The name of the bucket to store files in.
AWS_STORAGE_BUCKET_NAME = os.environ.get("AWS_S3_BUCKET_NAME", "uploads")
# The full URL to the S3 endpoint. Leave blank to use the default region URL.
AWS_S3_ENDPOINT_URL = os.environ.get(
"AWS_S3_ENDPOINT_URL", "http://plane-minio:9000"
)
# Default permissions
AWS_DEFAULT_ACL = "public-read"
AWS_QUERYSTRING_AUTH = False
AWS_S3_FILE_OVERWRITE = False
# Custom Domain settings
parsed_url = urlparse(os.environ.get("WEB_URL", "http://localhost"))
AWS_S3_CUSTOM_DOMAIN = f"{parsed_url.netloc}/{AWS_STORAGE_BUCKET_NAME}"
AWS_S3_URL_PROTOCOL = f"{parsed_url.scheme}:"
# Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# Allow all host headers
ALLOWED_HOSTS = [
"*",
]
# Security settings
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
# Redis URL
REDIS_URL = os.environ.get("REDIS_URL")
# Caches
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
},
}
}
# URL used for email redirects
WEB_URL = os.environ.get("WEB_URL", "http://localhost")
# Celery settings
CELERY_BROKER_URL = REDIS_URL
CELERY_RESULT_BACKEND = REDIS_URL
# Enable or Disable signups
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Analytics
ANALYTICS_BASE_API = False
# OPEN AI Settings
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")

View File

@ -1,223 +0,0 @@
"""Production settings and globals."""
from urllib.parse import urlparse
import ssl
import certifi
import dj_database_url
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.redis import RedisIntegration
from .common import * # noqa
# Database
DEBUG = int(os.environ.get("DEBUG", 1)) == 1
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": os.environ.get("PGUSER", "plane"),
"USER": "",
"PASSWORD": "",
"HOST": os.environ.get("PGHOST", "localhost"),
}
}
# CORS WHITELIST ON PROD
CORS_ORIGIN_WHITELIST = [
# "https://example.com",
# "https://sub.example.com",
# "http://localhost:8080",
# "http://127.0.0.1:9000"
]
# Parse database configuration from $DATABASE_URL
DATABASES["default"] = dj_database_url.config()
SITE_ID = 1
# Enable Connection Pooling (if desired)
# DATABASES['default']['ENGINE'] = 'django_postgrespool'
# Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# Allow all host headers
ALLOWED_HOSTS = ["*"]
# TODO: Make it FALSE and LIST DOMAINS IN FULL PROD.
CORS_ALLOW_ALL_ORIGINS = True
STORAGES = {
"staticfiles": {
"BACKEND": "whitenoise.storage.CompressedManifestStaticFilesStorage",
},
}
# Make true if running in a docker environment
DOCKERIZED = int(os.environ.get("DOCKERIZED", 0)) == 1
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
sentry_sdk.init(
dsn=os.environ.get("SENTRY_DSN"),
integrations=[DjangoIntegration(), RedisIntegration()],
# If you wish to associate users to errors (assuming you are using
# django.contrib.auth) you may enable sending PII data.
traces_sample_rate=1,
send_default_pii=True,
environment="staging",
profiles_sample_rate=1.0,
)
# The AWS region to connect to.
AWS_REGION = os.environ.get("AWS_REGION")
# The AWS access key to use.
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID")
# The AWS secret access key to use.
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY")
# The optional AWS session token to use.
# AWS_SESSION_TOKEN = ""
# The name of the bucket to store files in.
AWS_S3_BUCKET_NAME = os.environ.get("AWS_S3_BUCKET_NAME")
# How to construct S3 URLs ("auto", "path", "virtual").
AWS_S3_ADDRESSING_STYLE = "auto"
# The full URL to the S3 endpoint. Leave blank to use the default region URL.
AWS_S3_ENDPOINT_URL = os.environ.get("AWS_S3_ENDPOINT_URL", "")
# A prefix to be applied to every stored file. This will be joined to every filename using the "/" separator.
AWS_S3_KEY_PREFIX = ""
# Whether to enable authentication for stored files. If True, then generated URLs will include an authentication
# token valid for `AWS_S3_MAX_AGE_SECONDS`. If False, then generated URLs will not include an authentication token,
# and their permissions will be set to "public-read".
AWS_S3_BUCKET_AUTH = False
# How long generated URLs are valid for. This affects the expiry of authentication tokens if `AWS_S3_BUCKET_AUTH`
# is True. It also affects the "Cache-Control" header of the files.
# Important: Changing this setting will not affect existing files.
AWS_S3_MAX_AGE_SECONDS = 60 * 60 # 1 hours.
# A URL prefix to be used for generated URLs. This is useful if your bucket is served through a CDN. This setting
# cannot be used with `AWS_S3_BUCKET_AUTH`.
AWS_S3_PUBLIC_URL = ""
# If True, then files will be stored with reduced redundancy. Check the S3 documentation and make sure you
# understand the consequences before enabling.
# Important: Changing this setting will not affect existing files.
AWS_S3_REDUCED_REDUNDANCY = False
# The Content-Disposition header used when the file is downloaded. This can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_CONTENT_DISPOSITION = ""
# The Content-Language header used when the file is downloaded. This can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_CONTENT_LANGUAGE = ""
# A mapping of custom metadata for each file. Each value can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_METADATA = {}
# If True, then files will be stored using AES256 server-side encryption.
# If this is a string value (e.g., "aws:kms"), that encryption type will be used.
# Otherwise, server-side encryption is not be enabled.
# Important: Changing this setting will not affect existing files.
AWS_S3_ENCRYPT_KEY = False
# The AWS S3 KMS encryption key ID (the `SSEKMSKeyId` parameter) is set from this string if present.
# This is only relevant if AWS S3 KMS server-side encryption is enabled (above).
# AWS_S3_KMS_ENCRYPTION_KEY_ID = ""
# If True, then text files will be stored using gzip content encoding. Files will only be gzipped if their
# compressed size is smaller than their uncompressed size.
# Important: Changing this setting will not affect existing files.
AWS_S3_GZIP = True
# The signature version to use for S3 requests.
AWS_S3_SIGNATURE_VERSION = None
# If True, then files with the same name will overwrite each other. By default it's set to False to have
# extra characters appended.
AWS_S3_FILE_OVERWRITE = False
# AWS Settings End
STORAGES["default"] = {
"BACKEND": "django_s3_storage.storage.S3Storage",
}
# Enable Connection Pooling (if desired)
# DATABASES['default']['ENGINE'] = 'django_postgrespool'
# Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# Allow all host headers
ALLOWED_HOSTS = [
"*",
]
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
REDIS_URL = os.environ.get("REDIS_URL")
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"CONNECTION_POOL_KWARGS": {"ssl_cert_reqs": False},
},
}
}
RQ_QUEUES = {
"default": {
"USE_REDIS_CACHE": "default",
}
}
WEB_URL = os.environ.get("WEB_URL")
PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False)
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
LOGGER_BASE_URL = os.environ.get("LOGGER_BASE_URL", False)
redis_url = os.environ.get("REDIS_URL")
broker_url = (
f"{redis_url}?ssl_cert_reqs={ssl.CERT_NONE.name}&ssl_ca_certs={certifi.where()}"
)
CELERY_RESULT_BACKEND = broker_url
CELERY_BROKER_URL = broker_url
GITHUB_ACCESS_TOKEN = os.environ.get("GITHUB_ACCESS_TOKEN", False)
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Unsplash Access key
UNSPLASH_ACCESS_KEY = os.environ.get("UNSPLASH_ACCESS_KEY")

View File

@ -1,45 +1,9 @@
from __future__ import absolute_import """Test Settings"""
from .common import * # noqa from .common import * # noqa
DEBUG = True DEBUG = True
INSTALLED_APPS.append("plane.tests") # Send it in a dummy outbox
EMAIL_BACKEND = "django.core.mail.backends.locmem.EmailBackend"
if os.environ.get('GITHUB_WORKFLOW'): INSTALLED_APPS.append("plane.tests",)
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'github_actions',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'plane_test',
'USER': 'postgres',
'PASSWORD': 'password123',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
REDIS_HOST = "localhost"
REDIS_PORT = 6379
REDIS_URL = False
RQ_QUEUES = {
"default": {
"HOST": "localhost",
"PORT": 6379,
"DB": 0,
"DEFAULT_TIMEOUT": 360,
},
}
WEB_URL = "http://localhost:3000"

View File

@ -14,6 +14,8 @@ urlpatterns = [
path("", include("plane.web.urls")), path("", include("plane.web.urls")),
] ]
if settings.ENABLE_API:
urlpatterns += path("api/v1/", include("plane.proxy.urls")),
if settings.DEBUG: if settings.DEBUG:
import debug_toolbar import debug_toolbar

View File

@ -1,9 +1,9 @@
# base requirements # base requirements
Django==4.2.5 Django==4.2.7
django-braces==1.15.0 django-braces==1.15.0
django-taggit==4.0.0 django-taggit==4.0.0
psycopg==3.1.10 psycopg==3.1.12
django-oauth-toolkit==2.3.0 django-oauth-toolkit==2.3.0
mistune==3.0.1 mistune==3.0.1
djangorestframework==3.14.0 djangorestframework==3.14.0
@ -17,7 +17,7 @@ django-filter==23.2
jsonmodels==2.6.0 jsonmodels==2.6.0
djangorestframework-simplejwt==5.3.0 djangorestframework-simplejwt==5.3.0
sentry-sdk==1.30.0 sentry-sdk==1.30.0
django-s3-storage==0.14.0 django-storages==1.14
django-crum==0.7.9 django-crum==0.7.9
django-guardian==2.4.0 django-guardian==2.4.0
dj_rest_auth==2.2.5 dj_rest_auth==2.2.5
@ -30,8 +30,9 @@ openai==0.28.0
slack-sdk==3.21.3 slack-sdk==3.21.3
celery==5.3.4 celery==5.3.4
django_celery_beat==2.5.0 django_celery_beat==2.5.0
psycopg-binary==3.1.10 psycopg-binary==3.1.12
psycopg-c==3.1.10 psycopg-c==3.1.12
scout-apm==2.26.1 scout-apm==2.26.1
openpyxl==3.1.2 openpyxl==3.1.2
beautifulsoup4==4.12.2 beautifulsoup4==4.12.2
dj-database-url==2.1.0

View File

@ -1,9 +1,7 @@
-r base.txt -r base.txt
dj-database-url==2.1.0
gunicorn==21.2.0 gunicorn==21.2.0
whitenoise==6.5.0 whitenoise==6.5.0
django-storages==1.14
boto3==1.28.40 boto3==1.28.40
django-anymail==10.1 django-anymail==10.1
django-debug-toolbar==4.1.0 django-debug-toolbar==4.1.0

View File

@ -5,7 +5,7 @@
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="format-detection" content="telephone=no"> <meta name="format-detection" content="telephone=no">
<meta name="viewport" content="width=device-width, initial-scale=1.0"> <meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{{ Inviter }} invited you to join {{ Workspace-Name }} on Plane</title> <title>{{ first_name }} invited you to join {{ project_name }} on Plane</title>
<style type="text/css" emogrify="no">#outlook a { padding:0; } .ExternalClass { width:100%; } .ExternalClass, .ExternalClass p, .ExternalClass span, .ExternalClass font, .ExternalClass td, .ExternalClass div { line-height: 100%; } table td { border-collapse: collapse; mso-line-height-rule: exactly; } .editable.image { font-size: 0 !important; line-height: 0 !important; } .nl2go_preheader { display: none !important; mso-hide:all !important; mso-line-height-rule: exactly; visibility: hidden !important; line-height: 0px !important; font-size: 0px !important; } body { width:100% !important; -webkit-text-size-adjust:100%; -ms-text-size-adjust:100%; margin:0; padding:0; } img { outline:none; text-decoration:none; -ms-interpolation-mode: bicubic; } a img { border:none; } table { border-collapse:collapse; mso-table-lspace:0pt; mso-table-rspace:0pt; } th { font-weight: normal; text-align: left; } *[class="gmail-fix"] { display: none !important; } </style> <style type="text/css" emogrify="no">#outlook a { padding:0; } .ExternalClass { width:100%; } .ExternalClass, .ExternalClass p, .ExternalClass span, .ExternalClass font, .ExternalClass td, .ExternalClass div { line-height: 100%; } table td { border-collapse: collapse; mso-line-height-rule: exactly; } .editable.image { font-size: 0 !important; line-height: 0 !important; } .nl2go_preheader { display: none !important; mso-hide:all !important; mso-line-height-rule: exactly; visibility: hidden !important; line-height: 0px !important; font-size: 0px !important; } body { width:100% !important; -webkit-text-size-adjust:100%; -ms-text-size-adjust:100%; margin:0; padding:0; } img { outline:none; text-decoration:none; -ms-interpolation-mode: bicubic; } a img { border:none; } table { border-collapse:collapse; mso-table-lspace:0pt; mso-table-rspace:0pt; } th { font-weight: normal; text-align: left; } *[class="gmail-fix"] { display: none !important; } </style>
<style type="text/css" emogrify="no"> @media (max-width: 600px) { .gmx-killpill { content: ' \03D1';} } </style> <style type="text/css" emogrify="no"> @media (max-width: 600px) { .gmx-killpill { content: ' \03D1';} } </style>
<style type="text/css" emogrify="no">@media (max-width: 600px) { .gmx-killpill { content: ' \03D1';} .r0-c { box-sizing: border-box !important; text-align: center !important; valign: top !important; width: 320px !important } .r1-o { border-style: solid !important; margin: 0 auto 0 auto !important; width: 320px !important } .r2-i { background-color: #ffffff !important } .r3-c { box-sizing: border-box !important; text-align: center !important; valign: top !important; width: 100% !important } .r4-o { border-style: solid !important; margin: 0 auto 0 auto !important; margin-top: 20px !important; width: 100% !important } .r5-i { background-color: #f8f9fa !important; padding-bottom: 20px !important; padding-left: 10px !important; padding-right: 10px !important; padding-top: 20px !important } .r6-c { box-sizing: border-box !important; display: block !important; valign: top !important; width: 100% !important } .r7-o { border-style: solid !important; width: 100% !important } .r8-i { padding-left: 0px !important; padding-right: 0px !important } .r9-o { border-style: solid !important; margin: 0 auto 0 auto !important; width: 100% !important } .r10-i { padding-bottom: 35px !important; padding-top: 15px !important } .r11-c { box-sizing: border-box !important; text-align: left !important; valign: top !important; width: 100% !important } .r12-o { border-style: solid !important; margin: 0 auto 0 0 !important; width: 100% !important } .r13-i { padding-left: 20px !important; padding-right: 20px !important; padding-top: 0px !important; text-align: center !important } .r14-o { border-style: solid !important; margin: 0 auto 0 auto !important; margin-bottom: 20px !important; margin-top: 20px !important; width: 100% !important } .r15-i { text-align: center !important } .r16-r { background-color: #ffffff !important; border-color: #3f76ff !important; border-radius: 4px !important; border-width: 1px !important; box-sizing: border-box; height: initial !important; padding-bottom: 7px !important; padding-left: 20px !important; padding-right: 20px !important; padding-top: 7px !important; text-align: center !important; width: 100% !important } .r17-i { padding-bottom: 15px !important; padding-left: 20px !important; padding-right: 20px !important; padding-top: 15px !important; text-align: left !important } .r18-i { background-color: #eff2f7 !important; padding-bottom: 20px !important; padding-left: 15px !important; padding-right: 15px !important; padding-top: 20px !important } .r19-i { padding-bottom: 15px !important; padding-top: 15px !important } .r20-i { color: #3b3f44 !important; padding-bottom: 0px !important; padding-top: 0px !important; text-align: center !important } .r21-c { box-sizing: border-box !important; text-align: center !important; width: 100% !important } .r22-c { box-sizing: border-box !important; width: 100% !important } .r23-i { font-size: 0px !important; padding-bottom: 15px !important; padding-left: 65px !important; padding-right: 65px !important; padding-top: 15px !important } .r24-c { box-sizing: border-box !important; width: 32px !important } .r25-o { border-style: solid !important; margin-right: 8px !important; width: 32px !important } .r26-i { padding-bottom: 5px !important; padding-top: 5px !important } .r27-o { border-style: solid !important; margin-right: 0px !important; width: 32px !important } .r28-i { color: #3b3f44 !important; padding-bottom: 15px !important; padding-top: 15px !important; text-align: center !important } .r29-i { padding-bottom: 15px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important } .r30-c { box-sizing: border-box !important; text-align: center !important; valign: top !important; width: 129px !important } .r31-o { border-style: solid !important; margin: 0 auto 0 auto !important; width: 129px !important } body { -webkit-text-size-adjust: none } .nl2go-responsive-hide { display: none } .nl2go-body-table { min-width: unset !important } .mobshow { height: auto !important; overflow: visible !important; max-height: unset !important; visibility: visible !important; border: none !important } .resp-table { display: inline-table !important } .magic-resp { display: table-cell !important } } </style> <style type="text/css" emogrify="no">@media (max-width: 600px) { .gmx-killpill { content: ' \03D1';} .r0-c { box-sizing: border-box !important; text-align: center !important; valign: top !important; width: 320px !important } .r1-o { border-style: solid !important; margin: 0 auto 0 auto !important; width: 320px !important } .r2-i { background-color: #ffffff !important } .r3-c { box-sizing: border-box !important; text-align: center !important; valign: top !important; width: 100% !important } .r4-o { border-style: solid !important; margin: 0 auto 0 auto !important; margin-top: 20px !important; width: 100% !important } .r5-i { background-color: #f8f9fa !important; padding-bottom: 20px !important; padding-left: 10px !important; padding-right: 10px !important; padding-top: 20px !important } .r6-c { box-sizing: border-box !important; display: block !important; valign: top !important; width: 100% !important } .r7-o { border-style: solid !important; width: 100% !important } .r8-i { padding-left: 0px !important; padding-right: 0px !important } .r9-o { border-style: solid !important; margin: 0 auto 0 auto !important; width: 100% !important } .r10-i { padding-bottom: 35px !important; padding-top: 15px !important } .r11-c { box-sizing: border-box !important; text-align: left !important; valign: top !important; width: 100% !important } .r12-o { border-style: solid !important; margin: 0 auto 0 0 !important; width: 100% !important } .r13-i { padding-left: 20px !important; padding-right: 20px !important; padding-top: 0px !important; text-align: center !important } .r14-o { border-style: solid !important; margin: 0 auto 0 auto !important; margin-bottom: 20px !important; margin-top: 20px !important; width: 100% !important } .r15-i { text-align: center !important } .r16-r { background-color: #ffffff !important; border-color: #3f76ff !important; border-radius: 4px !important; border-width: 1px !important; box-sizing: border-box; height: initial !important; padding-bottom: 7px !important; padding-left: 20px !important; padding-right: 20px !important; padding-top: 7px !important; text-align: center !important; width: 100% !important } .r17-i { padding-bottom: 15px !important; padding-left: 20px !important; padding-right: 20px !important; padding-top: 15px !important; text-align: left !important } .r18-i { background-color: #eff2f7 !important; padding-bottom: 20px !important; padding-left: 15px !important; padding-right: 15px !important; padding-top: 20px !important } .r19-i { padding-bottom: 15px !important; padding-top: 15px !important } .r20-i { color: #3b3f44 !important; padding-bottom: 0px !important; padding-top: 0px !important; text-align: center !important } .r21-c { box-sizing: border-box !important; text-align: center !important; width: 100% !important } .r22-c { box-sizing: border-box !important; width: 100% !important } .r23-i { font-size: 0px !important; padding-bottom: 15px !important; padding-left: 65px !important; padding-right: 65px !important; padding-top: 15px !important } .r24-c { box-sizing: border-box !important; width: 32px !important } .r25-o { border-style: solid !important; margin-right: 8px !important; width: 32px !important } .r26-i { padding-bottom: 5px !important; padding-top: 5px !important } .r27-o { border-style: solid !important; margin-right: 0px !important; width: 32px !important } .r28-i { color: #3b3f44 !important; padding-bottom: 15px !important; padding-top: 15px !important; text-align: center !important } .r29-i { padding-bottom: 15px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important } .r30-c { box-sizing: border-box !important; text-align: center !important; valign: top !important; width: 129px !important } .r31-o { border-style: solid !important; margin: 0 auto 0 auto !important; width: 129px !important } body { -webkit-text-size-adjust: none } .nl2go-responsive-hide { display: none } .nl2go-body-table { min-width: unset !important } .mobshow { height: auto !important; overflow: visible !important; max-height: unset !important; visibility: visible !important; border: none !important } .resp-table { display: inline-table !important } .magic-resp { display: table-cell !important } } </style>

View File

@ -2,7 +2,8 @@ version: "3.8"
x-app-env : &app-env x-app-env : &app-env
environment: environment:
- NGINX_PORT=${NGINX_PORT:-84} - NGINX_PORT=${NGINX_PORT:-80}
- WEB_URL=${WEB_URL:-http://localhost}
- DEBUG=${DEBUG:-0} - DEBUG=${DEBUG:-0}
- DJANGO_SETTINGS_MODULE=${DJANGO_SETTINGS_MODULE:-plane.settings.selfhosted} - DJANGO_SETTINGS_MODULE=${DJANGO_SETTINGS_MODULE:-plane.settings.selfhosted}
- NEXT_PUBLIC_ENABLE_OAUTH=${NEXT_PUBLIC_ENABLE_OAUTH:-0} - NEXT_PUBLIC_ENABLE_OAUTH=${NEXT_PUBLIC_ENABLE_OAUTH:-0}
@ -10,6 +11,11 @@ x-app-env : &app-env
- SENTRY_DSN=${SENTRY_DSN:-""} - SENTRY_DSN=${SENTRY_DSN:-""}
- GITHUB_CLIENT_SECRET=${GITHUB_CLIENT_SECRET:-""} - GITHUB_CLIENT_SECRET=${GITHUB_CLIENT_SECRET:-""}
- DOCKERIZED=${DOCKERIZED:-1} - DOCKERIZED=${DOCKERIZED:-1}
# BASE WEBHOOK
- ENABLE_WEBHOOK=${ENABLE_WEBHOOK:-1}
# BASE API
- ENABLE_API=${ENABLE_API:-1}
- CORS_ALLOWED_ORIGINS=${CORS_ALLOWED_ORIGINS:-http://localhost}
# Gunicorn Workers # Gunicorn Workers
- GUNICORN_WORKERS=${GUNICORN_WORKERS:-2} - GUNICORN_WORKERS=${GUNICORN_WORKERS:-2}
#DB SETTINGS #DB SETTINGS
@ -55,6 +61,8 @@ x-app-env : &app-env
- BUCKET_NAME=${BUCKET_NAME:-uploads} - BUCKET_NAME=${BUCKET_NAME:-uploads}
- FILE_SIZE_LIMIT=${FILE_SIZE_LIMIT:-5242880} - FILE_SIZE_LIMIT=${FILE_SIZE_LIMIT:-5242880}
services: services:
web: web:
<<: *app-env <<: *app-env
@ -143,14 +151,6 @@ services:
volumes: volumes:
- uploads:/export - uploads:/export
createbuckets:
<<: *app-env
image: minio/mc
entrypoint: >
/bin/sh -c " /usr/bin/mc config host add plane-minio http://plane-minio:9000 \$AWS_ACCESS_KEY_ID \$AWS_SECRET_ACCESS_KEY; /usr/bin/mc mb plane-minio/\$AWS_S3_BUCKET_NAME; /usr/bin/mc anonymous set download plane-minio/\$AWS_S3_BUCKET_NAME; exit 0; "
depends_on:
- plane-minio
# Comment this if you already have a reverse proxy running # Comment this if you already have a reverse proxy running
proxy: proxy:
<<: *app-env <<: *app-env

View File

@ -5,6 +5,7 @@ SPACE_REPLICAS=1
API_REPLICAS=1 API_REPLICAS=1
NGINX_PORT=80 NGINX_PORT=80
WEB_URL=http://localhost
DEBUG=0 DEBUG=0
DJANGO_SETTINGS_MODULE=plane.settings.selfhosted DJANGO_SETTINGS_MODULE=plane.settings.selfhosted
NEXT_PUBLIC_ENABLE_OAUTH=0 NEXT_PUBLIC_ENABLE_OAUTH=0
@ -12,6 +13,12 @@ NEXT_PUBLIC_DEPLOY_URL=http://localhost/spaces
SENTRY_DSN="" SENTRY_DSN=""
GITHUB_CLIENT_SECRET="" GITHUB_CLIENT_SECRET=""
DOCKERIZED=1 DOCKERIZED=1
CORS_ALLOWED_ORIGINS="http://localhost"
# Webhook
ENABLE_WEBHOOK=1
# API
ENABLE_API=1
#DB SETTINGS #DB SETTINGS
PGHOST=plane-db PGHOST=plane-db

View File

@ -35,17 +35,6 @@ services:
MINIO_ROOT_USER: ${AWS_ACCESS_KEY_ID} MINIO_ROOT_USER: ${AWS_ACCESS_KEY_ID}
MINIO_ROOT_PASSWORD: ${AWS_SECRET_ACCESS_KEY} MINIO_ROOT_PASSWORD: ${AWS_SECRET_ACCESS_KEY}
createbuckets:
image: minio/mc
networks:
- dev_env
entrypoint: >
/bin/sh -c " /usr/bin/mc config host add plane-minio http://plane-minio:9000 \$AWS_ACCESS_KEY_ID \$AWS_SECRET_ACCESS_KEY; /usr/bin/mc mb plane-minio/\$AWS_S3_BUCKET_NAME; /usr/bin/mc anonymous set download plane-minio/\$AWS_S3_BUCKET_NAME; exit 0; "
env_file:
- .env
depends_on:
- plane-minio
plane-db: plane-db:
container_name: plane-db container_name: plane-db
image: postgres:15.2-alpine image: postgres:15.2-alpine

View File

@ -108,15 +108,6 @@ services:
MINIO_ROOT_USER: ${AWS_ACCESS_KEY_ID} MINIO_ROOT_USER: ${AWS_ACCESS_KEY_ID}
MINIO_ROOT_PASSWORD: ${AWS_SECRET_ACCESS_KEY} MINIO_ROOT_PASSWORD: ${AWS_SECRET_ACCESS_KEY}
createbuckets:
image: minio/mc
entrypoint: >
/bin/sh -c " /usr/bin/mc config host add plane-minio http://plane-minio:9000 \$AWS_ACCESS_KEY_ID \$AWS_SECRET_ACCESS_KEY; /usr/bin/mc mb plane-minio/\$AWS_S3_BUCKET_NAME; /usr/bin/mc anonymous set download plane-minio/\$AWS_S3_BUCKET_NAME; exit 0; "
env_file:
- .env
depends_on:
- plane-minio
# Comment this if you already have a reverse proxy running # Comment this if you already have a reverse proxy running
proxy: proxy:
container_name: proxy container_name: proxy

View File

@ -138,7 +138,10 @@ const MenuItem: React.FC<ICustomMenuItemProps> = (props) => {
className={`w-full select-none truncate rounded px-1 py-1.5 text-left text-custom-text-200 hover:bg-custom-background-80 ${ className={`w-full select-none truncate rounded px-1 py-1.5 text-left text-custom-text-200 hover:bg-custom-background-80 ${
active ? "bg-custom-background-80" : "" active ? "bg-custom-background-80" : ""
} ${className}`} } ${className}`}
onClick={onClick} onClick={(e) => {
close();
onClick && onClick(e);
}}
> >
{children} {children}
</button> </button>

View File

@ -15,24 +15,42 @@ import { IPriorityIcon } from "./type";
export const PriorityIcon: React.FC<IPriorityIcon> = ({ export const PriorityIcon: React.FC<IPriorityIcon> = ({
priority, priority,
className = "", className = "",
transparentBg = false
}) => { }) => {
if (!className || className === "") className = "h-3.5 w-3.5"; if (!className || className === "") className = "h-4 w-4";
// Convert to lowercase for string comparison // Convert to lowercase for string comparison
const lowercasePriority = priority?.toLowerCase(); const lowercasePriority = priority?.toLowerCase();
//get priority icon
const getPriorityIcon = (): React.ReactNode => {
switch (lowercasePriority) {
case 'urgent':
return <AlertCircle className={`text-red-500 ${ transparentBg ? '' : 'p-0.5' } ${className}`} />;
case 'high':
return <SignalHigh className={`text-orange-500 ${ transparentBg ? '' : 'pl-1' } ${className}`} />;
case 'medium':
return <SignalMedium className={`text-yellow-500 ${ transparentBg ? '' : 'ml-1.5' } ${className}`} />;
case 'low':
return <SignalLow className={`text-green-500 ${ transparentBg ? '' : 'ml-2' } ${className}`} />;
default:
return <Ban className={`text-custom-text-200 ${ transparentBg ? '' : 'p-0.5' } ${className}`} />;
}
};
return ( return (
<> <>
{lowercasePriority === "urgent" ? ( { transparentBg ? (
<AlertCircle className={`text-red-500 ${className}`} /> getPriorityIcon()
) : lowercasePriority === "high" ? (
<SignalHigh className={`text-orange-500 ${className}`} />
) : lowercasePriority === "medium" ? (
<SignalMedium className={`text-yellow-500 ${className}`} />
) : lowercasePriority === "low" ? (
<SignalLow className={`text-green-500 ${className}`} />
) : ( ) : (
<Ban className={`text-custom-text-200 ${className}`} /> <div className={`grid h-5 w-5 place-items-center rounded border items-center ${
lowercasePriority === "urgent"
? "border-red-500/20 bg-red-500/20"
: "border-custom-border-200"
}`}
>
{ getPriorityIcon() }
</div>
)} )}
</> </>
); );

View File

@ -7,4 +7,5 @@ export type TIssuePriorities = "urgent" | "high" | "medium" | "low" | "none";
export interface IPriorityIcon { export interface IPriorityIcon {
priority: TIssuePriorities | null; priority: TIssuePriorities | null;
className?: string; className?: string;
transparentBg?: boolean | false;
} }

View File

@ -0,0 +1,55 @@
import { TextArea } from "@plane/ui";
import { Control, Controller, FieldErrors } from "react-hook-form";
import { IApiToken } from "types/api_token";
import { IApiFormFields } from "./types";
import { Dispatch, SetStateAction } from "react";
interface IApiTokenDescription {
generatedToken: IApiToken | null | undefined;
control: Control<IApiFormFields, any>;
focusDescription: boolean;
setFocusTitle: Dispatch<SetStateAction<boolean>>;
setFocusDescription: Dispatch<SetStateAction<boolean>>;
}
export const ApiTokenDescription = ({
generatedToken,
control,
focusDescription,
setFocusTitle,
setFocusDescription,
}: IApiTokenDescription) => (
<Controller
control={control}
name="description"
render={({ field: { value, onChange } }) =>
focusDescription ? (
<TextArea
id="description"
name="description"
autoFocus={true}
onBlur={() => {
setFocusDescription(false);
}}
value={value}
defaultValue={value}
onChange={onChange}
placeholder="Description"
className="mt-3"
rows={3}
/>
) : (
<p
onClick={() => {
if (generatedToken != null) return;
setFocusTitle(false);
setFocusDescription(true);
}}
className={`${value.length === 0 ? "text-custom-text-400/60" : "text-custom-text-300"} text-lg pt-3`}
>
{value.length != 0 ? value : "Description"}
</p>
)
}
/>
);

View File

@ -0,0 +1,110 @@
import { Menu, Transition } from "@headlessui/react";
import { ToggleSwitch } from "@plane/ui";
import { Dispatch, Fragment, SetStateAction } from "react";
import { Control, Controller } from "react-hook-form";
import { IApiFormFields } from "./types";
interface IApiTokenExpiry {
neverExpires: boolean;
selectedExpiry: number;
setSelectedExpiry: Dispatch<SetStateAction<number>>;
setNeverExpire: Dispatch<SetStateAction<boolean>>;
renderExpiry: () => string;
control: Control<IApiFormFields, any>;
}
export const expiryOptions = [
{
title: "7 Days",
days: 7,
},
{
title: "30 Days",
days: 30,
},
{
title: "1 Month",
days: 30,
},
{
title: "3 Months",
days: 90,
},
{
title: "1 Year",
days: 365,
},
];
export const ApiTokenExpiry = ({
neverExpires,
selectedExpiry,
setSelectedExpiry,
setNeverExpire,
renderExpiry,
control,
}: IApiTokenExpiry) => (
<>
<Menu>
<p className="text-sm font-medium mb-2"> Expiration Date</p>
<Menu.Button className={"w-[40%]"} disabled={neverExpires}>
<div className="py-3 w-full font-medium px-3 flex border border-custom-border-200 rounded-md justify-center items-baseline">
<p className={`text-base ${neverExpires ? "text-custom-text-400/40" : ""}`}>
{expiryOptions[selectedExpiry].title.toLocaleLowerCase()}
</p>
<p className={`text-sm mr-auto ml-2 text-custom-text-400${neverExpires ? "/40" : ""}`}>({renderExpiry()})</p>
</div>
</Menu.Button>
<Transition
as={Fragment}
enter="transition ease-out duration-100"
enterFrom="transform opacity-0 scale-95"
enterTo="transform opacity-100 scale-100"
leave="transition ease-in duration-75"
leaveFrom="transform opacity-100 scale-100"
leaveTo="transform opacity-0 scale-95"
>
<Menu.Items className="absolute z-10 overflow-y-scroll whitespace-nowrap rounded-sm max-h-36 border origin-top-right mt-1 overflow-auto min-w-[10rem] border-custom-border-100 p-1 shadow-lg focus:outline-none bg-custom-background-100">
{expiryOptions.map((option, index) => (
<Menu.Item key={index}>
{({ active }) => (
<div className="py-1">
<button
type="button"
onClick={() => {
setSelectedExpiry(index);
}}
className={`w-full text-sm select-none truncate rounded px-3 py-1.5 text-left text-custom-text-300 hover:bg-custom-background-80 ${
active ? "bg-custom-background-80" : ""
}`}
>
{option.title}
</button>
</div>
)}
</Menu.Item>
))}
</Menu.Items>
</Transition>
</Menu>
<div className="mt-4 mb-6 flex items-center">
<span className="text-sm font-medium"> Never Expires</span>
<Controller
control={control}
name="never_expires"
render={({ field: { onChange, value } }) => (
<ToggleSwitch
className="ml-3"
value={value}
onChange={(val) => {
onChange(val);
setNeverExpire(val);
}}
size="sm"
/>
)}
/>
</div>
</>
);

View File

@ -0,0 +1,53 @@
import { Button } from "@plane/ui";
import useToast from "hooks/use-toast";
import { Copy } from "lucide-react";
import { Dispatch, SetStateAction } from "react";
import { IApiToken } from "types/api_token";
interface IApiTokenKeySection {
generatedToken: IApiToken | null | undefined;
renderExpiry: () => string;
setDeleteTokenModal: Dispatch<SetStateAction<boolean>>;
}
export const ApiTokenKeySection = ({ generatedToken, renderExpiry, setDeleteTokenModal }: IApiTokenKeySection) => {
const { setToastAlert } = useToast();
return generatedToken ? (
<div className={`mt-${generatedToken ? "8" : "16"}`}>
<p className="font-medium text-base pb-2">Api key created successfully</p>
<p className="text-sm pb-4 w-[80%] text-custom-text-400/60">
Save this API key somewhere safe. You will not be able to view it again once you close this page or reload this
page.
</p>
<Button variant="neutral-primary" className="py-3 w-[85%] flex justify-between items-center">
<p className="font-medium text-base">{generatedToken.token}</p>
<Copy
size={18}
color="#B9B9B9"
onClick={() => {
navigator.clipboard.writeText(generatedToken.token);
setToastAlert({
message: "The Secret key has been successfully copied to your clipboard",
type: "success",
title: "Copied to clipboard",
});
}}
/>
</Button>
<p className="mt-2 text-sm text-custom-text-400/60">
{generatedToken.expired_at ? "Expires on " + renderExpiry() : "Never Expires"}
</p>
<button
className="border py-3 px-5 text-custom-primary-100 text-sm mt-8 rounded-md border-custom-primary-100 w-fit font-medium"
onClick={(e) => {
e.preventDefault();
setDeleteTokenModal(true);
}}
>
Revoke
</button>
</div>
) : null;
};

View File

@ -0,0 +1,69 @@
import { Input } from "@plane/ui";
import { Dispatch, SetStateAction } from "react";
import { Control, Controller, FieldErrors } from "react-hook-form";
import { IApiToken } from "types/api_token";
import { IApiFormFields } from "./types";
interface IApiTokenTitle {
generatedToken: IApiToken | null | undefined;
errors: FieldErrors<IApiFormFields>;
control: Control<IApiFormFields, any>;
focusTitle: boolean;
setFocusTitle: Dispatch<SetStateAction<boolean>>;
setFocusDescription: Dispatch<SetStateAction<boolean>>;
}
export const ApiTokenTitle = ({
generatedToken,
errors,
control,
focusTitle,
setFocusTitle,
setFocusDescription,
}: IApiTokenTitle) => (
<Controller
control={control}
name="title"
rules={{
required: "Title is required",
maxLength: {
value: 255,
message: "Title should be less than 255 characters",
},
}}
render={({ field: { value, onChange, ref } }) =>
focusTitle ? (
<Input
id="title"
name="title"
type="text"
inputSize="md"
onBlur={() => {
setFocusTitle(false);
}}
onError={() => {
console.log("error");
}}
autoFocus={true}
value={value}
onChange={onChange}
ref={ref}
hasError={!!errors.title}
placeholder="Title"
className="resize-none text-xl w-full"
/>
) : (
<p
onClick={() => {
if (generatedToken != null) return;
setFocusDescription(false);
setFocusTitle(true);
}}
className={`${value.length === 0 ? "text-custom-text-400/60" : ""} font-medium text-[24px]`}
>
{value.length != 0 ? value : "Api Title"}
</p>
)
}
/>
);

View File

@ -0,0 +1,143 @@
import { useForm } from "react-hook-form";
import { addDays, renderDateFormat } from "helpers/date-time.helper";
import { IApiToken } from "types/api_token";
import { csvDownload } from "helpers/download.helper";
import { useRouter } from "next/router";
import { Dispatch, SetStateAction, useState } from "react";
import useToast from "hooks/use-toast";
import { useMobxStore } from "lib/mobx/store-provider";
import { ApiTokenService } from "services/api_token.service";
import { ApiTokenTitle } from "./ApiTokenTitle";
import { ApiTokenDescription } from "./ApiTokenDescription";
import { ApiTokenExpiry, expiryOptions } from "./ApiTokenExpiry";
import { Button } from "@plane/ui";
import { ApiTokenKeySection } from "./ApiTokenKeySection";
interface IApiTokenForm {
generatedToken: IApiToken | null | undefined;
setGeneratedToken: Dispatch<SetStateAction<IApiToken | null | undefined>>;
setDeleteTokenModal: Dispatch<SetStateAction<boolean>>;
}
const apiTokenService = new ApiTokenService();
export const ApiTokenForm = ({ generatedToken, setGeneratedToken, setDeleteTokenModal }: IApiTokenForm) => {
const [loading, setLoading] = useState<boolean>(false);
const [neverExpires, setNeverExpire] = useState<boolean>(false);
const [focusTitle, setFocusTitle] = useState<boolean>(false);
const [focusDescription, setFocusDescription] = useState<boolean>(false);
const [selectedExpiry, setSelectedExpiry] = useState<number>(1);
const { setToastAlert } = useToast();
const { theme: themStore } = useMobxStore();
const router = useRouter();
const { workspaceSlug } = router.query;
const {
control,
handleSubmit,
formState: { errors },
} = useForm({
defaultValues: {
never_expires: false,
title: "",
description: "",
},
});
const getExpiryDate = (): string | null => {
if (neverExpires === true) return null;
return addDays({ date: new Date(), days: expiryOptions[selectedExpiry].days }).toISOString();
};
function renderExpiry(): string {
return renderDateFormat(addDays({ date: new Date(), days: expiryOptions[selectedExpiry].days }), true);
}
const downloadSecretKey = (token: IApiToken) => {
const csvData = {
Label: token.label,
Description: token.description,
Expiry: renderDateFormat(token.expired_at ?? null),
"Secret Key": token.token,
};
csvDownload(csvData, `Secret-key-${Date.now()}`);
};
const generateToken = async (data: any) => {
if (!workspaceSlug) return;
setLoading(true);
await apiTokenService
.createApiToken(workspaceSlug.toString(), {
label: data.title,
description: data.description,
expired_at: getExpiryDate(),
})
.then((res) => {
setGeneratedToken(res);
downloadSecretKey(res);
setLoading(false);
})
.catch((err) => {
setToastAlert({
message: err.message,
type: "error",
title: "Error",
});
});
};
return (
<form
onSubmit={handleSubmit(generateToken, (err) => {
if (err.title) {
setFocusTitle(true);
}
})}
className={`${themStore.sidebarCollapsed ? "xl:w-[50%] lg:w-[60%] " : "w-[60%]"} mx-auto py-8`}
>
<div className="border-b border-custom-border-200 pb-4">
<ApiTokenTitle
generatedToken={generatedToken}
control={control}
errors={errors}
focusTitle={focusTitle}
setFocusTitle={setFocusTitle}
setFocusDescription={setFocusDescription}
/>
{errors.title && focusTitle && <p className=" text-red-600">{errors.title.message}</p>}
<ApiTokenDescription
generatedToken={generatedToken}
control={control}
focusDescription={focusDescription}
setFocusTitle={setFocusTitle}
setFocusDescription={setFocusDescription}
/>
</div>
{!generatedToken && (
<div className="mt-12">
<>
<ApiTokenExpiry
neverExpires={neverExpires}
selectedExpiry={selectedExpiry}
setSelectedExpiry={setSelectedExpiry}
setNeverExpire={setNeverExpire}
renderExpiry={renderExpiry}
control={control}
/>
<Button variant="primary" type="submit">
{loading ? "generating..." : "Add Api key"}
</Button>
</>
</div>
)}
<ApiTokenKeySection
generatedToken={generatedToken}
renderExpiry={renderExpiry}
setDeleteTokenModal={setDeleteTokenModal}
/>
</form>
);
};

Some files were not shown because too many files have changed in this diff Show More