Compare commits

...

23 Commits

Author SHA1 Message Date
pablohashescobar
89a5088f31 dev: update cors setup and local settings 2023-11-15 19:22:56 +05:30
pablohashescobar
9789757d2d dev: fix current site domain registration 2023-11-15 18:18:00 +05:30
pablohashescobar
2aee627616 dev: remove deprecated variables 2023-11-15 17:54:33 +05:30
Nikhil
26b1e9d5f1
dev: squashed migrations (#2779)
* dev: migration squash

* dev: migrations squashed for apis and webhooks

* dev: packages updated and  move dj-database-url for local settings

* dev: update package changes
2023-11-15 17:15:02 +05:30
Bavisetti Narayan
79347ec62b
feat: api webhooks (#2543)
* dev: initiate external apis

* dev: external api

* dev: external public api implementation

* dev: add prefix to all api tokens

* dev: flag to enable disable api token api access

* dev: webhook model create and apis

* dev: webhook settings

* fix: webhook logs

* chore: removed drf spectacular

* dev: remove retry_count and fix api logging for get requests

* dev: refactor webhook logic

* fix: celery retry mechanism

* chore: event and action change

* chore: migrations changes

* dev: proxy setup for apis

* chore: changed retry time and cleanup

* chore: added issue comment and inbox issue api endpoints

* fix: migration files

* fix: added env variables

* fix: removed issue attachment from proxy

* fix: added new migration file

* fix: restricted wehbook access

* chore: changed urls

* chore: fixed porject serializer

* fix: set expire for api token

* fix: retrive endpoint for api token

* feat: Api Token screens & api integration

* dev: webhook endpoint changes

* dev: add fields for webhook updates

* feat: Download Api secret key

* chore: removed BASE API URL

* feat: revoke token access

* dev: migration fixes

* feat: workspace webhooks (#2748)

* feat: workspace webhook store, services integeration and rendered webhook list and create

* chore: handled webhook update and rengenerate token in workspace webhooks

* feat: regenerate key and delete functionality

---------

Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>

* fix: url validation added

* fix: seperated env for webhook and api

* Web hooks refactoring

* add show option for generated hook key

* Api token restructure

* webhook minor fixes

* fix build errors

* chore: improvements in file structring

* dev: rate limiting the open apis

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: LAKHAN BAHETI <lakhanbaheti9@gmail.com>
Co-authored-by: rahulramesha <71900764+rahulramesha@users.noreply.github.com>
Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>
Co-authored-by: Nikhil <118773738+pablohashescobar@users.noreply.github.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-15 15:56:57 +05:30
Nikhil
7b965179d8
dev: update bucket script to make the bucket public (#2767)
* dev: update bucket script to make the bucket public

* dev: remove auto bucket script from docker compose
2023-11-15 15:56:08 +05:30
Nikhil
fc51ffc589
chore: user workflow (#2762)
* dev: workspace member deactivation and leave endpoints and filters

* dev: deactivated for project members

* dev: project members leave

* dev: project member check on workspace deactivation

* dev: project member queryset update and remove leave project endpoint

* dev: rename is_deactivated to is_active and user deactivation apis

* dev: check if the user is already part of workspace then make them active

* dev: workspace and project save

* dev: update project members to make them active

* dev: project invitation

* dev: automatic user workspace and project member create when user sign in/up

* dev: fix member invites

* dev: rename deactivation variable

* dev: update project member invitation

* dev: additional permission layer for workspace

* dev: update the url for  workspace invitations

* dev: remove invitation urls from users

* dev: cleanup workspace invitation workflow

* dev: workspace and project invitation
2023-11-15 15:53:16 +05:30
sabith-tu
96f6e37cc5
fix: Delete estimate popup is not closing automatically (#2777) 2023-11-15 14:08:52 +05:30
Nikhil
29774ce84a
dev: API settings (#2594)
* dev: update settings file structure and added extra settings for CORS

* dev: remove WEB_URL variable and add celery integration for sentry

* dev: aws and minio settings

* dev: add cors origins to env

* dev: update settings
2023-11-15 12:31:52 +05:30
Nikhil
8cbe9c26fc
enhancement: label sort order (#2763)
* chore: label sort ordering

* dev: ordering

* fix: sort order

* fix: save of labels

* dev: remove ordering by name

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-15 12:25:44 +05:30
Prateek Shourya
7f42566207
Fix: Custom menu item not automatically closing, affecting delete popup behavior. (#2771) 2023-11-14 23:05:30 +05:30
Ankush Deshmukh
b60237b676
Standarding priority icons across the platform (#2776) 2023-11-14 20:52:43 +05:30
Prateek Shourya
1fe09d369f
style: text overflow fix and border color update (#2769)
* style: fix text overflow in:
* Issue activity
* Cycle and Module Select in Create Issue form
* Delete Module modal
* Join Project modal

* style: update assignee select border as per design.
2023-11-14 18:34:51 +05:30
Dakshesh Jain
b7757c6b1a
fix: bugs (#2761)
* fix: semicolon on estimate settings page

* refactor: project settings automations store implementation

* fix: active cycle stuck on infinite loading

* fix: removed delete project option from sidebar

* fix: discloser not opening when navigating to project

* fix: clear filter not working & filter appearing even if nothing is selected

* refactor: select label store implementation

* refactor: select state store implementation
2023-11-14 18:33:01 +05:30
Anmol Singh Bhatia
1a25bacce1
style: create update view modal consistency (#2775) 2023-11-14 18:30:10 +05:30
Anmol Singh Bhatia
6797df239d
chore: no lead option added in lead select dropdown (#2774) 2023-11-14 18:29:39 +05:30
Anmol Singh Bhatia
43e7c10eb7
chore: spreadsheet layout column responsiveness (#2768) 2023-11-14 18:28:49 +05:30
Anmol Singh Bhatia
bdc9c9c2a8
chore: create update issue modal improvement (#2765) 2023-11-14 18:28:15 +05:30
Anmol Singh Bhatia
f0c72bf249
fix: breadcrumb project icon improvement (#2764) 2023-11-14 18:27:47 +05:30
sabith-tu
a8904bfc48
style: ui fixes for pages and views (#2770) 2023-11-14 18:26:50 +05:30
Nikhil
b31041726b
dev: create bucket through application (#2720) 2023-11-13 15:57:19 +05:30
Prateek Shourya
e6f947ad90
style: ui improvements and bug fixes (#2758)
* style: add transition to favorite projects dropdown.

* style: update project integration settings borders.

* style: fix text overflow issue in project views.

* fix: issue with non-functional cancel button in leave project modal.
2023-11-13 14:42:45 +05:30
Dakshesh Jain
7963993171
fix: workspace settings bugs (#2743)
* fix: double layout in exports

* fix: typo in jira email address section

* fix: workspace members not mutating

* fix: removed un-used variable

* fix: workspace members can't be filtered using email

* fix: autocomplete in workspace delete

* fix: autocomplete in project delete modal

* fix: update member function in store

* fix: sidebar link not active when in github/jira

* style: margin top & icon inconsistency

* fix: typo in create workspace

* fix: workspace leave flow

* fix: redirection to delete issue

* fix: autocomplete off in jira api token

* refactor: reduced api call, added optional chaining & removed variable with low scope
2023-11-13 13:34:05 +05:30
178 changed files with 6690 additions and 2219 deletions

View File

@ -21,15 +21,15 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint
OPENAI_API_KEY="sk-" # add your openai key here
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1
DOCKERIZED=1 # deprecated
# set to 1 If using the pre-configured minio setup
USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80

View File

@ -44,7 +44,6 @@ ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
ENV DJANGO_SETTINGS_MODULE plane.settings.production
ENV DOCKERIZED 1
WORKDIR /code

View File

@ -31,12 +31,10 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint
OPENAI_API_KEY="sk-" # add your openai key here
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1
# set to 1 If using the pre-configured minio setup
USE_MINIO=1
@ -115,15 +113,16 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint
OPENAI_API_KEY="sk-" # add your openai key here
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1 # Deprecated
# Github
GITHUB_CLIENT_SECRET="" # For fetching release notes
# Settings related to Docker
DOCKERIZED=1
# set to 1 If using the pre-configured minio setup
USE_MINIO=1

View File

@ -1,7 +1,7 @@
# Backend
# Debug value for api server use it as 0 for production use
DEBUG=0
DJANGO_SETTINGS_MODULE="plane.settings.production"
CORS_ALLOWED_ORIGINS=""
# Error logs
SENTRY_DSN=""
@ -38,9 +38,9 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint
OPENAI_API_KEY="sk-" # add your openai key here
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Github
GITHUB_CLIENT_SECRET="" # For fetching release notes
@ -70,6 +70,6 @@ ENABLE_MAGIC_LINK_LOGIN="0"
# Email redirections and minio domain settings
WEB_URL="http://localhost"
# Gunicorn Workers
GUNICORN_WORKERS=2

View File

@ -0,0 +1,83 @@
import os, sys
import boto3
import json
from botocore.exceptions import ClientError
sys.path.append("/code")
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "plane.settings.production")
import django
django.setup()
def set_bucket_public_policy(s3_client, bucket_name):
public_policy = {
"Version": "2012-10-17",
"Statement": [{
"Effect": "Allow",
"Principal": "*",
"Action": ["s3:GetObject"],
"Resource": [f"arn:aws:s3:::{bucket_name}/*"]
}]
}
try:
s3_client.put_bucket_policy(
Bucket=bucket_name,
Policy=json.dumps(public_policy)
)
print(f"Public read access policy set for bucket '{bucket_name}'.")
except ClientError as e:
print(f"Error setting public read access policy: {e}")
def create_bucket():
try:
from django.conf import settings
# Create a session using the credentials from Django settings
session = boto3.session.Session(
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
)
# Create an S3 client using the session
s3_client = session.client('s3', endpoint_url=settings.AWS_S3_ENDPOINT_URL)
bucket_name = settings.AWS_STORAGE_BUCKET_NAME
print("Checking bucket...")
# Check if the bucket exists
s3_client.head_bucket(Bucket=bucket_name)
# If head_bucket does not raise an exception, the bucket exists
print(f"Bucket '{bucket_name}' already exists.")
set_bucket_public_policy(s3_client, bucket_name)
except ClientError as e:
error_code = int(e.response['Error']['Code'])
bucket_name = settings.AWS_STORAGE_BUCKET_NAME
if error_code == 404:
# Bucket does not exist, create it
print(f"Bucket '{bucket_name}' does not exist. Creating bucket...")
try:
s3_client.create_bucket(Bucket=bucket_name)
print(f"Bucket '{bucket_name}' created successfully.")
set_bucket_public_policy(s3_client, bucket_name)
except ClientError as create_error:
print(f"Failed to create bucket: {create_error}")
elif error_code == 403:
# Access to the bucket is forbidden
print(f"Access to the bucket '{bucket_name}' is forbidden. Check permissions.")
else:
# Another ClientError occurred
print(f"Failed to check bucket: {e}")
except Exception as ex:
# Handle any other exception
print(f"An error occurred: {ex}")
if __name__ == "__main__":
create_bucket()

View File

@ -5,5 +5,7 @@ python manage.py migrate
# Create a Default User
python bin/user_script.py
# Create the default bucket
python bin/bucket_script.py
exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:8000 --max-requests 1200 --max-requests-jitter 1000 --access-logfile -

View File

@ -1,2 +1,17 @@
from .workspace import WorkSpaceBasePermission, WorkSpaceAdminPermission, WorkspaceEntityPermission, WorkspaceViewerPermission
from .project import ProjectBasePermission, ProjectEntityPermission, ProjectMemberPermission, ProjectLitePermission
from .workspace import (
WorkSpaceBasePermission,
WorkspaceOwnerPermission,
WorkSpaceAdminPermission,
WorkspaceEntityPermission,
WorkspaceViewerPermission,
WorkspaceUserPermission,
)
from .project import (
ProjectBasePermission,
ProjectEntityPermission,
ProjectMemberPermission,
ProjectLitePermission,
)

View File

@ -13,14 +13,15 @@ Guest = 5
class ProjectBasePermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS:
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug, member=request.user
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists()
## Only workspace owners or admins can create the projects
@ -29,6 +30,7 @@ class ProjectBasePermission(BasePermission):
workspace__slug=view.workspace_slug,
member=request.user,
role__in=[Admin, Member],
is_active=True,
).exists()
## Only Project Admins can update project attributes
@ -37,19 +39,21 @@ class ProjectBasePermission(BasePermission):
member=request.user,
role=Admin,
project_id=view.project_id,
is_active=True,
).exists()
class ProjectMemberPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS:
return ProjectMember.objects.filter(
workspace__slug=view.workspace_slug, member=request.user
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists()
## Only workspace owners or admins can create the projects
if request.method == "POST":
@ -57,6 +61,7 @@ class ProjectMemberPermission(BasePermission):
workspace__slug=view.workspace_slug,
member=request.user,
role__in=[Admin, Member],
is_active=True,
).exists()
## Only Project Admins can update project attributes
@ -65,12 +70,12 @@ class ProjectMemberPermission(BasePermission):
member=request.user,
role__in=[Admin, Member],
project_id=view.project_id,
is_active=True,
).exists()
class ProjectEntityPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
@ -80,6 +85,7 @@ class ProjectEntityPermission(BasePermission):
workspace__slug=view.workspace_slug,
member=request.user,
project_id=view.project_id,
is_active=True,
).exists()
## Only project members or admins can create and edit the project attributes
@ -88,17 +94,18 @@ class ProjectEntityPermission(BasePermission):
member=request.user,
role__in=[Admin, Member],
project_id=view.project_id,
is_active=True,
).exists()
class ProjectLitePermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return ProjectMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
project_id=view.project_id,
is_active=True,
).exists()

View File

@ -32,15 +32,31 @@ class WorkSpaceBasePermission(BasePermission):
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
is_active=True,
).exists()
# allow only owner to delete the workspace
if request.method == "DELETE":
return WorkspaceMember.objects.filter(
member=request.user, workspace__slug=view.workspace_slug, role=Owner
member=request.user,
workspace__slug=view.workspace_slug,
role=Owner,
is_active=True,
).exists()
class WorkspaceOwnerPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
role=Owner,
).exists()
class WorkSpaceAdminPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
@ -50,6 +66,7 @@ class WorkSpaceAdminPermission(BasePermission):
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
is_active=True,
).exists()
@ -63,12 +80,14 @@ class WorkspaceEntityPermission(BasePermission):
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists()
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
is_active=True,
).exists()
@ -78,5 +97,20 @@ class WorkspaceViewerPermission(BasePermission):
return False
return WorkspaceMember.objects.filter(
member=request.user, workspace__slug=view.workspace_slug, role__gte=10
member=request.user,
workspace__slug=view.workspace_slug,
role__gte=10,
is_active=True,
).exists()
class WorkspaceUserPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
is_active=True,
).exists()

View File

@ -71,7 +71,7 @@ from .module import (
ModuleFavoriteSerializer,
)
from .api_token import APITokenSerializer
from .api import APITokenSerializer, APITokenReadSerializer
from .integration import (
IntegrationSerializer,
@ -100,3 +100,5 @@ from .analytic import AnalyticViewSerializer
from .notification import NotificationSerializer
from .exporter import ExporterHistorySerializer
from .webhook import WebhookSerializer, WebhookLogSerializer

View File

@ -0,0 +1,31 @@
from .base import BaseSerializer
from plane.db.models import APIToken, APIActivityLog
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = "__all__"
read_only_fields = [
"token",
"expired_at",
"created_at",
"updated_at",
"workspace",
"user",
]
class APITokenReadSerializer(BaseSerializer):
class Meta:
model = APIToken
exclude = ('token',)
class APIActivityLogSerializer(BaseSerializer):
class Meta:
model = APIActivityLog
fields = "__all__"

View File

@ -1,14 +0,0 @@
from .base import BaseSerializer
from plane.db.models import APIToken
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = [
"label",
"user",
"user_type",
"workspace",
"created_at",
]

View File

@ -103,13 +103,16 @@ class ProjectListSerializer(DynamicBaseSerializer):
members = serializers.SerializerMethodField()
def get_members(self, obj):
project_members = ProjectMember.objects.filter(project_id=obj.id).values(
project_members = ProjectMember.objects.filter(
project_id=obj.id,
is_active=True,
).values(
"id",
"member_id",
"member__display_name",
"member__avatar",
)
return project_members
return list(project_members)
class Meta:
model = Project

View File

@ -0,0 +1,30 @@
# Third party imports
from rest_framework import serializers
# Module imports
from .base import DynamicBaseSerializer
from plane.db.models import Webhook, WebhookLog
from plane.db.models.webhook import validate_domain, validate_schema
class WebhookSerializer(DynamicBaseSerializer):
url = serializers.URLField(validators=[validate_schema, validate_domain])
class Meta:
model = Webhook
fields = "__all__"
read_only_fields = [
"workspace",
"secret_key",
]
class WebhookLogSerializer(DynamicBaseSerializer):
class Meta:
model = WebhookLog
fields = "__all__"
read_only_fields = [
"workspace",
"webhook"
]

View File

@ -19,6 +19,12 @@ from .state import urlpatterns as state_urls
from .user import urlpatterns as user_urls
from .views import urlpatterns as view_urls
from .workspace import urlpatterns as workspace_urls
from .api import urlpatterns as api_urls
from .webhook import urlpatterns as webhook_urls
# Django imports
from django.conf import settings
urlpatterns = [
@ -43,4 +49,6 @@ urlpatterns = [
*user_urls,
*view_urls,
*workspace_urls,
*api_urls,
*webhook_urls,
]

View File

@ -0,0 +1,17 @@
from django.urls import path
from plane.api.views import ApiTokenEndpoint
urlpatterns = [
# API Tokens
path(
"workspaces/<str:slug>/api-tokens/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
path(
"workspaces/<str:slug>/api-tokens/<uuid:pk>/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
## End API Tokens
]

View File

@ -2,17 +2,16 @@ from django.urls import path
from plane.api.views import (
ProjectViewSet,
InviteProjectEndpoint,
ProjectInvitationsViewset,
ProjectMemberViewSet,
ProjectMemberInvitationsViewset,
ProjectMemberUserEndpoint,
ProjectJoinEndpoint,
AddTeamToProjectEndpoint,
ProjectUserViewsEndpoint,
ProjectIdentifierEndpoint,
ProjectFavoritesViewSet,
LeaveProjectEndpoint,
ProjectPublicCoverImagesEndpoint,
UserProjectInvitationsViewset,
)
@ -45,13 +44,48 @@ urlpatterns = [
name="project-identifiers",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invite/",
InviteProjectEndpoint.as_view(),
name="invite-project",
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/",
ProjectInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/<uuid:pk>/",
ProjectInvitationsViewset.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-member-invite",
),
path(
"users/me/invitations/projects/",
UserProjectInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="user-project-invitations",
),
path(
"workspaces/<str:slug>/projects/join/",
ProjectJoinEndpoint.as_view(),
name="project-join",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/",
ProjectMemberViewSet.as_view({"get": "list", "post": "create"}),
ProjectMemberViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-member",
),
path(
@ -66,30 +100,19 @@ urlpatterns = [
name="project-member",
),
path(
"workspaces/<str:slug>/projects/join/",
ProjectJoinEndpoint.as_view(),
name="project-join",
"workspaces/<str:slug>/projects/<uuid:project_id>/members/leave/",
ProjectMemberViewSet.as_view(
{
"post": "leave",
}
),
name="project-member",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/team-invite/",
AddTeamToProjectEndpoint.as_view(),
name="projects",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/",
ProjectMemberInvitationsViewset.as_view({"get": "list"}),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/<uuid:pk>/",
ProjectMemberInvitationsViewset.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-views/",
ProjectUserViewsEndpoint.as_view(),
@ -119,11 +142,6 @@ urlpatterns = [
),
name="project-favorite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/leave/",
LeaveProjectEndpoint.as_view(),
name="leave-project",
),
path(
"project-covers/",
ProjectPublicCoverImagesEndpoint.as_view(),

View File

@ -9,15 +9,10 @@ from plane.api.views import (
ChangePasswordEndpoint,
## End User
## Workspaces
UserWorkspaceInvitationsEndpoint,
UserWorkSpacesEndpoint,
JoinWorkspaceEndpoint,
UserWorkspaceInvitationsEndpoint,
UserWorkspaceInvitationEndpoint,
UserActivityGraphEndpoint,
UserIssueCompletedGraphEndpoint,
UserWorkspaceDashboardEndpoint,
UserProjectInvitationsViewset,
## End Workspaces
)
@ -26,7 +21,11 @@ urlpatterns = [
path(
"users/me/",
UserEndpoint.as_view(
{"get": "retrieve", "patch": "partial_update", "delete": "destroy"}
{
"get": "retrieve",
"patch": "partial_update",
"delete": "deactivate",
}
),
name="users",
),
@ -65,23 +64,6 @@ urlpatterns = [
UserWorkSpacesEndpoint.as_view(),
name="user-workspace",
),
# user workspace invitations
path(
"users/me/invitations/workspaces/",
UserWorkspaceInvitationsEndpoint.as_view({"get": "list", "post": "create"}),
name="user-workspace-invitations",
),
# user workspace invitation
path(
"users/me/invitations/<uuid:pk>/",
UserWorkspaceInvitationEndpoint.as_view(
{
"get": "retrieve",
}
),
name="user-workspace-invitation",
),
# user join workspace
# User Graphs
path(
"users/me/workspaces/<str:slug>/activity-graph/",
@ -99,15 +81,4 @@ urlpatterns = [
name="user-workspace-dashboard",
),
## End User Graph
path(
"users/me/invitations/workspaces/<str:slug>/<uuid:pk>/join/",
JoinWorkspaceEndpoint.as_view(),
name="user-join-workspace",
),
# user project invitations
path(
"users/me/invitations/projects/",
UserProjectInvitationsViewset.as_view({"get": "list", "post": "create"}),
name="user-project-invitations",
),
]

View File

@ -0,0 +1,31 @@
from django.urls import path
from plane.api.views import (
WebhookEndpoint,
WebhookLogsEndpoint,
WebhookSecretRegenerateEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/webhooks/",
WebhookEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhooks/<uuid:pk>/",
WebhookEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhooks/<uuid:pk>/regenerate/",
WebhookSecretRegenerateEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhook-logs/<uuid:webhook_id>/",
WebhookLogsEndpoint.as_view(),
name="webhooks",
),
]

View File

@ -2,8 +2,9 @@ from django.urls import path
from plane.api.views import (
UserWorkspaceInvitationsViewSet,
WorkSpaceViewSet,
InviteWorkspaceEndpoint,
WorkspaceJoinEndpoint,
WorkSpaceMemberViewSet,
WorkspaceInvitationsViewset,
WorkspaceMemberUserEndpoint,
@ -17,7 +18,6 @@ from plane.api.views import (
WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint,
LeaveWorkspaceEndpoint,
)
@ -49,14 +49,14 @@ urlpatterns = [
),
name="workspace",
),
path(
"workspaces/<str:slug>/invite/",
InviteWorkspaceEndpoint.as_view(),
name="invite-workspace",
),
path(
"workspaces/<str:slug>/invitations/",
WorkspaceInvitationsViewset.as_view({"get": "list"}),
WorkspaceInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="workspace-invitations",
),
path(
@ -69,6 +69,23 @@ urlpatterns = [
),
name="workspace-invitations",
),
# user workspace invitations
path(
"users/me/workspaces/invitations/",
UserWorkspaceInvitationsViewSet.as_view(
{
"get": "list",
"post": "create",
},
),
name="user-workspace-invitations",
),
path(
"workspaces/<str:slug>/invitations/<uuid:pk>/join/",
WorkspaceJoinEndpoint.as_view(),
name="workspace-join",
),
# user join workspace
path(
"workspaces/<str:slug>/members/",
WorkSpaceMemberViewSet.as_view({"get": "list"}),
@ -85,6 +102,15 @@ urlpatterns = [
),
name="workspace-member",
),
path(
"workspaces/<str:slug>/members/leave/",
WorkSpaceMemberViewSet.as_view(
{
"post": "leave",
},
),
name="leave-workspace-members",
),
path(
"workspaces/<str:slug>/teams/",
TeamMemberViewSet.as_view(
@ -168,9 +194,4 @@ urlpatterns = [
WorkspaceLabelsEndpoint.as_view(),
name="workspace-labels",
),
path(
"workspaces/<str:slug>/members/leave/",
LeaveWorkspaceEndpoint.as_view(),
name="leave-workspace-members",
),
]

View File

@ -2,10 +2,8 @@ from .project import (
ProjectViewSet,
ProjectMemberViewSet,
UserProjectInvitationsViewset,
InviteProjectEndpoint,
ProjectInvitationsViewset,
AddTeamToProjectEndpoint,
ProjectMemberInvitationsViewset,
ProjectMemberInviteDetailViewSet,
ProjectIdentifierEndpoint,
ProjectJoinEndpoint,
ProjectUserViewsEndpoint,
@ -14,7 +12,6 @@ from .project import (
ProjectDeployBoardViewSet,
ProjectDeployBoardPublicSettingsEndpoint,
WorkspaceProjectDeployBoardEndpoint,
LeaveProjectEndpoint,
ProjectPublicCoverImagesEndpoint,
)
from .user import (
@ -26,19 +23,17 @@ from .user import (
from .oauth import OauthEndpoint
from .base import BaseAPIView, BaseViewSet
from .base import BaseAPIView, BaseViewSet, WebhookMixin
from .workspace import (
WorkSpaceViewSet,
UserWorkSpacesEndpoint,
WorkSpaceAvailabilityCheckEndpoint,
InviteWorkspaceEndpoint,
JoinWorkspaceEndpoint,
WorkspaceJoinEndpoint,
WorkSpaceMemberViewSet,
TeamMemberViewSet,
WorkspaceInvitationsViewset,
UserWorkspaceInvitationsEndpoint,
UserWorkspaceInvitationEndpoint,
UserWorkspaceInvitationsViewSet,
UserLastProjectWithWorkspaceEndpoint,
WorkspaceMemberUserEndpoint,
WorkspaceMemberUserViewsEndpoint,
@ -51,7 +46,6 @@ from .workspace import (
WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint,
LeaveWorkspaceEndpoint,
)
from .state import StateViewSet
from .view import (
@ -121,7 +115,7 @@ from .module import (
ModuleFavoriteViewSet,
)
from .api_token import ApiTokenEndpoint
from .api import ApiTokenEndpoint
from .integration import (
WorkspaceIntegrationViewSet,
@ -178,3 +172,5 @@ from .notification import (
from .exporter import ExportIssuesEndpoint
from .config import ConfigurationEndpoint
from .webhook import WebhookEndpoint, WebhookLogsEndpoint, WebhookSecretRegenerateEndpoint

View File

@ -0,0 +1,78 @@
# Python import
from uuid import uuid4
# Third party
from rest_framework.response import Response
from rest_framework import status
# Module import
from .base import BaseAPIView
from plane.db.models import APIToken, Workspace
from plane.api.serializers import APITokenSerializer, APITokenReadSerializer
from plane.api.permissions import WorkspaceOwnerPermission
class ApiTokenEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug):
label = request.data.get("label", str(uuid4().hex))
description = request.data.get("description", "")
workspace = Workspace.objects.get(slug=slug)
expired_at = request.data.get("expired_at", None)
# Check the user type
user_type = 1 if request.user.is_bot else 0
api_token = APIToken.objects.create(
label=label,
description=description,
user=request.user,
workspace=workspace,
user_type=user_type,
expired_at=expired_at,
)
serializer = APITokenSerializer(api_token)
# Token will be only visible while creating
return Response(
serializer.data,
status=status.HTTP_201_CREATED,
)
def get(self, request, slug, pk=None):
if pk == None:
api_tokens = APIToken.objects.filter(
user=request.user, workspace__slug=slug
)
serializer = APITokenReadSerializer(api_tokens, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
else:
api_tokens = APIToken.objects.get(
user=request.user, workspace__slug=slug, pk=pk
)
serializer = APITokenReadSerializer(api_tokens)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, slug, pk):
api_token = APIToken.objects.get(
workspace__slug=slug,
user=request.user,
pk=pk,
)
api_token.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, pk):
api_token = APIToken.objects.get(
workspace__slug=slug,
user=request.user,
pk=pk,
)
serializer = APITokenSerializer(api_token, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

View File

@ -1,47 +0,0 @@
# Python import
from uuid import uuid4
# Third party
from rest_framework.response import Response
from rest_framework import status
from sentry_sdk import capture_exception
# Module import
from .base import BaseAPIView
from plane.db.models import APIToken
from plane.api.serializers import APITokenSerializer
class ApiTokenEndpoint(BaseAPIView):
def post(self, request):
label = request.data.get("label", str(uuid4().hex))
workspace = request.data.get("workspace", False)
if not workspace:
return Response(
{"error": "Workspace is required"}, status=status.HTTP_200_OK
)
api_token = APIToken.objects.create(
label=label, user=request.user, workspace_id=workspace
)
serializer = APITokenSerializer(api_token)
# Token will be only vissible while creating
return Response(
{"api_token": serializer.data, "token": api_token.token},
status=status.HTTP_201_CREATED,
)
def get(self, request):
api_tokens = APIToken.objects.filter(user=request.user)
serializer = APITokenSerializer(api_tokens, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, pk):
api_token = APIToken.objects.get(pk=pk)
api_token.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@ -33,7 +33,7 @@ from plane.bgtasks.forgot_password_task import forgot_password
class RequestEmailVerificationEndpoint(BaseAPIView):
def get(self, request):
token = RefreshToken.for_user(request.user).access_token
current_site = settings.WEB_URL
current_site = request.META.get('HTTP_ORIGIN')
email_verification.delay(
request.user.first_name, request.user.email, token, current_site
)
@ -76,7 +76,7 @@ class ForgotPasswordEndpoint(BaseAPIView):
uidb64 = urlsafe_base64_encode(smart_bytes(user.id))
token = PasswordResetTokenGenerator().make_token(user)
current_site = settings.WEB_URL
current_site = request.META.get('HTTP_ORIGIN')
forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site

View File

@ -4,7 +4,7 @@ import random
import string
import json
import requests
from requests.exceptions import RequestException
# Django imports
from django.utils import timezone
from django.core.exceptions import ValidationError
@ -22,8 +22,13 @@ from sentry_sdk import capture_exception, capture_message
# Module imports
from . import BaseAPIView
from plane.db.models import User
from plane.api.serializers import UserSerializer
from plane.db.models import (
User,
WorkspaceMemberInvite,
WorkspaceMember,
ProjectMemberInvite,
ProjectMember,
)
from plane.settings.redis import redis_instance
from plane.bgtasks.magic_link_code_task import magic_link
@ -86,35 +91,93 @@ class SignUpEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
return Response(data, status=status.HTTP_200_OK)
@ -176,33 +239,92 @@ class SignInEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
access_token, refresh_token = get_tokens_for_user(user)
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
"event_type": "SIGN_IN",
},
)
)
except RequestException as e:
capture_exception(e)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
access_token, refresh_token = get_tokens_for_user(user)
return Response(data, status=status.HTTP_200_OK)
@ -287,7 +409,8 @@ class MagicSignInGenerateEndpoint(BaseAPIView):
ri.set(key, json.dumps(value), ex=expiry)
current_site = settings.WEB_URL
current_site = request.META.get('HTTP_ORIGIN')
magic_link.delay(email, key, token, current_site)
return Response({"key": key}, status=status.HTTP_200_OK)
@ -319,27 +442,37 @@ class MagicSignInEndpoint(BaseAPIView):
if str(token) == str(user_token):
if User.objects.filter(email=email).exists():
user = User.objects.get(email=email)
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
else:
user = User.objects.create(
email=email,
@ -347,27 +480,30 @@ class MagicSignInEndpoint(BaseAPIView):
password=make_password(uuid.uuid4().hex),
is_password_autoset=True,
)
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
"event_type": "SIGN_UP",
},
)
)
except RequestException as e:
capture_exception(e)
user.last_active = timezone.now()
user.last_login_time = timezone.now()
@ -376,6 +512,63 @@ class MagicSignInEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,

View File

@ -1,5 +1,6 @@
# Python imports
import zoneinfo
import json
# Django imports
from django.urls import resolve
@ -7,6 +8,7 @@ from django.conf import settings
from django.utils import timezone
from django.db import IntegrityError
from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.core.serializers.json import DjangoJSONEncoder
# Third part imports
from rest_framework import status
@ -22,6 +24,7 @@ from django_filters.rest_framework import DjangoFilterBackend
# Module imports
from plane.utils.paginator import BasePaginator
from plane.bgtasks.webhook_task import send_webhook
class TimezoneMixin:
@ -29,6 +32,7 @@ class TimezoneMixin:
This enables timezone conversion according
to the user set timezone
"""
def initial(self, request, *args, **kwargs):
super().initial(request, *args, **kwargs)
if request.user.is_authenticated:
@ -37,8 +41,28 @@ class TimezoneMixin:
timezone.deactivate()
class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
class WebhookMixin:
webhook_event = None
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
if (
self.webhook_event
and self.request.method in ["POST", "PATCH", "DELETE"]
and response.status_code in [200, 201, 204]
):
send_webhook.delay(
event=self.webhook_event,
event_data=json.dumps(response.data, cls=DjangoJSONEncoder),
action=self.request.method,
slug=self.workspace_slug,
)
return response
class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
model = None
permission_classes = [
@ -60,7 +84,7 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
except Exception as e:
capture_exception(e)
raise APIException("Please check the view", status.HTTP_400_BAD_REQUEST)
def handle_exception(self, exc):
"""
Handle any exception that occurs, by returning an appropriate response,
@ -71,18 +95,30 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
return response
except Exception as e:
if isinstance(e, IntegrityError):
return Response({"error": "The payload is not valid"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "The payload is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ValidationError):
return Response({"error": "Please provide valid detail"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Please provide valid detail"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0]
return Response({"error": f"{model_name} does not exist."}, status=status.HTTP_404_NOT_FOUND)
return Response(
{"error": f"{model_name} does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
if isinstance(e, KeyError):
capture_exception(e)
return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": f"key {e} does not exist"},
status=status.HTTP_400_BAD_REQUEST,
)
print(e) if settings.DEBUG else print("Server Error")
capture_exception(e)
@ -99,8 +135,8 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
print(
f"{request.method} - {request.get_full_path()} of Queries: {len(connection.queries)}"
)
return response
return response
except Exception as exc:
response = self.handle_exception(exc)
return exc
@ -120,7 +156,6 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
permission_classes = [
IsAuthenticated,
]
@ -139,7 +174,6 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
queryset = backend().filter_queryset(self.request, queryset, self)
return queryset
def handle_exception(self, exc):
"""
Handle any exception that occurs, by returning an appropriate response,
@ -150,19 +184,29 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
return response
except Exception as e:
if isinstance(e, IntegrityError):
return Response({"error": "The payload is not valid"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "The payload is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ValidationError):
return Response({"error": "Please provide valid detail"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Please provide valid detail"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0]
return Response({"error": f"{model_name} does not exist."}, status=status.HTTP_404_NOT_FOUND)
return Response(
{"error": f"{model_name} does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
if isinstance(e, KeyError):
return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST)
print(e) if settings.DEBUG else print("Server Error")
if settings.DEBUG:
print(e)
capture_exception(e)
return Response({"error": "Something went wrong please try again later"}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)

View File

@ -23,7 +23,7 @@ from rest_framework import status
from sentry_sdk import capture_exception
# Module imports
from . import BaseViewSet, BaseAPIView
from . import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import (
CycleSerializer,
CycleIssueSerializer,
@ -48,9 +48,10 @@ from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot
class CycleViewSet(BaseViewSet):
class CycleViewSet(WebhookMixin, BaseViewSet):
serializer_class = CycleSerializer
model = Cycle
webhook_event = "cycle"
permission_classes = [
ProjectEntityPermission,
]
@ -499,10 +500,10 @@ class CycleViewSet(BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT)
class CycleIssueViewSet(BaseViewSet):
class CycleIssueViewSet(WebhookMixin, BaseViewSet):
serializer_class = CycleIssueSerializer
model = CycleIssue
webhook_event = "cycle"
permission_classes = [
ProjectEntityPermission,
]

View File

@ -64,9 +64,7 @@ class InboxViewSet(BaseViewSet):
serializer.save(project_id=self.kwargs.get("project_id"))
def destroy(self, request, slug, project_id, pk):
inbox = Inbox.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
inbox = Inbox.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
# Handle default inbox delete
if inbox.is_default:
return Response(
@ -128,9 +126,7 @@ class InboxIssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@ -150,7 +146,6 @@ class InboxIssueViewSet(BaseViewSet):
status=status.HTTP_200_OK,
)
def create(self, request, slug, project_id, inbox_id):
if not request.data.get("issue", {}).get("name", False):
return Response(
@ -198,7 +193,7 @@ class InboxIssueViewSet(BaseViewSet):
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
# create an inbox issue
InboxIssue.objects.create(
@ -216,10 +211,20 @@ class InboxIssueViewSet(BaseViewSet):
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
# Get the project member
project_member = ProjectMember.objects.get(workspace__slug=slug, project_id=project_id, member=request.user)
project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
# Only project members admins and created_by users can access this endpoint
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot edit inbox issues"}, status=status.HTTP_400_BAD_REQUEST)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
):
return Response(
{"error": "You cannot edit inbox issues"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get issue data
issue_data = request.data.pop("issue", False)
@ -230,11 +235,13 @@ class InboxIssueViewSet(BaseViewSet):
)
# Only allow guests and viewers to edit name and description
if project_member.role <= 10:
# viewers and guests since only viewers and guests
# viewers and guests since only viewers and guests
issue_data = {
"name": issue_data.get("name", issue.name),
"description_html": issue_data.get("description_html", issue.description_html),
"description": issue_data.get("description", issue.description)
"description_html": issue_data.get(
"description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
}
issue_serializer = IssueCreateSerializer(
@ -256,7 +263,7 @@ class InboxIssueViewSet(BaseViewSet):
IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
issue_serializer.save()
else:
@ -307,7 +314,9 @@ class InboxIssueViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
else:
return Response(InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK)
return Response(
InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
)
def retrieve(self, request, slug, project_id, inbox_id, pk):
inbox_issue = InboxIssue.objects.get(
@ -324,15 +333,27 @@ class InboxIssueViewSet(BaseViewSet):
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
# Get the project member
project_member = ProjectMember.objects.get(workspace__slug=slug, project_id=project_id, member=request.user)
project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot delete inbox issue"}, status=status.HTTP_400_BAD_REQUEST)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
):
return Response(
{"error": "You cannot delete inbox issue"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check the issue status
if inbox_issue.status in [-2, -1, 0, 2]:
# Delete the issue also
Issue.objects.filter(workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id).delete()
Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id
).delete()
inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@ -347,7 +368,10 @@ class InboxIssuePublicViewSet(BaseViewSet):
]
def get_queryset(self):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=self.kwargs.get("slug"), project_id=self.kwargs.get("project_id"))
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
)
if project_deploy_board is not None:
return self.filter_queryset(
super()
@ -363,9 +387,14 @@ class InboxIssuePublicViewSet(BaseViewSet):
return InboxIssue.objects.none()
def list(self, request, slug, project_id, inbox_id):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
filters = issue_filters(request.query_params, "GET")
issues = (
@ -392,9 +421,7 @@ class InboxIssuePublicViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@ -415,9 +442,14 @@ class InboxIssuePublicViewSet(BaseViewSet):
)
def create(self, request, slug, project_id, inbox_id):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
if not request.data.get("issue", {}).get("name", False):
return Response(
@ -465,7 +497,7 @@ class InboxIssuePublicViewSet(BaseViewSet):
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
# create an inbox issue
InboxIssue.objects.create(
@ -479,34 +511,41 @@ class InboxIssuePublicViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
# Get the project member
if str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot edit inbox issues"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "You cannot edit inbox issues"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get issue data
issue_data = request.data.pop("issue", False)
issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
)
# viewers and guests since only viewers and guests
# viewers and guests since only viewers and guests
issue_data = {
"name": issue_data.get("name", issue.name),
"description_html": issue_data.get("description_html", issue.description_html),
"description": issue_data.get("description", issue.description)
"description_html": issue_data.get(
"description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
}
issue_serializer = IssueCreateSerializer(
issue, data=issue_data, partial=True
)
issue_serializer = IssueCreateSerializer(issue, data=issue_data, partial=True)
if issue_serializer.is_valid():
current_instance = issue
@ -523,17 +562,22 @@ class InboxIssuePublicViewSet(BaseViewSet):
IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
issue_serializer.save()
return Response(issue_serializer.data, status=status.HTTP_200_OK)
return Response(issue_serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def retrieve(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
@ -544,16 +588,24 @@ class InboxIssuePublicViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
if str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot delete inbox issue"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "You cannot delete inbox issue"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@ -33,7 +33,7 @@ from rest_framework.permissions import AllowAny, IsAuthenticated
from sentry_sdk import capture_exception
# Module imports
from . import BaseViewSet, BaseAPIView
from . import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import (
IssueCreateSerializer,
IssueActivitySerializer,
@ -84,7 +84,7 @@ from plane.utils.grouper import group_results
from plane.utils.issue_filters import issue_filters
class IssueViewSet(BaseViewSet):
class IssueViewSet(WebhookMixin, BaseViewSet):
def get_serializer_class(self):
return (
IssueCreateSerializer
@ -93,6 +93,7 @@ class IssueViewSet(BaseViewSet):
)
model = Issue
webhook_event = "issue"
permission_classes = [
ProjectEntityPermission,
]
@ -594,9 +595,10 @@ class IssueActivityEndpoint(BaseAPIView):
return Response(result_list, status=status.HTTP_200_OK)
class IssueCommentViewSet(BaseViewSet):
class IssueCommentViewSet(WebhookMixin, BaseViewSet):
serializer_class = IssueCommentSerializer
model = IssueComment
webhook_event = "issue-comment"
permission_classes = [
ProjectLitePermission,
]
@ -623,6 +625,7 @@ class IssueCommentViewSet(BaseViewSet):
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
member_id=self.request.user.id,
is_active=True,
)
)
)
@ -753,8 +756,8 @@ class LabelViewSet(BaseViewSet):
.select_related("project")
.select_related("workspace")
.select_related("parent")
.order_by("name")
.distinct()
.order_by("sort_order")
)
@ -1254,7 +1257,11 @@ class IssueSubscriberViewSet(BaseViewSet):
def list(self, request, slug, project_id, issue_id):
members = (
ProjectMember.objects.filter(workspace__slug=slug, project_id=project_id)
ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
.annotate(
is_subscribed=Exists(
IssueSubscriber.objects.filter(
@ -1498,6 +1505,7 @@ class IssueCommentPublicViewSet(BaseViewSet):
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
member_id=self.request.user.id,
is_active=True,
)
)
)
@ -1538,6 +1546,7 @@ class IssueCommentPublicViewSet(BaseViewSet):
if not ProjectMember.objects.filter(
project_id=project_id,
member=request.user,
is_active=True,
).exists():
# Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create(
@ -1651,6 +1660,7 @@ class IssueReactionPublicViewSet(BaseViewSet):
if not ProjectMember.objects.filter(
project_id=project_id,
member=request.user,
is_active=True,
).exists():
# Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create(
@ -1744,7 +1754,9 @@ class CommentReactionPublicViewSet(BaseViewSet):
project_id=project_id, comment_id=comment_id, actor=request.user
)
if not ProjectMember.objects.filter(
project_id=project_id, member=request.user
project_id=project_id,
member=request.user,
is_active=True,
).exists():
# Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create(
@ -1829,7 +1841,9 @@ class IssueVotePublicViewSet(BaseViewSet):
)
# Add the user for workspace tracking
if not ProjectMember.objects.filter(
project_id=project_id, member=request.user
project_id=project_id,
member=request.user,
is_active=True,
).exists():
_ = ProjectPublicMember.objects.get_or_create(
project_id=project_id,

View File

@ -15,7 +15,7 @@ from rest_framework import status
from sentry_sdk import capture_exception
# Module imports
from . import BaseViewSet
from . import BaseViewSet, WebhookMixin
from plane.api.serializers import (
ModuleWriteSerializer,
ModuleSerializer,
@ -41,11 +41,12 @@ from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot
class ModuleViewSet(BaseViewSet):
class ModuleViewSet(WebhookMixin, BaseViewSet):
model = Module
permission_classes = [
ProjectEntityPermission,
]
webhook_event = "module"
def get_serializer_class(self):
return (

View File

@ -85,7 +85,10 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
# Created issues
if type == "created":
if WorkspaceMember.objects.filter(
workspace__slug=slug, member=request.user, role__lt=15
workspace__slug=slug,
member=request.user,
role__lt=15,
is_active=True,
).exists():
notifications = Notification.objects.none()
else:
@ -255,7 +258,10 @@ class MarkAllReadNotificationViewSet(BaseViewSet):
# Created issues
if type == "created":
if WorkspaceMember.objects.filter(
workspace__slug=slug, member=request.user, role__lt=15
workspace__slug=slug,
member=request.user,
role__lt=15,
is_active=True,
).exists():
notifications = Notification.objects.none()
else:

View File

@ -2,6 +2,7 @@
import uuid
import requests
import os
from requests.exceptions import RequestException
# Django imports
from django.utils import timezone
@ -20,7 +21,14 @@ from google.oauth2 import id_token
from google.auth.transport import requests as google_auth_request
# Module imports
from plane.db.models import SocialLoginConnection, User
from plane.db.models import (
SocialLoginConnection,
User,
WorkspaceMemberInvite,
WorkspaceMember,
ProjectMemberInvite,
ProjectMember,
)
from plane.api.serializers import UserSerializer
from .base import BaseAPIView
@ -168,7 +176,6 @@ class OauthEndpoint(BaseAPIView):
)
## Login Case
if not user.is_active:
return Response(
{
@ -185,12 +192,61 @@ class OauthEndpoint(BaseAPIView):
user.is_email_verified = email_verified
user.save()
access_token, refresh_token = get_tokens_for_user(user)
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
SocialLoginConnection.objects.update_or_create(
medium=medium,
@ -201,26 +257,36 @@ class OauthEndpoint(BaseAPIView):
"last_login_at": timezone.now(),
},
)
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
"event_type": "SIGN_IN",
},
)
)
except RequestException as e:
capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_200_OK)
except User.DoesNotExist:
@ -260,31 +326,85 @@ class OauthEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
"event_type": "SIGN_UP",
},
)
)
except RequestException as e:
capture_exception(e)
SocialLoginConnection.objects.update_or_create(
medium=medium,
@ -295,4 +415,10 @@ class OauthEndpoint(BaseAPIView):
"last_login_at": timezone.now(),
},
)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_201_CREATED)

View File

@ -17,16 +17,16 @@ from django.db.models import (
)
from django.core.validators import validate_email
from django.conf import settings
from django.utils import timezone
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
from rest_framework import serializers
from rest_framework.permissions import AllowAny
from sentry_sdk import capture_exception
# Module imports
from .base import BaseViewSet, BaseAPIView
from .base import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import (
ProjectSerializer,
ProjectListSerializer,
@ -39,6 +39,7 @@ from plane.api.serializers import (
)
from plane.api.permissions import (
WorkspaceUserPermission,
ProjectBasePermission,
ProjectEntityPermission,
ProjectMemberPermission,
@ -58,13 +59,6 @@ from plane.db.models import (
ProjectIdentifier,
Module,
Cycle,
CycleFavorite,
ModuleFavorite,
PageFavorite,
IssueViewFavorite,
Page,
IssueAssignee,
ModuleMember,
Inbox,
ProjectDeployBoard,
IssueProperty,
@ -73,9 +67,10 @@ from plane.db.models import (
from plane.bgtasks.project_invitation_task import project_invitation
class ProjectViewSet(BaseViewSet):
class ProjectViewSet(WebhookMixin, BaseViewSet):
serializer_class = ProjectSerializer
model = Project
webhook_event = "project"
permission_classes = [
ProjectBasePermission,
@ -110,12 +105,15 @@ class ProjectViewSet(BaseViewSet):
member=self.request.user,
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
is_active=True,
)
)
)
.annotate(
total_members=ProjectMember.objects.filter(
project_id=OuterRef("id"), member__is_bot=False
project_id=OuterRef("id"),
member__is_bot=False,
is_active=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@ -137,6 +135,7 @@ class ProjectViewSet(BaseViewSet):
member_role=ProjectMember.objects.filter(
project_id=OuterRef("pk"),
member_id=self.request.user.id,
is_active=True,
).values("role")
)
.annotate(
@ -157,6 +156,7 @@ class ProjectViewSet(BaseViewSet):
member=request.user,
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
is_active=True,
).values("sort_order")
projects = (
self.get_queryset()
@ -166,6 +166,7 @@ class ProjectViewSet(BaseViewSet):
"project_projectmember",
queryset=ProjectMember.objects.filter(
workspace__slug=slug,
is_active=True,
).select_related("member"),
)
)
@ -345,66 +346,104 @@ class ProjectViewSet(BaseViewSet):
)
class InviteProjectEndpoint(BaseAPIView):
class ProjectInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def post(self, request, slug, project_id):
email = request.data.get("email", False)
role = request.data.get("role", False)
# Check if email is provided
if not email:
return Response(
{"error": "Email is required"}, status=status.HTTP_400_BAD_REQUEST
)
validate_email(email)
# Check if user is already a member of workspace
if ProjectMember.objects.filter(
project_id=project_id,
member__email=email,
member__is_bot=False,
).exists():
return Response(
{"error": "User is already member of workspace"},
status=status.HTTP_400_BAD_REQUEST,
)
user = User.objects.filter(email=email).first()
if user is None:
token = jwt.encode(
{"email": email, "timestamp": datetime.now().timestamp()},
settings.SECRET_KEY,
algorithm="HS256",
)
project_invitation_obj = ProjectMemberInvite.objects.create(
email=email.strip().lower(),
project_id=project_id,
token=token,
role=role,
)
domain = settings.WEB_URL
project_invitation.delay(email, project_id, token, domain)
return Response(
{
"message": "Email sent successfully",
"id": project_invitation_obj.id,
},
status=status.HTTP_200_OK,
)
project_member = ProjectMember.objects.create(
member=user, project_id=project_id, role=role
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.select_related("project")
.select_related("workspace", "workspace__owner")
)
_ = IssueProperty.objects.create(user=user, project_id=project_id)
def create(self, request, slug, project_id):
emails = request.data.get("emails", [])
# Check if email is provided
if not emails:
return Response(
{"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST
)
requesting_user = ProjectMember.objects.get(
workspace__slug=slug, project_id=project_id, member_id=request.user.id
)
# Check if any invited user has an higher role
if len(
[
email
for email in emails
if int(email.get("role", 10)) > requesting_user.role
]
):
return Response(
{"error": "You cannot invite a user with higher role"},
status=status.HTTP_400_BAD_REQUEST,
)
workspace = Workspace.objects.get(slug=slug)
project_invitations = []
for email in emails:
try:
validate_email(email.get("email"))
project_invitations.append(
ProjectMemberInvite(
email=email.get("email").strip().lower(),
project_id=project_id,
workspace_id=workspace.id,
token=jwt.encode(
{
"email": email,
"timestamp": datetime.now().timestamp(),
},
settings.SECRET_KEY,
algorithm="HS256",
),
role=email.get("role", 10),
created_by=request.user,
)
)
except ValidationError:
return Response(
{
"error": f"Invalid email - {email} provided a valid email address is required to send the invite"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Create workspace member invite
project_invitations = ProjectMemberInvite.objects.bulk_create(
project_invitations, batch_size=10, ignore_conflicts=True
)
current_site = request.META.get('HTTP_ORIGIN')
# Send invitations
for invitation in project_invitations:
project_invitations.delay(
invitation.email,
project_id,
invitation.token,
current_site,
request.user.email,
)
return Response(
ProjectMemberSerializer(project_member).data, status=status.HTTP_200_OK
{
"message": "Email sent successfully",
},
status=status.HTTP_200_OK,
)
@ -420,40 +459,134 @@ class UserProjectInvitationsViewset(BaseViewSet):
.select_related("workspace", "workspace__owner", "project")
)
def create(self, request):
invitations = request.data.get("invitations")
project_invitations = ProjectMemberInvite.objects.filter(
pk__in=invitations, accepted=True
def create(self, request, slug):
project_ids = request.data.get("project_ids", [])
# Get the workspace user role
workspace_member = WorkspaceMember.objects.get(
member=request.user,
workspace__slug=slug,
is_active=True,
)
workspace_role = workspace_member.role
workspace = workspace_member.workspace
ProjectMember.objects.bulk_create(
[
ProjectMember(
project=invitation.project,
workspace=invitation.project.workspace,
project_id=project_id,
member=request.user,
role=invitation.role,
role=15 if workspace_role >= 15 else 10,
workspace=workspace,
created_by=request.user,
)
for invitation in project_invitations
]
for project_id in project_ids
],
ignore_conflicts=True,
)
IssueProperty.objects.bulk_create(
[
ProjectMember(
project=invitation.project,
workspace=invitation.project.workspace,
IssueProperty(
project_id=project_id,
user=request.user,
workspace=workspace,
created_by=request.user,
)
for invitation in project_invitations
]
for project_id in project_ids
],
ignore_conflicts=True,
)
# Delete joined project invites
project_invitations.delete()
return Response(
{"message": "Projects joined successfully"},
status=status.HTTP_201_CREATED,
)
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectJoinEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request, slug, project_id, pk):
project_invite = ProjectMemberInvite.objects.get(
pk=pk,
project_id=project_id,
workspace__slug=slug,
)
email = request.data.get("email", "")
if email == "" or project_invite.email != email:
return Response(
{"error": "You do not have permission to join the project"},
status=status.HTTP_403_FORBIDDEN,
)
if project_invite.responded_at is None:
project_invite.accepted = request.data.get("accepted", False)
project_invite.responded_at = timezone.now()
project_invite.save()
if project_invite.accepted:
# Check if the user account exists
user = User.objects.filter(email=email).first()
# Check if user is a part of workspace
workspace_member = WorkspaceMember.objects.filter(
workspace__slug=slug, member=user
).first()
# Add him to workspace
if workspace_member is None:
_ = WorkspaceMember.objects.create(
workspace_id=project_invite.workspace_id,
member=user,
role=15 if project_invite.role >= 15 else project_invite.role,
)
else:
# Else make him active
workspace_member.is_active = True
workspace_member.save()
# Check if the user was already a member of project then activate the user
project_member = ProjectMember.objects.filter(
workspace_id=project_invite.workspace_id, member=user
).first()
if project_member is None:
# Create a Project Member
_ = ProjectMember.objects.create(
workspace_id=project_invite.workspace_id,
member=user,
role=project_invite.role,
)
else:
project_member.is_active = True
project_member.role = project_member.role
project_member.save()
return Response(
{"message": "Project Invitation Accepted"},
status=status.HTTP_200_OK,
)
return Response(
{"message": "Project Invitation was not accepted"},
status=status.HTTP_200_OK,
)
return Response(
{"error": "You have already responded to the invitation request"},
status=status.HTTP_400_BAD_REQUEST,
)
def get(self, request, slug, project_id, pk):
project_invitation = ProjectMemberInvite.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
serializer = ProjectMemberInviteSerializer(project_invitation)
return Response(serializer.data, status=status.HTTP_200_OK)
class ProjectMemberViewSet(BaseViewSet):
@ -475,6 +608,7 @@ class ProjectMemberViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(member__is_bot=False)
.filter()
.select_related("project")
.select_related("member")
.select_related("workspace", "workspace__owner")
@ -542,13 +676,17 @@ class ProjectMemberViewSet(BaseViewSet):
def list(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
member=request.user, workspace__slug=slug, project_id=project_id
member=request.user,
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
project_members = ProjectMember.objects.filter(
project_id=project_id,
workspace__slug=slug,
member__is_bot=False,
is_active=True,
).select_related("project", "member", "workspace")
if project_member.role > 10:
@ -559,7 +697,10 @@ class ProjectMemberViewSet(BaseViewSet):
def partial_update(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id
pk=pk,
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
if request.user.id == project_member.member_id:
return Response(
@ -568,7 +709,10 @@ class ProjectMemberViewSet(BaseViewSet):
)
# Check while updating user roles
requested_project_member = ProjectMember.objects.get(
project_id=project_id, workspace__slug=slug, member=request.user
project_id=project_id,
workspace__slug=slug,
member=request.user,
is_active=True,
)
if (
"role" in request.data
@ -591,54 +735,66 @@ class ProjectMemberViewSet(BaseViewSet):
def destroy(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
workspace__slug=slug,
project_id=project_id,
pk=pk,
member__is_bot=False,
is_active=True,
)
# check requesting user role
requesting_project_member = ProjectMember.objects.get(
workspace__slug=slug, member=request.user, project_id=project_id
workspace__slug=slug,
member=request.user,
project_id=project_id,
is_active=True,
)
# User cannot remove himself
if str(project_member.id) == str(requesting_project_member.id):
return Response(
{
"error": "You cannot remove yourself from the workspace. Please use leave workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
# User cannot deactivate higher role
if requesting_project_member.role < project_member.role:
return Response(
{"error": "You cannot remove a user having role higher than yourself"},
{"error": "You cannot remove a user having role higher than you"},
status=status.HTTP_400_BAD_REQUEST,
)
# Remove all favorites
ProjectFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
CycleFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
ModuleFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
PageFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
IssueViewFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
# Also remove issue from issue assigned
IssueAssignee.objects.filter(
workspace__slug=slug,
project_id=project_id,
assignee=project_member.member,
).delete()
project_member.is_active = False
project_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
# Remove if module member
ModuleMember.objects.filter(
def leave(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=project_member.member,
).delete()
# Delete owned Pages
Page.objects.filter(
workspace__slug=slug,
project_id=project_id,
owned_by=project_member.member,
).delete()
project_member.delete()
member=request.user,
is_active=True,
)
# Check if the leaving user is the only admin of the project
if (
project_member.role == 20
and not ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
role=20,
is_active=True,
).count()
> 1
):
return Response(
{
"error": "You cannot leave the project as your the only admin of the project you will have to either delete the project or create an another admin",
},
status=status.HTTP_400_BAD_REQUEST,
)
# Deactivate the user
project_member.is_active = False
project_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@ -691,46 +847,6 @@ class AddTeamToProjectEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED)
class ProjectMemberInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.select_related("project")
.select_related("workspace", "workspace__owner")
)
class ProjectMemberInviteDetailViewSet(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.select_related("project")
.select_related("workspace", "workspace__owner")
)
class ProjectIdentifierEndpoint(BaseAPIView):
permission_classes = [
ProjectBasePermission,
@ -774,59 +890,14 @@ class ProjectIdentifierEndpoint(BaseAPIView):
)
class ProjectJoinEndpoint(BaseAPIView):
def post(self, request, slug):
project_ids = request.data.get("project_ids", [])
# Get the workspace user role
workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug
)
workspace_role = workspace_member.role
workspace = workspace_member.workspace
ProjectMember.objects.bulk_create(
[
ProjectMember(
project_id=project_id,
member=request.user,
role=20
if workspace_role >= 15
else (15 if workspace_role == 10 else workspace_role),
workspace=workspace,
created_by=request.user,
)
for project_id in project_ids
],
ignore_conflicts=True,
)
IssueProperty.objects.bulk_create(
[
IssueProperty(
project_id=project_id,
user=request.user,
workspace=workspace,
created_by=request.user,
)
for project_id in project_ids
],
ignore_conflicts=True,
)
return Response(
{"message": "Projects joined successfully"},
status=status.HTTP_201_CREATED,
)
class ProjectUserViewsEndpoint(BaseAPIView):
def post(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project_member = ProjectMember.objects.filter(
member=request.user, project=project
member=request.user,
project=project,
is_active=True,
).first()
if project_member is None:
@ -850,7 +921,10 @@ class ProjectUserViewsEndpoint(BaseAPIView):
class ProjectMemberUserEndpoint(BaseAPIView):
def get(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
project_id=project_id, workspace__slug=slug, member=request.user
project_id=project_id,
workspace__slug=slug,
member=request.user,
is_active=True,
)
serializer = ProjectMemberSerializer(project_member)
@ -983,39 +1057,6 @@ class WorkspaceProjectDeployBoardEndpoint(BaseAPIView):
return Response(projects, status=status.HTTP_200_OK)
class LeaveProjectEndpoint(BaseAPIView):
permission_classes = [
ProjectLitePermission,
]
def delete(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
workspace__slug=slug,
member=request.user,
project_id=project_id,
)
# Only Admin case
if (
project_member.role == 20
and ProjectMember.objects.filter(
workspace__slug=slug,
role=20,
project_id=project_id,
).count()
== 1
):
return Response(
{
"error": "You cannot leave the project since you are the only admin of the project you should delete the project"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the member from workspace
project_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectPublicCoverImagesEndpoint(BaseAPIView):
permission_classes = [
AllowAny,

View File

@ -13,13 +13,7 @@ from plane.api.serializers import (
)
from plane.api.views.base import BaseViewSet, BaseAPIView
from plane.db.models import (
User,
Workspace,
WorkspaceMemberInvite,
Issue,
IssueActivity,
)
from plane.db.models import User, IssueActivity, WorkspaceMember
from plane.utils.paginator import BasePaginator
@ -41,10 +35,28 @@ class UserEndpoint(BaseViewSet):
serialized_data = UserMeSettingsSerializer(request.user).data
return Response(serialized_data, status=status.HTTP_200_OK)
def deactivate(self, request):
# Check all workspace user is active
user = self.get_object()
if WorkspaceMember.objects.filter(
member=request.user, is_active=True
).exists():
return Response(
{
"error": "User cannot deactivate account as user is active in some workspaces"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Deactivate the user
user.is_active = False
user.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class UpdateUserOnBoardedEndpoint(BaseAPIView):
def patch(self, request):
user = User.objects.get(pk=request.user.id)
user = User.objects.get(pk=request.user.id, is_active=True)
user.is_onboarded = request.data.get("is_onboarded", False)
user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)
@ -52,7 +64,7 @@ class UpdateUserOnBoardedEndpoint(BaseAPIView):
class UpdateUserTourCompletedEndpoint(BaseAPIView):
def patch(self, request):
user = User.objects.get(pk=request.user.id)
user = User.objects.get(pk=request.user.id, is_active=True)
user.is_tour_completed = request.data.get("is_tour_completed", False)
user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)

View File

@ -0,0 +1,130 @@
# Django imports
from django.db import IntegrityError
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.db.models import Webhook, WebhookLog, Workspace
from plane.db.models.webhook import generate_token
from .base import BaseAPIView
from plane.api.permissions import WorkspaceOwnerPermission
from plane.api.serializers import WebhookSerializer, WebhookLogSerializer
class WebhookEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug):
workspace = Workspace.objects.get(slug=slug)
try:
serializer = WebhookSerializer(data=request.data)
if serializer.is_valid():
serializer.save(workspace_id=workspace.id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
{"error": "URL already exists for the workspace"},
status=status.HTTP_410_GONE,
)
raise IntegrityError
def get(self, request, slug, pk=None):
if pk == None:
webhooks = Webhook.objects.filter(workspace__slug=slug)
serializer = WebhookSerializer(
webhooks,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
many=True,
)
return Response(serializer.data, status=status.HTTP_200_OK)
else:
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
serializer = WebhookSerializer(
webhook,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, pk):
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
serializer = WebhookSerializer(
webhook,
data=request.data,
partial=True,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, pk):
webhook = Webhook.objects.get(pk=pk, workspace__slug=slug)
webhook.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class WebhookSecretRegenerateEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug, pk):
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
webhook.secret_key = generate_token()
webhook.save()
serializer = WebhookSerializer(webhook)
return Response(serializer.data, status=status.HTTP_200_OK)
class WebhookLogsEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def get(self, request, slug, webhook_id):
webhook_logs = WebhookLog.objects.filter(
workspace__slug=slug, webhook_id=webhook_id
)
serializer = WebhookLogSerializer(webhook_logs, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@ -2,7 +2,6 @@
import jwt
from datetime import date, datetime
from dateutil.relativedelta import relativedelta
from uuid import uuid4
# Django imports
from django.db import IntegrityError
@ -26,13 +25,11 @@ from django.db.models import (
)
from django.db.models.functions import ExtractWeek, Cast, ExtractDay
from django.db.models.fields import DateField
from django.contrib.auth.hashers import make_password
# Third party modules
from rest_framework import status
from rest_framework.response import Response
from rest_framework.permissions import AllowAny
from sentry_sdk import capture_exception
from rest_framework.permissions import AllowAny, IsAuthenticated
# Module imports
from plane.api.serializers import (
@ -59,14 +56,6 @@ from plane.db.models import (
IssueActivity,
Issue,
WorkspaceTheme,
IssueAssignee,
ProjectFavorite,
CycleFavorite,
ModuleMember,
ModuleFavorite,
PageFavorite,
Page,
IssueViewFavorite,
IssueLink,
IssueAttachment,
IssueSubscriber,
@ -106,7 +95,9 @@ class WorkSpaceViewSet(BaseViewSet):
def get_queryset(self):
member_count = (
WorkspaceMember.objects.filter(
workspace=OuterRef("id"), member__is_bot=False
workspace=OuterRef("id"),
member__is_bot=False,
is_active=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@ -181,7 +172,9 @@ class UserWorkSpacesEndpoint(BaseAPIView):
def get(self, request):
member_count = (
WorkspaceMember.objects.filter(
workspace=OuterRef("id"), member__is_bot=False
workspace=OuterRef("id"),
member__is_bot=False,
is_active=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@ -227,23 +220,40 @@ class WorkSpaceAvailabilityCheckEndpoint(BaseAPIView):
return Response({"status": not workspace}, status=status.HTTP_200_OK)
class InviteWorkspaceEndpoint(BaseAPIView):
class WorkspaceInvitationsViewset(BaseViewSet):
"""Endpoint for creating, listing and deleting workspaces"""
serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite
permission_classes = [
WorkSpaceAdminPermission,
]
def post(self, request, slug):
emails = request.data.get("emails", False)
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "workspace__owner", "created_by")
)
def create(self, request, slug):
emails = request.data.get("emails", [])
# Check if email is provided
if not emails or not len(emails):
if not emails:
return Response(
{"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST
)
# check for role level
# check for role level of the requesting user
requesting_user = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
# Check if any invited user has an higher role
if len(
[
email
@ -256,15 +266,17 @@ class InviteWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# Get the workspace object
workspace = Workspace.objects.get(slug=slug)
# Check if user is already a member of workspace
workspace_members = WorkspaceMember.objects.filter(
workspace_id=workspace.id,
member__email__in=[email.get("email") for email in emails],
is_active=True,
).select_related("member", "workspace", "workspace__owner")
if len(workspace_members):
if workspace_members:
return Response(
{
"error": "Some users are already member of workspace",
@ -302,35 +314,20 @@ class InviteWorkspaceEndpoint(BaseAPIView):
},
status=status.HTTP_400_BAD_REQUEST,
)
WorkspaceMemberInvite.objects.bulk_create(
# Create workspace member invite
workspace_invitations = WorkspaceMemberInvite.objects.bulk_create(
workspace_invitations, batch_size=10, ignore_conflicts=True
)
workspace_invitations = WorkspaceMemberInvite.objects.filter(
email__in=[email.get("email") for email in emails]
).select_related("workspace")
# create the user if signup is disabled
if settings.DOCKERIZED and not settings.ENABLE_SIGNUP:
_ = User.objects.bulk_create(
[
User(
username=str(uuid4().hex),
email=invitation.email,
password=make_password(uuid4().hex),
is_password_autoset=True,
)
for invitation in workspace_invitations
],
batch_size=100,
)
current_site = request.META.get('HTTP_ORIGIN')
# Send invitations
for invitation in workspace_invitations:
workspace_invitation.delay(
invitation.email,
workspace.id,
invitation.token,
settings.WEB_URL,
current_site,
request.user.email,
)
@ -341,11 +338,19 @@ class InviteWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
def destroy(self, request, slug, pk):
workspace_member_invite = WorkspaceMemberInvite.objects.get(
pk=pk, workspace__slug=slug
)
workspace_member_invite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class JoinWorkspaceEndpoint(BaseAPIView):
class WorkspaceJoinEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
"""Invitation response endpoint the user can respond to the invitation"""
def post(self, request, slug, pk):
workspace_invite = WorkspaceMemberInvite.objects.get(
@ -354,12 +359,14 @@ class JoinWorkspaceEndpoint(BaseAPIView):
email = request.data.get("email", "")
# Check the email
if email == "" or workspace_invite.email != email:
return Response(
{"error": "You do not have permission to join the workspace"},
status=status.HTTP_403_FORBIDDEN,
)
# If already responded then return error
if workspace_invite.responded_at is None:
workspace_invite.accepted = request.data.get("accepted", False)
workspace_invite.responded_at = timezone.now()
@ -371,12 +378,23 @@ class JoinWorkspaceEndpoint(BaseAPIView):
# If the user is present then create the workspace member
if user is not None:
WorkspaceMember.objects.create(
workspace=workspace_invite.workspace,
member=user,
role=workspace_invite.role,
)
# Check if the user was already a member of workspace then activate the user
workspace_member = WorkspaceMember.objects.filter(
workspace=workspace_invite.workspace, member=user
).first()
if workspace_member is not None:
workspace_member.is_active = True
workspace_member.role = workspace_invite.role
workspace_member.save()
else:
# Create a Workspace
_ = WorkspaceMember.objects.create(
workspace=workspace_invite.workspace,
member=user,
role=workspace_invite.role,
)
# Set the user last_workspace_id to the accepted workspace
user.last_workspace_id = workspace_invite.workspace.id
user.save()
@ -388,6 +406,7 @@ class JoinWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
# Workspace invitation rejected
return Response(
{"message": "Workspace Invitation was not accepted"},
status=status.HTTP_200_OK,
@ -398,37 +417,13 @@ class JoinWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
class WorkspaceInvitationsViewset(BaseViewSet):
serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite
permission_classes = [
WorkSpaceAdminPermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "workspace__owner", "created_by")
)
def destroy(self, request, slug, pk):
workspace_member_invite = WorkspaceMemberInvite.objects.get(
pk=pk, workspace__slug=slug
)
# delete the user if signup is disabled
if settings.DOCKERIZED and not settings.ENABLE_SIGNUP:
user = User.objects.filter(email=workspace_member_invite.email).first()
if user is not None:
user.delete()
workspace_member_invite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def get(self, request, slug, pk):
workspace_invitation = WorkspaceMemberInvite.objects.get(workspace__slug=slug, pk=pk)
serializer = WorkSpaceMemberInviteSerializer(workspace_invitation)
return Response(serializer.data, status=status.HTTP_200_OK)
class UserWorkspaceInvitationsEndpoint(BaseViewSet):
class UserWorkspaceInvitationsViewSet(BaseViewSet):
serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite
@ -442,9 +437,19 @@ class UserWorkspaceInvitationsEndpoint(BaseViewSet):
)
def create(self, request):
invitations = request.data.get("invitations")
workspace_invitations = WorkspaceMemberInvite.objects.filter(pk__in=invitations)
invitations = request.data.get("invitations", [])
workspace_invitations = WorkspaceMemberInvite.objects.filter(
pk__in=invitations, email=request.user.email
).order_by("-created_at")
# If the user is already a member of workspace and was deactivated then activate the user
for invitation in workspace_invitations:
# Update the WorkspaceMember for this specific invitation
WorkspaceMember.objects.filter(
workspace_id=invitation.workspace_id, member=request.user
).update(is_active=True, role=invitation.role)
# Bulk create the user for all the workspaces
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
@ -481,20 +486,24 @@ class WorkSpaceMemberViewSet(BaseViewSet):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"), member__is_bot=False)
.filter(
workspace__slug=self.kwargs.get("slug"),
member__is_bot=False,
is_active=True,
)
.select_related("workspace", "workspace__owner")
.select_related("member")
)
def list(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug
member=request.user,
workspace__slug=slug,
is_active=True,
)
workspace_members = WorkspaceMember.objects.filter(
workspace__slug=slug,
member__is_bot=False,
).select_related("workspace", "member")
# Get all active workspace members
workspace_members = self.get_queryset()
if workspace_member.role > 10:
serializer = WorkspaceMemberAdminSerializer(workspace_members, many=True)
@ -506,7 +515,12 @@ class WorkSpaceMemberViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, pk):
workspace_member = WorkspaceMember.objects.get(pk=pk, workspace__slug=slug)
workspace_member = WorkspaceMember.objects.get(
pk=pk,
workspace__slug=slug,
member__is_bot=False,
is_active=True,
)
if request.user.id == workspace_member.member_id:
return Response(
{"error": "You cannot update your own role"},
@ -515,7 +529,9 @@ class WorkSpaceMemberViewSet(BaseViewSet):
# Get the requested user role
requested_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
# Check if role is being updated
# One cannot update role higher than his own role
@ -540,68 +556,121 @@ class WorkSpaceMemberViewSet(BaseViewSet):
def destroy(self, request, slug, pk):
# Check the user role who is deleting the user
workspace_member = WorkspaceMember.objects.get(workspace__slug=slug, pk=pk)
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug,
pk=pk,
member__is_bot=False,
is_active=True,
)
# check requesting user role
requesting_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
if str(workspace_member.id) == str(requesting_workspace_member.id):
return Response(
{
"error": "You cannot remove yourself from the workspace. Please use leave workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
if requesting_workspace_member.role < workspace_member.role:
return Response(
{"error": "You cannot remove a user having role higher than you"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check for the only member in the workspace
if (
workspace_member.role == 20
and WorkspaceMember.objects.filter(
workspace__slug=slug,
role=20,
member__is_bot=False,
).count()
== 1
Project.objects.annotate(
total_members=Count("project_projectmember"),
member_with_role=Count(
"project_projectmember",
filter=Q(
project_projectmember__member_id=request.user.id,
project_projectmember__role=20,
),
),
)
.filter(total_members=1, member_with_role=1, workspace__slug=slug)
.exists()
):
return Response(
{"error": "Cannot delete the only Admin for the workspace"},
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the user also from all the projects
ProjectMember.objects.filter(
workspace__slug=slug, member=workspace_member.member
).delete()
# Remove all favorites
ProjectFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
CycleFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
ModuleFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
PageFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
IssueViewFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
# Also remove issue from issue assigned
IssueAssignee.objects.filter(
workspace__slug=slug, assignee=workspace_member.member
).delete()
# Deactivate the users from the projects where the user is part of
_ = ProjectMember.objects.filter(
workspace__slug=slug,
member_id=workspace_member.member_id,
is_active=True,
).update(is_active=False)
# Remove if module member
ModuleMember.objects.filter(
workspace__slug=slug, member=workspace_member.member
).delete()
# Delete owned Pages
Page.objects.filter(
workspace__slug=slug, owned_by=workspace_member.member
).delete()
workspace_member.is_active = False
workspace_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
workspace_member.delete()
def leave(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug,
member=request.user,
is_active=True,
)
# Check if the leaving user is the only admin of the workspace
if (
workspace_member.role == 20
and not WorkspaceMember.objects.filter(
workspace__slug=slug,
role=20,
is_active=True,
).count()
> 1
):
return Response(
{
"error": "You cannot leave the workspace as your the only admin of the workspace you will have to either delete the workspace or create an another admin"
},
status=status.HTTP_400_BAD_REQUEST,
)
if (
Project.objects.annotate(
total_members=Count("project_projectmember"),
member_with_role=Count(
"project_projectmember",
filter=Q(
project_projectmember__member_id=request.user.id,
project_projectmember__role=20,
),
),
)
.filter(total_members=1, member_with_role=1, workspace__slug=slug)
.exists()
):
return Response(
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
},
status=status.HTTP_400_BAD_REQUEST,
)
# # Deactivate the users from the projects where the user is part of
_ = ProjectMember.objects.filter(
workspace__slug=slug,
member_id=workspace_member.member_id,
is_active=True,
).update(is_active=False)
# # Deactivate the user
workspace_member.is_active = False
workspace_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@ -629,7 +698,9 @@ class TeamMemberViewSet(BaseViewSet):
def create(self, request, slug):
members = list(
WorkspaceMember.objects.filter(
workspace__slug=slug, member__id__in=request.data.get("members", [])
workspace__slug=slug,
member__id__in=request.data.get("members", []),
is_active=True,
)
.annotate(member_str_id=Cast("member", output_field=CharField()))
.distinct()
@ -658,23 +729,6 @@ class TeamMemberViewSet(BaseViewSet):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class UserWorkspaceInvitationEndpoint(BaseViewSet):
model = WorkspaceMemberInvite
serializer_class = WorkSpaceMemberInviteSerializer
permission_classes = [
AllowAny,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(pk=self.kwargs.get("pk"))
.select_related("workspace")
)
class UserLastProjectWithWorkspaceEndpoint(BaseAPIView):
def get(self, request):
user = User.objects.get(pk=request.user.id)
@ -711,7 +765,9 @@ class UserLastProjectWithWorkspaceEndpoint(BaseAPIView):
class WorkspaceMemberUserEndpoint(BaseAPIView):
def get(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug
member=request.user,
workspace__slug=slug,
is_active=True,
)
serializer = WorkspaceMemberMeSerializer(workspace_member)
return Response(serializer.data, status=status.HTTP_200_OK)
@ -720,7 +776,9 @@ class WorkspaceMemberUserEndpoint(BaseAPIView):
class WorkspaceMemberUserViewsEndpoint(BaseAPIView):
def post(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
workspace_member.view_props = request.data.get("view_props", {})
workspace_member.save()
@ -1046,7 +1104,9 @@ class WorkspaceUserProfileEndpoint(BaseAPIView):
user_data = User.objects.get(pk=user_id)
requesting_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
projects = []
if requesting_workspace_member.role >= 10:
@ -1250,9 +1310,7 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
return Response(
issues, status=status.HTTP_200_OK
)
return Response(issues, status=status.HTTP_200_OK)
class WorkspaceLabelsEndpoint(BaseAPIView):
@ -1266,30 +1324,3 @@ class WorkspaceLabelsEndpoint(BaseAPIView):
project__project_projectmember__member=request.user,
).values("parent", "name", "color", "id", "project_id", "workspace__slug")
return Response(labels, status=status.HTTP_200_OK)
class LeaveWorkspaceEndpoint(BaseAPIView):
permission_classes = [
WorkspaceEntityPermission,
]
def delete(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
)
# Only Admin case
if (
workspace_member.role == 20
and WorkspaceMember.objects.filter(workspace__slug=slug, role=20).count()
== 1
):
return Response(
{
"error": "You cannot leave the workspace since you are the only admin of the workspace you should delete the workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the member from workspace
workspace_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@ -0,0 +1,47 @@
# Django imports
from django.utils import timezone
from django.db.models import Q
# Third party imports
from rest_framework import authentication
from rest_framework.exceptions import AuthenticationFailed
# Module imports
from plane.db.models import APIToken
class APIKeyAuthentication(authentication.BaseAuthentication):
"""
Authentication with an API Key
"""
www_authenticate_realm = "api"
media_type = "application/json"
auth_header_name = "X-Api-Key"
def get_api_token(self, request):
return request.headers.get(self.auth_header_name)
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
token=token,
is_active=True,
)
except APIToken.DoesNotExist:
raise AuthenticationFailed("Given API token is not valid")
# save api token last used
api_token.last_used = timezone.now()
api_token.save(update_fields=["last_used"])
return (api_token.user, api_token.token)
def authenticate(self, request):
token = self.get_api_token(request=request)
if not token:
return None
# Validate the API token
user, token = self.validate_api_token(token)
return user, token

View File

@ -0,0 +1,5 @@
from django.apps import AppConfig
class ApiConfig(AppConfig):
name = "plane.authentication"

View File

@ -73,6 +73,12 @@ def service_importer(service, importer_id):
]
)
# Check if any of the users are already member of workspace
_ = WorkspaceMember.objects.filter(
member__in=[user for user in workspace_users],
workspace_id=importer.workspace_id,
).update(is_active=True)
# Add new users to Workspace and project automatically
WorkspaceMember.objects.bulk_create(
[

View File

@ -13,23 +13,24 @@ from plane.db.models import Project, User, ProjectMemberInvite
@shared_task
def project_invitation(email, project_id, token, current_site):
def project_invitation(email, project_id, token, current_site, invitor):
try:
user = User.objects.get(email=invitor)
project = Project.objects.get(pk=project_id)
project_member_invite = ProjectMemberInvite.objects.get(
token=token, email=email
)
relativelink = f"/project-member-invitation/{project_member_invite.id}"
relativelink = f"/project-invitations/?invitation_id={project_member_invite.id}&email={email}&slug={project.workspace.slug}&project_id={str(project_id)}"
abs_url = current_site + relativelink
from_email_string = settings.EMAIL_FROM
subject = f"{project.created_by.first_name or project.created_by.email} invited you to join {project.name} on Plane"
subject = f"{user.first_name or user.display_name or user.email} invited you to join {project.name} on Plane"
context = {
"email": email,
"first_name": project.created_by.first_name,
"first_name": user.first_name,
"project_name": project.name,
"invitation_url": abs_url,
}

View File

@ -0,0 +1,139 @@
import requests
import uuid
import hashlib
import json
# Django imports
from django.conf import settings
# Third party imports
from celery import shared_task
from sentry_sdk import capture_exception
from plane.db.models import Webhook, WebhookLog
@shared_task(
bind=True,
autoretry_for=(requests.RequestException,),
retry_backoff=600,
max_retries=5,
retry_jitter=True,
)
def webhook_task(self, webhook, slug, event, event_data, action):
try:
webhook = Webhook.objects.get(id=webhook, workspace__slug=slug)
headers = {
"Content-Type": "application/json",
"User-Agent": "Autopilot",
"X-Plane-Delivery": str(uuid.uuid4()),
"X-Plane-Event": event,
}
# Your secret key
if webhook.secret_key:
# Concatenate the data and the secret key
message = event_data + webhook.secret_key
# Create a SHA-256 hash of the message
sha256 = hashlib.sha256()
sha256.update(message.encode("utf-8"))
signature = sha256.hexdigest()
headers["X-Plane-Signature"] = signature
event_data = json.loads(event_data) if event_data is not None else None
action = {
"POST": "create",
"PATCH": "update",
"PUT": "update",
"DELETE": "delete",
}.get(action, action)
payload = {
"event": event,
"action": action,
"webhook_id": str(webhook.id),
"workspace_id": str(webhook.workspace_id),
"data": event_data,
}
# Send the webhook event
response = requests.post(
webhook.url,
headers=headers,
json=payload,
timeout=30,
)
# Log the webhook request
WebhookLog.objects.create(
workspace_id=str(webhook.workspace_id),
webhook_id=str(webhook.id),
event_type=str(event),
request_method=str(action),
request_headers=str(headers),
request_body=str(payload),
response_status=str(response.status_code),
response_headers=str(response.headers),
response_body=str(response.text),
retry_count=str(self.request.retries),
)
except requests.RequestException as e:
# Log the failed webhook request
WebhookLog.objects.create(
workspace_id=str(webhook.workspace_id),
webhook_id=str(webhook.id),
event_type=str(event),
request_method=str(action),
request_headers=str(headers),
request_body=str(payload),
response_status=500,
response_headers="",
response_body=str(e),
retry_count=str(self.request.retries),
)
# Retry logic
if self.request.retries >= self.max_retries:
Webhook.objects.filter(pk=webhook.id).update(is_active=False)
return
raise requests.RequestException()
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return
@shared_task()
def send_webhook(event, event_data, action, slug):
try:
webhooks = Webhook.objects.filter(workspace__slug=slug, is_active=True)
if event == "project":
webhooks = webhooks.filter(project=True)
if event == "issue":
webhooks = webhooks.filter(issue=True)
if event == "module":
webhooks = webhooks.filter(module=True)
if event == "cycle":
webhooks = webhooks.filter(cycle=True)
if event == "issue-comment":
webhooks = webhooks.filter(issue_comment=True)
for webhook in webhooks:
webhook_task.delay(webhook.id, slug, event, event_data, action)
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@ -11,25 +11,33 @@ from slack_sdk import WebClient
from slack_sdk.errors import SlackApiError
# Module imports
from plane.db.models import Workspace, WorkspaceMemberInvite
from plane.db.models import User, Workspace, WorkspaceMemberInvite
@shared_task
def workspace_invitation(email, workspace_id, token, current_site, invitor):
try:
user = User.objects.get(email=invitor)
workspace = Workspace.objects.get(pk=workspace_id)
workspace_member_invite = WorkspaceMemberInvite.objects.get(
token=token, email=email
)
realtivelink = (
f"/workspace-member-invitation/?invitation_id={workspace_member_invite.id}&email={email}"
# Relative link
relative_link = (
f"/workspace-invitations/?invitation_id={workspace_member_invite.id}&email={email}&slug={workspace.slug}"
)
abs_url = current_site + realtivelink
# The complete url including the domain
abs_url = current_site + relative_link
# The email from
from_email_string = settings.EMAIL_FROM
subject = f"{invitor or email} invited you to join {workspace.name} on Plane"
# Subject of the email
subject = f"{user.first_name or user.display_name or user.email} invited you to join {workspace.name} on Plane"
context = {
"email": email,

View File

@ -3,7 +3,7 @@
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.api_token
import plane.db.models.api
import uuid
@ -40,8 +40,8 @@ class Migration(migrations.Migration):
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('token', models.CharField(default=plane.db.models.api_token.generate_token, max_length=255, unique=True)),
('label', models.CharField(default=plane.db.models.api_token.generate_label_token, max_length=255)),
('token', models.CharField(default=plane.db.models.api.generate_token, max_length=255, unique=True)),
('label', models.CharField(default=plane.db.models.api.generate_label_token, max_length=255)),
('user_type', models.PositiveSmallIntegerField(choices=[(0, 'Human'), (1, 'Bot')], default=0)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='apitoken_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='apitoken_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),

View File

@ -1,41 +0,0 @@
# Generated by Django 4.2.5 on 2023-10-18 12:04
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.issue
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0045_issueactivity_epoch_workspacemember_issue_props_and_more'),
]
operations = [
migrations.CreateModel(
name="issue_mentions",
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4,editable=False, primary_key=True, serialize=False, unique=True)),
('mention', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to=settings.AUTH_USER_MODEL)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL,related_name='issuemention_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('issue', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to='db.issue')),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_issuemention', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='issuemention_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_issuemention', to='db.workspace')),
],
options={
'verbose_name': 'IssueMention',
'verbose_name_plural': 'IssueMentions',
'db_table': 'issue_mentions',
'ordering': ('-created_at',),
},
),
migrations.AlterField(
model_name='issueproperty',
name='properties',
field=models.JSONField(default=plane.db.models.issue.get_default_properties),
),
]

View File

@ -0,0 +1,984 @@
# Generated by Django 4.2.5 on 2023-11-15 09:47
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.issue
import uuid
import random
def random_sort_ordering(apps, schema_editor):
Label = apps.get_model("db", "Label")
bulk_labels = []
for label in Label.objects.all():
label.sort_order = random.randint(0,65535)
bulk_labels.append(label)
Label.objects.bulk_update(bulk_labels, ["sort_order"], batch_size=1000)
class Migration(migrations.Migration):
dependencies = [
('db', '0045_issueactivity_epoch_workspacemember_issue_props_and_more'),
]
operations = [
migrations.AddField(
model_name='label',
name='sort_order',
field=models.FloatField(default=65535),
),
migrations.AlterField(
model_name='analyticview',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='analyticview',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='apitoken',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='apitoken',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycle',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cycle',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cycle',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycle',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='cycleissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cycleissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cycleissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycleissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='estimate',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='estimate',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='estimate',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='estimate',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='estimatepoint',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='estimatepoint',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='estimatepoint',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='estimatepoint',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='fileasset',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='fileasset',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubissuesync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubissuesync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubissuesync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubissuesync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubrepository',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubrepository',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubrepository',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubrepository',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='importer',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='importer',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='importer',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='importer',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='inbox',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='inbox',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='inbox',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='inbox',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='inboxissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='inboxissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='inboxissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='inboxissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='integration',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='integration',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueactivity',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueactivity',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueactivity',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueactivity',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueassignee',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueassignee',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueassignee',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueassignee',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueattachment',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueattachment',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueattachment',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueattachment',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueblocker',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueblocker',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueblocker',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueblocker',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuecomment',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuecomment',
name='issue',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_comments', to='db.issue'),
),
migrations.AlterField(
model_name='issuecomment',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuecomment',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuecomment',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuelabel',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuelabel',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuelabel',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuelabel',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuelink',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuelink',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuelink',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuelink',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueproperty',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueproperty',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueproperty',
name='properties',
field=models.JSONField(default=plane.db.models.issue.get_default_properties),
),
migrations.AlterField(
model_name='issueproperty',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueproperty',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuesequence',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuesequence',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuesequence',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuesequence',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueview',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueview',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueview',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueview',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='label',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='label',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='label',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='label',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='module',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='module',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='module',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='module',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='moduleissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='moduleissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='moduleissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='moduleissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulelink',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulelink',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulelink',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulelink',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulemember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulemember',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulemember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulemember',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='page',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='page',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='page',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='page',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pageblock',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pageblock',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pageblock',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pageblock',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pagefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pagefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pagefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pagefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pagelabel',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pagelabel',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pagelabel',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pagelabel',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='project',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='project',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectfavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='projectidentifier',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectidentifier',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectmember',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectmember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmember',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='socialloginconnection',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='socialloginconnection',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='state',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='state',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='state',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='state',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='team',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='team',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='teammember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='teammember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspace',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspace',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspaceintegration',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspaceintegration',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacemember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacemember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacememberinvite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacememberinvite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacetheme',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacetheme',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.CreateModel(
name='IssueMention',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('issue', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to='db.issue')),
('mention', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to=settings.AUTH_USER_MODEL)),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace')),
],
options={
'verbose_name': 'Issue Mention',
'verbose_name_plural': 'Issue Mentions',
'db_table': 'issue_mentions',
'ordering': ('-created_at',),
'unique_together': {('issue', 'mention')},
},
),
migrations.RunPython(random_sort_ordering),
]

View File

@ -0,0 +1,131 @@
# Generated by Django 4.2.5 on 2023-11-15 11:20
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.api
import plane.db.models.webhook
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0046_label_sort_order_alter_analyticview_created_by_and_more'),
]
operations = [
migrations.CreateModel(
name='Webhook',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('url', models.URLField(validators=[plane.db.models.webhook.validate_schema, plane.db.models.webhook.validate_domain])),
('is_active', models.BooleanField(default=True)),
('secret_key', models.CharField(default=plane.db.models.webhook.generate_token, max_length=255)),
('project', models.BooleanField(default=False)),
('issue', models.BooleanField(default=False)),
('module', models.BooleanField(default=False)),
('cycle', models.BooleanField(default=False)),
('issue_comment', models.BooleanField(default=False)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_webhooks', to='db.workspace')),
],
options={
'verbose_name': 'Webhook',
'verbose_name_plural': 'Webhooks',
'db_table': 'webhooks',
'ordering': ('-created_at',),
'unique_together': {('workspace', 'url')},
},
),
migrations.AddField(
model_name='apitoken',
name='description',
field=models.TextField(blank=True),
),
migrations.AddField(
model_name='apitoken',
name='expired_at',
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name='apitoken',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AddField(
model_name='apitoken',
name='last_used',
field=models.DateTimeField(null=True),
),
migrations.AddField(
model_name='projectmember',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AddField(
model_name='workspacemember',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AlterField(
model_name='apitoken',
name='token',
field=models.CharField(db_index=True, default=plane.db.models.api.generate_token, max_length=255, unique=True),
),
migrations.CreateModel(
name='WebhookLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('event_type', models.CharField(blank=True, max_length=255, null=True)),
('request_method', models.CharField(blank=True, max_length=10, null=True)),
('request_headers', models.TextField(blank=True, null=True)),
('request_body', models.TextField(blank=True, null=True)),
('response_status', models.TextField(blank=True, null=True)),
('response_headers', models.TextField(blank=True, null=True)),
('response_body', models.TextField(blank=True, null=True)),
('retry_count', models.PositiveSmallIntegerField(default=0)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('webhook', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='logs', to='db.webhook')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='webhook_logs', to='db.workspace')),
],
options={
'verbose_name': 'Webhook Log',
'verbose_name_plural': 'Webhook Logs',
'db_table': 'webhook_logs',
'ordering': ('-created_at',),
},
),
migrations.CreateModel(
name='APIActivityLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('token_identifier', models.CharField(max_length=255)),
('path', models.CharField(max_length=255)),
('method', models.CharField(max_length=10)),
('query_params', models.TextField(blank=True, null=True)),
('headers', models.TextField(blank=True, null=True)),
('body', models.TextField(blank=True, null=True)),
('response_code', models.PositiveIntegerField()),
('response_body', models.TextField(blank=True, null=True)),
('ip_address', models.GenericIPAddressField(blank=True, null=True)),
('user_agent', models.CharField(blank=True, max_length=512, null=True)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
],
options={
'verbose_name': 'API Activity Log',
'verbose_name_plural': 'API Activity Logs',
'db_table': 'api_activity_logs',
'ordering': ('-created_at',),
},
),
]

View File

@ -54,7 +54,7 @@ from .view import GlobalView, IssueView, IssueViewFavorite
from .module import Module, ModuleMember, ModuleIssue, ModuleLink, ModuleFavorite
from .api_token import APIToken
from .api import APIToken, APIActivityLog
from .integration import (
WorkspaceIntegration,
@ -79,3 +79,5 @@ from .analytic import AnalyticView
from .notification import Notification
from .exporter import ExporterHistory
from .webhook import Webhook, WebhookLog

View File

@ -0,0 +1,80 @@
# Python imports
from uuid import uuid4
# Django imports
from django.db import models
from django.conf import settings
from .base import BaseModel
def generate_label_token():
return uuid4().hex
def generate_token():
return "plane_api_" + uuid4().hex
class APIToken(BaseModel):
# Meta information
label = models.CharField(max_length=255, default=generate_label_token)
description = models.TextField(blank=True)
is_active = models.BooleanField(default=True)
last_used = models.DateTimeField(null=True)
# Token
token = models.CharField(
max_length=255, unique=True, default=generate_token, db_index=True
)
# User Information
user = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="bot_tokens",
)
user_type = models.PositiveSmallIntegerField(
choices=((0, "Human"), (1, "Bot")), default=0
)
workspace = models.ForeignKey(
"db.Workspace", related_name="api_tokens", on_delete=models.CASCADE, null=True
)
expired_at = models.DateTimeField(blank=True, null=True)
class Meta:
verbose_name = "API Token"
verbose_name_plural = "API Tokems"
db_table = "api_tokens"
ordering = ("-created_at",)
def __str__(self):
return str(self.user.id)
class APIActivityLog(BaseModel):
token_identifier = models.CharField(max_length=255)
# Request Info
path = models.CharField(max_length=255)
method = models.CharField(max_length=10)
query_params = models.TextField(null=True, blank=True)
headers = models.TextField(null=True, blank=True)
body = models.TextField(null=True, blank=True)
# Response info
response_code = models.PositiveIntegerField()
response_body = models.TextField(null=True, blank=True)
# Meta information
ip_address = models.GenericIPAddressField(null=True, blank=True)
user_agent = models.CharField(max_length=512, null=True, blank=True)
class Meta:
verbose_name = "API Activity Log"
verbose_name_plural = "API Activity Logs"
db_table = "api_activity_logs"
ordering = ("-created_at",)
def __str__(self):
return str(self.token_identifier)

View File

@ -1,41 +0,0 @@
# Python imports
from uuid import uuid4
# Django imports
from django.db import models
from django.conf import settings
from .base import BaseModel
def generate_label_token():
return uuid4().hex
def generate_token():
return uuid4().hex + uuid4().hex
class APIToken(BaseModel):
token = models.CharField(max_length=255, unique=True, default=generate_token)
label = models.CharField(max_length=255, default=generate_label_token)
user = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="bot_tokens",
)
user_type = models.PositiveSmallIntegerField(
choices=((0, "Human"), (1, "Bot")), default=0
)
workspace = models.ForeignKey(
"db.Workspace", related_name="api_tokens", on_delete=models.CASCADE, null=True
)
class Meta:
verbose_name = "API Token"
verbose_name_plural = "API Tokems"
db_table = "api_tokens"
ordering = ("-created_at",)
def __str__(self):
return str(self.user.name)

View File

@ -431,6 +431,7 @@ class Label(ProjectBaseModel):
name = models.CharField(max_length=255)
description = models.TextField(blank=True)
color = models.CharField(max_length=255, blank=True)
sort_order = models.FloatField(default=65535)
class Meta:
unique_together = ["name", "project"]
@ -439,6 +440,18 @@ class Label(ProjectBaseModel):
db_table = "labels"
ordering = ("-created_at",)
def save(self, *args, **kwargs):
if self._state.adding:
# Get the maximum sequence value from the database
last_id = Label.objects.filter(project=self.project).aggregate(
largest=models.Max("sort_order")
)["largest"]
# if last_id is not None
if last_id is not None:
self.sort_order = last_id + 10000
super(Label, self).save(*args, **kwargs)
def __str__(self):
return str(self.name)

View File

@ -166,6 +166,7 @@ class ProjectMember(ProjectBaseModel):
default_props = models.JSONField(default=get_default_props)
preferences = models.JSONField(default=get_default_preferences)
sort_order = models.FloatField(default=65535)
is_active = models.BooleanField(default=True)
def save(self, *args, **kwargs):
if self._state.adding:

View File

@ -0,0 +1,90 @@
# Python imports
from uuid import uuid4
from urllib.parse import urlparse
# Django imports
from django.db import models
from django.core.exceptions import ValidationError
# Module imports
from plane.db.models import BaseModel
def generate_token():
return "plane_wh_" + uuid4().hex
def validate_schema(value):
parsed_url = urlparse(value)
print(parsed_url)
if parsed_url.scheme not in ["http", "https"]:
raise ValidationError("Invalid schema. Only HTTP and HTTPS are allowed.")
def validate_domain(value):
parsed_url = urlparse(value)
domain = parsed_url.netloc
if domain in ["localhost", "127.0.0.1"]:
raise ValidationError("Local URLs are not allowed.")
class Webhook(BaseModel):
workspace = models.ForeignKey(
"db.Workspace",
on_delete=models.CASCADE,
related_name="workspace_webhooks",
)
url = models.URLField(
validators=[
validate_schema,
validate_domain,
]
)
is_active = models.BooleanField(default=True)
secret_key = models.CharField(max_length=255, default=generate_token)
project = models.BooleanField(default=False)
issue = models.BooleanField(default=False)
module = models.BooleanField(default=False)
cycle = models.BooleanField(default=False)
issue_comment = models.BooleanField(default=False)
def __str__(self):
return f"{self.workspace.slug} {self.url}"
class Meta:
unique_together = ["workspace", "url"]
verbose_name = "Webhook"
verbose_name_plural = "Webhooks"
db_table = "webhooks"
ordering = ("-created_at",)
class WebhookLog(BaseModel):
workspace = models.ForeignKey(
"db.Workspace", on_delete=models.CASCADE, related_name="webhook_logs"
)
# Associated webhook
webhook = models.ForeignKey(Webhook, on_delete=models.CASCADE, related_name="logs")
# Basic request details
event_type = models.CharField(max_length=255, blank=True, null=True)
request_method = models.CharField(max_length=10, blank=True, null=True)
request_headers = models.TextField(blank=True, null=True)
request_body = models.TextField(blank=True, null=True)
# Response details
response_status = models.TextField(blank=True, null=True)
response_headers = models.TextField(blank=True, null=True)
response_body = models.TextField(blank=True, null=True)
# Retry Count
retry_count = models.PositiveSmallIntegerField(default=0)
class Meta:
verbose_name = "Webhook Log"
verbose_name_plural = "Webhook Logs"
db_table = "webhook_logs"
ordering = ("-created_at",)
def __str__(self):
return f"{self.event_type} {str(self.webhook.url)}"

View File

@ -99,6 +99,7 @@ class WorkspaceMember(BaseModel):
view_props = models.JSONField(default=get_default_props)
default_props = models.JSONField(default=get_default_props)
issue_props = models.JSONField(default=get_issue_props)
is_active = models.BooleanField(default=True)
class Meta:
unique_together = ["workspace", "member"]

View File

@ -0,0 +1,40 @@
from plane.db.models import APIToken, APIActivityLog
class APITokenLogMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
request_body = request.body
response = self.get_response(request)
self.process_request(request, response, request_body)
return response
def process_request(self, request, response, request_body):
api_key_header = "X-Api-Key"
api_key = request.headers.get(api_key_header)
# If the API key is present, log the request
if api_key:
try:
APIActivityLog.objects.create(
token_identifier=api_key,
path=request.path,
method=request.method,
query_params=request.META.get("QUERY_STRING", ""),
headers=str(request.headers),
body=(request_body.decode('utf-8') if request_body else None),
response_body=(
response.content.decode("utf-8") if response.content else None
),
response_code=response.status_code,
ip_address=request.META.get("REMOTE_ADDR", None),
user_agent=request.META.get("HTTP_USER_AGENT", None),
)
except Exception as e:
print(e)
# If the token does not exist, you can decide whether to log this as an invalid attempt
pass
return None

View File

View File

@ -0,0 +1,5 @@
from django.apps import AppConfig
class ProxyConfig(AppConfig):
name = "plane.proxy"

View File

@ -0,0 +1,45 @@
from django.utils import timezone
from rest_framework.throttling import SimpleRateThrottle
class ApiKeyRateThrottle(SimpleRateThrottle):
scope = 'api_key'
def get_cache_key(self, request, view):
# Retrieve the API key from the request header
api_key = request.headers.get('X-Api-Key')
if not api_key:
return None # Allow the request if there's no API key
# Use the API key as part of the cache key
return f'{self.scope}:{api_key}'
def allow_request(self, request, view):
# Calculate the current time as a Unix timestamp
now = timezone.now().timestamp()
# Use the parent class's method to check if the request is allowed
allowed = super().allow_request(request, view)
if allowed:
# Calculate the remaining limit and reset time
history = self.cache.get(self.key, [])
# Remove old histories
while history and history[-1] <= now - self.duration:
history.pop()
# Calculate the requests
num_requests = len(history)
# Check available requests
available = self.num_requests - num_requests
# Unix timestamp for when the rate limit will reset
reset_time = int(now + self.duration)
# Add headers
request.META['X-RateLimit-Remaining'] = max(0, available)
request.META['X-RateLimit-Reset'] = reset_time
return allowed

View File

@ -0,0 +1,13 @@
from .cycle import urlpatterns as cycle_patterns
from .inbox import urlpatterns as inbox_patterns
from .issue import urlpatterns as issue_patterns
from .module import urlpatterns as module_patterns
from .project import urlpatterns as project_patterns
urlpatterns = [
*cycle_patterns,
*inbox_patterns,
*issue_patterns,
*module_patterns,
*project_patterns,
]

View File

@ -0,0 +1,35 @@
from django.urls import path
from plane.proxy.views.cycle import (
CycleAPIEndpoint,
CycleIssueAPIEndpoint,
TransferCycleIssueAPIEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/",
CycleAPIEndpoint.as_view(),
name="cycles",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/",
CycleAPIEndpoint.as_view(),
name="cycles",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/",
CycleIssueAPIEndpoint.as_view(),
name="cycle-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:pk>/",
CycleIssueAPIEndpoint.as_view(),
name="cycle-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/transfer-issues/",
TransferCycleIssueAPIEndpoint.as_view(),
name="transfer-issues",
),
]

View File

@ -0,0 +1,17 @@
from django.urls import path
from plane.proxy.views import InboxIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/<uuid:pk>/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
]

View File

@ -0,0 +1,51 @@
from django.urls import path
from plane.proxy.views import (
IssueAPIEndpoint,
LabelAPIEndpoint,
IssueLinkAPIEndpoint,
IssueCommentAPIEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueAPIEndpoint.as_view(),
name="issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/",
IssueAPIEndpoint.as_view(),
name="issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/",
LabelAPIEndpoint.as_view(),
name="labels",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/<uuid:pk>/",
LabelAPIEndpoint.as_view(),
name="labels",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/",
IssueLinkAPIEndpoint.as_view(),
name="issue-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/<uuid:pk>/",
IssueLinkAPIEndpoint.as_view(),
name="issue-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/",
IssueCommentAPIEndpoint.as_view(),
name="project-issue-comment",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/<uuid:pk>/",
IssueCommentAPIEndpoint.as_view(),
name="project-issue-comment",
),
]

View File

@ -0,0 +1,26 @@
from django.urls import path
from plane.proxy.views import ModuleAPIEndpoint, ModuleIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/",
ModuleAPIEndpoint.as_view(),
name="modules",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:pk>/",
ModuleAPIEndpoint.as_view(),
name="modules",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/",
ModuleIssueAPIEndpoint.as_view(),
name="module-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:pk>/",
ModuleIssueAPIEndpoint.as_view(),
name="module-issues",
),
]

View File

@ -0,0 +1,16 @@
from django.urls import path
from plane.proxy.views import ProjectAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/",
ProjectAPIEndpoint.as_view(),
name="project",
),
path(
"workspaces/<str:slug>/projects/<uuid:pk>/",
ProjectAPIEndpoint.as_view(),
name="project",
),
]

View File

@ -0,0 +1,18 @@
from .project import ProjectAPIEndpoint
from .issue import (
IssueAPIEndpoint,
LabelAPIEndpoint,
IssueLinkAPIEndpoint,
IssueCommentAPIEndpoint,
)
from .cycle import (
CycleAPIEndpoint,
CycleIssueAPIEndpoint,
TransferCycleIssueAPIEndpoint,
)
from .module import ModuleAPIEndpoint, ModuleIssueAPIEndpoint
from .inbox import InboxIssueAPIEndpoint

View File

@ -0,0 +1,101 @@
# Python imports
import re
import json
import requests
# Django imports
from django.conf import settings
# Third party imports
from rest_framework.views import APIView
from rest_framework.response import Response
from rest_framework.permissions import IsAuthenticated
from rest_framework_simplejwt.tokens import RefreshToken
# Module imports
from plane.authentication.api_authentication import APIKeyAuthentication
from plane.proxy.rate_limit import ApiKeyRateThrottle
class BaseAPIView(APIView):
authentication_classes = [
APIKeyAuthentication,
]
permission_classes = [
IsAuthenticated,
]
throttle_classes = [
ApiKeyRateThrottle,
]
def _get_jwt_token(self, request):
refresh = RefreshToken.for_user(request.user)
return str(refresh.access_token)
def _get_url_path(self, request):
match = re.search(r"/v1/(.*)", request.path)
return match.group(1) if match else ""
def _get_headers(self, request):
return {
"Authorization": f"Bearer {self._get_jwt_token(request=request)}",
"Content-Type": request.headers.get("Content-Type", "application/json"),
}
def _get_url(self, request):
path = self._get_url_path(request=request)
url = request.build_absolute_uri("/api/" + path)
return url
def _get_query_params(self, request):
query_params = request.GET
return query_params
def _get_payload(self, request):
content_type = request.headers.get("Content-Type", "application/json")
if content_type.startswith("multipart/form-data"):
files_dict = {k: v[0] for k, v in request.FILES.lists()}
return (None, files_dict)
else:
return (json.dumps(request.data), None)
def _make_request(self, request, method="GET"):
data_payload, files_payload = self._get_payload(request=request)
response = requests.request(
method=method,
url=self._get_url(request=request),
headers=self._get_headers(request=request),
params=self._get_query_params(request=request),
data=data_payload,
files=files_payload,
)
return response.json(), response.status_code
def finalize_response(self, request, response, *args, **kwargs):
# Call super to get the default response
response = super().finalize_response(request, response, *args, **kwargs)
# Add custom headers if they exist in the request META
ratelimit_remaining = request.META.get('X-RateLimit-Remaining')
if ratelimit_remaining is not None:
response['X-RateLimit-Remaining'] = ratelimit_remaining
ratelimit_reset = request.META.get('X-RateLimit-Reset')
if ratelimit_reset is not None:
response['X-RateLimit-Reset'] = ratelimit_reset
return response
def get(self, request, *args, **kwargs):
response, status_code = self._make_request(request=request, method="GET")
return Response(response, status=status_code)
def post(self, request, *args, **kwargs):
response, status_code = self._make_request(request=request, method="POST")
return Response(response, status=status_code)
def partial_update(self, request, *args, **kwargs):
response, status_code = self._make_request(request=request, method="PATCH")
return Response(response, status=status_code)

View File

@ -0,0 +1,30 @@
from .base import BaseAPIView
class CycleAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to cycle.
"""
pass
class CycleIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to cycle issues.
"""
pass
class TransferCycleIssueAPIEndpoint(BaseAPIView):
"""
This viewset provides `create` actions for transfering the issues into a particular cycle.
"""
pass

View File

@ -0,0 +1,10 @@
from .base import BaseAPIView
class InboxIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to inbox issues.
"""
pass

View File

@ -0,0 +1,37 @@
from .base import BaseAPIView
class IssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to issue.
"""
pass
class LabelAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to the labels.
"""
pass
class IssueLinkAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to the links of the particular issue.
"""
pass
class IssueCommentAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to comments of the particular issue.
"""
pass

View File

@ -0,0 +1,20 @@
from .base import BaseAPIView
class ModuleAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module.
"""
pass
class ModuleIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module issues.
"""
pass

View File

@ -0,0 +1,5 @@
from .base import BaseAPIView
class ProjectAPIEndpoint(BaseAPIView):
pass

View File

@ -1,22 +1,36 @@
"""Global Settings"""
# Python imports
import os
import datetime
import ssl
import certifi
from datetime import timedelta
from urllib.parse import urlparse
# Django imports
from django.core.management.utils import get_random_secret_key
# Third party imports
import dj_database_url
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.redis import RedisIntegration
from sentry_sdk.integrations.celery import CeleryIntegration
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Secret Key
SECRET_KEY = os.environ.get("SECRET_KEY", get_random_secret_key())
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
DEBUG = False
ALLOWED_HOSTS = []
# Allowed Hosts
ALLOWED_HOSTS = ["*"]
# Redirect if / is not present
APPEND_SLASH = True
# Application definition
INSTALLED_APPS = [
"django.contrib.auth",
"django.contrib.contenttypes",
@ -29,6 +43,7 @@ INSTALLED_APPS = [
"plane.utils",
"plane.web",
"plane.middleware",
"plane.proxy",
# Third-party things
"rest_framework",
"rest_framework.authtoken",
@ -36,12 +51,13 @@ INSTALLED_APPS = [
"corsheaders",
"taggit",
"django_celery_beat",
"storages",
]
# Middlewares
MIDDLEWARE = [
"corsheaders.middleware.CorsMiddleware",
"django.middleware.security.SecurityMiddleware",
# "whitenoise.middleware.WhiteNoiseMiddleware",
"django.contrib.sessions.middleware.SessionMiddleware",
"django.middleware.common.CommonMiddleware",
"django.middleware.csrf.CsrfViewMiddleware",
@ -49,8 +65,10 @@ MIDDLEWARE = [
"django.middleware.clickjacking.XFrameOptionsMiddleware",
"crum.CurrentRequestUserMiddleware",
"django.middleware.gzip.GZipMiddleware",
]
"plane.middleware.api_log_middleware.APITokenLogMiddleware",
]
# Rest Framework settings
REST_FRAMEWORK = {
"DEFAULT_AUTHENTICATION_CLASSES": (
"rest_framework_simplejwt.authentication.JWTAuthentication",
@ -58,15 +76,19 @@ REST_FRAMEWORK = {
"DEFAULT_PERMISSION_CLASSES": ("rest_framework.permissions.IsAuthenticated",),
"DEFAULT_RENDERER_CLASSES": ("rest_framework.renderers.JSONRenderer",),
"DEFAULT_FILTER_BACKENDS": ("django_filters.rest_framework.DjangoFilterBackend",),
"DEFAULT_THROTTLE_CLASSES": ("plane.proxy.rate_limit.ApiKeyRateThrottle",),
"DEFAULT_THROTTLE_RATES": {
"api_key": "60/minute",
},
}
AUTHENTICATION_BACKENDS = (
"django.contrib.auth.backends.ModelBackend", # default
# "guardian.backends.ObjectPermissionBackend",
)
# Django Auth Backend
AUTHENTICATION_BACKENDS = ("django.contrib.auth.backends.ModelBackend",) # default
# Root Urls
ROOT_URLCONF = "plane.urls"
# Templates
TEMPLATES = [
{
"BACKEND": "django.template.backends.django.DjangoTemplates",
@ -85,52 +107,74 @@ TEMPLATES = [
},
]
# Cookie Settings
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
JWT_AUTH = {
"JWT_ENCODE_HANDLER": "rest_framework_jwt.utils.jwt_encode_handler",
"JWT_DECODE_HANDLER": "rest_framework_jwt.utils.jwt_decode_handler",
"JWT_PAYLOAD_HANDLER": "rest_framework_jwt.utils.jwt_payload_handler",
"JWT_PAYLOAD_GET_USER_ID_HANDLER": "rest_framework_jwt.utils.jwt_get_user_id_from_payload_handler",
"JWT_RESPONSE_PAYLOAD_HANDLER": "rest_framework_jwt.utils.jwt_response_payload_handler",
"JWT_SECRET_KEY": SECRET_KEY,
"JWT_GET_USER_SECRET_KEY": None,
"JWT_PUBLIC_KEY": None,
"JWT_PRIVATE_KEY": None,
"JWT_ALGORITHM": "HS256",
"JWT_VERIFY": True,
"JWT_VERIFY_EXPIRATION": True,
"JWT_LEEWAY": 0,
"JWT_EXPIRATION_DELTA": datetime.timedelta(seconds=604800),
"JWT_AUDIENCE": None,
"JWT_ISSUER": None,
"JWT_ALLOW_REFRESH": False,
"JWT_REFRESH_EXPIRATION_DELTA": datetime.timedelta(days=7),
"JWT_AUTH_HEADER_PREFIX": "JWT",
"JWT_AUTH_COOKIE": None,
}
# CORS Settings
CORS_ALLOW_CREDENTIALS = True
cors_origins_raw = os.environ.get("CORS_ALLOWED_ORIGINS", "")
# filter out empty strings
cors_allowed_origins = [origin.strip() for origin in cors_origins_raw.split(",") if origin.strip()]
if cors_allowed_origins:
CORS_ALLOWED_ORIGINS = cors_allowed_origins
else:
CORS_ALLOW_ALL_ORIGINS = True
# Application Settings
WSGI_APPLICATION = "plane.wsgi.application"
ASGI_APPLICATION = "plane.asgi.application"
# Django Sites
SITE_ID = 1
# User Model
AUTH_USER_MODEL = "db.User"
# Database
DATABASES = {
"default": {
"ENGINE": "django.db.backends.sqlite3",
"NAME": os.path.join(BASE_DIR, "db.sqlite3"),
if bool(os.environ.get("DATABASE_URL")):
# Parse database configuration from $DATABASE_URL
DATABASES = {
"default": dj_database_url.config(),
}
else:
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": os.environ.get("POSTGRES_DB"),
"USER": os.environ.get("POSTGRES_USER"),
"PASSWORD": os.environ.get("POSTGRES_PASSWORD"),
"HOST": os.environ.get("POSTGRES_HOST"),
}
}
}
# Redis Config
REDIS_URL = os.environ.get("REDIS_URL")
REDIS_SSL = "rediss" in REDIS_URL
# Password validation
if REDIS_SSL:
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"CONNECTION_POOL_KWARGS": {"ssl_cert_reqs": False},
},
}
}
else:
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
},
}
}
# Password validations
AUTH_PASSWORD_VALIDATORS = [
{
"NAME": "django.contrib.auth.password_validation.UserAttributeSimilarityValidator",
@ -147,7 +191,6 @@ AUTH_PASSWORD_VALIDATORS = [
]
# Static files (CSS, JavaScript, Images)
STATIC_URL = "/static/"
STATIC_ROOT = os.path.join(BASE_DIR, "static-assets", "collected-static")
STATICFILES_DIRS = (os.path.join(BASE_DIR, "static"),)
@ -156,21 +199,19 @@ STATICFILES_DIRS = (os.path.join(BASE_DIR, "static"),)
MEDIA_ROOT = "mediafiles"
MEDIA_URL = "/media/"
# Internationalization
LANGUAGE_CODE = "en-us"
TIME_ZONE = "UTC"
USE_I18N = True
USE_L10N = True
# Timezones
USE_TZ = True
TIME_ZONE = "UTC"
# Default Auto Field
DEFAULT_AUTO_FIELD = "django.db.models.BigAutoField"
# Email settings
EMAIL_BACKEND = "django.core.mail.backends.smtp.EmailBackend"
# Host for sending e-mail.
EMAIL_HOST = os.environ.get("EMAIL_HOST")
@ -183,7 +224,31 @@ EMAIL_USE_TLS = os.environ.get("EMAIL_USE_TLS", "1") == "1"
EMAIL_USE_SSL = os.environ.get("EMAIL_USE_SSL", "0") == "1"
EMAIL_FROM = os.environ.get("EMAIL_FROM", "Team Plane <team@mailer.plane.so>")
# Storage Settings
STORAGES = {
"staticfiles": {
"BACKEND": "whitenoise.storage.CompressedManifestStaticFilesStorage",
},
}
STORAGES["default"] = {
"BACKEND": "storages.backends.s3boto3.S3Boto3Storage",
}
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID", "access-key")
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY", "secret-key")
AWS_STORAGE_BUCKET_NAME = os.environ.get("AWS_S3_BUCKET_NAME", "uploads")
AWS_DEFAULT_ACL = "public-read"
AWS_QUERYSTRING_AUTH = False
AWS_S3_FILE_OVERWRITE = False
AWS_S3_ENDPOINT_URL = os.environ.get("AWS_S3_ENDPOINT_URL", None) or os.environ.get(
"MINIO_ENDPOINT_URL", None
)
if AWS_S3_ENDPOINT_URL:
parsed_url = urlparse(os.environ.get("WEB_URL", "http://localhost"))
AWS_S3_CUSTOM_DOMAIN = f"{parsed_url.netloc}/{AWS_STORAGE_BUCKET_NAME}"
AWS_S3_URL_PROTOCOL = f"{parsed_url.scheme}:"
# JWT Auth Configuration
SIMPLE_JWT = {
"ACCESS_TOKEN_LIFETIME": timedelta(minutes=10080),
"REFRESH_TOKEN_LIFETIME": timedelta(days=43200),
@ -211,7 +276,59 @@ SIMPLE_JWT = {
"SLIDING_TOKEN_REFRESH_LIFETIME": timedelta(days=1),
}
# Celery Configuration
CELERY_TIMEZONE = TIME_ZONE
CELERY_TASK_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_IMPORTS = ("plane.bgtasks.issue_automation_task","plane.bgtasks.exporter_expired_task")
CELERY_TASK_SERIALIZER = "json"
CELERY_ACCEPT_CONTENT = ["application/json"]
if REDIS_SSL:
redis_url = os.environ.get("REDIS_URL")
broker_url = (
f"{redis_url}?ssl_cert_reqs={ssl.CERT_NONE.name}&ssl_ca_certs={certifi.where()}"
)
CELERY_BROKER_URL = broker_url
CELERY_RESULT_BACKEND = broker_url
else:
CELERY_BROKER_URL = REDIS_URL
CELERY_RESULT_BACKEND = REDIS_URL
CELERY_IMPORTS = (
"plane.bgtasks.issue_automation_task",
"plane.bgtasks.exporter_expired_task",
)
# Sentry Settings
# Enable Sentry Settings
if bool(os.environ.get("SENTRY_DSN", False)):
sentry_sdk.init(
dsn=os.environ.get("SENTRY_DSN", ""),
integrations=[
DjangoIntegration(),
RedisIntegration(),
CeleryIntegration(monitor_beat_tasks=True),
],
traces_sample_rate=1,
send_default_pii=True,
environment=os.environ.get("ENVIRONMENT", "development"),
profiles_sample_rate=1.0,
)
# Application Envs
PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False) # For External
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Unsplash Access key
UNSPLASH_ACCESS_KEY = os.environ.get("UNSPLASH_ACCESS_KEY")
# Github Access Token
GITHUB_ACCESS_TOKEN = os.environ.get("GITHUB_ACCESS_TOKEN", False)
# Analytics
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
# Use Minio settings
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1

View File

@ -1,123 +1,31 @@
"""Development settings and globals."""
from __future__ import absolute_import
import dj_database_url
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.redis import RedisIntegration
"""Development settings"""
from .common import * # noqa
DEBUG = int(os.environ.get("DEBUG", 1)) == 1
DEBUG = True
ALLOWED_HOSTS = [
"*",
]
# Debug Toolbar settings
INSTALLED_APPS += ("debug_toolbar",)
MIDDLEWARE += ("debug_toolbar.middleware.DebugToolbarMiddleware",)
DEBUG_TOOLBAR_PATCH_SETTINGS = False
# Only show emails in console don't send it to smtp
EMAIL_BACKEND = "django.core.mail.backends.console.EmailBackend"
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": os.environ.get("PGUSER", "plane"),
"USER": "",
"PASSWORD": "",
"HOST": os.environ.get("PGHOST", "localhost"),
}
}
DOCKERIZED = int(os.environ.get("DOCKERIZED", 0)) == 1
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
if DOCKERIZED:
DATABASES["default"] = dj_database_url.config()
CACHES = {
"default": {
"BACKEND": "django.core.cache.backends.locmem.LocMemCache",
}
}
INSTALLED_APPS += ("debug_toolbar",)
MIDDLEWARE += ("debug_toolbar.middleware.DebugToolbarMiddleware",)
DEBUG_TOOLBAR_PATCH_SETTINGS = False
INTERNAL_IPS = ("127.0.0.1",)
CORS_ORIGIN_ALLOW_ALL = True
if os.environ.get("SENTRY_DSN", False):
sentry_sdk.init(
dsn=os.environ.get("SENTRY_DSN"),
integrations=[DjangoIntegration(), RedisIntegration()],
# If you wish to associate users to errors (assuming you are using
# django.contrib.auth) you may enable sending PII data.
send_default_pii=True,
environment="local",
traces_sample_rate=0.7,
profiles_sample_rate=1.0,
)
else:
LOGGING = {
"version": 1,
"disable_existing_loggers": False,
"handlers": {
"console": {
"class": "logging.StreamHandler",
},
},
"root": {
"handlers": ["console"],
"level": "DEBUG",
},
"loggers": {
"*": {
"handlers": ["console"],
"level": "DEBUG",
"propagate": True,
},
},
}
REDIS_HOST = "localhost"
REDIS_PORT = 6379
REDIS_URL = os.environ.get("REDIS_URL")
MEDIA_URL = "/uploads/"
MEDIA_ROOT = os.path.join(BASE_DIR, "uploads")
if DOCKERIZED:
REDIS_URL = os.environ.get("REDIS_URL")
WEB_URL = os.environ.get("WEB_URL", "http://localhost:3000")
PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False)
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
LOGGER_BASE_URL = os.environ.get("LOGGER_BASE_URL", False)
CELERY_RESULT_BACKEND = os.environ.get("REDIS_URL")
CELERY_BROKER_URL = os.environ.get("REDIS_URL")
GITHUB_ACCESS_TOKEN = os.environ.get("GITHUB_ACCESS_TOKEN", False)
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Unsplash Access key
UNSPLASH_ACCESS_KEY = os.environ.get("UNSPLASH_ACCESS_KEY")
CORS_ALLOWED_ORIGINS = [
"http://localhost:3000",
"http://127.0.0.1:3000",
"http://localhost:4000",
"http://127.0.0.1:4000",
]

View File

@ -1,282 +1,18 @@
"""Production settings and globals."""
import ssl
import certifi
import dj_database_url
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.redis import RedisIntegration
from urllib.parse import urlparse
"""Production settings"""
from .common import * # noqa
# Database
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = int(os.environ.get("DEBUG", 0)) == 1
if bool(os.environ.get("DATABASE_URL")):
# Parse database configuration from $DATABASE_URL
DATABASES["default"] = dj_database_url.config()
else:
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": os.environ.get("POSTGRES_DB"),
"USER": os.environ.get("POSTGRES_USER"),
"PASSWORD": os.environ.get("POSTGRES_PASSWORD"),
"HOST": os.environ.get("POSTGRES_HOST"),
}
}
SITE_ID = 1
# Set the variable true if running in docker environment
DOCKERIZED = int(os.environ.get("DOCKERIZED", 0)) == 1
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
# Enable Connection Pooling (if desired)
# DATABASES['default']['ENGINE'] = 'django_postgrespool'
# Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# TODO: Make it FALSE and LIST DOMAINS IN FULL PROD.
CORS_ALLOW_ALL_ORIGINS = True
CORS_ALLOW_METHODS = [
"DELETE",
"GET",
"OPTIONS",
"PATCH",
"POST",
"PUT",
]
CORS_ALLOW_HEADERS = [
"accept",
"accept-encoding",
"authorization",
"content-type",
"dnt",
"origin",
"user-agent",
"x-csrftoken",
"x-requested-with",
]
CORS_ALLOW_CREDENTIALS = True
INSTALLED_APPS += ("scout_apm.django",)
STORAGES = {
"staticfiles": {
"BACKEND": "whitenoise.storage.CompressedManifestStaticFilesStorage",
},
}
if bool(os.environ.get("SENTRY_DSN", False)):
sentry_sdk.init(
dsn=os.environ.get("SENTRY_DSN", ""),
integrations=[DjangoIntegration(), RedisIntegration()],
# If you wish to associate users to errors (assuming you are using
# django.contrib.auth) you may enable sending PII data.
traces_sample_rate=1,
send_default_pii=True,
environment="production",
profiles_sample_rate=1.0,
)
if DOCKERIZED and USE_MINIO:
INSTALLED_APPS += ("storages",)
STORAGES["default"] = {"BACKEND": "storages.backends.s3boto3.S3Boto3Storage"}
# The AWS access key to use.
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID", "access-key")
# The AWS secret access key to use.
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY", "secret-key")
# The name of the bucket to store files in.
AWS_STORAGE_BUCKET_NAME = os.environ.get("AWS_S3_BUCKET_NAME", "uploads")
# The full URL to the S3 endpoint. Leave blank to use the default region URL.
AWS_S3_ENDPOINT_URL = os.environ.get(
"AWS_S3_ENDPOINT_URL", "http://plane-minio:9000"
)
# Default permissions
AWS_DEFAULT_ACL = "public-read"
AWS_QUERYSTRING_AUTH = False
AWS_S3_FILE_OVERWRITE = False
# Custom Domain settings
parsed_url = urlparse(os.environ.get("WEB_URL", "http://localhost"))
AWS_S3_CUSTOM_DOMAIN = f"{parsed_url.netloc}/{AWS_STORAGE_BUCKET_NAME}"
AWS_S3_URL_PROTOCOL = f"{parsed_url.scheme}:"
else:
# The AWS region to connect to.
AWS_REGION = os.environ.get("AWS_REGION", "")
# The AWS access key to use.
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID", "")
# The AWS secret access key to use.
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY", "")
# The optional AWS session token to use.
# AWS_SESSION_TOKEN = ""
# The name of the bucket to store files in.
AWS_S3_BUCKET_NAME = os.environ.get("AWS_S3_BUCKET_NAME")
# How to construct S3 URLs ("auto", "path", "virtual").
AWS_S3_ADDRESSING_STYLE = "auto"
# The full URL to the S3 endpoint. Leave blank to use the default region URL.
AWS_S3_ENDPOINT_URL = os.environ.get("AWS_S3_ENDPOINT_URL", "")
# A prefix to be applied to every stored file. This will be joined to every filename using the "/" separator.
AWS_S3_KEY_PREFIX = ""
# Whether to enable authentication for stored files. If True, then generated URLs will include an authentication
# token valid for `AWS_S3_MAX_AGE_SECONDS`. If False, then generated URLs will not include an authentication token,
# and their permissions will be set to "public-read".
AWS_S3_BUCKET_AUTH = False
# How long generated URLs are valid for. This affects the expiry of authentication tokens if `AWS_S3_BUCKET_AUTH`
# is True. It also affects the "Cache-Control" header of the files.
# Important: Changing this setting will not affect existing files.
AWS_S3_MAX_AGE_SECONDS = 60 * 60 # 1 hours.
# A URL prefix to be used for generated URLs. This is useful if your bucket is served through a CDN. This setting
# cannot be used with `AWS_S3_BUCKET_AUTH`.
AWS_S3_PUBLIC_URL = ""
# If True, then files will be stored with reduced redundancy. Check the S3 documentation and make sure you
# understand the consequences before enabling.
# Important: Changing this setting will not affect existing files.
AWS_S3_REDUCED_REDUNDANCY = False
# The Content-Disposition header used when the file is downloaded. This can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_CONTENT_DISPOSITION = ""
# The Content-Language header used when the file is downloaded. This can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_CONTENT_LANGUAGE = ""
# A mapping of custom metadata for each file. Each value can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_METADATA = {}
# If True, then files will be stored using AES256 server-side encryption.
# If this is a string value (e.g., "aws:kms"), that encryption type will be used.
# Otherwise, server-side encryption is not be enabled.
# Important: Changing this setting will not affect existing files.
AWS_S3_ENCRYPT_KEY = False
# The AWS S3 KMS encryption key ID (the `SSEKMSKeyId` parameter) is set from this string if present.
# This is only relevant if AWS S3 KMS server-side encryption is enabled (above).
# AWS_S3_KMS_ENCRYPTION_KEY_ID = ""
# If True, then text files will be stored using gzip content encoding. Files will only be gzipped if their
# compressed size is smaller than their uncompressed size.
# Important: Changing this setting will not affect existing files.
AWS_S3_GZIP = True
# The signature version to use for S3 requests.
AWS_S3_SIGNATURE_VERSION = None
# If True, then files with the same name will overwrite each other. By default it's set to False to have
# extra characters appended.
AWS_S3_FILE_OVERWRITE = False
STORAGES["default"] = {
"BACKEND": "django_s3_storage.storage.S3Storage",
}
# AWS Settings End
# Enable Connection Pooling (if desired)
# DATABASES['default']['ENGINE'] = 'django_postgrespool'
# Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# Allow all host headers
ALLOWED_HOSTS = [
"*",
]
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
REDIS_URL = os.environ.get("REDIS_URL")
if DOCKERIZED:
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
},
}
}
else:
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"CONNECTION_POOL_KWARGS": {"ssl_cert_reqs": False},
},
}
}
WEB_URL = os.environ.get("WEB_URL", "https://app.plane.so")
PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False)
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
LOGGER_BASE_URL = os.environ.get("LOGGER_BASE_URL", False)
redis_url = os.environ.get("REDIS_URL")
broker_url = (
f"{redis_url}?ssl_cert_reqs={ssl.CERT_NONE.name}&ssl_ca_certs={certifi.where()}"
)
if DOCKERIZED:
CELERY_BROKER_URL = REDIS_URL
CELERY_RESULT_BACKEND = REDIS_URL
else:
CELERY_BROKER_URL = broker_url
CELERY_RESULT_BACKEND = broker_url
GITHUB_ACCESS_TOKEN = os.environ.get("GITHUB_ACCESS_TOKEN", False)
# Enable or Disable signups
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Scout Settings
SCOUT_MONITOR = os.environ.get("SCOUT_MONITOR", False)
SCOUT_KEY = os.environ.get("SCOUT_KEY", "")
SCOUT_NAME = "Plane"
# Unsplash Access key
UNSPLASH_ACCESS_KEY = os.environ.get("UNSPLASH_ACCESS_KEY")

View File

@ -6,13 +6,7 @@ from urllib.parse import urlparse
def redis_instance():
# connect to redis
if (
settings.DOCKERIZED
or os.environ.get("DJANGO_SETTINGS_MODULE", "plane.settings.production")
== "plane.settings.local"
):
ri = redis.Redis.from_url(settings.REDIS_URL, db=0)
else:
if settings.REDIS_SSL:
url = urlparse(settings.REDIS_URL)
ri = redis.Redis(
host=url.hostname,
@ -21,5 +15,7 @@ def redis_instance():
ssl=True,
ssl_cert_reqs=None,
)
else:
ri = redis.Redis.from_url(settings.REDIS_URL, db=0)
return ri

View File

@ -1,129 +0,0 @@
"""Self hosted settings and globals."""
from urllib.parse import urlparse
import dj_database_url
from urllib.parse import urlparse
from .common import * # noqa
# Database
DEBUG = int(os.environ.get("DEBUG", 0)) == 1
# Docker configurations
DOCKERIZED = 1
USE_MINIO = 1
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": "plane",
"USER": os.environ.get("PGUSER", ""),
"PASSWORD": os.environ.get("PGPASSWORD", ""),
"HOST": os.environ.get("PGHOST", ""),
}
}
# Parse database configuration from $DATABASE_URL
DATABASES["default"] = dj_database_url.config()
SITE_ID = 1
# File size limit
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
CORS_ALLOW_METHODS = [
"DELETE",
"GET",
"OPTIONS",
"PATCH",
"POST",
"PUT",
]
CORS_ALLOW_HEADERS = [
"accept",
"accept-encoding",
"authorization",
"content-type",
"dnt",
"origin",
"user-agent",
"x-csrftoken",
"x-requested-with",
]
CORS_ALLOW_CREDENTIALS = True
CORS_ALLOW_ALL_ORIGINS = True
STORAGES = {
"staticfiles": {
"BACKEND": "whitenoise.storage.CompressedManifestStaticFilesStorage",
},
}
INSTALLED_APPS += ("storages",)
STORAGES["default"] = {"BACKEND": "storages.backends.s3boto3.S3Boto3Storage"}
# The AWS access key to use.
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID", "access-key")
# The AWS secret access key to use.
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY", "secret-key")
# The name of the bucket to store files in.
AWS_STORAGE_BUCKET_NAME = os.environ.get("AWS_S3_BUCKET_NAME", "uploads")
# The full URL to the S3 endpoint. Leave blank to use the default region URL.
AWS_S3_ENDPOINT_URL = os.environ.get(
"AWS_S3_ENDPOINT_URL", "http://plane-minio:9000"
)
# Default permissions
AWS_DEFAULT_ACL = "public-read"
AWS_QUERYSTRING_AUTH = False
AWS_S3_FILE_OVERWRITE = False
# Custom Domain settings
parsed_url = urlparse(os.environ.get("WEB_URL", "http://localhost"))
AWS_S3_CUSTOM_DOMAIN = f"{parsed_url.netloc}/{AWS_STORAGE_BUCKET_NAME}"
AWS_S3_URL_PROTOCOL = f"{parsed_url.scheme}:"
# Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# Allow all host headers
ALLOWED_HOSTS = [
"*",
]
# Security settings
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
# Redis URL
REDIS_URL = os.environ.get("REDIS_URL")
# Caches
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
},
}
}
# URL used for email redirects
WEB_URL = os.environ.get("WEB_URL", "http://localhost")
# Celery settings
CELERY_BROKER_URL = REDIS_URL
CELERY_RESULT_BACKEND = REDIS_URL
# Enable or Disable signups
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Analytics
ANALYTICS_BASE_API = False
# OPEN AI Settings
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")

View File

@ -1,223 +0,0 @@
"""Production settings and globals."""
from urllib.parse import urlparse
import ssl
import certifi
import dj_database_url
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.redis import RedisIntegration
from .common import * # noqa
# Database
DEBUG = int(os.environ.get("DEBUG", 1)) == 1
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": os.environ.get("PGUSER", "plane"),
"USER": "",
"PASSWORD": "",
"HOST": os.environ.get("PGHOST", "localhost"),
}
}
# CORS WHITELIST ON PROD
CORS_ORIGIN_WHITELIST = [
# "https://example.com",
# "https://sub.example.com",
# "http://localhost:8080",
# "http://127.0.0.1:9000"
]
# Parse database configuration from $DATABASE_URL
DATABASES["default"] = dj_database_url.config()
SITE_ID = 1
# Enable Connection Pooling (if desired)
# DATABASES['default']['ENGINE'] = 'django_postgrespool'
# Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# Allow all host headers
ALLOWED_HOSTS = ["*"]
# TODO: Make it FALSE and LIST DOMAINS IN FULL PROD.
CORS_ALLOW_ALL_ORIGINS = True
STORAGES = {
"staticfiles": {
"BACKEND": "whitenoise.storage.CompressedManifestStaticFilesStorage",
},
}
# Make true if running in a docker environment
DOCKERIZED = int(os.environ.get("DOCKERIZED", 0)) == 1
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
sentry_sdk.init(
dsn=os.environ.get("SENTRY_DSN"),
integrations=[DjangoIntegration(), RedisIntegration()],
# If you wish to associate users to errors (assuming you are using
# django.contrib.auth) you may enable sending PII data.
traces_sample_rate=1,
send_default_pii=True,
environment="staging",
profiles_sample_rate=1.0,
)
# The AWS region to connect to.
AWS_REGION = os.environ.get("AWS_REGION")
# The AWS access key to use.
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID")
# The AWS secret access key to use.
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY")
# The optional AWS session token to use.
# AWS_SESSION_TOKEN = ""
# The name of the bucket to store files in.
AWS_S3_BUCKET_NAME = os.environ.get("AWS_S3_BUCKET_NAME")
# How to construct S3 URLs ("auto", "path", "virtual").
AWS_S3_ADDRESSING_STYLE = "auto"
# The full URL to the S3 endpoint. Leave blank to use the default region URL.
AWS_S3_ENDPOINT_URL = os.environ.get("AWS_S3_ENDPOINT_URL", "")
# A prefix to be applied to every stored file. This will be joined to every filename using the "/" separator.
AWS_S3_KEY_PREFIX = ""
# Whether to enable authentication for stored files. If True, then generated URLs will include an authentication
# token valid for `AWS_S3_MAX_AGE_SECONDS`. If False, then generated URLs will not include an authentication token,
# and their permissions will be set to "public-read".
AWS_S3_BUCKET_AUTH = False
# How long generated URLs are valid for. This affects the expiry of authentication tokens if `AWS_S3_BUCKET_AUTH`
# is True. It also affects the "Cache-Control" header of the files.
# Important: Changing this setting will not affect existing files.
AWS_S3_MAX_AGE_SECONDS = 60 * 60 # 1 hours.
# A URL prefix to be used for generated URLs. This is useful if your bucket is served through a CDN. This setting
# cannot be used with `AWS_S3_BUCKET_AUTH`.
AWS_S3_PUBLIC_URL = ""
# If True, then files will be stored with reduced redundancy. Check the S3 documentation and make sure you
# understand the consequences before enabling.
# Important: Changing this setting will not affect existing files.
AWS_S3_REDUCED_REDUNDANCY = False
# The Content-Disposition header used when the file is downloaded. This can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_CONTENT_DISPOSITION = ""
# The Content-Language header used when the file is downloaded. This can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_CONTENT_LANGUAGE = ""
# A mapping of custom metadata for each file. Each value can be a string, or a function taking a
# single `name` argument.
# Important: Changing this setting will not affect existing files.
AWS_S3_METADATA = {}
# If True, then files will be stored using AES256 server-side encryption.
# If this is a string value (e.g., "aws:kms"), that encryption type will be used.
# Otherwise, server-side encryption is not be enabled.
# Important: Changing this setting will not affect existing files.
AWS_S3_ENCRYPT_KEY = False
# The AWS S3 KMS encryption key ID (the `SSEKMSKeyId` parameter) is set from this string if present.
# This is only relevant if AWS S3 KMS server-side encryption is enabled (above).
# AWS_S3_KMS_ENCRYPTION_KEY_ID = ""
# If True, then text files will be stored using gzip content encoding. Files will only be gzipped if their
# compressed size is smaller than their uncompressed size.
# Important: Changing this setting will not affect existing files.
AWS_S3_GZIP = True
# The signature version to use for S3 requests.
AWS_S3_SIGNATURE_VERSION = None
# If True, then files with the same name will overwrite each other. By default it's set to False to have
# extra characters appended.
AWS_S3_FILE_OVERWRITE = False
# AWS Settings End
STORAGES["default"] = {
"BACKEND": "django_s3_storage.storage.S3Storage",
}
# Enable Connection Pooling (if desired)
# DATABASES['default']['ENGINE'] = 'django_postgrespool'
# Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# Allow all host headers
ALLOWED_HOSTS = [
"*",
]
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
REDIS_URL = os.environ.get("REDIS_URL")
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": REDIS_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"CONNECTION_POOL_KWARGS": {"ssl_cert_reqs": False},
},
}
}
RQ_QUEUES = {
"default": {
"USE_REDIS_CACHE": "default",
}
}
WEB_URL = os.environ.get("WEB_URL")
PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False)
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
LOGGER_BASE_URL = os.environ.get("LOGGER_BASE_URL", False)
redis_url = os.environ.get("REDIS_URL")
broker_url = (
f"{redis_url}?ssl_cert_reqs={ssl.CERT_NONE.name}&ssl_ca_certs={certifi.where()}"
)
CELERY_RESULT_BACKEND = broker_url
CELERY_BROKER_URL = broker_url
GITHUB_ACCESS_TOKEN = os.environ.get("GITHUB_ACCESS_TOKEN", False)
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Unsplash Access key
UNSPLASH_ACCESS_KEY = os.environ.get("UNSPLASH_ACCESS_KEY")

View File

@ -1,45 +1,9 @@
from __future__ import absolute_import
"""Test Settings"""
from .common import * # noqa
DEBUG = True
INSTALLED_APPS.append("plane.tests")
# Send it in a dummy outbox
EMAIL_BACKEND = "django.core.mail.backends.locmem.EmailBackend"
if os.environ.get('GITHUB_WORKFLOW'):
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'github_actions',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'plane_test',
'USER': 'postgres',
'PASSWORD': 'password123',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
REDIS_HOST = "localhost"
REDIS_PORT = 6379
REDIS_URL = False
RQ_QUEUES = {
"default": {
"HOST": "localhost",
"PORT": 6379,
"DB": 0,
"DEFAULT_TIMEOUT": 360,
},
}
WEB_URL = "http://localhost:3000"
INSTALLED_APPS.append("plane.tests",)

View File

@ -11,6 +11,7 @@ from django.conf import settings
urlpatterns = [
path("", TemplateView.as_view(template_name="index.html")),
path("api/", include("plane.api.urls")),
path("api/v1/", include("plane.proxy.urls")),
path("", include("plane.web.urls")),
]

View File

@ -1,9 +1,9 @@
# base requirements
Django==4.2.5
Django==4.2.7
django-braces==1.15.0
django-taggit==4.0.0
psycopg==3.1.10
psycopg==3.1.12
django-oauth-toolkit==2.3.0
mistune==3.0.1
djangorestframework==3.14.0
@ -17,7 +17,7 @@ django-filter==23.2
jsonmodels==2.6.0
djangorestframework-simplejwt==5.3.0
sentry-sdk==1.30.0
django-s3-storage==0.14.0
django-storages==1.14
django-crum==0.7.9
django-guardian==2.4.0
dj_rest_auth==2.2.5
@ -30,8 +30,9 @@ openai==0.28.0
slack-sdk==3.21.3
celery==5.3.4
django_celery_beat==2.5.0
psycopg-binary==3.1.10
psycopg-c==3.1.10
psycopg-binary==3.1.12
psycopg-c==3.1.12
scout-apm==2.26.1
openpyxl==3.1.2
beautifulsoup4==4.12.2
beautifulsoup4==4.12.2
dj-database-url==2.1.0

View File

@ -1,9 +1,7 @@
-r base.txt
dj-database-url==2.1.0
gunicorn==21.2.0
whitenoise==6.5.0
django-storages==1.14
boto3==1.28.40
django-anymail==10.1
django-debug-toolbar==4.1.0

View File

@ -5,7 +5,7 @@
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="format-detection" content="telephone=no">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{{ Inviter }} invited you to join {{ Workspace-Name }} on Plane</title>
<title>{{ first_name }} invited you to join {{ project_name }} on Plane</title>
<style type="text/css" emogrify="no">#outlook a { padding:0; } .ExternalClass { width:100%; } .ExternalClass, .ExternalClass p, .ExternalClass span, .ExternalClass font, .ExternalClass td, .ExternalClass div { line-height: 100%; } table td { border-collapse: collapse; mso-line-height-rule: exactly; } .editable.image { font-size: 0 !important; line-height: 0 !important; } .nl2go_preheader { display: none !important; mso-hide:all !important; mso-line-height-rule: exactly; visibility: hidden !important; line-height: 0px !important; font-size: 0px !important; } body { width:100% !important; -webkit-text-size-adjust:100%; -ms-text-size-adjust:100%; margin:0; padding:0; } img { outline:none; text-decoration:none; -ms-interpolation-mode: bicubic; } a img { border:none; } table { border-collapse:collapse; mso-table-lspace:0pt; mso-table-rspace:0pt; } th { font-weight: normal; text-align: left; } *[class="gmail-fix"] { display: none !important; } </style>
<style type="text/css" emogrify="no"> @media (max-width: 600px) { .gmx-killpill { content: ' \03D1';} } </style>
<style type="text/css" emogrify="no">@media (max-width: 600px) { .gmx-killpill { content: ' \03D1';} .r0-c { box-sizing: border-box !important; text-align: center !important; valign: top !important; width: 320px !important } .r1-o { border-style: solid !important; margin: 0 auto 0 auto !important; width: 320px !important } .r2-i { background-color: #ffffff !important } .r3-c { box-sizing: border-box !important; text-align: center !important; valign: top !important; width: 100% !important } .r4-o { border-style: solid !important; margin: 0 auto 0 auto !important; margin-top: 20px !important; width: 100% !important } .r5-i { background-color: #f8f9fa !important; padding-bottom: 20px !important; padding-left: 10px !important; padding-right: 10px !important; padding-top: 20px !important } .r6-c { box-sizing: border-box !important; display: block !important; valign: top !important; width: 100% !important } .r7-o { border-style: solid !important; width: 100% !important } .r8-i { padding-left: 0px !important; padding-right: 0px !important } .r9-o { border-style: solid !important; margin: 0 auto 0 auto !important; width: 100% !important } .r10-i { padding-bottom: 35px !important; padding-top: 15px !important } .r11-c { box-sizing: border-box !important; text-align: left !important; valign: top !important; width: 100% !important } .r12-o { border-style: solid !important; margin: 0 auto 0 0 !important; width: 100% !important } .r13-i { padding-left: 20px !important; padding-right: 20px !important; padding-top: 0px !important; text-align: center !important } .r14-o { border-style: solid !important; margin: 0 auto 0 auto !important; margin-bottom: 20px !important; margin-top: 20px !important; width: 100% !important } .r15-i { text-align: center !important } .r16-r { background-color: #ffffff !important; border-color: #3f76ff !important; border-radius: 4px !important; border-width: 1px !important; box-sizing: border-box; height: initial !important; padding-bottom: 7px !important; padding-left: 20px !important; padding-right: 20px !important; padding-top: 7px !important; text-align: center !important; width: 100% !important } .r17-i { padding-bottom: 15px !important; padding-left: 20px !important; padding-right: 20px !important; padding-top: 15px !important; text-align: left !important } .r18-i { background-color: #eff2f7 !important; padding-bottom: 20px !important; padding-left: 15px !important; padding-right: 15px !important; padding-top: 20px !important } .r19-i { padding-bottom: 15px !important; padding-top: 15px !important } .r20-i { color: #3b3f44 !important; padding-bottom: 0px !important; padding-top: 0px !important; text-align: center !important } .r21-c { box-sizing: border-box !important; text-align: center !important; width: 100% !important } .r22-c { box-sizing: border-box !important; width: 100% !important } .r23-i { font-size: 0px !important; padding-bottom: 15px !important; padding-left: 65px !important; padding-right: 65px !important; padding-top: 15px !important } .r24-c { box-sizing: border-box !important; width: 32px !important } .r25-o { border-style: solid !important; margin-right: 8px !important; width: 32px !important } .r26-i { padding-bottom: 5px !important; padding-top: 5px !important } .r27-o { border-style: solid !important; margin-right: 0px !important; width: 32px !important } .r28-i { color: #3b3f44 !important; padding-bottom: 15px !important; padding-top: 15px !important; text-align: center !important } .r29-i { padding-bottom: 15px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important } .r30-c { box-sizing: border-box !important; text-align: center !important; valign: top !important; width: 129px !important } .r31-o { border-style: solid !important; margin: 0 auto 0 auto !important; width: 129px !important } body { -webkit-text-size-adjust: none } .nl2go-responsive-hide { display: none } .nl2go-body-table { min-width: unset !important } .mobshow { height: auto !important; overflow: visible !important; max-height: unset !important; visibility: visible !important; border: none !important } .resp-table { display: inline-table !important } .magic-resp { display: table-cell !important } } </style>

View File

@ -9,8 +9,9 @@ x-app-env : &app-env
- NEXT_PUBLIC_DEPLOY_URL=${NEXT_PUBLIC_DEPLOY_URL:-http://localhost/spaces}
- SENTRY_DSN=${SENTRY_DSN:-""}
- GITHUB_CLIENT_SECRET=${GITHUB_CLIENT_SECRET:-""}
- DOCKERIZED=${DOCKERIZED:-1}
# Gunicorn Workers
- DOCKERIZED=${DOCKERIZED:-1} # deprecated
- CORS_ALLOWED_ORIGINS=${CORS_ALLOWED_ORIGINS:-""}
# Gunicorn Workers
- GUNICORN_WORKERS=${GUNICORN_WORKERS:-2}
#DB SETTINGS
- PGHOST=${PGHOST:-plane-db}
@ -34,7 +35,7 @@ x-app-env : &app-env
- EMAIL_USE_SSL=${EMAIL_USE_SSL:-0}
- DEFAULT_EMAIL=${DEFAULT_EMAIL:-captain@plane.so}
- DEFAULT_PASSWORD=${DEFAULT_PASSWORD:-password123}
# OPENAI SETTINGS
# OPENAI SETTINGS - Deprecated can be configured through admin panel
- OPENAI_API_BASE=${OPENAI_API_BASE:-https://api.openai.com/v1}
- OPENAI_API_KEY=${OPENAI_API_KEY:-"sk-"}
- GPT_ENGINE=${GPT_ENGINE:-"gpt-3.5-turbo"}
@ -55,6 +56,8 @@ x-app-env : &app-env
- BUCKET_NAME=${BUCKET_NAME:-uploads}
- FILE_SIZE_LIMIT=${FILE_SIZE_LIMIT:-5242880}
services:
web:
<<: *app-env
@ -143,14 +146,6 @@ services:
volumes:
- uploads:/export
createbuckets:
<<: *app-env
image: minio/mc
entrypoint: >
/bin/sh -c " /usr/bin/mc config host add plane-minio http://plane-minio:9000 \$AWS_ACCESS_KEY_ID \$AWS_SECRET_ACCESS_KEY; /usr/bin/mc mb plane-minio/\$AWS_S3_BUCKET_NAME; /usr/bin/mc anonymous set download plane-minio/\$AWS_S3_BUCKET_NAME; exit 0; "
depends_on:
- plane-minio
# Comment this if you already have a reverse proxy running
proxy:
<<: *app-env

View File

@ -11,7 +11,9 @@ NEXT_PUBLIC_ENABLE_OAUTH=0
NEXT_PUBLIC_DEPLOY_URL=http://localhost/spaces
SENTRY_DSN=""
GITHUB_CLIENT_SECRET=""
DOCKERIZED=1
DOCKERIZED=1 # deprecated
CORS_ALLOWED_ORIGINS=""
#DB SETTINGS
PGHOST=plane-db
@ -39,9 +41,9 @@ DEFAULT_EMAIL=captain@plane.so
DEFAULT_PASSWORD=password123
# OPENAI SETTINGS
OPENAI_API_BASE=https://api.openai.com/v1
OPENAI_API_KEY="sk-"
GPT_ENGINE="gpt-3.5-turbo"
OPENAI_API_BASE=https://api.openai.com/v1 # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# LOGIN/SIGNUP SETTINGS
ENABLE_SIGNUP=1

View File

@ -35,17 +35,6 @@ services:
MINIO_ROOT_USER: ${AWS_ACCESS_KEY_ID}
MINIO_ROOT_PASSWORD: ${AWS_SECRET_ACCESS_KEY}
createbuckets:
image: minio/mc
networks:
- dev_env
entrypoint: >
/bin/sh -c " /usr/bin/mc config host add plane-minio http://plane-minio:9000 \$AWS_ACCESS_KEY_ID \$AWS_SECRET_ACCESS_KEY; /usr/bin/mc mb plane-minio/\$AWS_S3_BUCKET_NAME; /usr/bin/mc anonymous set download plane-minio/\$AWS_S3_BUCKET_NAME; exit 0; "
env_file:
- .env
depends_on:
- plane-minio
plane-db:
container_name: plane-db
image: postgres:15.2-alpine

View File

@ -108,15 +108,6 @@ services:
MINIO_ROOT_USER: ${AWS_ACCESS_KEY_ID}
MINIO_ROOT_PASSWORD: ${AWS_SECRET_ACCESS_KEY}
createbuckets:
image: minio/mc
entrypoint: >
/bin/sh -c " /usr/bin/mc config host add plane-minio http://plane-minio:9000 \$AWS_ACCESS_KEY_ID \$AWS_SECRET_ACCESS_KEY; /usr/bin/mc mb plane-minio/\$AWS_S3_BUCKET_NAME; /usr/bin/mc anonymous set download plane-minio/\$AWS_S3_BUCKET_NAME; exit 0; "
env_file:
- .env
depends_on:
- plane-minio
# Comment this if you already have a reverse proxy running
proxy:
container_name: proxy

View File

@ -138,7 +138,10 @@ const MenuItem: React.FC<ICustomMenuItemProps> = (props) => {
className={`w-full select-none truncate rounded px-1 py-1.5 text-left text-custom-text-200 hover:bg-custom-background-80 ${
active ? "bg-custom-background-80" : ""
} ${className}`}
onClick={onClick}
onClick={(e) => {
close();
onClick && onClick(e);
}}
>
{children}
</button>

View File

@ -15,24 +15,42 @@ import { IPriorityIcon } from "./type";
export const PriorityIcon: React.FC<IPriorityIcon> = ({
priority,
className = "",
transparentBg = false
}) => {
if (!className || className === "") className = "h-3.5 w-3.5";
if (!className || className === "") className = "h-4 w-4";
// Convert to lowercase for string comparison
const lowercasePriority = priority?.toLowerCase();
//get priority icon
const getPriorityIcon = (): React.ReactNode => {
switch (lowercasePriority) {
case 'urgent':
return <AlertCircle className={`text-red-500 ${ transparentBg ? '' : 'p-0.5' } ${className}`} />;
case 'high':
return <SignalHigh className={`text-orange-500 ${ transparentBg ? '' : 'pl-1' } ${className}`} />;
case 'medium':
return <SignalMedium className={`text-yellow-500 ${ transparentBg ? '' : 'ml-1.5' } ${className}`} />;
case 'low':
return <SignalLow className={`text-green-500 ${ transparentBg ? '' : 'ml-2' } ${className}`} />;
default:
return <Ban className={`text-custom-text-200 ${ transparentBg ? '' : 'p-0.5' } ${className}`} />;
}
};
return (
<>
{lowercasePriority === "urgent" ? (
<AlertCircle className={`text-red-500 ${className}`} />
) : lowercasePriority === "high" ? (
<SignalHigh className={`text-orange-500 ${className}`} />
) : lowercasePriority === "medium" ? (
<SignalMedium className={`text-yellow-500 ${className}`} />
) : lowercasePriority === "low" ? (
<SignalLow className={`text-green-500 ${className}`} />
{ transparentBg ? (
getPriorityIcon()
) : (
<Ban className={`text-custom-text-200 ${className}`} />
<div className={`grid h-5 w-5 place-items-center rounded border items-center ${
lowercasePriority === "urgent"
? "border-red-500/20 bg-red-500/20"
: "border-custom-border-200"
}`}
>
{ getPriorityIcon() }
</div>
)}
</>
);

View File

@ -7,4 +7,5 @@ export type TIssuePriorities = "urgent" | "high" | "medium" | "low" | "none";
export interface IPriorityIcon {
priority: TIssuePriorities | null;
className?: string;
transparentBg?: boolean | false;
}

View File

@ -0,0 +1,55 @@
import { TextArea } from "@plane/ui";
import { Control, Controller, FieldErrors } from "react-hook-form";
import { IApiToken } from "types/api_token";
import { IApiFormFields } from "./types";
import { Dispatch, SetStateAction } from "react";
interface IApiTokenDescription {
generatedToken: IApiToken | null | undefined;
control: Control<IApiFormFields, any>;
focusDescription: boolean;
setFocusTitle: Dispatch<SetStateAction<boolean>>;
setFocusDescription: Dispatch<SetStateAction<boolean>>;
}
export const ApiTokenDescription = ({
generatedToken,
control,
focusDescription,
setFocusTitle,
setFocusDescription,
}: IApiTokenDescription) => (
<Controller
control={control}
name="description"
render={({ field: { value, onChange } }) =>
focusDescription ? (
<TextArea
id="description"
name="description"
autoFocus={true}
onBlur={() => {
setFocusDescription(false);
}}
value={value}
defaultValue={value}
onChange={onChange}
placeholder="Description"
className="mt-3"
rows={3}
/>
) : (
<p
onClick={() => {
if (generatedToken != null) return;
setFocusTitle(false);
setFocusDescription(true);
}}
className={`${value.length === 0 ? "text-custom-text-400/60" : "text-custom-text-300"} text-lg pt-3`}
>
{value.length != 0 ? value : "Description"}
</p>
)
}
/>
);

View File

@ -0,0 +1,110 @@
import { Menu, Transition } from "@headlessui/react";
import { ToggleSwitch } from "@plane/ui";
import { Dispatch, Fragment, SetStateAction } from "react";
import { Control, Controller } from "react-hook-form";
import { IApiFormFields } from "./types";
interface IApiTokenExpiry {
neverExpires: boolean;
selectedExpiry: number;
setSelectedExpiry: Dispatch<SetStateAction<number>>;
setNeverExpire: Dispatch<SetStateAction<boolean>>;
renderExpiry: () => string;
control: Control<IApiFormFields, any>;
}
export const expiryOptions = [
{
title: "7 Days",
days: 7,
},
{
title: "30 Days",
days: 30,
},
{
title: "1 Month",
days: 30,
},
{
title: "3 Months",
days: 90,
},
{
title: "1 Year",
days: 365,
},
];
export const ApiTokenExpiry = ({
neverExpires,
selectedExpiry,
setSelectedExpiry,
setNeverExpire,
renderExpiry,
control,
}: IApiTokenExpiry) => (
<>
<Menu>
<p className="text-sm font-medium mb-2"> Expiration Date</p>
<Menu.Button className={"w-[40%]"} disabled={neverExpires}>
<div className="py-3 w-full font-medium px-3 flex border border-custom-border-200 rounded-md justify-center items-baseline">
<p className={`text-base ${neverExpires ? "text-custom-text-400/40" : ""}`}>
{expiryOptions[selectedExpiry].title.toLocaleLowerCase()}
</p>
<p className={`text-sm mr-auto ml-2 text-custom-text-400${neverExpires ? "/40" : ""}`}>({renderExpiry()})</p>
</div>
</Menu.Button>
<Transition
as={Fragment}
enter="transition ease-out duration-100"
enterFrom="transform opacity-0 scale-95"
enterTo="transform opacity-100 scale-100"
leave="transition ease-in duration-75"
leaveFrom="transform opacity-100 scale-100"
leaveTo="transform opacity-0 scale-95"
>
<Menu.Items className="absolute z-10 overflow-y-scroll whitespace-nowrap rounded-sm max-h-36 border origin-top-right mt-1 overflow-auto min-w-[10rem] border-custom-border-100 p-1 shadow-lg focus:outline-none bg-custom-background-100">
{expiryOptions.map((option, index) => (
<Menu.Item key={index}>
{({ active }) => (
<div className="py-1">
<button
type="button"
onClick={() => {
setSelectedExpiry(index);
}}
className={`w-full text-sm select-none truncate rounded px-3 py-1.5 text-left text-custom-text-300 hover:bg-custom-background-80 ${
active ? "bg-custom-background-80" : ""
}`}
>
{option.title}
</button>
</div>
)}
</Menu.Item>
))}
</Menu.Items>
</Transition>
</Menu>
<div className="mt-4 mb-6 flex items-center">
<span className="text-sm font-medium"> Never Expires</span>
<Controller
control={control}
name="never_expires"
render={({ field: { onChange, value } }) => (
<ToggleSwitch
className="ml-3"
value={value}
onChange={(val) => {
onChange(val);
setNeverExpire(val);
}}
size="sm"
/>
)}
/>
</div>
</>
);

View File

@ -0,0 +1,53 @@
import { Button } from "@plane/ui";
import useToast from "hooks/use-toast";
import { Copy } from "lucide-react";
import { Dispatch, SetStateAction } from "react";
import { IApiToken } from "types/api_token";
interface IApiTokenKeySection {
generatedToken: IApiToken | null | undefined;
renderExpiry: () => string;
setDeleteTokenModal: Dispatch<SetStateAction<boolean>>;
}
export const ApiTokenKeySection = ({ generatedToken, renderExpiry, setDeleteTokenModal }: IApiTokenKeySection) => {
const { setToastAlert } = useToast();
return generatedToken ? (
<div className={`mt-${generatedToken ? "8" : "16"}`}>
<p className="font-medium text-base pb-2">Api key created successfully</p>
<p className="text-sm pb-4 w-[80%] text-custom-text-400/60">
Save this API key somewhere safe. You will not be able to view it again once you close this page or reload this
page.
</p>
<Button variant="neutral-primary" className="py-3 w-[85%] flex justify-between items-center">
<p className="font-medium text-base">{generatedToken.token}</p>
<Copy
size={18}
color="#B9B9B9"
onClick={() => {
navigator.clipboard.writeText(generatedToken.token);
setToastAlert({
message: "The Secret key has been successfully copied to your clipboard",
type: "success",
title: "Copied to clipboard",
});
}}
/>
</Button>
<p className="mt-2 text-sm text-custom-text-400/60">
{generatedToken.expired_at ? "Expires on " + renderExpiry() : "Never Expires"}
</p>
<button
className="border py-3 px-5 text-custom-primary-100 text-sm mt-8 rounded-md border-custom-primary-100 w-fit font-medium"
onClick={(e) => {
e.preventDefault();
setDeleteTokenModal(true);
}}
>
Revoke
</button>
</div>
) : null;
};

View File

@ -0,0 +1,69 @@
import { Input } from "@plane/ui";
import { Dispatch, SetStateAction } from "react";
import { Control, Controller, FieldErrors } from "react-hook-form";
import { IApiToken } from "types/api_token";
import { IApiFormFields } from "./types";
interface IApiTokenTitle {
generatedToken: IApiToken | null | undefined;
errors: FieldErrors<IApiFormFields>;
control: Control<IApiFormFields, any>;
focusTitle: boolean;
setFocusTitle: Dispatch<SetStateAction<boolean>>;
setFocusDescription: Dispatch<SetStateAction<boolean>>;
}
export const ApiTokenTitle = ({
generatedToken,
errors,
control,
focusTitle,
setFocusTitle,
setFocusDescription,
}: IApiTokenTitle) => (
<Controller
control={control}
name="title"
rules={{
required: "Title is required",
maxLength: {
value: 255,
message: "Title should be less than 255 characters",
},
}}
render={({ field: { value, onChange, ref } }) =>
focusTitle ? (
<Input
id="title"
name="title"
type="text"
inputSize="md"
onBlur={() => {
setFocusTitle(false);
}}
onError={() => {
console.log("error");
}}
autoFocus={true}
value={value}
onChange={onChange}
ref={ref}
hasError={!!errors.title}
placeholder="Title"
className="resize-none text-xl w-full"
/>
) : (
<p
onClick={() => {
if (generatedToken != null) return;
setFocusDescription(false);
setFocusTitle(true);
}}
className={`${value.length === 0 ? "text-custom-text-400/60" : ""} font-medium text-[24px]`}
>
{value.length != 0 ? value : "Api Title"}
</p>
)
}
/>
);

View File

@ -0,0 +1,143 @@
import { useForm } from "react-hook-form";
import { addDays, renderDateFormat } from "helpers/date-time.helper";
import { IApiToken } from "types/api_token";
import { csvDownload } from "helpers/download.helper";
import { useRouter } from "next/router";
import { Dispatch, SetStateAction, useState } from "react";
import useToast from "hooks/use-toast";
import { useMobxStore } from "lib/mobx/store-provider";
import { ApiTokenService } from "services/api_token.service";
import { ApiTokenTitle } from "./ApiTokenTitle";
import { ApiTokenDescription } from "./ApiTokenDescription";
import { ApiTokenExpiry, expiryOptions } from "./ApiTokenExpiry";
import { Button } from "@plane/ui";
import { ApiTokenKeySection } from "./ApiTokenKeySection";
interface IApiTokenForm {
generatedToken: IApiToken | null | undefined;
setGeneratedToken: Dispatch<SetStateAction<IApiToken | null | undefined>>;
setDeleteTokenModal: Dispatch<SetStateAction<boolean>>;
}
const apiTokenService = new ApiTokenService();
export const ApiTokenForm = ({ generatedToken, setGeneratedToken, setDeleteTokenModal }: IApiTokenForm) => {
const [loading, setLoading] = useState<boolean>(false);
const [neverExpires, setNeverExpire] = useState<boolean>(false);
const [focusTitle, setFocusTitle] = useState<boolean>(false);
const [focusDescription, setFocusDescription] = useState<boolean>(false);
const [selectedExpiry, setSelectedExpiry] = useState<number>(1);
const { setToastAlert } = useToast();
const { theme: themStore } = useMobxStore();
const router = useRouter();
const { workspaceSlug } = router.query;
const {
control,
handleSubmit,
formState: { errors },
} = useForm({
defaultValues: {
never_expires: false,
title: "",
description: "",
},
});
const getExpiryDate = (): string | null => {
if (neverExpires === true) return null;
return addDays({ date: new Date(), days: expiryOptions[selectedExpiry].days }).toISOString();
};
function renderExpiry(): string {
return renderDateFormat(addDays({ date: new Date(), days: expiryOptions[selectedExpiry].days }), true);
}
const downloadSecretKey = (token: IApiToken) => {
const csvData = {
Label: token.label,
Description: token.description,
Expiry: renderDateFormat(token.expired_at ?? null),
"Secret Key": token.token,
};
csvDownload(csvData, `Secret-key-${Date.now()}`);
};
const generateToken = async (data: any) => {
if (!workspaceSlug) return;
setLoading(true);
await apiTokenService
.createApiToken(workspaceSlug.toString(), {
label: data.title,
description: data.description,
expired_at: getExpiryDate(),
})
.then((res) => {
setGeneratedToken(res);
downloadSecretKey(res);
setLoading(false);
})
.catch((err) => {
setToastAlert({
message: err.message,
type: "error",
title: "Error",
});
});
};
return (
<form
onSubmit={handleSubmit(generateToken, (err) => {
if (err.title) {
setFocusTitle(true);
}
})}
className={`${themStore.sidebarCollapsed ? "xl:w-[50%] lg:w-[60%] " : "w-[60%]"} mx-auto py-8`}
>
<div className="border-b border-custom-border-200 pb-4">
<ApiTokenTitle
generatedToken={generatedToken}
control={control}
errors={errors}
focusTitle={focusTitle}
setFocusTitle={setFocusTitle}
setFocusDescription={setFocusDescription}
/>
{errors.title && focusTitle && <p className=" text-red-600">{errors.title.message}</p>}
<ApiTokenDescription
generatedToken={generatedToken}
control={control}
focusDescription={focusDescription}
setFocusTitle={setFocusTitle}
setFocusDescription={setFocusDescription}
/>
</div>
{!generatedToken && (
<div className="mt-12">
<>
<ApiTokenExpiry
neverExpires={neverExpires}
selectedExpiry={selectedExpiry}
setSelectedExpiry={setSelectedExpiry}
setNeverExpire={setNeverExpire}
renderExpiry={renderExpiry}
control={control}
/>
<Button variant="primary" type="submit">
{loading ? "generating..." : "Add Api key"}
</Button>
</>
</div>
)}
<ApiTokenKeySection
generatedToken={generatedToken}
renderExpiry={renderExpiry}
setDeleteTokenModal={setDeleteTokenModal}
/>
</form>
);
};

View File

@ -0,0 +1,5 @@
export interface IApiFormFields {
never_expires: boolean;
title: string;
description: string;
}

View File

@ -0,0 +1,43 @@
import Link from "next/link";
// helpers
import { formatLongDateDistance, timeAgo } from "helpers/date-time.helper";
// icons
import { XCircle } from "lucide-react";
import { IApiToken } from "types/api_token";
interface IApiTokenListItem {
workspaceSlug: string | string[] | undefined;
token: IApiToken;
}
export const ApiTokenListItem = ({ token, workspaceSlug }: IApiTokenListItem) => (
<Link href={`/${workspaceSlug}/settings/api-tokens/${token.id}`} key={token.id}>
<div className="border-b flex flex-col relative justify-center items-start border-custom-border-200 py-5 hover:cursor-pointer">
<XCircle className="absolute right-5 opacity-0 pointer-events-none group-hover:opacity-100 group-hover:pointer-events-auto justify-self-center stroke-custom-text-400 h-[15px] w-[15px]" />
<div className="flex items-center px-4">
<span className="text-sm font-medium leading-6">{token.label}</span>
<span
className={`${
token.is_active ? "bg-green-600/10 text-green-600" : "bg-custom-text-400/20 text-custom-text-400"
} flex items-center px-2 h-4 rounded-sm max-h-fit ml-2 text-xs font-medium`}
>
{token.is_active ? "Active" : "Expired"}
</span>
</div>
<div className="flex items-center px-4 w-full">
{token.description.length != 0 && (
<p className="text-sm mb-1 mr-3 font-medium leading-6 truncate max-w-[50%]">{token.description}</p>
)}
{
<p className="text-xs mb-1 leading-6 text-custom-text-400">
{token.is_active
? token.expired_at === null
? "Never Expires"
: `Expires in ${formatLongDateDistance(token.expired_at!)}`
: timeAgo(token.expired_at)}
</p>
}
</div>
</div>
</Link>
);

View File

@ -0,0 +1,111 @@
//react
import { useState, Fragment, FC } from "react";
//next
import { useRouter } from "next/router";
//ui
import { Button } from "@plane/ui";
//hooks
import useToast from "hooks/use-toast";
//services
import { ApiTokenService } from "services/api_token.service";
//headless ui
import { Dialog, Transition } from "@headlessui/react";
type Props = {
isOpen: boolean;
handleClose: () => void;
tokenId?: string;
};
const apiTokenService = new ApiTokenService();
const DeleteTokenModal: FC<Props> = ({ isOpen, handleClose, tokenId }) => {
const [deleteLoading, setDeleteLoading] = useState<boolean>(false);
const { setToastAlert } = useToast();
const router = useRouter();
const { workspaceSlug, tokenId: tokenIdFromQuery } = router.query;
const handleDeletion = () => {
if (!workspaceSlug || (!tokenIdFromQuery && !tokenId)) return;
const token = tokenId || tokenIdFromQuery;
setDeleteLoading(true);
apiTokenService
.deleteApiToken(workspaceSlug.toString(), token!.toString())
.then(() => {
setToastAlert({
message: "Token deleted successfully",
type: "success",
title: "Success",
});
router.replace(`/${workspaceSlug}/settings/api-tokens/`);
})
.catch((err) => {
setToastAlert({
message: err?.message,
type: "error",
title: "Error",
});
})
.finally(() => {
setDeleteLoading(false);
handleClose();
});
};
return (
<Transition.Root show={isOpen} as={Fragment}>
<Dialog as="div" className="relative z-20" onClose={handleClose}>
<Transition.Child
as={Fragment}
enter="ease-out duration-300"
enterFrom="opacity-0"
enterTo="opacity-100"
leave="ease-in duration-200"
leaveFrom="opacity-100"
leaveTo="opacity-0"
>
<div className="fixed inset-0 bg-custom-backdrop bg-opacity-50 transition-opacity" />
</Transition.Child>
<div className="fixed inset-0 z-20 overflow-y-auto">
<div className="flex min-h-full items-end justify-center p-4 text-center sm:items-center sm:p-0">
<Transition.Child
as={Fragment}
enter="ease-out duration-300"
enterFrom="opacity-0 translate-y-4 sm:translate-y-0 sm:scale-95"
enterTo="opacity-100 translate-y-0 sm:scale-100"
leave="ease-in duration-200"
leaveFrom="opacity-100 translate-y-0 sm:scale-100"
leaveTo="opacity-0 translate-y-4 sm:translate-y-0 sm:scale-95"
>
<Dialog.Panel className="relative transform overflow-hidden rounded-lg border border-custom-border-200 bg-custom-background-100 text-left shadow-xl transition-all sm:my-8 sm:w-full sm:max-w-2xl">
<div className="flex flex-col gap-3 p-6">
<div className="flex w-full items-center justify-start">
<h3 className="text-xl font-semibold 2xl:text-2xl">Are you sure you want to revoke access?</h3>
</div>
<span>
<p className="text-base font-normal text-custom-text-400">
Any applications Using this developer key will no longer have the access to Plane Data. This
Action cannot be undone.
</p>
</span>
<div className="flex justify-end mt-2 gap-2">
<Button variant="neutral-primary" onClick={handleClose} disabled={deleteLoading}>
Cancel
</Button>
<Button variant="primary" onClick={handleDeletion} loading={deleteLoading} disabled={deleteLoading}>
{deleteLoading ? "Revoking..." : "Revoke"}
</Button>
</div>
</div>
</Dialog.Panel>
</Transition.Child>
</div>
</div>
</Dialog>
</Transition.Root>
);
};
export default DeleteTokenModal;

View File

@ -0,0 +1,36 @@
// react
import React from "react";
// next
import Image from "next/image";
import { useRouter } from "next/router";
// ui
import { Button } from "@plane/ui";
// assets
import emptyApiTokens from "public/empty-state/api-token.svg";
const ApiTokenEmptyState = () => {
const router = useRouter();
return (
<div className={`flex items-center justify-center mx-auto border bg-custom-background-90 py-10 px-16 w-full`}>
<div className="text-center flex flex-col items-center w-full">
<Image src={emptyApiTokens} className="w-52 sm:w-60" alt="empty" />
<h6 className="text-xl font-semibold mt-6 sm:mt-8 mb-3">No API Tokens</h6>
{
<p className="text-custom-text-300 mb-7 sm:mb-8">
Create API tokens for safe and easy data sharing with external apps, maintaining control and security
</p>
}
<Button
className="flex items-center gap-1.5"
onClick={() => {
router.push(`${router.asPath}/create/`);
}}
>
Add Token
</Button>
</div>
</div>
);
};
export default ApiTokenEmptyState;

View File

@ -42,13 +42,16 @@ const IssueLink = ({ activity }: { activity: IIssueActivity }) => {
return (
<Tooltip tooltipContent={activity.issue_detail ? activity.issue_detail.name : "This issue has been deleted"}>
<a
href={`/${workspaceSlug}/projects/${activity.project}/issues/${activity.issue}`}
target="_blank"
rel="noopener noreferrer"
aria-disabled={activity.issue === null}
href={`${
activity.issue_detail ? `/${workspaceSlug}/projects/${activity.project}/issues/${activity.issue}` : "#"
}`}
target={activity.issue === null ? "_self" : "_blank"}
rel={activity.issue === null ? "" : "noopener noreferrer"}
className="font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
>
{activity.issue_detail ? `${activity.project_detail.identifier}-${activity.issue_detail.sequence_id}` : "Issue"}
<RocketIcon size={12} color="#6b7280" />
<RocketIcon size={12} color="#6b7280" className="flex-shrink-0" />
</a>
</Tooltip>
);
@ -268,10 +271,10 @@ const activityDetails: {
href={`/${workspaceSlug}/projects/${activity.project}/cycles/${activity.new_identifier}`}
target="_blank"
rel="noopener noreferrer"
className="font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
className="w-full font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
>
{activity.new_value}
<RocketIcon size={12} color="#6b7280" />
<span className="truncate">{activity.new_value}</span>
<RocketIcon size={12} color="#6b7280" className="flex-shrink-0" />
</a>
</>
);
@ -283,10 +286,10 @@ const activityDetails: {
href={`/${workspaceSlug}/projects/${activity.project}/cycles/${activity.new_identifier}`}
target="_blank"
rel="noopener noreferrer"
className="font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
className="w-full font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
>
{activity.new_value}
<RocketIcon size={12} color="#6b7280" />
<span className="truncate">{activity.new_value}</span>
<RocketIcon size={12} color="#6b7280" className="flex-shrink-0" />
</a>
</>
);
@ -298,10 +301,10 @@ const activityDetails: {
href={`/${workspaceSlug}/projects/${activity.project}/cycles/${activity.old_identifier}`}
target="_blank"
rel="noopener noreferrer"
className="font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
className="w-full font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
>
{activity.old_value}
<RocketIcon size={12} color="#6b7280" />
<span className="truncate">{activity.old_value}</span>
<RocketIcon size={12} color="#6b7280" className="flex-shrink-0" />
</a>
</>
);
@ -479,10 +482,10 @@ const activityDetails: {
href={`/${workspaceSlug}/projects/${activity.project}/modules/${activity.new_identifier}`}
target="_blank"
rel="noopener noreferrer"
className="font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
className="w-full font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
>
{activity.new_value}
<RocketIcon size={12} color="#6b7280" />
<span className="truncate">{activity.new_value}</span>
<RocketIcon size={12} color="#6b7280" className="flex-shrink-0" />
</a>
</>
);
@ -494,10 +497,10 @@ const activityDetails: {
href={`/${workspaceSlug}/projects/${activity.project}/modules/${activity.new_identifier}`}
target="_blank"
rel="noopener noreferrer"
className="font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
className="w-full font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
>
{activity.new_value}
<RocketIcon size={12} color="#6b7280" />
<span className="truncate">{activity.new_value}</span>
<RocketIcon size={12} color="#6b7280" className="flex-shrink-0" />
</a>
</>
);
@ -509,10 +512,10 @@ const activityDetails: {
href={`/${workspaceSlug}/projects/${activity.project}/modules/${activity.old_identifier}`}
target="_blank"
rel="noopener noreferrer"
className="font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
className="w-full font-medium text-custom-text-100 inline-flex items-center gap-1 hover:underline"
>
{activity.old_value}
<RocketIcon size={12} color="#6b7280" />
<span className="truncate">{activity.old_value}</span>
<RocketIcon size={12} color="#6b7280" className="flex-shrink-0" />
</a>
</>
);

View File

@ -80,7 +80,7 @@ export const ActiveCycleDetails: React.FC<IActiveCycleDetails> = observer((props
workspaceSlug && projectId ? () => cycleStore.fetchCycles(workspaceSlug, projectId, "current") : null
);
const activeCycle = cycleStore.cycles?.[projectId]?.active || null;
const activeCycle = cycleStore.cycles?.[projectId]?.current || null;
const cycle = activeCycle ? activeCycle[0] : null;
const issues = (cycleStore?.active_cycle_issues as any) || null;

Some files were not shown because too many files have changed in this diff Show More