Compare commits

..

216 Commits

Author SHA1 Message Date
pablohashescobar
1e4bf9f997 dev: update takeoff script to take custom port values 2023-12-07 14:04:47 +05:30
Henit Chobisa
6c8df73ad4
[ FEATURE ] New Issue Widget for displaying issues inside document-editor (#2920)
* feat: added heading 3 in the editor summary markings

* feat: fixed editor and summary bar sizing

* feat: added `issue-embed` extension

* feat: exposed issue embed extension

* feat: added main embed config configuration to document editor body

* feat: added peek overview and issue embed fetch function

* feat: enabled slash commands to take additonal suggestions from editors

* chore: replaced `IssueEmbedWidget` into widget extension

* chore: removed issue embed from previous places

* feat: added issue embed suggestion extension

* feat: added issue embed suggestion renderer

* feat: added issue embed suggestions into extensions module

* feat: added issues in issueEmbedConfiguration in document editor

* chore: package fixes

* chore: removed log statements

* feat: added title updation logic into document editor

* fix: issue suggestion items, not rendering issue widget on enter

* feat: added error card for issue widget

* feat: improved focus logic for issue search and navigate

* feat: appended transactionid for issueWidgetTransaction

* chore: packages update

* feat: disabled editing of title in readonly mode

* feat: added issueEmbedConfig in readonly editor

* fix: issue suggestions not loading after structure changed to object

* feat: added toast messages for success/error messages from doc editor

* fix: issue suggestions sorting issue

* fix: formatting errors resolved

* fix: infinite reloading of the readonly document editor

* fix: css in avatar of issue widget card

* feat: added show alert on pages reload

* feat: added saving state for the pages editor

* fix: issue with heading 3 in side bar view

* style: updated issue suggestions dropdown ui

* fix: Pages intiliazation and mutation with updated MobX store

* fixed image uploads being cancelled on refocus due to swr

* fix: issue with same description rerendering empty content fixed

* fix: scroll in issue suggestion view

* fix: added submission prop

* fix: Updated the comment update to take issue id in inbox issues

* feat:changed date representation in IssueEmbedCard

* fix: page details mutation with optimistic updates using swr

* fix: menu options in read only editor with auth fixed

* fix: add error handling for title and page desc

* fixed yarn.lock

* fix: read-only editor title wrapping

* fix: build error with rich text editor

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
Co-authored-by: Palanikannan1437 <73993394+Palanikannan1437@users.noreply.github.com>
2023-12-07 12:04:21 +05:30
Lakhan Baheti
074e35525c
fix: spreadsheet layout bugs (#3016)
* fix: date picker z visibility

* fix: typo in empty issue screen

* fix: spread sheet column rightmost border
2023-12-07 11:59:46 +05:30
Aaryan Khandelwal
8ea42aa0b6
fix: remove all unused variables and added dependecies to useEffect and useCallback (#3013) 2023-12-06 20:31:42 +05:30
sriram veeraghanta
1df067ff61
fix: upgrading types react package (#3014) 2023-12-06 20:07:39 +05:30
guru_sainath
a79dbdadb5
fix: issue layouts bugs and ui fixes (#3012)
* fix: initial issue creation issue in the list layout

* fix kanban drag n drop and updating properties

* reduce z index of spreadsheet bottom row to not overlap with other elements

* fix state update by using state id instead of state detail's id

* fix add default use state for description

* add create issue button for project views to be at par with production

* save draft issues from modal

* chore: added save view button in all layouts applied filters

* use useEffect instead of swr for fetching issue details for peek overview

* fix: resolved kanban dnd

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-12-06 19:58:47 +05:30
Bavisetti Narayan
bffba6b9dc
chore: Page auth and other improvements (#3011)
* chore: project query optimised

* chore: page permissions changed
2023-12-06 19:30:40 +05:30
Aaryan Khandelwal
bd4b5040b2
chore: remove unused fields from the god mode (#3007) 2023-12-06 19:29:45 +05:30
Lakhan Baheti
0d762d74dc
fix: kanban board block's menu & drop delete. (#2987)
* fix: kanban board block menu click

* fix: menu active/disable

* fix: drag n drop delete modal

* fix: quick action button in all the layouts

* chore: toast for drag & drop api
2023-12-06 19:21:24 +05:30
Henit Chobisa
567c3fadc0
feat: added custom blockquote extension for resolving enter key behaviour (#2997) 2023-12-06 19:17:47 +05:30
Lakhan Baheti
cb0af255f9
fix: custom analytic grouped bar tooltip value as ID (#3003)
* fix: tooltip value is coming as ID

* fix lint named module
2023-12-06 19:15:07 +05:30
Aaryan Khandelwal
d204cc7d6c
chore: added authorization to pages (#3006)
* chore: updated pages authorization

* chore: updated pages empty state image
2023-12-06 19:13:42 +05:30
Anmol Singh Bhatia
5317de5919
chore: plane logo without text updated (#3008) 2023-12-06 19:11:34 +05:30
Anmol Singh Bhatia
b499e2c5a5
fix: bug fixes (#3010)
* fix: project view modal auto close bug fix

* fix: issue peek overview label select permission validation added
2023-12-06 19:10:53 +05:30
Nikhil
956d7acfa6
dev: user password reset management command (#3000) 2023-12-06 18:29:53 +05:30
Nikhil
4040441ab1
dev: remove unused packages (#3009)
* dev: remove unused packages

* dev: remove gunicorn config
2023-12-06 18:28:34 +05:30
Bavisetti Narayan
dc5c0fe5bf
chore: posthog event for workspace invite (#2989)
* chore: posthog event for workspace invite

* chore: updated event names, added all the existing events to workspace metrics group

* chore: seperated workspace invite

* fix: workspace invite accept event updated

---------

Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>
2023-12-06 17:15:14 +05:30
Jorge
675e747fed
Add CodeQL workflow (#1452) 2023-12-06 17:07:55 +05:30
Manish Gupta
c9517610da
modified docker image repo names (#3004) 2023-12-06 16:47:14 +05:30
Anmol Singh Bhatia
0e1ee80512
chore: updated plane deploy sign-in workflows for cloud and self-hosted instances (#2999)
* chore: deploy onboarding workflow

* chore: sign in workflow improvement

* fix: build error
2023-12-06 16:42:57 +05:30
guru_sainath
0fc273aca6
clean-up: removed labels in the filters and handled redirection issue from peek overview and ui changes (#3002) 2023-12-06 16:27:33 +05:30
Lakhan Baheti
d6b23fe380
fix: bugs & improvements (#2998)
* fix: create more toggle in update issue modal

* fix: spreadsheet estimate column hide

* fix: flickering in all the layouts

* fix: logs
2023-12-06 14:29:07 +05:30
Aaryan Khandelwal
11987994a1
chore: updated sign-in workflows for cloud and self-hosted instances (#2994)
* chore: update onboarding workflow

* dev: update user count tasks

* fix: forgot password endpoint

* dev: instance and onboarding updates

* chore: update sign-in workflow for cloud and self-hosted instances (#2993)

* chore: updated auth services

* chore: new signin workflow updated

* chore: updated content

* chore: instance admin setup

* dev: update instance verification task

* dev: run the instance verification task every 4 hours

* dev: update migrations

* chore: update latest features image

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2023-12-06 14:22:59 +05:30
Anmol Singh Bhatia
b957e4e935
fix: bug fixes and improvement (#2992)
* chore: issue sidebar permission bug fix and not authorized page redirection added

* chore: unauthorized project setting page improvement

* fix: build error fix
2023-12-06 14:22:06 +05:30
M. Palanikannan
e7ac7e1da8
fix: Image Resizing and PR (#2996)
* added image min width and height programatically

* fixed editor initialization for peek view and inbox issues

* fixed ts issues with issue id in inbox
2023-12-06 12:16:34 +05:30
rahulramesha
ac18906604
fix: bugs related to issues (#2995)
* hide properties in list and kanban with 0 or nil values

* module and cycle mutation from peek overlay

* fix peek over view title change while switching

* fix create issue fetching

* fix build errors by mutating the values as well
2023-12-06 01:02:18 +05:30
Anmol Singh Bhatia
aec50e2c48
[FED-1147] chore: module link mobx integration (#2990)
* chore: link type updated

* chore: mobx implementation for module link

* chore: update module mutation logic updated and toast alert added
2023-12-05 19:32:25 +05:30
guru_sainath
8dee7e51ca
chore: workspace profile issues, kanabn DND upgrade, implemented filters in plaen deploy (#2991) 2023-12-05 17:26:57 +05:30
Anmol Singh Bhatia
8b2d78ef92
chore: space ui component revamp and bug fixes (#2980)
* chore: replace space ui component with plane ui component

* fix: space project icon and user pic bug

* chore: code refactor

* fix: profile section navbar fix
2023-12-05 16:07:25 +05:30
Lakhan Baheti
9649f42ff3
chore: email invite accept validation (#2965)
* fix: empty state flickering on accepting only invitation

* fix: redirection from workspace-invitaion to onboarding

* chore: onboarding step 1 skip on accepting invite from email

* fix: dashboard redirection path
2023-12-05 16:06:43 +05:30
sabith-tu
db1166411a
style: image picker, spreadsheet view title, icons (#2988)
* style: image picker, spreadsheet view title, icons

* fix: build error fix
2023-12-05 16:05:50 +05:30
Lakhan Baheti
effad33f67
chore: issue update status in peekview & detail (#2985) 2023-12-05 14:40:29 +05:30
Aaryan Khandelwal
fd73f18d95
fix: leave project mutation (#2976) 2023-12-05 13:44:01 +05:30
Nikhil
32bfa4d7cf
fix: sentry dsn error (#2981) 2023-12-05 13:42:16 +05:30
Anmol Singh Bhatia
7b12d54d83
chore: draft issue layout and permission validation (#2982)
* chore: create draft issue option added in draft issue layout and permission validation added

* chore: create draft issue option added in draft issue list layout and permission validation added
2023-12-05 13:41:31 +05:30
Anmol Singh Bhatia
2cf847e8a7
chore: module and cycle sidebar date mutation fix (#2986) 2023-12-05 13:40:29 +05:30
Bavisetti Narayan
e2524799d2
chore: html validation (#2970)
* chore: changed api serializers

* chore: state status code

* chore: removed sorted keys
2023-12-05 13:39:09 +05:30
rahulramesha
8a8eea38f9
fix: Permission levels for project settings (#2978)
* fix add subgroup issue FED-1101

* fix subgroup by None assignee FED-1100

* fix grouping by asignee or labels FED-1096

* fix create view popup FED-1093

* fix subgroup exception in swimlanes

* fix show sub issue filter FED-1102

* use Enums instead of numbers

* fix Estimates setting permission for admin

* disable access to project settings for viewers and guests

* fix project unautorized flicker

* add observer to estimates

* add permissions to member list
2023-12-04 20:03:23 +05:30
rahulramesha
b527ec582f
fix: mutation on transfer issues from cycle (#2979)
* fix cycle issues mutation on transfering issues

* fix transfer issues from cycle
2023-12-04 20:02:14 +05:30
Aaryan Khandelwal
fa990ed444
refactor: custom hook for sign in redirection (#2969) 2023-12-04 15:04:04 +05:30
Nikhil
79fa860b28
chore: instance (#2955) 2023-12-04 14:48:40 +05:30
rahulramesha
59110c7205
fix: sub display filter params for fetching issues (#2972)
* fix add subgroup issue FED-1101

* fix subgroup by None assignee FED-1100

* fix grouping by asignee or labels FED-1096

* fix create view popup FED-1093

* fix subgroup exception in swimlanes

* fix show sub issue filter FED-1102
2023-12-02 18:23:07 +05:30
Aaryan Khandelwal
9756abece4
chore: update instance admin sign in endpoint (#2973) 2023-12-02 18:19:59 +05:30
guru_sainath
f4aa059da6
fix: global issues properties updation issue resolved (#2974) 2023-12-02 18:19:29 +05:30
rahulramesha
d22c258c00
fix: V3 release blocker bugs (#2968)
* fix add subgroup issue FED-1101

* fix subgroup by None assignee FED-1100

* fix grouping by asignee or labels FED-1096

* fix create view popup FED-1093

* fix subgroup exception in swimlanes
2023-12-01 18:51:52 +05:30
sabith-tu
65253c6383
style: empty state for analytics, views and pages (#2967) 2023-12-01 18:19:17 +05:30
guru_sainath
8c462f96ee
fix: corrected rendering of workspace-level labels, members, and states in project view (#2966)
* fix: dynamic issue properties filters in project, workspace and profile level

* clean-up: removed logs from the project store
2023-12-01 17:35:33 +05:30
Aaryan Khandelwal
332e56bb0d
style: updated the UI of the instance admin setup and the sign in workflow (#2962)
* style: updated the UI of the signin and instance setups

* fix: form validations and mutations

* fix: updated Link tags in accordance to next v14

* chore: latest features image, reset password redirection
2023-12-01 15:50:01 +05:30
Lakhan Baheti
eca96da837
fix: create workspace form validation (#2958)
* fix: create workspace form validation

* fix: textfield placeholder typo

* fix: name field onchange
2023-12-01 15:18:48 +05:30
guru_sainath
5446447231
chore: workspace global issues (#2964)
* dev: global issues store

* build-error: all issues render

* build-error: build error resolved in global view store
2023-12-01 15:16:36 +05:30
Anmol Singh Bhatia
dec1fb5a63
chore: profile issue display filters content updated (#2963) 2023-12-01 15:14:18 +05:30
Anmol Singh Bhatia
2b63827caa
chore: create issue modal improvement (#2960) 2023-12-01 15:12:25 +05:30
Manish Gupta
5993fc38f0
branch build fixes (#2961) 2023-12-01 13:40:16 +05:30
Anmol Singh Bhatia
ab37ce979a
fix: view modal overlapping (#2956) 2023-12-01 13:27:16 +05:30
sriram veeraghanta
c6764111a3
fix: upgrading to nextjs 14 (#2959) 2023-12-01 13:25:48 +05:30
Manish Gupta
4a2f8d93b1
fix: branch build 2 (#2957)
* branch build fix

* removed quotes
2023-12-01 13:23:50 +05:30
Manish Gupta
5db9b2b638
branch build fix (#2954) 2023-12-01 12:46:18 +05:30
Bavisetti Narayan
e2b704ea6f
chore: removed django settings module (#2953) 2023-12-01 12:21:17 +05:30
Nikhil
452c9f0d5b
dev: update email templates (#2948)
* dev: update magic link email

* dev: forgot password mail

* dev: workspace invitation update

* dev: update email templates and task

* dev: remove email verification template

* dev: change all conversation links to issues
2023-11-30 13:30:13 +05:30
rahulramesha
139c0857eb
fix: quick add positioning (#2949)
* fix quick add posutioning for kanban and spreadsheet

* fix kanban quick add project identifier
2023-11-30 12:22:43 +05:30
Nikhil
2556d052de
chore: status code changed (#2947) 2023-11-29 21:46:54 +05:30
Nikhil
c65915665b
dev: transactional emails (#2946) 2023-11-29 21:46:18 +05:30
Nikhil
6ece5a57e2
dev: instance registration (#2912)
* dev: remove auto script for registration

* dev: make all of the instance admins as owners when adding a instance admin

* dev: remove sign out endpoint

* dev: update takeoff script to register the instance

* dev:  reapply instance model

* dev: check none for instance configuration encryptions

* dev: encrypting secrets configuration

* dev: user workflow for registration in instances

* dev: add email automation configuration

* dev: remove unused imports

* dev: reallign migrations

* dev: reconfigure license engine registrations

* dev: move email check to background worker

* dev: add sign up

* chore: signup error message

* dev: updated onboarding workflows and instance setting

* dev: updated template for magic login

* chore: page migration changed

* dev: updated migrations and authentication for license and update template for workspace invite

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-29 20:33:37 +05:30
Anmol Singh Bhatia
a779fa1497
dev: instance setup workflow (#2935)
* chore: instance type updated

* chore: instance not ready screen added

* chore: instance layout added

* chore: instance magic sign in endpoint and type added

* chore: instance admin password endpoint added

* chore: instance setup page added

* chore: instance setup form added

* chore: instance layout updated

* fix: instance admin workflow setup

* fix: admin workflow setup

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-29 20:33:08 +05:30
sriram veeraghanta
5eb7740833
fix: removed unused packages and upgraded to next 14 (#2944)
* fix: upgrading next package and removed unused deps

* chore: unused variable removed

* chore: next image icon fix

* chore: unused component removed

* chore: next image icon fix

* chore: replace use-debounce with lodash debounce

* chore: unused component removed

* resolved: fixed issue with next link component

* fix: updates in next config

* fix: updating types pages

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
2023-11-29 20:32:10 +05:30
guru_sainath
86408fcd46
chore: replaced v3 issues endpoints (#2945)
* chore: removed v3 endpoints

* chore: replace v3 issues to normal issues endpoints

* build-error: Bulid error is new issue structure

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-29 20:26:43 +05:30
Anmol Singh Bhatia
86de3188b1
chore: cycle and module status indicator improvement (#2942) 2023-11-29 20:02:41 +05:30
Anmol Singh Bhatia
c5996382bc
style: empty state improvement (#2943) 2023-11-29 20:02:13 +05:30
rahulramesha
2796754e5f
fix: v3 issues for the layouts (#2941)
* fix drag n drop exception error

* fix peek overlay close buttons

* fix project empty state view

* fix cycle and module empty state view

* add ai options to inbox issue creation

* fix inbox filters for viewers

* fix inbox filters for viewers for project

* disable editing permission for members and viewers

* define accurate types for drag and drop
2023-11-29 19:58:27 +05:30
Lakhan Baheti
3488dc3014
style: deactivate acount modal (#2940) 2023-11-29 19:12:12 +05:30
Bavisetti Narayan
c7a3d8f0a0
chore: v3 global issues (#2938) 2023-11-29 19:08:14 +05:30
Aaryan Khandelwal
6d97dd814a
chore: updated sign in workflow (#2939)
* chore: new sign in workflow

* chore: request new code button added

* chore: create new password form added

* fix: build errors

* chore: remove unused components

* chore: update submitting state texts

* fix: oauth sign in process
2023-11-29 19:07:33 +05:30
Lakhan Baheti
804c03bd15
chore: redirection to profile after workpace delete/leave (#2937) 2023-11-29 16:52:54 +05:30
Manish Gupta
59981e3e68
fix: Branch Build and Self hosting fixes (#2930)
* Branch build yml modified to create preview and latest tags

* self host install modified to handle public image only

* testing update-docker

* testing

* wip

* rolled back to orignal

* selfhosting readme updated
2023-11-29 14:58:31 +05:30
Aaryan Khandelwal
1be1b9f4a3
chore: issue peek overview (#2918)
* chore: autorun for the issue detail store

* fix: labels mutation

* chore: remove old peek overview code

* chore: move add to cycle and module logic to store

* fix: build errors

* chore: add peekProjectId query param for the peek overview

* chore: update profile layout

* fix: multiple workspaces

* style: Issue activity and link design improvements in Peek overview.
* fix issue with labels not occupying full widht.
* fix links overflow issue.
* add tooltip in links to display entire link.
* add functionality to copy links to clipboard.

* chore: peek overview for all the layouts

* fix: build errors

---------

Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-29 14:25:57 +05:30
Lakhan Baheti
7cd02401c1
fix: spreadsheet layout issue properties (#2936)
* fix: spredsheet layout state column state name & tootltip

* fix: label select dropdown first item auto active state

* fix: priority column padding & tooltip position
2023-11-29 14:20:58 +05:30
Bavisetti Narayan
6d175c0e56
chore: api rate limiting (#2933) 2023-11-29 14:14:26 +05:30
Bavisetti Narayan
6a4f521b47
chore: api webhooks validation (#2928)
* chore: api webhooks update

* chore: webhooks signature validation
2023-11-29 14:13:03 +05:30
rahulramesha
cdc4fd27a5
fix issue sorting and add sorting to other properties (#2931) 2023-11-29 14:02:39 +05:30
Aaryan Khandelwal
f4af3db7b6
chore: update the content of webhooks form (#2932) 2023-11-29 14:02:13 +05:30
Aaryan Khandelwal
4f64f431a7
chore: update profile settings layout (#2925)
* chore: update profile layout

* fix: multiple workspaces

* chore: removed breadcrumbs

* chore: fix sidebar collapsed state
2023-11-29 13:48:07 +05:30
Anmol Singh Bhatia
537901f046
style: workspace sidebar scroll fix and improvement (#2934) 2023-11-29 13:32:35 +05:30
Lakhan Baheti
404a61aee5
fix: google auth button content alignment (#2929) 2023-11-29 13:29:38 +05:30
Lakhan Baheti
1a2b1e648d
style: switch or delete account modal (#2926)
* style: switch or delete account modal

* fix: popover text color

* fix: typo
2023-11-29 13:28:24 +05:30
guru_sainath
3400c119bc
fix: drag and drop implementation in calendar layout and kanban layout (#2921)
* fix profile issue filters and kanban

* chore: calendar drag and drop

* chore: kanban drag and drop

* dev: remove issue from the kanban layout and resolved build errors

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-28 19:17:38 +05:30
sabith-tu
d5853405ca
style: new empty state ui (#2923) 2023-11-28 18:47:52 +05:30
rahulramesha
db510dcfcd
fix: all functionalities for profile, archived and draft issues (#2922)
* fix profile issue filters and kanban

* fix profile draft and archived issues
2023-11-28 18:15:46 +05:30
Lakhan Baheti
62c0615012
style: member role visibility (#2919)
* style: member role visibility

* fix: build errors
2023-11-28 17:11:04 +05:30
Bavisetti Narayan
3914a75334
chore: deactivated user workflow change (#2888)
* chore: deactivated user workflow change

* chore: removed archived and draft from v3 by default

* chore: draft and archive update

* chore: bool field

* chore: fall back workspace updated

* chore: workspace member active
2023-11-28 17:08:05 +05:30
Aaryan Khandelwal
0cbb201348
fix: workspace settings pages authorization (#2915)
* fix: workspace settings pages authorization

* chore: user cannot add a member with a higher role than theirs

* chore: update workspace general settings auth
2023-11-28 17:05:42 +05:30
rahulramesha
f7264364bd
add functionality for addition of existing issues to modules and cycles (#2913)
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-28 14:50:37 +05:30
Manish Gupta
35f8ffa5ab
migration script added (#2914) 2023-11-28 13:29:08 +05:30
M. Palanikannan
6607caade7
fix: Image restoration fixed (marks/unmarks an image to be deleted after a week) (#2859)
* image restoration fixed (marks an image to be deleted after a week)

* removed clgs

* added image constraints

* formatted editor-core package using yarn format

* lite-text-editor nothing to format

* rich-text-editor nothing to format

* formatted document-editor with prettier

* modified file service to follow api change

* fixed more formatting in document editor

* fixed all instances of types with that from the package

* fixed delete to work consistently (minor optimizations turned off)

* stop duplicate images inside editor

* restore image on editor creation

say if user A deletes image number 2, user B was also in the same issue and in their screen the image was there, if user B makes certain changes and that gets saved in backend, according to user B image 2 should exist but since user A deleted it, it'll not get restored and get deleted in 7 days, hence I've added a check such that whenever a issue loads we restore all images by default

* added restore image function with types

* replaced all instances to have restore image logic

* fixed issue detail for peek view

* disabled option to insert table inside a table
2023-11-28 11:34:20 +05:30
Manish Gupta
8ee8270697
removed container names for selfhosting (#2907) 2023-11-28 11:31:04 +05:30
Aaryan Khandelwal
41ab962dd7
chore: update get invitation details endpoint (#2902) 2023-11-28 11:30:03 +05:30
Prateek Shourya
c22c6bb9b2
Fix: bug fixes and UI / UX improvements (#2906)
* Fix: issue with project publish modal data not updating immediately.

* fix: issue with workspace list not scrollable in profile settings.

* fix: update redirect workspace slug logic to redirect to prev workspace instead of `/`.

* style: update API tokens and webhooks empty state designs.
2023-11-28 11:29:01 +05:30
sriram veeraghanta
67de6d0729
fix: adding ai assistance to pages (#2905)
* fix: adding ai modal to pages

* fix: pages overflow

* chore: update pages UI

* fix: updating page description while using ai assistance

* fix: gpt assistant modal height and position

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-27 20:39:18 +05:30
M. Palanikannan
f361cd045e
image can't be inserted inside table (#2904)
* image can't be inserted inside table

Now we've diabled image icon from showing up if the cursor is inside a table node or if a table cell is selected

* added drag drop support for document editor

* fixed missing dependencies
2023-11-27 20:37:40 +05:30
Prateek Shourya
06d3cd7e73
refactor: Instance admin setting and UI updates. (#2889)
* refactor: shift instance admin restriction content to seperate component.
fix: instance components export logic.

* style: fix sidebar dropdown `God Mode` icon padding.

* style: update profile settings user dropdown menu width.

* fix: update input type to `password` for Client Secret and API/ Access Key fields.

* style: update loader design for all forms.

* fix: typo

* style: ui updates.

* chore: add show/ hide button for all password fields.
2023-11-27 19:41:47 +05:30
Aaryan Khandelwal
10c52bf89b
refactor: keyboard shortcuts modal (#2822)
* refactor: keyboard shortcuts modal

* chore: updated search logic

* refactor: divided the modal component into granular components
2023-11-27 18:41:59 +05:30
Anmol Singh Bhatia
b717518fbe
fix: workspace dropdown scroll (#2900) 2023-11-27 18:25:40 +05:30
Aaryan Khandelwal
f48cd6f50c
fix: profile layout flicker (#2898)
* fix: user profile layout flicker

* chore: update import statements
2023-11-27 18:25:16 +05:30
Lakhan Baheti
3203ae6549
fix: progress panel default open (#2894) 2023-11-27 17:21:14 +05:30
Lakhan Baheti
b8f603f920
chore: signup removed (#2890) 2023-11-27 17:17:55 +05:30
Anmol Singh Bhatia
96ff76af94
[FED-1018] chore: workspace dropdown improvement (#2891)
* fix: workspace dropdown improvement

* style: sidebar workspace icon alignment
2023-11-27 17:17:09 +05:30
Anmol Singh Bhatia
830675741f
[FED-1054] fix: join project mutation (#2892)
* fix: join project mutation

* chore: code refactor
2023-11-27 17:16:46 +05:30
srinivas pendem
e489ad50dc
Update README.md (#2893)
./setup.sh removed

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-27 17:16:13 +05:30
Aaryan Khandelwal
3dc18bc8fd
refactor: webhooks (#2896)
* refactor: webhooks workflow

* chore: update delete modal content
2023-11-27 17:15:48 +05:30
Anmol Singh Bhatia
2d04917951
chore: instance admins endpoint added and ui/ux improvement (#2895)
* style: sidebar improvement

* style: header height consistency

* chore: layout consistency and general page improvement

* chore: layout, email form and image form improvement

* chore: instance admins endpoint intergrated and code refactor

* chore: code refactor

* chore: google client secret section removed
2023-11-27 17:15:11 +05:30
guru_sainath
2bf7e63625
issues rendering in all issue layouts fir profile and project issues and global issues store implementation (#2886)
* dev: draft and archived issue store

* connect draft and archived issues

* kanban for draft issues

* fix filter store for calendar and kanban

* dev: profile issues store and draft issues filters in header

* disble issue creation for draft issues

* dev: profile issues store filters

* disable kanban properties in draft issues

* dev: profile issues store filters

* dev: seperated adding issues to the cycle and module as seperate methds in cycle and module store

* dev: workspace profile issues store

* dev: sub group issues in the swimlanes

* profile issues and create issue connection

* fix profile issues

* fix spreadsheet issues

* fix dissapearing project from create issue modal

* page level modifications

* fix additional bugs

* dev: issues profile and global iisues and filters update

* fix issue related bugs

* fix project views for list and kanban

* fix build errors

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-27 14:15:33 +05:30
Anmol Singh Bhatia
eb78fd6088
fix: resolve modal overlapping issue (#2885) 2023-11-27 12:16:59 +05:30
Lakhan Baheti
202ecd21df
fix: bug fixes & UI improvements (#2884)
* chore: access restriction for api tokens

* fix: on create module total issues undefined

* fix: cycle board card typo

* chore: fetch modules after creation

* fix: peek module on delete

* fix: peek cycle on delete

* fix: cycle detail sidebar copy link toast

* chore: router replace -> push
2023-11-27 12:15:10 +05:30
Aaryan Khandelwal
b2ac7b9ac6
chore: revamp the API tokens workflow (#2880)
* chore: added getLayout method to api tokens pages

* revamp: api tokens workflow

* chore: add title validation and update types

* chore: minor UI updates

* chore: update route
2023-11-27 12:14:06 +05:30
Lakhan Baheti
51dff31926
fix: user state after logout (#2849)
* fix: user state after logout

* chore: user state handle with mobx

* chore: signout update for profile setting

* fix: minor fixes

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-25 23:04:56 +05:30
sriram veeraghanta
e89f152779
fix: remove slack notification on build branch workflow (#2881) 2023-11-25 22:43:27 +05:30
Lakhan Baheti
3c9f57f8f4
fix: workspace & user avatar tooltip (#2851)
* fix: workspace & user avatar tooltip

* chore: user name update while typing on top right avatar

* chore: imports placement

* fix: rendering condition

* chore: component re-arrangement

* fix: imports
2023-11-25 21:31:09 +05:30
Bavisetti Narayan
1bc859c68c
chore: seperated delete endpoint for file upload (#2870) 2023-11-25 21:28:03 +05:30
Ramesh Kumar Chandra
11d57a5bf0
fix: track events updated, extra parameters added, added events for issues, pages, states, cycles (#2875)
* fix: event tracking method updated to store, chore: updated and added events for workspace, projects and create issue

* fix: posthog auth event tracking

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-25 21:26:26 +05:30
Prateek Shourya
2980c7b00d
Feat: God Mode UI Updates and More Config Settings (#2877)
* feat: Images in Plane config screen.
* feat: Enable/ Disable Magic Login config toggle.
* style: UX copy and design updates across all screens.
* style: SSO and OAuth Screen revamp.
* style: Enter God Mode button for Profile Settings sidebar.
* fix: update input type to password for password fields.
2023-11-25 21:23:50 +05:30
Anmol Singh Bhatia
5c6a59ba35
dev: badge component added in planu ui package (#2876) 2023-11-25 21:21:03 +05:30
Anmol Singh Bhatia
a3ea7c8f10
fix: issue peek overview state select dropdown overflow fix (#2873) 2023-11-25 21:18:54 +05:30
Anmol Singh Bhatia
cb922fb113
fix: module sidebar date select fix and code refactor (#2872) 2023-11-25 21:18:16 +05:30
sriram veeraghanta
06564ee856
fix: remove slack notify (#2871)
* fix: remove slack notifications on workflows

* fix: bugfix
2023-11-24 14:31:44 +05:30
Nikhil
c7e6118804
refactor: image upload modals, file size limit added to config (#2868)
* chore: add file size limit as config in the config api

* refactor: image upload modals

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-24 13:23:46 +05:30
Manish Gupta
069b8b3ed9
Updated the slack notification message to PR Title (#2869) 2023-11-24 13:11:21 +05:30
Lakhan Baheti
38a5b7bec0
chore: added error toast for invitation (#2853) 2023-11-24 12:47:02 +05:30
Bavisetti Narayan
236caaafe8
chore: user deactivation and login restriction (#2855)
* chore: user deactivation

* chore: deactivation and login disabled

* chore: added get configuration value

* chore: serializer message change

* chore: instance admin passowrd change

* chore: removed triage

* chore: v3 endpoint for user profile

* chore: added enable signin

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-24 12:22:24 +05:30
Nikhil
a6d5eab634
chore: api and webhook refactor (#2861)
* chore: bug fix

* dev: changes in api endpoints for invitations and inbox

* chore: improvements

* dev: update webhook send

* dev: webhook validation and fix webhook flow for app

* dev: error messages for deactivation

* chore: api fixes

* dev: update webhook and workspace leave

* chore: issue comment

* dev: default values for environment variables

* dev: make the user active if he was already part of project member

* chore: webhook cycle and module event

* dev: disable ssl for emails

* dev: webhooks restructuring

* dev: updated webhook configuration

* dev: webhooks

* dev: state get object

* dev: update workspace slug validation

* dev: remove deactivation flag if max retries exceeded

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-24 12:19:26 +05:30
sriram veeraghanta
8d76c96a6f
fix: adding slack notification when build is failed to upload to docker (#2862)
* fix: removing logs

* fix: adding slack notification when build is failed to upload to docker

* minor changes

---------

Co-authored-by: Manish Gupta <59428681+manishg3@users.noreply.github.com>
2023-11-24 12:17:31 +05:30
Aaryan Khandelwal
97be4b60ae
chore: update profile and God mode routes (#2860)
* chore: update profile and god mode routes

* fix: profile activity loader

* chore: update profile route in the change password page
2023-11-24 12:16:37 +05:30
Bavisetti Narayan
dece103873
chore: user activity in profile page (#2856)
* chore: user activity endpoint change

* chore: added workspace detail in activity serializer
2023-11-23 21:00:49 +05:30
Anmol Singh Bhatia
c6125876be
fix: view date filter select fix (#2858) 2023-11-23 20:40:41 +05:30
Anmol Singh Bhatia
1f85bf2302
style: module ui improvement (#2838) 2023-11-23 20:39:58 +05:30
Anmol Singh Bhatia
20baba3bb0
style: issue activity section improvement (#2836) 2023-11-23 20:39:18 +05:30
Ramesh Kumar Chandra
85907b32d1
feat: change password page (#2847) 2023-11-23 20:38:50 +05:30
sabith-tu
ef2bef83dc
style: removing extra options heading and drop down icon (#2852) 2023-11-23 20:38:05 +05:30
Aaryan Khandelwal
6e7a96394a
fix: page scroll area (#2850) 2023-11-23 18:22:25 +05:30
Aaryan Khandelwal
5726f6955c
dev: added tailwind merge helper function (#2844) 2023-11-23 17:21:47 +05:30
Aaryan Khandelwal
82665a35ee
fix: archived issues infinite call (#2848) 2023-11-23 16:58:08 +05:30
sriram veeraghanta
4efd225599
fix: updated document editor package in web and space apps (#2846) 2023-11-23 15:27:20 +05:30
sriram veeraghanta
2481706581
chore: optimizations and file name changes (#2845)
* fix: deepsource antipatterns

* fix: deepsource exclude file patterns

* chore: file name changes and removed unwanted variables

* fix: changing version number for editor
2023-11-23 15:09:46 +05:30
guru_sainath
a17b08dd15
chore: implemented new store and issue layouts for issues and updated new data structure for issues (#2843)
* fix: Implemented new workflow in the issue store and updated the quick add workflow in list layout

* fix: initial load and mutaion of issues in list layout

* dev: implemented the new project issues store with grouped, subGrouped and unGrouped issue computed functions

* dev: default display properties data made as a function

* conflict: merge conflict resolved

* dev: implemented quick add logic in kanban

* chore: implemented quick add logic in calendar and spreadsheet layout

* fix: spreadsheet layout quick add fix

* dev: optimised the issues workflow and handled the issues order_by filter

* dev: project issue CRUD operations in new issue store architecture

* dev: issues filtering in calendar layout

* fix: build error

* dev/issue_filters_store

* chore: updated filters computed structure

* conflict: merge conflicts resolved in project issues

* dev: implemented gantt chart for project issues using the new mobx store

* dev: initialized cycle and module issue filters store

* dev: issue store and list layout store updates

* dev: quick add and update, delete issue in the list

* refactor list root changes

* dev: store new structure

* refactor spreadsheet and gnatt project roots

* fix errors for base gantt and spreadsheet roots

* connect Calendar project view

* minor house keeping

* connect Kanban View to th enew store

* generalise base calendar issue actions

* dev: store project issues and issue filters

* dev: store project issues and filters

* dev: updated undefined with displayFilters in project issue store

* Add Quick add to all the layouts

* connect module views to store

* dev: Rendering list issues in project issues

* dev: removed console log

* dev: module filters store

* fix errors and connect modules list and quick add for list

* dev: module issue store

* dev: modle filter store issue fixed and updates cycle issue filters

* minor house keeping changes

* dev: cycle issues and cycle filters

* connecty cycles to teh store

* dev: project view issues and issue filtrs

* connect project views

* dev: updated applied filters in layouts

* dev: replaced project id with view id in project views

* dev: in cycle and module store made cycledId and moduleId as optional

* fix minor issues and build errots

* dev: project draft and archived issues store and filters

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-23 14:47:04 +05:30
Aaryan Khandelwal
a7d6b528bd
chore: deactivate user option added (#2841)
* dev: deactivate user option added

* chore: new layout for profile settings

* fix: build errors

* fix: user profile activity
2023-11-23 14:44:06 +05:30
Lakhan Baheti
9ba724b78d
fix: onboarding bugs & improvements (#2839)
* fix: terms & condition alignment

* fix: onboarding page scrolling

* fix: create workspace name clear

* fix: setup profile sidebar workspace name

* fix: invite team screen button text

* fix: inner div min height

* fix: allow single invite also in invite member

* fix: UI clipping in invite members

* fix: signin screen scroll

* fix: sidebar notification icon

* fix: sidebar project name & icon

* fix: user detail bottom image alignment

* fix: step indicator in invite member

* fix: try different account modal state

* fix: setup profile remove image

* fix: workspace slug clear

* fix: invite member UI & focus

* fix: step indicator size

* fix: inner div placement

* fix: invite member validation logic

* fix: cuurent user data persistency

* fix: sidebar animation colors

* feat: signup & resend

* fix: sign out theme persist from popover

* fix: imports

* chore: signin responsiveness

* fix: sign-in, sign-up top padding
2023-11-23 13:45:00 +05:30
Bavisetti Narayan
c2da9783a3
chore: change password endpoint (#2842) 2023-11-23 13:44:50 +05:30
Anmol Singh Bhatia
784be47e91
[FED-888] fix: parent issue select modal improvement (#2837)
This PR include improvement for parent issue select modal.
2023-11-22 16:16:52 +05:30
Anmol Singh Bhatia
0fdd9c28bf
fix: project setting ui consistency (#2835) 2023-11-22 15:36:34 +05:30
Anmol Singh Bhatia
644b06749b
fix: profile setting overflow (#2834) 2023-11-22 15:35:51 +05:30
Anmol Singh Bhatia
dd8c7a7487
fix: cycle and module create/update modal fix (#2833) 2023-11-22 15:35:24 +05:30
Anmol Singh Bhatia
e6a1f34713
fix: module and cycle sidebar loading state (#2831) 2023-11-22 15:34:39 +05:30
sabith-tu
1dff6b63f8
style: new empty project screen (#2832) 2023-11-22 15:34:06 +05:30
Anmol Singh Bhatia
59dbbb29cd
fix: custom analytics project dropdown fix (#2828) 2023-11-22 14:55:18 +05:30
Anmol Singh Bhatia
6cb3939835
style: project card improvement (#2827) 2023-11-22 14:54:52 +05:30
Anmol Singh Bhatia
021c0675b7
fix: module sidebar link section (#2830) 2023-11-22 14:36:29 +05:30
Anmol Singh Bhatia
67000892e5
chore: dashboard redirection fix (#2826) 2023-11-22 13:47:59 +05:30
sriram veeraghanta
3df4794e77
fix: AI Assistance hide/unhide depending on the configuration (#2825)
* fix: gpt error handlijng

* fix: enabling ai assistance only when it is configured.
2023-11-22 13:20:59 +05:30
Prateek Shourya
42ccd1de58
Style: UI improvements (#2824)
* style: update notification Read status toast alert description.

* style: update issue subscribe button design.

* fix: remove group_by `none` display filter from the kanban view in profile and draft issues.

* style: design improvement in members settings.
* style: add display name for all user role.
* style: remove email for user roles other than admin.
* style: fix border color as per designs.
2023-11-22 12:58:55 +05:30
Aaryan Khandelwal
c8c89007c0
style: revamped page details UI (#2823)
* style: revamp page details UI

* chore: updated the info popover date format

* fix: page actions mutation

* style: made the page content responsive
2023-11-22 12:32:49 +05:30
Bavisetti Narayan
4cf3e69e22
chore: file asset update (#2816)
* chore: endpoint to update file asset

* chore: aws storage endpoint change
2023-11-21 17:52:19 +05:30
Lakhan Baheti
fb1f65c2c1
fix: sidebar project section hover (#2818)
* fix: sidebar project section hover

* fix: icons alignment
2023-11-21 17:37:17 +05:30
Lakhan Baheti
d91b4e6fa1
fix: bug fixes & UI improvements (#2819)
* fix: profile setting fields border

* fix: webhooks empty state UI

* fix: cycle delete redirection from cycle detail

* fix: integration access restriction
2023-11-21 17:35:29 +05:30
Aaryan Khandelwal
561223ea71
chore: update join project endpoint (#2821) 2023-11-21 17:35:15 +05:30
Aaryan Khandelwal
982eba0bd1
fix: complete pages editor not clickable, recent pages calculation logic (#2820)
* fix: whole editor not clickable

* fix: recent pages calculation

* chore: update older pages calculation logic in recent pages list

* fix: archived pages computed function

* chore: add type for older pages
2023-11-21 15:47:34 +05:30
Aaryan Khandelwal
7aaf840fb1
refactor: command k modal (#2803)
* refactor: command palette file structure

* fix: identifier search
2023-11-21 15:46:41 +05:30
Nikhil
15927c9cae
dev: change url for the license engine instance registration (#2810) 2023-11-20 21:32:45 +05:30
Bavisetti Narayan
d46d70fcd5
chore: removed DOCKERIZED value and changed REDIS_SSL (#2813)
* chore: removed DOCKERIZED value

* chore: changed redis ssl
2023-11-20 21:32:00 +05:30
Henit Chobisa
de581102e3
feat: New Pages with Enhanced Document Editor Packages made over Editor Core 📝 (#2784)
* fix: page transaction model

* fix: page transaction model

* feat: updated ui for page route

* chore: initailized `document-editor` package for plane

* fix: format persistence while pasting markdown in editor

* feat: Inititalized Document-Editor and Editor with Ref

* feat: added tooltip component and slash command for editor

* feat: added `document-editor` extensions

* feat: added custom search component for embedding labels

* feat: added top bar menu component

* feat: created document-editor exposed components

* feat: integrated `document-editor` in `pages` route

* chore: updated dependencies

* feat: merge conflict resolution

* chore: modified configuration for document editor

* feat: added content browser menu for document editor summary

* feat: added fixed menu and editor instances

* feat: added document edittor instances and summary table

* feat: implemented document-editor in PageDetail

* chore: css and export fixes

* fix: migration and optimisation

* fix: added `on_create` hook in the core editor

* feat: added conditional menu bar action in document-editor

* feat: added menu actions from single page view

* feat: added services for archiving, unarchiving and retriving archived pages

* feat: added services for page archives

* feat: implemented page archives in page list view

* feat: implemented page archives in document-editor

* feat: added editor marking hook

* chore: seperated editor header from the main content

* chore: seperated editor summary utilities from the main editor

* chore: refactored necessary components from the document editor

* chore: removed summary sidebar component from the main content editor

* chore: removed scrollSummaryDependency from Header and Sidebar

* feat: seperated page renderer as a seperate component

* chore: seperated page_renderer and sidebar as component from index

* feat: added locked property to IPage type

* feat: added lock/unlock services in page service

* chore: seperated DocumentDetails as exported interface from index

* feat: seperated document editor configs as seperate interfaces

* chore: seperated menu options from the editor header component

* fix: fixed page_lock performing lock/unlock operation on queryset instead of single instance

* fix: css positioning changes

* feat: added archive/lock alert labels

* feat: added boolean props in menu-actions/options

* feat: added lock/unlock & archive/unarchive services

* feat: added on update mutations for archived pages in page-view

* feat: added archive/lock on_update mutations in single page vieq

* feat: exported readonly editor for locked pages

* chore: seperated kanban menu props and saved over passing redundant data

* fix: readonly editor not generating markings on first render

* fix: cheveron overflowing from editor-header

* chore: removed unused utility actions

* fix: enabled sidebar view by default

* feat: removed locking on pages in archived state

* feat: added indentation in heading component

* fix: button classnames in vertical dropdowns

* feat: added `last_archived_at` and `last_edited_at` details in editor-header

* feat: changed types for archived updates and document last updates

* feat: updated editor and header props

* feat: updated queryset according to new page query format

* feat: added parameters in page view for shared / private pages

* feat: updated other-page-view to shared page view && same with private pages

* feat: added page-view as shared / private

* fix: replaced deleting to archiving for pages

* feat: handle restoring of page from archived section from list view

* feat: made previledge based option render for pages

* feat: removed layout view for page list view

* feat: linting changes

* fix: adding mobx changes to pages

* fix: removed uneccessary migrations

* fix: mobx store changes

* fix: adding date-fns pacakge

* fix: updating yarn lock

* fix: removing unneccessary method params

* chore: added access specifier to the create/update page modal

* fix: tab view layout changes

* chore: delete endpoint for page

* fix: page actions, including- archive, favorite, access control, delete

* chore: remove archive page modal

* fix: build errors

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: sriramveeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-20 21:31:12 +05:30
Prateek Shourya
b903126e5a
feat: Instance Admin Panel: Configuration Settings (#2800)
* feat: Instance Admin Panel: Configuration Settings

* refactor: seprate Google and Github form into independent components.

* feat: add admin auth wrapper and access denied page.

* style: design updates.
2023-11-20 20:46:49 +05:30
sabith-tu
f44f70168f
style: changing profile screen title (#2814) 2023-11-20 20:46:15 +05:30
sriram veeraghanta
3c10f00b04
fix: minor fix (#2815) 2023-11-20 20:24:35 +05:30
Lakhan Baheti
f1de05e4de
chore: onboarding (#2790)
* style: onboarding light version

* style: dark mode

* fix: onboarding gradient

* refactor: imports

* chore: add use case field in users api

* feat: delete account

* fix: delete modal points alignment

* feat: usecase in profile

* fix: build error

* fix: typos & hardcoded strings

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2023-11-20 19:31:19 +05:30
Prashant Indurkar
61d4e2e016
Fixed: while creating new Add Labels the field should be auto focus #2437 (#2438)
* bug:fix recent page hiding last item on scroll #1468

* bug:fix recent page hiding last item on scroll #1468 (#2411)

* fixed add label autofocuse

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-20 19:00:55 +05:30
Lakhan Baheti
c1eb5055e5
fix: bug fixes & ui improvements. (#2772)
* fix: create project modal member select

* fix: overflow in workspace activity

* fix: memeber selected state
2023-11-20 16:36:50 +05:30
Bavisetti Narayan
8d942e28da
chore: ams url name changed (#2808) 2023-11-20 16:34:57 +05:30
Nikhil
f7461af3f5
dev: open ai configuration (#2807) 2023-11-20 16:03:31 +05:30
Aaryan Khandelwal
29f3e02adc
refactor: project estimates store (#2801)
* refactor: remove estimates from project store

* chore: update all the instances of the old store

* chore: update store declaration structure
2023-11-20 15:58:40 +05:30
Nikhil
9a704458b3
dev: external apis (#2806)
* dev: new proxy api setup

* dev: updated endpoints with serializers and structure

* dev: external apis for cycles, modules and inbox
issue

* dev: order by for all the apis

* dev: enable webhooks for external apis

* dev: fields and expand for the apis

* dev: move authentication to proxy middleware

* dev: fix imports

* dev: api serializer updates and paginator

* dev: renamed api to app

* dev: renamed proxy to api

* dev: validation for project, issues, modules and cycles

* dev: remove favourites from project apis

* dev: states api

* dev: rewrite the url endpoints

* dev: exception handling for the apis

* dev: merge updated structure

* dev: remove attachment apis

* dev: issue activities endpoints
2023-11-20 15:58:17 +05:30
Aaryan Khandelwal
668dfd2e38
chore: update exception detected screen action button (#2805) 2023-11-20 15:00:36 +05:30
Bavisetti Narayan
3b3f94ed03
fix: file asset delete (#2804) 2023-11-20 14:53:06 +05:30
onFire(Abhi)
e945aa9b71
fix: newly added cycle doesnt appear unlelss the page is manually reloaded (#2673)
* fix: newly added cycle doesnt appear unlelss the page is manually reloaded

* Delete \

* Delete web/layouts/profile-layout/profile-sidebar.tsx

* Update cycles.store.ts

* fix: remove duplicate type declaration

---------

Co-authored-by: Aaryan Khandelwal <65252264+aaryan610@users.noreply.github.com>
2023-11-20 14:36:46 +05:30
sriram veeraghanta
6595a387d0
feat: event tracking using posthog and created application provider to render multiple wrappers (#2757)
* fix: event tracker changes

* fix: App provider implementation using wrappers

* fix: updating packages

* fix: handling warning

* fix: wrapper fixes and minor optimization changes

* fix: chore app-provider clearnup

* fix: cleanup

* fix: removing jitsu tracking

* fix: minor updates

* fix: adding event to posthog event tracker (#2802)

* dev: posthog event tracker update intitiate

* fix: adding events for posthog integration

* fix: event payload

---------

Co-authored-by: Ramesh Kumar Chandra <31303617+rameshkumarchandra@users.noreply.github.com>
2023-11-20 13:29:54 +05:30
Dakshesh Jain
8839e42dc0
fix: archive issue bugs (#2712)
* fix: blur on side/modal peek view

* fix: delete archive not working on list layout with group by is none

* fix: show empty group has no effect

* fix: filter/display options same as production

* fix: disabling full-screen peek-overview for archive issues

* fix: truncate in calendar view
2023-11-20 12:48:30 +05:30
Nikhil
9db6312081
fix: self hosted instance (#2795)
* dev: update create bucket script

* dev: update patch endpoint for instance configuration

* dev: add google client secret and default values for ADMIN_EMAIL and LICENSE_ENGINE_BASE_URL
2023-11-20 12:36:48 +05:30
Dakshesh Jain
779ef2a4aa
fix: delete issues in spreadsheet doesn't work (#2718)
Co-authored-by: Aaryan Khandelwal <65252264+aaryan610@users.noreply.github.com>
2023-11-20 12:22:43 +05:30
Bavisetti Narayan
51e17643a2
fix: file structuring (#2797)
* fix: file structure changes

* fix: pages update

* fix: license imports changed
2023-11-20 11:59:20 +05:30
Nikhil
4c2074b6ff
dev: environment settings (#2794)
* dev: update environment configuration

* dev: update the takeoff script for instance registration
2023-11-19 01:48:05 +05:30
Lakhan Baheti
c9ffc9465f
fix: Labels delete & reordering (#2729)
* fix: Labels reordering inconsistency

* fix: Delete child labels

* feat: multi-select while grouping labels

* refactor: label sorting in mobx computed function

* feat: drag & drop label grouping, un-grouping

* chore: removed label select modal

* fix: moving labels from project store to project label store

* fix: typo changes and build tree function added

* labels feature

* disable dropping group into a group

* fix build errors

* fix more issues

* chore: added combining state UI, fixed scroll issue for label groups

* chore: group icon for label groups

* fix: group cannot be dropped in another group

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-19 01:46:11 +05:30
Bavisetti Narayan
2b6c489513
feat: v3 endpoint for module and cycle (#2786)
* feat: v3 endpoint for module and cycle

* fix: removed the str
2023-11-18 16:30:35 +05:30
M. Palanikannan
0c63f21718
fix: Task List Behaviour in Editor (#2789)
* better variable names and comments

* drag drop migrated

* custom horizontal rule created

* init transaction hijack

* fixed code block with better contrast, keyboard tripple enter press disabled and syntax highlighting

* fixed link selector closing on open behaviour

* added better keymaps and syntax highlights

* made drag and drop working for code blocks

* fixed drag drop for code blocks

* moved drag drop only to rich text editor

* fixed drag and drop only for description

* enabled drag handles for peek overview and main issues

* got images to old state

* fixed task lists to be smaller

* removed validate image functions and uncessary imports

* table icons svg attributes fixed

* custom list keymap extension added

* more uncessary imports of validate image removed

* removed console logs

* fixed drag-handle styles

* space styles updated for the editor

* removed showing quotes from blockquotes

* removed validateImage for now

* added better comments and improved redundant renders

* removed uncessary console logs

* created util for creating the drag handle element

* fixed file names
2023-11-18 16:20:35 +05:30
Nikhil
a987df38f4
chore: user onboarding workflow (#2791) 2023-11-18 16:18:06 +05:30
sriram veeraghanta
878707f444
feat: Instance Registration and Configuration (#2793)
* dev: remove default user

* dev: initiate licensing

* dev: remove migration file 0046

* feat: self hosted licensing initialize

* dev: instance licenses

* dev: change license response structure

* dev: add default properties and issue mention migration

* dev: reset migrations

* dev: instance configuration

* dev: instance configuration migration

* dev: update instance configuration model to take null and empty values

* dev: instance configuration variables

* dev: set default values

* dev: update instance configuration load

* dev: email configuration settings moved to database

* dev: instance configuration on instance bootup

* dev: auto instance registration script

* dev: instance admin

* dev: enable instance configuration and instance admin roles

* dev: instance owner fix

* dev: instance configuration values

* dev: fix instance permissions and serializer

* dev: fix email senders

* dev: remove deprecated variables

* dev: fix current site domain registration

* dev: update cors setup and local settings

* dev: migrate instance registration and configuration to manage commands

* dev: check email validity

* dev: update script to use manage command

* dev: default bucket creation script

* dev: instance admin routes and initial set of screens

* dev: admin api to check if the current user is admin

* dev: instance admin unique constraints

* dev: check magic link login

* dev: fix email sending for ssl

* dev: create instance activation route if the instance is not activated during startup

* dev: removed DJANGO_SETTINGS_MODULE from environment files and deleted auto bucket create script

* dev: environment configuration for backend

* dev: fix access token variable error

* feat: Instance Admin Panel: General Settings (#2792)

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
2023-11-18 16:17:01 +05:30
M. Palanikannan
9369ee5008
[feat]: Drag and Drop Handles for all Data Structures (#2745)
* better variable names and comments

* drag drop migrated

* custom horizontal rule created

* init transaction hijack

* fixed code block with better contrast, keyboard tripple enter press disabled and syntax highlighting

* fixed link selector closing on open behaviour

* added better keymaps and syntax highlights

* made drag and drop working for code blocks

* fixed drag drop for code blocks

* moved drag drop only to rich text editor

* fixed drag and drop only for description

* enabled drag handles for peek overview and main issues

* got images to old state
2023-11-17 12:29:30 +05:30
Manish Gupta
0a88db975a
dev: Self Hosting with private repo fixes (#2787)
* fixes to self hosting

* self hosting fixes

* removed .temp

* wip

* wip

* self install private repo

* folder change

* fix

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-17 11:51:54 +05:30
Manish Gupta
dd60dec887
Dev/mg selfhosting fix (#2782)
* fixes to self hosting

* self hosting fixes

* removed .temp

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-16 14:38:55 +05:30
Bavisetti Narayan
0c1097592e
fix: pages revamping (#2760)
* fix: page transaction model

* fix: page transaction model

* fix: migration and optimisation

* fix: back migration of page blocks

* fix: added issue embed

* fix: migration fixes

* fix: resolved changes
2023-11-16 14:38:12 +05:30
Anmol Singh Bhatia
bed66235f2
style: workspace sidebar dropdown improvement (#2783) 2023-11-16 14:11:33 +05:30
Nikhil
26b1e9d5f1
dev: squashed migrations (#2779)
* dev: migration squash

* dev: migrations squashed for apis and webhooks

* dev: packages updated and  move dj-database-url for local settings

* dev: update package changes
2023-11-15 17:15:02 +05:30
Bavisetti Narayan
79347ec62b
feat: api webhooks (#2543)
* dev: initiate external apis

* dev: external api

* dev: external public api implementation

* dev: add prefix to all api tokens

* dev: flag to enable disable api token api access

* dev: webhook model create and apis

* dev: webhook settings

* fix: webhook logs

* chore: removed drf spectacular

* dev: remove retry_count and fix api logging for get requests

* dev: refactor webhook logic

* fix: celery retry mechanism

* chore: event and action change

* chore: migrations changes

* dev: proxy setup for apis

* chore: changed retry time and cleanup

* chore: added issue comment and inbox issue api endpoints

* fix: migration files

* fix: added env variables

* fix: removed issue attachment from proxy

* fix: added new migration file

* fix: restricted wehbook access

* chore: changed urls

* chore: fixed porject serializer

* fix: set expire for api token

* fix: retrive endpoint for api token

* feat: Api Token screens & api integration

* dev: webhook endpoint changes

* dev: add fields for webhook updates

* feat: Download Api secret key

* chore: removed BASE API URL

* feat: revoke token access

* dev: migration fixes

* feat: workspace webhooks (#2748)

* feat: workspace webhook store, services integeration and rendered webhook list and create

* chore: handled webhook update and rengenerate token in workspace webhooks

* feat: regenerate key and delete functionality

---------

Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>

* fix: url validation added

* fix: seperated env for webhook and api

* Web hooks refactoring

* add show option for generated hook key

* Api token restructure

* webhook minor fixes

* fix build errors

* chore: improvements in file structring

* dev: rate limiting the open apis

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: LAKHAN BAHETI <lakhanbaheti9@gmail.com>
Co-authored-by: rahulramesha <71900764+rahulramesha@users.noreply.github.com>
Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>
Co-authored-by: Nikhil <118773738+pablohashescobar@users.noreply.github.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-15 15:56:57 +05:30
Nikhil
7b965179d8
dev: update bucket script to make the bucket public (#2767)
* dev: update bucket script to make the bucket public

* dev: remove auto bucket script from docker compose
2023-11-15 15:56:08 +05:30
Nikhil
fc51ffc589
chore: user workflow (#2762)
* dev: workspace member deactivation and leave endpoints and filters

* dev: deactivated for project members

* dev: project members leave

* dev: project member check on workspace deactivation

* dev: project member queryset update and remove leave project endpoint

* dev: rename is_deactivated to is_active and user deactivation apis

* dev: check if the user is already part of workspace then make them active

* dev: workspace and project save

* dev: update project members to make them active

* dev: project invitation

* dev: automatic user workspace and project member create when user sign in/up

* dev: fix member invites

* dev: rename deactivation variable

* dev: update project member invitation

* dev: additional permission layer for workspace

* dev: update the url for  workspace invitations

* dev: remove invitation urls from users

* dev: cleanup workspace invitation workflow

* dev: workspace and project invitation
2023-11-15 15:53:16 +05:30
sabith-tu
96f6e37cc5
fix: Delete estimate popup is not closing automatically (#2777) 2023-11-15 14:08:52 +05:30
Nikhil
29774ce84a
dev: API settings (#2594)
* dev: update settings file structure and added extra settings for CORS

* dev: remove WEB_URL variable and add celery integration for sentry

* dev: aws and minio settings

* dev: add cors origins to env

* dev: update settings
2023-11-15 12:31:52 +05:30
Nikhil
8cbe9c26fc
enhancement: label sort order (#2763)
* chore: label sort ordering

* dev: ordering

* fix: sort order

* fix: save of labels

* dev: remove ordering by name

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-15 12:25:44 +05:30
Prateek Shourya
7f42566207
Fix: Custom menu item not automatically closing, affecting delete popup behavior. (#2771) 2023-11-14 23:05:30 +05:30
Ankush Deshmukh
b60237b676
Standarding priority icons across the platform (#2776) 2023-11-14 20:52:43 +05:30
Prateek Shourya
1fe09d369f
style: text overflow fix and border color update (#2769)
* style: fix text overflow in:
* Issue activity
* Cycle and Module Select in Create Issue form
* Delete Module modal
* Join Project modal

* style: update assignee select border as per design.
2023-11-14 18:34:51 +05:30
Dakshesh Jain
b7757c6b1a
fix: bugs (#2761)
* fix: semicolon on estimate settings page

* refactor: project settings automations store implementation

* fix: active cycle stuck on infinite loading

* fix: removed delete project option from sidebar

* fix: discloser not opening when navigating to project

* fix: clear filter not working & filter appearing even if nothing is selected

* refactor: select label store implementation

* refactor: select state store implementation
2023-11-14 18:33:01 +05:30
Anmol Singh Bhatia
1a25bacce1
style: create update view modal consistency (#2775) 2023-11-14 18:30:10 +05:30
Anmol Singh Bhatia
6797df239d
chore: no lead option added in lead select dropdown (#2774) 2023-11-14 18:29:39 +05:30
Anmol Singh Bhatia
43e7c10eb7
chore: spreadsheet layout column responsiveness (#2768) 2023-11-14 18:28:49 +05:30
Anmol Singh Bhatia
bdc9c9c2a8
chore: create update issue modal improvement (#2765) 2023-11-14 18:28:15 +05:30
Anmol Singh Bhatia
f0c72bf249
fix: breadcrumb project icon improvement (#2764) 2023-11-14 18:27:47 +05:30
sabith-tu
a8904bfc48
style: ui fixes for pages and views (#2770) 2023-11-14 18:26:50 +05:30
Nikhil
b31041726b
dev: create bucket through application (#2720) 2023-11-13 15:57:19 +05:30
Prateek Shourya
e6f947ad90
style: ui improvements and bug fixes (#2758)
* style: add transition to favorite projects dropdown.

* style: update project integration settings borders.

* style: fix text overflow issue in project views.

* fix: issue with non-functional cancel button in leave project modal.
2023-11-13 14:42:45 +05:30
Dakshesh Jain
7963993171
fix: workspace settings bugs (#2743)
* fix: double layout in exports

* fix: typo in jira email address section

* fix: workspace members not mutating

* fix: removed un-used variable

* fix: workspace members can't be filtered using email

* fix: autocomplete in workspace delete

* fix: autocomplete in project delete modal

* fix: update member function in store

* fix: sidebar link not active when in github/jira

* style: margin top & icon inconsistency

* fix: typo in create workspace

* fix: workspace leave flow

* fix: redirection to delete issue

* fix: autocomplete off in jira api token

* refactor: reduced api call, added optional chaining & removed variable with low scope
2023-11-13 13:34:05 +05:30
2517 changed files with 87305 additions and 119636 deletions

23
.deepsource.toml Normal file
View File

@ -0,0 +1,23 @@
version = 1
exclude_patterns = [
"bin/**",
"**/node_modules/",
"**/*.min.js"
]
[[analyzers]]
name = "shell"
[[analyzers]]
name = "javascript"
[analyzers.meta]
plugins = ["react"]
environment = ["nodejs"]
[[analyzers]]
name = "python"
[analyzers.meta]
runtime_version = "3.x.x"

View File

@ -2,16 +2,5 @@
*.pyc *.pyc
.env .env
venv venv
node_modules/ node_modules
**/node_modules/ npm-debug.log
npm-debug.log
.next/
**/.next/
.turbo/
**/.turbo/
build/
**/build/
out/
**/out/
dist/
**/dist/

View File

@ -1,12 +1,14 @@
# Database Settings # Database Settings
POSTGRES_USER="plane" PGUSER="plane"
POSTGRES_PASSWORD="plane" PGPASSWORD="plane"
POSTGRES_DB="plane" PGHOST="plane-db"
PGDATA="/var/lib/postgresql/data" PGDATABASE="plane"
DATABASE_URL=postgresql://${PGUSER}:${PGPASSWORD}@${PGHOST}/${PGDATABASE}
# Redis Settings # Redis Settings
REDIS_HOST="plane-redis" REDIS_HOST="plane-redis"
REDIS_PORT="6379" REDIS_PORT="6379"
REDIS_URL="redis://${REDIS_HOST}:6379/"
# AWS Settings # AWS Settings
AWS_REGION="" AWS_REGION=""

View File

@ -1,8 +1,7 @@
name: Bug report name: Bug report
description: Create a bug report to help us improve Plane description: Create a bug report to help us improve Plane
title: "[bug]: " title: "[bug]: "
labels: [🐛bug] labels: [bug, need testing]
assignees: [srinivaspendem, pushya22]
body: body:
- type: markdown - type: markdown
attributes: attributes:
@ -45,7 +44,7 @@ body:
- Deploy preview - Deploy preview
validations: validations:
required: true required: true
- type: dropdown type: dropdown
id: browser id: browser
attributes: attributes:
label: Browser label: Browser

View File

@ -1,8 +1,7 @@
name: Feature request name: Feature request
description: Suggest a feature to improve Plane description: Suggest a feature to improve Plane
title: "[feature]: " title: "[feature]: "
labels: [✨feature] labels: [feature]
assignees: [srinivaspendem, pushya22]
body: body:
- type: markdown - type: markdown
attributes: attributes:

View File

@ -1,84 +0,0 @@
name: Auto Merge or Create PR on Push
on:
workflow_dispatch:
push:
branches:
- "sync/**"
env:
CURRENT_BRANCH: ${{ github.ref_name }}
SOURCE_BRANCH: ${{ secrets.SYNC_SOURCE_BRANCH_NAME }} # The sync branch such as "sync/ce"
TARGET_BRANCH: ${{ secrets.SYNC_TARGET_BRANCH_NAME }} # The target branch that you would like to merge changes like develop
GITHUB_TOKEN: ${{ secrets.ACCESS_TOKEN }} # Personal access token required to modify contents and workflows
REVIEWER: ${{ secrets.SYNC_PR_REVIEWER }}
jobs:
Check_Branch:
runs-on: ubuntu-latest
outputs:
BRANCH_MATCH: ${{ steps.check-branch.outputs.MATCH }}
steps:
- name: Check if current branch matches the secret
id: check-branch
run: |
if [ "$CURRENT_BRANCH" = "$SOURCE_BRANCH" ]; then
echo "MATCH=true" >> $GITHUB_OUTPUT
else
echo "MATCH=false" >> $GITHUB_OUTPUT
fi
Auto_Merge:
if: ${{ needs.Check_Branch.outputs.BRANCH_MATCH == 'true' }}
needs: [Check_Branch]
runs-on: ubuntu-latest
permissions:
pull-requests: write
contents: write
steps:
- name: Checkout code
uses: actions/checkout@v4.1.1
with:
fetch-depth: 0 # Fetch all history for all branches and tags
- name: Setup Git
run: |
git config user.name "GitHub Actions"
git config user.email "actions@github.com"
- name: Setup GH CLI and Git Config
run: |
type -p curl >/dev/null || (sudo apt update && sudo apt install curl -y)
curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | sudo dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg
sudo chmod go+r /usr/share/keyrings/githubcli-archive-keyring.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages stable main" | sudo tee /etc/apt/sources.list.d/github-cli.list > /dev/null
sudo apt update
sudo apt install gh -y
- name: Check for merge conflicts
id: conflicts
run: |
git fetch origin $TARGET_BRANCH
git checkout $TARGET_BRANCH
# Attempt to merge the main branch into the current branch
if $(git merge --no-commit --no-ff $SOURCE_BRANCH); then
echo "No merge conflicts detected."
echo "HAS_CONFLICTS=false" >> $GITHUB_ENV
else
echo "Merge conflicts detected."
echo "HAS_CONFLICTS=true" >> $GITHUB_ENV
git merge --abort
fi
- name: Merge Change to Target Branch
if: env.HAS_CONFLICTS == 'false'
run: |
git commit -m "Merge branch '$SOURCE_BRANCH' into $TARGET_BRANCH"
git push origin $TARGET_BRANCH
- name: Create PR to Target Branch
if: env.HAS_CONFLICTS == 'true'
run: |
# Replace 'username' with the actual GitHub username of the reviewer.
PR_URL=$(gh pr create --base $TARGET_BRANCH --head $SOURCE_BRANCH --title "sync: merge conflicts need to be resolved" --body "" --reviewer $REVIEWER)
echo "Pull Request created: $PR_URL"

View File

@ -1,122 +1,98 @@
name: Branch Build name: Branch Build
on: on:
workflow_dispatch: pull_request:
push: types:
- closed
branches: branches:
- master - master
- preview - preview
- qa
- develop
release: release:
types: [released, prereleased] types: [released, prereleased]
env: env:
TARGET_BRANCH: ${{ github.ref_name || github.event.release.target_commitish }} TARGET_BRANCH: ${{ github.event.pull_request.base.ref || github.event.release.target_commitish }}
jobs: jobs:
branch_build_setup: branch_build_setup:
if: ${{ (github.event_name == 'pull_request' && github.event.action =='closed' && github.event.pull_request.merged == true) || github.event_name == 'release' }}
name: Build-Push Web/Space/API/Proxy Docker Image name: Build-Push Web/Space/API/Proxy Docker Image
runs-on: ubuntu-latest runs-on: ubuntu-20.04
outputs:
gh_branch_name: ${{ steps.set_env_variables.outputs.TARGET_BRANCH }}
gh_buildx_driver: ${{ steps.set_env_variables.outputs.BUILDX_DRIVER }}
gh_buildx_version: ${{ steps.set_env_variables.outputs.BUILDX_VERSION }}
gh_buildx_platforms: ${{ steps.set_env_variables.outputs.BUILDX_PLATFORMS }}
gh_buildx_endpoint: ${{ steps.set_env_variables.outputs.BUILDX_ENDPOINT }}
build_frontend: ${{ steps.changed_files.outputs.frontend_any_changed }}
build_space: ${{ steps.changed_files.outputs.space_any_changed }}
build_backend: ${{ steps.changed_files.outputs.backend_any_changed }}
build_proxy: ${{ steps.changed_files.outputs.proxy_any_changed }}
steps: steps:
- id: set_env_variables - name: Check out the repo
name: Set Environment Variables uses: actions/checkout@v3.3.0
run: |
if [ "${{ env.TARGET_BRANCH }}" == "master" ] || [ "${{ github.event_name }}" == "release" ]; then
echo "BUILDX_DRIVER=cloud" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=lab:latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64,linux/arm64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=makeplane/plane-dev" >> $GITHUB_OUTPUT
else
echo "BUILDX_DRIVER=docker-container" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=" >> $GITHUB_OUTPUT
fi
echo "TARGET_BRANCH=${{ env.TARGET_BRANCH }}" >> $GITHUB_OUTPUT
- id: checkout_files - name: Uploading Proxy Source
name: Checkout Files uses: actions/upload-artifact@v3
uses: actions/checkout@v4
- name: Get changed files
id: changed_files
uses: tj-actions/changed-files@v42
with: with:
files_yaml: | name: proxy-src-code
frontend: path: ./nginx
- web/** - name: Uploading Backend Source
- packages/** uses: actions/upload-artifact@v3
- 'package.json' with:
- 'yarn.lock' name: backend-src-code
- 'tsconfig.json' path: ./apiserver
- 'turbo.json' - name: Uploading Web Source
space: uses: actions/upload-artifact@v3
- space/** with:
- packages/** name: web-src-code
- 'package.json' path: |
- 'yarn.lock' ./
- 'tsconfig.json' !./apiserver
- 'turbo.json' !./nginx
backend: !./deploy
- apiserver/** !./space
proxy: - name: Uploading Space Source
- nginx/** uses: actions/upload-artifact@v3
with:
name: space-src-code
path: |
./
!./apiserver
!./nginx
!./deploy
!./web
outputs:
gh_branch_name: ${{ env.TARGET_BRANCH }}
branch_build_push_frontend: branch_build_push_frontend:
if: ${{ needs.branch_build_setup.outputs.build_frontend == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
needs: [branch_build_setup] needs: [branch_build_setup]
env: env:
FRONTEND_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend:${{ needs.branch_build_setup.outputs.gh_branch_name }} FRONTEND_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-ce:${{ needs.branch_build_setup.outputs.gh_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps: steps:
- name: Set Frontend Docker Tag - name: Set Frontend Docker Tag
run: | run: |
if [ "${{ github.event_name }}" == "release" ]; then if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ] && [ "${{ github.event_name }}" == "release" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend:stable,${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend:${{ github.event.release.tag_name }} TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-ce:latest,${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-ce:${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend:latest TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-ce:stable
else else
TAG=${{ env.FRONTEND_TAG }} TAG=${{ env.FRONTEND_TAG }}
fi fi
echo "FRONTEND_TAG=${TAG}" >> $GITHUB_ENV echo "FRONTEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub - name: Login to Docker Hub
uses: docker/login-action@v3 uses: docker/login-action@v2.1.0
with: with:
username: ${{ secrets.DOCKERHUB_USERNAME }} username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }} password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Web Source Code
- name: Set up Docker Buildx uses: actions/download-artifact@v3
uses: docker/setup-buildx-action@v3
with: with:
driver: ${{ env.BUILDX_DRIVER }} name: web-src-code
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Frontend to Docker Container Registry - name: Build and Push Frontend to Docker Container Registry
uses: docker/build-push-action@v5.1.0 uses: docker/build-push-action@v4.0.0
with: with:
context: . context: .
file: ./web/Dockerfile.web file: ./web/Dockerfile.web
platforms: ${{ env.BUILDX_PLATFORMS }} platforms: linux/amd64
tags: ${{ env.FRONTEND_TAG }} tags: ${{ env.FRONTEND_TAG }}
push: true push: true
env: env:
@ -125,50 +101,40 @@ jobs:
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }} DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_space: branch_build_push_space:
if: ${{ needs.branch_build_setup.outputs.build_space == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
needs: [branch_build_setup] needs: [branch_build_setup]
env: env:
SPACE_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space:${{ needs.branch_build_setup.outputs.gh_branch_name }} SPACE_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space-ce:${{ needs.branch_build_setup.outputs.gh_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps: steps:
- name: Set Space Docker Tag - name: Set Space Docker Tag
run: | run: |
if [ "${{ github.event_name }}" == "release" ]; then if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ] && [ "${{ github.event_name }}" == "release" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-space:stable,${{ secrets.DOCKERHUB_USERNAME }}/plane-space:${{ github.event.release.tag_name }} TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-space-ce:latest,${{ secrets.DOCKERHUB_USERNAME }}/plane-space-ce:${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-space:latest TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-space-ce:stable
else else
TAG=${{ env.SPACE_TAG }} TAG=${{ env.SPACE_TAG }}
fi fi
echo "SPACE_TAG=${TAG}" >> $GITHUB_ENV echo "SPACE_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub - name: Login to Docker Hub
uses: docker/login-action@v3 uses: docker/login-action@v2.1.0
with: with:
username: ${{ secrets.DOCKERHUB_USERNAME }} username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }} password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Space Source Code
- name: Set up Docker Buildx uses: actions/download-artifact@v3
uses: docker/setup-buildx-action@v3
with: with:
driver: ${{ env.BUILDX_DRIVER }} name: space-src-code
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Space to Docker Hub - name: Build and Push Space to Docker Hub
uses: docker/build-push-action@v5.1.0 uses: docker/build-push-action@v4.0.0
with: with:
context: . context: .
file: ./space/Dockerfile.space file: ./space/Dockerfile.space
platforms: ${{ env.BUILDX_PLATFORMS }} platforms: linux/amd64
tags: ${{ env.SPACE_TAG }} tags: ${{ env.SPACE_TAG }}
push: true push: true
env: env:
@ -177,50 +143,40 @@ jobs:
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }} DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_backend: branch_build_push_backend:
if: ${{ needs.branch_build_setup.outputs.build_backend == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
needs: [branch_build_setup] needs: [branch_build_setup]
env: env:
BACKEND_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend:${{ needs.branch_build_setup.outputs.gh_branch_name }} BACKEND_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-ce:${{ needs.branch_build_setup.outputs.gh_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps: steps:
- name: Set Backend Docker Tag - name: Set Backend Docker Tag
run: | run: |
if [ "${{ github.event_name }}" == "release" ]; then if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ] && [ "${{ github.event_name }}" == "release" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-backend:stable,${{ secrets.DOCKERHUB_USERNAME }}/plane-backend:${{ github.event.release.tag_name }} TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-ce:latest,${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-ce:${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-backend:latest TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-ce:stable
else else
TAG=${{ env.BACKEND_TAG }} TAG=${{ env.BACKEND_TAG }}
fi fi
echo "BACKEND_TAG=${TAG}" >> $GITHUB_ENV echo "BACKEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub - name: Login to Docker Hub
uses: docker/login-action@v3 uses: docker/login-action@v2.1.0
with: with:
username: ${{ secrets.DOCKERHUB_USERNAME }} username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }} password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Backend Source Code
- name: Set up Docker Buildx uses: actions/download-artifact@v3
uses: docker/setup-buildx-action@v3
with: with:
driver: ${{ env.BUILDX_DRIVER }} name: backend-src-code
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Backend to Docker Hub - name: Build and Push Backend to Docker Hub
uses: docker/build-push-action@v5.1.0 uses: docker/build-push-action@v4.0.0
with: with:
context: ./apiserver context: .
file: ./apiserver/Dockerfile.api file: ./Dockerfile.api
platforms: ${{ env.BUILDX_PLATFORMS }} platforms: linux/amd64
push: true push: true
tags: ${{ env.BACKEND_TAG }} tags: ${{ env.BACKEND_TAG }}
env: env:
@ -229,50 +185,41 @@ jobs:
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }} DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_proxy: branch_build_push_proxy:
if: ${{ needs.branch_build_setup.outputs.build_proxy == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
needs: [branch_build_setup] needs: [branch_build_setup]
env: env:
PROXY_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy:${{ needs.branch_build_setup.outputs.gh_branch_name }} PROXY_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-ce:${{ needs.branch_build_setup.outputs.gh_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps: steps:
- name: Set Proxy Docker Tag - name: Set Proxy Docker Tag
run: | run: |
if [ "${{ github.event_name }}" == "release" ]; then if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ] && [ "${{ github.event_name }}" == "release" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy:stable,${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy:${{ github.event.release.tag_name }} TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-ce:latest,${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-ce:${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy:latest TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-ce:stable
else else
TAG=${{ env.PROXY_TAG }} TAG=${{ env.PROXY_TAG }}
fi fi
echo "PROXY_TAG=${TAG}" >> $GITHUB_ENV echo "PROXY_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub - name: Login to Docker Hub
uses: docker/login-action@v3 uses: docker/login-action@v2.1.0
with: with:
username: ${{ secrets.DOCKERHUB_USERNAME }} username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }} password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx - name: Downloading Proxy Source Code
uses: docker/setup-buildx-action@v3 uses: actions/download-artifact@v3
with: with:
driver: ${{ env.BUILDX_DRIVER }} name: proxy-src-code
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Plane-Proxy to Docker Hub - name: Build and Push Plane-Proxy to Docker Hub
uses: docker/build-push-action@v5.1.0 uses: docker/build-push-action@v4.0.0
with: with:
context: ./nginx context: .
file: ./nginx/Dockerfile file: ./Dockerfile
platforms: ${{ env.BUILDX_PLATFORMS }} platforms: linux/amd64
tags: ${{ env.PROXY_TAG }} tags: ${{ env.PROXY_TAG }}
push: true push: true
env: env:

View File

@ -1,104 +1,48 @@
name: Build and Lint on Pull Request name: Build Pull Request Contents
on: on:
workflow_dispatch:
pull_request: pull_request:
types: ["opened", "synchronize"] types: ["opened", "synchronize"]
jobs: jobs:
get-changed-files: build-pull-request-contents:
runs-on: ubuntu-latest name: Build Pull Request Contents
outputs: runs-on: ubuntu-20.04
apiserver_changed: ${{ steps.changed-files.outputs.apiserver_any_changed }} permissions:
web_changed: ${{ steps.changed-files.outputs.web_any_changed }} pull-requests: read
space_changed: ${{ steps.changed-files.outputs.deploy_any_changed }}
steps: steps:
- uses: actions/checkout@v3 - name: Checkout Repository to Actions
uses: actions/checkout@v3.3.0
- name: Setup Node.js 18.x
uses: actions/setup-node@v2
with:
node-version: 18.x
cache: 'yarn'
- name: Get changed files - name: Get changed files
id: changed-files id: changed-files
uses: tj-actions/changed-files@v41 uses: tj-actions/changed-files@v38
with: with:
files_yaml: | files_yaml: |
apiserver: apiserver:
- apiserver/** - apiserver/**
web: web:
- web/** - web/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
deploy: deploy:
- space/** - space/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
lint-apiserver: - name: Build Plane's Main App
needs: get-changed-files if: steps.changed-files.outputs.web_any_changed == 'true'
runs-on: ubuntu-latest run: |
if: needs.get-changed-files.outputs.apiserver_changed == 'true' yarn
steps: yarn build --filter=web
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.x' # Specify the Python version you need
- name: Install Pylint
run: python -m pip install ruff
- name: Install Apiserver Dependencies
run: cd apiserver && pip install -r requirements.txt
- name: Lint apiserver
run: ruff check --fix apiserver
lint-web: - name: Build Plane's Deploy App
needs: get-changed-files if: steps.changed-files.outputs.deploy_any_changed == 'true'
if: needs.get-changed-files.outputs.web_changed == 'true' run: |
runs-on: ubuntu-latest yarn
steps: yarn build --filter=space
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v2
with:
node-version: 18.x
- run: yarn install
- run: yarn lint --filter=web
lint-space:
needs: get-changed-files
if: needs.get-changed-files.outputs.space_changed == 'true'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v2
with:
node-version: 18.x
- run: yarn install
- run: yarn lint --filter=space
build-web:
needs: lint-web
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v2
with:
node-version: 18.x
- run: yarn install
- run: yarn build --filter=web
build-space:
needs: lint-space
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v2
with:
node-version: 18.x
- run: yarn install
- run: yarn build --filter=space

View File

@ -1,45 +0,0 @@
name: Version Change Before Release
on:
pull_request:
branches:
- master
jobs:
check-version:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
ref: ${{ github.head_ref }}
fetch-depth: 0
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '18'
- name: Get PR Branch version
run: echo "PR_VERSION=$(node -p "require('./package.json').version")" >> $GITHUB_ENV
- name: Fetch base branch
run: git fetch origin master:master
- name: Get Master Branch version
run: |
git checkout master
echo "MASTER_VERSION=$(node -p "require('./package.json').version")" >> $GITHUB_ENV
- name: Get master branch version and compare
run: |
echo "Comparing versions: PR version is $PR_VERSION, Master version is $MASTER_VERSION"
if [ "$PR_VERSION" == "$MASTER_VERSION" ]; then
echo "Version in PR branch is the same as in master. Failing the CI."
exit 1
else
echo "Version check passed. Versions are different."
fi
env:
PR_VERSION: ${{ env.PR_VERSION }}
MASTER_VERSION: ${{ env.MASTER_VERSION }}

View File

@ -1,13 +1,13 @@
name: "CodeQL" name: "CodeQL"
on: on:
workflow_dispatch:
push: push:
branches: ["develop", "preview", "master"] branches: [ 'develop', 'hot-fix', 'stage-release' ]
pull_request: pull_request:
branches: ["develop", "preview", "master"] # The branches below must be a subset of the branches above
branches: [ 'develop' ]
schedule: schedule:
- cron: "53 19 * * 5" - cron: '53 19 * * 5'
jobs: jobs:
analyze: analyze:
@ -21,44 +21,45 @@ jobs:
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
language: ["python", "javascript"] language: [ 'python', 'javascript' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ] # CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
# Use only 'java' to analyze code written in Java, Kotlin or both # Use only 'java' to analyze code written in Java, Kotlin or both
# Use only 'javascript' to analyze code written in JavaScript, TypeScript or both # Use only 'javascript' to analyze code written in JavaScript, TypeScript or both
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support # Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v3 uses: actions/checkout@v3
# Initializes the CodeQL tools for scanning. # Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL - name: Initialize CodeQL
uses: github/codeql-action/init@v2 uses: github/codeql-action/init@v2
with: with:
languages: ${{ matrix.language }} languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file. # If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file. # By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file. # Prefix the list here with "+" to use these queries and those in the config file.
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs # For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality # queries: security-extended,security-and-quality
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
# Command-line programs to run using the OS shell. # Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun # If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
# If the Autobuild fails above, remove it and uncomment the following three lines. # Command-line programs to run using the OS shell.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance. # 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
# - run: | # If the Autobuild fails above, remove it and uncomment the following three lines.
# echo "Run, Build Application using script" # modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis # - run: |
uses: github/codeql-action/analyze@v2 # echo "Run, Build Application using script"
with: # ./location_of_script_within_repo/buildscript.sh
category: "/language:${{matrix.language}}"
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
with:
category: "/language:${{matrix.language}}"

View File

@ -1,28 +1,42 @@
name: Create Sync Action name: Create PR in Plane EE Repository to sync the changes
on: on:
workflow_dispatch: pull_request:
push:
branches: branches:
- preview - master
types:
env: - closed
SOURCE_BRANCH_NAME: ${{ github.ref_name }}
jobs: jobs:
sync_changes: create_pr:
# Only run the job when a PR is merged
if: github.event.pull_request.merged == true
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
pull-requests: write pull-requests: write
contents: read contents: read
steps: steps:
- name: Check SOURCE_REPO
id: check_repo
env:
SOURCE_REPO: ${{ secrets.SOURCE_REPO_NAME }}
run: |
echo "::set-output name=is_correct_repo::$(if [[ "$SOURCE_REPO" == "makeplane/plane" ]]; then echo 'true'; else echo 'false'; fi)"
- name: Checkout Code - name: Checkout Code
uses: actions/checkout@v4.1.1 if: steps.check_repo.outputs.is_correct_repo == 'true'
uses: actions/checkout@v2
with: with:
persist-credentials: false persist-credentials: false
fetch-depth: 0 fetch-depth: 0
- name: Set up Branch Name
if: steps.check_repo.outputs.is_correct_repo == 'true'
run: |
echo "SOURCE_BRANCH_NAME=${{ github.head_ref }}" >> $GITHUB_ENV
- name: Setup GH CLI - name: Setup GH CLI
if: steps.check_repo.outputs.is_correct_repo == 'true'
run: | run: |
type -p curl >/dev/null || (sudo apt update && sudo apt install curl -y) type -p curl >/dev/null || (sudo apt update && sudo apt install curl -y)
curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | sudo dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | sudo dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg
@ -31,25 +45,35 @@ jobs:
sudo apt update sudo apt update
sudo apt install gh -y sudo apt install gh -y
- name: Push Changes to Target Repo A - name: Create Pull Request
if: steps.check_repo.outputs.is_correct_repo == 'true'
env: env:
GH_TOKEN: ${{ secrets.ACCESS_TOKEN }} GH_TOKEN: ${{ secrets.ACCESS_TOKEN }}
run: | run: |
TARGET_REPO="${{ secrets.TARGET_REPO_A }}" TARGET_REPO="${{ secrets.TARGET_REPO_NAME }}"
TARGET_BRANCH="${{ secrets.TARGET_REPO_A_BRANCH_NAME }}" TARGET_BRANCH="${{ secrets.TARGET_REPO_BRANCH }}"
SOURCE_BRANCH="${{ env.SOURCE_BRANCH_NAME }}" SOURCE_BRANCH="${{ env.SOURCE_BRANCH_NAME }}"
git checkout $SOURCE_BRANCH git checkout $SOURCE_BRANCH
git remote add target-origin-a "https://$GH_TOKEN@github.com/$TARGET_REPO.git" git remote add target "https://$GH_TOKEN@github.com/$TARGET_REPO.git"
git push target-origin-a $SOURCE_BRANCH:$TARGET_BRANCH git push target $SOURCE_BRANCH:$SOURCE_BRANCH
- name: Push Changes to Target Repo B PR_TITLE="${{ github.event.pull_request.title }}"
env: PR_BODY="${{ github.event.pull_request.body }}"
GH_TOKEN: ${{ secrets.ACCESS_TOKEN }}
run: |
TARGET_REPO="${{ secrets.TARGET_REPO_B }}"
TARGET_BRANCH="${{ secrets.TARGET_REPO_B_BRANCH_NAME }}"
SOURCE_BRANCH="${{ env.SOURCE_BRANCH_NAME }}"
git remote add target-origin-b "https://$GH_TOKEN@github.com/$TARGET_REPO.git" # Remove double quotes
git push target-origin-b $SOURCE_BRANCH:$TARGET_BRANCH PR_TITLE_CLEANED="${PR_TITLE//\"/}"
PR_BODY_CLEANED="${PR_BODY//\"/}"
# Construct PR_BODY_CONTENT using a here-document
PR_BODY_CONTENT=$(cat <<EOF
$PR_BODY_CLEANED
EOF
)
gh pr create \
--base $TARGET_BRANCH \
--head $SOURCE_BRANCH \
--title "[SYNC] $PR_TITLE_CLEANED" \
--body "$PR_BODY_CONTENT" \
--repo $TARGET_REPO

View File

@ -1,199 +0,0 @@
name: Feature Preview
on:
workflow_dispatch:
inputs:
web-build:
required: false
description: 'Build Web'
type: boolean
default: true
space-build:
required: false
description: 'Build Space'
type: boolean
default: false
env:
BUILD_WEB: ${{ github.event.inputs.web-build }}
BUILD_SPACE: ${{ github.event.inputs.space-build }}
jobs:
setup-feature-build:
name: Feature Build Setup
runs-on: ubuntu-latest
steps:
- name: Checkout
run: |
echo "BUILD_WEB=$BUILD_WEB"
echo "BUILD_SPACE=$BUILD_SPACE"
outputs:
web-build: ${{ env.BUILD_WEB}}
space-build: ${{env.BUILD_SPACE}}
feature-build-web:
if: ${{ needs.setup-feature-build.outputs.web-build == 'true' }}
needs: setup-feature-build
name: Feature Build Web
runs-on: ubuntu-latest
env:
AWS_ACCESS_KEY_ID: ${{ vars.FEATURE_PREVIEW_AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.FEATURE_PREVIEW_AWS_SECRET_ACCESS_KEY }}
AWS_BUCKET: ${{ vars.FEATURE_PREVIEW_AWS_BUCKET }}
NEXT_PUBLIC_API_BASE_URL: ${{ vars.FEATURE_PREVIEW_NEXT_PUBLIC_API_BASE_URL }}
steps:
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: '18'
- name: Install AWS cli
run: |
sudo apt-get update
sudo apt-get install -y python3-pip
pip3 install awscli
- name: Checkout
uses: actions/checkout@v4
with:
path: plane
- name: Install Dependencies
run: |
cd $GITHUB_WORKSPACE/plane
yarn install
- name: Build Web
id: build-web
run: |
cd $GITHUB_WORKSPACE/plane
yarn build --filter=web
cd $GITHUB_WORKSPACE
TAR_NAME="web.tar.gz"
tar -czf $TAR_NAME ./plane
FILE_EXPIRY=$(date -u -d "+2 days" +"%Y-%m-%dT%H:%M:%SZ")
aws s3 cp $TAR_NAME s3://${{ env.AWS_BUCKET }}/${{github.sha}}/$TAR_NAME --expires $FILE_EXPIRY
feature-build-space:
if: ${{ needs.setup-feature-build.outputs.space-build == 'true' }}
needs: setup-feature-build
name: Feature Build Space
runs-on: ubuntu-latest
env:
AWS_ACCESS_KEY_ID: ${{ vars.FEATURE_PREVIEW_AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.FEATURE_PREVIEW_AWS_SECRET_ACCESS_KEY }}
AWS_BUCKET: ${{ vars.FEATURE_PREVIEW_AWS_BUCKET }}
NEXT_PUBLIC_DEPLOY_WITH_NGINX: 1
NEXT_PUBLIC_API_BASE_URL: ${{ vars.FEATURE_PREVIEW_NEXT_PUBLIC_API_BASE_URL }}
outputs:
do-build: ${{ needs.setup-feature-build.outputs.space-build }}
s3-url: ${{ steps.build-space.outputs.S3_PRESIGNED_URL }}
steps:
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: '18'
- name: Install AWS cli
run: |
sudo apt-get update
sudo apt-get install -y python3-pip
pip3 install awscli
- name: Checkout
uses: actions/checkout@v4
with:
path: plane
- name: Install Dependencies
run: |
cd $GITHUB_WORKSPACE/plane
yarn install
- name: Build Space
id: build-space
run: |
cd $GITHUB_WORKSPACE/plane
yarn build --filter=space
cd $GITHUB_WORKSPACE
TAR_NAME="space.tar.gz"
tar -czf $TAR_NAME ./plane
FILE_EXPIRY=$(date -u -d "+2 days" +"%Y-%m-%dT%H:%M:%SZ")
aws s3 cp $TAR_NAME s3://${{ env.AWS_BUCKET }}/${{github.sha}}/$TAR_NAME --expires $FILE_EXPIRY
feature-deploy:
if: ${{ always() && (needs.setup-feature-build.outputs.web-build == 'true' || needs.setup-feature-build.outputs.space-build == 'true') }}
needs: [feature-build-web, feature-build-space]
name: Feature Deploy
runs-on: ubuntu-latest
env:
AWS_ACCESS_KEY_ID: ${{ vars.FEATURE_PREVIEW_AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.FEATURE_PREVIEW_AWS_SECRET_ACCESS_KEY }}
AWS_BUCKET: ${{ vars.FEATURE_PREVIEW_AWS_BUCKET }}
KUBE_CONFIG_FILE: ${{ secrets.FEATURE_PREVIEW_KUBE_CONFIG }}
steps:
- name: Install AWS cli
run: |
sudo apt-get update
sudo apt-get install -y python3-pip
pip3 install awscli
- name: Tailscale
uses: tailscale/github-action@v2
with:
oauth-client-id: ${{ secrets.TAILSCALE_OAUTH_CLIENT_ID }}
oauth-secret: ${{ secrets.TAILSCALE_OAUTH_SECRET }}
tags: tag:ci
- name: Kubectl Setup
run: |
curl -LO "https://dl.k8s.io/release/${{ vars.FEATURE_PREVIEW_KUBE_VERSION }}/bin/linux/amd64/kubectl"
chmod +x kubectl
mkdir -p ~/.kube
echo "$KUBE_CONFIG_FILE" > ~/.kube/config
chmod 600 ~/.kube/config
- name: HELM Setup
run: |
curl -fsSL -o get_helm.sh https://raw.githubusercontent.com/helm/helm/main/scripts/get-helm-3
chmod 700 get_helm.sh
./get_helm.sh
- name: App Deploy
run: |
WEB_S3_URL=""
if [ ${{ env.BUILD_WEB }} == true ]; then
WEB_S3_URL=$(aws s3 presign s3://${{ vars.FEATURE_PREVIEW_AWS_BUCKET }}/${{github.sha}}/web.tar.gz --expires-in 3600)
fi
SPACE_S3_URL=""
if [ ${{ env.BUILD_SPACE }} == true ]; then
SPACE_S3_URL=$(aws s3 presign s3://${{ vars.FEATURE_PREVIEW_AWS_BUCKET }}/${{github.sha}}/space.tar.gz --expires-in 3600)
fi
if [ ${{ env.BUILD_WEB }} == true ] || [ ${{ env.BUILD_SPACE }} == true ]; then
helm --kube-insecure-skip-tls-verify repo add feature-preview ${{ vars.FEATURE_PREVIEW_HELM_CHART_URL }}
APP_NAMESPACE="${{ vars.FEATURE_PREVIEW_NAMESPACE }}"
DEPLOY_SCRIPT_URL="${{ vars.FEATURE_PREVIEW_DEPLOY_SCRIPT_URL }}"
METADATA=$(helm --kube-insecure-skip-tls-verify install feature-preview/${{ vars.FEATURE_PREVIEW_HELM_CHART_NAME }} \
--generate-name \
--namespace $APP_NAMESPACE \
--set ingress.primaryDomain=${{vars.FEATURE_PREVIEW_PRIMARY_DOMAIN || 'feature.plane.tools' }} \
--set web.image=${{vars.FEATURE_PREVIEW_DOCKER_BASE}} \
--set web.enabled=${{ env.BUILD_WEB || false }} \
--set web.artifact_url=$WEB_S3_URL \
--set space.image=${{vars.FEATURE_PREVIEW_DOCKER_BASE}} \
--set space.enabled=${{ env.BUILD_SPACE || false }} \
--set space.artifact_url=$SPACE_S3_URL \
--set shared_config.deploy_script_url=$DEPLOY_SCRIPT_URL \
--set shared_config.api_base_url=${{vars.FEATURE_PREVIEW_NEXT_PUBLIC_API_BASE_URL}} \
--output json \
--timeout 1000s)
APP_NAME=$(echo $METADATA | jq -r '.name')
INGRESS_HOSTNAME=$(kubectl get ingress -n feature-builds --insecure-skip-tls-verify \
-o jsonpath='{.items[?(@.metadata.annotations.meta\.helm\.sh\/release-name=="'$APP_NAME'")]}' | \
jq -r '.spec.rules[0].host')
echo "****************************************"
echo "APP NAME ::: $APP_NAME"
echo "INGRESS HOSTNAME ::: $INGRESS_HOSTNAME"
echo "****************************************"
fi

5
.gitignore vendored
View File

@ -1,7 +1,3 @@
pg_data
redis_data
minio_data
node_modules node_modules
.next .next
@ -55,7 +51,6 @@ staticfiles
mediafiles mediafiles
.env .env
.DS_Store .DS_Store
logs/
node_modules/ node_modules/
assets/dist/ assets/dist/

View File

@ -33,8 +33,8 @@ The backend is a django project which is kept inside apiserver
1. Clone the repo 1. Clone the repo
```bash ```bash
git clone https://github.com/makeplane/plane.git [folder-name] git clone https://github.com/makeplane/plane
cd [folder-name] cd plane
chmod +x setup.sh chmod +x setup.sh
``` ```
@ -44,10 +44,32 @@ chmod +x setup.sh
./setup.sh ./setup.sh
``` ```
3. Start the containers 3. Define `NEXT_PUBLIC_API_BASE_URL=http://localhost` in **web/.env** and **space/.env** file
```bash ```bash
docker compose -f docker-compose-local.yml up echo "\nNEXT_PUBLIC_API_BASE_URL=http://localhost\n" >> ./web/.env
```
```bash
echo "\nNEXT_PUBLIC_API_BASE_URL=http://localhost\n" >> ./space/.env
```
4. Run Docker compose up
```bash
docker compose up -d
```
5. Install dependencies
```bash
yarn install
```
6. Run the web app in development mode
```bash
yarn dev
``` ```
## Missing a Feature? ## Missing a Feature?

View File

@ -1,110 +1,129 @@
FROM git.orionkindel.com/tpl/asdf:bookworm AS system FROM node:18-alpine AS builder
RUN apk add --no-cache libc6-compat
# Set working directory
WORKDIR /app
ENV NEXT_PUBLIC_API_BASE_URL=http://NEXT_PUBLIC_API_BASE_URL_PLACEHOLDER
ARG S6_OVERLAY_VERSION=3.1.6.2 RUN yarn global add turbo
RUN apk add tree
COPY . .
ADD https://github.com/just-containers/s6-overlay/releases/download/v${S6_OVERLAY_VERSION}/s6-overlay-noarch.tar.xz /tmp RUN turbo prune --scope=app --scope=plane-deploy --docker
RUN tar -C / -Jxpf /tmp/s6-overlay-noarch.tar.xz CMD tree -I node_modules/
ADD https://github.com/just-containers/s6-overlay/releases/download/v${S6_OVERLAY_VERSION}/s6-overlay-x86_64.tar.xz /tmp # Add lockfile and package.json's of isolated subworkspace
RUN tar -C / -Jxpf /tmp/s6-overlay-x86_64.tar.xz FROM node:18-alpine AS installer
RUN apt-get update RUN apk add --no-cache libc6-compat
RUN apt-get install -y \ WORKDIR /app
build-essential \ ARG NEXT_PUBLIC_API_BASE_URL=http://localhost:8000
zlib1g-dev \ # First install the dependencies (as they change less often)
libncurses5-dev \ COPY .gitignore .gitignore
libgdbm-dev \ COPY --from=builder /app/out/json/ .
libnss3-dev \ COPY --from=builder /app/out/yarn.lock ./yarn.lock
libssl-dev \ RUN yarn install
libreadline-dev \
libffi-dev \
libsqlite3-dev \
wget \
libbz2-dev \
uuid-dev \
nginx \
procps
RUN asdf plugin add nodejs \ # # Build the project
&& asdf plugin add python \ COPY --from=builder /app/out/full/ .
&& asdf plugin add postgres COPY turbo.json turbo.json
COPY replace-env-vars.sh /usr/local/bin/
USER root
RUN chmod +x /usr/local/bin/replace-env-vars.sh
RUN --mount=type=cache,target=/.asdf-build \ RUN yarn turbo run build
export ASDF_DOWNLOAD_PATH=/.asdf-build \
&& export TMPDIR=/.asdf-build \
&& export POSTGRES_SKIP_INITDB=y \
&& asdf install nodejs 20.9.0 \
&& asdf install python 3.11.1 \
&& asdf install postgres 15.3
RUN asdf global nodejs 20.9.0 \ ENV NEXT_PUBLIC_API_BASE_URL=$NEXT_PUBLIC_API_BASE_URL \
&& asdf global postgres 15.3 \ BUILT_NEXT_PUBLIC_API_BASE_URL=$NEXT_PUBLIC_API_BASE_URL
&& asdf global python 3.11.1
RUN useradd -m postgres && passwd -d postgres RUN /usr/local/bin/replace-env-vars.sh http://NEXT_PUBLIC_WEBAPP_URL_PLACEHOLDER ${NEXT_PUBLIC_API_BASE_URL}
ADD https://dl.min.io/server/minio/release/linux-amd64/minio /usr/bin FROM python:3.11.1-alpine3.17 AS backend
RUN chmod +x /usr/bin/minio
RUN set -eo pipefail; \ # set environment variables
curl -fsSL https://packages.redis.io/gpg | gpg --dearmor -o /usr/share/keyrings/redis-archive-keyring.gpg; \ ENV PYTHONDONTWRITEBYTECODE 1
echo "deb [signed-by=/usr/share/keyrings/redis-archive-keyring.gpg] https://packages.redis.io/deb bookworm main" | tee /etc/apt/sources.list.d/redis.list; \ ENV PYTHONUNBUFFERED 1
apt-get update; \ ENV PIP_DISABLE_PIP_VERSION_CHECK=1
apt-get install -y redis
FROM system AS next_prebuild WORKDIR /code
RUN npm i -g yarn RUN apk --no-cache add \
RUN --mount=type=cache,target=/.yarn-cache \ "libpq~=15" \
yarn config set cache-folder /.yarn-cache "libxslt~=1.1" \
"nodejs-current~=19" \
"xmlsec~=1.2" \
"nginx" \
"nodejs" \
"npm" \
"supervisor"
COPY package.json turbo.json yarn.lock app.json ./ COPY apiserver/requirements.txt ./
COPY packages packages COPY apiserver/requirements ./requirements
COPY web web RUN apk add --no-cache libffi-dev
COPY space space RUN apk add --no-cache --virtual .build-deps \
"bash~=5.2" \
"g++~=12.2" \
"gcc~=12.2" \
"cargo~=1.64" \
"git~=2" \
"make~=4.3" \
"postgresql13-dev~=13" \
"libc-dev" \
"linux-headers" \
&& \
pip install -r requirements.txt --compile --no-cache-dir \
&& \
apk del .build-deps
RUN --mount=type=cache,target=/.yarn-cache \ # Add in Django deps and generate Django's static files
yarn install COPY apiserver/manage.py manage.py
COPY apiserver/plane plane/
COPY apiserver/templates templates/
FROM next_prebuild AS next_build RUN apk --no-cache add "bash~=5.2"
COPY apiserver/bin ./bin/
RUN --mount=type=cache,target=/.yarn-cache \ RUN chmod +x ./bin/takeoff ./bin/worker
--mount=type=cache,target=/web/.next \ RUN chmod -R 777 /code
--mount=type=cache,target=/space/.next \
yarn build && \
cp -R /web/.next /web/_next && \
cp -R /space/.next /space/_next
RUN mv /web/_next /web/.next && \ # Expose container port and run entry point script
mv /space/_next /space/.next && \
cp -R /web/.next/standalone/web/* /web/ && \
cp -R /space/.next/standalone/space/* /space/
FROM next_build AS api_build WORKDIR /app
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
COPY apiserver apiserver # Don't run production as root
RUN --mount=type=cache,target=/root/.cache/pip \ RUN addgroup --system --gid 1001 plane
cd /apiserver \ RUN adduser --system --uid 1001 captain
&& pip install -r requirements.txt --compile
FROM api_build AS s6 COPY --from=installer /app/apps/app/next.config.js .
COPY --from=installer /app/apps/app/package.json .
COPY --from=installer /app/apps/space/next.config.js .
COPY --from=installer /app/apps/space/package.json .
COPY docker/etc/ /etc/ COPY --from=installer --chown=captain:plane /app/apps/app/.next/standalone ./
RUN chmod -R 777 /root \ COPY --from=installer --chown=captain:plane /app/apps/app/.next/static ./apps/app/.next/static
&& chmod -R 777 /root/.asdf \
&& chmod -x /root/.asdf/lib/commands/* \
&& chmod -R 777 /apiserver \
&& chmod -R 777 /web \
&& chmod -R 777 /space \
&& ln $(asdf which postgres) /usr/bin/postgres \
&& ln $(asdf which initdb) /usr/bin/initdb \
&& ln $(asdf which node) /usr/bin/node \
&& ln $(asdf which npm) /usr/bin/npm \
&& ln $(asdf which python) /usr/bin/python
ENV S6_KEEP_ENV=1 COPY --from=installer --chown=captain:plane /app/apps/space/.next/standalone ./
ENTRYPOINT ["/init"] COPY --from=installer --chown=captain:plane /app/apps/space/.next ./apps/space/.next
ENV NEXT_TELEMETRY_DISABLED 1
# RUN rm /etc/nginx/conf.d/default.conf
#######################################################################
COPY nginx/nginx-single-docker-image.conf /etc/nginx/http.d/default.conf
#######################################################################
COPY nginx/supervisor.conf /code/supervisor.conf
ARG NEXT_PUBLIC_API_BASE_URL=http://localhost:8000
ENV NEXT_PUBLIC_API_BASE_URL=$NEXT_PUBLIC_API_BASE_URL \
BUILT_NEXT_PUBLIC_API_BASE_URL=$NEXT_PUBLIC_API_BASE_URL
USER root
COPY replace-env-vars.sh /usr/local/bin/
COPY start.sh /usr/local/bin/
RUN chmod +x /usr/local/bin/replace-env-vars.sh
RUN chmod +x /usr/local/bin/start.sh
EXPOSE 80
CMD ["supervisord","-c","/code/supervisor.conf"]

View File

@ -49,10 +49,25 @@ NGINX_PORT=80
``` ```
# Enable/Disable OAUTH - default 0 for selfhosted instance
NEXT_PUBLIC_ENABLE_OAUTH=0
# Public boards deploy URL # Public boards deploy URL
NEXT_PUBLIC_DEPLOY_URL="http://localhost/spaces" NEXT_PUBLIC_DEPLOY_URL="http://localhost/spaces"
``` ```
## {PROJECT_FOLDER}/spaces/.env.example
```
# Flag to toggle OAuth
NEXT_PUBLIC_ENABLE_OAUTH=0
```
## {PROJECT_FOLDER}/apiserver/.env ## {PROJECT_FOLDER}/apiserver/.env

145
README.md
View File

@ -7,7 +7,7 @@
</p> </p>
<h3 align="center"><b>Plane</b></h3> <h3 align="center"><b>Plane</b></h3>
<p align="center"><b>Open-source project management that unlocks customer value.</b></p> <p align="center"><b>Flexible, extensible open-source project management</b></p>
<p align="center"> <p align="center">
<a href="https://discord.com/invite/A92xrEGCge"> <a href="https://discord.com/invite/A92xrEGCge">
@ -16,13 +16,6 @@
<img alt="Commit activity per month" src="https://img.shields.io/github/commit-activity/m/makeplane/plane?style=for-the-badge" /> <img alt="Commit activity per month" src="https://img.shields.io/github/commit-activity/m/makeplane/plane?style=for-the-badge" />
</p> </p>
<p align="center">
<a href="https://dub.sh/plane-website-readme"><b>Website</b></a>
<a href="https://git.new/releases"><b>Releases</b></a>
<a href="https://dub.sh/planepowershq"><b>Twitter</b></a>
<a href="https://dub.sh/planedocs"><b>Documentation</b></a>
</p>
<p> <p>
<a href="https://app.plane.so/#gh-light-mode-only" target="_blank"> <a href="https://app.plane.so/#gh-light-mode-only" target="_blank">
<img <img
@ -40,90 +33,56 @@
</a> </a>
</p> </p>
Meet [Plane](https://dub.sh/plane-website-readme). An open-source software development tool to manage issues, sprints, and product roadmaps with peace of mind. 🧘‍♀️ Meet [Plane](https://plane.so). An open-source software development tool to manage issues, sprints, and product roadmaps with peace of mind 🧘‍♀️.
> Plane is still in its early days, not everything will be perfect yet, and hiccups may happen. Please let us know of any suggestions, ideas, or bugs that you encounter on our [Discord](https://discord.com/invite/A92xrEGCge) or GitHub issues, and we will use your feedback to improve in our upcoming releases. > Plane is still in its early days, not everything will be perfect yet, and hiccups may happen. Please let us know of any suggestions, ideas, or bugs that you encounter on our [Discord](https://discord.com/invite/A92xrEGCge) or GitHub issues, and we will use your feedback to improve on our upcoming releases.
## ⚡ Installation The easiest way to get started with Plane is by creating a [Plane Cloud](https://app.plane.so) account. Plane Cloud offers a hosted solution for Plane. If you prefer to self-host Plane, please refer to our [deployment documentation](https://docs.plane.so/self-hosting).
The easiest way to get started with Plane is by creating a [Plane Cloud](https://app.plane.so) account where we offer a hosted solution for users. ## ⚡️ Contributors Quick Start
If you want more control over your data, prefer to self-host Plane, please refer to our [deployment documentation](https://docs.plane.so/docker-compose). ### Prerequisite
| Installation Methods | Documentation Link | Development system must have docker engine installed and running.
| -------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| Docker | [![Docker](https://img.shields.io/badge/docker-%230db7ed.svg?style=for-the-badge&logo=docker&logoColor=white)](https://docs.plane.so/self-hosting/methods/docker-compose) |
| Kubernetes | [![Kubernetes](https://img.shields.io/badge/kubernetes-%23326ce5.svg?style=for-the-badge&logo=kubernetes&logoColor=white)](https://docs.plane.so/kubernetes) |
`Instance admin` can configure instance settings using our [God-mode](https://docs.plane.so/instance-admin) feature. ### Steps
## 🚀 Features Setting up local environment is extremely easy and straight forward. Follow the below step and you will be ready to contribute
- **Issues**: Quickly create issues and add details using a powerful rich text editor that supports file uploads. Add sub-properties and references to problems for better organization and tracking. 1. Clone the code locally using `git clone https://github.com/makeplane/plane.git`
1. Switch to the code folder `cd plane`
1. Create your feature or fix branch you plan to work on using `git checkout -b <feature-branch-name>`
1. Open terminal and run `./setup.sh`
1. Open the code on VSCode or similar equivalent IDE
1. Review the `.env` files available in various folders. Visit [Environment Setup](./ENV_SETUP.md) to know about various environment variables used in system
1. Run the docker command to initiate various services `docker compose -f docker-compose-local.yml up -d`
- **Cycles**: You are ready to make changes to the code. Do not forget to refresh the browser (in case id does not auto-reload)
Keep up your team's momentum with Cycles. Gain insights into your project's progress with burn-down charts and other valuable features.
- **Modules**: Break down your large projects into smaller, more manageable modules. Assign modules between teams to track and plan your project's progress easily.
- **Views**: Create custom filters to display only the issues that matter to you. Save and share your filters in just a few clicks.
- **Pages**: Plane pages, equipped with AI and a rich text editor, let you jot down your thoughts on the fly. Format your text, upload images, hyperlink, or sync your existing ideas into an actionable item or issue.
- **Analytics**: Get insights into all your Plane data in real-time. Visualize issue data to spot trends, remove blockers, and progress your work.
- **Drive** (_coming soon_): The drive helps you share documents, images, videos, or any other files that make sense to you or your team and align on the problem/solution.
## 🛠️ Quick start for contributors
> Development system must have docker engine installed and running.
Setting up local environment is extremely easy and straight forward. Follow the below step and you will be ready to contribute -
1. Clone the code locally using:
```
git clone https://github.com/makeplane/plane.git
```
2. Switch to the code folder:
```
cd plane
```
3. Create your feature or fix branch you plan to work on using:
```
git checkout -b <feature-branch-name>
```
4. Open terminal and run:
```
./setup.sh
```
5. Open the code on VSCode or similar equivalent IDE.
6. Review the `.env` files available in various folders.
Visit [Environment Setup](./ENV_SETUP.md) to know about various environment variables used in system.
7. Run the docker command to initiate services:
```
docker compose -f docker-compose-local.yml up -d
```
You are ready to make changes to the code. Do not forget to refresh the browser (in case it does not auto-reload).
Thats it! Thats it!
## ❤️ Community ## 🍙 Self Hosting
The Plane community can be found on [GitHub Discussions](https://github.com/orgs/makeplane/discussions), and our [Discord server](https://discord.com/invite/A92xrEGCge). Our [Code of conduct](https://github.com/makeplane/plane/blob/master/CODE_OF_CONDUCT.md) applies to all Plane community chanels. For self hosting environment setup, visit the [Self Hosting](https://docs.plane.so/self-hosting) documentation page
Ask questions, report bugs, join discussions, voice ideas, make feature requests, or share your projects. ## 🚀 Features
### Repo Activity - **Issue Planning and Tracking**: Quickly create issues and add details using a powerful rich text editor that supports file uploads. Add sub-properties and references to issues for better organization and tracking.
- **Issue Attachments**: Collaborate effectively by attaching files to issues, making it easy for your team to find and share important project-related documents.
![Plane Repo Activity](https://repobeats.axiom.co/api/embed/2523c6ed2f77c082b7908c33e2ab208981d76c39.svg "Repobeats analytics image") - **Layouts**: Customize your project view with your preferred layout - choose from List, Kanban, or Calendar to visualize your project in a way that makes sense to you.
- **Cycles**: Plan sprints with Cycles to keep your team on track and productive. Gain insights into your project's progress with burn-down charts and other useful features.
- **Modules**: Break down your large projects into smaller, more manageable modules. Assign modules between teams to easily track and plan your project's progress.
- **Views**: Create custom filters to display only the issues that matter to you. Save and share your filters in just a few clicks.
- **Pages**: Plane pages function as an AI-powered notepad, allowing you to easily document issues, cycle plans, and module details, and then synchronize them with your issues.
- **Command K**: Enjoy a better user experience with the new Command + K menu. Easily manage and navigate through your projects from one convenient location.
- **GitHub Sync**: Streamline your planning process by syncing your GitHub issues with Plane. Keep all your issues in one place for better tracking and collaboration.
## 📸 Screenshots ## 📸 Screenshots
<p> <p>
<a href="https://plane.so" target="_blank"> <a href="https://plane.so" target="_blank">
<img <img
src="https://ik.imagekit.io/w2okwbtu2/Issues_rNZjrGgFl.png?updatedAt=1709298765880" src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_views_dark_mode.webp"
alt="Plane Views" alt="Plane Views"
width="100%" width="100%"
/> />
@ -132,7 +91,8 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
<p> <p>
<a href="https://plane.so" target="_blank"> <a href="https://plane.so" target="_blank">
<img <img
src="https://ik.imagekit.io/w2okwbtu2/Cycles_jCDhqmTl9.png?updatedAt=1709298780697" src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_issue_detail_dark_mode.webp"
alt="Plane Issue Details"
width="100%" width="100%"
/> />
</a> </a>
@ -140,7 +100,7 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
<p> <p>
<a href="https://plane.so" target="_blank"> <a href="https://plane.so" target="_blank">
<img <img
src="https://ik.imagekit.io/w2okwbtu2/Modules_PSCVsbSfI.png?updatedAt=1709298796783" src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_cycles_modules_dark_mode.webp"
alt="Plane Cycles and Modules" alt="Plane Cycles and Modules"
width="100%" width="100%"
/> />
@ -149,7 +109,7 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
<p> <p>
<a href="https://plane.so" target="_blank"> <a href="https://plane.so" target="_blank">
<img <img
src="https://ik.imagekit.io/w2okwbtu2/Views_uxXsRatS4.png?updatedAt=1709298834522" src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_analytics_dark_mode.webp"
alt="Plane Analytics" alt="Plane Analytics"
width="100%" width="100%"
/> />
@ -158,7 +118,7 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
<p> <p>
<a href="https://plane.so" target="_blank"> <a href="https://plane.so" target="_blank">
<img <img
src="https://ik.imagekit.io/w2okwbtu2/Analytics_0o22gLRtp.png?updatedAt=1709298834389" src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_pages_dark_mode.webp"
alt="Plane Pages" alt="Plane Pages"
width="100%" width="100%"
/> />
@ -168,7 +128,7 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
<p> <p>
<a href="https://plane.so" target="_blank"> <a href="https://plane.so" target="_blank">
<img <img
src="https://ik.imagekit.io/w2okwbtu2/Drive_LlfeY4xn3.png?updatedAt=1709298837917" src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_commad_k_dark_mode.webp"
alt="Plane Command Menu" alt="Plane Command Menu"
width="100%" width="100%"
/> />
@ -176,23 +136,20 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
</p> </p>
</p> </p>
## 📚Documentation
For full documentation, visit [docs.plane.so](https://docs.plane.so/)
To see how to Contribute, visit [here](https://github.com/makeplane/plane/blob/master/CONTRIBUTING.md).
## ❤️ Community
The Plane community can be found on GitHub Discussions, where you can ask questions, voice ideas, and share your projects.
To chat with other community members you can join the [Plane Discord](https://discord.com/invite/A92xrEGCge).
Our [Code of Conduct](https://github.com/makeplane/plane/blob/master/CODE_OF_CONDUCT.md) applies to all Plane community channels.
## ⛓️ Security ## ⛓️ Security
If you believe you have found a security vulnerability in Plane, we encourage you to responsibly disclose this and not open a public issue. We will investigate all legitimate reports. If you believe you have found a security vulnerability in Plane, we encourage you to responsibly disclose this and not open a public issue. We will investigate all legitimate reports. Email engineering@plane.so to disclose any security vulnerabilities.
Email squawk@plane.so to disclose any security vulnerabilities.
## ❤️ Contribute
There are many ways to contribute to Plane, including:
- Submitting [bugs](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%F0%9F%90%9Bbug&projects=&template=--bug-report.yaml&title=%5Bbug%5D%3A+) and [feature requests](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%E2%9C%A8feature&projects=&template=--feature-request.yaml&title=%5Bfeature%5D%3A+) for various components.
- Reviewing [the documentation](https://docs.plane.so/) and submitting [pull requests](https://github.com/makeplane/plane), from fixing typos to adding new features.
- Speaking or writing about Plane or any other ecosystem integration and [letting us know](https://discord.com/invite/A92xrEGCge)!
- Upvoting [popular feature requests](https://github.com/makeplane/plane/issues) to show your support.
### We couldn't have done this without you.
<a href="https://github.com/makeplane/plane/graphs/contributors">
<img src="https://contrib.rocks/image?repo=makeplane/plane" />
</a>

View File

@ -1,44 +0,0 @@
# Security Policy
This document outlines security procedures and vulnerabilities reporting for the Plane project.
At Plane, we safeguarding the security of our systems with top priority. Despite our efforts, vulnerabilities may still exist. We greatly appreciate your assistance in identifying and reporting any such vulnerabilities to help us maintain the integrity of our systems and protect our clients.
To report a security vulnerability, please email us directly at security@plane.so with a detailed description of the vulnerability and steps to reproduce it. Please refrain from disclosing the vulnerability publicly until we have had an opportunity to review and address it.
## Out of Scope Vulnerabilities
We appreciate your help in identifying vulnerabilities. However, please note that the following types of vulnerabilities are considered out of scope:
- Attacks requiring MITM or physical access to a user's device.
- Content spoofing and text injection issues without demonstrating an attack vector or ability to modify HTML/CSS.
- Email spoofing.
- Missing DNSSEC, CAA, CSP headers.
- Lack of Secure or HTTP only flag on non-sensitive cookies.
## Reporting Process
If you discover a vulnerability, please adhere to the following reporting process:
1. Email your findings to security@plane.so.
2. Refrain from running automated scanners on our infrastructure or dashboard without prior consent. Contact us to set up a sandbox environment if necessary.
3. Do not exploit the vulnerability for malicious purposes, such as downloading excessive data or altering user data.
4. Maintain confidentiality and refrain from disclosing the vulnerability until it has been resolved.
5. Avoid using physical security attacks, social engineering, distributed denial of service, spam, or third-party applications.
When reporting a vulnerability, please provide sufficient information to allow us to reproduce and address the issue promptly. Include the IP address or URL of the affected system, along with a detailed description of the vulnerability.
## Our Commitment
We are committed to promptly addressing reported vulnerabilities and maintaining open communication throughout the resolution process. Here's what you can expect from us:
- **Response Time:** We will acknowledge receipt of your report within three business days and provide an expected resolution date.
- **Legal Protection:** We will not pursue legal action against you for reporting vulnerabilities, provided you adhere to the reporting guidelines.
- **Confidentiality:** Your report will be treated with strict confidentiality. We will not disclose your personal information to third parties without your consent.
- **Progress Updates:** We will keep you informed of our progress in resolving the reported vulnerability.
- **Recognition:** With your permission, we will publicly acknowledge you as the discoverer of the vulnerability.
- **Timely Resolution:** We strive to resolve all reported vulnerabilities promptly and will actively participate in the publication process once the issue is resolved.
We appreciate your cooperation in helping us maintain the security of our systems and protecting our clients. Thank you for your contributions to our security efforts.
reference: https://supabase.com/.well-known/security.txt

View File

@ -8,12 +8,16 @@ SENTRY_DSN=""
SENTRY_ENVIRONMENT="development" SENTRY_ENVIRONMENT="development"
# Database Settings # Database Settings
POSTGRES_USER="plane" PGUSER="plane"
POSTGRES_PASSWORD="plane" PGPASSWORD="plane"
POSTGRES_HOST="plane-db" PGHOST="plane-db"
POSTGRES_DB="plane" PGDATABASE="plane"
DATABASE_URL=postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST}/${POSTGRES_DB} DATABASE_URL=postgresql://${PGUSER}:${PGPASSWORD}@${PGHOST}/${PGDATABASE}
# Oauth variables
GOOGLE_CLIENT_ID=""
GITHUB_CLIENT_ID=""
GITHUB_CLIENT_SECRET=""
# Redis Settings # Redis Settings
REDIS_HOST="plane-redis" REDIS_HOST="plane-redis"
@ -30,6 +34,14 @@ AWS_S3_BUCKET_NAME="uploads"
# Maximum file upload limit # Maximum file upload limit
FILE_SIZE_LIMIT=5242880 FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Github
GITHUB_CLIENT_SECRET="" # For fetching release notes
# Settings related to Docker # Settings related to Docker
DOCKERIZED=1 # deprecated DOCKERIZED=1 # deprecated
@ -39,8 +51,19 @@ USE_MINIO=1
# Nginx Configuration # Nginx Configuration
NGINX_PORT=80 NGINX_PORT=80
# SignUps
ENABLE_SIGNUP="1"
# Enable Email/Password Signup
ENABLE_EMAIL_PASSWORD="1"
# Enable Magic link Login
ENABLE_MAGIC_LINK_LOGIN="0"
# Email redirections and minio domain settings # Email redirections and minio domain settings
WEB_URL="http://localhost" WEB_URL="http://localhost"
# Gunicorn Workers # Gunicorn Workers
GUNICORN_WORKERS=2 GUNICORN_WORKERS=2

View File

@ -32,19 +32,27 @@ RUN apk add --no-cache --virtual .build-deps \
apk del .build-deps apk del .build-deps
RUN addgroup -S plane && \
adduser -S captain -G plane
RUN chown captain.plane /code
USER captain
# Add in Django deps and generate Django's static files # Add in Django deps and generate Django's static files
COPY manage.py manage.py COPY manage.py manage.py
COPY plane plane/ COPY plane plane/
COPY templates templates/ COPY templates templates/
COPY package.json package.json COPY package.json package.json
USER root
RUN apk --no-cache add "bash~=5.2" RUN apk --no-cache add "bash~=5.2"
COPY ./bin ./bin/ COPY ./bin ./bin/
RUN mkdir -p /code/plane/logs
RUN chmod +x ./bin/takeoff ./bin/worker ./bin/beat RUN chmod +x ./bin/takeoff ./bin/worker ./bin/beat
RUN chmod -R 777 /code RUN chmod -R 777 /code
USER captain
# Expose container port and run entry point script # Expose container port and run entry point script
EXPOSE 8000 EXPOSE 8000

View File

@ -27,19 +27,26 @@ WORKDIR /code
COPY requirements.txt ./requirements.txt COPY requirements.txt ./requirements.txt
ADD requirements ./requirements ADD requirements ./requirements
# Install the local development settings RUN pip install -r requirements.txt --compile --no-cache-dir
RUN pip install -r requirements/local.txt --compile --no-cache-dir
RUN addgroup -S plane && \
adduser -S captain -G plane
COPY . . RUN chown captain.plane /code
RUN mkdir -p /code/plane/logs USER captain
RUN chmod -R +x /code/bin
# Add in Django deps and generate Django's static files
USER root
# RUN chmod +x ./bin/takeoff ./bin/worker ./bin/beat
RUN chmod -R 777 /code RUN chmod -R 777 /code
USER captain
# Expose container port and run entry point script # Expose container port and run entry point script
EXPOSE 8000 EXPOSE 8000
CMD [ "./bin/takeoff.local" ] # CMD [ "./bin/takeoff" ]

View File

@ -26,9 +26,7 @@ def update_description():
updated_issues.append(issue) updated_issues.append(issue)
Issue.objects.bulk_update( Issue.objects.bulk_update(
updated_issues, updated_issues, ["description_html", "description_stripped"], batch_size=100
["description_html", "description_stripped"],
batch_size=100,
) )
print("Success") print("Success")
except Exception as e: except Exception as e:
@ -42,9 +40,7 @@ def update_comments():
updated_issue_comments = [] updated_issue_comments = []
for issue_comment in issue_comments: for issue_comment in issue_comments:
issue_comment.comment_html = ( issue_comment.comment_html = f"<p>{issue_comment.comment_stripped}</p>"
f"<p>{issue_comment.comment_stripped}</p>"
)
updated_issue_comments.append(issue_comment) updated_issue_comments.append(issue_comment)
IssueComment.objects.bulk_update( IssueComment.objects.bulk_update(
@ -103,9 +99,7 @@ def updated_issue_sort_order():
issue.sort_order = issue.sequence_id * random.randint(100, 500) issue.sort_order = issue.sequence_id * random.randint(100, 500)
updated_issues.append(issue) updated_issues.append(issue)
Issue.objects.bulk_update( Issue.objects.bulk_update(updated_issues, ["sort_order"], batch_size=100)
updated_issues, ["sort_order"], batch_size=100
)
print("Success") print("Success")
except Exception as e: except Exception as e:
print(e) print(e)
@ -143,9 +137,7 @@ def update_project_cover_images():
project.cover_image = project_cover_images[random.randint(0, 19)] project.cover_image = project_cover_images[random.randint(0, 19)]
updated_projects.append(project) updated_projects.append(project)
Project.objects.bulk_update( Project.objects.bulk_update(updated_projects, ["cover_image"], batch_size=100)
updated_projects, ["cover_image"], batch_size=100
)
print("Success") print("Success")
except Exception as e: except Exception as e:
print(e) print(e)
@ -182,7 +174,7 @@ def update_label_color():
labels = Label.objects.filter(color="") labels = Label.objects.filter(color="")
updated_labels = [] updated_labels = []
for label in labels: for label in labels:
label.color = f"#{random.randint(0, 0xFFFFFF+1):06X}" label.color = "#" + "%06x" % random.randint(0, 0xFFFFFF)
updated_labels.append(label) updated_labels.append(label)
Label.objects.bulk_update(updated_labels, ["color"], batch_size=100) Label.objects.bulk_update(updated_labels, ["color"], batch_size=100)
@ -194,9 +186,7 @@ def update_label_color():
def create_slack_integration(): def create_slack_integration():
try: try:
_ = Integration.objects.create( _ = Integration.objects.create(provider="slack", network=2, title="Slack")
provider="slack", network=2, title="Slack"
)
print("Success") print("Success")
except Exception as e: except Exception as e:
print(e) print(e)
@ -222,16 +212,12 @@ def update_integration_verified():
def update_start_date(): def update_start_date():
try: try:
issues = Issue.objects.filter( issues = Issue.objects.filter(state__group__in=["started", "completed"])
state__group__in=["started", "completed"]
)
updated_issues = [] updated_issues = []
for issue in issues: for issue in issues:
issue.start_date = issue.created_at.date() issue.start_date = issue.created_at.date()
updated_issues.append(issue) updated_issues.append(issue)
Issue.objects.bulk_update( Issue.objects.bulk_update(updated_issues, ["start_date"], batch_size=500)
updated_issues, ["start_date"], batch_size=500
)
print("Success") print("Success")
except Exception as e: except Exception as e:
print(e) print(e)

3
apiserver/bin/beat Executable file → Normal file
View File

@ -2,7 +2,4 @@
set -e set -e
python manage.py wait_for_db python manage.py wait_for_db
# Wait for migrations
python manage.py wait_for_migrations
# Run the processes
celery -A plane beat -l info celery -A plane beat -l info

View File

@ -1,8 +1,7 @@
#!/bin/bash #!/bin/bash
set -e set -e
python manage.py wait_for_db python manage.py wait_for_db
# Wait for migrations python manage.py migrate
python manage.py wait_for_migrations
# Create the default bucket # Create the default bucket
#!/bin/bash #!/bin/bash
@ -21,15 +20,11 @@ SIGNATURE=$(echo "$HOSTNAME$MAC_ADDRESS$CPU_INFO$MEMORY_INFO$DISK_INFO" | sha256
export MACHINE_SIGNATURE=$SIGNATURE export MACHINE_SIGNATURE=$SIGNATURE
# Register instance # Register instance
python manage.py register_instance "$MACHINE_SIGNATURE" python manage.py register_instance $MACHINE_SIGNATURE
# Load the configuration variable # Load the configuration variable
python manage.py configure_instance python manage.py configure_instance
# Create the default bucket # Create the default bucket
python manage.py create_bucket python manage.py create_bucket
# Clear Cache before starting to remove stale values exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:${PORT:-8000} --max-requests 1200 --max-requests-jitter 1000 --access-logfile -
python manage.py clear_cache
exec gunicorn -w "$GUNICORN_WORKERS" -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:"${PORT:-8000}" --max-requests 1200 --max-requests-jitter 1000 --access-logfile -

View File

@ -1,35 +0,0 @@
#!/bin/bash
set -e
python manage.py wait_for_db
# Wait for migrations
python manage.py wait_for_migrations
# Create the default bucket
#!/bin/bash
# Collect system information
HOSTNAME=$(hostname)
MAC_ADDRESS=$(ip link show | awk '/ether/ {print $2}' | head -n 1)
CPU_INFO=$(cat /proc/cpuinfo)
MEMORY_INFO=$(free -h)
DISK_INFO=$(df -h)
# Concatenate information and compute SHA-256 hash
SIGNATURE=$(echo "$HOSTNAME$MAC_ADDRESS$CPU_INFO$MEMORY_INFO$DISK_INFO" | sha256sum | awk '{print $1}')
# Export the variables
export MACHINE_SIGNATURE=$SIGNATURE
# Register instance
python manage.py register_instance "$MACHINE_SIGNATURE"
# Load the configuration variable
python manage.py configure_instance
# Create the default bucket
python manage.py create_bucket
# Clear Cache before starting to remove stale values
python manage.py clear_cache
python manage.py runserver 0.0.0.0:8000 --settings=plane.settings.local

View File

@ -2,7 +2,4 @@
set -e set -e
python manage.py wait_for_db python manage.py wait_for_db
# Wait for migrations
python manage.py wait_for_migrations
# Run the processes
celery -A plane worker -l info celery -A plane worker -l info

View File

@ -2,10 +2,10 @@
import os import os
import sys import sys
if __name__ == "__main__": if __name__ == '__main__':
os.environ.setdefault( os.environ.setdefault(
"DJANGO_SETTINGS_MODULE", "plane.settings.production" 'DJANGO_SETTINGS_MODULE',
) 'plane.settings.production')
try: try:
from django.core.management import execute_from_command_line from django.core.management import execute_from_command_line
except ImportError as exc: except ImportError as exc:

View File

@ -1,4 +1,4 @@
{ {
"name": "plane-api", "name": "plane-api",
"version": "0.17.0" "version": "0.13.2"
} }

View File

@ -1,3 +1,3 @@
from .celery import app as celery_app from .celery import app as celery_app
__all__ = ("celery_app",) __all__ = ('celery_app',)

View File

@ -2,4 +2,4 @@ from django.apps import AppConfig
class AnalyticsConfig(AppConfig): class AnalyticsConfig(AppConfig):
name = "plane.analytics" name = 'plane.analytics'

View File

@ -2,4 +2,4 @@ from django.apps import AppConfig
class ApiConfig(AppConfig): class ApiConfig(AppConfig):
name = "plane.api" name = "plane.api"

View File

@ -25,10 +25,7 @@ class APIKeyAuthentication(authentication.BaseAuthentication):
def validate_api_token(self, token): def validate_api_token(self, token):
try: try:
api_token = APIToken.objects.get( api_token = APIToken.objects.get(
Q( Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
Q(expired_at__gt=timezone.now())
| Q(expired_at__isnull=True)
),
token=token, token=token,
is_active=True, is_active=True,
) )
@ -47,4 +44,4 @@ class APIKeyAuthentication(authentication.BaseAuthentication):
# Validate the API token # Validate the API token
user, token = self.validate_api_token(token) user, token = self.validate_api_token(token)
return user, token return user, token

View File

@ -1,18 +1,17 @@
from rest_framework.throttling import SimpleRateThrottle from rest_framework.throttling import SimpleRateThrottle
class ApiKeyRateThrottle(SimpleRateThrottle): class ApiKeyRateThrottle(SimpleRateThrottle):
scope = "api_key" scope = 'api_key'
rate = "60/minute" rate = '60/minute'
def get_cache_key(self, request, view): def get_cache_key(self, request, view):
# Retrieve the API key from the request header # Retrieve the API key from the request header
api_key = request.headers.get("X-Api-Key") api_key = request.headers.get('X-Api-Key')
if not api_key: if not api_key:
return None # Allow the request if there's no API key return None # Allow the request if there's no API key
# Use the API key as part of the cache key # Use the API key as part of the cache key
return f"{self.scope}:{api_key}" return f'{self.scope}:{api_key}'
def allow_request(self, request, view): def allow_request(self, request, view):
allowed = super().allow_request(request, view) allowed = super().allow_request(request, view)
@ -25,7 +24,7 @@ class ApiKeyRateThrottle(SimpleRateThrottle):
# Remove old histories # Remove old histories
while history and history[-1] <= now - self.duration: while history and history[-1] <= now - self.duration:
history.pop() history.pop()
# Calculate the requests # Calculate the requests
num_requests = len(history) num_requests = len(history)
@ -36,7 +35,7 @@ class ApiKeyRateThrottle(SimpleRateThrottle):
reset_time = int(now + self.duration) reset_time = int(now + self.duration)
# Add headers # Add headers
request.META["X-RateLimit-Remaining"] = max(0, available) request.META['X-RateLimit-Remaining'] = max(0, available)
request.META["X-RateLimit-Reset"] = reset_time request.META['X-RateLimit-Reset'] = reset_time
return allowed return allowed

View File

@ -13,9 +13,5 @@ from .issue import (
) )
from .state import StateLiteSerializer, StateSerializer from .state import StateLiteSerializer, StateSerializer
from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer
from .module import ( from .module import ModuleSerializer, ModuleIssueSerializer, ModuleLiteSerializer
ModuleSerializer, from .inbox import InboxIssueSerializer
ModuleIssueSerializer,
ModuleLiteSerializer,
)
from .inbox import InboxIssueSerializer

View File

@ -66,11 +66,11 @@ class BaseSerializer(serializers.ModelSerializer):
if expand in self.fields: if expand in self.fields:
# Import all the expandable serializers # Import all the expandable serializers
from . import ( from . import (
IssueSerializer,
ProjectLiteSerializer,
StateLiteSerializer,
UserLiteSerializer,
WorkspaceLiteSerializer, WorkspaceLiteSerializer,
ProjectLiteSerializer,
UserLiteSerializer,
StateLiteSerializer,
IssueSerializer,
) )
# Expansion mapper # Expansion mapper
@ -97,11 +97,9 @@ class BaseSerializer(serializers.ModelSerializer):
exp_serializer = expansion[expand]( exp_serializer = expansion[expand](
getattr(instance, expand) getattr(instance, expand)
) )
response[expand] = exp_serializer.data response[expand] = exp_serializer.data
else: else:
# You might need to handle this case differently # You might need to handle this case differently
response[expand] = getattr( response[expand] = getattr(instance, f"{expand}_id", None)
instance, f"{expand}_id", None
)
return response return response

View File

@ -23,9 +23,7 @@ class CycleSerializer(BaseSerializer):
and data.get("end_date", None) is not None and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None) and data.get("start_date", None) > data.get("end_date", None)
): ):
raise serializers.ValidationError( raise serializers.ValidationError("Start date cannot exceed end date")
"Start date cannot exceed end date"
)
return data return data
class Meta: class Meta:
@ -57,6 +55,7 @@ class CycleIssueSerializer(BaseSerializer):
class CycleLiteSerializer(BaseSerializer): class CycleLiteSerializer(BaseSerializer):
class Meta: class Meta:
model = Cycle model = Cycle
fields = "__all__" fields = "__all__"

View File

@ -2,8 +2,8 @@
from .base import BaseSerializer from .base import BaseSerializer
from plane.db.models import InboxIssue from plane.db.models import InboxIssue
class InboxIssueSerializer(BaseSerializer): class InboxIssueSerializer(BaseSerializer):
class Meta: class Meta:
model = InboxIssue model = InboxIssue
fields = "__all__" fields = "__all__"
@ -16,4 +16,4 @@ class InboxIssueSerializer(BaseSerializer):
"updated_by", "updated_by",
"created_at", "created_at",
"updated_at", "updated_at",
] ]

View File

@ -1,34 +1,31 @@
from django.core.exceptions import ValidationError from lxml import html
from django.core.validators import URLValidator
# Django imports # Django imports
from django.utils import timezone from django.utils import timezone
from lxml import html
# Third party imports # Third party imports
from rest_framework import serializers from rest_framework import serializers
# Module imports # Module imports
from plane.db.models import ( from plane.db.models import (
User,
Issue, Issue,
IssueActivity, State,
IssueAssignee, IssueAssignee,
IssueAttachment, Label,
IssueComment,
IssueLabel, IssueLabel,
IssueLink, IssueLink,
Label, IssueComment,
IssueAttachment,
IssueActivity,
ProjectMember, ProjectMember,
State,
User,
) )
from .base import BaseSerializer from .base import BaseSerializer
from .cycle import CycleLiteSerializer, CycleSerializer from .cycle import CycleSerializer, CycleLiteSerializer
from .module import ModuleLiteSerializer, ModuleSerializer from .module import ModuleSerializer, ModuleLiteSerializer
from .state import StateLiteSerializer
from .user import UserLiteSerializer from .user import UserLiteSerializer
from .state import StateLiteSerializer
class IssueSerializer(BaseSerializer): class IssueSerializer(BaseSerializer):
assignees = serializers.ListField( assignees = serializers.ListField(
@ -69,18 +66,16 @@ class IssueSerializer(BaseSerializer):
and data.get("target_date", None) is not None and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None) and data.get("start_date", None) > data.get("target_date", None)
): ):
raise serializers.ValidationError( raise serializers.ValidationError("Start date cannot exceed target date")
"Start date cannot exceed target date"
)
try: try:
if data.get("description_html", None) is not None: if(data.get("description_html", None) is not None):
parsed = html.fromstring(data["description_html"]) parsed = html.fromstring(data["description_html"])
parsed_str = html.tostring(parsed, encoding="unicode") parsed_str = html.tostring(parsed, encoding='unicode')
data["description_html"] = parsed_str data["description_html"] = parsed_str
except Exception: except Exception as e:
raise serializers.ValidationError("Invalid HTML passed") raise serializers.ValidationError(f"Invalid HTML: {str(e)}")
# Validate assignees are from project # Validate assignees are from project
if data.get("assignees", []): if data.get("assignees", []):
@ -101,8 +96,7 @@ class IssueSerializer(BaseSerializer):
if ( if (
data.get("state") data.get("state")
and not State.objects.filter( and not State.objects.filter(
project_id=self.context.get("project_id"), project_id=self.context.get("project_id"), pk=data.get("state")
pk=data.get("state").id,
).exists() ).exists()
): ):
raise serializers.ValidationError( raise serializers.ValidationError(
@ -113,8 +107,7 @@ class IssueSerializer(BaseSerializer):
if ( if (
data.get("parent") data.get("parent")
and not Issue.objects.filter( and not Issue.objects.filter(
workspace_id=self.context.get("workspace_id"), workspace_id=self.context.get("workspace_id"), pk=data.get("parent")
pk=data.get("parent").id,
).exists() ).exists()
): ):
raise serializers.ValidationError( raise serializers.ValidationError(
@ -245,13 +238,9 @@ class IssueSerializer(BaseSerializer):
] ]
if "labels" in self.fields: if "labels" in self.fields:
if "labels" in self.expand: if "labels" in self.expand:
data["labels"] = LabelSerializer( data["labels"] = LabelSerializer(instance.labels.all(), many=True).data
instance.labels.all(), many=True
).data
else: else:
data["labels"] = [ data["labels"] = [str(label.id) for label in instance.labels.all()]
str(label.id) for label in instance.labels.all()
]
return data return data
@ -286,42 +275,16 @@ class IssueLinkSerializer(BaseSerializer):
"updated_at", "updated_at",
] ]
def validate_url(self, value):
# Check URL format
validate_url = URLValidator()
try:
validate_url(value)
except ValidationError:
raise serializers.ValidationError("Invalid URL format.")
# Check URL scheme
if not value.startswith(("http://", "https://")):
raise serializers.ValidationError("Invalid URL scheme.")
return value
# Validation if url already exists # Validation if url already exists
def create(self, validated_data): def create(self, validated_data):
if IssueLink.objects.filter( if IssueLink.objects.filter(
url=validated_data.get("url"), url=validated_data.get("url"), issue_id=validated_data.get("issue_id")
issue_id=validated_data.get("issue_id"),
).exists(): ).exists():
raise serializers.ValidationError( raise serializers.ValidationError(
{"error": "URL already exists for this Issue"} {"error": "URL already exists for this Issue"}
) )
return IssueLink.objects.create(**validated_data) return IssueLink.objects.create(**validated_data)
def update(self, instance, validated_data):
if IssueLink.objects.filter(
url=validated_data.get("url"),
issue_id=instance.issue_id,
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
return super().update(instance, validated_data)
class IssueAttachmentSerializer(BaseSerializer): class IssueAttachmentSerializer(BaseSerializer):
class Meta: class Meta:
@ -361,13 +324,13 @@ class IssueCommentSerializer(BaseSerializer):
def validate(self, data): def validate(self, data):
try: try:
if data.get("comment_html", None) is not None: if(data.get("comment_html", None) is not None):
parsed = html.fromstring(data["comment_html"]) parsed = html.fromstring(data["comment_html"])
parsed_str = html.tostring(parsed, encoding="unicode") parsed_str = html.tostring(parsed, encoding='unicode')
data["comment_html"] = parsed_str data["comment_html"] = parsed_str
except Exception: except Exception as e:
raise serializers.ValidationError("Invalid HTML passed") raise serializers.ValidationError(f"Invalid HTML: {str(e)}")
return data return data
@ -399,6 +362,7 @@ class ModuleIssueSerializer(BaseSerializer):
class LabelLiteSerializer(BaseSerializer): class LabelLiteSerializer(BaseSerializer):
class Meta: class Meta:
model = Label model = Label
fields = [ fields = [

View File

@ -52,9 +52,7 @@ class ModuleSerializer(BaseSerializer):
and data.get("target_date", None) is not None and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None) and data.get("start_date", None) > data.get("target_date", None)
): ):
raise serializers.ValidationError( raise serializers.ValidationError("Start date cannot exceed target date")
"Start date cannot exceed target date"
)
if data.get("members", []): if data.get("members", []):
data["members"] = ProjectMember.objects.filter( data["members"] = ProjectMember.objects.filter(
@ -67,18 +65,18 @@ class ModuleSerializer(BaseSerializer):
def create(self, validated_data): def create(self, validated_data):
members = validated_data.pop("members", None) members = validated_data.pop("members", None)
project_id = self.context["project_id"] project = self.context["project"]
workspace_id = self.context["workspace_id"]
module = Module.objects.create(**validated_data, project=project)
module = Module.objects.create(**validated_data, project_id=project_id)
if members is not None: if members is not None:
ModuleMember.objects.bulk_create( ModuleMember.objects.bulk_create(
[ [
ModuleMember( ModuleMember(
module=module, module=module,
member_id=str(member), member=member,
project_id=project_id, project=project,
workspace_id=workspace_id, workspace=project.workspace,
created_by=module.created_by, created_by=module.created_by,
updated_by=module.updated_by, updated_by=module.updated_by,
) )
@ -99,7 +97,7 @@ class ModuleSerializer(BaseSerializer):
[ [
ModuleMember( ModuleMember(
module=instance, module=instance,
member_id=str(member), member=member,
project=instance.project, project=instance.project,
workspace=instance.project.workspace, workspace=instance.project.workspace,
created_by=instance.created_by, created_by=instance.created_by,
@ -148,16 +146,16 @@ class ModuleLinkSerializer(BaseSerializer):
# Validation if url already exists # Validation if url already exists
def create(self, validated_data): def create(self, validated_data):
if ModuleLink.objects.filter( if ModuleLink.objects.filter(
url=validated_data.get("url"), url=validated_data.get("url"), module_id=validated_data.get("module_id")
module_id=validated_data.get("module_id"),
).exists(): ).exists():
raise serializers.ValidationError( raise serializers.ValidationError(
{"error": "URL already exists for this Issue"} {"error": "URL already exists for this Issue"}
) )
return ModuleLink.objects.create(**validated_data) return ModuleLink.objects.create(**validated_data)
class ModuleLiteSerializer(BaseSerializer): class ModuleLiteSerializer(BaseSerializer):
class Meta: class Meta:
model = Module model = Module
fields = "__all__" fields = "__all__"

View File

@ -2,16 +2,12 @@
from rest_framework import serializers from rest_framework import serializers
# Module imports # Module imports
from plane.db.models import ( from plane.db.models import Project, ProjectIdentifier, WorkspaceMember, State, Estimate
Project,
ProjectIdentifier,
WorkspaceMember,
)
from .base import BaseSerializer from .base import BaseSerializer
class ProjectSerializer(BaseSerializer): class ProjectSerializer(BaseSerializer):
total_members = serializers.IntegerField(read_only=True) total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True) total_cycles = serializers.IntegerField(read_only=True)
total_modules = serializers.IntegerField(read_only=True) total_modules = serializers.IntegerField(read_only=True)
@ -25,7 +21,7 @@ class ProjectSerializer(BaseSerializer):
fields = "__all__" fields = "__all__"
read_only_fields = [ read_only_fields = [
"id", "id",
"emoji", 'emoji',
"workspace", "workspace",
"created_at", "created_at",
"updated_at", "updated_at",
@ -63,16 +59,12 @@ class ProjectSerializer(BaseSerializer):
def create(self, validated_data): def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper() identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "": if identifier == "":
raise serializers.ValidationError( raise serializers.ValidationError(detail="Project Identifier is required")
detail="Project Identifier is required"
)
if ProjectIdentifier.objects.filter( if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"] name=identifier, workspace_id=self.context["workspace_id"]
).exists(): ).exists():
raise serializers.ValidationError( raise serializers.ValidationError(detail="Project Identifier is taken")
detail="Project Identifier is taken"
)
project = Project.objects.create( project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"] **validated_data, workspace_id=self.context["workspace_id"]
@ -97,4 +89,4 @@ class ProjectLiteSerializer(BaseSerializer):
"emoji", "emoji",
"description", "description",
] ]
read_only_fields = fields read_only_fields = fields

View File

@ -7,9 +7,9 @@ class StateSerializer(BaseSerializer):
def validate(self, data): def validate(self, data):
# If the default is being provided then make all other states default False # If the default is being provided then make all other states default False
if data.get("default", False): if data.get("default", False):
State.objects.filter( State.objects.filter(project_id=self.context.get("project_id")).update(
project_id=self.context.get("project_id") default=False
).update(default=False) )
return data return data
class Meta: class Meta:
@ -35,4 +35,4 @@ class StateLiteSerializer(BaseSerializer):
"color", "color",
"group", "group",
] ]
read_only_fields = fields read_only_fields = fields

View File

@ -1,6 +1,5 @@
# Module imports # Module imports
from plane.db.models import User from plane.db.models import User
from .base import BaseSerializer from .base import BaseSerializer
@ -11,9 +10,7 @@ class UserLiteSerializer(BaseSerializer):
"id", "id",
"first_name", "first_name",
"last_name", "last_name",
"email",
"avatar", "avatar",
"display_name", "display_name",
"email",
] ]
read_only_fields = fields read_only_fields = fields

View File

@ -5,7 +5,6 @@ from .base import BaseSerializer
class WorkspaceLiteSerializer(BaseSerializer): class WorkspaceLiteSerializer(BaseSerializer):
"""Lite serializer with only required fields""" """Lite serializer with only required fields"""
class Meta: class Meta:
model = Workspace model = Workspace
fields = [ fields = [
@ -13,4 +12,4 @@ class WorkspaceLiteSerializer(BaseSerializer):
"slug", "slug",
"id", "id",
] ]
read_only_fields = fields read_only_fields = fields

View File

@ -12,4 +12,4 @@ urlpatterns = [
*cycle_patterns, *cycle_patterns,
*module_patterns, *module_patterns,
*inbox_patterns, *inbox_patterns,
] ]

View File

@ -4,7 +4,6 @@ from plane.api.views.cycle import (
CycleAPIEndpoint, CycleAPIEndpoint,
CycleIssueAPIEndpoint, CycleIssueAPIEndpoint,
TransferCycleIssueAPIEndpoint, TransferCycleIssueAPIEndpoint,
CycleArchiveUnarchiveAPIEndpoint,
) )
urlpatterns = [ urlpatterns = [
@ -33,14 +32,4 @@ urlpatterns = [
TransferCycleIssueAPIEndpoint.as_view(), TransferCycleIssueAPIEndpoint.as_view(),
name="transfer-issues", name="transfer-issues",
), ),
path( ]
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:pk>/archive/",
CycleArchiveUnarchiveAPIEndpoint.as_view(),
name="cycle-archive-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-cycles/",
CycleArchiveUnarchiveAPIEndpoint.as_view(),
name="cycle-archive-unarchive",
),
]

View File

@ -14,4 +14,4 @@ urlpatterns = [
InboxIssueAPIEndpoint.as_view(), InboxIssueAPIEndpoint.as_view(),
name="inbox-issue", name="inbox-issue",
), ),
] ]

View File

@ -1,10 +1,6 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.api.views import ModuleAPIEndpoint, ModuleIssueAPIEndpoint
ModuleAPIEndpoint,
ModuleIssueAPIEndpoint,
ModuleArchiveUnarchiveAPIEndpoint,
)
urlpatterns = [ urlpatterns = [
path( path(
@ -27,14 +23,4 @@ urlpatterns = [
ModuleIssueAPIEndpoint.as_view(), ModuleIssueAPIEndpoint.as_view(),
name="module-issues", name="module-issues",
), ),
path( ]
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:pk>/archive/",
ModuleArchiveUnarchiveAPIEndpoint.as_view(),
name="module-archive-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-modules/",
ModuleArchiveUnarchiveAPIEndpoint.as_view(),
name="module-archive-unarchive",
),
]

View File

@ -1,24 +1,16 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.api.views import ProjectAPIEndpoint
ProjectAPIEndpoint,
ProjectArchiveUnarchiveAPIEndpoint,
)
urlpatterns = [ urlpatterns = [
path( path(
"workspaces/<str:slug>/projects/", "workspaces/<str:slug>/projects/",
ProjectAPIEndpoint.as_view(), ProjectAPIEndpoint.as_view(),
name="project", name="project",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/",
ProjectAPIEndpoint.as_view(), ProjectAPIEndpoint.as_view(),
name="project", name="project",
), ),
path( ]
"workspaces/<str:slug>/projects/<uuid:project_id>/archive/",
ProjectArchiveUnarchiveAPIEndpoint.as_view(),
name="project-archive-unarchive",
),
]

View File

@ -13,4 +13,4 @@ urlpatterns = [
StateAPIEndpoint.as_view(), StateAPIEndpoint.as_view(),
name="states", name="states",
), ),
] ]

View File

@ -1,4 +1,4 @@
from .project import ProjectAPIEndpoint, ProjectArchiveUnarchiveAPIEndpoint from .project import ProjectAPIEndpoint
from .state import StateAPIEndpoint from .state import StateAPIEndpoint
@ -14,13 +14,8 @@ from .cycle import (
CycleAPIEndpoint, CycleAPIEndpoint,
CycleIssueAPIEndpoint, CycleIssueAPIEndpoint,
TransferCycleIssueAPIEndpoint, TransferCycleIssueAPIEndpoint,
CycleArchiveUnarchiveAPIEndpoint,
) )
from .module import ( from .module import ModuleAPIEndpoint, ModuleIssueAPIEndpoint
ModuleAPIEndpoint,
ModuleIssueAPIEndpoint,
ModuleArchiveUnarchiveAPIEndpoint,
)
from .inbox import InboxIssueAPIEndpoint from .inbox import InboxIssueAPIEndpoint

View File

@ -1,27 +1,25 @@
# Python imports # Python imports
from urllib.parse import urlparse
import zoneinfo import zoneinfo
import json
# Django imports # Django imports
from django.conf import settings from django.conf import settings
from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.db import IntegrityError from django.db import IntegrityError
from django.urls import resolve from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.utils import timezone from django.utils import timezone
from rest_framework import status
from rest_framework.permissions import IsAuthenticated
from rest_framework.response import Response
# Third party imports # Third party imports
from rest_framework.views import APIView from rest_framework.views import APIView
from rest_framework.response import Response
from rest_framework.permissions import IsAuthenticated
from rest_framework import status
from sentry_sdk import capture_exception
# Module imports # Module imports
from plane.api.middleware.api_authentication import APIKeyAuthentication from plane.api.middleware.api_authentication import APIKeyAuthentication
from plane.api.rate_limit import ApiKeyRateThrottle from plane.api.rate_limit import ApiKeyRateThrottle
from plane.bgtasks.webhook_task import send_webhook
from plane.utils.exception_logger import log_exception
from plane.utils.paginator import BasePaginator from plane.utils.paginator import BasePaginator
from plane.bgtasks.webhook_task import send_webhook
class TimezoneMixin: class TimezoneMixin:
@ -43,9 +41,7 @@ class WebhookMixin:
bulk = False bulk = False
def finalize_response(self, request, response, *args, **kwargs): def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response( response = super().finalize_response(request, response, *args, **kwargs)
request, response, *args, **kwargs
)
# Check for the case should webhook be sent # Check for the case should webhook be sent
if ( if (
@ -53,11 +49,6 @@ class WebhookMixin:
and self.request.method in ["POST", "PATCH", "DELETE"] and self.request.method in ["POST", "PATCH", "DELETE"]
and response.status_code in [200, 201, 204] and response.status_code in [200, 201, 204]
): ):
url = request.build_absolute_uri()
parsed_url = urlparse(url)
# Extract the scheme and netloc
scheme = parsed_url.scheme
netloc = parsed_url.netloc
# Push the object to delay # Push the object to delay
send_webhook.delay( send_webhook.delay(
event=self.webhook_event, event=self.webhook_event,
@ -66,7 +57,6 @@ class WebhookMixin:
action=self.request.method, action=self.request.method,
slug=self.workspace_slug, slug=self.workspace_slug,
bulk=self.bulk, bulk=self.bulk,
current_site=f"{scheme}://{netloc}",
) )
return response return response
@ -107,23 +97,28 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
if isinstance(e, ValidationError): if isinstance(e, ValidationError):
return Response( return Response(
{"error": "Please provide valid detail"}, {
"error": "The provided payload is not valid please try with a valid payload"
},
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
if isinstance(e, ObjectDoesNotExist): if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0]
return Response( return Response(
{"error": "The requested resource does not exist."}, {"error": f"{model_name} does not exist."},
status=status.HTTP_404_NOT_FOUND, status=status.HTTP_404_NOT_FOUND,
) )
if isinstance(e, KeyError): if isinstance(e, KeyError):
return Response( return Response(
{"error": "The required key does not exist."}, {"error": f"key {e} does not exist"},
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
log_exception(e) if settings.DEBUG:
print(e)
capture_exception(e)
return Response( return Response(
{"error": "Something went wrong please try again later"}, {"error": "Something went wrong please try again later"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR, status=status.HTTP_500_INTERNAL_SERVER_ERROR,
@ -145,9 +140,7 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
def finalize_response(self, request, response, *args, **kwargs): def finalize_response(self, request, response, *args, **kwargs):
# Call super to get the default response # Call super to get the default response
response = super().finalize_response( response = super().finalize_response(request, response, *args, **kwargs)
request, response, *args, **kwargs
)
# Add custom headers if they exist in the request META # Add custom headers if they exist in the request META
ratelimit_remaining = request.META.get("X-RateLimit-Remaining") ratelimit_remaining = request.META.get("X-RateLimit-Remaining")
@ -166,27 +159,18 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
@property @property
def project_id(self): def project_id(self):
project_id = self.kwargs.get("project_id", None) return self.kwargs.get("project_id", None)
if project_id:
return project_id
if resolve(self.request.path_info).url_name == "project":
return self.kwargs.get("pk", None)
@property @property
def fields(self): def fields(self):
fields = [ fields = [
field field for field in self.request.GET.get("fields", "").split(",") if field
for field in self.request.GET.get("fields", "").split(",")
if field
] ]
return fields if fields else None return fields if fields else None
@property @property
def expand(self): def expand(self):
expand = [ expand = [
expand expand for expand in self.request.GET.get("expand", "").split(",") if expand
for expand in self.request.GET.get("expand", "").split(",")
if expand
] ]
return expand if expand else None return expand if expand else None

View File

@ -2,31 +2,23 @@
import json import json
# Django imports # Django imports
from django.core import serializers from django.db.models import Q, Count, Sum, Prefetch, F, OuterRef, Func
from django.db.models import Count, F, Func, OuterRef, Q, Sum
from django.utils import timezone from django.utils import timezone
from django.core import serializers
# Third party imports # Third party imports
from rest_framework import status
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework import status
# Module imports # Module imports
from plane.api.serializers import (
CycleIssueSerializer,
CycleSerializer,
)
from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activites_task import issue_activity
from plane.db.models import (
Cycle,
CycleIssue,
Issue,
IssueAttachment,
IssueLink,
)
from plane.utils.analytics_plot import burndown_plot
from .base import BaseAPIView, WebhookMixin from .base import BaseAPIView, WebhookMixin
from plane.db.models import Cycle, Issue, CycleIssue, IssueLink, IssueAttachment
from plane.app.permissions import ProjectEntityPermission
from plane.api.serializers import (
CycleSerializer,
CycleIssueSerializer,
)
from plane.bgtasks.issue_activites_task import issue_activity
class CycleAPIEndpoint(WebhookMixin, BaseAPIView): class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
@ -47,10 +39,7 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
return ( return (
Cycle.objects.filter(workspace__slug=self.kwargs.get("slug")) Cycle.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter( .filter(project__project_projectmember__member=self.request.user)
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.select_related("project") .select_related("project")
.select_related("workspace") .select_related("workspace")
.select_related("owned_by") .select_related("owned_by")
@ -113,9 +102,7 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
), ),
) )
) )
.annotate( .annotate(total_estimates=Sum("issue_cycle__issue__estimate_point"))
total_estimates=Sum("issue_cycle__issue__estimate_point")
)
.annotate( .annotate(
completed_estimates=Sum( completed_estimates=Sum(
"issue_cycle__issue__estimate_point", "issue_cycle__issue__estimate_point",
@ -142,9 +129,7 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
def get(self, request, slug, project_id, pk=None): def get(self, request, slug, project_id, pk=None):
if pk: if pk:
queryset = ( queryset = self.get_queryset().get(pk=pk)
self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
)
data = CycleSerializer( data = CycleSerializer(
queryset, queryset,
fields=self.fields, fields=self.fields,
@ -154,9 +139,7 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
data, data,
status=status.HTTP_200_OK, status=status.HTTP_200_OK,
) )
queryset = ( queryset = self.get_queryset()
self.get_queryset().filter(archived_at__isnull=True)
)
cycle_view = request.GET.get("cycle_view", "all") cycle_view = request.GET.get("cycle_view", "all")
# Current Cycle # Current Cycle
@ -218,8 +201,7 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
# Incomplete Cycles # Incomplete Cycles
if cycle_view == "incomplete": if cycle_view == "incomplete":
queryset = queryset.filter( queryset = queryset.filter(
Q(end_date__gte=timezone.now().date()) Q(end_date__gte=timezone.now().date()) | Q(end_date__isnull=True),
| Q(end_date__isnull=True),
) )
return self.paginate( return self.paginate(
request=request, request=request,
@ -252,39 +234,12 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
): ):
serializer = CycleSerializer(data=request.data) serializer = CycleSerializer(data=request.data)
if serializer.is_valid(): if serializer.is_valid():
if (
request.data.get("external_id")
and request.data.get("external_source")
and Cycle.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get("external_source"),
external_id=request.data.get("external_id"),
).exists()
):
cycle = Cycle.objects.filter(
workspace__slug=slug,
project_id=project_id,
external_source=request.data.get("external_source"),
external_id=request.data.get("external_id"),
).first()
return Response(
{
"error": "Cycle with the same external id and external source already exists",
"id": str(cycle.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer.save( serializer.save(
project_id=project_id, project_id=project_id,
owned_by=request.user, owned_by=request.user,
) )
return Response( return Response(serializer.data, status=status.HTTP_201_CREATED)
serializer.data, status=status.HTTP_201_CREATED return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
else: else:
return Response( return Response(
{ {
@ -294,27 +249,15 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
) )
def patch(self, request, slug, project_id, pk): def patch(self, request, slug, project_id, pk):
cycle = Cycle.objects.get( cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
workspace__slug=slug, project_id=project_id, pk=pk
)
if cycle.archived_at:
return Response(
{"error": "Archived cycle cannot be edited"},
status=status.HTTP_400_BAD_REQUEST,
)
request_data = request.data request_data = request.data
if ( if cycle.end_date is not None and cycle.end_date < timezone.now().date():
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
if "sort_order" in request_data: if "sort_order" in request_data:
# Can only change sort order # Can only change sort order
request_data = { request_data = {
"sort_order": request_data.get( "sort_order": request_data.get("sort_order", cycle.sort_order)
"sort_order", cycle.sort_order
)
} }
else: else:
return Response( return Response(
@ -326,38 +269,17 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
serializer = CycleSerializer(cycle, data=request.data, partial=True) serializer = CycleSerializer(cycle, data=request.data, partial=True)
if serializer.is_valid(): if serializer.is_valid():
if (
request.data.get("external_id")
and (cycle.external_id != request.data.get("external_id"))
and Cycle.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get(
"external_source", cycle.external_source
),
external_id=request.data.get("external_id"),
).exists()
):
return Response(
{
"error": "Cycle with the same external id and external source already exists",
"id": str(cycle.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer.save() serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk): def delete(self, request, slug, project_id, pk):
cycle_issues = list( cycle_issues = list(
CycleIssue.objects.filter( CycleIssue.objects.filter(cycle_id=self.kwargs.get("pk")).values_list(
cycle_id=self.kwargs.get("pk") "issue", flat=True
).values_list("issue", flat=True) )
)
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
) )
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
issue_activity.delay( issue_activity.delay(
type="cycle.activity.deleted", type="cycle.activity.deleted",
@ -379,139 +301,6 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get_queryset(self):
return (
Cycle.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.filter(archived_at__isnull=False)
.select_related("project")
.select_related("workspace")
.select_related("owned_by")
.annotate(
total_issues=Count(
"issue_cycle",
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
completed_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
cancelled_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
started_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
unstarted_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
backlog_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
total_estimates=Sum("issue_cycle__issue__estimate_point")
)
.annotate(
completed_estimates=Sum(
"issue_cycle__issue__estimate_point",
filter=Q(
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
started_estimates=Sum(
"issue_cycle__issue__estimate_point",
filter=Q(
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.order_by(self.kwargs.get("order_by", "-created_at"))
.distinct()
)
def get(self, request, slug, project_id):
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
).data,
)
def post(self, request, slug, project_id, pk):
cycle = Cycle.objects.get(
pk=pk, project_id=project_id, workspace__slug=slug
)
cycle.archived_at = timezone.now()
cycle.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, slug, project_id, pk):
cycle = Cycle.objects.get(
pk=pk, project_id=project_id, workspace__slug=slug
)
cycle.archived_at = None
cycle.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView): class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
""" """
This viewset automatically provides `list`, `create`, This viewset automatically provides `list`, `create`,
@ -530,19 +319,14 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self): def get_queryset(self):
return ( return (
CycleIssue.objects.annotate( CycleIssue.objects.annotate(
sub_issues_count=Issue.issue_objects.filter( sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue_id"))
parent=OuterRef("issue_id")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
) )
.filter(workspace__slug=self.kwargs.get("slug")) .filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter( .filter(project__project_projectmember__member=self.request.user)
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.filter(cycle_id=self.kwargs.get("cycle_id")) .filter(cycle_id=self.kwargs.get("cycle_id"))
.select_related("project") .select_related("project")
.select_related("workspace") .select_related("workspace")
@ -553,28 +337,12 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
.distinct() .distinct()
) )
def get(self, request, slug, project_id, cycle_id, issue_id=None): def get(self, request, slug, project_id, cycle_id):
# Get
if issue_id:
cycle_issue = CycleIssue.objects.get(
workspace__slug=slug,
project_id=project_id,
cycle_id=cycle_id,
issue_id=issue_id,
)
serializer = CycleIssueSerializer(
cycle_issue, fields=self.fields, expand=self.expand
)
return Response(serializer.data, status=status.HTTP_200_OK)
# List
order_by = request.GET.get("order_by", "created_at") order_by = request.GET.get("order_by", "created_at")
issues = ( issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id) Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
.annotate( .annotate(
sub_issues_count=Issue.issue_objects.filter( sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
parent=OuterRef("id")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
@ -596,9 +364,7 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
.values("count") .values("count")
) )
.annotate( .annotate(
attachment_count=IssueAttachment.objects.filter( attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
issue=OuterRef("id")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
@ -621,18 +387,14 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
if not issues: if not issues:
return Response( return Response(
{"error": "Issues are required"}, {"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
status=status.HTTP_400_BAD_REQUEST,
) )
cycle = Cycle.objects.get( cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=cycle_id workspace__slug=slug, project_id=project_id, pk=cycle_id
) )
if ( if cycle.end_date is not None and cycle.end_date < timezone.now().date():
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
return Response( return Response(
{ {
"error": "The Cycle has already been completed so no new issues can be added" "error": "The Cycle has already been completed so no new issues can be added"
@ -717,10 +479,7 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def delete(self, request, slug, project_id, cycle_id, issue_id): def delete(self, request, slug, project_id, cycle_id, issue_id):
cycle_issue = CycleIssue.objects.get( cycle_issue = CycleIssue.objects.get(
issue_id=issue_id, issue_id=issue_id, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
workspace__slug=slug,
project_id=project_id,
cycle_id=cycle_id,
) )
issue_id = cycle_issue.issue_id issue_id = cycle_issue.issue_id
cycle_issue.delete() cycle_issue.delete()
@ -764,209 +523,6 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
workspace__slug=slug, project_id=project_id, pk=new_cycle_id workspace__slug=slug, project_id=project_id, pk=new_cycle_id
) )
old_cycle = (
Cycle.objects.filter(
workspace__slug=slug, project_id=project_id, pk=cycle_id
)
.annotate(
total_issues=Count(
"issue_cycle",
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
completed_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
cancelled_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
started_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
unstarted_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
backlog_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
)
# Pass the new_cycle queryset to burndown_plot
completion_chart = burndown_plot(
queryset=old_cycle.first(),
slug=slug,
project_id=project_id,
cycle_id=cycle_id,
)
# Get the assignee distribution
assignee_distribution = (
Issue.objects.filter(
issue_cycle__cycle_id=cycle_id,
workspace__slug=slug,
project_id=project_id,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
total_issues=Count(
"id",
filter=Q(archived_at__isnull=True, is_draft=False),
),
)
.annotate(
completed_issues=Count(
"id",
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
pending_issues=Count(
"id",
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
is_draft=False,
),
)
)
.order_by("display_name")
)
# assignee distribution serialized
assignee_distribution_data = [
{
"display_name": item["display_name"],
"assignee_id": (
str(item["assignee_id"]) if item["assignee_id"] else None
),
"avatar": item["avatar"],
"total_issues": item["total_issues"],
"completed_issues": item["completed_issues"],
"pending_issues": item["pending_issues"],
}
for item in assignee_distribution
]
# Get the label distribution
label_distribution = (
Issue.objects.filter(
issue_cycle__cycle_id=cycle_id,
workspace__slug=slug,
project_id=project_id,
)
.annotate(label_name=F("labels__name"))
.annotate(color=F("labels__color"))
.annotate(label_id=F("labels__id"))
.values("label_name", "color", "label_id")
.annotate(
total_issues=Count(
"id",
filter=Q(archived_at__isnull=True, is_draft=False),
)
)
.annotate(
completed_issues=Count(
"id",
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
pending_issues=Count(
"id",
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
is_draft=False,
),
)
)
.order_by("label_name")
)
# Label distribution serilization
label_distribution_data = [
{
"label_name": item["label_name"],
"color": item["color"],
"label_id": (
str(item["label_id"]) if item["label_id"] else None
),
"total_issues": item["total_issues"],
"completed_issues": item["completed_issues"],
"pending_issues": item["pending_issues"],
}
for item in label_distribution
]
current_cycle = Cycle.objects.filter(
workspace__slug=slug, project_id=project_id, pk=cycle_id
).first()
if current_cycle:
current_cycle.progress_snapshot = {
"total_issues": old_cycle.first().total_issues,
"completed_issues": old_cycle.first().completed_issues,
"cancelled_issues": old_cycle.first().cancelled_issues,
"started_issues": old_cycle.first().started_issues,
"unstarted_issues": old_cycle.first().unstarted_issues,
"backlog_issues": old_cycle.first().backlog_issues,
"distribution": {
"labels": label_distribution_data,
"assignees": assignee_distribution_data,
"completion_chart": completion_chart,
},
}
# Save the snapshot of the current cycle
current_cycle.save(update_fields=["progress_snapshot"])
if ( if (
new_cycle.end_date is not None new_cycle.end_date is not None
and new_cycle.end_date < timezone.now().date() and new_cycle.end_date < timezone.now().date()
@ -994,4 +550,4 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
updated_cycles, ["cycle_id"], batch_size=100 updated_cycles, ["cycle_id"], batch_size=100
) )
return Response({"message": "Success"}, status=status.HTTP_200_OK) return Response({"message": "Success"}, status=status.HTTP_200_OK)

View File

@ -2,28 +2,20 @@
import json import json
# Django improts # Django improts
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models import Q
from django.utils import timezone from django.utils import timezone
from django.db.models import Q
from django.core.serializers.json import DjangoJSONEncoder
# Third party imports # Third party imports
from rest_framework import status from rest_framework import status
from rest_framework.response import Response from rest_framework.response import Response
# Module imports # Module imports
from plane.api.serializers import InboxIssueSerializer, IssueSerializer
from plane.app.permissions import ProjectLitePermission
from plane.bgtasks.issue_activites_task import issue_activity
from plane.db.models import (
Inbox,
InboxIssue,
Issue,
Project,
ProjectMember,
State,
)
from .base import BaseAPIView from .base import BaseAPIView
from plane.app.permissions import ProjectLitePermission
from plane.api.serializers import InboxIssueSerializer, IssueSerializer
from plane.db.models import InboxIssue, Issue, State, ProjectMember, Project, Inbox
from plane.bgtasks.issue_activites_task import issue_activity
class InboxIssueAPIEndpoint(BaseAPIView): class InboxIssueAPIEndpoint(BaseAPIView):
@ -51,8 +43,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
).first() ).first()
project = Project.objects.get( project = Project.objects.get(
workspace__slug=self.kwargs.get("slug"), workspace__slug=self.kwargs.get("slug"), pk=self.kwargs.get("project_id")
pk=self.kwargs.get("project_id"),
) )
if inbox is None and not project.inbox_view: if inbox is None and not project.inbox_view:
@ -60,8 +51,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
return ( return (
InboxIssue.objects.filter( InboxIssue.objects.filter(
Q(snoozed_till__gte=timezone.now()) Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
| Q(snoozed_till__isnull=True),
workspace__slug=self.kwargs.get("slug"), workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"), project_id=self.kwargs.get("project_id"),
inbox_id=inbox.id, inbox_id=inbox.id,
@ -97,8 +87,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
def post(self, request, slug, project_id): def post(self, request, slug, project_id):
if not request.data.get("issue", {}).get("name", False): if not request.data.get("issue", {}).get("name", False):
return Response( return Response(
{"error": "Name is required"}, {"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
status=status.HTTP_400_BAD_REQUEST,
) )
inbox = Inbox.objects.filter( inbox = Inbox.objects.filter(
@ -120,7 +109,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
) )
# Check for valid priority # Check for valid priority
if request.data.get("issue", {}).get("priority", "none") not in [ if not request.data.get("issue", {}).get("priority", "none") in [
"low", "low",
"medium", "medium",
"high", "high",
@ -128,18 +117,16 @@ class InboxIssueAPIEndpoint(BaseAPIView):
"none", "none",
]: ]:
return Response( return Response(
{"error": "Invalid priority"}, {"error": "Invalid priority"}, status=status.HTTP_400_BAD_REQUEST
status=status.HTTP_400_BAD_REQUEST,
) )
# Create or get state # Create or get state
state, _ = State.objects.get_or_create( state, _ = State.objects.get_or_create(
name="Triage", name="Triage",
group="triage", group="backlog",
description="Default state for managing all Inbox Issues", description="Default state for managing all Inbox Issues",
project_id=project_id, project_id=project_id,
color="#ff7700", color="#ff7700",
is_triage=True,
) )
# create an issue # create an issue
@ -235,14 +222,10 @@ class InboxIssueAPIEndpoint(BaseAPIView):
"description_html": issue_data.get( "description_html": issue_data.get(
"description_html", issue.description_html "description_html", issue.description_html
), ),
"description": issue_data.get( "description": issue_data.get("description", issue.description),
"description", issue.description
),
} }
issue_serializer = IssueSerializer( issue_serializer = IssueSerializer(issue, data=issue_data, partial=True)
issue, data=issue_data, partial=True
)
if issue_serializer.is_valid(): if issue_serializer.is_valid():
current_instance = issue current_instance = issue
@ -283,9 +266,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
project_id=project_id, project_id=project_id,
) )
state = State.objects.filter( state = State.objects.filter(
group="cancelled", group="cancelled", workspace__slug=slug, project_id=project_id
workspace__slug=slug,
project_id=project_id,
).first() ).first()
if state is not None: if state is not None:
issue.state = state issue.state = state
@ -300,25 +281,20 @@ class InboxIssueAPIEndpoint(BaseAPIView):
) )
# Update the issue state only if it is in triage state # Update the issue state only if it is in triage state
if issue.state.is_triage: if issue.state.name == "Triage":
# Move to default state # Move to default state
state = State.objects.filter( state = State.objects.filter(
workspace__slug=slug, workspace__slug=slug, project_id=project_id, default=True
project_id=project_id,
default=True,
).first() ).first()
if state is not None: if state is not None:
issue.state = state issue.state = state
issue.save() issue.save()
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return Response( return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
else: else:
return Response( return Response(
InboxIssueSerializer(inbox_issue).data, InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
status=status.HTTP_200_OK,
) )
def delete(self, request, slug, project_id, issue_id): def delete(self, request, slug, project_id, issue_id):

View File

@ -1,22 +1,22 @@
# Python imports # Python imports
import json import json
from itertools import chain
from django.core.serializers.json import DjangoJSONEncoder
# Django imports # Django imports
from django.db import IntegrityError from django.db import IntegrityError
from django.db.models import ( from django.db.models import (
Case,
CharField,
Exists,
F,
Func,
Max,
OuterRef, OuterRef,
Func,
Q, Q,
Value, F,
Case,
When, When,
Value,
CharField,
Max,
Exists,
) )
from django.core.serializers.json import DjangoJSONEncoder
from django.utils import timezone from django.utils import timezone
# Third party imports # Third party imports
@ -24,31 +24,30 @@ from rest_framework import status
from rest_framework.response import Response from rest_framework.response import Response
# Module imports # Module imports
from plane.api.serializers import ( from .base import BaseAPIView, WebhookMixin
IssueActivitySerializer,
IssueCommentSerializer,
IssueLinkSerializer,
IssueSerializer,
LabelSerializer,
)
from plane.app.permissions import ( from plane.app.permissions import (
ProjectEntityPermission, ProjectEntityPermission,
ProjectLitePermission,
ProjectMemberPermission, ProjectMemberPermission,
ProjectLitePermission,
) )
from plane.bgtasks.issue_activites_task import issue_activity
from plane.db.models import ( from plane.db.models import (
Issue, Issue,
IssueActivity,
IssueAttachment, IssueAttachment,
IssueComment,
IssueLink, IssueLink,
Label,
Project, Project,
Label,
ProjectMember, ProjectMember,
IssueComment,
IssueActivity,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.api.serializers import (
IssueSerializer,
LabelSerializer,
IssueLinkSerializer,
IssueCommentSerializer,
IssueActivitySerializer,
) )
from .base import BaseAPIView, WebhookMixin
class IssueAPIEndpoint(WebhookMixin, BaseAPIView): class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
@ -68,9 +67,7 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self): def get_queryset(self):
return ( return (
Issue.issue_objects.annotate( Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter( sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
parent=OuterRef("id")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
@ -89,9 +86,7 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get(self, request, slug, project_id, pk=None): def get(self, request, slug, project_id, pk=None):
if pk: if pk:
issue = Issue.issue_objects.annotate( issue = Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter( sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
parent=OuterRef("id")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
@ -107,13 +102,7 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Custom ordering for priority and state # Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"] priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [ state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at") order_by_param = request.GET.get("order_by", "-created_at")
@ -128,9 +117,7 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
.values("count") .values("count")
) )
.annotate( .annotate(
attachment_count=IssueAttachment.objects.filter( attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
issue=OuterRef("id")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
@ -140,9 +127,7 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Priority Ordering # Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority": if order_by_param == "priority" or order_by_param == "-priority":
priority_order = ( priority_order = (
priority_order priority_order if order_by_param == "priority" else priority_order[::-1]
if order_by_param == "priority"
else priority_order[::-1]
) )
issue_queryset = issue_queryset.annotate( issue_queryset = issue_queryset.annotate(
priority_order=Case( priority_order=Case(
@ -190,9 +175,7 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
else order_by_param else order_by_param
) )
).order_by( ).order_by(
"-max_values" "-max_values" if order_by_param.startswith("-") else "max_values"
if order_by_param.startswith("-")
else "max_values"
) )
else: else:
issue_queryset = issue_queryset.order_by(order_by_param) issue_queryset = issue_queryset.order_by(order_by_param)
@ -221,38 +204,12 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
) )
if serializer.is_valid(): if serializer.is_valid():
if (
request.data.get("external_id")
and request.data.get("external_source")
and Issue.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get("external_source"),
external_id=request.data.get("external_id"),
).exists()
):
issue = Issue.objects.filter(
workspace__slug=slug,
project_id=project_id,
external_id=request.data.get("external_id"),
external_source=request.data.get("external_source"),
).first()
return Response(
{
"error": "Issue with the same external id and external source already exists",
"id": str(issue.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer.save() serializer.save()
# Track the issue # Track the issue
issue_activity.delay( issue_activity.delay(
type="issue.activity.created", type="issue.activity.created",
requested_data=json.dumps( requested_data=json.dumps(self.request.data, cls=DjangoJSONEncoder),
self.request.data, cls=DjangoJSONEncoder
),
actor_id=str(request.user.id), actor_id=str(request.user.id),
issue_id=str(serializer.data.get("id", None)), issue_id=str(serializer.data.get("id", None)),
project_id=str(project_id), project_id=str(project_id),
@ -263,44 +220,13 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, pk=None): def patch(self, request, slug, project_id, pk=None):
issue = Issue.objects.get( issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
workspace__slug=slug, project_id=project_id, pk=pk
)
project = Project.objects.get(pk=project_id)
current_instance = json.dumps( current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder IssueSerializer(issue).data, cls=DjangoJSONEncoder
) )
requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder) requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder)
serializer = IssueSerializer( serializer = IssueSerializer(issue, data=request.data, partial=True)
issue,
data=request.data,
context={
"project_id": project_id,
"workspace_id": project.workspace_id,
},
partial=True,
)
if serializer.is_valid(): if serializer.is_valid():
if (
str(request.data.get("external_id"))
and (issue.external_id != str(request.data.get("external_id")))
and Issue.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get(
"external_source", issue.external_source
),
external_id=request.data.get("external_id"),
).exists()
):
return Response(
{
"error": "Issue with the same external id and external source already exists",
"id": str(issue.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer.save() serializer.save()
issue_activity.delay( issue_activity.delay(
type="issue.activity.updated", type="issue.activity.updated",
@ -315,9 +241,7 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk=None): def delete(self, request, slug, project_id, pk=None):
issue = Issue.objects.get( issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
workspace__slug=slug, project_id=project_id, pk=pk
)
current_instance = json.dumps( current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder IssueSerializer(issue).data, cls=DjangoJSONEncoder
) )
@ -351,11 +275,7 @@ class LabelAPIEndpoint(BaseAPIView):
return ( return (
Label.objects.filter(workspace__slug=self.kwargs.get("slug")) Label.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter( .filter(project__project_projectmember__member=self.request.user)
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.filter(project__archived_at__isnull=True)
.select_related("project") .select_related("project")
.select_related("workspace") .select_related("workspace")
.select_related("parent") .select_related("parent")
@ -367,49 +287,13 @@ class LabelAPIEndpoint(BaseAPIView):
try: try:
serializer = LabelSerializer(data=request.data) serializer = LabelSerializer(data=request.data)
if serializer.is_valid(): if serializer.is_valid():
if (
request.data.get("external_id")
and request.data.get("external_source")
and Label.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get("external_source"),
external_id=request.data.get("external_id"),
).exists()
):
label = Label.objects.filter(
workspace__slug=slug,
project_id=project_id,
external_id=request.data.get("external_id"),
external_source=request.data.get("external_source"),
).first()
return Response(
{
"error": "Label with the same external id and external source already exists",
"id": str(label.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer.save(project_id=project_id) serializer.save(project_id=project_id)
return Response( return Response(serializer.data, status=status.HTTP_201_CREATED)
serializer.data, status=status.HTTP_201_CREATED return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
except IntegrityError: except IntegrityError:
label = Label.objects.filter(
workspace__slug=slug,
project_id=project_id,
name=request.data.get("name"),
).first()
return Response( return Response(
{ {"error": "Label with the same name already exists in the project"},
"error": "Label with the same name already exists in the project", status=status.HTTP_400_BAD_REQUEST,
"id": str(label.id),
},
status=status.HTTP_409_CONFLICT,
) )
def get(self, request, slug, project_id, pk=None): def get(self, request, slug, project_id, pk=None):
@ -425,39 +309,17 @@ class LabelAPIEndpoint(BaseAPIView):
).data, ).data,
) )
label = self.get_queryset().get(pk=pk) label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer( serializer = LabelSerializer(label, fields=self.fields, expand=self.expand,)
label,
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, project_id, pk=None): def patch(self, request, slug, project_id, pk=None):
label = self.get_queryset().get(pk=pk) label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer(label, data=request.data, partial=True) serializer = LabelSerializer(label, data=request.data, partial=True)
if serializer.is_valid(): if serializer.is_valid():
if (
str(request.data.get("external_id"))
and (label.external_id != str(request.data.get("external_id")))
and Issue.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get(
"external_source", label.external_source
),
external_id=request.data.get("external_id"),
).exists()
):
return Response(
{
"error": "Label with the same external id and external source already exists",
"id": str(label.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer.save() serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk=None): def delete(self, request, slug, project_id, pk=None):
label = self.get_queryset().get(pk=pk) label = self.get_queryset().get(pk=pk)
@ -484,11 +346,7 @@ class IssueLinkAPIEndpoint(BaseAPIView):
IssueLink.objects.filter(workspace__slug=self.kwargs.get("slug")) IssueLink.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter(issue_id=self.kwargs.get("issue_id")) .filter(issue_id=self.kwargs.get("issue_id"))
.filter( .filter(project__project_projectmember__member=self.request.user)
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.filter(project__archived_at__isnull=True)
.order_by(self.kwargs.get("order_by", "-created_at")) .order_by(self.kwargs.get("order_by", "-created_at"))
.distinct() .distinct()
) )
@ -528,9 +386,7 @@ class IssueLinkAPIEndpoint(BaseAPIView):
) )
issue_activity.delay( issue_activity.delay(
type="link.activity.created", type="link.activity.created",
requested_data=json.dumps( requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
serializer.data, cls=DjangoJSONEncoder
),
actor_id=str(self.request.user.id), actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id")), issue_id=str(self.kwargs.get("issue_id")),
project_id=str(self.kwargs.get("project_id")), project_id=str(self.kwargs.get("project_id")),
@ -542,19 +398,14 @@ class IssueLinkAPIEndpoint(BaseAPIView):
def patch(self, request, slug, project_id, issue_id, pk): def patch(self, request, slug, project_id, issue_id, pk):
issue_link = IssueLink.objects.get( issue_link = IssueLink.objects.get(
workspace__slug=slug, workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
project_id=project_id,
issue_id=issue_id,
pk=pk,
) )
requested_data = json.dumps(request.data, cls=DjangoJSONEncoder) requested_data = json.dumps(request.data, cls=DjangoJSONEncoder)
current_instance = json.dumps( current_instance = json.dumps(
IssueLinkSerializer(issue_link).data, IssueLinkSerializer(issue_link).data,
cls=DjangoJSONEncoder, cls=DjangoJSONEncoder,
) )
serializer = IssueLinkSerializer( serializer = IssueLinkSerializer(issue_link, data=request.data, partial=True)
issue_link, data=request.data, partial=True
)
if serializer.is_valid(): if serializer.is_valid():
serializer.save() serializer.save()
issue_activity.delay( issue_activity.delay(
@ -571,10 +422,7 @@ class IssueLinkAPIEndpoint(BaseAPIView):
def delete(self, request, slug, project_id, issue_id, pk): def delete(self, request, slug, project_id, issue_id, pk):
issue_link = IssueLink.objects.get( issue_link = IssueLink.objects.get(
workspace__slug=slug, workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
project_id=project_id,
issue_id=issue_id,
pk=pk,
) )
current_instance = json.dumps( current_instance = json.dumps(
IssueLinkSerializer(issue_link).data, IssueLinkSerializer(issue_link).data,
@ -609,17 +457,14 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self): def get_queryset(self):
return ( return (
IssueComment.objects.filter( IssueComment.objects.filter(workspace__slug=self.kwargs.get("slug"))
workspace__slug=self.kwargs.get("slug")
)
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter(issue_id=self.kwargs.get("issue_id")) .filter(issue_id=self.kwargs.get("issue_id"))
.filter( .filter(project__project_projectmember__member=self.request.user)
project__project_projectmember__member=self.request.user, .select_related("project")
project__project_projectmember__is_active=True, .select_related("workspace")
) .select_related("issue")
.filter(project__archived_at__isnull=True) .select_related("actor")
.select_related("workspace", "project", "issue", "actor")
.annotate( .annotate(
is_member=Exists( is_member=Exists(
ProjectMember.objects.filter( ProjectMember.objects.filter(
@ -655,31 +500,6 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
) )
def post(self, request, slug, project_id, issue_id): def post(self, request, slug, project_id, issue_id):
# Validation check if the issue already exists
if (
request.data.get("external_id")
and request.data.get("external_source")
and IssueComment.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get("external_source"),
external_id=request.data.get("external_id"),
).exists()
):
issue_comment = IssueComment.objects.filter(
workspace__slug=slug,
project_id=project_id,
external_id=request.data.get("external_id"),
external_source=request.data.get("external_source"),
).first()
return Response(
{
"error": "Issue Comment with the same external id and external source already exists",
"id": str(issue_comment.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer = IssueCommentSerializer(data=request.data) serializer = IssueCommentSerializer(data=request.data)
if serializer.is_valid(): if serializer.is_valid():
serializer.save( serializer.save(
@ -689,9 +509,7 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
) )
issue_activity.delay( issue_activity.delay(
type="comment.activity.created", type="comment.activity.created",
requested_data=json.dumps( requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
serializer.data, cls=DjangoJSONEncoder
),
actor_id=str(self.request.user.id), actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id")), issue_id=str(self.kwargs.get("issue_id")),
project_id=str(self.kwargs.get("project_id")), project_id=str(self.kwargs.get("project_id")),
@ -703,41 +521,13 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
def patch(self, request, slug, project_id, issue_id, pk): def patch(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get( issue_comment = IssueComment.objects.get(
workspace__slug=slug, workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
project_id=project_id,
issue_id=issue_id,
pk=pk,
) )
requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder) requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder)
current_instance = json.dumps( current_instance = json.dumps(
IssueCommentSerializer(issue_comment).data, IssueCommentSerializer(issue_comment).data,
cls=DjangoJSONEncoder, cls=DjangoJSONEncoder,
) )
# Validation check if the issue already exists
if (
request.data.get("external_id")
and (
issue_comment.external_id
!= str(request.data.get("external_id"))
)
and IssueComment.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get(
"external_source", issue_comment.external_source
),
external_id=request.data.get("external_id"),
).exists()
):
return Response(
{
"error": "Issue Comment with the same external id and external source already exists",
"id": str(issue_comment.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer = IssueCommentSerializer( serializer = IssueCommentSerializer(
issue_comment, data=request.data, partial=True issue_comment, data=request.data, partial=True
) )
@ -757,10 +547,7 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
def delete(self, request, slug, project_id, issue_id, pk): def delete(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get( issue_comment = IssueComment.objects.get(
workspace__slug=slug, workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
project_id=project_id,
issue_id=issue_id,
pk=pk,
) )
current_instance = json.dumps( current_instance = json.dumps(
IssueCommentSerializer(issue_comment).data, IssueCommentSerializer(issue_comment).data,
@ -792,12 +579,10 @@ class IssueActivityAPIEndpoint(BaseAPIView):
.filter( .filter(
~Q(field__in=["comment", "vote", "reaction", "draft"]), ~Q(field__in=["comment", "vote", "reaction", "draft"]),
project__project_projectmember__member=self.request.user, project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
) )
.filter(project__archived_at__isnull=True)
.select_related("actor", "workspace", "issue", "project") .select_related("actor", "workspace", "issue", "project")
).order_by(request.GET.get("order_by", "created_at")) ).order_by(request.GET.get("order_by", "created_at"))
if pk: if pk:
issue_activities = issue_activities.get(pk=pk) issue_activities = issue_activities.get(pk=pk)
serializer = IssueActivitySerializer(issue_activities) serializer = IssueActivitySerializer(issue_activities)

View File

@ -2,33 +2,32 @@
import json import json
# Django imports # Django imports
from django.core import serializers from django.db.models import Count, Prefetch, Q, F, Func, OuterRef
from django.db.models import Count, F, Func, OuterRef, Prefetch, Q
from django.utils import timezone from django.utils import timezone
from django.core import serializers
# Third party imports # Third party imports
from rest_framework import status from rest_framework import status
from rest_framework.response import Response from rest_framework.response import Response
# Module imports # Module imports
from plane.api.serializers import ( from .base import BaseAPIView, WebhookMixin
IssueSerializer,
ModuleIssueSerializer,
ModuleSerializer,
)
from plane.app.permissions import ProjectEntityPermission from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activites_task import issue_activity
from plane.db.models import ( from plane.db.models import (
Project,
Module,
ModuleLink,
Issue, Issue,
ModuleIssue,
IssueAttachment, IssueAttachment,
IssueLink, IssueLink,
Module,
ModuleIssue,
ModuleLink,
Project,
) )
from plane.api.serializers import (
from .base import BaseAPIView, WebhookMixin ModuleSerializer,
ModuleIssueSerializer,
IssueSerializer,
)
from plane.bgtasks.issue_activites_task import issue_activity
class ModuleAPIEndpoint(WebhookMixin, BaseAPIView): class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
@ -56,9 +55,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
.prefetch_related( .prefetch_related(
Prefetch( Prefetch(
"link_module", "link_module",
queryset=ModuleLink.objects.select_related( queryset=ModuleLink.objects.select_related("module", "created_by"),
"module", "created_by"
),
) )
) )
.annotate( .annotate(
@ -68,7 +65,6 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True, issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False, issue_module__issue__is_draft=False,
), ),
distinct=True,
), ),
) )
.annotate( .annotate(
@ -79,7 +75,6 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True, issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False, issue_module__issue__is_draft=False,
), ),
distinct=True,
) )
) )
.annotate( .annotate(
@ -90,7 +85,6 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True, issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False, issue_module__issue__is_draft=False,
), ),
distinct=True,
) )
) )
.annotate( .annotate(
@ -101,7 +95,6 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True, issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False, issue_module__issue__is_draft=False,
), ),
distinct=True,
) )
) )
.annotate( .annotate(
@ -112,7 +105,6 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True, issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False, issue_module__issue__is_draft=False,
), ),
distinct=True,
) )
) )
.annotate( .annotate(
@ -123,95 +115,32 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True, issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False, issue_module__issue__is_draft=False,
), ),
distinct=True,
) )
) )
.order_by(self.kwargs.get("order_by", "-created_at")) .order_by(self.kwargs.get("order_by", "-created_at"))
) )
def post(self, request, slug, project_id): def post(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug) project = Project.objects.get(workspace__slug=slug, pk=project_id)
serializer = ModuleSerializer( serializer = ModuleSerializer(data=request.data, context={"project": project})
data=request.data,
context={
"project_id": project_id,
"workspace_id": project.workspace_id,
},
)
if serializer.is_valid(): if serializer.is_valid():
if (
request.data.get("external_id")
and request.data.get("external_source")
and Module.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get("external_source"),
external_id=request.data.get("external_id"),
).exists()
):
module = Module.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get("external_source"),
external_id=request.data.get("external_id"),
).first()
return Response(
{
"error": "Module with the same external id and external source already exists",
"id": str(module.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer.save() serializer.save()
module = Module.objects.get(pk=serializer.data["id"]) module = Module.objects.get(pk=serializer.data["id"])
serializer = ModuleSerializer(module) serializer = ModuleSerializer(module)
return Response(serializer.data, status=status.HTTP_201_CREATED) return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, pk): def patch(self, request, slug, project_id, pk):
module = Module.objects.get( module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
pk=pk, project_id=project_id, workspace__slug=slug serializer = ModuleSerializer(module, data=request.data)
)
if module.archived_at:
return Response(
{"error": "Archived module cannot be edited"},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = ModuleSerializer(
module,
data=request.data,
context={"project_id": project_id},
partial=True,
)
if serializer.is_valid(): if serializer.is_valid():
if (
request.data.get("external_id")
and (module.external_id != request.data.get("external_id"))
and Module.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get(
"external_source", module.external_source
),
external_id=request.data.get("external_id"),
).exists()
):
return Response(
{
"error": "Module with the same external id and external source already exists",
"id": str(module.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer.save() serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def get(self, request, slug, project_id, pk=None): def get(self, request, slug, project_id, pk=None):
if pk: if pk:
queryset = ( queryset = self.get_queryset().get(pk=pk)
self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
)
data = ModuleSerializer( data = ModuleSerializer(
queryset, queryset,
fields=self.fields, fields=self.fields,
@ -223,7 +152,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
) )
return self.paginate( return self.paginate(
request=request, request=request,
queryset=(self.get_queryset().filter(archived_at__isnull=True)), queryset=(self.get_queryset()),
on_results=lambda modules: ModuleSerializer( on_results=lambda modules: ModuleSerializer(
modules, modules,
many=True, many=True,
@ -233,13 +162,9 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
) )
def delete(self, request, slug, project_id, pk): def delete(self, request, slug, project_id, pk):
module = Module.objects.get( module = Module.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
workspace__slug=slug, project_id=project_id, pk=pk
)
module_issues = list( module_issues = list(
ModuleIssue.objects.filter(module_id=pk).values_list( ModuleIssue.objects.filter(module_id=pk).values_list("issue", flat=True)
"issue", flat=True
)
) )
issue_activity.delay( issue_activity.delay(
type="module.activity.deleted", type="module.activity.deleted",
@ -279,9 +204,7 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self): def get_queryset(self):
return ( return (
ModuleIssue.objects.annotate( ModuleIssue.objects.annotate(
sub_issues_count=Issue.issue_objects.filter( sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue"))
parent=OuterRef("issue")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
@ -289,11 +212,7 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
.filter(workspace__slug=self.kwargs.get("slug")) .filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter(module_id=self.kwargs.get("module_id")) .filter(module_id=self.kwargs.get("module_id"))
.filter( .filter(project__project_projectmember__member=self.request.user)
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.filter(project__archived_at__isnull=True)
.select_related("project") .select_related("project")
.select_related("workspace") .select_related("workspace")
.select_related("module") .select_related("module")
@ -309,9 +228,7 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
issues = ( issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id) Issue.issue_objects.filter(issue_module__module_id=module_id)
.annotate( .annotate(
sub_issues_count=Issue.issue_objects.filter( sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
parent=OuterRef("id")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
@ -333,9 +250,7 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
.values("count") .values("count")
) )
.annotate( .annotate(
attachment_count=IssueAttachment.objects.filter( attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
issue=OuterRef("id")
)
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
.values("count") .values("count")
@ -356,8 +271,7 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
issues = request.data.get("issues", []) issues = request.data.get("issues", [])
if not len(issues): if not len(issues):
return Response( return Response(
{"error": "Issues are required"}, {"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
status=status.HTTP_400_BAD_REQUEST,
) )
module = Module.objects.get( module = Module.objects.get(
workspace__slug=slug, project_id=project_id, pk=module_id workspace__slug=slug, project_id=project_id, pk=module_id
@ -440,10 +354,7 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def delete(self, request, slug, project_id, module_id, issue_id): def delete(self, request, slug, project_id, module_id, issue_id):
module_issue = ModuleIssue.objects.get( module_issue = ModuleIssue.objects.get(
workspace__slug=slug, workspace__slug=slug, project_id=project_id, module_id=module_id, issue_id=issue_id
project_id=project_id,
module_id=module_id,
issue_id=issue_id,
) )
module_issue.delete() module_issue.delete()
issue_activity.delay( issue_activity.delay(
@ -460,124 +371,4 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
current_instance=None, current_instance=None,
epoch=int(timezone.now().timestamp()), epoch=int(timezone.now().timestamp()),
) )
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get_queryset(self):
return (
Module.objects.filter(project_id=self.kwargs.get("project_id"))
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(archived_at__isnull=False)
.select_related("project")
.select_related("workspace")
.select_related("lead")
.prefetch_related("members")
.prefetch_related(
Prefetch(
"link_module",
queryset=ModuleLink.objects.select_related(
"module", "created_by"
),
)
)
.annotate(
total_issues=Count(
"issue_module",
filter=Q(
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
distinct=True,
),
)
.annotate(
completed_issues=Count(
"issue_module__issue__state__group",
filter=Q(
issue_module__issue__state__group="completed",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
distinct=True,
)
)
.annotate(
cancelled_issues=Count(
"issue_module__issue__state__group",
filter=Q(
issue_module__issue__state__group="cancelled",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
distinct=True,
)
)
.annotate(
started_issues=Count(
"issue_module__issue__state__group",
filter=Q(
issue_module__issue__state__group="started",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
distinct=True,
)
)
.annotate(
unstarted_issues=Count(
"issue_module__issue__state__group",
filter=Q(
issue_module__issue__state__group="unstarted",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
distinct=True,
)
)
.annotate(
backlog_issues=Count(
"issue_module__issue__state__group",
filter=Q(
issue_module__issue__state__group="backlog",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
distinct=True,
)
)
.order_by(self.kwargs.get("order_by", "-created_at"))
)
def get(self, request, slug, project_id):
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda modules: ModuleSerializer(
modules,
many=True,
fields=self.fields,
expand=self.expand,
).data,
)
def post(self, request, slug, project_id, pk):
module = Module.objects.get(
pk=pk, project_id=project_id, workspace__slug=slug
)
module.archived_at = timezone.now()
module.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, slug, project_id, pk):
module = Module.objects.get(
pk=pk, project_id=project_id, workspace__slug=slug
)
module.archived_at = None
module.save()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@ -1,29 +1,27 @@
# Django imports # Django imports
from django.db import IntegrityError from django.db import IntegrityError
from django.db.models import Exists, F, Func, OuterRef, Prefetch, Q, Subquery from django.db.models import Exists, OuterRef, Q, F, Func, Subquery, Prefetch
from django.utils import timezone
# Third party imports # Third party imports
from rest_framework import status from rest_framework import status
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.serializers import ValidationError from rest_framework.serializers import ValidationError
from plane.api.serializers import ProjectSerializer
from plane.app.permissions import ProjectBasePermission
# Module imports # Module imports
from plane.db.models import ( from plane.db.models import (
Cycle,
Inbox,
IssueProperty,
Module,
Project,
ProjectDeployBoard,
ProjectMember,
State,
Workspace, Workspace,
Project,
ProjectFavorite,
ProjectMember,
ProjectDeployBoard,
State,
Cycle,
Module,
IssueProperty,
Inbox,
) )
from plane.app.permissions import ProjectBasePermission
from plane.api.serializers import ProjectSerializer
from .base import BaseAPIView, WebhookMixin from .base import BaseAPIView, WebhookMixin
@ -41,18 +39,9 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self): def get_queryset(self):
return ( return (
Project.objects.filter(workspace__slug=self.kwargs.get("slug")) Project.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter( .filter(Q(project_projectmember__member=self.request.user) | Q(network=2))
Q(
project_projectmember__member=self.request.user,
project_projectmember__is_active=True,
)
| Q(network=2)
)
.select_related( .select_related(
"workspace", "workspace", "workspace__owner", "default_assignee", "project_lead"
"workspace__owner",
"default_assignee",
"project_lead",
) )
.annotate( .annotate(
is_member=Exists( is_member=Exists(
@ -105,8 +94,8 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
.distinct() .distinct()
) )
def get(self, request, slug, pk=None): def get(self, request, slug, project_id=None):
if pk is None: if project_id is None:
sort_order_query = ProjectMember.objects.filter( sort_order_query = ProjectMember.objects.filter(
member=request.user, member=request.user,
project_id=OuterRef("pk"), project_id=OuterRef("pk"),
@ -131,18 +120,11 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
request=request, request=request,
queryset=(projects), queryset=(projects),
on_results=lambda projects: ProjectSerializer( on_results=lambda projects: ProjectSerializer(
projects, projects, many=True, fields=self.fields, expand=self.expand,
many=True,
fields=self.fields,
expand=self.expand,
).data, ).data,
) )
project = self.get_queryset().get(workspace__slug=slug, pk=pk) project = self.get_queryset().get(workspace__slug=slug, pk=project_id)
serializer = ProjectSerializer( serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand,)
project,
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
def post(self, request, slug): def post(self, request, slug):
@ -155,10 +137,8 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
serializer.save() serializer.save()
# Add the user as Administrator to the project # Add the user as Administrator to the project
_ = ProjectMember.objects.create( project_member = ProjectMember.objects.create(
project_id=serializer.data["id"], project_id=serializer.data["id"], member=request.user, role=20
member=request.user,
role=20,
) )
# Also create the issue property for the user # Also create the issue property for the user
_ = IssueProperty.objects.create( _ = IssueProperty.objects.create(
@ -231,15 +211,9 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
] ]
) )
project = ( project = self.get_queryset().filter(pk=serializer.data["id"]).first()
self.get_queryset()
.filter(pk=serializer.data["id"])
.first()
)
serializer = ProjectSerializer(project) serializer = ProjectSerializer(project)
return Response( return Response(serializer.data, status=status.HTTP_201_CREATED)
serializer.data, status=status.HTTP_201_CREATED
)
return Response( return Response(
serializer.errors, serializer.errors,
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
@ -250,27 +224,20 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
{"name": "The project name is already taken"}, {"name": "The project name is already taken"},
status=status.HTTP_410_GONE, status=status.HTTP_410_GONE,
) )
except Workspace.DoesNotExist: except Workspace.DoesNotExist as e:
return Response( return Response(
{"error": "Workspace does not exist"}, {"error": "Workspace does not exist"}, status=status.HTTP_404_NOT_FOUND
status=status.HTTP_404_NOT_FOUND,
) )
except ValidationError: except ValidationError as e:
return Response( return Response(
{"identifier": "The project identifier is already taken"}, {"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE, status=status.HTTP_410_GONE,
) )
def patch(self, request, slug, pk): def patch(self, request, slug, project_id=None):
try: try:
workspace = Workspace.objects.get(slug=slug) workspace = Workspace.objects.get(slug=slug)
project = Project.objects.get(pk=pk) project = Project.objects.get(pk=project_id)
if project.archived_at:
return Response(
{"error": "Archived project cannot be updated"},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = ProjectSerializer( serializer = ProjectSerializer(
project, project,
@ -283,31 +250,22 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
serializer.save() serializer.save()
if serializer.data["inbox_view"]: if serializer.data["inbox_view"]:
Inbox.objects.get_or_create( Inbox.objects.get_or_create(
name=f"{project.name} Inbox", name=f"{project.name} Inbox", project=project, is_default=True
project=project,
is_default=True,
) )
# Create the triage state in Backlog group # Create the triage state in Backlog group
State.objects.get_or_create( State.objects.get_or_create(
name="Triage", name="Triage",
group="triage", group="backlog",
description="Default state for managing all Inbox Issues", description="Default state for managing all Inbox Issues",
project_id=pk, project_id=project_id,
color="#ff7700", color="#ff7700",
is_triage=True,
) )
project = ( project = self.get_queryset().filter(pk=serializer.data["id"]).first()
self.get_queryset()
.filter(pk=serializer.data["id"])
.first()
)
serializer = ProjectSerializer(project) serializer = ProjectSerializer(project)
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return Response( return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
except IntegrityError as e: except IntegrityError as e:
if "already exists" in str(e): if "already exists" in str(e):
return Response( return Response(
@ -316,35 +274,15 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
) )
except (Project.DoesNotExist, Workspace.DoesNotExist): except (Project.DoesNotExist, Workspace.DoesNotExist):
return Response( return Response(
{"error": "Project does not exist"}, {"error": "Project does not exist"}, status=status.HTTP_404_NOT_FOUND
status=status.HTTP_404_NOT_FOUND,
) )
except ValidationError: except ValidationError as e:
return Response( return Response(
{"identifier": "The project identifier is already taken"}, {"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE, status=status.HTTP_410_GONE,
) )
def delete(self, request, slug, pk):
project = Project.objects.get(pk=pk, workspace__slug=slug)
project.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectArchiveUnarchiveAPIEndpoint(BaseAPIView):
permission_classes = [
ProjectBasePermission,
]
def post(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project.archived_at = timezone.now()
project.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, slug, project_id): def delete(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug) project = Project.objects.get(pk=project_id, workspace__slug=slug)
project.archived_at = None project.delete()
project.save() return Response(status=status.HTTP_204_NO_CONTENT)
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@ -1,16 +1,18 @@
# Python imports
from itertools import groupby
# Django imports # Django imports
from django.db import IntegrityError from django.db.models import Q
# Third party imports # Third party imports
from rest_framework import status
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework import status
from plane.api.serializers import StateSerializer
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import Issue, State
# Module imports # Module imports
from .base import BaseAPIView from .base import BaseAPIView
from plane.api.serializers import StateSerializer
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import State, Issue
class StateAPIEndpoint(BaseAPIView): class StateAPIEndpoint(BaseAPIView):
@ -24,73 +26,23 @@ class StateAPIEndpoint(BaseAPIView):
return ( return (
State.objects.filter(workspace__slug=self.kwargs.get("slug")) State.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter( .filter(project__project_projectmember__member=self.request.user)
project__project_projectmember__member=self.request.user, .filter(~Q(name="Triage"))
project__project_projectmember__is_active=True,
)
.filter(is_triage=False)
.filter(project__archived_at__isnull=True)
.select_related("project") .select_related("project")
.select_related("workspace") .select_related("workspace")
.distinct() .distinct()
) )
def post(self, request, slug, project_id): def post(self, request, slug, project_id):
try: serializer = StateSerializer(data=request.data, context={"project_id": project_id})
serializer = StateSerializer( if serializer.is_valid():
data=request.data, context={"project_id": project_id} serializer.save(project_id=project_id)
) return Response(serializer.data, status=status.HTTP_200_OK)
if serializer.is_valid(): return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
if (
request.data.get("external_id")
and request.data.get("external_source")
and State.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get("external_source"),
external_id=request.data.get("external_id"),
).exists()
):
state = State.objects.filter(
workspace__slug=slug,
project_id=project_id,
external_id=request.data.get("external_id"),
external_source=request.data.get("external_source"),
).first()
return Response(
{
"error": "State with the same external id and external source already exists",
"id": str(state.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer.save(project_id=project_id)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
except IntegrityError:
state = State.objects.filter(
workspace__slug=slug,
project_id=project_id,
name=request.data.get("name"),
).first()
return Response(
{
"error": "State with the same name already exists in the project",
"id": str(state.id),
},
status=status.HTTP_409_CONFLICT,
)
def get(self, request, slug, project_id, state_id=None): def get(self, request, slug, project_id, state_id=None):
if state_id: if state_id:
serializer = StateSerializer( serializer = StateSerializer(self.get_queryset().get(pk=state_id))
self.get_queryset().get(pk=state_id),
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return self.paginate( return self.paginate(
request=request, request=request,
@ -105,26 +57,21 @@ class StateAPIEndpoint(BaseAPIView):
def delete(self, request, slug, project_id, state_id): def delete(self, request, slug, project_id, state_id):
state = State.objects.get( state = State.objects.get(
is_triage=False, ~Q(name="Triage"),
pk=state_id, pk=state_id,
project_id=project_id, project_id=project_id,
workspace__slug=slug, workspace__slug=slug,
) )
if state.default: if state.default:
return Response( return Response({"error": "Default state cannot be deleted"}, status=status.HTTP_400_BAD_REQUEST)
{"error": "Default state cannot be deleted"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check for any issues in the state # Check for any issues in the state
issue_exist = Issue.issue_objects.filter(state=state_id).exists() issue_exist = Issue.issue_objects.filter(state=state_id).exists()
if issue_exist: if issue_exist:
return Response( return Response(
{ {"error": "The state is not empty, only empty states can be deleted"},
"error": "The state is not empty, only empty states can be deleted"
},
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
@ -132,30 +79,9 @@ class StateAPIEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, project_id, state_id=None): def patch(self, request, slug, project_id, state_id=None):
state = State.objects.get( state = State.objects.get(workspace__slug=slug, project_id=project_id, pk=state_id)
workspace__slug=slug, project_id=project_id, pk=state_id
)
serializer = StateSerializer(state, data=request.data, partial=True) serializer = StateSerializer(state, data=request.data, partial=True)
if serializer.is_valid(): if serializer.is_valid():
if (
str(request.data.get("external_id"))
and (state.external_id != str(request.data.get("external_id")))
and State.objects.filter(
project_id=project_id,
workspace__slug=slug,
external_source=request.data.get(
"external_source", state.external_source
),
external_id=request.data.get("external_id"),
).exists()
):
return Response(
{
"error": "State with the same external id and external source already exists",
"id": str(state.id),
},
status=status.HTTP_409_CONFLICT,
)
serializer.save() serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

View File

@ -25,10 +25,7 @@ class APIKeyAuthentication(authentication.BaseAuthentication):
def validate_api_token(self, token): def validate_api_token(self, token):
try: try:
api_token = APIToken.objects.get( api_token = APIToken.objects.get(
Q( Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
Q(expired_at__gt=timezone.now())
| Q(expired_at__isnull=True)
),
token=token, token=token,
is_active=True, is_active=True,
) )

View File

@ -1,3 +1,4 @@
from .workspace import ( from .workspace import (
WorkSpaceBasePermission, WorkSpaceBasePermission,
WorkspaceOwnerPermission, WorkspaceOwnerPermission,
@ -12,3 +13,5 @@ from .project import (
ProjectMemberPermission, ProjectMemberPermission,
ProjectLitePermission, ProjectLitePermission,
) )

View File

@ -1,8 +1,8 @@
# Third Party imports # Third Party imports
from rest_framework.permissions import SAFE_METHODS, BasePermission from rest_framework.permissions import BasePermission, SAFE_METHODS
# Module import # Module import
from plane.db.models import ProjectMember, WorkspaceMember from plane.db.models import WorkspaceMember, ProjectMember
# Permission Mappings # Permission Mappings
Admin = 20 Admin = 20

View File

@ -17,7 +17,6 @@ from .workspace import (
WorkspaceThemeSerializer, WorkspaceThemeSerializer,
WorkspaceMemberAdminSerializer, WorkspaceMemberAdminSerializer,
WorkspaceMemberMeSerializer, WorkspaceMemberMeSerializer,
WorkspaceUserPropertiesSerializer,
) )
from .project import ( from .project import (
ProjectSerializer, ProjectSerializer,
@ -32,20 +31,14 @@ from .project import (
ProjectDeployBoardSerializer, ProjectDeployBoardSerializer,
ProjectMemberAdminSerializer, ProjectMemberAdminSerializer,
ProjectPublicMemberSerializer, ProjectPublicMemberSerializer,
ProjectMemberRoleSerializer,
) )
from .state import StateSerializer, StateLiteSerializer from .state import StateSerializer, StateLiteSerializer
from .view import ( from .view import GlobalViewSerializer, IssueViewSerializer, IssueViewFavoriteSerializer
GlobalViewSerializer,
IssueViewSerializer,
IssueViewFavoriteSerializer,
)
from .cycle import ( from .cycle import (
CycleSerializer, CycleSerializer,
CycleIssueSerializer, CycleIssueSerializer,
CycleFavoriteSerializer, CycleFavoriteSerializer,
CycleWriteSerializer, CycleWriteSerializer,
CycleUserPropertiesSerializer,
) )
from .asset import FileAssetSerializer from .asset import FileAssetSerializer
from .issue import ( from .issue import (
@ -59,7 +52,6 @@ from .issue import (
IssueFlatSerializer, IssueFlatSerializer,
IssueStateSerializer, IssueStateSerializer,
IssueLinkSerializer, IssueLinkSerializer,
IssueInboxSerializer,
IssueLiteSerializer, IssueLiteSerializer,
IssueAttachmentSerializer, IssueAttachmentSerializer,
IssueSubscriberSerializer, IssueSubscriberSerializer,
@ -69,57 +61,44 @@ from .issue import (
IssueRelationSerializer, IssueRelationSerializer,
RelatedIssueSerializer, RelatedIssueSerializer,
IssuePublicSerializer, IssuePublicSerializer,
IssueDetailSerializer,
IssueReactionLiteSerializer,
IssueAttachmentLiteSerializer,
IssueLinkLiteSerializer,
) )
from .module import ( from .module import (
ModuleDetailSerializer,
ModuleWriteSerializer, ModuleWriteSerializer,
ModuleSerializer, ModuleSerializer,
ModuleIssueSerializer, ModuleIssueSerializer,
ModuleLinkSerializer, ModuleLinkSerializer,
ModuleFavoriteSerializer, ModuleFavoriteSerializer,
ModuleUserPropertiesSerializer,
) )
from .api import APITokenSerializer, APITokenReadSerializer from .api import APITokenSerializer, APITokenReadSerializer
from .integration import (
IntegrationSerializer,
WorkspaceIntegrationSerializer,
GithubIssueSyncSerializer,
GithubRepositorySerializer,
GithubRepositorySyncSerializer,
GithubCommentSyncSerializer,
SlackProjectSyncSerializer,
)
from .importer import ImporterSerializer from .importer import ImporterSerializer
from .page import ( from .page import PageSerializer, PageLogSerializer, SubPageSerializer, PageFavoriteSerializer
PageSerializer,
PageLogSerializer,
SubPageSerializer,
PageFavoriteSerializer,
)
from .estimate import ( from .estimate import (
EstimateSerializer, EstimateSerializer,
EstimatePointSerializer, EstimatePointSerializer,
EstimateReadSerializer, EstimateReadSerializer,
WorkspaceEstimateSerializer,
) )
from .inbox import ( from .inbox import InboxSerializer, InboxIssueSerializer, IssueStateInboxSerializer
InboxSerializer,
InboxIssueSerializer,
IssueStateInboxSerializer,
InboxIssueLiteSerializer,
InboxIssueDetailSerializer,
)
from .analytic import AnalyticViewSerializer from .analytic import AnalyticViewSerializer
from .notification import ( from .notification import NotificationSerializer
NotificationSerializer,
UserNotificationPreferenceSerializer,
)
from .exporter import ExporterHistorySerializer from .exporter import ExporterHistorySerializer
from .webhook import WebhookSerializer, WebhookLogSerializer from .webhook import WebhookSerializer, WebhookLogSerializer
from .dashboard import DashboardSerializer, WidgetSerializer

View File

@ -3,6 +3,7 @@ from plane.db.models import APIToken, APIActivityLog
class APITokenSerializer(BaseSerializer): class APITokenSerializer(BaseSerializer):
class Meta: class Meta:
model = APIToken model = APIToken
fields = "__all__" fields = "__all__"
@ -17,12 +18,14 @@ class APITokenSerializer(BaseSerializer):
class APITokenReadSerializer(BaseSerializer): class APITokenReadSerializer(BaseSerializer):
class Meta: class Meta:
model = APIToken model = APIToken
exclude = ("token",) exclude = ('token',)
class APIActivityLogSerializer(BaseSerializer): class APIActivityLogSerializer(BaseSerializer):
class Meta: class Meta:
model = APIActivityLog model = APIActivityLog
fields = "__all__" fields = "__all__"

View File

@ -4,17 +4,16 @@ from rest_framework import serializers
class BaseSerializer(serializers.ModelSerializer): class BaseSerializer(serializers.ModelSerializer):
id = serializers.PrimaryKeyRelatedField(read_only=True) id = serializers.PrimaryKeyRelatedField(read_only=True)
class DynamicBaseSerializer(BaseSerializer): class DynamicBaseSerializer(BaseSerializer):
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
# If 'fields' is provided in the arguments, remove it and store it separately. # If 'fields' is provided in the arguments, remove it and store it separately.
# This is done so as not to pass this custom argument up to the superclass. # This is done so as not to pass this custom argument up to the superclass.
fields = kwargs.pop("fields", []) fields = kwargs.pop("fields", None)
self.expand = kwargs.pop("expand", []) or []
fields = self.expand
# Call the initialization of the superclass. # Call the initialization of the superclass.
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
# If 'fields' was provided, filter the fields of the serializer accordingly. # If 'fields' was provided, filter the fields of the serializer accordingly.
if fields is not None: if fields is not None:
self.fields = self._filter_fields(fields) self.fields = self._filter_fields(fields)
@ -32,7 +31,7 @@ class DynamicBaseSerializer(BaseSerializer):
# loop through its keys and values. # loop through its keys and values.
if isinstance(field_name, dict): if isinstance(field_name, dict):
for key, value in field_name.items(): for key, value in field_name.items():
# If the value of this nested field is a list, # If the value of this nested field is a list,
# perform a recursive filter on it. # perform a recursive filter on it.
if isinstance(value, list): if isinstance(value, list):
self._filter_fields(self.fields[key], value) self._filter_fields(self.fields[key], value)
@ -48,134 +47,12 @@ class DynamicBaseSerializer(BaseSerializer):
elif isinstance(item, dict): elif isinstance(item, dict):
allowed.append(list(item.keys())[0]) allowed.append(list(item.keys())[0])
for field in allowed: # Convert the current serializer's fields and the allowed fields to sets.
if field not in self.fields: existing = set(self.fields)
from . import ( allowed = set(allowed)
WorkspaceLiteSerializer,
ProjectLiteSerializer,
UserLiteSerializer,
StateLiteSerializer,
IssueSerializer,
LabelSerializer,
CycleIssueSerializer,
IssueLiteSerializer,
IssueRelationSerializer,
InboxIssueLiteSerializer,
IssueReactionLiteSerializer,
IssueAttachmentLiteSerializer,
IssueLinkLiteSerializer,
)
# Expansion mapper # Remove fields from the serializer that aren't in the 'allowed' list.
expansion = { for field_name in (existing - allowed):
"user": UserLiteSerializer, self.fields.pop(field_name)
"workspace": WorkspaceLiteSerializer,
"project": ProjectLiteSerializer,
"default_assignee": UserLiteSerializer,
"project_lead": UserLiteSerializer,
"state": StateLiteSerializer,
"created_by": UserLiteSerializer,
"issue": IssueSerializer,
"actor": UserLiteSerializer,
"owned_by": UserLiteSerializer,
"members": UserLiteSerializer,
"assignees": UserLiteSerializer,
"labels": LabelSerializer,
"issue_cycle": CycleIssueSerializer,
"parent": IssueLiteSerializer,
"issue_relation": IssueRelationSerializer,
"issue_inbox": InboxIssueLiteSerializer,
"issue_reactions": IssueReactionLiteSerializer,
"issue_attachment": IssueAttachmentLiteSerializer,
"issue_link": IssueLinkLiteSerializer,
"sub_issues": IssueLiteSerializer,
}
self.fields[field] = expansion[field](
many=(
True
if field
in [
"members",
"assignees",
"labels",
"issue_cycle",
"issue_relation",
"issue_inbox",
"issue_reactions",
"issue_attachment",
"issue_link",
"sub_issues",
]
else False
)
)
return self.fields return self.fields
def to_representation(self, instance):
response = super().to_representation(instance)
# Ensure 'expand' is iterable before processing
if self.expand:
for expand in self.expand:
if expand in self.fields:
# Import all the expandable serializers
from . import (
WorkspaceLiteSerializer,
ProjectLiteSerializer,
UserLiteSerializer,
StateLiteSerializer,
IssueSerializer,
LabelSerializer,
CycleIssueSerializer,
IssueRelationSerializer,
InboxIssueLiteSerializer,
IssueLiteSerializer,
IssueReactionLiteSerializer,
IssueAttachmentLiteSerializer,
IssueLinkLiteSerializer,
)
# Expansion mapper
expansion = {
"user": UserLiteSerializer,
"workspace": WorkspaceLiteSerializer,
"project": ProjectLiteSerializer,
"default_assignee": UserLiteSerializer,
"project_lead": UserLiteSerializer,
"state": StateLiteSerializer,
"created_by": UserLiteSerializer,
"issue": IssueSerializer,
"actor": UserLiteSerializer,
"owned_by": UserLiteSerializer,
"members": UserLiteSerializer,
"assignees": UserLiteSerializer,
"labels": LabelSerializer,
"issue_cycle": CycleIssueSerializer,
"parent": IssueLiteSerializer,
"issue_relation": IssueRelationSerializer,
"issue_inbox": InboxIssueLiteSerializer,
"issue_reactions": IssueReactionLiteSerializer,
"issue_attachment": IssueAttachmentLiteSerializer,
"issue_link": IssueLinkLiteSerializer,
"sub_issues": IssueLiteSerializer,
}
# Check if field in expansion then expand the field
if expand in expansion:
if isinstance(response.get(expand), list):
exp_serializer = expansion[expand](
getattr(instance, expand), many=True
)
else:
exp_serializer = expansion[expand](
getattr(instance, expand)
)
response[expand] = exp_serializer.data
else:
# You might need to handle this case differently
response[expand] = getattr(
instance, f"{expand}_id", None
)
return response

View File

@ -3,13 +3,11 @@ from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer from .base import BaseSerializer
from .user import UserLiteSerializer
from .issue import IssueStateSerializer from .issue import IssueStateSerializer
from plane.db.models import ( from .workspace import WorkspaceLiteSerializer
Cycle, from .project import ProjectLiteSerializer
CycleIssue, from plane.db.models import Cycle, CycleIssue, CycleFavorite
CycleFavorite,
CycleUserProperties,
)
class CycleWriteSerializer(BaseSerializer): class CycleWriteSerializer(BaseSerializer):
@ -19,67 +17,69 @@ class CycleWriteSerializer(BaseSerializer):
and data.get("end_date", None) is not None and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None) and data.get("start_date", None) > data.get("end_date", None)
): ):
raise serializers.ValidationError( raise serializers.ValidationError("Start date cannot exceed end date")
"Start date cannot exceed end date"
)
return data return data
class Meta: class Meta:
model = Cycle model = Cycle
fields = "__all__" fields = "__all__"
class CycleSerializer(BaseSerializer):
owned_by = UserLiteSerializer(read_only=True)
is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
assignees = serializers.SerializerMethodField(read_only=True)
total_estimates = serializers.IntegerField(read_only=True)
completed_estimates = serializers.IntegerField(read_only=True)
started_estimates = serializers.IntegerField(read_only=True)
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
def validate(self, data):
if (
data.get("start_date", None) is not None
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError("Start date cannot exceed end date")
return data
def get_assignees(self, obj):
members = [
{
"avatar": assignee.avatar,
"display_name": assignee.display_name,
"id": assignee.id,
}
for issue_cycle in obj.issue_cycle.prefetch_related(
"issue__assignees"
).all()
for assignee in issue_cycle.issue.assignees.all()
]
# Use a set comprehension to return only the unique objects
unique_objects = {frozenset(item.items()) for item in members}
# Convert the set back to a list of dictionaries
unique_list = [dict(item) for item in unique_objects]
return unique_list
class Meta:
model = Cycle
fields = "__all__"
read_only_fields = [ read_only_fields = [
"workspace", "workspace",
"project", "project",
"owned_by", "owned_by",
"archived_at",
] ]
class CycleSerializer(BaseSerializer):
# favorite
is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
# state group wise distribution
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
# active | draft | upcoming | completed
status = serializers.CharField(read_only=True)
class Meta:
model = Cycle
fields = [
# necessary fields
"id",
"workspace_id",
"project_id",
# model fields
"name",
"description",
"start_date",
"end_date",
"owned_by_id",
"view_props",
"sort_order",
"external_source",
"external_id",
"progress_snapshot",
# meta fields
"is_favorite",
"total_issues",
"cancelled_issues",
"completed_issues",
"started_issues",
"unstarted_issues",
"backlog_issues",
"status",
]
read_only_fields = fields
class CycleIssueSerializer(BaseSerializer): class CycleIssueSerializer(BaseSerializer):
issue_detail = IssueStateSerializer(read_only=True, source="issue") issue_detail = IssueStateSerializer(read_only=True, source="issue")
sub_issues_count = serializers.IntegerField(read_only=True) sub_issues_count = serializers.IntegerField(read_only=True)
@ -105,14 +105,3 @@ class CycleFavoriteSerializer(BaseSerializer):
"project", "project",
"user", "user",
] ]
class CycleUserPropertiesSerializer(BaseSerializer):
class Meta:
model = CycleUserProperties
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"cycle" "user",
]

View File

@ -1,21 +0,0 @@
# Module imports
from .base import BaseSerializer
from plane.db.models import Dashboard, Widget
# Third party frameworks
from rest_framework import serializers
class DashboardSerializer(BaseSerializer):
class Meta:
model = Dashboard
fields = "__all__"
class WidgetSerializer(BaseSerializer):
is_visible = serializers.BooleanField(read_only=True)
widget_filters = serializers.JSONField(read_only=True)
class Meta:
model = Widget
fields = ["id", "key", "is_visible", "widget_filters"]

View File

@ -2,18 +2,11 @@
from .base import BaseSerializer from .base import BaseSerializer
from plane.db.models import Estimate, EstimatePoint from plane.db.models import Estimate, EstimatePoint
from plane.app.serializers import ( from plane.app.serializers import WorkspaceLiteSerializer, ProjectLiteSerializer
WorkspaceLiteSerializer,
ProjectLiteSerializer,
)
from rest_framework import serializers
class EstimateSerializer(BaseSerializer): class EstimateSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer( workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
read_only=True, source="workspace"
)
project_detail = ProjectLiteSerializer(read_only=True, source="project") project_detail = ProjectLiteSerializer(read_only=True, source="project")
class Meta: class Meta:
@ -26,16 +19,6 @@ class EstimateSerializer(BaseSerializer):
class EstimatePointSerializer(BaseSerializer): class EstimatePointSerializer(BaseSerializer):
def validate(self, data):
if not data:
raise serializers.ValidationError("Estimate points are required")
value = data.get("value")
if value and len(value) > 20:
raise serializers.ValidationError(
"Value can't be more than 20 characters"
)
return data
class Meta: class Meta:
model = EstimatePoint model = EstimatePoint
fields = "__all__" fields = "__all__"
@ -48,9 +31,7 @@ class EstimatePointSerializer(BaseSerializer):
class EstimateReadSerializer(BaseSerializer): class EstimateReadSerializer(BaseSerializer):
points = EstimatePointSerializer(read_only=True, many=True) points = EstimatePointSerializer(read_only=True, many=True)
workspace_detail = WorkspaceLiteSerializer( workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
read_only=True, source="workspace"
)
project_detail = ProjectLiteSerializer(read_only=True, source="project") project_detail = ProjectLiteSerializer(read_only=True, source="project")
class Meta: class Meta:
@ -61,16 +42,3 @@ class EstimateReadSerializer(BaseSerializer):
"name", "name",
"description", "description",
] ]
class WorkspaceEstimateSerializer(BaseSerializer):
points = EstimatePointSerializer(read_only=True, many=True)
class Meta:
model = Estimate
fields = "__all__"
read_only_fields = [
"points",
"name",
"description",
]

View File

@ -5,9 +5,7 @@ from .user import UserLiteSerializer
class ExporterHistorySerializer(BaseSerializer): class ExporterHistorySerializer(BaseSerializer):
initiated_by_detail = UserLiteSerializer( initiated_by_detail = UserLiteSerializer(source="initiated_by", read_only=True)
source="initiated_by", read_only=True
)
class Meta: class Meta:
model = ExporterHistory model = ExporterHistory

View File

@ -7,13 +7,9 @@ from plane.db.models import Importer
class ImporterSerializer(BaseSerializer): class ImporterSerializer(BaseSerializer):
initiated_by_detail = UserLiteSerializer( initiated_by_detail = UserLiteSerializer(source="initiated_by", read_only=True)
source="initiated_by", read_only=True
)
project_detail = ProjectLiteSerializer(source="project", read_only=True) project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer( workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
source="workspace", read_only=True
)
class Meta: class Meta:
model = Importer model = Importer

View File

@ -3,11 +3,7 @@ from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer from .base import BaseSerializer
from .issue import ( from .issue import IssueFlatSerializer, LabelLiteSerializer
IssueInboxSerializer,
LabelLiteSerializer,
IssueDetailSerializer,
)
from .project import ProjectLiteSerializer from .project import ProjectLiteSerializer
from .state import StateLiteSerializer from .state import StateLiteSerializer
from .user import UserLiteSerializer from .user import UserLiteSerializer
@ -28,62 +24,17 @@ class InboxSerializer(BaseSerializer):
class InboxIssueSerializer(BaseSerializer): class InboxIssueSerializer(BaseSerializer):
issue = IssueInboxSerializer(read_only=True) issue_detail = IssueFlatSerializer(source="issue", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
class Meta: class Meta:
model = InboxIssue model = InboxIssue
fields = [ fields = "__all__"
"id",
"status",
"duplicate_to",
"snoozed_till",
"source",
"issue",
"created_by",
]
read_only_fields = [ read_only_fields = [
"project", "project",
"workspace", "workspace",
] ]
def to_representation(self, instance):
# Pass the annotated fields to the Issue instance if they exist
if hasattr(instance, "label_ids"):
instance.issue.label_ids = instance.label_ids
return super().to_representation(instance)
class InboxIssueDetailSerializer(BaseSerializer):
issue = IssueDetailSerializer(read_only=True)
duplicate_issue_detail = IssueInboxSerializer(
read_only=True, source="duplicate_to"
)
class Meta:
model = InboxIssue
fields = [
"id",
"status",
"duplicate_to",
"snoozed_till",
"duplicate_issue_detail",
"source",
"issue",
]
read_only_fields = [
"project",
"workspace",
]
def to_representation(self, instance):
# Pass the annotated fields to the Issue instance if they exist
if hasattr(instance, "assignee_ids"):
instance.issue.assignee_ids = instance.assignee_ids
if hasattr(instance, "label_ids"):
instance.issue.label_ids = instance.label_ids
return super().to_representation(instance)
class InboxIssueLiteSerializer(BaseSerializer): class InboxIssueLiteSerializer(BaseSerializer):
class Meta: class Meta:
@ -95,13 +46,10 @@ class InboxIssueLiteSerializer(BaseSerializer):
class IssueStateInboxSerializer(BaseSerializer): class IssueStateInboxSerializer(BaseSerializer):
state_detail = StateLiteSerializer(read_only=True, source="state") state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project") project_detail = ProjectLiteSerializer(read_only=True, source="project")
label_details = LabelLiteSerializer( label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
read_only=True, source="labels", many=True assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
)
assignee_details = UserLiteSerializer(
read_only=True, source="assignees", many=True
)
sub_issues_count = serializers.IntegerField(read_only=True) sub_issues_count = serializers.IntegerField(read_only=True)
bridge_id = serializers.UUIDField(read_only=True)
issue_inbox = InboxIssueLiteSerializer(read_only=True, many=True) issue_inbox = InboxIssueLiteSerializer(read_only=True, many=True)
class Meta: class Meta:

View File

@ -0,0 +1,8 @@
from .base import IntegrationSerializer, WorkspaceIntegrationSerializer
from .github import (
GithubRepositorySerializer,
GithubRepositorySyncSerializer,
GithubIssueSyncSerializer,
GithubCommentSyncSerializer,
)
from .slack import SlackProjectSyncSerializer

View File

@ -0,0 +1,20 @@
# Module imports
from plane.app.serializers import BaseSerializer
from plane.db.models import Integration, WorkspaceIntegration
class IntegrationSerializer(BaseSerializer):
class Meta:
model = Integration
fields = "__all__"
read_only_fields = [
"verified",
]
class WorkspaceIntegrationSerializer(BaseSerializer):
integration_detail = IntegrationSerializer(read_only=True, source="integration")
class Meta:
model = WorkspaceIntegration
fields = "__all__"

View File

@ -0,0 +1,45 @@
# Module imports
from plane.app.serializers import BaseSerializer
from plane.db.models import (
GithubIssueSync,
GithubRepository,
GithubRepositorySync,
GithubCommentSync,
)
class GithubRepositorySerializer(BaseSerializer):
class Meta:
model = GithubRepository
fields = "__all__"
class GithubRepositorySyncSerializer(BaseSerializer):
repo_detail = GithubRepositorySerializer(source="repository")
class Meta:
model = GithubRepositorySync
fields = "__all__"
class GithubIssueSyncSerializer(BaseSerializer):
class Meta:
model = GithubIssueSync
fields = "__all__"
read_only_fields = [
"project",
"workspace",
"repository_sync",
]
class GithubCommentSyncSerializer(BaseSerializer):
class Meta:
model = GithubCommentSync
fields = "__all__"
read_only_fields = [
"project",
"workspace",
"repository_sync",
"issue_sync",
]

View File

@ -0,0 +1,14 @@
# Module imports
from plane.app.serializers import BaseSerializer
from plane.db.models import SlackProjectSync
class SlackProjectSyncSerializer(BaseSerializer):
class Meta:
model = SlackProjectSync
fields = "__all__"
read_only_fields = [
"project",
"workspace",
"workspace_integration",
]

View File

@ -1,7 +1,5 @@
# Django imports # Django imports
from django.utils import timezone from django.utils import timezone
from django.core.validators import URLValidator
from django.core.exceptions import ValidationError
# Third Party imports # Third Party imports
from rest_framework import serializers from rest_framework import serializers
@ -9,7 +7,7 @@ from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer, DynamicBaseSerializer from .base import BaseSerializer, DynamicBaseSerializer
from .user import UserLiteSerializer from .user import UserLiteSerializer
from .state import StateLiteSerializer from .state import StateSerializer, StateLiteSerializer
from .project import ProjectLiteSerializer from .project import ProjectLiteSerializer
from .workspace import WorkspaceLiteSerializer from .workspace import WorkspaceLiteSerializer
from plane.db.models import ( from plane.db.models import (
@ -32,7 +30,6 @@ from plane.db.models import (
CommentReaction, CommentReaction,
IssueVote, IssueVote,
IssueRelation, IssueRelation,
State,
) )
@ -72,26 +69,19 @@ class IssueProjectLiteSerializer(BaseSerializer):
##TODO: Find a better way to write this serializer ##TODO: Find a better way to write this serializer
## Find a better approach to save manytomany? ## Find a better approach to save manytomany?
class IssueCreateSerializer(BaseSerializer): class IssueCreateSerializer(BaseSerializer):
# ids state_detail = StateSerializer(read_only=True, source="state")
state_id = serializers.PrimaryKeyRelatedField( created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
source="state", project_detail = ProjectLiteSerializer(read_only=True, source="project")
queryset=State.objects.all(), workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
required=False,
allow_null=True, assignees = serializers.ListField(
) child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
parent_id = serializers.PrimaryKeyRelatedField(
source="parent",
queryset=Issue.objects.all(),
required=False,
allow_null=True,
)
label_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True, write_only=True,
required=False, required=False,
) )
assignee_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()), labels = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True, write_only=True,
required=False, required=False,
) )
@ -110,10 +100,8 @@ class IssueCreateSerializer(BaseSerializer):
def to_representation(self, instance): def to_representation(self, instance):
data = super().to_representation(instance) data = super().to_representation(instance)
assignee_ids = self.initial_data.get("assignee_ids") data['assignees'] = [str(assignee.id) for assignee in instance.assignees.all()]
data["assignee_ids"] = assignee_ids if assignee_ids else [] data['labels'] = [str(label.id) for label in instance.labels.all()]
label_ids = self.initial_data.get("label_ids")
data["label_ids"] = label_ids if label_ids else []
return data return data
def validate(self, data): def validate(self, data):
@ -122,14 +110,12 @@ class IssueCreateSerializer(BaseSerializer):
and data.get("target_date", None) is not None and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None) and data.get("start_date", None) > data.get("target_date", None)
): ):
raise serializers.ValidationError( raise serializers.ValidationError("Start date cannot exceed target date")
"Start date cannot exceed target date"
)
return data return data
def create(self, validated_data): def create(self, validated_data):
assignees = validated_data.pop("assignee_ids", None) assignees = validated_data.pop("assignees", None)
labels = validated_data.pop("label_ids", None) labels = validated_data.pop("labels", None)
project_id = self.context["project_id"] project_id = self.context["project_id"]
workspace_id = self.context["workspace_id"] workspace_id = self.context["workspace_id"]
@ -187,8 +173,8 @@ class IssueCreateSerializer(BaseSerializer):
return issue return issue
def update(self, instance, validated_data): def update(self, instance, validated_data):
assignees = validated_data.pop("assignee_ids", None) assignees = validated_data.pop("assignees", None)
labels = validated_data.pop("label_ids", None) labels = validated_data.pop("labels", None)
# Related models # Related models
project_id = instance.project_id project_id = instance.project_id
@ -239,15 +225,14 @@ class IssueActivitySerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor") actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue") issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project") project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer( workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
read_only=True, source="workspace"
)
class Meta: class Meta:
model = IssueActivity model = IssueActivity
fields = "__all__" fields = "__all__"
class IssuePropertySerializer(BaseSerializer): class IssuePropertySerializer(BaseSerializer):
class Meta: class Meta:
model = IssueProperty model = IssueProperty
@ -260,17 +245,12 @@ class IssuePropertySerializer(BaseSerializer):
class LabelSerializer(BaseSerializer): class LabelSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
class Meta: class Meta:
model = Label model = Label
fields = [ fields = "__all__"
"parent",
"name",
"color",
"id",
"project_id",
"workspace_id",
"sort_order",
]
read_only_fields = [ read_only_fields = [
"workspace", "workspace",
"project", "project",
@ -288,6 +268,7 @@ class LabelLiteSerializer(BaseSerializer):
class IssueLabelSerializer(BaseSerializer): class IssueLabelSerializer(BaseSerializer):
class Meta: class Meta:
model = IssueLabel model = IssueLabel
fields = "__all__" fields = "__all__"
@ -298,50 +279,33 @@ class IssueLabelSerializer(BaseSerializer):
class IssueRelationSerializer(BaseSerializer): class IssueRelationSerializer(BaseSerializer):
id = serializers.UUIDField(source="related_issue.id", read_only=True) issue_detail = IssueProjectLiteSerializer(read_only=True, source="related_issue")
project_id = serializers.PrimaryKeyRelatedField(
source="related_issue.project_id", read_only=True
)
sequence_id = serializers.IntegerField(
source="related_issue.sequence_id", read_only=True
)
name = serializers.CharField(source="related_issue.name", read_only=True)
relation_type = serializers.CharField(read_only=True)
class Meta: class Meta:
model = IssueRelation model = IssueRelation
fields = [ fields = [
"id", "issue_detail",
"project_id",
"sequence_id",
"relation_type", "relation_type",
"name", "related_issue",
"issue",
"id"
] ]
read_only_fields = [ read_only_fields = [
"workspace", "workspace",
"project", "project",
] ]
class RelatedIssueSerializer(BaseSerializer): class RelatedIssueSerializer(BaseSerializer):
id = serializers.UUIDField(source="issue.id", read_only=True) issue_detail = IssueProjectLiteSerializer(read_only=True, source="issue")
project_id = serializers.PrimaryKeyRelatedField(
source="issue.project_id", read_only=True
)
sequence_id = serializers.IntegerField(
source="issue.sequence_id", read_only=True
)
name = serializers.CharField(source="issue.name", read_only=True)
relation_type = serializers.CharField(read_only=True)
class Meta: class Meta:
model = IssueRelation model = IssueRelation
fields = [ fields = [
"id", "issue_detail",
"project_id",
"sequence_id",
"relation_type", "relation_type",
"name", "related_issue",
"issue",
"id"
] ]
read_only_fields = [ read_only_fields = [
"workspace", "workspace",
@ -433,57 +397,16 @@ class IssueLinkSerializer(BaseSerializer):
"issue", "issue",
] ]
def validate_url(self, value):
# Check URL format
validate_url = URLValidator()
try:
validate_url(value)
except ValidationError:
raise serializers.ValidationError("Invalid URL format.")
# Check URL scheme
if not value.startswith(('http://', 'https://')):
raise serializers.ValidationError("Invalid URL scheme.")
return value
# Validation if url already exists # Validation if url already exists
def create(self, validated_data): def create(self, validated_data):
if IssueLink.objects.filter( if IssueLink.objects.filter(
url=validated_data.get("url"), url=validated_data.get("url"), issue_id=validated_data.get("issue_id")
issue_id=validated_data.get("issue_id"),
).exists(): ).exists():
raise serializers.ValidationError( raise serializers.ValidationError(
{"error": "URL already exists for this Issue"} {"error": "URL already exists for this Issue"}
) )
return IssueLink.objects.create(**validated_data) return IssueLink.objects.create(**validated_data)
def update(self, instance, validated_data):
if IssueLink.objects.filter(
url=validated_data.get("url"),
issue_id=instance.issue_id,
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
return super().update(instance, validated_data)
class IssueLinkLiteSerializer(BaseSerializer):
class Meta:
model = IssueLink
fields = [
"id",
"issue_id",
"title",
"url",
"metadata",
"created_by_id",
"created_at",
]
read_only_fields = fields
class IssueAttachmentSerializer(BaseSerializer): class IssueAttachmentSerializer(BaseSerializer):
class Meta: class Meta:
@ -500,23 +423,10 @@ class IssueAttachmentSerializer(BaseSerializer):
] ]
class IssueAttachmentLiteSerializer(DynamicBaseSerializer):
class Meta:
model = IssueAttachment
fields = [
"id",
"asset",
"attributes",
"issue_id",
"updated_at",
"updated_by_id",
]
read_only_fields = fields
class IssueReactionSerializer(BaseSerializer): class IssueReactionSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor") actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta: class Meta:
model = IssueReaction model = IssueReaction
fields = "__all__" fields = "__all__"
@ -528,14 +438,16 @@ class IssueReactionSerializer(BaseSerializer):
] ]
class IssueReactionLiteSerializer(DynamicBaseSerializer): class CommentReactionLiteSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta: class Meta:
model = IssueReaction model = CommentReaction
fields = [ fields = [
"id", "id",
"actor",
"issue",
"reaction", "reaction",
"comment",
"actor_detail",
] ]
@ -547,18 +459,12 @@ class CommentReactionSerializer(BaseSerializer):
class IssueVoteSerializer(BaseSerializer): class IssueVoteSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor") actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta: class Meta:
model = IssueVote model = IssueVote
fields = [ fields = ["issue", "vote", "workspace", "project", "actor", "actor_detail"]
"issue",
"vote",
"workspace",
"project",
"actor",
"actor_detail",
]
read_only_fields = fields read_only_fields = fields
@ -566,10 +472,8 @@ class IssueCommentSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor") actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue") issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project") project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer( workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
read_only=True, source="workspace" comment_reactions = CommentReactionLiteSerializer(read_only=True, many=True)
)
comment_reactions = CommentReactionSerializer(read_only=True, many=True)
is_member = serializers.BooleanField(read_only=True) is_member = serializers.BooleanField(read_only=True)
class Meta: class Meta:
@ -603,15 +507,12 @@ class IssueStateFlatSerializer(BaseSerializer):
# Issue Serializer with state details # Issue Serializer with state details
class IssueStateSerializer(DynamicBaseSerializer): class IssueStateSerializer(DynamicBaseSerializer):
label_details = LabelLiteSerializer( label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
read_only=True, source="labels", many=True
)
state_detail = StateLiteSerializer(read_only=True, source="state") state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project") project_detail = ProjectLiteSerializer(read_only=True, source="project")
assignee_details = UserLiteSerializer( assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
read_only=True, source="assignees", many=True
)
sub_issues_count = serializers.IntegerField(read_only=True) sub_issues_count = serializers.IntegerField(read_only=True)
bridge_id = serializers.UUIDField(read_only=True)
attachment_count = serializers.IntegerField(read_only=True) attachment_count = serializers.IntegerField(read_only=True)
link_count = serializers.IntegerField(read_only=True) link_count = serializers.IntegerField(read_only=True)
@ -620,110 +521,67 @@ class IssueStateSerializer(DynamicBaseSerializer):
fields = "__all__" fields = "__all__"
class IssueInboxSerializer(DynamicBaseSerializer): class IssueSerializer(BaseSerializer):
label_ids = serializers.ListField( project_detail = ProjectLiteSerializer(read_only=True, source="project")
child=serializers.UUIDField(), state_detail = StateSerializer(read_only=True, source="state")
required=False, parent_detail = IssueStateFlatSerializer(read_only=True, source="parent")
) label_details = LabelSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
class Meta: related_issues = IssueRelationSerializer(read_only=True, source="issue_relation", many=True)
model = Issue issue_relations = RelatedIssueSerializer(read_only=True, source="issue_related", many=True)
fields = [ issue_cycle = IssueCycleDetailSerializer(read_only=True)
"id", issue_module = IssueModuleDetailSerializer(read_only=True)
"name", issue_link = IssueLinkSerializer(read_only=True, many=True)
"priority", issue_attachment = IssueAttachmentSerializer(read_only=True, many=True)
"sequence_id",
"project_id",
"created_at",
"label_ids",
]
read_only_fields = fields
class IssueSerializer(DynamicBaseSerializer):
# ids
cycle_id = serializers.PrimaryKeyRelatedField(read_only=True)
module_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
# Many to many
label_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
assignee_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
# Count items
sub_issues_count = serializers.IntegerField(read_only=True) sub_issues_count = serializers.IntegerField(read_only=True)
attachment_count = serializers.IntegerField(read_only=True) issue_reactions = IssueReactionSerializer(read_only=True, many=True)
link_count = serializers.IntegerField(read_only=True)
class Meta: class Meta:
model = Issue model = Issue
fields = [ fields = "__all__"
"id", read_only_fields = [
"name", "workspace",
"state_id", "project",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by", "created_by",
"updated_by", "updated_by",
"attachment_count", "created_at",
"link_count", "updated_at",
"is_draft",
"archived_at",
] ]
read_only_fields = fields
class IssueLiteSerializer(DynamicBaseSerializer): class IssueLiteSerializer(DynamicBaseSerializer):
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
sub_issues_count = serializers.IntegerField(read_only=True)
cycle_id = serializers.UUIDField(read_only=True)
module_id = serializers.UUIDField(read_only=True)
attachment_count = serializers.IntegerField(read_only=True)
link_count = serializers.IntegerField(read_only=True)
issue_reactions = IssueReactionSerializer(read_only=True, many=True)
class Meta: class Meta:
model = Issue model = Issue
fields = [ fields = "__all__"
"id", read_only_fields = [
"sequence_id", "start_date",
"project_id", "target_date",
"completed_at",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
] ]
read_only_fields = fields
class IssueDetailSerializer(IssueSerializer):
description_html = serializers.CharField()
is_subscribed = serializers.BooleanField(read_only=True)
class Meta(IssueSerializer.Meta):
fields = IssueSerializer.Meta.fields + [
"description_html",
"is_subscribed",
]
read_only_fields = fields
class IssuePublicSerializer(BaseSerializer): class IssuePublicSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project") project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state") state_detail = StateLiteSerializer(read_only=True, source="state")
reactions = IssueReactionSerializer( reactions = IssueReactionSerializer(read_only=True, many=True, source="issue_reactions")
read_only=True, many=True, source="issue_reactions"
)
votes = IssueVoteSerializer(read_only=True, many=True) votes = IssueVoteSerializer(read_only=True, many=True)
class Meta: class Meta:
@ -746,6 +604,7 @@ class IssuePublicSerializer(BaseSerializer):
read_only_fields = fields read_only_fields = fields
class IssueSubscriberSerializer(BaseSerializer): class IssueSubscriberSerializer(BaseSerializer):
class Meta: class Meta:
model = IssueSubscriber model = IssueSubscriber

View File

@ -2,8 +2,10 @@
from rest_framework import serializers from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer, DynamicBaseSerializer from .base import BaseSerializer
from .user import UserLiteSerializer
from .project import ProjectLiteSerializer from .project import ProjectLiteSerializer
from .workspace import WorkspaceLiteSerializer
from plane.db.models import ( from plane.db.models import (
User, User,
@ -12,23 +14,19 @@ from plane.db.models import (
ModuleIssue, ModuleIssue,
ModuleLink, ModuleLink,
ModuleFavorite, ModuleFavorite,
ModuleUserProperties,
) )
class ModuleWriteSerializer(BaseSerializer): class ModuleWriteSerializer(BaseSerializer):
lead_id = serializers.PrimaryKeyRelatedField( members = serializers.ListField(
source="lead",
queryset=User.objects.all(),
required=False,
allow_null=True,
)
member_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()), child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True, write_only=True,
required=False, required=False,
) )
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
class Meta: class Meta:
model = Module model = Module
fields = "__all__" fields = "__all__"
@ -39,32 +37,25 @@ class ModuleWriteSerializer(BaseSerializer):
"updated_by", "updated_by",
"created_at", "created_at",
"updated_at", "updated_at",
"archived_at",
] ]
def to_representation(self, instance): def to_representation(self, instance):
data = super().to_representation(instance) data = super().to_representation(instance)
data["member_ids"] = [ data['members'] = [str(member.id) for member in instance.members.all()]
str(member.id) for member in instance.members.all()
]
return data return data
def validate(self, data): def validate(self, data):
if ( if data.get("start_date", None) is not None and data.get("target_date", None) is not None and data.get("start_date", None) > data.get("target_date", None):
data.get("start_date", None) is not None raise serializers.ValidationError("Start date cannot exceed target date")
and data.get("target_date", None) is not None return data
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
return data
def create(self, validated_data): def create(self, validated_data):
members = validated_data.pop("member_ids", None) members = validated_data.pop("members", None)
project = self.context["project"] project = self.context["project"]
module = Module.objects.create(**validated_data, project=project) module = Module.objects.create(**validated_data, project=project)
if members is not None: if members is not None:
ModuleMember.objects.bulk_create( ModuleMember.objects.bulk_create(
[ [
@ -85,7 +76,7 @@ class ModuleWriteSerializer(BaseSerializer):
return module return module
def update(self, instance, validated_data): def update(self, instance, validated_data):
members = validated_data.pop("member_ids", None) members = validated_data.pop("members", None)
if members is not None: if members is not None:
ModuleMember.objects.filter(module=instance).delete() ModuleMember.objects.filter(module=instance).delete()
@ -142,6 +133,8 @@ class ModuleIssueSerializer(BaseSerializer):
class ModuleLinkSerializer(BaseSerializer): class ModuleLinkSerializer(BaseSerializer):
created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
class Meta: class Meta:
model = ModuleLink model = ModuleLink
fields = "__all__" fields = "__all__"
@ -158,8 +151,7 @@ class ModuleLinkSerializer(BaseSerializer):
# Validation if url already exists # Validation if url already exists
def create(self, validated_data): def create(self, validated_data):
if ModuleLink.objects.filter( if ModuleLink.objects.filter(
url=validated_data.get("url"), url=validated_data.get("url"), module_id=validated_data.get("module_id")
module_id=validated_data.get("module_id"),
).exists(): ).exists():
raise serializers.ValidationError( raise serializers.ValidationError(
{"error": "URL already exists for this Issue"} {"error": "URL already exists for this Issue"}
@ -167,10 +159,11 @@ class ModuleLinkSerializer(BaseSerializer):
return ModuleLink.objects.create(**validated_data) return ModuleLink.objects.create(**validated_data)
class ModuleSerializer(DynamicBaseSerializer): class ModuleSerializer(BaseSerializer):
member_ids = serializers.ListField( project_detail = ProjectLiteSerializer(read_only=True, source="project")
child=serializers.UUIDField(), required=False, allow_null=True lead_detail = UserLiteSerializer(read_only=True, source="lead")
) members_detail = UserLiteSerializer(read_only=True, many=True, source="members")
link_module = ModuleLinkSerializer(read_only=True, many=True)
is_favorite = serializers.BooleanField(read_only=True) is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True) total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True) cancelled_issues = serializers.IntegerField(read_only=True)
@ -181,45 +174,15 @@ class ModuleSerializer(DynamicBaseSerializer):
class Meta: class Meta:
model = Module model = Module
fields = [ fields = "__all__"
# Required fields read_only_fields = [
"id", "workspace",
"workspace_id", "project",
"project_id", "created_by",
# Model fields "updated_by",
"name",
"description",
"description_text",
"description_html",
"start_date",
"target_date",
"status",
"lead_id",
"member_ids",
"view_props",
"sort_order",
"external_source",
"external_id",
# computed fields
"is_favorite",
"total_issues",
"cancelled_issues",
"completed_issues",
"started_issues",
"unstarted_issues",
"backlog_issues",
"created_at", "created_at",
"updated_at", "updated_at",
] ]
read_only_fields = fields
class ModuleDetailSerializer(ModuleSerializer):
link_module = ModuleLinkSerializer(read_only=True, many=True)
sub_issues = serializers.IntegerField(read_only=True)
class Meta(ModuleSerializer.Meta):
fields = ModuleSerializer.Meta.fields + ["link_module", "sub_issues"]
class ModuleFavoriteSerializer(BaseSerializer): class ModuleFavoriteSerializer(BaseSerializer):
@ -233,10 +196,3 @@ class ModuleFavoriteSerializer(BaseSerializer):
"project", "project",
"user", "user",
] ]
class ModuleUserPropertiesSerializer(BaseSerializer):
class Meta:
model = ModuleUserProperties
fields = "__all__"
read_only_fields = ["workspace", "project", "module", "user"]

View File

@ -1,20 +1,12 @@
# Module imports # Module imports
from .base import BaseSerializer from .base import BaseSerializer
from .user import UserLiteSerializer from .user import UserLiteSerializer
from plane.db.models import Notification, UserNotificationPreference from plane.db.models import Notification
class NotificationSerializer(BaseSerializer): class NotificationSerializer(BaseSerializer):
triggered_by_details = UserLiteSerializer( triggered_by_details = UserLiteSerializer(read_only=True, source="triggered_by")
read_only=True, source="triggered_by"
)
class Meta: class Meta:
model = Notification model = Notification
fields = "__all__" fields = "__all__"
class UserNotificationPreferenceSerializer(BaseSerializer):
class Meta:
model = UserNotificationPreference
fields = "__all__"

View File

@ -3,32 +3,22 @@ from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer from .base import BaseSerializer
from .issue import LabelLiteSerializer from .issue import IssueFlatSerializer, LabelLiteSerializer
from .workspace import WorkspaceLiteSerializer from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer from .project import ProjectLiteSerializer
from plane.db.models import ( from plane.db.models import Page, PageLog, PageFavorite, PageLabel, Label, Issue, Module
Page,
PageLog,
PageFavorite,
PageLabel,
Label,
)
class PageSerializer(BaseSerializer): class PageSerializer(BaseSerializer):
is_favorite = serializers.BooleanField(read_only=True) is_favorite = serializers.BooleanField(read_only=True)
label_details = LabelLiteSerializer( label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
read_only=True, source="labels", many=True
)
labels = serializers.ListField( labels = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()), child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True, write_only=True,
required=False, required=False,
) )
project_detail = ProjectLiteSerializer(source="project", read_only=True) project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer( workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
source="workspace", read_only=True
)
class Meta: class Meta:
model = Page model = Page
@ -38,10 +28,9 @@ class PageSerializer(BaseSerializer):
"project", "project",
"owned_by", "owned_by",
] ]
def to_representation(self, instance): def to_representation(self, instance):
data = super().to_representation(instance) data = super().to_representation(instance)
data["labels"] = [str(label.id) for label in instance.labels.all()] data['labels'] = [str(label.id) for label in instance.labels.all()]
return data return data
def create(self, validated_data): def create(self, validated_data):
@ -105,7 +94,7 @@ class SubPageSerializer(BaseSerializer):
def get_entity_details(self, obj): def get_entity_details(self, obj):
entity_name = obj.entity_name entity_name = obj.entity_name
if entity_name == "forward_link" or entity_name == "back_link": if entity_name == 'forward_link' or entity_name == 'back_link':
try: try:
page = Page.objects.get(pk=obj.entity_identifier) page = Page.objects.get(pk=obj.entity_identifier)
return PageSerializer(page).data return PageSerializer(page).data
@ -115,6 +104,7 @@ class SubPageSerializer(BaseSerializer):
class PageLogSerializer(BaseSerializer): class PageLogSerializer(BaseSerializer):
class Meta: class Meta:
model = PageLog model = PageLog
fields = "__all__" fields = "__all__"

View File

@ -4,10 +4,7 @@ from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer, DynamicBaseSerializer from .base import BaseSerializer, DynamicBaseSerializer
from plane.app.serializers.workspace import WorkspaceLiteSerializer from plane.app.serializers.workspace import WorkspaceLiteSerializer
from plane.app.serializers.user import ( from plane.app.serializers.user import UserLiteSerializer, UserAdminLiteSerializer
UserLiteSerializer,
UserAdminLiteSerializer,
)
from plane.db.models import ( from plane.db.models import (
Project, Project,
ProjectMember, ProjectMember,
@ -20,9 +17,7 @@ from plane.db.models import (
class ProjectSerializer(BaseSerializer): class ProjectSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer( workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
source="workspace", read_only=True
)
class Meta: class Meta:
model = Project model = Project
@ -34,16 +29,12 @@ class ProjectSerializer(BaseSerializer):
def create(self, validated_data): def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper() identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "": if identifier == "":
raise serializers.ValidationError( raise serializers.ValidationError(detail="Project Identifier is required")
detail="Project Identifier is required"
)
if ProjectIdentifier.objects.filter( if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"] name=identifier, workspace_id=self.context["workspace_id"]
).exists(): ).exists():
raise serializers.ValidationError( raise serializers.ValidationError(detail="Project Identifier is taken")
detail="Project Identifier is taken"
)
project = Project.objects.create( project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"] **validated_data, workspace_id=self.context["workspace_id"]
) )
@ -82,9 +73,7 @@ class ProjectSerializer(BaseSerializer):
return project return project
# If not same fail update # If not same fail update
raise serializers.ValidationError( raise serializers.ValidationError(detail="Project Identifier is already taken")
detail="Project Identifier is already taken"
)
class ProjectLiteSerializer(BaseSerializer): class ProjectLiteSerializer(BaseSerializer):
@ -95,19 +84,14 @@ class ProjectLiteSerializer(BaseSerializer):
"identifier", "identifier",
"name", "name",
"cover_image", "cover_image",
"logo_props", "icon_prop",
"emoji",
"description", "description",
] ]
read_only_fields = fields read_only_fields = fields
class ProjectListSerializer(DynamicBaseSerializer): class ProjectListSerializer(DynamicBaseSerializer):
total_issues = serializers.IntegerField(read_only=True)
archived_issues = serializers.IntegerField(read_only=True)
archived_sub_issues = serializers.IntegerField(read_only=True)
draft_issues = serializers.IntegerField(read_only=True)
draft_sub_issues = serializers.IntegerField(read_only=True)
sub_issues = serializers.IntegerField(read_only=True)
is_favorite = serializers.BooleanField(read_only=True) is_favorite = serializers.BooleanField(read_only=True)
total_members = serializers.IntegerField(read_only=True) total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True) total_cycles = serializers.IntegerField(read_only=True)
@ -176,12 +160,6 @@ class ProjectMemberAdminSerializer(BaseSerializer):
fields = "__all__" fields = "__all__"
class ProjectMemberRoleSerializer(DynamicBaseSerializer):
class Meta:
model = ProjectMember
fields = ("id", "role", "member", "project")
class ProjectMemberInviteSerializer(BaseSerializer): class ProjectMemberInviteSerializer(BaseSerializer):
project = ProjectLiteSerializer(read_only=True) project = ProjectLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True) workspace = WorkspaceLiteSerializer(read_only=True)
@ -219,9 +197,7 @@ class ProjectMemberLiteSerializer(BaseSerializer):
class ProjectDeployBoardSerializer(BaseSerializer): class ProjectDeployBoardSerializer(BaseSerializer):
project_details = ProjectLiteSerializer(read_only=True, source="project") project_details = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer( workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
read_only=True, source="workspace"
)
class Meta: class Meta:
model = ProjectDeployBoard model = ProjectDeployBoard
@ -241,4 +217,4 @@ class ProjectPublicMemberSerializer(BaseSerializer):
"workspace", "workspace",
"project", "project",
"member", "member",
] ]

View File

@ -6,19 +6,10 @@ from plane.db.models import State
class StateSerializer(BaseSerializer): class StateSerializer(BaseSerializer):
class Meta: class Meta:
model = State model = State
fields = [ fields = "__all__"
"id",
"project_id",
"workspace_id",
"name",
"color",
"group",
"default",
"description",
"sequence",
]
read_only_fields = [ read_only_fields = [
"workspace", "workspace",
"project", "project",
@ -34,4 +25,4 @@ class StateLiteSerializer(BaseSerializer):
"color", "color",
"group", "group",
] ]
read_only_fields = fields read_only_fields = fields

View File

@ -4,6 +4,7 @@ from rest_framework import serializers
# Module import # Module import
from .base import BaseSerializer from .base import BaseSerializer
from plane.db.models import User, Workspace, WorkspaceMemberInvite from plane.db.models import User, Workspace, WorkspaceMemberInvite
from plane.license.models import InstanceAdmin, Instance
class UserSerializer(BaseSerializer): class UserSerializer(BaseSerializer):
@ -98,20 +99,17 @@ class UserMeSettingsSerializer(BaseSerializer):
).first() ).first()
return { return {
"last_workspace_id": obj.last_workspace_id, "last_workspace_id": obj.last_workspace_id,
"last_workspace_slug": ( "last_workspace_slug": workspace.slug if workspace is not None else "",
workspace.slug if workspace is not None else ""
),
"fallback_workspace_id": obj.last_workspace_id, "fallback_workspace_id": obj.last_workspace_id,
"fallback_workspace_slug": ( "fallback_workspace_slug": workspace.slug
workspace.slug if workspace is not None else "" if workspace is not None
), else "",
"invites": workspace_invites, "invites": workspace_invites,
} }
else: else:
fallback_workspace = ( fallback_workspace = (
Workspace.objects.filter( Workspace.objects.filter(
workspace_member__member_id=obj.id, workspace_member__member_id=obj.id, workspace_member__is_active=True
workspace_member__is_active=True,
) )
.order_by("created_at") .order_by("created_at")
.first() .first()
@ -119,16 +117,12 @@ class UserMeSettingsSerializer(BaseSerializer):
return { return {
"last_workspace_id": None, "last_workspace_id": None,
"last_workspace_slug": None, "last_workspace_slug": None,
"fallback_workspace_id": ( "fallback_workspace_id": fallback_workspace.id
fallback_workspace.id if fallback_workspace is not None
if fallback_workspace is not None else None,
else None "fallback_workspace_slug": fallback_workspace.slug
), if fallback_workspace is not None
"fallback_workspace_slug": ( else None,
fallback_workspace.slug
if fallback_workspace is not None
else None
),
"invites": workspace_invites, "invites": workspace_invites,
} }
@ -186,9 +180,7 @@ class ChangePasswordSerializer(serializers.Serializer):
if data.get("new_password") != data.get("confirm_password"): if data.get("new_password") != data.get("confirm_password"):
raise serializers.ValidationError( raise serializers.ValidationError(
{ {"error": "Confirm password should be same as the new password."}
"error": "Confirm password should be same as the new password."
}
) )
return data return data
@ -198,5 +190,4 @@ class ResetPasswordSerializer(serializers.Serializer):
""" """
Serializer for password change endpoint. Serializer for password change endpoint.
""" """
new_password = serializers.CharField(required=True, min_length=8) new_password = serializers.CharField(required=True, min_length=8)

View File

@ -2,7 +2,7 @@
from rest_framework import serializers from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer, DynamicBaseSerializer from .base import BaseSerializer
from .workspace import WorkspaceLiteSerializer from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer from .project import ProjectLiteSerializer
from plane.db.models import GlobalView, IssueView, IssueViewFavorite from plane.db.models import GlobalView, IssueView, IssueViewFavorite
@ -10,9 +10,7 @@ from plane.utils.issue_filters import issue_filters
class GlobalViewSerializer(BaseSerializer): class GlobalViewSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer( workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
source="workspace", read_only=True
)
class Meta: class Meta:
model = GlobalView model = GlobalView
@ -40,12 +38,10 @@ class GlobalViewSerializer(BaseSerializer):
return super().update(instance, validated_data) return super().update(instance, validated_data)
class IssueViewSerializer(DynamicBaseSerializer): class IssueViewSerializer(BaseSerializer):
is_favorite = serializers.BooleanField(read_only=True) is_favorite = serializers.BooleanField(read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True) project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer( workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
source="workspace", read_only=True
)
class Meta: class Meta:
model = IssueView model = IssueView

View File

@ -1,4 +1,5 @@
# Python imports # Python imports
import urllib
import socket import socket
import ipaddress import ipaddress
from urllib.parse import urlparse from urllib.parse import urlparse
@ -9,113 +10,78 @@ from rest_framework import serializers
# Module imports # Module imports
from .base import DynamicBaseSerializer from .base import DynamicBaseSerializer
from plane.db.models import Webhook, WebhookLog from plane.db.models import Webhook, WebhookLog
from plane.db.models.webhook import validate_domain, validate_schema from plane.db.models.webhook import validate_domain, validate_schema
class WebhookSerializer(DynamicBaseSerializer): class WebhookSerializer(DynamicBaseSerializer):
url = serializers.URLField(validators=[validate_schema, validate_domain]) url = serializers.URLField(validators=[validate_schema, validate_domain])
def create(self, validated_data): def create(self, validated_data):
url = validated_data.get("url", None) url = validated_data.get("url", None)
# Extract the hostname from the URL # Extract the hostname from the URL
hostname = urlparse(url).hostname hostname = urlparse(url).hostname
if not hostname: if not hostname:
raise serializers.ValidationError( raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
{"url": "Invalid URL: No hostname found."}
)
# Resolve the hostname to IP addresses # Resolve the hostname to IP addresses
try: try:
ip_addresses = socket.getaddrinfo(hostname, None) ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror: except socket.gaierror:
raise serializers.ValidationError( raise serializers.ValidationError({"url": "Hostname could not be resolved."})
{"url": "Hostname could not be resolved."}
)
if not ip_addresses: if not ip_addresses:
raise serializers.ValidationError( raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
{"url": "No IP addresses found for the hostname."}
)
for addr in ip_addresses: for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0]) ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback: if ip.is_private or ip.is_loopback:
raise serializers.ValidationError( raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
{"url": "URL resolves to a blocked IP address."}
)
# Additional validation for multiple request domains and their subdomains # Additional validation for multiple request domains and their subdomains
request = self.context.get("request") request = self.context.get('request')
disallowed_domains = [ disallowed_domains = ['plane.so',] # Add your disallowed domains here
"plane.so",
] # Add your disallowed domains here
if request: if request:
request_host = request.get_host().split(":")[ request_host = request.get_host().split(':')[0] # Remove port if present
0
] # Remove port if present
disallowed_domains.append(request_host) disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain # Check if hostname is a subdomain or exact match of any disallowed domain
if any( if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
hostname == domain or hostname.endswith("." + domain) raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
for domain in disallowed_domains
):
raise serializers.ValidationError(
{"url": "URL domain or its subdomain is not allowed."}
)
return Webhook.objects.create(**validated_data) return Webhook.objects.create(**validated_data)
def update(self, instance, validated_data): def update(self, instance, validated_data):
url = validated_data.get("url", None) url = validated_data.get("url", None)
if url: if url:
# Extract the hostname from the URL # Extract the hostname from the URL
hostname = urlparse(url).hostname hostname = urlparse(url).hostname
if not hostname: if not hostname:
raise serializers.ValidationError( raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
{"url": "Invalid URL: No hostname found."}
)
# Resolve the hostname to IP addresses # Resolve the hostname to IP addresses
try: try:
ip_addresses = socket.getaddrinfo(hostname, None) ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror: except socket.gaierror:
raise serializers.ValidationError( raise serializers.ValidationError({"url": "Hostname could not be resolved."})
{"url": "Hostname could not be resolved."}
)
if not ip_addresses: if not ip_addresses:
raise serializers.ValidationError( raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
{"url": "No IP addresses found for the hostname."}
)
for addr in ip_addresses: for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0]) ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback: if ip.is_private or ip.is_loopback:
raise serializers.ValidationError( raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
{"url": "URL resolves to a blocked IP address."}
)
# Additional validation for multiple request domains and their subdomains # Additional validation for multiple request domains and their subdomains
request = self.context.get("request") request = self.context.get('request')
disallowed_domains = [ disallowed_domains = ['plane.so',] # Add your disallowed domains here
"plane.so",
] # Add your disallowed domains here
if request: if request:
request_host = request.get_host().split(":")[ request_host = request.get_host().split(':')[0] # Remove port if present
0
] # Remove port if present
disallowed_domains.append(request_host) disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain # Check if hostname is a subdomain or exact match of any disallowed domain
if any( if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
hostname == domain or hostname.endswith("." + domain) raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
for domain in disallowed_domains
):
raise serializers.ValidationError(
{"url": "URL domain or its subdomain is not allowed."}
)
return super().update(instance, validated_data) return super().update(instance, validated_data)
@ -129,7 +95,12 @@ class WebhookSerializer(DynamicBaseSerializer):
class WebhookLogSerializer(DynamicBaseSerializer): class WebhookLogSerializer(DynamicBaseSerializer):
class Meta: class Meta:
model = WebhookLog model = WebhookLog
fields = "__all__" fields = "__all__"
read_only_fields = ["workspace", "webhook"] read_only_fields = [
"workspace",
"webhook"
]

View File

@ -2,7 +2,7 @@
from rest_framework import serializers from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer, DynamicBaseSerializer from .base import BaseSerializer
from .user import UserLiteSerializer, UserAdminLiteSerializer from .user import UserLiteSerializer, UserAdminLiteSerializer
from plane.db.models import ( from plane.db.models import (
@ -13,11 +13,10 @@ from plane.db.models import (
TeamMember, TeamMember,
WorkspaceMemberInvite, WorkspaceMemberInvite,
WorkspaceTheme, WorkspaceTheme,
WorkspaceUserProperties,
) )
class WorkSpaceSerializer(DynamicBaseSerializer): class WorkSpaceSerializer(BaseSerializer):
owner = UserLiteSerializer(read_only=True) owner = UserLiteSerializer(read_only=True)
total_members = serializers.IntegerField(read_only=True) total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True) total_issues = serializers.IntegerField(read_only=True)
@ -51,7 +50,6 @@ class WorkSpaceSerializer(DynamicBaseSerializer):
"owner", "owner",
] ]
class WorkspaceLiteSerializer(BaseSerializer): class WorkspaceLiteSerializer(BaseSerializer):
class Meta: class Meta:
model = Workspace model = Workspace
@ -63,7 +61,8 @@ class WorkspaceLiteSerializer(BaseSerializer):
read_only_fields = fields read_only_fields = fields
class WorkSpaceMemberSerializer(DynamicBaseSerializer):
class WorkSpaceMemberSerializer(BaseSerializer):
member = UserLiteSerializer(read_only=True) member = UserLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True) workspace = WorkspaceLiteSerializer(read_only=True)
@ -73,12 +72,13 @@ class WorkSpaceMemberSerializer(DynamicBaseSerializer):
class WorkspaceMemberMeSerializer(BaseSerializer): class WorkspaceMemberMeSerializer(BaseSerializer):
class Meta: class Meta:
model = WorkspaceMember model = WorkspaceMember
fields = "__all__" fields = "__all__"
class WorkspaceMemberAdminSerializer(DynamicBaseSerializer): class WorkspaceMemberAdminSerializer(BaseSerializer):
member = UserAdminLiteSerializer(read_only=True) member = UserAdminLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True) workspace = WorkspaceLiteSerializer(read_only=True)
@ -95,22 +95,10 @@ class WorkSpaceMemberInviteSerializer(BaseSerializer):
class Meta: class Meta:
model = WorkspaceMemberInvite model = WorkspaceMemberInvite
fields = "__all__" fields = "__all__"
read_only_fields = [
"id",
"email",
"token",
"workspace",
"message",
"responded_at",
"created_at",
"updated_at",
]
class TeamSerializer(BaseSerializer): class TeamSerializer(BaseSerializer):
members_detail = UserLiteSerializer( members_detail = UserLiteSerializer(read_only=True, source="members", many=True)
read_only=True, source="members", many=True
)
members = serializers.ListField( members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()), child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True, write_only=True,
@ -147,9 +135,7 @@ class TeamSerializer(BaseSerializer):
members = validated_data.pop("members") members = validated_data.pop("members")
TeamMember.objects.filter(team=instance).delete() TeamMember.objects.filter(team=instance).delete()
team_members = [ team_members = [
TeamMember( TeamMember(member=member, team=instance, workspace=instance.workspace)
member=member, team=instance, workspace=instance.workspace
)
for member in members for member in members
] ]
TeamMember.objects.bulk_create(team_members, batch_size=10) TeamMember.objects.bulk_create(team_members, batch_size=10)
@ -165,13 +151,3 @@ class WorkspaceThemeSerializer(BaseSerializer):
"workspace", "workspace",
"actor", "actor",
] ]
class WorkspaceUserPropertiesSerializer(BaseSerializer):
class Meta:
model = WorkspaceUserProperties
fields = "__all__"
read_only_fields = [
"workspace",
"user",
]

View File

@ -3,10 +3,11 @@ from .asset import urlpatterns as asset_urls
from .authentication import urlpatterns as authentication_urls from .authentication import urlpatterns as authentication_urls
from .config import urlpatterns as configuration_urls from .config import urlpatterns as configuration_urls
from .cycle import urlpatterns as cycle_urls from .cycle import urlpatterns as cycle_urls
from .dashboard import urlpatterns as dashboard_urls
from .estimate import urlpatterns as estimate_urls from .estimate import urlpatterns as estimate_urls
from .external import urlpatterns as external_urls from .external import urlpatterns as external_urls
from .importer import urlpatterns as importer_urls
from .inbox import urlpatterns as inbox_urls from .inbox import urlpatterns as inbox_urls
from .integration import urlpatterns as integration_urls
from .issue import urlpatterns as issue_urls from .issue import urlpatterns as issue_urls
from .module import urlpatterns as module_urls from .module import urlpatterns as module_urls
from .notification import urlpatterns as notification_urls from .notification import urlpatterns as notification_urls
@ -27,10 +28,11 @@ urlpatterns = [
*authentication_urls, *authentication_urls,
*configuration_urls, *configuration_urls,
*cycle_urls, *cycle_urls,
*dashboard_urls,
*estimate_urls, *estimate_urls,
*external_urls, *external_urls,
*importer_urls,
*inbox_urls, *inbox_urls,
*integration_urls,
*issue_urls, *issue_urls,
*module_urls, *module_urls,
*notification_urls, *notification_urls,
@ -43,4 +45,4 @@ urlpatterns = [
*workspace_urls, *workspace_urls,
*api_urls, *api_urls,
*webhook_urls, *webhook_urls,
] ]

View File

@ -31,14 +31,8 @@ urlpatterns = [
path("sign-in/", SignInEndpoint.as_view(), name="sign-in"), path("sign-in/", SignInEndpoint.as_view(), name="sign-in"),
path("sign-out/", SignOutEndpoint.as_view(), name="sign-out"), path("sign-out/", SignOutEndpoint.as_view(), name="sign-out"),
# magic sign in # magic sign in
path( path("magic-generate/", MagicGenerateEndpoint.as_view(), name="magic-generate"),
"magic-generate/", path("magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"),
MagicGenerateEndpoint.as_view(),
name="magic-generate",
),
path(
"magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"
),
path("token/refresh/", TokenRefreshView.as_view(), name="token_refresh"), path("token/refresh/", TokenRefreshView.as_view(), name="token_refresh"),
# Password Manipulation # Password Manipulation
path( path(
@ -58,8 +52,6 @@ urlpatterns = [
), ),
# API Tokens # API Tokens
path("api-tokens/", ApiTokenEndpoint.as_view(), name="api-tokens"), path("api-tokens/", ApiTokenEndpoint.as_view(), name="api-tokens"),
path( path("api-tokens/<uuid:pk>/", ApiTokenEndpoint.as_view(), name="api-tokens"),
"api-tokens/<uuid:pk>/", ApiTokenEndpoint.as_view(), name="api-tokens"
),
## End API Tokens ## End API Tokens
] ]

View File

@ -1,7 +1,7 @@
from django.urls import path from django.urls import path
from plane.app.views import ConfigurationEndpoint, MobileConfigurationEndpoint from plane.app.views import ConfigurationEndpoint
urlpatterns = [ urlpatterns = [
path( path(
@ -9,9 +9,4 @@ urlpatterns = [
ConfigurationEndpoint.as_view(), ConfigurationEndpoint.as_view(),
name="configuration", name="configuration",
), ),
path( ]
"mobile-configs/",
MobileConfigurationEndpoint.as_view(),
name="configuration",
),
]

View File

@ -7,8 +7,6 @@ from plane.app.views import (
CycleDateCheckEndpoint, CycleDateCheckEndpoint,
CycleFavoriteViewSet, CycleFavoriteViewSet,
TransferCycleIssueEndpoint, TransferCycleIssueEndpoint,
CycleUserPropertiesEndpoint,
CycleArchiveUnarchiveEndpoint,
) )
@ -46,7 +44,7 @@ urlpatterns = [
name="project-issue-cycle", name="project-issue-cycle",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:issue_id>/", "workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:pk>/",
CycleIssueViewSet.as_view( CycleIssueViewSet.as_view(
{ {
"get": "retrieve", "get": "retrieve",
@ -86,19 +84,4 @@ urlpatterns = [
TransferCycleIssueEndpoint.as_view(), TransferCycleIssueEndpoint.as_view(),
name="transfer-issues", name="transfer-issues",
), ),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/user-properties/",
CycleUserPropertiesEndpoint.as_view(),
name="cycle-user-filters",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/archive/",
CycleArchiveUnarchiveEndpoint.as_view(),
name="cycle-archive-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-cycles/",
CycleArchiveUnarchiveEndpoint.as_view(),
name="cycle-archive-unarchive",
),
] ]

View File

@ -1,23 +0,0 @@
from django.urls import path
from plane.app.views import DashboardEndpoint, WidgetsEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/dashboard/",
DashboardEndpoint.as_view(),
name="dashboard",
),
path(
"workspaces/<str:slug>/dashboard/<uuid:dashboard_id>/",
DashboardEndpoint.as_view(),
name="dashboard",
),
path(
"dashboard/<uuid:dashboard_id>/widgets/<uuid:widget_id>/",
WidgetsEndpoint.as_view(),
name="widgets",
),
]

View File

@ -2,6 +2,7 @@ from django.urls import path
from plane.app.views import UnsplashEndpoint from plane.app.views import UnsplashEndpoint
from plane.app.views import ReleaseNotesEndpoint
from plane.app.views import GPTIntegrationEndpoint from plane.app.views import GPTIntegrationEndpoint
@ -11,6 +12,11 @@ urlpatterns = [
UnsplashEndpoint.as_view(), UnsplashEndpoint.as_view(),
name="unsplash", name="unsplash",
), ),
path(
"release-notes/",
ReleaseNotesEndpoint.as_view(),
name="release-notes",
),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/ai-assistant/", "workspaces/<str:slug>/projects/<uuid:project_id>/ai-assistant/",
GPTIntegrationEndpoint.as_view(), GPTIntegrationEndpoint.as_view(),

View File

@ -0,0 +1,37 @@
from django.urls import path
from plane.app.views import (
ServiceIssueImportSummaryEndpoint,
ImportServiceEndpoint,
UpdateServiceImportStatusEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/importers/<str:service>/",
ServiceIssueImportSummaryEndpoint.as_view(),
name="importer-summary",
),
path(
"workspaces/<str:slug>/projects/importers/<str:service>/",
ImportServiceEndpoint.as_view(),
name="importer",
),
path(
"workspaces/<str:slug>/importers/",
ImportServiceEndpoint.as_view(),
name="importer",
),
path(
"workspaces/<str:slug>/importers/<str:service>/<uuid:pk>/",
ImportServiceEndpoint.as_view(),
name="importer",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/service/<str:service>/importers/<uuid:importer_id>/",
UpdateServiceImportStatusEndpoint.as_view(),
name="importer-status",
),
]

View File

@ -30,7 +30,7 @@ urlpatterns = [
name="inbox", name="inbox",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/", "workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/",
InboxIssueViewSet.as_view( InboxIssueViewSet.as_view(
{ {
"get": "list", "get": "list",
@ -40,7 +40,7 @@ urlpatterns = [
name="inbox-issue", name="inbox-issue",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:issue_id>/", "workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/<uuid:pk>/",
InboxIssueViewSet.as_view( InboxIssueViewSet.as_view(
{ {
"get": "retrieve", "get": "retrieve",

View File

@ -0,0 +1,150 @@
from django.urls import path
from plane.app.views import (
IntegrationViewSet,
WorkspaceIntegrationViewSet,
GithubRepositoriesEndpoint,
GithubRepositorySyncViewSet,
GithubIssueSyncViewSet,
GithubCommentSyncViewSet,
BulkCreateGithubIssueSyncEndpoint,
SlackProjectSyncViewSet,
)
urlpatterns = [
path(
"integrations/",
IntegrationViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="integrations",
),
path(
"integrations/<uuid:pk>/",
IntegrationViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="integrations",
),
path(
"workspaces/<str:slug>/workspace-integrations/",
WorkspaceIntegrationViewSet.as_view(
{
"get": "list",
}
),
name="workspace-integrations",
),
path(
"workspaces/<str:slug>/workspace-integrations/<str:provider>/",
WorkspaceIntegrationViewSet.as_view(
{
"post": "create",
}
),
name="workspace-integrations",
),
path(
"workspaces/<str:slug>/workspace-integrations/<uuid:pk>/provider/",
WorkspaceIntegrationViewSet.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="workspace-integrations",
),
# Github Integrations
path(
"workspaces/<str:slug>/workspace-integrations/<uuid:workspace_integration_id>/github-repositories/",
GithubRepositoriesEndpoint.as_view(),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/workspace-integrations/<uuid:workspace_integration_id>/github-repository-sync/",
GithubRepositorySyncViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/workspace-integrations/<uuid:workspace_integration_id>/github-repository-sync/<uuid:pk>/",
GithubRepositorySyncViewSet.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/github-repository-sync/<uuid:repo_sync_id>/github-issue-sync/",
GithubIssueSyncViewSet.as_view(
{
"post": "create",
"get": "list",
}
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/github-repository-sync/<uuid:repo_sync_id>/bulk-create-github-issue-sync/",
BulkCreateGithubIssueSyncEndpoint.as_view(),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/github-repository-sync/<uuid:repo_sync_id>/github-issue-sync/<uuid:pk>/",
GithubIssueSyncViewSet.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/github-repository-sync/<uuid:repo_sync_id>/github-issue-sync/<uuid:issue_sync_id>/github-comment-sync/",
GithubCommentSyncViewSet.as_view(
{
"post": "create",
"get": "list",
}
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/github-repository-sync/<uuid:repo_sync_id>/github-issue-sync/<uuid:issue_sync_id>/github-comment-sync/<uuid:pk>/",
GithubCommentSyncViewSet.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
),
## End Github Integrations
# Slack Integration
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/workspace-integrations/<uuid:workspace_integration_id>/project-slack-sync/",
SlackProjectSyncViewSet.as_view(
{
"post": "create",
"get": "list",
}
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/workspace-integrations/<uuid:workspace_integration_id>/project-slack-sync/<uuid:pk>/",
SlackProjectSyncViewSet.as_view(
{
"delete": "destroy",
"get": "retrieve",
}
),
),
## End Slack Integration
]

View File

@ -1,32 +1,30 @@
from django.urls import path from django.urls import path
from plane.app.views import ( from plane.app.views import (
IssueViewSet,
LabelViewSet,
BulkCreateIssueLabelsEndpoint, BulkCreateIssueLabelsEndpoint,
BulkDeleteIssuesEndpoint, BulkDeleteIssuesEndpoint,
BulkImportIssuesEndpoint,
UserWorkSpaceIssues,
SubIssuesEndpoint, SubIssuesEndpoint,
IssueLinkViewSet, IssueLinkViewSet,
IssueAttachmentEndpoint, IssueAttachmentEndpoint,
CommentReactionViewSet,
ExportIssuesEndpoint, ExportIssuesEndpoint,
IssueActivityEndpoint, IssueActivityEndpoint,
IssueArchiveViewSet,
IssueCommentViewSet, IssueCommentViewSet,
IssueDraftViewSet,
IssueListEndpoint,
IssueReactionViewSet,
IssueRelationViewSet,
IssueSubscriberViewSet, IssueSubscriberViewSet,
IssueReactionViewSet,
CommentReactionViewSet,
IssueUserDisplayPropertyEndpoint, IssueUserDisplayPropertyEndpoint,
IssueViewSet, IssueArchiveViewSet,
LabelViewSet, IssueRelationViewSet,
IssueDraftViewSet,
) )
urlpatterns = [ urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/list/",
IssueListEndpoint.as_view(),
name="project-issue",
),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueViewSet.as_view( IssueViewSet.as_view(
@ -81,7 +79,16 @@ urlpatterns = [
BulkDeleteIssuesEndpoint.as_view(), BulkDeleteIssuesEndpoint.as_view(),
name="project-issues-bulk", name="project-issues-bulk",
), ),
## path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-import-issues/<str:service>/",
BulkImportIssuesEndpoint.as_view(),
name="project-issues-bulk",
),
path(
"workspaces/<str:slug>/my-issues/",
UserWorkSpaceIssues.as_view(),
name="workspace-issues",
),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/sub-issues/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/sub-issues/",
SubIssuesEndpoint.as_view(), SubIssuesEndpoint.as_view(),
@ -228,7 +235,7 @@ urlpatterns = [
## End Comment Reactions ## End Comment Reactions
## IssueProperty ## IssueProperty
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-properties/", "workspaces/<str:slug>/projects/<uuid:project_id>/issue-display-properties/",
IssueUserDisplayPropertyEndpoint.as_view(), IssueUserDisplayPropertyEndpoint.as_view(),
name="project-issue-display-properties", name="project-issue-display-properties",
), ),
@ -244,15 +251,23 @@ urlpatterns = [
name="project-issue-archive", name="project-issue-archive",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/archive/", "workspaces/<str:slug>/projects/<uuid:project_id>/archived-issues/<uuid:pk>/",
IssueArchiveViewSet.as_view( IssueArchiveViewSet.as_view(
{ {
"get": "retrieve", "get": "retrieve",
"post": "archive", "delete": "destroy",
"delete": "unarchive",
} }
), ),
name="project-issue-archive-unarchive", name="project-issue-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/unarchive/<uuid:pk>/",
IssueArchiveViewSet.as_view(
{
"post": "unarchive",
}
),
name="project-issue-archive",
), ),
## End Issue Archives ## End Issue Archives
## Issue Relation ## Issue Relation
@ -260,17 +275,16 @@ urlpatterns = [
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-relation/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-relation/",
IssueRelationViewSet.as_view( IssueRelationViewSet.as_view(
{ {
"get": "list",
"post": "create", "post": "create",
} }
), ),
name="issue-relation", name="issue-relation",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/remove-relation/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-relation/<uuid:pk>/",
IssueRelationViewSet.as_view( IssueRelationViewSet.as_view(
{ {
"post": "remove_relation", "delete": "destroy",
} }
), ),
name="issue-relation", name="issue-relation",

View File

@ -6,8 +6,7 @@ from plane.app.views import (
ModuleIssueViewSet, ModuleIssueViewSet,
ModuleLinkViewSet, ModuleLinkViewSet,
ModuleFavoriteViewSet, ModuleFavoriteViewSet,
ModuleUserPropertiesEndpoint, BulkImportModulesEndpoint,
ModuleArchiveUnarchiveEndpoint,
) )
@ -35,26 +34,17 @@ urlpatterns = [
name="project-modules", name="project-modules",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/modules/", "workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/",
ModuleIssueViewSet.as_view( ModuleIssueViewSet.as_view(
{ {
"post": "create_issue_modules",
}
),
name="issue-module",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/issues/",
ModuleIssueViewSet.as_view(
{
"post": "create_module_issues",
"get": "list", "get": "list",
"post": "create",
} }
), ),
name="project-module-issues", name="project-module-issues",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/issues/<uuid:issue_id>/", "workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:pk>/",
ModuleIssueViewSet.as_view( ModuleIssueViewSet.as_view(
{ {
"get": "retrieve", "get": "retrieve",
@ -107,18 +97,8 @@ urlpatterns = [
name="user-favorite-module", name="user-favorite-module",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/user-properties/", "workspaces/<str:slug>/projects/<uuid:project_id>/bulk-import-modules/<str:service>/",
ModuleUserPropertiesEndpoint.as_view(), BulkImportModulesEndpoint.as_view(),
name="cycle-user-filters", name="bulk-modules-create",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/archive/",
ModuleArchiveUnarchiveEndpoint.as_view(),
name="module-archive-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-modules/",
ModuleArchiveUnarchiveEndpoint.as_view(),
name="module-archive-unarchive",
), ),
] ]

View File

@ -5,7 +5,6 @@ from plane.app.views import (
NotificationViewSet, NotificationViewSet,
UnreadNotificationEndpoint, UnreadNotificationEndpoint,
MarkAllReadNotificationViewSet, MarkAllReadNotificationViewSet,
UserNotificationPreferenceEndpoint,
) )
@ -64,9 +63,4 @@ urlpatterns = [
), ),
name="mark-all-read-notifications", name="mark-all-read-notifications",
), ),
path(
"users/me/notification-preferences/",
UserNotificationPreferenceEndpoint.as_view(),
name="user-notification-preferences",
),
] ]

View File

@ -13,8 +13,6 @@ from plane.app.views import (
UserProjectInvitationsViewset, UserProjectInvitationsViewset,
ProjectPublicCoverImagesEndpoint, ProjectPublicCoverImagesEndpoint,
ProjectDeployBoardViewSet, ProjectDeployBoardViewSet,
UserProjectRolesEndpoint,
ProjectArchiveUnarchiveEndpoint,
) )
@ -76,11 +74,6 @@ urlpatterns = [
), ),
name="user-project-invitations", name="user-project-invitations",
), ),
path(
"users/me/workspaces/<str:slug>/project-roles/",
UserProjectRolesEndpoint.as_view(),
name="user-project-roles",
),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/join/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/join/<uuid:pk>/",
ProjectJoinEndpoint.as_view(), ProjectJoinEndpoint.as_view(),
@ -176,9 +169,4 @@ urlpatterns = [
), ),
name="project-deploy-board", name="project-deploy-board",
), ),
path( ]
"workspaces/<str:slug>/projects/<uuid:project_id>/archive/",
ProjectArchiveUnarchiveEndpoint.as_view(),
name="project-archive-unarchive",
),
]

View File

@ -5,7 +5,7 @@ from plane.app.views import (
IssueViewViewSet, IssueViewViewSet,
GlobalViewViewSet, GlobalViewViewSet,
GlobalViewIssuesViewSet, GlobalViewIssuesViewSet,
IssueViewFavoriteViewSet, IssueViewFavoriteViewSet,
) )

View File

@ -18,13 +18,6 @@ from plane.app.views import (
WorkspaceUserProfileEndpoint, WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint, WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint, WorkspaceLabelsEndpoint,
WorkspaceProjectMemberEndpoint,
WorkspaceUserPropertiesEndpoint,
WorkspaceStatesEndpoint,
WorkspaceEstimatesEndpoint,
ExportWorkspaceUserActivityEndpoint,
WorkspaceModulesEndpoint,
WorkspaceCyclesEndpoint,
) )
@ -72,7 +65,6 @@ urlpatterns = [
{ {
"delete": "destroy", "delete": "destroy",
"get": "retrieve", "get": "retrieve",
"patch": "partial_update",
} }
), ),
name="workspace-invitations", name="workspace-invitations",
@ -99,11 +91,6 @@ urlpatterns = [
WorkSpaceMemberViewSet.as_view({"get": "list"}), WorkSpaceMemberViewSet.as_view({"get": "list"}),
name="workspace-member", name="workspace-member",
), ),
path(
"workspaces/<str:slug>/project-members/",
WorkspaceProjectMemberEndpoint.as_view(),
name="workspace-member-roles",
),
path( path(
"workspaces/<str:slug>/members/<uuid:pk>/", "workspaces/<str:slug>/members/<uuid:pk>/",
WorkSpaceMemberViewSet.as_view( WorkSpaceMemberViewSet.as_view(
@ -192,11 +179,6 @@ urlpatterns = [
WorkspaceUserActivityEndpoint.as_view(), WorkspaceUserActivityEndpoint.as_view(),
name="workspace-user-activity", name="workspace-user-activity",
), ),
path(
"workspaces/<str:slug>/user-activity/<uuid:user_id>/export/",
ExportWorkspaceUserActivityEndpoint.as_view(),
name="export-workspace-user-activity",
),
path( path(
"workspaces/<str:slug>/user-profile/<uuid:user_id>/", "workspaces/<str:slug>/user-profile/<uuid:user_id>/",
WorkspaceUserProfileEndpoint.as_view(), WorkspaceUserProfileEndpoint.as_view(),
@ -212,29 +194,4 @@ urlpatterns = [
WorkspaceLabelsEndpoint.as_view(), WorkspaceLabelsEndpoint.as_view(),
name="workspace-labels", name="workspace-labels",
), ),
path(
"workspaces/<str:slug>/user-properties/",
WorkspaceUserPropertiesEndpoint.as_view(),
name="workspace-user-filters",
),
path(
"workspaces/<str:slug>/states/",
WorkspaceStatesEndpoint.as_view(),
name="workspace-state",
),
path(
"workspaces/<str:slug>/estimates/",
WorkspaceEstimatesEndpoint.as_view(),
name="workspace-estimate",
),
path(
"workspaces/<str:slug>/modules/",
WorkspaceModulesEndpoint.as_view(),
name="workspace-modules",
),
path(
"workspaces/<str:slug>/cycles/",
WorkspaceCyclesEndpoint.as_view(),
name="workspace-cycles",
),
] ]

File diff suppressed because it is too large Load Diff

View File

@ -1,27 +1,18 @@
from .project.base import ( from .project import (
ProjectViewSet, ProjectViewSet,
ProjectMemberViewSet,
UserProjectInvitationsViewset,
ProjectInvitationsViewset,
AddTeamToProjectEndpoint,
ProjectIdentifierEndpoint, ProjectIdentifierEndpoint,
ProjectJoinEndpoint,
ProjectUserViewsEndpoint, ProjectUserViewsEndpoint,
ProjectMemberUserEndpoint,
ProjectFavoritesViewSet, ProjectFavoritesViewSet,
ProjectPublicCoverImagesEndpoint, ProjectPublicCoverImagesEndpoint,
ProjectDeployBoardViewSet, ProjectDeployBoardViewSet,
ProjectArchiveUnarchiveEndpoint,
) )
from .user import (
from .project.invite import (
UserProjectInvitationsViewset,
ProjectInvitationsViewset,
ProjectJoinEndpoint,
)
from .project.member import (
ProjectMemberViewSet,
AddTeamToProjectEndpoint,
ProjectMemberUserEndpoint,
UserProjectRolesEndpoint,
)
from .user.base import (
UserEndpoint, UserEndpoint,
UpdateUserOnBoardedEndpoint, UpdateUserOnBoardedEndpoint,
UpdateUserTourCompletedEndpoint, UpdateUserTourCompletedEndpoint,
@ -32,122 +23,62 @@ from .oauth import OauthEndpoint
from .base import BaseAPIView, BaseViewSet, WebhookMixin from .base import BaseAPIView, BaseViewSet, WebhookMixin
from .workspace.base import ( from .workspace import (
WorkSpaceViewSet, WorkSpaceViewSet,
UserWorkSpacesEndpoint, UserWorkSpacesEndpoint,
WorkSpaceAvailabilityCheckEndpoint, WorkSpaceAvailabilityCheckEndpoint,
UserWorkspaceDashboardEndpoint, WorkspaceJoinEndpoint,
WorkspaceThemeViewSet,
ExportWorkspaceUserActivityEndpoint
)
from .workspace.member import (
WorkSpaceMemberViewSet, WorkSpaceMemberViewSet,
TeamMemberViewSet, TeamMemberViewSet,
WorkspaceMemberUserEndpoint,
WorkspaceProjectMemberEndpoint,
WorkspaceMemberUserViewsEndpoint,
)
from .workspace.invite import (
WorkspaceInvitationsViewset, WorkspaceInvitationsViewset,
WorkspaceJoinEndpoint,
UserWorkspaceInvitationsViewSet, UserWorkspaceInvitationsViewSet,
)
from .workspace.label import (
WorkspaceLabelsEndpoint,
)
from .workspace.state import (
WorkspaceStatesEndpoint,
)
from .workspace.user import (
UserLastProjectWithWorkspaceEndpoint, UserLastProjectWithWorkspaceEndpoint,
WorkspaceUserProfileIssuesEndpoint, WorkspaceMemberUserEndpoint,
WorkspaceUserPropertiesEndpoint, WorkspaceMemberUserViewsEndpoint,
WorkspaceUserProfileEndpoint,
WorkspaceUserActivityEndpoint,
WorkspaceUserProfileStatsEndpoint,
UserActivityGraphEndpoint, UserActivityGraphEndpoint,
UserIssueCompletedGraphEndpoint, UserIssueCompletedGraphEndpoint,
UserWorkspaceDashboardEndpoint,
WorkspaceThemeViewSet,
WorkspaceUserProfileStatsEndpoint,
WorkspaceUserActivityEndpoint,
WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint,
) )
from .workspace.estimate import ( from .state import StateViewSet
WorkspaceEstimatesEndpoint, from .view import (
)
from .workspace.module import (
WorkspaceModulesEndpoint,
)
from .workspace.cycle import (
WorkspaceCyclesEndpoint,
)
from .state.base import StateViewSet
from .view.base import (
GlobalViewViewSet, GlobalViewViewSet,
GlobalViewIssuesViewSet, GlobalViewIssuesViewSet,
IssueViewViewSet, IssueViewViewSet,
IssueViewFavoriteViewSet, IssueViewFavoriteViewSet,
) )
from .cycle.base import ( from .cycle import (
CycleViewSet, CycleViewSet,
CycleIssueViewSet,
CycleDateCheckEndpoint, CycleDateCheckEndpoint,
CycleFavoriteViewSet, CycleFavoriteViewSet,
TransferCycleIssueEndpoint, TransferCycleIssueEndpoint,
CycleArchiveUnarchiveEndpoint,
CycleUserPropertiesEndpoint,
) )
from .cycle.issue import ( from .asset import FileAssetEndpoint, UserAssetsEndpoint, FileAssetViewSet
CycleIssueViewSet, from .issue import (
)
from .asset.base import FileAssetEndpoint, UserAssetsEndpoint, FileAssetViewSet
from .issue.base import (
IssueListEndpoint,
IssueViewSet, IssueViewSet,
IssueUserDisplayPropertyEndpoint, WorkSpaceIssuesEndpoint,
BulkDeleteIssuesEndpoint,
)
from .issue.activity import (
IssueActivityEndpoint, IssueActivityEndpoint,
)
from .issue.archive import (
IssueArchiveViewSet,
)
from .issue.attachment import (
IssueAttachmentEndpoint,
)
from .issue.comment import (
IssueCommentViewSet, IssueCommentViewSet,
CommentReactionViewSet, IssueUserDisplayPropertyEndpoint,
)
from .issue.draft import IssueDraftViewSet
from .issue.label import (
LabelViewSet, LabelViewSet,
BulkCreateIssueLabelsEndpoint, BulkDeleteIssuesEndpoint,
) UserWorkSpaceIssues,
from .issue.link import (
IssueLinkViewSet,
)
from .issue.relation import (
IssueRelationViewSet,
)
from .issue.reaction import (
IssueReactionViewSet,
)
from .issue.sub_issue import (
SubIssuesEndpoint, SubIssuesEndpoint,
) IssueLinkViewSet,
BulkCreateIssueLabelsEndpoint,
from .issue.subscriber import ( IssueAttachmentEndpoint,
IssueArchiveViewSet,
IssueSubscriberViewSet, IssueSubscriberViewSet,
CommentReactionViewSet,
IssueReactionViewSet,
IssueRelationViewSet,
IssueDraftViewSet,
) )
from .auth_extended import ( from .auth_extended import (
@ -166,22 +97,35 @@ from .authentication import (
MagicSignInEndpoint, MagicSignInEndpoint,
) )
from .module.base import ( from .module import (
ModuleViewSet, ModuleViewSet,
ModuleIssueViewSet,
ModuleLinkViewSet, ModuleLinkViewSet,
ModuleFavoriteViewSet, ModuleFavoriteViewSet,
ModuleArchiveUnarchiveEndpoint,
ModuleUserPropertiesEndpoint,
)
from .module.issue import (
ModuleIssueViewSet,
) )
from .api import ApiTokenEndpoint from .api import ApiTokenEndpoint
from .integration import (
WorkspaceIntegrationViewSet,
IntegrationViewSet,
GithubIssueSyncViewSet,
GithubRepositorySyncViewSet,
GithubCommentSyncViewSet,
GithubRepositoriesEndpoint,
BulkCreateGithubIssueSyncEndpoint,
SlackProjectSyncViewSet,
)
from .page.base import ( from .importer import (
ServiceIssueImportSummaryEndpoint,
ImportServiceEndpoint,
UpdateServiceImportStatusEndpoint,
BulkImportIssuesEndpoint,
BulkImportModulesEndpoint,
)
from .page import (
PageViewSet, PageViewSet,
PageFavoriteViewSet, PageFavoriteViewSet,
PageLogEndpoint, PageLogEndpoint,
@ -191,19 +135,16 @@ from .page.base import (
from .search import GlobalSearchEndpoint, IssueSearchEndpoint from .search import GlobalSearchEndpoint, IssueSearchEndpoint
from .external.base import ( from .external import GPTIntegrationEndpoint, ReleaseNotesEndpoint, UnsplashEndpoint
GPTIntegrationEndpoint,
UnsplashEndpoint,
)
from .estimate.base import ( from .estimate import (
ProjectEstimatePointEndpoint, ProjectEstimatePointEndpoint,
BulkEstimatePointEndpoint, BulkEstimatePointEndpoint,
) )
from .inbox.base import InboxViewSet, InboxIssueViewSet from .inbox import InboxViewSet, InboxIssueViewSet
from .analytic.base import ( from .analytic import (
AnalyticsEndpoint, AnalyticsEndpoint,
AnalyticViewViewset, AnalyticViewViewset,
SavedAnalyticEndpoint, SavedAnalyticEndpoint,
@ -211,23 +152,18 @@ from .analytic.base import (
DefaultAnalyticsEndpoint, DefaultAnalyticsEndpoint,
) )
from .notification.base import ( from .notification import (
NotificationViewSet, NotificationViewSet,
UnreadNotificationEndpoint, UnreadNotificationEndpoint,
MarkAllReadNotificationViewSet, MarkAllReadNotificationViewSet,
UserNotificationPreferenceEndpoint,
) )
from .exporter.base import ExportIssuesEndpoint from .exporter import ExportIssuesEndpoint
from .config import ConfigurationEndpoint, MobileConfigurationEndpoint from .config import ConfigurationEndpoint
from .webhook.base import ( from .webhook import (
WebhookEndpoint, WebhookEndpoint,
WebhookLogsEndpoint, WebhookLogsEndpoint,
WebhookSecretRegenerateEndpoint, WebhookSecretRegenerateEndpoint,
) )
from .dashboard.base import DashboardEndpoint, WidgetsEndpoint
from .error_404 import custom_404_view

Some files were not shown because too many files have changed in this diff Show More