Compare commits

...

169 Commits

Author SHA1 Message Date
NarayanBavisetti
9a3950230c Merge branch 'chore/settings_variable' of github.com:makeplane/plane into chore/settings_variable 2023-12-01 11:46:56 +05:30
NarayanBavisetti
37320aa8cf Merge branch 'develop' of github.com:makeplane/plane into chore/settings_variable 2023-12-01 11:46:23 +05:30
NarayanBavisetti
0ab63ecf39 chore: removed djnago settings module 2023-12-01 11:42:58 +05:30
NarayanBavisetti
14f9f65a33 Merge branch 'develop' of github.com:makeplane/plane into develop 2023-12-01 11:28:32 +05:30
Nikhil
452c9f0d5b
dev: update email templates (#2948)
* dev: update magic link email

* dev: forgot password mail

* dev: workspace invitation update

* dev: update email templates and task

* dev: remove email verification template

* dev: change all conversation links to issues
2023-11-30 13:30:13 +05:30
rahulramesha
139c0857eb
fix: quick add positioning (#2949)
* fix quick add posutioning for kanban and spreadsheet

* fix kanban quick add project identifier
2023-11-30 12:22:43 +05:30
pablohashescobar
75d18adfd3 Merge branch 'develop' of github.com:makeplane/plane into dev/update_emails 2023-11-30 00:21:31 +05:30
pablohashescobar
05a9065b12 dev: update email templates and task 2023-11-30 00:16:16 +05:30
pablohashescobar
daf71f81f9 dev: workspace invitation update 2023-11-29 22:42:33 +05:30
pablohashescobar
cf0e89012c dev: forgot password mail 2023-11-29 22:42:03 +05:30
pablohashescobar
083bd18bc4 dev: update magic link email 2023-11-29 22:41:29 +05:30
Nikhil
2556d052de
chore: status code changed (#2947) 2023-11-29 21:46:54 +05:30
Nikhil
c65915665b
dev: transactional emails (#2946) 2023-11-29 21:46:18 +05:30
Nikhil
6ece5a57e2
dev: instance registration (#2912)
* dev: remove auto script for registration

* dev: make all of the instance admins as owners when adding a instance admin

* dev: remove sign out endpoint

* dev: update takeoff script to register the instance

* dev:  reapply instance model

* dev: check none for instance configuration encryptions

* dev: encrypting secrets configuration

* dev: user workflow for registration in instances

* dev: add email automation configuration

* dev: remove unused imports

* dev: reallign migrations

* dev: reconfigure license engine registrations

* dev: move email check to background worker

* dev: add sign up

* chore: signup error message

* dev: updated onboarding workflows and instance setting

* dev: updated template for magic login

* chore: page migration changed

* dev: updated migrations and authentication for license and update template for workspace invite

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-29 20:33:37 +05:30
Anmol Singh Bhatia
a779fa1497
dev: instance setup workflow (#2935)
* chore: instance type updated

* chore: instance not ready screen added

* chore: instance layout added

* chore: instance magic sign in endpoint and type added

* chore: instance admin password endpoint added

* chore: instance setup page added

* chore: instance setup form added

* chore: instance layout updated

* fix: instance admin workflow setup

* fix: admin workflow setup

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-29 20:33:08 +05:30
sriram veeraghanta
5eb7740833
fix: removed unused packages and upgraded to next 14 (#2944)
* fix: upgrading next package and removed unused deps

* chore: unused variable removed

* chore: next image icon fix

* chore: unused component removed

* chore: next image icon fix

* chore: replace use-debounce with lodash debounce

* chore: unused component removed

* resolved: fixed issue with next link component

* fix: updates in next config

* fix: updating types pages

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
2023-11-29 20:32:10 +05:30
guru_sainath
86408fcd46
chore: replaced v3 issues endpoints (#2945)
* chore: removed v3 endpoints

* chore: replace v3 issues to normal issues endpoints

* build-error: Bulid error is new issue structure

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-29 20:26:43 +05:30
Anmol Singh Bhatia
86de3188b1
chore: cycle and module status indicator improvement (#2942) 2023-11-29 20:02:41 +05:30
Anmol Singh Bhatia
c5996382bc
style: empty state improvement (#2943) 2023-11-29 20:02:13 +05:30
rahulramesha
2796754e5f
fix: v3 issues for the layouts (#2941)
* fix drag n drop exception error

* fix peek overlay close buttons

* fix project empty state view

* fix cycle and module empty state view

* add ai options to inbox issue creation

* fix inbox filters for viewers

* fix inbox filters for viewers for project

* disable editing permission for members and viewers

* define accurate types for drag and drop
2023-11-29 19:58:27 +05:30
Lakhan Baheti
3488dc3014
style: deactivate acount modal (#2940) 2023-11-29 19:12:12 +05:30
Bavisetti Narayan
c7a3d8f0a0
chore: v3 global issues (#2938) 2023-11-29 19:08:14 +05:30
Aaryan Khandelwal
6d97dd814a
chore: updated sign in workflow (#2939)
* chore: new sign in workflow

* chore: request new code button added

* chore: create new password form added

* fix: build errors

* chore: remove unused components

* chore: update submitting state texts

* fix: oauth sign in process
2023-11-29 19:07:33 +05:30
Lakhan Baheti
804c03bd15
chore: redirection to profile after workpace delete/leave (#2937) 2023-11-29 16:52:54 +05:30
Manish Gupta
59981e3e68
fix: Branch Build and Self hosting fixes (#2930)
* Branch build yml modified to create preview and latest tags

* self host install modified to handle public image only

* testing update-docker

* testing

* wip

* rolled back to orignal

* selfhosting readme updated
2023-11-29 14:58:31 +05:30
Aaryan Khandelwal
1be1b9f4a3
chore: issue peek overview (#2918)
* chore: autorun for the issue detail store

* fix: labels mutation

* chore: remove old peek overview code

* chore: move add to cycle and module logic to store

* fix: build errors

* chore: add peekProjectId query param for the peek overview

* chore: update profile layout

* fix: multiple workspaces

* style: Issue activity and link design improvements in Peek overview.
* fix issue with labels not occupying full widht.
* fix links overflow issue.
* add tooltip in links to display entire link.
* add functionality to copy links to clipboard.

* chore: peek overview for all the layouts

* fix: build errors

---------

Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-29 14:25:57 +05:30
Lakhan Baheti
7cd02401c1
fix: spreadsheet layout issue properties (#2936)
* fix: spredsheet layout state column state name & tootltip

* fix: label select dropdown first item auto active state

* fix: priority column padding & tooltip position
2023-11-29 14:20:58 +05:30
Bavisetti Narayan
6d175c0e56
chore: api rate limiting (#2933) 2023-11-29 14:14:26 +05:30
Bavisetti Narayan
6a4f521b47
chore: api webhooks validation (#2928)
* chore: api webhooks update

* chore: webhooks signature validation
2023-11-29 14:13:03 +05:30
rahulramesha
cdc4fd27a5
fix issue sorting and add sorting to other properties (#2931) 2023-11-29 14:02:39 +05:30
Aaryan Khandelwal
f4af3db7b6
chore: update the content of webhooks form (#2932) 2023-11-29 14:02:13 +05:30
Aaryan Khandelwal
4f64f431a7
chore: update profile settings layout (#2925)
* chore: update profile layout

* fix: multiple workspaces

* chore: removed breadcrumbs

* chore: fix sidebar collapsed state
2023-11-29 13:48:07 +05:30
Anmol Singh Bhatia
537901f046
style: workspace sidebar scroll fix and improvement (#2934) 2023-11-29 13:32:35 +05:30
Lakhan Baheti
404a61aee5
fix: google auth button content alignment (#2929) 2023-11-29 13:29:38 +05:30
Lakhan Baheti
1a2b1e648d
style: switch or delete account modal (#2926)
* style: switch or delete account modal

* fix: popover text color

* fix: typo
2023-11-29 13:28:24 +05:30
guru_sainath
3400c119bc
fix: drag and drop implementation in calendar layout and kanban layout (#2921)
* fix profile issue filters and kanban

* chore: calendar drag and drop

* chore: kanban drag and drop

* dev: remove issue from the kanban layout and resolved build errors

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-28 19:17:38 +05:30
sabith-tu
d5853405ca
style: new empty state ui (#2923) 2023-11-28 18:47:52 +05:30
rahulramesha
db510dcfcd
fix: all functionalities for profile, archived and draft issues (#2922)
* fix profile issue filters and kanban

* fix profile draft and archived issues
2023-11-28 18:15:46 +05:30
Lakhan Baheti
62c0615012
style: member role visibility (#2919)
* style: member role visibility

* fix: build errors
2023-11-28 17:11:04 +05:30
Bavisetti Narayan
3914a75334
chore: deactivated user workflow change (#2888)
* chore: deactivated user workflow change

* chore: removed archived and draft from v3 by default

* chore: draft and archive update

* chore: bool field

* chore: fall back workspace updated

* chore: workspace member active
2023-11-28 17:08:05 +05:30
Aaryan Khandelwal
0cbb201348
fix: workspace settings pages authorization (#2915)
* fix: workspace settings pages authorization

* chore: user cannot add a member with a higher role than theirs

* chore: update workspace general settings auth
2023-11-28 17:05:42 +05:30
rahulramesha
f7264364bd
add functionality for addition of existing issues to modules and cycles (#2913)
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-28 14:50:37 +05:30
Manish Gupta
35f8ffa5ab
migration script added (#2914) 2023-11-28 13:29:08 +05:30
M. Palanikannan
6607caade7
fix: Image restoration fixed (marks/unmarks an image to be deleted after a week) (#2859)
* image restoration fixed (marks an image to be deleted after a week)

* removed clgs

* added image constraints

* formatted editor-core package using yarn format

* lite-text-editor nothing to format

* rich-text-editor nothing to format

* formatted document-editor with prettier

* modified file service to follow api change

* fixed more formatting in document editor

* fixed all instances of types with that from the package

* fixed delete to work consistently (minor optimizations turned off)

* stop duplicate images inside editor

* restore image on editor creation

say if user A deletes image number 2, user B was also in the same issue and in their screen the image was there, if user B makes certain changes and that gets saved in backend, according to user B image 2 should exist but since user A deleted it, it'll not get restored and get deleted in 7 days, hence I've added a check such that whenever a issue loads we restore all images by default

* added restore image function with types

* replaced all instances to have restore image logic

* fixed issue detail for peek view

* disabled option to insert table inside a table
2023-11-28 11:34:20 +05:30
Manish Gupta
8ee8270697
removed container names for selfhosting (#2907) 2023-11-28 11:31:04 +05:30
Aaryan Khandelwal
41ab962dd7
chore: update get invitation details endpoint (#2902) 2023-11-28 11:30:03 +05:30
Prateek Shourya
c22c6bb9b2
Fix: bug fixes and UI / UX improvements (#2906)
* Fix: issue with project publish modal data not updating immediately.

* fix: issue with workspace list not scrollable in profile settings.

* fix: update redirect workspace slug logic to redirect to prev workspace instead of `/`.

* style: update API tokens and webhooks empty state designs.
2023-11-28 11:29:01 +05:30
sriram veeraghanta
67de6d0729
fix: adding ai assistance to pages (#2905)
* fix: adding ai modal to pages

* fix: pages overflow

* chore: update pages UI

* fix: updating page description while using ai assistance

* fix: gpt assistant modal height and position

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-27 20:39:18 +05:30
M. Palanikannan
f361cd045e
image can't be inserted inside table (#2904)
* image can't be inserted inside table

Now we've diabled image icon from showing up if the cursor is inside a table node or if a table cell is selected

* added drag drop support for document editor

* fixed missing dependencies
2023-11-27 20:37:40 +05:30
Prateek Shourya
06d3cd7e73
refactor: Instance admin setting and UI updates. (#2889)
* refactor: shift instance admin restriction content to seperate component.
fix: instance components export logic.

* style: fix sidebar dropdown `God Mode` icon padding.

* style: update profile settings user dropdown menu width.

* fix: update input type to `password` for Client Secret and API/ Access Key fields.

* style: update loader design for all forms.

* fix: typo

* style: ui updates.

* chore: add show/ hide button for all password fields.
2023-11-27 19:41:47 +05:30
Aaryan Khandelwal
10c52bf89b
refactor: keyboard shortcuts modal (#2822)
* refactor: keyboard shortcuts modal

* chore: updated search logic

* refactor: divided the modal component into granular components
2023-11-27 18:41:59 +05:30
Anmol Singh Bhatia
b717518fbe
fix: workspace dropdown scroll (#2900) 2023-11-27 18:25:40 +05:30
Aaryan Khandelwal
f48cd6f50c
fix: profile layout flicker (#2898)
* fix: user profile layout flicker

* chore: update import statements
2023-11-27 18:25:16 +05:30
Lakhan Baheti
3203ae6549
fix: progress panel default open (#2894) 2023-11-27 17:21:14 +05:30
Lakhan Baheti
b8f603f920
chore: signup removed (#2890) 2023-11-27 17:17:55 +05:30
Anmol Singh Bhatia
96ff76af94
[FED-1018] chore: workspace dropdown improvement (#2891)
* fix: workspace dropdown improvement

* style: sidebar workspace icon alignment
2023-11-27 17:17:09 +05:30
Anmol Singh Bhatia
830675741f
[FED-1054] fix: join project mutation (#2892)
* fix: join project mutation

* chore: code refactor
2023-11-27 17:16:46 +05:30
srinivas pendem
e489ad50dc
Update README.md (#2893)
./setup.sh removed

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-27 17:16:13 +05:30
Aaryan Khandelwal
3dc18bc8fd
refactor: webhooks (#2896)
* refactor: webhooks workflow

* chore: update delete modal content
2023-11-27 17:15:48 +05:30
Anmol Singh Bhatia
2d04917951
chore: instance admins endpoint added and ui/ux improvement (#2895)
* style: sidebar improvement

* style: header height consistency

* chore: layout consistency and general page improvement

* chore: layout, email form and image form improvement

* chore: instance admins endpoint intergrated and code refactor

* chore: code refactor

* chore: google client secret section removed
2023-11-27 17:15:11 +05:30
guru_sainath
2bf7e63625
issues rendering in all issue layouts fir profile and project issues and global issues store implementation (#2886)
* dev: draft and archived issue store

* connect draft and archived issues

* kanban for draft issues

* fix filter store for calendar and kanban

* dev: profile issues store and draft issues filters in header

* disble issue creation for draft issues

* dev: profile issues store filters

* disable kanban properties in draft issues

* dev: profile issues store filters

* dev: seperated adding issues to the cycle and module as seperate methds in cycle and module store

* dev: workspace profile issues store

* dev: sub group issues in the swimlanes

* profile issues and create issue connection

* fix profile issues

* fix spreadsheet issues

* fix dissapearing project from create issue modal

* page level modifications

* fix additional bugs

* dev: issues profile and global iisues and filters update

* fix issue related bugs

* fix project views for list and kanban

* fix build errors

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-27 14:15:33 +05:30
Anmol Singh Bhatia
eb78fd6088
fix: resolve modal overlapping issue (#2885) 2023-11-27 12:16:59 +05:30
Lakhan Baheti
202ecd21df
fix: bug fixes & UI improvements (#2884)
* chore: access restriction for api tokens

* fix: on create module total issues undefined

* fix: cycle board card typo

* chore: fetch modules after creation

* fix: peek module on delete

* fix: peek cycle on delete

* fix: cycle detail sidebar copy link toast

* chore: router replace -> push
2023-11-27 12:15:10 +05:30
Aaryan Khandelwal
b2ac7b9ac6
chore: revamp the API tokens workflow (#2880)
* chore: added getLayout method to api tokens pages

* revamp: api tokens workflow

* chore: add title validation and update types

* chore: minor UI updates

* chore: update route
2023-11-27 12:14:06 +05:30
Lakhan Baheti
51dff31926
fix: user state after logout (#2849)
* fix: user state after logout

* chore: user state handle with mobx

* chore: signout update for profile setting

* fix: minor fixes

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-25 23:04:56 +05:30
sriram veeraghanta
e89f152779
fix: remove slack notification on build branch workflow (#2881) 2023-11-25 22:43:27 +05:30
Lakhan Baheti
3c9f57f8f4
fix: workspace & user avatar tooltip (#2851)
* fix: workspace & user avatar tooltip

* chore: user name update while typing on top right avatar

* chore: imports placement

* fix: rendering condition

* chore: component re-arrangement

* fix: imports
2023-11-25 21:31:09 +05:30
Bavisetti Narayan
1bc859c68c
chore: seperated delete endpoint for file upload (#2870) 2023-11-25 21:28:03 +05:30
Ramesh Kumar Chandra
11d57a5bf0
fix: track events updated, extra parameters added, added events for issues, pages, states, cycles (#2875)
* fix: event tracking method updated to store, chore: updated and added events for workspace, projects and create issue

* fix: posthog auth event tracking

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-25 21:26:26 +05:30
Prateek Shourya
2980c7b00d
Feat: God Mode UI Updates and More Config Settings (#2877)
* feat: Images in Plane config screen.
* feat: Enable/ Disable Magic Login config toggle.
* style: UX copy and design updates across all screens.
* style: SSO and OAuth Screen revamp.
* style: Enter God Mode button for Profile Settings sidebar.
* fix: update input type to password for password fields.
2023-11-25 21:23:50 +05:30
Anmol Singh Bhatia
5c6a59ba35
dev: badge component added in planu ui package (#2876) 2023-11-25 21:21:03 +05:30
Anmol Singh Bhatia
a3ea7c8f10
fix: issue peek overview state select dropdown overflow fix (#2873) 2023-11-25 21:18:54 +05:30
Anmol Singh Bhatia
cb922fb113
fix: module sidebar date select fix and code refactor (#2872) 2023-11-25 21:18:16 +05:30
sriram veeraghanta
06564ee856
fix: remove slack notify (#2871)
* fix: remove slack notifications on workflows

* fix: bugfix
2023-11-24 14:31:44 +05:30
Nikhil
c7e6118804
refactor: image upload modals, file size limit added to config (#2868)
* chore: add file size limit as config in the config api

* refactor: image upload modals

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-24 13:23:46 +05:30
Manish Gupta
069b8b3ed9
Updated the slack notification message to PR Title (#2869) 2023-11-24 13:11:21 +05:30
Lakhan Baheti
38a5b7bec0
chore: added error toast for invitation (#2853) 2023-11-24 12:47:02 +05:30
Bavisetti Narayan
236caaafe8
chore: user deactivation and login restriction (#2855)
* chore: user deactivation

* chore: deactivation and login disabled

* chore: added get configuration value

* chore: serializer message change

* chore: instance admin passowrd change

* chore: removed triage

* chore: v3 endpoint for user profile

* chore: added enable signin

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-24 12:22:24 +05:30
Nikhil
a6d5eab634
chore: api and webhook refactor (#2861)
* chore: bug fix

* dev: changes in api endpoints for invitations and inbox

* chore: improvements

* dev: update webhook send

* dev: webhook validation and fix webhook flow for app

* dev: error messages for deactivation

* chore: api fixes

* dev: update webhook and workspace leave

* chore: issue comment

* dev: default values for environment variables

* dev: make the user active if he was already part of project member

* chore: webhook cycle and module event

* dev: disable ssl for emails

* dev: webhooks restructuring

* dev: updated webhook configuration

* dev: webhooks

* dev: state get object

* dev: update workspace slug validation

* dev: remove deactivation flag if max retries exceeded

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-24 12:19:26 +05:30
sriram veeraghanta
8d76c96a6f
fix: adding slack notification when build is failed to upload to docker (#2862)
* fix: removing logs

* fix: adding slack notification when build is failed to upload to docker

* minor changes

---------

Co-authored-by: Manish Gupta <59428681+manishg3@users.noreply.github.com>
2023-11-24 12:17:31 +05:30
Aaryan Khandelwal
97be4b60ae
chore: update profile and God mode routes (#2860)
* chore: update profile and god mode routes

* fix: profile activity loader

* chore: update profile route in the change password page
2023-11-24 12:16:37 +05:30
Bavisetti Narayan
dece103873
chore: user activity in profile page (#2856)
* chore: user activity endpoint change

* chore: added workspace detail in activity serializer
2023-11-23 21:00:49 +05:30
Anmol Singh Bhatia
c6125876be
fix: view date filter select fix (#2858) 2023-11-23 20:40:41 +05:30
Anmol Singh Bhatia
1f85bf2302
style: module ui improvement (#2838) 2023-11-23 20:39:58 +05:30
Anmol Singh Bhatia
20baba3bb0
style: issue activity section improvement (#2836) 2023-11-23 20:39:18 +05:30
Ramesh Kumar Chandra
85907b32d1
feat: change password page (#2847) 2023-11-23 20:38:50 +05:30
sabith-tu
ef2bef83dc
style: removing extra options heading and drop down icon (#2852) 2023-11-23 20:38:05 +05:30
Aaryan Khandelwal
6e7a96394a
fix: page scroll area (#2850) 2023-11-23 18:22:25 +05:30
Aaryan Khandelwal
5726f6955c
dev: added tailwind merge helper function (#2844) 2023-11-23 17:21:47 +05:30
Aaryan Khandelwal
82665a35ee
fix: archived issues infinite call (#2848) 2023-11-23 16:58:08 +05:30
sriram veeraghanta
4efd225599
fix: updated document editor package in web and space apps (#2846) 2023-11-23 15:27:20 +05:30
sriram veeraghanta
2481706581
chore: optimizations and file name changes (#2845)
* fix: deepsource antipatterns

* fix: deepsource exclude file patterns

* chore: file name changes and removed unwanted variables

* fix: changing version number for editor
2023-11-23 15:09:46 +05:30
guru_sainath
a17b08dd15
chore: implemented new store and issue layouts for issues and updated new data structure for issues (#2843)
* fix: Implemented new workflow in the issue store and updated the quick add workflow in list layout

* fix: initial load and mutaion of issues in list layout

* dev: implemented the new project issues store with grouped, subGrouped and unGrouped issue computed functions

* dev: default display properties data made as a function

* conflict: merge conflict resolved

* dev: implemented quick add logic in kanban

* chore: implemented quick add logic in calendar and spreadsheet layout

* fix: spreadsheet layout quick add fix

* dev: optimised the issues workflow and handled the issues order_by filter

* dev: project issue CRUD operations in new issue store architecture

* dev: issues filtering in calendar layout

* fix: build error

* dev/issue_filters_store

* chore: updated filters computed structure

* conflict: merge conflicts resolved in project issues

* dev: implemented gantt chart for project issues using the new mobx store

* dev: initialized cycle and module issue filters store

* dev: issue store and list layout store updates

* dev: quick add and update, delete issue in the list

* refactor list root changes

* dev: store new structure

* refactor spreadsheet and gnatt project roots

* fix errors for base gantt and spreadsheet roots

* connect Calendar project view

* minor house keeping

* connect Kanban View to th enew store

* generalise base calendar issue actions

* dev: store project issues and issue filters

* dev: store project issues and filters

* dev: updated undefined with displayFilters in project issue store

* Add Quick add to all the layouts

* connect module views to store

* dev: Rendering list issues in project issues

* dev: removed console log

* dev: module filters store

* fix errors and connect modules list and quick add for list

* dev: module issue store

* dev: modle filter store issue fixed and updates cycle issue filters

* minor house keeping changes

* dev: cycle issues and cycle filters

* connecty cycles to teh store

* dev: project view issues and issue filtrs

* connect project views

* dev: updated applied filters in layouts

* dev: replaced project id with view id in project views

* dev: in cycle and module store made cycledId and moduleId as optional

* fix minor issues and build errots

* dev: project draft and archived issues store and filters

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-23 14:47:04 +05:30
Aaryan Khandelwal
a7d6b528bd
chore: deactivate user option added (#2841)
* dev: deactivate user option added

* chore: new layout for profile settings

* fix: build errors

* fix: user profile activity
2023-11-23 14:44:06 +05:30
Lakhan Baheti
9ba724b78d
fix: onboarding bugs & improvements (#2839)
* fix: terms & condition alignment

* fix: onboarding page scrolling

* fix: create workspace name clear

* fix: setup profile sidebar workspace name

* fix: invite team screen button text

* fix: inner div min height

* fix: allow single invite also in invite member

* fix: UI clipping in invite members

* fix: signin screen scroll

* fix: sidebar notification icon

* fix: sidebar project name & icon

* fix: user detail bottom image alignment

* fix: step indicator in invite member

* fix: try different account modal state

* fix: setup profile remove image

* fix: workspace slug clear

* fix: invite member UI & focus

* fix: step indicator size

* fix: inner div placement

* fix: invite member validation logic

* fix: cuurent user data persistency

* fix: sidebar animation colors

* feat: signup & resend

* fix: sign out theme persist from popover

* fix: imports

* chore: signin responsiveness

* fix: sign-in, sign-up top padding
2023-11-23 13:45:00 +05:30
Bavisetti Narayan
c2da9783a3
chore: change password endpoint (#2842) 2023-11-23 13:44:50 +05:30
Anmol Singh Bhatia
784be47e91
[FED-888] fix: parent issue select modal improvement (#2837)
This PR include improvement for parent issue select modal.
2023-11-22 16:16:52 +05:30
Anmol Singh Bhatia
0fdd9c28bf
fix: project setting ui consistency (#2835) 2023-11-22 15:36:34 +05:30
Anmol Singh Bhatia
644b06749b
fix: profile setting overflow (#2834) 2023-11-22 15:35:51 +05:30
Anmol Singh Bhatia
dd8c7a7487
fix: cycle and module create/update modal fix (#2833) 2023-11-22 15:35:24 +05:30
Anmol Singh Bhatia
e6a1f34713
fix: module and cycle sidebar loading state (#2831) 2023-11-22 15:34:39 +05:30
sabith-tu
1dff6b63f8
style: new empty project screen (#2832) 2023-11-22 15:34:06 +05:30
Anmol Singh Bhatia
59dbbb29cd
fix: custom analytics project dropdown fix (#2828) 2023-11-22 14:55:18 +05:30
Anmol Singh Bhatia
6cb3939835
style: project card improvement (#2827) 2023-11-22 14:54:52 +05:30
Anmol Singh Bhatia
021c0675b7
fix: module sidebar link section (#2830) 2023-11-22 14:36:29 +05:30
Anmol Singh Bhatia
67000892e5
chore: dashboard redirection fix (#2826) 2023-11-22 13:47:59 +05:30
sriram veeraghanta
3df4794e77
fix: AI Assistance hide/unhide depending on the configuration (#2825)
* fix: gpt error handlijng

* fix: enabling ai assistance only when it is configured.
2023-11-22 13:20:59 +05:30
Prateek Shourya
42ccd1de58
Style: UI improvements (#2824)
* style: update notification Read status toast alert description.

* style: update issue subscribe button design.

* fix: remove group_by `none` display filter from the kanban view in profile and draft issues.

* style: design improvement in members settings.
* style: add display name for all user role.
* style: remove email for user roles other than admin.
* style: fix border color as per designs.
2023-11-22 12:58:55 +05:30
Aaryan Khandelwal
c8c89007c0
style: revamped page details UI (#2823)
* style: revamp page details UI

* chore: updated the info popover date format

* fix: page actions mutation

* style: made the page content responsive
2023-11-22 12:32:49 +05:30
Bavisetti Narayan
4cf3e69e22
chore: file asset update (#2816)
* chore: endpoint to update file asset

* chore: aws storage endpoint change
2023-11-21 17:52:19 +05:30
Lakhan Baheti
fb1f65c2c1
fix: sidebar project section hover (#2818)
* fix: sidebar project section hover

* fix: icons alignment
2023-11-21 17:37:17 +05:30
Lakhan Baheti
d91b4e6fa1
fix: bug fixes & UI improvements (#2819)
* fix: profile setting fields border

* fix: webhooks empty state UI

* fix: cycle delete redirection from cycle detail

* fix: integration access restriction
2023-11-21 17:35:29 +05:30
Aaryan Khandelwal
561223ea71
chore: update join project endpoint (#2821) 2023-11-21 17:35:15 +05:30
Aaryan Khandelwal
982eba0bd1
fix: complete pages editor not clickable, recent pages calculation logic (#2820)
* fix: whole editor not clickable

* fix: recent pages calculation

* chore: update older pages calculation logic in recent pages list

* fix: archived pages computed function

* chore: add type for older pages
2023-11-21 15:47:34 +05:30
Aaryan Khandelwal
7aaf840fb1
refactor: command k modal (#2803)
* refactor: command palette file structure

* fix: identifier search
2023-11-21 15:46:41 +05:30
Nikhil
15927c9cae
dev: change url for the license engine instance registration (#2810) 2023-11-20 21:32:45 +05:30
Bavisetti Narayan
d46d70fcd5
chore: removed DOCKERIZED value and changed REDIS_SSL (#2813)
* chore: removed DOCKERIZED value

* chore: changed redis ssl
2023-11-20 21:32:00 +05:30
Henit Chobisa
de581102e3
feat: New Pages with Enhanced Document Editor Packages made over Editor Core 📝 (#2784)
* fix: page transaction model

* fix: page transaction model

* feat: updated ui for page route

* chore: initailized `document-editor` package for plane

* fix: format persistence while pasting markdown in editor

* feat: Inititalized Document-Editor and Editor with Ref

* feat: added tooltip component and slash command for editor

* feat: added `document-editor` extensions

* feat: added custom search component for embedding labels

* feat: added top bar menu component

* feat: created document-editor exposed components

* feat: integrated `document-editor` in `pages` route

* chore: updated dependencies

* feat: merge conflict resolution

* chore: modified configuration for document editor

* feat: added content browser menu for document editor summary

* feat: added fixed menu and editor instances

* feat: added document edittor instances and summary table

* feat: implemented document-editor in PageDetail

* chore: css and export fixes

* fix: migration and optimisation

* fix: added `on_create` hook in the core editor

* feat: added conditional menu bar action in document-editor

* feat: added menu actions from single page view

* feat: added services for archiving, unarchiving and retriving archived pages

* feat: added services for page archives

* feat: implemented page archives in page list view

* feat: implemented page archives in document-editor

* feat: added editor marking hook

* chore: seperated editor header from the main content

* chore: seperated editor summary utilities from the main editor

* chore: refactored necessary components from the document editor

* chore: removed summary sidebar component from the main content editor

* chore: removed scrollSummaryDependency from Header and Sidebar

* feat: seperated page renderer as a seperate component

* chore: seperated page_renderer and sidebar as component from index

* feat: added locked property to IPage type

* feat: added lock/unlock services in page service

* chore: seperated DocumentDetails as exported interface from index

* feat: seperated document editor configs as seperate interfaces

* chore: seperated menu options from the editor header component

* fix: fixed page_lock performing lock/unlock operation on queryset instead of single instance

* fix: css positioning changes

* feat: added archive/lock alert labels

* feat: added boolean props in menu-actions/options

* feat: added lock/unlock & archive/unarchive services

* feat: added on update mutations for archived pages in page-view

* feat: added archive/lock on_update mutations in single page vieq

* feat: exported readonly editor for locked pages

* chore: seperated kanban menu props and saved over passing redundant data

* fix: readonly editor not generating markings on first render

* fix: cheveron overflowing from editor-header

* chore: removed unused utility actions

* fix: enabled sidebar view by default

* feat: removed locking on pages in archived state

* feat: added indentation in heading component

* fix: button classnames in vertical dropdowns

* feat: added `last_archived_at` and `last_edited_at` details in editor-header

* feat: changed types for archived updates and document last updates

* feat: updated editor and header props

* feat: updated queryset according to new page query format

* feat: added parameters in page view for shared / private pages

* feat: updated other-page-view to shared page view && same with private pages

* feat: added page-view as shared / private

* fix: replaced deleting to archiving for pages

* feat: handle restoring of page from archived section from list view

* feat: made previledge based option render for pages

* feat: removed layout view for page list view

* feat: linting changes

* fix: adding mobx changes to pages

* fix: removed uneccessary migrations

* fix: mobx store changes

* fix: adding date-fns pacakge

* fix: updating yarn lock

* fix: removing unneccessary method params

* chore: added access specifier to the create/update page modal

* fix: tab view layout changes

* chore: delete endpoint for page

* fix: page actions, including- archive, favorite, access control, delete

* chore: remove archive page modal

* fix: build errors

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: sriramveeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-20 21:31:12 +05:30
Prateek Shourya
b903126e5a
feat: Instance Admin Panel: Configuration Settings (#2800)
* feat: Instance Admin Panel: Configuration Settings

* refactor: seprate Google and Github form into independent components.

* feat: add admin auth wrapper and access denied page.

* style: design updates.
2023-11-20 20:46:49 +05:30
sabith-tu
f44f70168f
style: changing profile screen title (#2814) 2023-11-20 20:46:15 +05:30
sriram veeraghanta
3c10f00b04
fix: minor fix (#2815) 2023-11-20 20:24:35 +05:30
Lakhan Baheti
f1de05e4de
chore: onboarding (#2790)
* style: onboarding light version

* style: dark mode

* fix: onboarding gradient

* refactor: imports

* chore: add use case field in users api

* feat: delete account

* fix: delete modal points alignment

* feat: usecase in profile

* fix: build error

* fix: typos & hardcoded strings

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2023-11-20 19:31:19 +05:30
Prashant Indurkar
61d4e2e016
Fixed: while creating new Add Labels the field should be auto focus #2437 (#2438)
* bug:fix recent page hiding last item on scroll #1468

* bug:fix recent page hiding last item on scroll #1468 (#2411)

* fixed add label autofocuse

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-20 19:00:55 +05:30
Lakhan Baheti
c1eb5055e5
fix: bug fixes & ui improvements. (#2772)
* fix: create project modal member select

* fix: overflow in workspace activity

* fix: memeber selected state
2023-11-20 16:36:50 +05:30
Bavisetti Narayan
8d942e28da
chore: ams url name changed (#2808) 2023-11-20 16:34:57 +05:30
Nikhil
f7461af3f5
dev: open ai configuration (#2807) 2023-11-20 16:03:31 +05:30
Aaryan Khandelwal
29f3e02adc
refactor: project estimates store (#2801)
* refactor: remove estimates from project store

* chore: update all the instances of the old store

* chore: update store declaration structure
2023-11-20 15:58:40 +05:30
Nikhil
9a704458b3
dev: external apis (#2806)
* dev: new proxy api setup

* dev: updated endpoints with serializers and structure

* dev: external apis for cycles, modules and inbox
issue

* dev: order by for all the apis

* dev: enable webhooks for external apis

* dev: fields and expand for the apis

* dev: move authentication to proxy middleware

* dev: fix imports

* dev: api serializer updates and paginator

* dev: renamed api to app

* dev: renamed proxy to api

* dev: validation for project, issues, modules and cycles

* dev: remove favourites from project apis

* dev: states api

* dev: rewrite the url endpoints

* dev: exception handling for the apis

* dev: merge updated structure

* dev: remove attachment apis

* dev: issue activities endpoints
2023-11-20 15:58:17 +05:30
Aaryan Khandelwal
668dfd2e38
chore: update exception detected screen action button (#2805) 2023-11-20 15:00:36 +05:30
Bavisetti Narayan
3b3f94ed03
fix: file asset delete (#2804) 2023-11-20 14:53:06 +05:30
onFire(Abhi)
e945aa9b71
fix: newly added cycle doesnt appear unlelss the page is manually reloaded (#2673)
* fix: newly added cycle doesnt appear unlelss the page is manually reloaded

* Delete \

* Delete web/layouts/profile-layout/profile-sidebar.tsx

* Update cycles.store.ts

* fix: remove duplicate type declaration

---------

Co-authored-by: Aaryan Khandelwal <65252264+aaryan610@users.noreply.github.com>
2023-11-20 14:36:46 +05:30
sriram veeraghanta
6595a387d0
feat: event tracking using posthog and created application provider to render multiple wrappers (#2757)
* fix: event tracker changes

* fix: App provider implementation using wrappers

* fix: updating packages

* fix: handling warning

* fix: wrapper fixes and minor optimization changes

* fix: chore app-provider clearnup

* fix: cleanup

* fix: removing jitsu tracking

* fix: minor updates

* fix: adding event to posthog event tracker (#2802)

* dev: posthog event tracker update intitiate

* fix: adding events for posthog integration

* fix: event payload

---------

Co-authored-by: Ramesh Kumar Chandra <31303617+rameshkumarchandra@users.noreply.github.com>
2023-11-20 13:29:54 +05:30
Dakshesh Jain
8839e42dc0
fix: archive issue bugs (#2712)
* fix: blur on side/modal peek view

* fix: delete archive not working on list layout with group by is none

* fix: show empty group has no effect

* fix: filter/display options same as production

* fix: disabling full-screen peek-overview for archive issues

* fix: truncate in calendar view
2023-11-20 12:48:30 +05:30
Nikhil
9db6312081
fix: self hosted instance (#2795)
* dev: update create bucket script

* dev: update patch endpoint for instance configuration

* dev: add google client secret and default values for ADMIN_EMAIL and LICENSE_ENGINE_BASE_URL
2023-11-20 12:36:48 +05:30
Dakshesh Jain
779ef2a4aa
fix: delete issues in spreadsheet doesn't work (#2718)
Co-authored-by: Aaryan Khandelwal <65252264+aaryan610@users.noreply.github.com>
2023-11-20 12:22:43 +05:30
Bavisetti Narayan
51e17643a2
fix: file structuring (#2797)
* fix: file structure changes

* fix: pages update

* fix: license imports changed
2023-11-20 11:59:20 +05:30
Nikhil
4c2074b6ff
dev: environment settings (#2794)
* dev: update environment configuration

* dev: update the takeoff script for instance registration
2023-11-19 01:48:05 +05:30
Lakhan Baheti
c9ffc9465f
fix: Labels delete & reordering (#2729)
* fix: Labels reordering inconsistency

* fix: Delete child labels

* feat: multi-select while grouping labels

* refactor: label sorting in mobx computed function

* feat: drag & drop label grouping, un-grouping

* chore: removed label select modal

* fix: moving labels from project store to project label store

* fix: typo changes and build tree function added

* labels feature

* disable dropping group into a group

* fix build errors

* fix more issues

* chore: added combining state UI, fixed scroll issue for label groups

* chore: group icon for label groups

* fix: group cannot be dropped in another group

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-19 01:46:11 +05:30
Bavisetti Narayan
2b6c489513
feat: v3 endpoint for module and cycle (#2786)
* feat: v3 endpoint for module and cycle

* fix: removed the str
2023-11-18 16:30:35 +05:30
M. Palanikannan
0c63f21718
fix: Task List Behaviour in Editor (#2789)
* better variable names and comments

* drag drop migrated

* custom horizontal rule created

* init transaction hijack

* fixed code block with better contrast, keyboard tripple enter press disabled and syntax highlighting

* fixed link selector closing on open behaviour

* added better keymaps and syntax highlights

* made drag and drop working for code blocks

* fixed drag drop for code blocks

* moved drag drop only to rich text editor

* fixed drag and drop only for description

* enabled drag handles for peek overview and main issues

* got images to old state

* fixed task lists to be smaller

* removed validate image functions and uncessary imports

* table icons svg attributes fixed

* custom list keymap extension added

* more uncessary imports of validate image removed

* removed console logs

* fixed drag-handle styles

* space styles updated for the editor

* removed showing quotes from blockquotes

* removed validateImage for now

* added better comments and improved redundant renders

* removed uncessary console logs

* created util for creating the drag handle element

* fixed file names
2023-11-18 16:20:35 +05:30
Nikhil
a987df38f4
chore: user onboarding workflow (#2791) 2023-11-18 16:18:06 +05:30
sriram veeraghanta
878707f444
feat: Instance Registration and Configuration (#2793)
* dev: remove default user

* dev: initiate licensing

* dev: remove migration file 0046

* feat: self hosted licensing initialize

* dev: instance licenses

* dev: change license response structure

* dev: add default properties and issue mention migration

* dev: reset migrations

* dev: instance configuration

* dev: instance configuration migration

* dev: update instance configuration model to take null and empty values

* dev: instance configuration variables

* dev: set default values

* dev: update instance configuration load

* dev: email configuration settings moved to database

* dev: instance configuration on instance bootup

* dev: auto instance registration script

* dev: instance admin

* dev: enable instance configuration and instance admin roles

* dev: instance owner fix

* dev: instance configuration values

* dev: fix instance permissions and serializer

* dev: fix email senders

* dev: remove deprecated variables

* dev: fix current site domain registration

* dev: update cors setup and local settings

* dev: migrate instance registration and configuration to manage commands

* dev: check email validity

* dev: update script to use manage command

* dev: default bucket creation script

* dev: instance admin routes and initial set of screens

* dev: admin api to check if the current user is admin

* dev: instance admin unique constraints

* dev: check magic link login

* dev: fix email sending for ssl

* dev: create instance activation route if the instance is not activated during startup

* dev: removed DJANGO_SETTINGS_MODULE from environment files and deleted auto bucket create script

* dev: environment configuration for backend

* dev: fix access token variable error

* feat: Instance Admin Panel: General Settings (#2792)

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
2023-11-18 16:17:01 +05:30
M. Palanikannan
9369ee5008
[feat]: Drag and Drop Handles for all Data Structures (#2745)
* better variable names and comments

* drag drop migrated

* custom horizontal rule created

* init transaction hijack

* fixed code block with better contrast, keyboard tripple enter press disabled and syntax highlighting

* fixed link selector closing on open behaviour

* added better keymaps and syntax highlights

* made drag and drop working for code blocks

* fixed drag drop for code blocks

* moved drag drop only to rich text editor

* fixed drag and drop only for description

* enabled drag handles for peek overview and main issues

* got images to old state
2023-11-17 12:29:30 +05:30
Manish Gupta
0a88db975a
dev: Self Hosting with private repo fixes (#2787)
* fixes to self hosting

* self hosting fixes

* removed .temp

* wip

* wip

* self install private repo

* folder change

* fix

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-17 11:51:54 +05:30
Manish Gupta
dd60dec887
Dev/mg selfhosting fix (#2782)
* fixes to self hosting

* self hosting fixes

* removed .temp

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-16 14:38:55 +05:30
Bavisetti Narayan
0c1097592e
fix: pages revamping (#2760)
* fix: page transaction model

* fix: page transaction model

* fix: migration and optimisation

* fix: back migration of page blocks

* fix: added issue embed

* fix: migration fixes

* fix: resolved changes
2023-11-16 14:38:12 +05:30
Anmol Singh Bhatia
bed66235f2
style: workspace sidebar dropdown improvement (#2783) 2023-11-16 14:11:33 +05:30
Nikhil
26b1e9d5f1
dev: squashed migrations (#2779)
* dev: migration squash

* dev: migrations squashed for apis and webhooks

* dev: packages updated and  move dj-database-url for local settings

* dev: update package changes
2023-11-15 17:15:02 +05:30
Bavisetti Narayan
79347ec62b
feat: api webhooks (#2543)
* dev: initiate external apis

* dev: external api

* dev: external public api implementation

* dev: add prefix to all api tokens

* dev: flag to enable disable api token api access

* dev: webhook model create and apis

* dev: webhook settings

* fix: webhook logs

* chore: removed drf spectacular

* dev: remove retry_count and fix api logging for get requests

* dev: refactor webhook logic

* fix: celery retry mechanism

* chore: event and action change

* chore: migrations changes

* dev: proxy setup for apis

* chore: changed retry time and cleanup

* chore: added issue comment and inbox issue api endpoints

* fix: migration files

* fix: added env variables

* fix: removed issue attachment from proxy

* fix: added new migration file

* fix: restricted wehbook access

* chore: changed urls

* chore: fixed porject serializer

* fix: set expire for api token

* fix: retrive endpoint for api token

* feat: Api Token screens & api integration

* dev: webhook endpoint changes

* dev: add fields for webhook updates

* feat: Download Api secret key

* chore: removed BASE API URL

* feat: revoke token access

* dev: migration fixes

* feat: workspace webhooks (#2748)

* feat: workspace webhook store, services integeration and rendered webhook list and create

* chore: handled webhook update and rengenerate token in workspace webhooks

* feat: regenerate key and delete functionality

---------

Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>

* fix: url validation added

* fix: seperated env for webhook and api

* Web hooks refactoring

* add show option for generated hook key

* Api token restructure

* webhook minor fixes

* fix build errors

* chore: improvements in file structring

* dev: rate limiting the open apis

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: LAKHAN BAHETI <lakhanbaheti9@gmail.com>
Co-authored-by: rahulramesha <71900764+rahulramesha@users.noreply.github.com>
Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>
Co-authored-by: Nikhil <118773738+pablohashescobar@users.noreply.github.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-15 15:56:57 +05:30
Nikhil
7b965179d8
dev: update bucket script to make the bucket public (#2767)
* dev: update bucket script to make the bucket public

* dev: remove auto bucket script from docker compose
2023-11-15 15:56:08 +05:30
Nikhil
fc51ffc589
chore: user workflow (#2762)
* dev: workspace member deactivation and leave endpoints and filters

* dev: deactivated for project members

* dev: project members leave

* dev: project member check on workspace deactivation

* dev: project member queryset update and remove leave project endpoint

* dev: rename is_deactivated to is_active and user deactivation apis

* dev: check if the user is already part of workspace then make them active

* dev: workspace and project save

* dev: update project members to make them active

* dev: project invitation

* dev: automatic user workspace and project member create when user sign in/up

* dev: fix member invites

* dev: rename deactivation variable

* dev: update project member invitation

* dev: additional permission layer for workspace

* dev: update the url for  workspace invitations

* dev: remove invitation urls from users

* dev: cleanup workspace invitation workflow

* dev: workspace and project invitation
2023-11-15 15:53:16 +05:30
sabith-tu
96f6e37cc5
fix: Delete estimate popup is not closing automatically (#2777) 2023-11-15 14:08:52 +05:30
Nikhil
29774ce84a
dev: API settings (#2594)
* dev: update settings file structure and added extra settings for CORS

* dev: remove WEB_URL variable and add celery integration for sentry

* dev: aws and minio settings

* dev: add cors origins to env

* dev: update settings
2023-11-15 12:31:52 +05:30
Nikhil
8cbe9c26fc
enhancement: label sort order (#2763)
* chore: label sort ordering

* dev: ordering

* fix: sort order

* fix: save of labels

* dev: remove ordering by name

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-15 12:25:44 +05:30
Prateek Shourya
7f42566207
Fix: Custom menu item not automatically closing, affecting delete popup behavior. (#2771) 2023-11-14 23:05:30 +05:30
Ankush Deshmukh
b60237b676
Standarding priority icons across the platform (#2776) 2023-11-14 20:52:43 +05:30
Prateek Shourya
1fe09d369f
style: text overflow fix and border color update (#2769)
* style: fix text overflow in:
* Issue activity
* Cycle and Module Select in Create Issue form
* Delete Module modal
* Join Project modal

* style: update assignee select border as per design.
2023-11-14 18:34:51 +05:30
Dakshesh Jain
b7757c6b1a
fix: bugs (#2761)
* fix: semicolon on estimate settings page

* refactor: project settings automations store implementation

* fix: active cycle stuck on infinite loading

* fix: removed delete project option from sidebar

* fix: discloser not opening when navigating to project

* fix: clear filter not working & filter appearing even if nothing is selected

* refactor: select label store implementation

* refactor: select state store implementation
2023-11-14 18:33:01 +05:30
Anmol Singh Bhatia
1a25bacce1
style: create update view modal consistency (#2775) 2023-11-14 18:30:10 +05:30
Anmol Singh Bhatia
6797df239d
chore: no lead option added in lead select dropdown (#2774) 2023-11-14 18:29:39 +05:30
Anmol Singh Bhatia
43e7c10eb7
chore: spreadsheet layout column responsiveness (#2768) 2023-11-14 18:28:49 +05:30
Anmol Singh Bhatia
bdc9c9c2a8
chore: create update issue modal improvement (#2765) 2023-11-14 18:28:15 +05:30
Anmol Singh Bhatia
f0c72bf249
fix: breadcrumb project icon improvement (#2764) 2023-11-14 18:27:47 +05:30
sabith-tu
a8904bfc48
style: ui fixes for pages and views (#2770) 2023-11-14 18:26:50 +05:30
Nikhil
b31041726b
dev: create bucket through application (#2720) 2023-11-13 15:57:19 +05:30
Prateek Shourya
e6f947ad90
style: ui improvements and bug fixes (#2758)
* style: add transition to favorite projects dropdown.

* style: update project integration settings borders.

* style: fix text overflow issue in project views.

* fix: issue with non-functional cancel button in leave project modal.
2023-11-13 14:42:45 +05:30
Dakshesh Jain
7963993171
fix: workspace settings bugs (#2743)
* fix: double layout in exports

* fix: typo in jira email address section

* fix: workspace members not mutating

* fix: removed un-used variable

* fix: workspace members can't be filtered using email

* fix: autocomplete in workspace delete

* fix: autocomplete in project delete modal

* fix: update member function in store

* fix: sidebar link not active when in github/jira

* style: margin top & icon inconsistency

* fix: typo in create workspace

* fix: workspace leave flow

* fix: redirection to delete issue

* fix: autocomplete off in jira api token

* refactor: reduced api call, added optional chaining & removed variable with low scope
2023-11-13 13:34:05 +05:30
sriram veeraghanta
0ceb9974f6
dev: hub compose file update (#2376) (#2444) (#2445)
* docker-compose-hub modified for envs

* bug:fix recent page hiding last item on scroll #1468 (#2411)

* wip

* fixed the AMD build on ARM

---------

Co-authored-by: Manish Gupta <59428681+manishg3@users.noreply.github.com>
Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2023-10-16 13:00:58 +05:30
Prashant Indurkar
e0fcc0c876
bug:fix recent page hiding last item on scroll #1468 (#2411) 2023-10-12 16:35:00 +05:30
1051 changed files with 55072 additions and 26448 deletions

View File

@ -1,5 +1,11 @@
version = 1 version = 1
exclude_patterns = [
"bin/**",
"**/node_modules/",
"**/*.min.js"
]
[[analyzers]] [[analyzers]]
name = "shell" name = "shell"

View File

@ -21,15 +21,15 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880 FILE_SIZE_LIMIT=5242880
# GPT settings # GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # add your openai key here OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker # Settings related to Docker
DOCKERIZED=1 DOCKERIZED=1 # deprecated
# set to 1 If using the pre-configured minio setup # set to 1 If using the pre-configured minio setup
USE_MINIO=1 USE_MINIO=1
# Nginx Configuration # Nginx Configuration
NGINX_PORT=80 NGINX_PORT=80

View File

@ -1,4 +1,3 @@
name: Branch Build name: Branch Build
on: on:
@ -8,6 +7,7 @@ on:
branches: branches:
- master - master
- release - release
- preview
- qa - qa
- develop - develop
@ -24,39 +24,6 @@ jobs:
- name: Check out the repo - name: Check out the repo
uses: actions/checkout@v3.3.0 uses: actions/checkout@v3.3.0
# - name: Set Target Branch Name on PR close
# if: ${{ github.event_name == 'pull_request' && github.event.action =='closed' }}
# run: echo "TARGET_BRANCH=${{ github.event.pull_request.base.ref }}" >> $GITHUB_ENV
# - name: Set Target Branch Name on other than PR close
# if: ${{ github.event_name == 'push' }}
# run: echo "TARGET_BRANCH=${{ github.ref_name }}" >> $GITHUB_ENV
- uses: ASzc/change-string-case-action@v2
id: gh_branch_upper_lower
with:
string: ${{env.TARGET_BRANCH}}
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_replace_slash
with:
source: ${{ steps.gh_branch_upper_lower.outputs.lowercase }}
find: '/'
replace: '-'
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_replace_dot
with:
source: ${{ steps.gh_branch_replace_slash.outputs.value }}
find: '.'
replace: ''
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_clean
with:
source: ${{ steps.gh_branch_replace_dot.outputs.value }}
find: '_'
replace: ''
- name: Uploading Proxy Source - name: Uploading Proxy Source
uses: actions/upload-artifact@v3 uses: actions/upload-artifact@v3
with: with:
@ -77,7 +44,6 @@ jobs:
!./nginx !./nginx
!./deploy !./deploy
!./space !./space
- name: Uploading Space Source - name: Uploading Space Source
uses: actions/upload-artifact@v3 uses: actions/upload-artifact@v3
with: with:
@ -89,12 +55,24 @@ jobs:
!./deploy !./deploy
!./web !./web
outputs: outputs:
gh_branch_name: ${{ steps.gh_branch_clean.outputs.value }} gh_branch_name: ${{ env.TARGET_BRANCH }}
branch_build_push_frontend: branch_build_push_frontend:
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
needs: [ branch_build_and_push ] needs: [branch_build_and_push]
env:
FRONTEND_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
steps: steps:
- name: Set Frontend Docker Tag
run: |
if [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:latest"
elif [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "release" ] || [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "preview" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:preview,${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend:preview"
else
TAG=${{ env.FRONTEND_TAG }}"
fi
echo "FRONTEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0 uses: docker/setup-buildx-action@v2.5.0
@ -114,7 +92,7 @@ jobs:
context: . context: .
file: ./web/Dockerfile.web file: ./web/Dockerfile.web
platforms: linux/amd64 platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }} tags: ${{ env.FRONTEND_TAG }}
push: true push: true
env: env:
DOCKER_BUILDKIT: 1 DOCKER_BUILDKIT: 1
@ -123,8 +101,20 @@ jobs:
branch_build_push_space: branch_build_push_space:
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
needs: [ branch_build_and_push ] needs: [branch_build_and_push]
env:
SPACE_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
steps: steps:
- name: Set Space Docker Tag
run: |
if [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-space:latest"
elif [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "release" ] || [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "preview" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-space-private:preview,${{ secrets.DOCKERHUB_USERNAME }}/plane-space:preview"
else
TAG=${{ env.SPACE_TAG }}"
fi
echo "SPACE_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0 uses: docker/setup-buildx-action@v2.5.0
@ -144,7 +134,7 @@ jobs:
context: . context: .
file: ./space/Dockerfile.space file: ./space/Dockerfile.space
platforms: linux/amd64 platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }} tags: ${{ env.SPACE_TAG }}
push: true push: true
env: env:
DOCKER_BUILDKIT: 1 DOCKER_BUILDKIT: 1
@ -153,8 +143,20 @@ jobs:
branch_build_push_backend: branch_build_push_backend:
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
needs: [ branch_build_and_push ] needs: [branch_build_and_push]
env:
BACKEND_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
steps: steps:
- name: Set Backend Docker Tag
run: |
if [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:latest"
elif [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "release" ] || [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "preview" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:preview",${{ secrets.DOCKERHUB_USERNAME }}/plane-backend:preview"
else
TAG=${{ env.BACKEND_TAG }}
fi
echo "BACKEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0 uses: docker/setup-buildx-action@v2.5.0
@ -175,7 +177,7 @@ jobs:
file: ./Dockerfile.api file: ./Dockerfile.api
platforms: linux/amd64 platforms: linux/amd64
push: true push: true
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }} tags: ${{ env.BACKEND_TAG }}
env: env:
DOCKER_BUILDKIT: 1 DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }} DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
@ -183,8 +185,20 @@ jobs:
branch_build_push_proxy: branch_build_push_proxy:
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
needs: [ branch_build_and_push ] needs: [branch_build_and_push]
env:
PROXY_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
steps: steps:
- name: Set Proxy Docker Tag
run: |
if [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:latest"
elif [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "release" ] || [ "${{ needs.branch_build_and_push.outputs.gh_branch_name }}" = "preview" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:preview,${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy:preview"
else
TAG=${{ env.PROXY_TAG }}
fi
echo "PROXY_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0 uses: docker/setup-buildx-action@v2.5.0
@ -205,7 +219,7 @@ jobs:
context: . context: .
file: ./Dockerfile file: ./Dockerfile
platforms: linux/amd64 platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }} tags: ${{ env.PROXY_TAG }}
push: true push: true
env: env:
DOCKER_BUILDKIT: 1 DOCKER_BUILDKIT: 1

1
.gitignore vendored
View File

@ -79,3 +79,4 @@ pnpm-workspace.yaml
tmp/ tmp/
## packages ## packages
dist dist
.temp/

View File

@ -43,8 +43,6 @@ FROM python:3.11.1-alpine3.17 AS backend
ENV PYTHONDONTWRITEBYTECODE 1 ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1 ENV PYTHONUNBUFFERED 1
ENV PIP_DISABLE_PIP_VERSION_CHECK=1 ENV PIP_DISABLE_PIP_VERSION_CHECK=1
ENV DJANGO_SETTINGS_MODULE plane.settings.production
ENV DOCKERIZED 1
WORKDIR /code WORKDIR /code

View File

@ -31,12 +31,10 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880 FILE_SIZE_LIMIT=5242880
# GPT settings # GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # add your openai key here OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1
# set to 1 If using the pre-configured minio setup # set to 1 If using the pre-configured minio setup
USE_MINIO=1 USE_MINIO=1
@ -78,7 +76,6 @@ NEXT_PUBLIC_ENABLE_OAUTH=0
# Backend # Backend
# Debug value for api server use it as 0 for production use # Debug value for api server use it as 0 for production use
DEBUG=0 DEBUG=0
DJANGO_SETTINGS_MODULE="plane.settings.selfhosted"
# Error logs # Error logs
SENTRY_DSN="" SENTRY_DSN=""
@ -115,24 +112,22 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880 FILE_SIZE_LIMIT=5242880
# GPT settings # GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # add your openai key here OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1 # Deprecated
# Github # Github
GITHUB_CLIENT_SECRET="" # For fetching release notes GITHUB_CLIENT_SECRET="" # For fetching release notes
# Settings related to Docker
DOCKERIZED=1
# set to 1 If using the pre-configured minio setup # set to 1 If using the pre-configured minio setup
USE_MINIO=1 USE_MINIO=1
# Nginx Configuration # Nginx Configuration
NGINX_PORT=80 NGINX_PORT=80
# Default Creds
DEFAULT_EMAIL="captain@plane.so"
DEFAULT_PASSWORD="password123"
# SignUps # SignUps
ENABLE_SIGNUP="1" ENABLE_SIGNUP="1"

View File

@ -57,10 +57,6 @@ Setting up local environment is extremely easy and straight forward. Follow the
1. Review the `.env` files available in various folders. Visit [Environment Setup](./ENV_SETUP.md) to know about various environment variables used in system 1. Review the `.env` files available in various folders. Visit [Environment Setup](./ENV_SETUP.md) to know about various environment variables used in system
1. Run the docker command to initiate various services `docker compose -f docker-compose-local.yml up -d` 1. Run the docker command to initiate various services `docker compose -f docker-compose-local.yml up -d`
```bash
./setup.sh
```
You are ready to make changes to the code. Do not forget to refresh the browser (in case id does not auto-reload) You are ready to make changes to the code. Do not forget to refresh the browser (in case id does not auto-reload)
Thats it! Thats it!

View File

@ -1,10 +1,11 @@
# Backend # Backend
# Debug value for api server use it as 0 for production use # Debug value for api server use it as 0 for production use
DEBUG=0 DEBUG=0
DJANGO_SETTINGS_MODULE="plane.settings.production" CORS_ALLOWED_ORIGINS=""
# Error logs # Error logs
SENTRY_DSN="" SENTRY_DSN=""
SENTRY_ENVIRONMENT="development"
# Database Settings # Database Settings
PGUSER="plane" PGUSER="plane"
@ -18,15 +19,6 @@ REDIS_HOST="plane-redis"
REDIS_PORT="6379" REDIS_PORT="6379"
REDIS_URL="redis://${REDIS_HOST}:6379/" REDIS_URL="redis://${REDIS_HOST}:6379/"
# Email Settings
EMAIL_HOST=""
EMAIL_HOST_USER=""
EMAIL_HOST_PASSWORD=""
EMAIL_PORT=587
EMAIL_FROM="Team Plane <team@mailer.plane.so>"
EMAIL_USE_TLS="1"
EMAIL_USE_SSL="0"
# AWS Settings # AWS Settings
AWS_REGION="" AWS_REGION=""
AWS_ACCESS_KEY_ID="access-key" AWS_ACCESS_KEY_ID="access-key"
@ -38,24 +30,22 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880 FILE_SIZE_LIMIT=5242880
# GPT settings # GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # add your openai key here OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Github # Github
GITHUB_CLIENT_SECRET="" # For fetching release notes GITHUB_CLIENT_SECRET="" # For fetching release notes
# Settings related to Docker # Settings related to Docker
DOCKERIZED=1 DOCKERIZED=1 # deprecated
# set to 1 If using the pre-configured minio setup # set to 1 If using the pre-configured minio setup
USE_MINIO=1 USE_MINIO=1
# Nginx Configuration # Nginx Configuration
NGINX_PORT=80 NGINX_PORT=80
# Default Creds
DEFAULT_EMAIL="captain@plane.so"
DEFAULT_PASSWORD="password123"
# SignUps # SignUps
ENABLE_SIGNUP="1" ENABLE_SIGNUP="1"
@ -70,6 +60,6 @@ ENABLE_MAGIC_LINK_LOGIN="0"
# Email redirections and minio domain settings # Email redirections and minio domain settings
WEB_URL="http://localhost" WEB_URL="http://localhost"
# Gunicorn Workers # Gunicorn Workers
GUNICORN_WORKERS=2 GUNICORN_WORKERS=2

View File

@ -43,7 +43,7 @@ USER captain
COPY manage.py manage.py COPY manage.py manage.py
COPY plane plane/ COPY plane plane/
COPY templates templates/ COPY templates templates/
COPY package.json package.json
COPY gunicorn.config.py ./ COPY gunicorn.config.py ./
USER root USER root
RUN apk --no-cache add "bash~=5.2" RUN apk --no-cache add "bash~=5.2"

View File

@ -3,7 +3,28 @@ set -e
python manage.py wait_for_db python manage.py wait_for_db
python manage.py migrate python manage.py migrate
# Create a Default User # Create the default bucket
python bin/user_script.py #!/bin/bash
# Collect system information
HOSTNAME=$(hostname)
MAC_ADDRESS=$(ip link show | awk '/ether/ {print $2}' | head -n 1)
CPU_INFO=$(cat /proc/cpuinfo)
MEMORY_INFO=$(free -h)
DISK_INFO=$(df -h)
# Concatenate information and compute SHA-256 hash
SIGNATURE=$(echo "$HOSTNAME$MAC_ADDRESS$CPU_INFO$MEMORY_INFO$DISK_INFO" | sha256sum | awk '{print $1}')
# Export the variables
export MACHINE_SIGNATURE=$SIGNATURE
# Register instance
python manage.py register_instance $MACHINE_SIGNATURE
# Load the configuration variable
python manage.py configure_instance
# Create the default bucket
python manage.py create_bucket
exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:8000 --max-requests 1200 --max-requests-jitter 1000 --access-logfile - exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:8000 --max-requests 1200 --max-requests-jitter 1000 --access-logfile -

View File

@ -1,28 +0,0 @@
import os, sys
import uuid
sys.path.append("/code")
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "plane.settings.production")
import django
django.setup()
from plane.db.models import User
def populate():
default_email = os.environ.get("DEFAULT_EMAIL", "captain@plane.so")
default_password = os.environ.get("DEFAULT_PASSWORD", "password123")
if not User.objects.filter(email=default_email).exists():
user = User.objects.create(email=default_email, username=uuid.uuid4().hex)
user.set_password(default_password)
user.save()
print(f"User created with an email: {default_email}")
else:
print(f"User already exists with the default email: {default_email}")
if __name__ == "__main__":
populate()

4
apiserver/package.json Normal file
View File

@ -0,0 +1,4 @@
{
"name": "plane-api",
"version": "0.13.2"
}

View File

@ -0,0 +1,47 @@
# Django imports
from django.utils import timezone
from django.db.models import Q
# Third party imports
from rest_framework import authentication
from rest_framework.exceptions import AuthenticationFailed
# Module imports
from plane.db.models import APIToken
class APIKeyAuthentication(authentication.BaseAuthentication):
"""
Authentication with an API Key
"""
www_authenticate_realm = "api"
media_type = "application/json"
auth_header_name = "X-Api-Key"
def get_api_token(self, request):
return request.headers.get(self.auth_header_name)
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
token=token,
is_active=True,
)
except APIToken.DoesNotExist:
raise AuthenticationFailed("Given API token is not valid")
# save api token last used
api_token.last_used = timezone.now()
api_token.save(update_fields=["last_used"])
return (api_token.user, api_token.token)
def authenticate(self, request):
token = self.get_api_token(request=request)
if not token:
return None
# Validate the API token
user, token = self.validate_api_token(token)
return user, token

View File

@ -1,2 +0,0 @@
from .workspace import WorkSpaceBasePermission, WorkSpaceAdminPermission, WorkspaceEntityPermission, WorkspaceViewerPermission
from .project import ProjectBasePermission, ProjectEntityPermission, ProjectMemberPermission, ProjectLitePermission

View File

@ -0,0 +1,41 @@
from rest_framework.throttling import SimpleRateThrottle
class ApiKeyRateThrottle(SimpleRateThrottle):
scope = 'api_key'
rate = '60/minute'
def get_cache_key(self, request, view):
# Retrieve the API key from the request header
api_key = request.headers.get('X-Api-Key')
if not api_key:
return None # Allow the request if there's no API key
# Use the API key as part of the cache key
return f'{self.scope}:{api_key}'
def allow_request(self, request, view):
allowed = super().allow_request(request, view)
if allowed:
now = self.timer()
# Calculate the remaining limit and reset time
history = self.cache.get(self.key, [])
# Remove old histories
while history and history[-1] <= now - self.duration:
history.pop()
# Calculate the requests
num_requests = len(history)
# Check available requests
available = self.num_requests - num_requests
# Unix timestamp for when the rate limit will reset
reset_time = int(now + self.duration)
# Add headers
request.META['X-RateLimit-Remaining'] = max(0, available)
request.META['X-RateLimit-Reset'] = reset_time
return allowed

View File

@ -1,102 +1,17 @@
from .base import BaseSerializer from .user import UserLiteSerializer
from .user import ( from .workspace import WorkspaceLiteSerializer
UserSerializer, from .project import ProjectSerializer, ProjectLiteSerializer
UserLiteSerializer,
ChangePasswordSerializer,
ResetPasswordSerializer,
UserAdminLiteSerializer,
UserMeSerializer,
UserMeSettingsSerializer,
)
from .workspace import (
WorkSpaceSerializer,
WorkSpaceMemberSerializer,
TeamSerializer,
WorkSpaceMemberInviteSerializer,
WorkspaceLiteSerializer,
WorkspaceThemeSerializer,
WorkspaceMemberAdminSerializer,
WorkspaceMemberMeSerializer,
)
from .project import (
ProjectSerializer,
ProjectListSerializer,
ProjectDetailSerializer,
ProjectMemberSerializer,
ProjectMemberInviteSerializer,
ProjectIdentifierSerializer,
ProjectFavoriteSerializer,
ProjectLiteSerializer,
ProjectMemberLiteSerializer,
ProjectDeployBoardSerializer,
ProjectMemberAdminSerializer,
ProjectPublicMemberSerializer,
)
from .state import StateSerializer, StateLiteSerializer
from .view import GlobalViewSerializer, IssueViewSerializer, IssueViewFavoriteSerializer
from .cycle import (
CycleSerializer,
CycleIssueSerializer,
CycleFavoriteSerializer,
CycleWriteSerializer,
)
from .asset import FileAssetSerializer
from .issue import ( from .issue import (
IssueCreateSerializer,
IssueActivitySerializer,
IssueCommentSerializer,
IssuePropertySerializer,
IssueAssigneeSerializer,
LabelSerializer,
IssueSerializer, IssueSerializer,
IssueFlatSerializer, LabelSerializer,
IssueStateSerializer,
IssueLinkSerializer, IssueLinkSerializer,
IssueLiteSerializer,
IssueAttachmentSerializer, IssueAttachmentSerializer,
IssueSubscriberSerializer, IssueCommentSerializer,
IssueReactionSerializer, IssueAttachmentSerializer,
CommentReactionSerializer, IssueActivitySerializer,
IssueVoteSerializer, IssueExpandSerializer,
IssueRelationSerializer,
RelatedIssueSerializer,
IssuePublicSerializer,
) )
from .state import StateLiteSerializer, StateSerializer
from .module import ( from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer
ModuleWriteSerializer, from .module import ModuleSerializer, ModuleIssueSerializer, ModuleLiteSerializer
ModuleSerializer, from .inbox import InboxIssueSerializer
ModuleIssueSerializer,
ModuleLinkSerializer,
ModuleFavoriteSerializer,
)
from .api_token import APITokenSerializer
from .integration import (
IntegrationSerializer,
WorkspaceIntegrationSerializer,
GithubIssueSyncSerializer,
GithubRepositorySerializer,
GithubRepositorySyncSerializer,
GithubCommentSyncSerializer,
SlackProjectSyncSerializer,
)
from .importer import ImporterSerializer
from .page import PageSerializer, PageBlockSerializer, PageFavoriteSerializer
from .estimate import (
EstimateSerializer,
EstimatePointSerializer,
EstimateReadSerializer,
)
from .inbox import InboxSerializer, InboxIssueSerializer, IssueStateInboxSerializer
from .analytic import AnalyticViewSerializer
from .notification import NotificationSerializer
from .exporter import ExporterHistorySerializer

View File

@ -1,14 +0,0 @@
from .base import BaseSerializer
from plane.db.models import APIToken
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = [
"label",
"user",
"user_type",
"workspace",
"created_at",
]

View File

@ -1,22 +1,22 @@
# Third party imports
from rest_framework import serializers from rest_framework import serializers
class BaseSerializer(serializers.ModelSerializer): class BaseSerializer(serializers.ModelSerializer):
id = serializers.PrimaryKeyRelatedField(read_only=True) id = serializers.PrimaryKeyRelatedField(read_only=True)
class DynamicBaseSerializer(BaseSerializer):
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
# If 'fields' is provided in the arguments, remove it and store it separately. # If 'fields' is provided in the arguments, remove it and store it separately.
# This is done so as not to pass this custom argument up to the superclass. # This is done so as not to pass this custom argument up to the superclass.
fields = kwargs.pop("fields", None) fields = kwargs.pop("fields", [])
self.expand = kwargs.pop("expand", []) or []
# Call the initialization of the superclass. # Call the initialization of the superclass.
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
# If 'fields' was provided, filter the fields of the serializer accordingly. # If 'fields' was provided, filter the fields of the serializer accordingly.
if fields is not None: if fields:
self.fields = self._filter_fields(fields) self.fields = self._filter_fields(fields=fields)
def _filter_fields(self, fields): def _filter_fields(self, fields):
""" """
@ -52,7 +52,54 @@ class DynamicBaseSerializer(BaseSerializer):
allowed = set(allowed) allowed = set(allowed)
# Remove fields from the serializer that aren't in the 'allowed' list. # Remove fields from the serializer that aren't in the 'allowed' list.
for field_name in (existing - allowed): for field_name in existing - allowed:
self.fields.pop(field_name) self.fields.pop(field_name)
return self.fields return self.fields
def to_representation(self, instance):
response = super().to_representation(instance)
# Ensure 'expand' is iterable before processing
if self.expand:
for expand in self.expand:
if expand in self.fields:
# Import all the expandable serializers
from . import (
WorkspaceLiteSerializer,
ProjectLiteSerializer,
UserLiteSerializer,
StateLiteSerializer,
IssueSerializer,
)
# Expansion mapper
expansion = {
"user": UserLiteSerializer,
"workspace": WorkspaceLiteSerializer,
"project": ProjectLiteSerializer,
"default_assignee": UserLiteSerializer,
"project_lead": UserLiteSerializer,
"state": StateLiteSerializer,
"created_by": UserLiteSerializer,
"issue": IssueSerializer,
"actor": UserLiteSerializer,
"owned_by": UserLiteSerializer,
"members": UserLiteSerializer,
}
# Check if field in expansion then expand the field
if expand in expansion:
if isinstance(response.get(expand), list):
exp_serializer = expansion[expand](
getattr(instance, expand), many=True
)
else:
exp_serializer = expansion[expand](
getattr(instance, expand)
)
response[expand] = exp_serializer.data
else:
# You might need to handle this case differently
response[expand] = getattr(instance, f"{expand}_id", None)
return response

View File

@ -3,43 +3,19 @@ from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer from .base import BaseSerializer
from .user import UserLiteSerializer from plane.db.models import Cycle, CycleIssue
from .issue import IssueStateSerializer
from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer
from plane.db.models import Cycle, CycleIssue, CycleFavorite
class CycleWriteSerializer(BaseSerializer):
def validate(self, data):
if (
data.get("start_date", None) is not None
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError("Start date cannot exceed end date")
return data
class Meta:
model = Cycle
fields = "__all__"
class CycleSerializer(BaseSerializer): class CycleSerializer(BaseSerializer):
owned_by = UserLiteSerializer(read_only=True)
is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True) total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True) cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True) completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True) started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True) unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True) backlog_issues = serializers.IntegerField(read_only=True)
assignees = serializers.SerializerMethodField(read_only=True)
total_estimates = serializers.IntegerField(read_only=True) total_estimates = serializers.IntegerField(read_only=True)
completed_estimates = serializers.IntegerField(read_only=True) completed_estimates = serializers.IntegerField(read_only=True)
started_estimates = serializers.IntegerField(read_only=True) started_estimates = serializers.IntegerField(read_only=True)
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
def validate(self, data): def validate(self, data):
if ( if (
@ -50,26 +26,6 @@ class CycleSerializer(BaseSerializer):
raise serializers.ValidationError("Start date cannot exceed end date") raise serializers.ValidationError("Start date cannot exceed end date")
return data return data
def get_assignees(self, obj):
members = [
{
"avatar": assignee.avatar,
"display_name": assignee.display_name,
"id": assignee.id,
}
for issue_cycle in obj.issue_cycle.prefetch_related(
"issue__assignees"
).all()
for assignee in issue_cycle.issue.assignees.all()
]
# Use a set comprehension to return only the unique objects
unique_objects = {frozenset(item.items()) for item in members}
# Convert the set back to a list of dictionaries
unique_list = [dict(item) for item in unique_objects]
return unique_list
class Meta: class Meta:
model = Cycle model = Cycle
fields = "__all__" fields = "__all__"
@ -81,7 +37,6 @@ class CycleSerializer(BaseSerializer):
class CycleIssueSerializer(BaseSerializer): class CycleIssueSerializer(BaseSerializer):
issue_detail = IssueStateSerializer(read_only=True, source="issue")
sub_issues_count = serializers.IntegerField(read_only=True) sub_issues_count = serializers.IntegerField(read_only=True)
class Meta: class Meta:
@ -94,14 +49,8 @@ class CycleIssueSerializer(BaseSerializer):
] ]
class CycleFavoriteSerializer(BaseSerializer): class CycleLiteSerializer(BaseSerializer):
cycle_detail = CycleSerializer(source="cycle", read_only=True)
class Meta: class Meta:
model = CycleFavorite model = Cycle
fields = "__all__" fields = "__all__"
read_only_fields = [
"workspace",
"project",
"user",
]

View File

@ -1,57 +1,19 @@
# Third party frameworks # Module improts
from rest_framework import serializers
# Module imports
from .base import BaseSerializer from .base import BaseSerializer
from .issue import IssueFlatSerializer, LabelLiteSerializer from plane.db.models import InboxIssue
from .project import ProjectLiteSerializer
from .state import StateLiteSerializer
from .user import UserLiteSerializer
from plane.db.models import Inbox, InboxIssue, Issue
class InboxSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(source="project", read_only=True)
pending_issue_count = serializers.IntegerField(read_only=True)
class Meta:
model = Inbox
fields = "__all__"
read_only_fields = [
"project",
"workspace",
]
class InboxIssueSerializer(BaseSerializer): class InboxIssueSerializer(BaseSerializer):
issue_detail = IssueFlatSerializer(source="issue", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
class Meta: class Meta:
model = InboxIssue model = InboxIssue
fields = "__all__" fields = "__all__"
read_only_fields = [ read_only_fields = [
"project", "id",
"workspace", "workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
] ]
class InboxIssueLiteSerializer(BaseSerializer):
class Meta:
model = InboxIssue
fields = ["id", "status", "duplicate_to", "snoozed_till", "source"]
read_only_fields = fields
class IssueStateInboxSerializer(BaseSerializer):
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
sub_issues_count = serializers.IntegerField(read_only=True)
bridge_id = serializers.UUIDField(read_only=True)
issue_inbox = InboxIssueLiteSerializer(read_only=True, many=True)
class Meta:
model = Issue
fields = "__all__"

View File

@ -1,87 +1,41 @@
# Django imports # Django imports
from django.utils import timezone from django.utils import timezone
# Third Party imports # Third party imports
from rest_framework import serializers from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer, DynamicBaseSerializer
from .user import UserLiteSerializer
from .state import StateSerializer, StateLiteSerializer
from .project import ProjectLiteSerializer
from .workspace import WorkspaceLiteSerializer
from plane.db.models import ( from plane.db.models import (
User, User,
Issue, Issue,
IssueActivity, State,
IssueComment,
IssueProperty,
IssueAssignee, IssueAssignee,
IssueSubscriber,
IssueLabel,
Label, Label,
CycleIssue, IssueLabel,
Cycle,
Module,
ModuleIssue,
IssueLink, IssueLink,
IssueComment,
IssueAttachment, IssueAttachment,
IssueReaction, IssueActivity,
CommentReaction, ProjectMember,
IssueVote,
IssueRelation,
) )
from .base import BaseSerializer
from .cycle import CycleSerializer, CycleLiteSerializer
from .module import ModuleSerializer, ModuleLiteSerializer
class IssueFlatSerializer(BaseSerializer): class IssueSerializer(BaseSerializer):
## Contain only flat fields
class Meta:
model = Issue
fields = [
"id",
"name",
"description",
"description_html",
"priority",
"start_date",
"target_date",
"sequence_id",
"sort_order",
"is_draft",
]
class IssueProjectLiteSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(source="project", read_only=True)
class Meta:
model = Issue
fields = [
"id",
"project_detail",
"name",
"sequence_id",
]
read_only_fields = fields
##TODO: Find a better way to write this serializer
## Find a better approach to save manytomany?
class IssueCreateSerializer(BaseSerializer):
state_detail = StateSerializer(read_only=True, source="state")
created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
assignees = serializers.ListField( assignees = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()), child=serializers.PrimaryKeyRelatedField(
queryset=User.objects.values_list("id", flat=True)
),
write_only=True, write_only=True,
required=False, required=False,
) )
labels = serializers.ListField( labels = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()), child=serializers.PrimaryKeyRelatedField(
queryset=Label.objects.values_list("id", flat=True)
),
write_only=True, write_only=True,
required=False, required=False,
) )
@ -90,6 +44,7 @@ class IssueCreateSerializer(BaseSerializer):
model = Issue model = Issue
fields = "__all__" fields = "__all__"
read_only_fields = [ read_only_fields = [
"id",
"workspace", "workspace",
"project", "project",
"created_by", "created_by",
@ -98,12 +53,6 @@ class IssueCreateSerializer(BaseSerializer):
"updated_at", "updated_at",
] ]
def to_representation(self, instance):
data = super().to_representation(instance)
data['assignees'] = [str(assignee.id) for assignee in instance.assignees.all()]
data['labels'] = [str(label.id) for label in instance.labels.all()]
return data
def validate(self, data): def validate(self, data):
if ( if (
data.get("start_date", None) is not None data.get("start_date", None) is not None
@ -111,6 +60,44 @@ class IssueCreateSerializer(BaseSerializer):
and data.get("start_date", None) > data.get("target_date", None) and data.get("start_date", None) > data.get("target_date", None)
): ):
raise serializers.ValidationError("Start date cannot exceed target date") raise serializers.ValidationError("Start date cannot exceed target date")
# Validate assignees are from project
if data.get("assignees", []):
data["assignees"] = ProjectMember.objects.filter(
project_id=self.context.get("project_id"),
is_active=True,
member_id__in=data["assignees"],
).values_list("member_id", flat=True)
# Validate labels are from project
if data.get("labels", []):
data["labels"] = Label.objects.filter(
project_id=self.context.get("project_id"),
id__in=data["labels"],
).values_list("id", flat=True)
# Check state is from the project only else raise validation error
if (
data.get("state")
and not State.objects.filter(
project_id=self.context.get("project_id"), pk=data.get("state")
).exists()
):
raise serializers.ValidationError(
"State is not valid please pass a valid state_id"
)
# Check parent issue is from workspace as it can be cross workspace
if (
data.get("parent")
and not Issue.objects.filter(
workspace_id=self.context.get("workspace_id"), pk=data.get("parent")
).exists()
):
raise serializers.ValidationError(
"Parent is not valid issue_id please pass a valid issue_id"
)
return data return data
def create(self, validated_data): def create(self, validated_data):
@ -131,14 +118,14 @@ class IssueCreateSerializer(BaseSerializer):
IssueAssignee.objects.bulk_create( IssueAssignee.objects.bulk_create(
[ [
IssueAssignee( IssueAssignee(
assignee=user, assignee_id=assignee_id,
issue=issue, issue=issue,
project_id=project_id, project_id=project_id,
workspace_id=workspace_id, workspace_id=workspace_id,
created_by_id=created_by_id, created_by_id=created_by_id,
updated_by_id=updated_by_id, updated_by_id=updated_by_id,
) )
for user in assignees for assignee_id in assignees
], ],
batch_size=10, batch_size=10,
) )
@ -158,14 +145,14 @@ class IssueCreateSerializer(BaseSerializer):
IssueLabel.objects.bulk_create( IssueLabel.objects.bulk_create(
[ [
IssueLabel( IssueLabel(
label=label, label_id=label_id,
issue=issue, issue=issue,
project_id=project_id, project_id=project_id,
workspace_id=workspace_id, workspace_id=workspace_id,
created_by_id=created_by_id, created_by_id=created_by_id,
updated_by_id=updated_by_id, updated_by_id=updated_by_id,
) )
for label in labels for label_id in labels
], ],
batch_size=10, batch_size=10,
) )
@ -187,14 +174,14 @@ class IssueCreateSerializer(BaseSerializer):
IssueAssignee.objects.bulk_create( IssueAssignee.objects.bulk_create(
[ [
IssueAssignee( IssueAssignee(
assignee=user, assignee_id=assignee_id,
issue=instance, issue=instance,
project_id=project_id, project_id=project_id,
workspace_id=workspace_id, workspace_id=workspace_id,
created_by_id=created_by_id, created_by_id=created_by_id,
updated_by_id=updated_by_id, updated_by_id=updated_by_id,
) )
for user in assignees for assignee_id in assignees
], ],
batch_size=10, batch_size=10,
) )
@ -204,14 +191,14 @@ class IssueCreateSerializer(BaseSerializer):
IssueLabel.objects.bulk_create( IssueLabel.objects.bulk_create(
[ [
IssueLabel( IssueLabel(
label=label, label_id=label_id,
issue=instance, issue=instance,
project_id=project_id, project_id=project_id,
workspace_id=workspace_id, workspace_id=workspace_id,
created_by_id=created_by_id, created_by_id=created_by_id,
updated_by_id=updated_by_id, updated_by_id=updated_by_id,
) )
for label in labels for label_id in labels
], ],
batch_size=10, batch_size=10,
) )
@ -220,157 +207,34 @@ class IssueCreateSerializer(BaseSerializer):
instance.updated_at = timezone.now() instance.updated_at = timezone.now()
return super().update(instance, validated_data) return super().update(instance, validated_data)
def to_representation(self, instance):
data = super().to_representation(instance)
if "assignees" in self.fields:
if "assignees" in self.expand:
from .user import UserLiteSerializer
class IssueActivitySerializer(BaseSerializer): data["assignees"] = UserLiteSerializer(
actor_detail = UserLiteSerializer(read_only=True, source="actor") instance.assignees.all(), many=True
issue_detail = IssueFlatSerializer(read_only=True, source="issue") ).data
project_detail = ProjectLiteSerializer(read_only=True, source="project") else:
data["assignees"] = [
str(assignee.id) for assignee in instance.assignees.all()
]
if "labels" in self.fields:
if "labels" in self.expand:
data["labels"] = LabelSerializer(instance.labels.all(), many=True).data
else:
data["labels"] = [str(label.id) for label in instance.labels.all()]
class Meta: return data
model = IssueActivity
fields = "__all__"
class IssuePropertySerializer(BaseSerializer):
class Meta:
model = IssueProperty
fields = "__all__"
read_only_fields = [
"user",
"workspace",
"project",
]
class LabelSerializer(BaseSerializer): class LabelSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
class Meta: class Meta:
model = Label model = Label
fields = "__all__" fields = "__all__"
read_only_fields = [ read_only_fields = [
"workspace",
"project",
]
class LabelLiteSerializer(BaseSerializer):
class Meta:
model = Label
fields = [
"id", "id",
"name",
"color",
]
class IssueLabelSerializer(BaseSerializer):
class Meta:
model = IssueLabel
fields = "__all__"
read_only_fields = [
"workspace",
"project",
]
class IssueRelationSerializer(BaseSerializer):
issue_detail = IssueProjectLiteSerializer(read_only=True, source="related_issue")
class Meta:
model = IssueRelation
fields = [
"issue_detail",
"relation_type",
"related_issue",
"issue",
"id"
]
read_only_fields = [
"workspace",
"project",
]
class RelatedIssueSerializer(BaseSerializer):
issue_detail = IssueProjectLiteSerializer(read_only=True, source="issue")
class Meta:
model = IssueRelation
fields = [
"issue_detail",
"relation_type",
"related_issue",
"issue",
"id"
]
read_only_fields = [
"workspace",
"project",
]
class IssueAssigneeSerializer(BaseSerializer):
assignee_details = UserLiteSerializer(read_only=True, source="assignee")
class Meta:
model = IssueAssignee
fields = "__all__"
class CycleBaseSerializer(BaseSerializer):
class Meta:
model = Cycle
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueCycleDetailSerializer(BaseSerializer):
cycle_detail = CycleBaseSerializer(read_only=True, source="cycle")
class Meta:
model = CycleIssue
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class ModuleBaseSerializer(BaseSerializer):
class Meta:
model = Module
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueModuleDetailSerializer(BaseSerializer):
module_detail = ModuleBaseSerializer(read_only=True, source="module")
class Meta:
model = ModuleIssue
fields = "__all__"
read_only_fields = [
"workspace", "workspace",
"project", "project",
"created_by", "created_by",
@ -381,19 +245,18 @@ class IssueModuleDetailSerializer(BaseSerializer):
class IssueLinkSerializer(BaseSerializer): class IssueLinkSerializer(BaseSerializer):
created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
class Meta: class Meta:
model = IssueLink model = IssueLink
fields = "__all__" fields = "__all__"
read_only_fields = [ read_only_fields = [
"id",
"workspace", "workspace",
"project", "project",
"issue",
"created_by", "created_by",
"updated_by", "updated_by",
"created_at", "created_at",
"updated_at", "updated_at",
"issue",
] ]
# Validation if url already exists # Validation if url already exists
@ -412,73 +275,25 @@ class IssueAttachmentSerializer(BaseSerializer):
model = IssueAttachment model = IssueAttachment
fields = "__all__" fields = "__all__"
read_only_fields = [ read_only_fields = [
"id",
"workspace",
"project",
"issue",
"created_by", "created_by",
"updated_by", "updated_by",
"created_at", "created_at",
"updated_at", "updated_at",
"workspace",
"project",
"issue",
] ]
class IssueReactionSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = IssueReaction
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
"actor",
]
class CommentReactionLiteSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = CommentReaction
fields = [
"id",
"reaction",
"comment",
"actor_detail",
]
class CommentReactionSerializer(BaseSerializer):
class Meta:
model = CommentReaction
fields = "__all__"
read_only_fields = ["workspace", "project", "comment", "actor"]
class IssueVoteSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = IssueVote
fields = ["issue", "vote", "workspace", "project", "actor", "actor_detail"]
read_only_fields = fields
class IssueCommentSerializer(BaseSerializer): class IssueCommentSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
comment_reactions = CommentReactionLiteSerializer(read_only=True, many=True)
is_member = serializers.BooleanField(read_only=True) is_member = serializers.BooleanField(read_only=True)
class Meta: class Meta:
model = IssueComment model = IssueComment
fields = "__all__" fields = "__all__"
read_only_fields = [ read_only_fields = [
"id",
"workspace", "workspace",
"project", "project",
"issue", "issue",
@ -489,56 +304,45 @@ class IssueCommentSerializer(BaseSerializer):
] ]
class IssueStateFlatSerializer(BaseSerializer): class IssueActivitySerializer(BaseSerializer):
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
class Meta: class Meta:
model = Issue model = IssueActivity
fields = [ exclude = [
"id", "created_by",
"sequence_id", "updated_by",
"name",
"state_detail",
"project_detail",
] ]
# Issue Serializer with state details class CycleIssueSerializer(BaseSerializer):
class IssueStateSerializer(BaseSerializer): cycle = CycleSerializer(read_only=True)
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
sub_issues_count = serializers.IntegerField(read_only=True)
bridge_id = serializers.UUIDField(read_only=True)
attachment_count = serializers.IntegerField(read_only=True)
link_count = serializers.IntegerField(read_only=True)
class Meta: class Meta:
model = Issue fields = [
fields = "__all__" "cycle",
]
class IssueSerializer(BaseSerializer): class ModuleIssueSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project") module = ModuleSerializer(read_only=True)
state_detail = StateSerializer(read_only=True, source="state")
parent_detail = IssueStateFlatSerializer(read_only=True, source="parent") class Meta:
label_details = LabelSerializer(read_only=True, source="labels", many=True) fields = [
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True) "module",
related_issues = IssueRelationSerializer(read_only=True, source="issue_relation", many=True) ]
issue_relations = RelatedIssueSerializer(read_only=True, source="issue_related", many=True)
issue_cycle = IssueCycleDetailSerializer(read_only=True)
issue_module = IssueModuleDetailSerializer(read_only=True) class IssueExpandSerializer(BaseSerializer):
issue_link = IssueLinkSerializer(read_only=True, many=True) # Serialize the related cycle. It's a OneToOne relation.
issue_attachment = IssueAttachmentSerializer(read_only=True, many=True) cycle = CycleLiteSerializer(source="issue_cycle.cycle", read_only=True)
sub_issues_count = serializers.IntegerField(read_only=True)
issue_reactions = IssueReactionSerializer(read_only=True, many=True) # Serialize the related module. It's a OneToOne relation.
module = ModuleLiteSerializer(source="issue_module.module", read_only=True)
class Meta: class Meta:
model = Issue model = Issue
fields = "__all__" fields = "__all__"
read_only_fields = [ read_only_fields = [
"id",
"workspace", "workspace",
"project", "project",
"created_by", "created_by",
@ -546,70 +350,3 @@ class IssueSerializer(BaseSerializer):
"created_at", "created_at",
"updated_at", "updated_at",
] ]
class IssueLiteSerializer(DynamicBaseSerializer):
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
sub_issues_count = serializers.IntegerField(read_only=True)
cycle_id = serializers.UUIDField(read_only=True)
module_id = serializers.UUIDField(read_only=True)
attachment_count = serializers.IntegerField(read_only=True)
link_count = serializers.IntegerField(read_only=True)
issue_reactions = IssueReactionSerializer(read_only=True, many=True)
class Meta:
model = Issue
fields = "__all__"
read_only_fields = [
"start_date",
"target_date",
"completed_at",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssuePublicSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")
reactions = IssueReactionSerializer(read_only=True, many=True, source="issue_reactions")
votes = IssueVoteSerializer(read_only=True, many=True)
class Meta:
model = Issue
fields = [
"id",
"name",
"description_html",
"sequence_id",
"state",
"state_detail",
"project",
"project_detail",
"workspace",
"priority",
"target_date",
"reactions",
"votes",
]
read_only_fields = fields
class IssueSubscriberSerializer(BaseSerializer):
class Meta:
model = IssueSubscriber
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
]

View File

@ -1,36 +1,38 @@
# Third Party imports # Third party imports
from rest_framework import serializers from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer from .base import BaseSerializer
from .user import UserLiteSerializer
from .project import ProjectLiteSerializer
from .workspace import WorkspaceLiteSerializer
from plane.db.models import ( from plane.db.models import (
User, User,
Module, Module,
ModuleLink,
ModuleMember, ModuleMember,
ModuleIssue, ModuleIssue,
ModuleLink, ProjectMember,
ModuleFavorite,
) )
class ModuleWriteSerializer(BaseSerializer): class ModuleSerializer(BaseSerializer):
members = serializers.ListField( members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()), child=serializers.PrimaryKeyRelatedField(
queryset=User.objects.values_list("id", flat=True)
),
write_only=True, write_only=True,
required=False, required=False,
) )
total_issues = serializers.IntegerField(read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True) cancelled_issues = serializers.IntegerField(read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True) completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
class Meta: class Meta:
model = Module model = Module
fields = "__all__" fields = "__all__"
read_only_fields = [ read_only_fields = [
"id",
"workspace", "workspace",
"project", "project",
"created_by", "created_by",
@ -41,12 +43,23 @@ class ModuleWriteSerializer(BaseSerializer):
def to_representation(self, instance): def to_representation(self, instance):
data = super().to_representation(instance) data = super().to_representation(instance)
data['members'] = [str(member.id) for member in instance.members.all()] data["members"] = [str(member.id) for member in instance.members.all()]
return data return data
def validate(self, data): def validate(self, data):
if data.get("start_date", None) is not None and data.get("target_date", None) is not None and data.get("start_date", None) > data.get("target_date", None): if (
data.get("start_date", None) is not None
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError("Start date cannot exceed target date") raise serializers.ValidationError("Start date cannot exceed target date")
if data.get("members", []):
data["members"] = ProjectMember.objects.filter(
project_id=self.context.get("project_id"),
member_id__in=data["members"],
).values_list("member_id", flat=True)
return data return data
def create(self, validated_data): def create(self, validated_data):
@ -99,23 +112,7 @@ class ModuleWriteSerializer(BaseSerializer):
return super().update(instance, validated_data) return super().update(instance, validated_data)
class ModuleFlatSerializer(BaseSerializer):
class Meta:
model = Module
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class ModuleIssueSerializer(BaseSerializer): class ModuleIssueSerializer(BaseSerializer):
module_detail = ModuleFlatSerializer(read_only=True, source="module")
issue_detail = ProjectLiteSerializer(read_only=True, source="issue")
sub_issues_count = serializers.IntegerField(read_only=True) sub_issues_count = serializers.IntegerField(read_only=True)
class Meta: class Meta:
@ -133,8 +130,6 @@ class ModuleIssueSerializer(BaseSerializer):
class ModuleLinkSerializer(BaseSerializer): class ModuleLinkSerializer(BaseSerializer):
created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
class Meta: class Meta:
model = ModuleLink model = ModuleLink
fields = "__all__" fields = "__all__"
@ -159,40 +154,8 @@ class ModuleLinkSerializer(BaseSerializer):
return ModuleLink.objects.create(**validated_data) return ModuleLink.objects.create(**validated_data)
class ModuleSerializer(BaseSerializer): class ModuleLiteSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project")
lead_detail = UserLiteSerializer(read_only=True, source="lead")
members_detail = UserLiteSerializer(read_only=True, many=True, source="members")
link_module = ModuleLinkSerializer(read_only=True, many=True)
is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
class Meta: class Meta:
model = Module model = Module
fields = "__all__" fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class ModuleFavoriteSerializer(BaseSerializer):
module_detail = ModuleFlatSerializer(source="module", read_only=True)
class Meta:
model = ModuleFavorite
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"user",
]

View File

@ -2,30 +2,59 @@
from rest_framework import serializers from rest_framework import serializers
# Module imports # Module imports
from .base import BaseSerializer, DynamicBaseSerializer from plane.db.models import Project, ProjectIdentifier, WorkspaceMember, State, Estimate
from plane.api.serializers.workspace import WorkSpaceSerializer, WorkspaceLiteSerializer from .base import BaseSerializer
from plane.api.serializers.user import UserLiteSerializer, UserAdminLiteSerializer
from plane.db.models import (
Project,
ProjectMember,
ProjectMemberInvite,
ProjectIdentifier,
ProjectFavorite,
ProjectDeployBoard,
ProjectPublicMember,
)
class ProjectSerializer(BaseSerializer): class ProjectSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True)
total_modules = serializers.IntegerField(read_only=True)
is_member = serializers.BooleanField(read_only=True)
sort_order = serializers.FloatField(read_only=True)
member_role = serializers.IntegerField(read_only=True)
is_deployed = serializers.BooleanField(read_only=True)
class Meta: class Meta:
model = Project model = Project
fields = "__all__" fields = "__all__"
read_only_fields = [ read_only_fields = [
"id",
"workspace", "workspace",
"created_at",
"updated_at",
"created_by",
"updated_by",
] ]
def validate(self, data):
# Check project lead should be a member of the workspace
if (
data.get("project_lead", None) is not None
and not WorkspaceMember.objects.filter(
workspace_id=self.context["workspace_id"],
member_id=data.get("project_lead"),
).exists()
):
raise serializers.ValidationError(
"Project lead should be a user in the workspace"
)
# Check default assignee should be a member of the workspace
if (
data.get("default_assignee", None) is not None
and not WorkspaceMember.objects.filter(
workspace_id=self.context["workspace_id"],
member_id=data.get("default_assignee"),
).exists()
):
raise serializers.ValidationError(
"Default assignee should be a user in the workspace"
)
return data
def create(self, validated_data): def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper() identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "": if identifier == "":
@ -35,6 +64,7 @@ class ProjectSerializer(BaseSerializer):
name=identifier, workspace_id=self.context["workspace_id"] name=identifier, workspace_id=self.context["workspace_id"]
).exists(): ).exists():
raise serializers.ValidationError(detail="Project Identifier is taken") raise serializers.ValidationError(detail="Project Identifier is taken")
project = Project.objects.create( project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"] **validated_data, workspace_id=self.context["workspace_id"]
) )
@ -45,36 +75,6 @@ class ProjectSerializer(BaseSerializer):
) )
return project return project
def update(self, instance, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
# If identifier is not passed update the project and return
if identifier == "":
project = super().update(instance, validated_data)
return project
# If no Project Identifier is found create it
project_identifier = ProjectIdentifier.objects.filter(
name=identifier, workspace_id=instance.workspace_id
).first()
if project_identifier is None:
project = super().update(instance, validated_data)
project_identifier = ProjectIdentifier.objects.filter(
project=project
).first()
if project_identifier is not None:
project_identifier.name = identifier
project_identifier.save()
return project
# If found check if the project_id to be updated and identifier project id is same
if project_identifier.project_id == instance.id:
# If same pass update
project = super().update(instance, validated_data)
return project
# If not same fail update
raise serializers.ValidationError(detail="Project Identifier is already taken")
class ProjectLiteSerializer(BaseSerializer): class ProjectLiteSerializer(BaseSerializer):
class Meta: class Meta:
@ -89,126 +89,3 @@ class ProjectLiteSerializer(BaseSerializer):
"description", "description",
] ]
read_only_fields = fields read_only_fields = fields
class ProjectListSerializer(DynamicBaseSerializer):
is_favorite = serializers.BooleanField(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True)
total_modules = serializers.IntegerField(read_only=True)
is_member = serializers.BooleanField(read_only=True)
sort_order = serializers.FloatField(read_only=True)
member_role = serializers.IntegerField(read_only=True)
is_deployed = serializers.BooleanField(read_only=True)
members = serializers.SerializerMethodField()
def get_members(self, obj):
project_members = ProjectMember.objects.filter(project_id=obj.id).values(
"id",
"member_id",
"member__display_name",
"member__avatar",
)
return project_members
class Meta:
model = Project
fields = "__all__"
class ProjectDetailSerializer(BaseSerializer):
# workspace = WorkSpaceSerializer(read_only=True)
default_assignee = UserLiteSerializer(read_only=True)
project_lead = UserLiteSerializer(read_only=True)
is_favorite = serializers.BooleanField(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True)
total_modules = serializers.IntegerField(read_only=True)
is_member = serializers.BooleanField(read_only=True)
sort_order = serializers.FloatField(read_only=True)
member_role = serializers.IntegerField(read_only=True)
is_deployed = serializers.BooleanField(read_only=True)
class Meta:
model = Project
fields = "__all__"
class ProjectMemberSerializer(BaseSerializer):
workspace = WorkspaceLiteSerializer(read_only=True)
project = ProjectLiteSerializer(read_only=True)
member = UserLiteSerializer(read_only=True)
class Meta:
model = ProjectMember
fields = "__all__"
class ProjectMemberAdminSerializer(BaseSerializer):
workspace = WorkspaceLiteSerializer(read_only=True)
project = ProjectLiteSerializer(read_only=True)
member = UserAdminLiteSerializer(read_only=True)
class Meta:
model = ProjectMember
fields = "__all__"
class ProjectMemberInviteSerializer(BaseSerializer):
project = ProjectLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
class Meta:
model = ProjectMemberInvite
fields = "__all__"
class ProjectIdentifierSerializer(BaseSerializer):
class Meta:
model = ProjectIdentifier
fields = "__all__"
class ProjectFavoriteSerializer(BaseSerializer):
class Meta:
model = ProjectFavorite
fields = "__all__"
read_only_fields = [
"workspace",
"user",
]
class ProjectMemberLiteSerializer(BaseSerializer):
member = UserLiteSerializer(read_only=True)
is_subscribed = serializers.BooleanField(read_only=True)
class Meta:
model = ProjectMember
fields = ["member", "id", "is_subscribed"]
read_only_fields = fields
class ProjectDeployBoardSerializer(BaseSerializer):
project_details = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
class Meta:
model = ProjectDeployBoard
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"anchor",
]
class ProjectPublicMemberSerializer(BaseSerializer):
class Meta:
model = ProjectPublicMember
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"member",
]

View File

@ -1,12 +1,16 @@
# Module imports # Module imports
from .base import BaseSerializer from .base import BaseSerializer
from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer
from plane.db.models import State from plane.db.models import State
class StateSerializer(BaseSerializer): class StateSerializer(BaseSerializer):
def validate(self, data):
# If the default is being provided then make all other states default False
if data.get("default", False):
State.objects.filter(project_id=self.context.get("project_id")).update(
default=False
)
return data
class Meta: class Meta:
model = State model = State

View File

@ -1,111 +1,6 @@
# Third party imports # Module imports
from rest_framework import serializers from plane.db.models import User
# Module import
from .base import BaseSerializer from .base import BaseSerializer
from plane.db.models import User, Workspace, WorkspaceMemberInvite
class UserSerializer(BaseSerializer):
class Meta:
model = User
fields = "__all__"
read_only_fields = [
"id",
"created_at",
"updated_at",
"is_superuser",
"is_staff",
"last_active",
"last_login_time",
"last_logout_time",
"last_login_ip",
"last_logout_ip",
"last_login_uagent",
"token_updated_at",
"is_onboarded",
"is_bot",
]
extra_kwargs = {"password": {"write_only": True}}
# If the user has already filled first name or last name then he is onboarded
def get_is_onboarded(self, obj):
return bool(obj.first_name) or bool(obj.last_name)
class UserMeSerializer(BaseSerializer):
class Meta:
model = User
fields = [
"id",
"avatar",
"cover_image",
"date_joined",
"display_name",
"email",
"first_name",
"last_name",
"is_active",
"is_bot",
"is_email_verified",
"is_managed",
"is_onboarded",
"is_tour_completed",
"mobile_number",
"role",
"onboarding_step",
"user_timezone",
"username",
"theme",
"last_workspace_id",
]
read_only_fields = fields
class UserMeSettingsSerializer(BaseSerializer):
workspace = serializers.SerializerMethodField()
class Meta:
model = User
fields = [
"id",
"email",
"workspace",
]
read_only_fields = fields
def get_workspace(self, obj):
workspace_invites = WorkspaceMemberInvite.objects.filter(
email=obj.email
).count()
if obj.last_workspace_id is not None:
workspace = Workspace.objects.filter(
pk=obj.last_workspace_id, workspace_member__member=obj.id
).first()
return {
"last_workspace_id": obj.last_workspace_id,
"last_workspace_slug": workspace.slug if workspace is not None else "",
"fallback_workspace_id": obj.last_workspace_id,
"fallback_workspace_slug": workspace.slug if workspace is not None else "",
"invites": workspace_invites,
}
else:
fallback_workspace = (
Workspace.objects.filter(workspace_member__member_id=obj.id)
.order_by("created_at")
.first()
)
return {
"last_workspace_id": None,
"last_workspace_slug": None,
"fallback_workspace_id": fallback_workspace.id
if fallback_workspace is not None
else None,
"fallback_workspace_slug": fallback_workspace.slug
if fallback_workspace is not None
else None,
"invites": workspace_invites,
}
class UserLiteSerializer(BaseSerializer): class UserLiteSerializer(BaseSerializer):
@ -123,41 +18,3 @@ class UserLiteSerializer(BaseSerializer):
"id", "id",
"is_bot", "is_bot",
] ]
class UserAdminLiteSerializer(BaseSerializer):
class Meta:
model = User
fields = [
"id",
"first_name",
"last_name",
"avatar",
"is_bot",
"display_name",
"email",
]
read_only_fields = [
"id",
"is_bot",
]
class ChangePasswordSerializer(serializers.Serializer):
model = User
"""
Serializer for password change endpoint.
"""
old_password = serializers.CharField(required=True)
new_password = serializers.CharField(required=True)
class ResetPasswordSerializer(serializers.Serializer):
model = User
"""
Serializer for password change endpoint.
"""
new_password = serializers.CharField(required=True)
confirm_password = serializers.CharField(required=True)

View File

@ -1,39 +1,10 @@
# Third party imports
from rest_framework import serializers
# Module imports # Module imports
from plane.db.models import Workspace
from .base import BaseSerializer from .base import BaseSerializer
from .user import UserLiteSerializer, UserAdminLiteSerializer
from plane.db.models import (
User,
Workspace,
WorkspaceMember,
Team,
TeamMember,
WorkspaceMemberInvite,
WorkspaceTheme,
)
class WorkSpaceSerializer(BaseSerializer):
owner = UserLiteSerializer(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
class Meta:
model = Workspace
fields = "__all__"
read_only_fields = [
"id",
"created_by",
"updated_by",
"created_at",
"updated_at",
"owner",
]
class WorkspaceLiteSerializer(BaseSerializer): class WorkspaceLiteSerializer(BaseSerializer):
"""Lite serializer with only required fields"""
class Meta: class Meta:
model = Workspace model = Workspace
fields = [ fields = [
@ -42,95 +13,3 @@ class WorkspaceLiteSerializer(BaseSerializer):
"id", "id",
] ]
read_only_fields = fields read_only_fields = fields
class WorkSpaceMemberSerializer(BaseSerializer):
member = UserLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
class Meta:
model = WorkspaceMember
fields = "__all__"
class WorkspaceMemberMeSerializer(BaseSerializer):
class Meta:
model = WorkspaceMember
fields = "__all__"
class WorkspaceMemberAdminSerializer(BaseSerializer):
member = UserAdminLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
class Meta:
model = WorkspaceMember
fields = "__all__"
class WorkSpaceMemberInviteSerializer(BaseSerializer):
workspace = WorkSpaceSerializer(read_only=True)
total_members = serializers.IntegerField(read_only=True)
created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
class Meta:
model = WorkspaceMemberInvite
fields = "__all__"
class TeamSerializer(BaseSerializer):
members_detail = UserLiteSerializer(read_only=True, source="members", many=True)
members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
class Meta:
model = Team
fields = "__all__"
read_only_fields = [
"workspace",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
def create(self, validated_data, **kwargs):
if "members" in validated_data:
members = validated_data.pop("members")
workspace = self.context["workspace"]
team = Team.objects.create(**validated_data, workspace=workspace)
team_members = [
TeamMember(member=member, team=team, workspace=workspace)
for member in members
]
TeamMember.objects.bulk_create(team_members, batch_size=10)
return team
team = Team.objects.create(**validated_data)
return team
def update(self, instance, validated_data):
if "members" in validated_data:
members = validated_data.pop("members")
TeamMember.objects.filter(team=instance).delete()
team_members = [
TeamMember(member=member, team=instance, workspace=instance.workspace)
for member in members
]
TeamMember.objects.bulk_create(team_members, batch_size=10)
return super().update(instance, validated_data)
return super().update(instance, validated_data)
class WorkspaceThemeSerializer(BaseSerializer):
class Meta:
model = WorkspaceTheme
fields = "__all__"
read_only_fields = [
"workspace",
"actor",
]

View File

@ -1,46 +1,15 @@
from .analytic import urlpatterns as analytic_urls from .project import urlpatterns as project_patterns
from .asset import urlpatterns as asset_urls from .state import urlpatterns as state_patterns
from .authentication import urlpatterns as authentication_urls from .issue import urlpatterns as issue_patterns
from .config import urlpatterns as configuration_urls from .cycle import urlpatterns as cycle_patterns
from .cycle import urlpatterns as cycle_urls from .module import urlpatterns as module_patterns
from .estimate import urlpatterns as estimate_urls from .inbox import urlpatterns as inbox_patterns
from .external import urlpatterns as external_urls
from .importer import urlpatterns as importer_urls
from .inbox import urlpatterns as inbox_urls
from .integration import urlpatterns as integration_urls
from .issue import urlpatterns as issue_urls
from .module import urlpatterns as module_urls
from .notification import urlpatterns as notification_urls
from .page import urlpatterns as page_urls
from .project import urlpatterns as project_urls
from .public_board import urlpatterns as public_board_urls
from .search import urlpatterns as search_urls
from .state import urlpatterns as state_urls
from .user import urlpatterns as user_urls
from .views import urlpatterns as view_urls
from .workspace import urlpatterns as workspace_urls
urlpatterns = [ urlpatterns = [
*analytic_urls, *project_patterns,
*asset_urls, *state_patterns,
*authentication_urls, *issue_patterns,
*configuration_urls, *cycle_patterns,
*cycle_urls, *module_patterns,
*estimate_urls, *inbox_patterns,
*external_urls,
*importer_urls,
*inbox_urls,
*integration_urls,
*issue_urls,
*module_urls,
*notification_urls,
*page_urls,
*project_urls,
*public_board_urls,
*search_urls,
*state_urls,
*user_urls,
*view_urls,
*workspace_urls,
] ]

View File

@ -1,87 +1,35 @@
from django.urls import path from django.urls import path
from plane.api.views.cycle import (
from plane.api.views import ( CycleAPIEndpoint,
CycleViewSet, CycleIssueAPIEndpoint,
CycleIssueViewSet, TransferCycleIssueAPIEndpoint,
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
TransferCycleIssueEndpoint,
) )
urlpatterns = [ urlpatterns = [
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/", "workspaces/<str:slug>/projects/<uuid:project_id>/cycles/",
CycleViewSet.as_view( CycleAPIEndpoint.as_view(),
{ name="cycles",
"get": "list",
"post": "create",
}
),
name="project-cycle",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:pk>/",
CycleViewSet.as_view( CycleAPIEndpoint.as_view(),
{ name="cycles",
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-cycle",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/", "workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/",
CycleIssueViewSet.as_view( CycleIssueAPIEndpoint.as_view(),
{ name="cycle-issues",
"get": "list",
"post": "create",
}
),
name="project-issue-cycle",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:issue_id>/",
CycleIssueViewSet.as_view( CycleIssueAPIEndpoint.as_view(),
{ name="cycle-issues",
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/date-check/",
CycleDateCheckEndpoint.as_view(),
name="project-cycle-date",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-cycles/",
CycleFavoriteViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="user-favorite-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-cycles/<uuid:cycle_id>/",
CycleFavoriteViewSet.as_view(
{
"delete": "destroy",
}
),
name="user-favorite-cycle",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/transfer-issues/", "workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/transfer-issues/",
TransferCycleIssueEndpoint.as_view(), TransferCycleIssueAPIEndpoint.as_view(),
name="transfer-issues", name="transfer-issues",
), ),
] ]

View File

@ -1,53 +1,17 @@
from django.urls import path from django.urls import path
from plane.api.views import InboxIssueAPIEndpoint
from plane.api.views import (
InboxViewSet,
InboxIssueViewSet,
)
urlpatterns = [ urlpatterns = [
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/", "workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/",
InboxViewSet.as_view( InboxIssueAPIEndpoint.as_view(),
{
"get": "list",
"post": "create",
}
),
name="inbox",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:pk>/",
InboxViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="inbox",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/",
InboxIssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="inbox-issue", name="inbox-issue",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:issue_id>/",
InboxIssueViewSet.as_view( InboxIssueAPIEndpoint.as_view(),
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="inbox-issue", name="inbox-issue",
), ),
] ]

View File

@ -1,327 +1,62 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.api.views import (
IssueViewSet, IssueAPIEndpoint,
IssueListEndpoint, LabelAPIEndpoint,
IssueListGroupedEndpoint, IssueLinkAPIEndpoint,
LabelViewSet, IssueCommentAPIEndpoint,
BulkCreateIssueLabelsEndpoint, IssueActivityAPIEndpoint,
BulkDeleteIssuesEndpoint,
BulkImportIssuesEndpoint,
UserWorkSpaceIssues,
SubIssuesEndpoint,
IssueLinkViewSet,
IssueAttachmentEndpoint,
ExportIssuesEndpoint,
IssueActivityEndpoint,
IssueCommentViewSet,
IssueSubscriberViewSet,
IssueReactionViewSet,
CommentReactionViewSet,
IssueUserDisplayPropertyEndpoint,
IssueArchiveViewSet,
IssueRelationViewSet,
IssueDraftViewSet,
) )
urlpatterns = [ urlpatterns = [
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueViewSet.as_view( IssueAPIEndpoint.as_view(),
{ name="issue",
"get": "list",
"post": "create",
}
),
name="project-issue",
),
path(
"v2/workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueListEndpoint.as_view(),
name="project-issue",
),
path(
"v3/workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueListGroupedEndpoint.as_view(),
name="project-issue",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/",
IssueViewSet.as_view( IssueAPIEndpoint.as_view(),
{ name="issue",
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/", "workspaces/<str:slug>/projects/<uuid:project_id>/labels/",
LabelViewSet.as_view( LabelAPIEndpoint.as_view(),
{ name="label",
"get": "list",
"post": "create",
}
),
name="project-issue-labels",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/labels/<uuid:pk>/",
LabelViewSet.as_view( LabelAPIEndpoint.as_view(),
{ name="label",
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-labels",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-create-labels/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/links/",
BulkCreateIssueLabelsEndpoint.as_view(), IssueLinkAPIEndpoint.as_view(),
name="project-bulk-labels", name="link",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-delete-issues/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/links/<uuid:pk>/",
BulkDeleteIssuesEndpoint.as_view(), IssueLinkAPIEndpoint.as_view(),
name="project-issues-bulk", name="link",
), ),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-import-issues/<str:service>/",
BulkImportIssuesEndpoint.as_view(),
name="project-issues-bulk",
),
path(
"workspaces/<str:slug>/my-issues/",
UserWorkSpaceIssues.as_view(),
name="workspace-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/sub-issues/",
SubIssuesEndpoint.as_view(),
name="sub-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/",
IssueLinkViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/<uuid:pk>/",
IssueLinkViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-attachments/",
IssueAttachmentEndpoint.as_view(),
name="project-issue-attachments",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-attachments/<uuid:pk>/",
IssueAttachmentEndpoint.as_view(),
name="project-issue-attachments",
),
path(
"workspaces/<str:slug>/export-issues/",
ExportIssuesEndpoint.as_view(),
name="export-issues",
),
## End Issues
## Issue Activity
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/history/",
IssueActivityEndpoint.as_view(),
name="project-issue-history",
),
## Issue Activity
## IssueComments
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/",
IssueCommentViewSet.as_view( IssueCommentAPIEndpoint.as_view(),
{ name="comment",
"get": "list",
"post": "create",
}
),
name="project-issue-comment",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/<uuid:pk>/",
IssueCommentViewSet.as_view( IssueCommentAPIEndpoint.as_view(),
{ name="comment",
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-comment",
),
## End IssueComments
# Issue Subscribers
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-subscribers/",
IssueSubscriberViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-subscribers",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-subscribers/<uuid:subscriber_id>/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activities/",
IssueSubscriberViewSet.as_view({"delete": "destroy"}), IssueActivityAPIEndpoint.as_view(),
name="project-issue-subscribers", name="activity",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/subscribe/", "workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activities/<uuid:pk>/",
IssueSubscriberViewSet.as_view( IssueActivityAPIEndpoint.as_view(),
{ name="activity",
"get": "subscription_status",
"post": "subscribe",
"delete": "unsubscribe",
}
),
name="project-issue-subscribers",
),
## End Issue Subscribers
# Issue Reactions
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/reactions/",
IssueReactionViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-reactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/reactions/<str:reaction_code>/",
IssueReactionViewSet.as_view(
{
"delete": "destroy",
}
),
name="project-issue-reactions",
),
## End Issue Reactions
# Comment Reactions
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/comments/<uuid:comment_id>/reactions/",
CommentReactionViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-comment-reactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/comments/<uuid:comment_id>/reactions/<str:reaction_code>/",
CommentReactionViewSet.as_view(
{
"delete": "destroy",
}
),
name="project-issue-comment-reactions",
),
## End Comment Reactions
## IssueProperty
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-display-properties/",
IssueUserDisplayPropertyEndpoint.as_view(),
name="project-issue-display-properties",
),
## IssueProperty End
## Issue Archives
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-issues/",
IssueArchiveViewSet.as_view(
{
"get": "list",
}
),
name="project-issue-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-issues/<uuid:pk>/",
IssueArchiveViewSet.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-issue-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/unarchive/<uuid:pk>/",
IssueArchiveViewSet.as_view(
{
"post": "unarchive",
}
),
name="project-issue-archive",
),
## End Issue Archives
## Issue Relation
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-relation/",
IssueRelationViewSet.as_view(
{
"post": "create",
}
),
name="issue-relation",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-relation/<uuid:pk>/",
IssueRelationViewSet.as_view(
{
"delete": "destroy",
}
),
name="issue-relation",
),
## End Issue Relation
## Issue Drafts
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-drafts/",
IssueDraftViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-draft",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-drafts/<uuid:pk>/",
IssueDraftViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-draft",
), ),
] ]

View File

@ -1,104 +1,26 @@
from django.urls import path from django.urls import path
from plane.api.views import ModuleAPIEndpoint, ModuleIssueAPIEndpoint
from plane.api.views import (
ModuleViewSet,
ModuleIssueViewSet,
ModuleLinkViewSet,
ModuleFavoriteViewSet,
BulkImportModulesEndpoint,
)
urlpatterns = [ urlpatterns = [
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/", "workspaces/<str:slug>/projects/<uuid:project_id>/modules/",
ModuleViewSet.as_view( ModuleAPIEndpoint.as_view(),
{ name="modules",
"get": "list",
"post": "create",
}
),
name="project-modules",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:pk>/",
ModuleViewSet.as_view( ModuleAPIEndpoint.as_view(),
{ name="modules",
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-modules",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/", "workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/",
ModuleIssueViewSet.as_view( ModuleIssueAPIEndpoint.as_view(),
{ name="module-issues",
"get": "list",
"post": "create",
}
),
name="project-module-issues",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:issue_id>/",
ModuleIssueViewSet.as_view( ModuleIssueAPIEndpoint.as_view(),
{ name="module-issues",
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-module-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-links/",
ModuleLinkViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-module-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-links/<uuid:pk>/",
ModuleLinkViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-module-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-modules/",
ModuleFavoriteViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="user-favorite-module",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-modules/<uuid:module_id>/",
ModuleFavoriteViewSet.as_view(
{
"delete": "destroy",
}
),
name="user-favorite-module",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-import-modules/<str:service>/",
BulkImportModulesEndpoint.as_view(),
name="bulk-modules-create",
), ),
] ]

View File

@ -1,79 +0,0 @@
from django.urls import path
from plane.api.views import (
PageViewSet,
PageBlockViewSet,
PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/",
PageViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:pk>/",
PageViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/",
PageBlockViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-page-blocks",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:pk>/",
PageBlockViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-page-blocks",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/",
PageFavoriteViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="user-favorite-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/<uuid:page_id>/",
PageFavoriteViewSet.as_view(
{
"delete": "destroy",
}
),
name="user-favorite-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:page_block_id>/issues/",
CreateIssueFromPageBlockEndpoint.as_view(),
name="page-block-issues",
),
]

View File

@ -1,132 +1,16 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.api.views import ProjectAPIEndpoint
ProjectViewSet,
InviteProjectEndpoint,
ProjectMemberViewSet,
ProjectMemberInvitationsViewset,
ProjectMemberUserEndpoint,
ProjectJoinEndpoint,
AddTeamToProjectEndpoint,
ProjectUserViewsEndpoint,
ProjectIdentifierEndpoint,
ProjectFavoritesViewSet,
LeaveProjectEndpoint,
ProjectPublicCoverImagesEndpoint,
)
urlpatterns = [ urlpatterns = [
path( path(
"workspaces/<str:slug>/projects/", "workspaces/<str:slug>/projects/",
ProjectViewSet.as_view( ProjectAPIEndpoint.as_view(),
{
"get": "list",
"post": "create",
}
),
name="project", name="project",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/",
ProjectViewSet.as_view( ProjectAPIEndpoint.as_view(),
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project", name="project",
), ),
path(
"workspaces/<str:slug>/project-identifiers/",
ProjectIdentifierEndpoint.as_view(),
name="project-identifiers",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invite/",
InviteProjectEndpoint.as_view(),
name="invite-project",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/",
ProjectMemberViewSet.as_view({"get": "list", "post": "create"}),
name="project-member",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/<uuid:pk>/",
ProjectMemberViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-member",
),
path(
"workspaces/<str:slug>/projects/join/",
ProjectJoinEndpoint.as_view(),
name="project-join",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/team-invite/",
AddTeamToProjectEndpoint.as_view(),
name="projects",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/",
ProjectMemberInvitationsViewset.as_view({"get": "list"}),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/<uuid:pk>/",
ProjectMemberInvitationsViewset.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-views/",
ProjectUserViewsEndpoint.as_view(),
name="project-view",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-members/me/",
ProjectMemberUserEndpoint.as_view(),
name="project-member-view",
),
path(
"workspaces/<str:slug>/user-favorite-projects/",
ProjectFavoritesViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-favorite",
),
path(
"workspaces/<str:slug>/user-favorite-projects/<uuid:project_id>/",
ProjectFavoritesViewSet.as_view(
{
"delete": "destroy",
}
),
name="project-favorite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/leave/",
LeaveProjectEndpoint.as_view(),
name="leave-project",
),
path(
"project-covers/",
ProjectPublicCoverImagesEndpoint.as_view(),
name="project-covers",
),
] ]

View File

@ -1,151 +0,0 @@
from django.urls import path
from plane.api.views import (
ProjectDeployBoardViewSet,
ProjectDeployBoardPublicSettingsEndpoint,
ProjectIssuesPublicEndpoint,
IssueRetrievePublicEndpoint,
IssueCommentPublicViewSet,
IssueReactionPublicViewSet,
CommentReactionPublicViewSet,
InboxIssuePublicViewSet,
IssueVotePublicViewSet,
WorkspaceProjectDeployBoardEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-deploy-boards/",
ProjectDeployBoardViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-deploy-board",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-deploy-boards/<uuid:pk>/",
ProjectDeployBoardViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-deploy-board",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/settings/",
ProjectDeployBoardPublicSettingsEndpoint.as_view(),
name="project-deploy-board-settings",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/issues/",
ProjectIssuesPublicEndpoint.as_view(),
name="project-deploy-board",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/issues/<uuid:issue_id>/",
IssueRetrievePublicEndpoint.as_view(),
name="workspace-project-boards",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/issues/<uuid:issue_id>/comments/",
IssueCommentPublicViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="issue-comments-project-board",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/issues/<uuid:issue_id>/comments/<uuid:pk>/",
IssueCommentPublicViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="issue-comments-project-board",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/issues/<uuid:issue_id>/reactions/",
IssueReactionPublicViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="issue-reactions-project-board",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/issues/<uuid:issue_id>/reactions/<str:reaction_code>/",
IssueReactionPublicViewSet.as_view(
{
"delete": "destroy",
}
),
name="issue-reactions-project-board",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/comments/<uuid:comment_id>/reactions/",
CommentReactionPublicViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="comment-reactions-project-board",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/comments/<uuid:comment_id>/reactions/<str:reaction_code>/",
CommentReactionPublicViewSet.as_view(
{
"delete": "destroy",
}
),
name="comment-reactions-project-board",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/",
InboxIssuePublicViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="inbox-issue",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/<uuid:pk>/",
InboxIssuePublicViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="inbox-issue",
),
path(
"public/workspaces/<str:slug>/project-boards/<uuid:project_id>/issues/<uuid:issue_id>/votes/",
IssueVotePublicViewSet.as_view(
{
"get": "list",
"post": "create",
"delete": "destroy",
}
),
name="issue-vote-project-board",
),
path(
"public/workspaces/<str:slug>/project-boards/",
WorkspaceProjectDeployBoardEndpoint.as_view(),
name="workspace-project-boards",
),
]

View File

@ -1,38 +1,16 @@
from django.urls import path from django.urls import path
from plane.api.views import StateAPIEndpoint
from plane.api.views import StateViewSet
urlpatterns = [ urlpatterns = [
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/", "workspaces/<str:slug>/projects/<uuid:project_id>/states/",
StateViewSet.as_view( StateAPIEndpoint.as_view(),
{ name="states",
"get": "list",
"post": "create",
}
),
name="project-states",
), ),
path( path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/<uuid:pk>/", "workspaces/<str:slug>/projects/<uuid:project_id>/states/<uuid:state_id>/",
StateViewSet.as_view( StateAPIEndpoint.as_view(),
{ name="states",
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-state",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/<uuid:pk>/mark-default/",
StateViewSet.as_view(
{
"post": "mark_as_default",
}
),
name="project-state",
), ),
] ]

View File

@ -1,180 +1,21 @@
from .project import ( from .project import ProjectAPIEndpoint
ProjectViewSet,
ProjectMemberViewSet,
UserProjectInvitationsViewset,
InviteProjectEndpoint,
AddTeamToProjectEndpoint,
ProjectMemberInvitationsViewset,
ProjectMemberInviteDetailViewSet,
ProjectIdentifierEndpoint,
ProjectJoinEndpoint,
ProjectUserViewsEndpoint,
ProjectMemberUserEndpoint,
ProjectFavoritesViewSet,
ProjectDeployBoardViewSet,
ProjectDeployBoardPublicSettingsEndpoint,
WorkspaceProjectDeployBoardEndpoint,
LeaveProjectEndpoint,
ProjectPublicCoverImagesEndpoint,
)
from .user import (
UserEndpoint,
UpdateUserOnBoardedEndpoint,
UpdateUserTourCompletedEndpoint,
UserActivityEndpoint,
)
from .oauth import OauthEndpoint from .state import StateAPIEndpoint
from .base import BaseAPIView, BaseViewSet
from .workspace import (
WorkSpaceViewSet,
UserWorkSpacesEndpoint,
WorkSpaceAvailabilityCheckEndpoint,
InviteWorkspaceEndpoint,
JoinWorkspaceEndpoint,
WorkSpaceMemberViewSet,
TeamMemberViewSet,
WorkspaceInvitationsViewset,
UserWorkspaceInvitationsEndpoint,
UserWorkspaceInvitationEndpoint,
UserLastProjectWithWorkspaceEndpoint,
WorkspaceMemberUserEndpoint,
WorkspaceMemberUserViewsEndpoint,
UserActivityGraphEndpoint,
UserIssueCompletedGraphEndpoint,
UserWorkspaceDashboardEndpoint,
WorkspaceThemeViewSet,
WorkspaceUserProfileStatsEndpoint,
WorkspaceUserActivityEndpoint,
WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint,
LeaveWorkspaceEndpoint,
)
from .state import StateViewSet
from .view import (
GlobalViewViewSet,
GlobalViewIssuesViewSet,
IssueViewViewSet,
IssueViewFavoriteViewSet,
)
from .cycle import (
CycleViewSet,
CycleIssueViewSet,
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
TransferCycleIssueEndpoint,
)
from .asset import FileAssetEndpoint, UserAssetsEndpoint
from .issue import ( from .issue import (
IssueViewSet, IssueAPIEndpoint,
IssueListEndpoint, LabelAPIEndpoint,
IssueListGroupedEndpoint, IssueLinkAPIEndpoint,
WorkSpaceIssuesEndpoint, IssueCommentAPIEndpoint,
IssueActivityEndpoint, IssueActivityAPIEndpoint,
IssueCommentViewSet,
IssueUserDisplayPropertyEndpoint,
LabelViewSet,
BulkDeleteIssuesEndpoint,
UserWorkSpaceIssues,
SubIssuesEndpoint,
IssueLinkViewSet,
BulkCreateIssueLabelsEndpoint,
IssueAttachmentEndpoint,
IssueArchiveViewSet,
IssueSubscriberViewSet,
IssueCommentPublicViewSet,
CommentReactionViewSet,
IssueReactionViewSet,
IssueReactionPublicViewSet,
CommentReactionPublicViewSet,
IssueVotePublicViewSet,
IssueRelationViewSet,
IssueRetrievePublicEndpoint,
ProjectIssuesPublicEndpoint,
IssueDraftViewSet,
) )
from .auth_extended import ( from .cycle import (
VerifyEmailEndpoint, CycleAPIEndpoint,
RequestEmailVerificationEndpoint, CycleIssueAPIEndpoint,
ForgotPasswordEndpoint, TransferCycleIssueAPIEndpoint,
ResetPasswordEndpoint,
ChangePasswordEndpoint,
) )
from .module import ModuleAPIEndpoint, ModuleIssueAPIEndpoint
from .authentication import ( from .inbox import InboxIssueAPIEndpoint
SignUpEndpoint,
SignInEndpoint,
SignOutEndpoint,
MagicSignInEndpoint,
MagicSignInGenerateEndpoint,
)
from .module import (
ModuleViewSet,
ModuleIssueViewSet,
ModuleLinkViewSet,
ModuleFavoriteViewSet,
)
from .api_token import ApiTokenEndpoint
from .integration import (
WorkspaceIntegrationViewSet,
IntegrationViewSet,
GithubIssueSyncViewSet,
GithubRepositorySyncViewSet,
GithubCommentSyncViewSet,
GithubRepositoriesEndpoint,
BulkCreateGithubIssueSyncEndpoint,
SlackProjectSyncViewSet,
)
from .importer import (
ServiceIssueImportSummaryEndpoint,
ImportServiceEndpoint,
UpdateServiceImportStatusEndpoint,
BulkImportIssuesEndpoint,
BulkImportModulesEndpoint,
)
from .page import (
PageViewSet,
PageBlockViewSet,
PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint,
)
from .search import GlobalSearchEndpoint, IssueSearchEndpoint
from .external import GPTIntegrationEndpoint, ReleaseNotesEndpoint, UnsplashEndpoint
from .estimate import (
ProjectEstimatePointEndpoint,
BulkEstimatePointEndpoint,
)
from .inbox import InboxViewSet, InboxIssueViewSet, InboxIssuePublicViewSet
from .analytic import (
AnalyticsEndpoint,
AnalyticViewViewset,
SavedAnalyticEndpoint,
ExportAnalyticsEndpoint,
DefaultAnalyticsEndpoint,
)
from .notification import (
NotificationViewSet,
UnreadNotificationEndpoint,
MarkAllReadNotificationViewSet,
)
from .exporter import ExportIssuesEndpoint
from .config import ConfigurationEndpoint

View File

@ -1,47 +0,0 @@
# Python import
from uuid import uuid4
# Third party
from rest_framework.response import Response
from rest_framework import status
from sentry_sdk import capture_exception
# Module import
from .base import BaseAPIView
from plane.db.models import APIToken
from plane.api.serializers import APITokenSerializer
class ApiTokenEndpoint(BaseAPIView):
def post(self, request):
label = request.data.get("label", str(uuid4().hex))
workspace = request.data.get("workspace", False)
if not workspace:
return Response(
{"error": "Workspace is required"}, status=status.HTTP_200_OK
)
api_token = APIToken.objects.create(
label=label, user=request.user, workspace_id=workspace
)
serializer = APITokenSerializer(api_token)
# Token will be only vissible while creating
return Response(
{"api_token": serializer.data, "token": api_token.token},
status=status.HTTP_201_CREATED,
)
def get(self, request):
api_tokens = APIToken.objects.filter(user=request.user)
serializer = APITokenSerializer(api_tokens, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, pk):
api_token = APIToken.objects.get(pk=pk)
api_token.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@ -1,151 +0,0 @@
## Python imports
import jwt
## Django imports
from django.contrib.auth.tokens import PasswordResetTokenGenerator
from django.utils.encoding import (
smart_str,
smart_bytes,
DjangoUnicodeDecodeError,
)
from django.utils.http import urlsafe_base64_decode, urlsafe_base64_encode
from django.conf import settings
## Third Party Imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework import permissions
from rest_framework_simplejwt.tokens import RefreshToken
from sentry_sdk import capture_exception
## Module imports
from . import BaseAPIView
from plane.api.serializers import (
ChangePasswordSerializer,
ResetPasswordSerializer,
)
from plane.db.models import User
from plane.bgtasks.email_verification_task import email_verification
from plane.bgtasks.forgot_password_task import forgot_password
class RequestEmailVerificationEndpoint(BaseAPIView):
def get(self, request):
token = RefreshToken.for_user(request.user).access_token
current_site = settings.WEB_URL
email_verification.delay(
request.user.first_name, request.user.email, token, current_site
)
return Response(
{"message": "Email sent successfully"}, status=status.HTTP_200_OK
)
class VerifyEmailEndpoint(BaseAPIView):
def get(self, request):
token = request.GET.get("token")
try:
payload = jwt.decode(token, settings.SECRET_KEY, algorithms="HS256")
user = User.objects.get(id=payload["user_id"])
if not user.is_email_verified:
user.is_email_verified = True
user.save()
return Response(
{"email": "Successfully activated"}, status=status.HTTP_200_OK
)
except jwt.ExpiredSignatureError as _indentifier:
return Response(
{"email": "Activation expired"}, status=status.HTTP_400_BAD_REQUEST
)
except jwt.exceptions.DecodeError as _indentifier:
return Response(
{"email": "Invalid token"}, status=status.HTTP_400_BAD_REQUEST
)
class ForgotPasswordEndpoint(BaseAPIView):
permission_classes = [permissions.AllowAny]
def post(self, request):
email = request.data.get("email")
if User.objects.filter(email=email).exists():
user = User.objects.get(email=email)
uidb64 = urlsafe_base64_encode(smart_bytes(user.id))
token = PasswordResetTokenGenerator().make_token(user)
current_site = settings.WEB_URL
forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site
)
return Response(
{"message": "Check your email to reset your password"},
status=status.HTTP_200_OK,
)
return Response(
{"error": "Please check the email"}, status=status.HTTP_400_BAD_REQUEST
)
class ResetPasswordEndpoint(BaseAPIView):
permission_classes = [permissions.AllowAny]
def post(self, request, uidb64, token):
try:
id = smart_str(urlsafe_base64_decode(uidb64))
user = User.objects.get(id=id)
if not PasswordResetTokenGenerator().check_token(user, token):
return Response(
{"error": "token is not valid, please check the new one"},
status=status.HTTP_401_UNAUTHORIZED,
)
serializer = ResetPasswordSerializer(data=request.data)
if serializer.is_valid():
# set_password also hashes the password that the user will get
user.set_password(serializer.data.get("new_password"))
user.save()
response = {
"status": "success",
"code": status.HTTP_200_OK,
"message": "Password updated successfully",
}
return Response(response)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except DjangoUnicodeDecodeError as indentifier:
return Response(
{"error": "token is not valid, please check the new one"},
status=status.HTTP_401_UNAUTHORIZED,
)
class ChangePasswordEndpoint(BaseAPIView):
def post(self, request):
serializer = ChangePasswordSerializer(data=request.data)
user = User.objects.get(pk=request.user.id)
if serializer.is_valid():
# Check old password
if not user.object.check_password(serializer.data.get("old_password")):
return Response(
{"old_password": ["Wrong password."]},
status=status.HTTP_400_BAD_REQUEST,
)
# set_password also hashes the password that the user will get
self.object.set_password(serializer.data.get("new_password"))
self.object.save()
response = {
"status": "success",
"code": status.HTTP_200_OK,
"message": "Password updated successfully",
}
return Response(response)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

View File

@ -1,27 +1,25 @@
# Python imports # Python imports
import zoneinfo import zoneinfo
import json
# Django imports # Django imports
from django.urls import resolve
from django.conf import settings from django.conf import settings
from django.utils import timezone
from django.db import IntegrityError from django.db import IntegrityError
from django.core.exceptions import ObjectDoesNotExist, ValidationError from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.utils import timezone
# Third part imports # Third party imports
from rest_framework import status
from rest_framework import status
from rest_framework.viewsets import ModelViewSet
from rest_framework.response import Response
from rest_framework.exceptions import APIException
from rest_framework.views import APIView from rest_framework.views import APIView
from rest_framework.filters import SearchFilter from rest_framework.response import Response
from rest_framework.permissions import IsAuthenticated from rest_framework.permissions import IsAuthenticated
from rest_framework import status
from sentry_sdk import capture_exception from sentry_sdk import capture_exception
from django_filters.rest_framework import DjangoFilterBackend
# Module imports # Module imports
from plane.api.middleware.api_authentication import APIKeyAuthentication
from plane.api.rate_limit import ApiKeyRateThrottle
from plane.utils.paginator import BasePaginator from plane.utils.paginator import BasePaginator
from plane.bgtasks.webhook_task import send_webhook
class TimezoneMixin: class TimezoneMixin:
@ -29,6 +27,7 @@ class TimezoneMixin:
This enables timezone conversion according This enables timezone conversion according
to the user set timezone to the user set timezone
""" """
def initial(self, request, *args, **kwargs): def initial(self, request, *args, **kwargs):
super().initial(request, *args, **kwargs) super().initial(request, *args, **kwargs)
if request.user.is_authenticated: if request.user.is_authenticated:
@ -37,109 +36,50 @@ class TimezoneMixin:
timezone.deactivate() timezone.deactivate()
class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator): class WebhookMixin:
webhook_event = None
bulk = False
model = None def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
permission_classes = [ # Check for the case should webhook be sent
IsAuthenticated, if (
] self.webhook_event
and self.request.method in ["POST", "PATCH", "DELETE"]
and response.status_code in [200, 201, 204]
):
# Push the object to delay
send_webhook.delay(
event=self.webhook_event,
payload=response.data,
kw=self.kwargs,
action=self.request.method,
slug=self.workspace_slug,
bulk=self.bulk,
)
filter_backends = ( return response
DjangoFilterBackend,
SearchFilter,
)
filterset_fields = []
search_fields = []
def get_queryset(self):
try:
return self.model.objects.all()
except Exception as e:
capture_exception(e)
raise APIException("Please check the view", status.HTTP_400_BAD_REQUEST)
def handle_exception(self, exc):
"""
Handle any exception that occurs, by returning an appropriate response,
or re-raising the error.
"""
try:
response = super().handle_exception(exc)
return response
except Exception as e:
if isinstance(e, IntegrityError):
return Response({"error": "The payload is not valid"}, status=status.HTTP_400_BAD_REQUEST)
if isinstance(e, ValidationError):
return Response({"error": "Please provide valid detail"}, status=status.HTTP_400_BAD_REQUEST)
if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0]
return Response({"error": f"{model_name} does not exist."}, status=status.HTTP_404_NOT_FOUND)
if isinstance(e, KeyError):
capture_exception(e)
return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST)
print(e) if settings.DEBUG else print("Server Error")
capture_exception(e)
return Response({"error": "Something went wrong please try again later"}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)
def dispatch(self, request, *args, **kwargs):
try:
response = super().dispatch(request, *args, **kwargs)
if settings.DEBUG:
from django.db import connection
print(
f"{request.method} - {request.get_full_path()} of Queries: {len(connection.queries)}"
)
return response
except Exception as exc:
response = self.handle_exception(exc)
return exc
@property
def workspace_slug(self):
return self.kwargs.get("slug", None)
@property
def project_id(self):
project_id = self.kwargs.get("project_id", None)
if project_id:
return project_id
if resolve(self.request.path_info).url_name == "project":
return self.kwargs.get("pk", None)
class BaseAPIView(TimezoneMixin, APIView, BasePaginator): class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
authentication_classes = [
APIKeyAuthentication,
]
permission_classes = [ permission_classes = [
IsAuthenticated, IsAuthenticated,
] ]
filter_backends = ( throttle_classes = [
DjangoFilterBackend, ApiKeyRateThrottle,
SearchFilter, ]
)
filterset_fields = []
search_fields = []
def filter_queryset(self, queryset): def filter_queryset(self, queryset):
for backend in list(self.filter_backends): for backend in list(self.filter_backends):
queryset = backend().filter_queryset(self.request, queryset, self) queryset = backend().filter_queryset(self.request, queryset, self)
return queryset return queryset
def handle_exception(self, exc): def handle_exception(self, exc):
""" """
Handle any exception that occurs, by returning an appropriate response, Handle any exception that occurs, by returning an appropriate response,
@ -150,27 +90,43 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
return response return response
except Exception as e: except Exception as e:
if isinstance(e, IntegrityError): if isinstance(e, IntegrityError):
return Response({"error": "The payload is not valid"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": "The payload is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ValidationError): if isinstance(e, ValidationError):
return Response({"error": "Please provide valid detail"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{
"error": "The provided payload is not valid please try with a valid payload"
},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ObjectDoesNotExist): if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0] model_name = str(exc).split(" matching query does not exist.")[0]
return Response({"error": f"{model_name} does not exist."}, status=status.HTTP_404_NOT_FOUND) return Response(
{"error": f"{model_name} does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
if isinstance(e, KeyError): if isinstance(e, KeyError):
return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST) return Response(
{"error": f"key {e} does not exist"},
status=status.HTTP_400_BAD_REQUEST,
)
print(e) if settings.DEBUG else print("Server Error") if settings.DEBUG:
print(e)
capture_exception(e) capture_exception(e)
return Response({"error": "Something went wrong please try again later"}, status=status.HTTP_500_INTERNAL_SERVER_ERROR) return Response(
{"error": "Something went wrong please try again later"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def dispatch(self, request, *args, **kwargs): def dispatch(self, request, *args, **kwargs):
try: try:
response = super().dispatch(request, *args, **kwargs) response = super().dispatch(request, *args, **kwargs)
if settings.DEBUG: if settings.DEBUG:
from django.db import connection from django.db import connection
@ -178,11 +134,25 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
f"{request.method} - {request.get_full_path()} of Queries: {len(connection.queries)}" f"{request.method} - {request.get_full_path()} of Queries: {len(connection.queries)}"
) )
return response return response
except Exception as exc: except Exception as exc:
response = self.handle_exception(exc) response = self.handle_exception(exc)
return exc return exc
def finalize_response(self, request, response, *args, **kwargs):
# Call super to get the default response
response = super().finalize_response(request, response, *args, **kwargs)
# Add custom headers if they exist in the request META
ratelimit_remaining = request.META.get("X-RateLimit-Remaining")
if ratelimit_remaining is not None:
response["X-RateLimit-Remaining"] = ratelimit_remaining
ratelimit_reset = request.META.get("X-RateLimit-Reset")
if ratelimit_reset is not None:
response["X-RateLimit-Reset"] = ratelimit_reset
return response
@property @property
def workspace_slug(self): def workspace_slug(self):
return self.kwargs.get("slug", None) return self.kwargs.get("slug", None)
@ -190,3 +160,17 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
@property @property
def project_id(self): def project_id(self):
return self.kwargs.get("project_id", None) return self.kwargs.get("project_id", None)
@property
def fields(self):
fields = [
field for field in self.request.GET.get("fields", "").split(",") if field
]
return fields if fields else None
@property
def expand(self):
expand = [
expand for expand in self.request.GET.get("expand", "").split(",") if expand
]
return expand if expand else None

View File

@ -1,37 +0,0 @@
# Python imports
import os
# Django imports
from django.conf import settings
# Third party imports
from rest_framework.permissions import AllowAny
from rest_framework import status
from rest_framework.response import Response
from sentry_sdk import capture_exception
# Module imports
from .base import BaseAPIView
class ConfigurationEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request):
data = {}
data["google_client_id"] = os.environ.get("GOOGLE_CLIENT_ID", None)
data["github_client_id"] = os.environ.get("GITHUB_CLIENT_ID", None)
data["github_app_name"] = os.environ.get("GITHUB_APP_NAME", None)
data["magic_login"] = (
bool(settings.EMAIL_HOST_USER) and bool(settings.EMAIL_HOST_PASSWORD)
) and os.environ.get("ENABLE_MAGIC_LINK_LOGIN", "0") == "1"
data["email_password_login"] = (
os.environ.get("ENABLE_EMAIL_PASSWORD", "0") == "1"
)
data["slack_client_id"] = os.environ.get("SLACK_CLIENT_ID", None)
data["posthog_api_key"] = os.environ.get("POSTHOG_API_KEY", None)
data["posthog_host"] = os.environ.get("POSTHOG_HOST", None)
data["has_unsplash_configured"] = bool(settings.UNSPLASH_ACCESS_KEY)
return Response(data, status=status.HTTP_200_OK)

View File

@ -2,81 +2,47 @@
import json import json
# Django imports # Django imports
from django.db.models import ( from django.db.models import Q, Count, Sum, Prefetch, F, OuterRef, Func
Func,
F,
Q,
Exists,
OuterRef,
Count,
Prefetch,
Sum,
)
from django.core import serializers
from django.utils import timezone from django.utils import timezone
from django.utils.decorators import method_decorator from django.core import serializers
from django.views.decorators.gzip import gzip_page
# Third party imports # Third party imports
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework import status from rest_framework import status
from sentry_sdk import capture_exception
# Module imports # Module imports
from . import BaseViewSet, BaseAPIView from .base import BaseAPIView, WebhookMixin
from plane.db.models import Cycle, Issue, CycleIssue, IssueLink, IssueAttachment
from plane.app.permissions import ProjectEntityPermission
from plane.api.serializers import ( from plane.api.serializers import (
CycleSerializer, CycleSerializer,
CycleIssueSerializer, CycleIssueSerializer,
CycleFavoriteSerializer,
IssueStateSerializer,
CycleWriteSerializer,
)
from plane.api.permissions import ProjectEntityPermission
from plane.db.models import (
User,
Cycle,
CycleIssue,
Issue,
CycleFavorite,
IssueLink,
IssueAttachment,
Label,
) )
from plane.bgtasks.issue_activites_task import issue_activity from plane.bgtasks.issue_activites_task import issue_activity
from plane.utils.grouper import group_results
from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot
class CycleViewSet(BaseViewSet): class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to cycle.
"""
serializer_class = CycleSerializer serializer_class = CycleSerializer
model = Cycle model = Cycle
webhook_event = "cycle"
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"), owned_by=self.request.user
)
def get_queryset(self): def get_queryset(self):
subquery = CycleFavorite.objects.filter( return (
user=self.request.user, Cycle.objects.filter(workspace__slug=self.kwargs.get("slug"))
cycle_id=OuterRef("pk"),
project_id=self.kwargs.get("project_id"),
workspace__slug=self.kwargs.get("slug"),
)
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter(project__project_projectmember__member=self.request.user) .filter(project__project_projectmember__member=self.request.user)
.select_related("project") .select_related("project")
.select_related("workspace") .select_related("workspace")
.select_related("owned_by") .select_related("owned_by")
.annotate(is_favorite=Exists(subquery))
.annotate( .annotate(
total_issues=Count( total_issues=Count(
"issue_cycle", "issue_cycle",
@ -157,142 +123,62 @@ class CycleViewSet(BaseViewSet):
), ),
) )
) )
.prefetch_related( .order_by(self.kwargs.get("order_by", "-created_at"))
Prefetch(
"issue_cycle__issue__assignees",
queryset=User.objects.only("avatar", "first_name", "id").distinct(),
)
)
.prefetch_related(
Prefetch(
"issue_cycle__issue__labels",
queryset=Label.objects.only("name", "color", "id").distinct(),
)
)
.order_by("-is_favorite", "name")
.distinct() .distinct()
) )
def list(self, request, slug, project_id): def get(self, request, slug, project_id, pk=None):
if pk:
queryset = self.get_queryset().get(pk=pk)
data = CycleSerializer(
queryset,
fields=self.fields,
expand=self.expand,
).data
return Response(
data,
status=status.HTTP_200_OK,
)
queryset = self.get_queryset() queryset = self.get_queryset()
cycle_view = request.GET.get("cycle_view", "all") cycle_view = request.GET.get("cycle_view", "all")
queryset = queryset.order_by("-is_favorite","-created_at")
# Current Cycle # Current Cycle
if cycle_view == "current": if cycle_view == "current":
queryset = queryset.filter( queryset = queryset.filter(
start_date__lte=timezone.now(), start_date__lte=timezone.now(),
end_date__gte=timezone.now(), end_date__gte=timezone.now(),
) )
data = CycleSerializer(
data = CycleSerializer(queryset, many=True).data queryset, many=True, fields=self.fields, expand=self.expand
).data
if len(data):
assignee_distribution = (
Issue.objects.filter(
issue_cycle__cycle_id=data[0]["id"],
workspace__slug=slug,
project_id=project_id,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
total_issues=Count(
"assignee_id",
filter=Q(archived_at__isnull=True, is_draft=False),
),
)
.annotate(
completed_issues=Count(
"assignee_id",
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
pending_issues=Count(
"assignee_id",
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
is_draft=False,
),
)
)
.order_by("display_name")
)
label_distribution = (
Issue.objects.filter(
issue_cycle__cycle_id=data[0]["id"],
workspace__slug=slug,
project_id=project_id,
)
.annotate(label_name=F("labels__name"))
.annotate(color=F("labels__color"))
.annotate(label_id=F("labels__id"))
.values("label_name", "color", "label_id")
.annotate(
total_issues=Count(
"label_id",
filter=Q(archived_at__isnull=True, is_draft=False),
)
)
.annotate(
completed_issues=Count(
"label_id",
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
pending_issues=Count(
"label_id",
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
is_draft=False,
),
)
)
.order_by("label_name")
)
data[0]["distribution"] = {
"assignees": assignee_distribution,
"labels": label_distribution,
"completion_chart": {},
}
if data[0]["start_date"] and data[0]["end_date"]:
data[0]["distribution"]["completion_chart"] = burndown_plot(
queryset=queryset.first(),
slug=slug,
project_id=project_id,
cycle_id=data[0]["id"],
)
return Response(data, status=status.HTTP_200_OK) return Response(data, status=status.HTTP_200_OK)
# Upcoming Cycles # Upcoming Cycles
if cycle_view == "upcoming": if cycle_view == "upcoming":
queryset = queryset.filter(start_date__gt=timezone.now()) queryset = queryset.filter(start_date__gt=timezone.now())
return Response( return self.paginate(
CycleSerializer(queryset, many=True).data, status=status.HTTP_200_OK request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
).data,
) )
# Completed Cycles # Completed Cycles
if cycle_view == "completed": if cycle_view == "completed":
queryset = queryset.filter(end_date__lt=timezone.now()) queryset = queryset.filter(end_date__lt=timezone.now())
return Response( return self.paginate(
CycleSerializer(queryset, many=True).data, status=status.HTTP_200_OK request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
).data,
) )
# Draft Cycles # Draft Cycles
@ -301,9 +187,15 @@ class CycleViewSet(BaseViewSet):
end_date=None, end_date=None,
start_date=None, start_date=None,
) )
return self.paginate(
return Response( request=request,
CycleSerializer(queryset, many=True).data, status=status.HTTP_200_OK queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
).data,
) )
# Incomplete Cycles # Incomplete Cycles
@ -311,16 +203,28 @@ class CycleViewSet(BaseViewSet):
queryset = queryset.filter( queryset = queryset.filter(
Q(end_date__gte=timezone.now().date()) | Q(end_date__isnull=True), Q(end_date__gte=timezone.now().date()) | Q(end_date__isnull=True),
) )
return Response( return self.paginate(
CycleSerializer(queryset, many=True).data, status=status.HTTP_200_OK request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
).data,
) )
return self.paginate(
# If no matching view is found return all cycles request=request,
return Response( queryset=(queryset),
CycleSerializer(queryset, many=True).data, status=status.HTTP_200_OK on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
).data,
) )
def create(self, request, slug, project_id): def post(self, request, slug, project_id):
if ( if (
request.data.get("start_date", None) is None request.data.get("start_date", None) is None
and request.data.get("end_date", None) is None and request.data.get("end_date", None) is None
@ -344,7 +248,7 @@ class CycleViewSet(BaseViewSet):
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
def partial_update(self, request, slug, project_id, pk): def patch(self, request, slug, project_id, pk):
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk) cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
request_data = request.data request_data = request.data
@ -363,115 +267,13 @@ class CycleViewSet(BaseViewSet):
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
serializer = CycleWriteSerializer(cycle, data=request.data, partial=True) serializer = CycleSerializer(cycle, data=request.data, partial=True)
if serializer.is_valid(): if serializer.is_valid():
serializer.save() serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def retrieve(self, request, slug, project_id, pk): def delete(self, request, slug, project_id, pk):
queryset = self.get_queryset().get(pk=pk)
# Assignee Distribution
assignee_distribution = (
Issue.objects.filter(
issue_cycle__cycle_id=pk,
workspace__slug=slug,
project_id=project_id,
)
.annotate(first_name=F("assignees__first_name"))
.annotate(last_name=F("assignees__last_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.annotate(display_name=F("assignees__display_name"))
.values("first_name", "last_name", "assignee_id", "avatar", "display_name")
.annotate(
total_issues=Count(
"assignee_id",
filter=Q(archived_at__isnull=True, is_draft=False),
),
)
.annotate(
completed_issues=Count(
"assignee_id",
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
pending_issues=Count(
"assignee_id",
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
is_draft=False,
),
)
)
.order_by("first_name", "last_name")
)
# Label Distribution
label_distribution = (
Issue.objects.filter(
issue_cycle__cycle_id=pk,
workspace__slug=slug,
project_id=project_id,
)
.annotate(label_name=F("labels__name"))
.annotate(color=F("labels__color"))
.annotate(label_id=F("labels__id"))
.values("label_name", "color", "label_id")
.annotate(
total_issues=Count(
"label_id",
filter=Q(archived_at__isnull=True, is_draft=False),
),
)
.annotate(
completed_issues=Count(
"label_id",
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
pending_issues=Count(
"label_id",
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
is_draft=False,
),
)
)
.order_by("label_name")
)
data = CycleSerializer(queryset).data
data["distribution"] = {
"assignees": assignee_distribution,
"labels": label_distribution,
"completion_chart": {},
}
if queryset.start_date and queryset.end_date:
data["distribution"]["completion_chart"] = burndown_plot(
queryset=queryset, slug=slug, project_id=project_id, cycle_id=pk
)
return Response(
data,
status=status.HTTP_200_OK,
)
def destroy(self, request, slug, project_id, pk):
cycle_issues = list( cycle_issues = list(
CycleIssue.objects.filter(cycle_id=self.kwargs.get("pk")).values_list( CycleIssue.objects.filter(cycle_id=self.kwargs.get("pk")).values_list(
"issue", flat=True "issue", flat=True
@ -489,7 +291,7 @@ class CycleViewSet(BaseViewSet):
} }
), ),
actor_id=str(request.user.id), actor_id=str(request.user.id),
issue_id=str(pk), issue_id=None,
project_id=str(project_id), project_id=str(project_id),
current_instance=None, current_instance=None,
epoch=int(timezone.now().timestamp()), epoch=int(timezone.now().timestamp()),
@ -499,24 +301,24 @@ class CycleViewSet(BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
class CycleIssueViewSet(BaseViewSet): class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
"""
This viewset automatically provides `list`, `create`,
and `destroy` actions related to cycle issues.
"""
serializer_class = CycleIssueSerializer serializer_class = CycleIssueSerializer
model = CycleIssue model = CycleIssue
webhook_event = "cycle_issue"
bulk = True
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]
filterset_fields = [
"issue__labels__id",
"issue__assignees__id",
]
def get_queryset(self): def get_queryset(self):
return self.filter_queryset( return (
super() CycleIssue.objects.annotate(
.get_queryset()
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue_id")) sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue_id"))
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
@ -531,15 +333,12 @@ class CycleIssueViewSet(BaseViewSet):
.select_related("cycle") .select_related("cycle")
.select_related("issue", "issue__state", "issue__project") .select_related("issue", "issue__state", "issue__project")
.prefetch_related("issue__assignees", "issue__labels") .prefetch_related("issue__assignees", "issue__labels")
.order_by(self.kwargs.get("order_by", "-created_at"))
.distinct() .distinct()
) )
@method_decorator(gzip_page) def get(self, request, slug, project_id, cycle_id):
def list(self, request, slug, project_id, cycle_id):
order_by = request.GET.get("order_by", "created_at") order_by = request.GET.get("order_by", "created_at")
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
filters = issue_filters(request.query_params, "GET")
issues = ( issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id) Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
.annotate( .annotate(
@ -558,7 +357,6 @@ class CycleIssueViewSet(BaseViewSet):
.prefetch_related("assignees") .prefetch_related("assignees")
.prefetch_related("labels") .prefetch_related("labels")
.order_by(order_by) .order_by(order_by)
.filter(**filters)
.annotate( .annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id")) link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by() .order_by()
@ -573,29 +371,21 @@ class CycleIssueViewSet(BaseViewSet):
) )
) )
issues_data = IssueStateSerializer(issues, many=True).data return self.paginate(
request=request,
if sub_group_by and sub_group_by == group_by: queryset=(issues),
return Response( on_results=lambda issues: CycleSerializer(
{"error": "Group by and sub group by cannot be same"}, issues,
status=status.HTTP_400_BAD_REQUEST, many=True,
) fields=self.fields,
expand=self.expand,
if group_by: ).data,
grouped_results = group_results(issues_data, group_by, sub_group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(
issues_data, status=status.HTTP_200_OK
) )
def create(self, request, slug, project_id, cycle_id): def post(self, request, slug, project_id, cycle_id):
issues = request.data.get("issues", []) issues = request.data.get("issues", [])
if not len(issues): if not issues:
return Response( return Response(
{"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST {"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
) )
@ -612,6 +402,10 @@ class CycleIssueViewSet(BaseViewSet):
status=status.HTTP_400_BAD_REQUEST, status=status.HTTP_400_BAD_REQUEST,
) )
issues = Issue.objects.filter(
pk__in=issues, workspace__slug=slug, project_id=project_id
).values_list("id", flat=True)
# Get all CycleIssues already created # Get all CycleIssues already created
cycle_issues = list(CycleIssue.objects.filter(issue_id__in=issues)) cycle_issues = list(CycleIssue.objects.filter(issue_id__in=issues))
update_cycle_issue_activity = [] update_cycle_issue_activity = []
@ -662,7 +456,7 @@ class CycleIssueViewSet(BaseViewSet):
# Capture Issue Activity # Capture Issue Activity
issue_activity.delay( issue_activity.delay(
type="cycle.activity.created", type="cycle.activity.created",
requested_data=json.dumps({"cycles_list": issues}), requested_data=json.dumps({"cycles_list": str(issues)}),
actor_id=str(self.request.user.id), actor_id=str(self.request.user.id),
issue_id=None, issue_id=None,
project_id=str(self.kwargs.get("project_id", None)), project_id=str(self.kwargs.get("project_id", None)),
@ -683,9 +477,9 @@ class CycleIssueViewSet(BaseViewSet):
status=status.HTTP_200_OK, status=status.HTTP_200_OK,
) )
def destroy(self, request, slug, project_id, cycle_id, pk): def delete(self, request, slug, project_id, cycle_id, issue_id):
cycle_issue = CycleIssue.objects.get( cycle_issue = CycleIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id issue_id=issue_id, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
) )
issue_id = cycle_issue.issue_id issue_id = cycle_issue.issue_id
cycle_issue.delete() cycle_issue.delete()
@ -698,7 +492,7 @@ class CycleIssueViewSet(BaseViewSet):
} }
), ),
actor_id=str(self.request.user.id), actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)), issue_id=str(issue_id),
project_id=str(self.kwargs.get("project_id", None)), project_id=str(self.kwargs.get("project_id", None)),
current_instance=None, current_instance=None,
epoch=int(timezone.now().timestamp()), epoch=int(timezone.now().timestamp()),
@ -706,74 +500,12 @@ class CycleIssueViewSet(BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
class CycleDateCheckEndpoint(BaseAPIView): class TransferCycleIssueAPIEndpoint(BaseAPIView):
permission_classes = [ """
ProjectEntityPermission, This viewset provides `create` actions for transfering the issues into a particular cycle.
]
def post(self, request, slug, project_id): """
start_date = request.data.get("start_date", False)
end_date = request.data.get("end_date", False)
cycle_id = request.data.get("cycle_id")
if not start_date or not end_date:
return Response(
{"error": "Start date and end date both are required"},
status=status.HTTP_400_BAD_REQUEST,
)
cycles = Cycle.objects.filter(
Q(workspace__slug=slug)
& Q(project_id=project_id)
& (
Q(start_date__lte=start_date, end_date__gte=start_date)
| Q(start_date__lte=end_date, end_date__gte=end_date)
| Q(start_date__gte=start_date, end_date__lte=end_date)
)
).exclude(pk=cycle_id)
if cycles.exists():
return Response(
{
"error": "You have a cycle already on the given dates, if you want to create a draft cycle you can do that by removing dates",
"status": False,
}
)
else:
return Response({"status": True}, status=status.HTTP_200_OK)
class CycleFavoriteViewSet(BaseViewSet):
serializer_class = CycleFavoriteSerializer
model = CycleFavorite
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(user=self.request.user)
.select_related("cycle", "cycle__owned_by")
)
def create(self, request, slug, project_id):
serializer = CycleFavoriteSerializer(data=request.data)
if serializer.is_valid():
serializer.save(user=request.user, project_id=project_id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def destroy(self, request, slug, project_id, cycle_id):
cycle_favorite = CycleFavorite.objects.get(
project=project_id,
user=request.user,
workspace__slug=slug,
cycle_id=cycle_id,
)
cycle_favorite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class TransferCycleIssueEndpoint(BaseAPIView):
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]

View File

@ -1,83 +1,30 @@
# Python imports # Python imports
import json import json
# Django import # Django improts
from django.utils import timezone from django.utils import timezone
from django.db.models import Q, Count, OuterRef, Func, F, Prefetch from django.db.models import Q
from django.core.serializers.json import DjangoJSONEncoder from django.core.serializers.json import DjangoJSONEncoder
# Third party imports # Third party imports
from rest_framework import status from rest_framework import status
from rest_framework.response import Response from rest_framework.response import Response
from sentry_sdk import capture_exception
# Module imports # Module imports
from .base import BaseViewSet from .base import BaseAPIView
from plane.api.permissions import ProjectBasePermission, ProjectLitePermission from plane.app.permissions import ProjectLitePermission
from plane.db.models import ( from plane.api.serializers import InboxIssueSerializer, IssueSerializer
Inbox, from plane.db.models import InboxIssue, Issue, State, ProjectMember, Project, Inbox
InboxIssue,
Issue,
State,
IssueLink,
IssueAttachment,
ProjectMember,
ProjectDeployBoard,
)
from plane.api.serializers import (
IssueSerializer,
InboxSerializer,
InboxIssueSerializer,
IssueCreateSerializer,
IssueStateInboxSerializer,
)
from plane.utils.issue_filters import issue_filters
from plane.bgtasks.issue_activites_task import issue_activity from plane.bgtasks.issue_activites_task import issue_activity
class InboxViewSet(BaseViewSet): class InboxIssueAPIEndpoint(BaseAPIView):
permission_classes = [ """
ProjectBasePermission, This viewset automatically provides `list`, `create`, `retrieve`,
] `update` and `destroy` actions related to inbox issues.
serializer_class = InboxSerializer """
model = Inbox
def get_queryset(self):
return (
super()
.get_queryset()
.filter(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
)
.annotate(
pending_issue_count=Count(
"issue_inbox",
filter=Q(issue_inbox__status=-2),
)
)
.select_related("workspace", "project")
)
def perform_create(self, serializer):
serializer.save(project_id=self.kwargs.get("project_id"))
def destroy(self, request, slug, project_id, pk):
inbox = Inbox.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
# Handle default inbox delete
if inbox.is_default:
return Response(
{"error": "You cannot delete the default inbox"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class InboxIssueViewSet(BaseViewSet):
permission_classes = [ permission_classes = [
ProjectLitePermission, ProjectLitePermission,
] ]
@ -90,73 +37,77 @@ class InboxIssueViewSet(BaseViewSet):
] ]
def get_queryset(self): def get_queryset(self):
return self.filter_queryset( inbox = Inbox.objects.filter(
super() workspace__slug=self.kwargs.get("slug"),
.get_queryset() project_id=self.kwargs.get("project_id"),
.filter( ).first()
project = Project.objects.get(
workspace__slug=self.kwargs.get("slug"), pk=self.kwargs.get("project_id")
)
if inbox is None and not project.inbox_view:
return InboxIssue.objects.none()
return (
InboxIssue.objects.filter(
Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True), Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
workspace__slug=self.kwargs.get("slug"), workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"), project_id=self.kwargs.get("project_id"),
inbox_id=self.kwargs.get("inbox_id"), inbox_id=inbox.id,
) )
.select_related("issue", "workspace", "project") .select_related("issue", "workspace", "project")
.order_by(self.kwargs.get("order_by", "-created_at"))
) )
def list(self, request, slug, project_id, inbox_id): def get(self, request, slug, project_id, issue_id=None):
filters = issue_filters(request.query_params, "GET") if issue_id:
issues = ( inbox_issue_queryset = self.get_queryset().get(issue_id=issue_id)
Issue.objects.filter( inbox_issue_data = InboxIssueSerializer(
issue_inbox__inbox_id=inbox_id, inbox_issue_queryset,
workspace__slug=slug, fields=self.fields,
project_id=project_id, expand=self.expand,
).data
return Response(
inbox_issue_data,
status=status.HTTP_200_OK,
) )
.filter(**filters) issue_queryset = self.get_queryset()
.annotate(bridge_id=F("issue_inbox__id")) return self.paginate(
.select_related("workspace", "project", "state", "parent") request=request,
.prefetch_related("assignees", "labels") queryset=(issue_queryset),
.order_by("issue_inbox__snoozed_till", "issue_inbox__status") on_results=lambda inbox_issues: InboxIssueSerializer(
.annotate( inbox_issues,
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id")) many=True,
.order_by() fields=self.fields,
.annotate(count=Func(F("id"), function="Count")) expand=self.expand,
.values("count") ).data,
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.prefetch_related(
Prefetch(
"issue_inbox",
queryset=InboxIssue.objects.only(
"status", "duplicate_to", "snoozed_till", "source"
),
)
)
)
issues_data = IssueStateInboxSerializer(issues, many=True).data
return Response(
issues_data,
status=status.HTTP_200_OK,
) )
def post(self, request, slug, project_id):
def create(self, request, slug, project_id, inbox_id):
if not request.data.get("issue", {}).get("name", False): if not request.data.get("issue", {}).get("name", False):
return Response( return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST {"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
) )
inbox = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
# Inbox view
if inbox is None and not project.inbox_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project settings"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check for valid priority # Check for valid priority
if not request.data.get("issue", {}).get("priority", "none") in [ if not request.data.get("issue", {}).get("priority", "none") in [
"low", "low",
@ -198,48 +149,83 @@ class InboxIssueViewSet(BaseViewSet):
issue_id=str(issue.id), issue_id=str(issue.id),
project_id=str(project_id), project_id=str(project_id),
current_instance=None, current_instance=None,
epoch=int(timezone.now().timestamp()) epoch=int(timezone.now().timestamp()),
) )
# create an inbox issue # create an inbox issue
InboxIssue.objects.create( inbox_issue = InboxIssue.objects.create(
inbox_id=inbox_id, inbox_id=inbox.id,
project_id=project_id, project_id=project_id,
issue=issue, issue=issue,
source=request.data.get("source", "in-app"), source=request.data.get("source", "in-app"),
) )
serializer = IssueStateInboxSerializer(issue) serializer = InboxIssueSerializer(inbox_issue)
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, inbox_id, pk): def patch(self, request, slug, project_id, issue_id):
inbox_issue = InboxIssue.objects.get( inbox = Inbox.objects.filter(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
) )
# Inbox view
if inbox is None and not project.inbox_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project settings"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the inbox issue
inbox_issue = InboxIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox.id,
)
# Get the project member # Get the project member
project_member = ProjectMember.objects.get(workspace__slug=slug, project_id=project_id, member=request.user) project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
# Only project members admins and created_by users can access this endpoint # Only project members admins and created_by users can access this endpoint
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(request.user.id): if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
return Response({"error": "You cannot edit inbox issues"}, status=status.HTTP_400_BAD_REQUEST) request.user.id
):
return Response(
{"error": "You cannot edit inbox issues"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get issue data # Get issue data
issue_data = request.data.pop("issue", False) issue_data = request.data.pop("issue", False)
if bool(issue_data): if bool(issue_data):
issue = Issue.objects.get( issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id pk=issue_id, workspace__slug=slug, project_id=project_id
) )
# Only allow guests and viewers to edit name and description # Only allow guests and viewers to edit name and description
if project_member.role <= 10: if project_member.role <= 10:
# viewers and guests since only viewers and guests # viewers and guests since only viewers and guests
issue_data = { issue_data = {
"name": issue_data.get("name", issue.name), "name": issue_data.get("name", issue.name),
"description_html": issue_data.get("description_html", issue.description_html), "description_html": issue_data.get(
"description": issue_data.get("description", issue.description) "description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
} }
issue_serializer = IssueCreateSerializer( issue_serializer = IssueSerializer(issue, data=issue_data, partial=True)
issue, data=issue_data, partial=True
)
if issue_serializer.is_valid(): if issue_serializer.is_valid():
current_instance = issue current_instance = issue
@ -250,13 +236,13 @@ class InboxIssueViewSet(BaseViewSet):
type="issue.activity.updated", type="issue.activity.updated",
requested_data=requested_data, requested_data=requested_data,
actor_id=str(request.user.id), actor_id=str(request.user.id),
issue_id=str(issue.id), issue_id=str(issue_id),
project_id=str(project_id), project_id=str(project_id),
current_instance=json.dumps( current_instance=json.dumps(
IssueSerializer(current_instance).data, IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder, cls=DjangoJSONEncoder,
), ),
epoch=int(timezone.now().timestamp()) epoch=int(timezone.now().timestamp()),
) )
issue_serializer.save() issue_serializer.save()
else: else:
@ -275,7 +261,7 @@ class InboxIssueViewSet(BaseViewSet):
# Update the issue state if the issue is rejected or marked as duplicate # Update the issue state if the issue is rejected or marked as duplicate
if serializer.data["status"] in [-1, 2]: if serializer.data["status"] in [-1, 2]:
issue = Issue.objects.get( issue = Issue.objects.get(
pk=inbox_issue.issue_id, pk=issue_id,
workspace__slug=slug, workspace__slug=slug,
project_id=project_id, project_id=project_id,
) )
@ -289,7 +275,7 @@ class InboxIssueViewSet(BaseViewSet):
# Update the issue state if it is accepted # Update the issue state if it is accepted
if serializer.data["status"] in [1]: if serializer.data["status"] in [1]:
issue = Issue.objects.get( issue = Issue.objects.get(
pk=inbox_issue.issue_id, pk=issue_id,
workspace__slug=slug, workspace__slug=slug,
project_id=project_id, project_id=project_id,
) )
@ -307,253 +293,60 @@ class InboxIssueViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
else: else:
return Response(InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK) return Response(
InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
)
def retrieve(self, request, slug, project_id, inbox_id, pk): def delete(self, request, slug, project_id, issue_id):
inbox_issue = InboxIssue.objects.get( inbox = Inbox.objects.filter(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id workspace__slug=slug, project_id=project_id
) ).first()
issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
)
serializer = IssueStateInboxSerializer(issue)
return Response(serializer.data, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, inbox_id, pk): project = Project.objects.get(
inbox_issue = InboxIssue.objects.get( workspace__slug=slug,
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id pk=project_id,
) )
# Inbox view
if inbox is None and not project.inbox_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project settings"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the inbox issue
inbox_issue = InboxIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox.id,
)
# Get the project member # Get the project member
project_member = ProjectMember.objects.get(workspace__slug=slug, project_id=project_id, member=request.user) project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(request.user.id): # Check the inbox issue created
return Response({"error": "You cannot delete inbox issue"}, status=status.HTTP_400_BAD_REQUEST) if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
):
return Response(
{"error": "You cannot delete inbox issue"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check the issue status # Check the issue status
if inbox_issue.status in [-2, -1, 0, 2]: if inbox_issue.status in [-2, -1, 0, 2]:
# Delete the issue also # Delete the issue also
Issue.objects.filter(workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id).delete()
inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class InboxIssuePublicViewSet(BaseViewSet):
serializer_class = InboxIssueSerializer
model = InboxIssue
filterset_fields = [
"status",
]
def get_queryset(self):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=self.kwargs.get("slug"), project_id=self.kwargs.get("project_id"))
if project_deploy_board is not None:
return self.filter_queryset(
super()
.get_queryset()
.filter(
Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
project_id=self.kwargs.get("project_id"),
workspace__slug=self.kwargs.get("slug"),
inbox_id=self.kwargs.get("inbox_id"),
)
.select_related("issue", "workspace", "project")
)
return InboxIssue.objects.none()
def list(self, request, slug, project_id, inbox_id):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
filters = issue_filters(request.query_params, "GET")
issues = (
Issue.objects.filter( Issue.objects.filter(
issue_inbox__inbox_id=inbox_id, workspace__slug=slug, project_id=project_id, pk=issue_id
workspace__slug=slug, ).delete()
project_id=project_id,
)
.filter(**filters)
.annotate(bridge_id=F("issue_inbox__id"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels")
.order_by("issue_inbox__snoozed_till", "issue_inbox__status")
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.prefetch_related(
Prefetch(
"issue_inbox",
queryset=InboxIssue.objects.only(
"status", "duplicate_to", "snoozed_till", "source"
),
)
)
)
issues_data = IssueStateInboxSerializer(issues, many=True).data
return Response(
issues_data,
status=status.HTTP_200_OK,
)
def create(self, request, slug, project_id, inbox_id):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
if not request.data.get("issue", {}).get("name", False):
return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
)
# Check for valid priority
if not request.data.get("issue", {}).get("priority", "none") in [
"low",
"medium",
"high",
"urgent",
"none",
]:
return Response(
{"error": "Invalid priority"}, status=status.HTTP_400_BAD_REQUEST
)
# Create or get state
state, _ = State.objects.get_or_create(
name="Triage",
group="backlog",
description="Default state for managing all Inbox Issues",
project_id=project_id,
color="#ff7700",
)
# create an issue
issue = Issue.objects.create(
name=request.data.get("issue", {}).get("name"),
description=request.data.get("issue", {}).get("description", {}),
description_html=request.data.get("issue", {}).get(
"description_html", "<p></p>"
),
priority=request.data.get("issue", {}).get("priority", "low"),
project_id=project_id,
state=state,
)
# Create an Issue Activity
issue_activity.delay(
type="issue.activity.created",
requested_data=json.dumps(request.data, cls=DjangoJSONEncoder),
actor_id=str(request.user.id),
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp())
)
# create an inbox issue
InboxIssue.objects.create(
inbox_id=inbox_id,
project_id=project_id,
issue=issue,
source=request.data.get("source", "in-app"),
)
serializer = IssueStateInboxSerializer(issue)
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
# Get the project member
if str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot edit inbox issues"}, status=status.HTTP_400_BAD_REQUEST)
# Get issue data
issue_data = request.data.pop("issue", False)
issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
)
# viewers and guests since only viewers and guests
issue_data = {
"name": issue_data.get("name", issue.name),
"description_html": issue_data.get("description_html", issue.description_html),
"description": issue_data.get("description", issue.description)
}
issue_serializer = IssueCreateSerializer(
issue, data=issue_data, partial=True
)
if issue_serializer.is_valid():
current_instance = issue
# Log all the updates
requested_data = json.dumps(issue_data, cls=DjangoJSONEncoder)
if issue is not None:
issue_activity.delay(
type="issue.activity.updated",
requested_data=requested_data,
actor_id=str(request.user.id),
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=json.dumps(
IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp())
)
issue_serializer.save()
return Response(issue_serializer.data, status=status.HTTP_200_OK)
return Response(issue_serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def retrieve(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
)
serializer = IssueStateInboxSerializer(issue)
return Response(serializer.data, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
if str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot delete inbox issue"}, status=status.HTTP_400_BAD_REQUEST)
inbox_issue.delete() inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)

File diff suppressed because it is too large Load Diff

View File

@ -1,73 +1,53 @@
# Python imports # Python imports
import json import json
# Django Imports # Django imports
from django.db.models import Count, Prefetch, Q, F, Func, OuterRef
from django.utils import timezone from django.utils import timezone
from django.db import IntegrityError
from django.db.models import Prefetch, F, OuterRef, Func, Exists, Count, Q
from django.core import serializers from django.core import serializers
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
# Third party imports # Third party imports
from rest_framework.response import Response
from rest_framework import status from rest_framework import status
from sentry_sdk import capture_exception from rest_framework.response import Response
# Module imports # Module imports
from . import BaseViewSet from .base import BaseAPIView, WebhookMixin
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import (
Project,
Module,
ModuleLink,
Issue,
ModuleIssue,
IssueAttachment,
IssueLink,
)
from plane.api.serializers import ( from plane.api.serializers import (
ModuleWriteSerializer,
ModuleSerializer, ModuleSerializer,
ModuleIssueSerializer, ModuleIssueSerializer,
ModuleLinkSerializer, IssueSerializer,
ModuleFavoriteSerializer,
IssueStateSerializer,
)
from plane.api.permissions import ProjectEntityPermission
from plane.db.models import (
Module,
ModuleIssue,
Project,
Issue,
ModuleLink,
ModuleFavorite,
IssueLink,
IssueAttachment,
) )
from plane.bgtasks.issue_activites_task import issue_activity from plane.bgtasks.issue_activites_task import issue_activity
from plane.utils.grouper import group_results
from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot
class ModuleViewSet(BaseViewSet): class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module.
"""
model = Module model = Module
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]
serializer_class = ModuleSerializer
def get_serializer_class(self): webhook_event = "module"
return (
ModuleWriteSerializer
if self.action in ["create", "update", "partial_update"]
else ModuleSerializer
)
def get_queryset(self): def get_queryset(self):
subquery = ModuleFavorite.objects.filter(
user=self.request.user,
module_id=OuterRef("pk"),
project_id=self.kwargs.get("project_id"),
workspace__slug=self.kwargs.get("slug"),
)
return ( return (
super() Module.objects.filter(project_id=self.kwargs.get("project_id"))
.get_queryset()
.filter(project_id=self.kwargs.get("project_id"))
.filter(workspace__slug=self.kwargs.get("slug")) .filter(workspace__slug=self.kwargs.get("slug"))
.annotate(is_favorite=Exists(subquery))
.select_related("project") .select_related("project")
.select_related("workspace") .select_related("workspace")
.select_related("lead") .select_related("lead")
@ -137,130 +117,51 @@ class ModuleViewSet(BaseViewSet):
), ),
) )
) )
.order_by("-is_favorite","-created_at") .order_by(self.kwargs.get("order_by", "-created_at"))
) )
def create(self, request, slug, project_id): def post(self, request, slug, project_id):
project = Project.objects.get(workspace__slug=slug, pk=project_id) project = Project.objects.get(workspace__slug=slug, pk=project_id)
serializer = ModuleWriteSerializer( serializer = ModuleSerializer(data=request.data, context={"project": project})
data=request.data, context={"project": project}
)
if serializer.is_valid(): if serializer.is_valid():
serializer.save() serializer.save()
module = Module.objects.get(pk=serializer.data["id"]) module = Module.objects.get(pk=serializer.data["id"])
serializer = ModuleSerializer(module) serializer = ModuleSerializer(module)
return Response(serializer.data, status=status.HTTP_201_CREATED) return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def retrieve(self, request, slug, project_id, pk): def patch(self, request, slug, project_id, pk):
queryset = self.get_queryset().get(pk=pk) module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
serializer = ModuleSerializer(module, data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
assignee_distribution = ( def get(self, request, slug, project_id, pk=None):
Issue.objects.filter( if pk:
issue_module__module_id=pk, queryset = self.get_queryset().get(pk=pk)
workspace__slug=slug, data = ModuleSerializer(
project_id=project_id, queryset,
fields=self.fields,
expand=self.expand,
).data
return Response(
data,
status=status.HTTP_200_OK,
) )
.annotate(first_name=F("assignees__first_name")) return self.paginate(
.annotate(last_name=F("assignees__last_name")) request=request,
.annotate(assignee_id=F("assignees__id")) queryset=(self.get_queryset()),
.annotate(display_name=F("assignees__display_name")) on_results=lambda modules: ModuleSerializer(
.annotate(avatar=F("assignees__avatar")) modules,
.values("first_name", "last_name", "assignee_id", "avatar", "display_name") many=True,
.annotate( fields=self.fields,
total_issues=Count( expand=self.expand,
"assignee_id", ).data,
filter=Q(
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
completed_issues=Count(
"assignee_id",
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
pending_issues=Count(
"assignee_id",
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
is_draft=False,
),
)
)
.order_by("first_name", "last_name")
) )
label_distribution = ( def delete(self, request, slug, project_id, pk):
Issue.objects.filter(
issue_module__module_id=pk,
workspace__slug=slug,
project_id=project_id,
)
.annotate(label_name=F("labels__name"))
.annotate(color=F("labels__color"))
.annotate(label_id=F("labels__id"))
.values("label_name", "color", "label_id")
.annotate(
total_issues=Count(
"label_id",
filter=Q(
archived_at__isnull=True,
is_draft=False,
),
),
)
.annotate(
completed_issues=Count(
"label_id",
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
pending_issues=Count(
"label_id",
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
is_draft=False,
),
)
)
.order_by("label_name")
)
data = ModuleSerializer(queryset).data
data["distribution"] = {
"assignees": assignee_distribution,
"labels": label_distribution,
"completion_chart": {},
}
if queryset.start_date and queryset.target_date:
data["distribution"]["completion_chart"] = burndown_plot(
queryset=queryset, slug=slug, project_id=project_id, module_id=pk
)
return Response(
data,
status=status.HTTP_200_OK,
)
def destroy(self, request, slug, project_id, pk):
module = Module.objects.get(workspace__slug=slug, project_id=project_id, pk=pk) module = Module.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
module_issues = list( module_issues = list(
ModuleIssue.objects.filter(module_id=pk).values_list("issue", flat=True) ModuleIssue.objects.filter(module_id=pk).values_list("issue", flat=True)
@ -275,7 +176,7 @@ class ModuleViewSet(BaseViewSet):
} }
), ),
actor_id=str(request.user.id), actor_id=str(request.user.id),
issue_id=str(pk), issue_id=None,
project_id=str(project_id), project_id=str(project_id),
current_instance=None, current_instance=None,
epoch=int(timezone.now().timestamp()), epoch=int(timezone.now().timestamp()),
@ -284,24 +185,25 @@ class ModuleViewSet(BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
class ModuleIssueViewSet(BaseViewSet): class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module issues.
"""
serializer_class = ModuleIssueSerializer serializer_class = ModuleIssueSerializer
model = ModuleIssue model = ModuleIssue
webhook_event = "module_issue"
filterset_fields = [ bulk = True
"issue__labels__id",
"issue__assignees__id",
]
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]
def get_queryset(self): def get_queryset(self):
return self.filter_queryset( return (
super() ModuleIssue.objects.annotate(
.get_queryset()
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue")) sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue"))
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
@ -317,15 +219,12 @@ class ModuleIssueViewSet(BaseViewSet):
.select_related("issue", "issue__state", "issue__project") .select_related("issue", "issue__state", "issue__project")
.prefetch_related("issue__assignees", "issue__labels") .prefetch_related("issue__assignees", "issue__labels")
.prefetch_related("module__members") .prefetch_related("module__members")
.order_by(self.kwargs.get("order_by", "-created_at"))
.distinct() .distinct()
) )
@method_decorator(gzip_page) def get(self, request, slug, project_id, module_id):
def list(self, request, slug, project_id, module_id):
order_by = request.GET.get("order_by", "created_at") order_by = request.GET.get("order_by", "created_at")
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
filters = issue_filters(request.query_params, "GET")
issues = ( issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id) Issue.issue_objects.filter(issue_module__module_id=module_id)
.annotate( .annotate(
@ -344,7 +243,6 @@ class ModuleIssueViewSet(BaseViewSet):
.prefetch_related("assignees") .prefetch_related("assignees")
.prefetch_related("labels") .prefetch_related("labels")
.order_by(order_by) .order_by(order_by)
.filter(**filters)
.annotate( .annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id")) link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by() .order_by()
@ -358,26 +256,18 @@ class ModuleIssueViewSet(BaseViewSet):
.values("count") .values("count")
) )
) )
issues_data = IssueStateSerializer(issues, many=True).data return self.paginate(
request=request,
if sub_group_by and sub_group_by == group_by: queryset=(issues),
return Response( on_results=lambda issues: IssueSerializer(
{"error": "Group by and sub group by cannot be same"}, issues,
status=status.HTTP_400_BAD_REQUEST, many=True,
) fields=self.fields,
expand=self.expand,
if group_by: ).data,
grouped_results = group_results(issues_data, group_by, sub_group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(
issues_data, status=status.HTTP_200_OK
) )
def create(self, request, slug, project_id, module_id): def post(self, request, slug, project_id, module_id):
issues = request.data.get("issues", []) issues = request.data.get("issues", [])
if not len(issues): if not len(issues):
return Response( return Response(
@ -387,6 +277,10 @@ class ModuleIssueViewSet(BaseViewSet):
workspace__slug=slug, project_id=project_id, pk=module_id workspace__slug=slug, project_id=project_id, pk=module_id
) )
issues = Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk__in=issues
).values_list("id", flat=True)
module_issues = list(ModuleIssue.objects.filter(issue_id__in=issues)) module_issues = list(ModuleIssue.objects.filter(issue_id__in=issues))
update_module_issue_activity = [] update_module_issue_activity = []
@ -438,7 +332,7 @@ class ModuleIssueViewSet(BaseViewSet):
# Capture Issue Activity # Capture Issue Activity
issue_activity.delay( issue_activity.delay(
type="module.activity.created", type="module.activity.created",
requested_data=json.dumps({"modules_list": issues}), requested_data=json.dumps({"modules_list": str(issues)}),
actor_id=str(self.request.user.id), actor_id=str(self.request.user.id),
issue_id=None, issue_id=None,
project_id=str(self.kwargs.get("project_id", None)), project_id=str(self.kwargs.get("project_id", None)),
@ -458,9 +352,9 @@ class ModuleIssueViewSet(BaseViewSet):
status=status.HTTP_200_OK, status=status.HTTP_200_OK,
) )
def destroy(self, request, slug, project_id, module_id, pk): def delete(self, request, slug, project_id, module_id, issue_id):
module_issue = ModuleIssue.objects.get( module_issue = ModuleIssue.objects.get(
workspace__slug=slug, project_id=project_id, module_id=module_id, pk=pk workspace__slug=slug, project_id=project_id, module_id=module_id, issue_id=issue_id
) )
module_issue.delete() module_issue.delete()
issue_activity.delay( issue_activity.delay(
@ -472,67 +366,9 @@ class ModuleIssueViewSet(BaseViewSet):
} }
), ),
actor_id=str(request.user.id), actor_id=str(request.user.id),
issue_id=str(pk), issue_id=str(issue_id),
project_id=str(project_id), project_id=str(project_id),
current_instance=None, current_instance=None,
epoch=int(timezone.now().timestamp()), epoch=int(timezone.now().timestamp()),
) )
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
class ModuleLinkViewSet(BaseViewSet):
permission_classes = [
ProjectEntityPermission,
]
model = ModuleLink
serializer_class = ModuleLinkSerializer
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"),
module_id=self.kwargs.get("module_id"),
)
def get_queryset(self):
return (
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(module_id=self.kwargs.get("module_id"))
.filter(project__project_projectmember__member=self.request.user)
.order_by("-created_at")
.distinct()
)
class ModuleFavoriteViewSet(BaseViewSet):
serializer_class = ModuleFavoriteSerializer
model = ModuleFavorite
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(user=self.request.user)
.select_related("module")
)
def create(self, request, slug, project_id):
serializer = ModuleFavoriteSerializer(data=request.data)
if serializer.is_valid():
serializer.save(user=request.user, project_id=project_id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def destroy(self, request, slug, project_id, module_id):
module_favorite = ModuleFavorite.objects.get(
project=project_id,
user=request.user,
workspace__slug=slug,
module_id=module_id,
)
module_favorite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@ -1,298 +0,0 @@
# Python imports
import uuid
import requests
import os
# Django imports
from django.utils import timezone
from django.conf import settings
# Third Party modules
from rest_framework.response import Response
from rest_framework import exceptions
from rest_framework.permissions import AllowAny
from rest_framework_simplejwt.tokens import RefreshToken
from rest_framework import status
from sentry_sdk import capture_exception
# sso authentication
from google.oauth2 import id_token
from google.auth.transport import requests as google_auth_request
# Module imports
from plane.db.models import SocialLoginConnection, User
from plane.api.serializers import UserSerializer
from .base import BaseAPIView
def get_tokens_for_user(user):
refresh = RefreshToken.for_user(user)
return (
str(refresh.access_token),
str(refresh),
)
def validate_google_token(token, client_id):
try:
id_info = id_token.verify_oauth2_token(
token, google_auth_request.Request(), client_id
)
email = id_info.get("email")
first_name = id_info.get("given_name")
last_name = id_info.get("family_name", "")
data = {
"email": email,
"first_name": first_name,
"last_name": last_name,
}
return data
except Exception as e:
capture_exception(e)
raise exceptions.AuthenticationFailed("Error with Google connection.")
def get_access_token(request_token: str, client_id: str) -> str:
"""Obtain the request token from github.
Given the client id, client secret and request issued out by GitHub, this method
should give back an access token
Parameters
----------
CLIENT_ID: str
A string representing the client id issued out by github
CLIENT_SECRET: str
A string representing the client secret issued out by github
request_token: str
A string representing the request token issued out by github
Throws
------
ValueError:
if CLIENT_ID or CLIENT_SECRET or request_token is empty or not a string
Returns
-------
access_token: str
A string representing the access token issued out by github
"""
if not request_token:
raise ValueError("The request token has to be supplied!")
CLIENT_SECRET = os.environ.get("GITHUB_CLIENT_SECRET")
url = f"https://github.com/login/oauth/access_token?client_id={client_id}&client_secret={CLIENT_SECRET}&code={request_token}"
headers = {"accept": "application/json"}
res = requests.post(url, headers=headers)
data = res.json()
access_token = data["access_token"]
return access_token
def get_user_data(access_token: str) -> dict:
"""
Obtain the user data from github.
Given the access token, this method should give back the user data
"""
if not access_token:
raise ValueError("The request token has to be supplied!")
if not isinstance(access_token, str):
raise ValueError("The request token has to be a string!")
access_token = "token " + access_token
url = "https://api.github.com/user"
headers = {"Authorization": access_token}
resp = requests.get(url=url, headers=headers)
user_data = resp.json()
response = requests.get(
url="https://api.github.com/user/emails", headers=headers
).json()
_ = [
user_data.update({"email": item.get("email")})
for item in response
if item.get("primary") is True
]
return user_data
class OauthEndpoint(BaseAPIView):
permission_classes = [AllowAny]
def post(self, request):
try:
medium = request.data.get("medium", False)
id_token = request.data.get("credential", False)
client_id = request.data.get("clientId", False)
if not medium or not id_token:
return Response(
{
"error": "Something went wrong. Please try again later or contact the support team."
},
status=status.HTTP_400_BAD_REQUEST,
)
if medium == "google":
data = validate_google_token(id_token, client_id)
if medium == "github":
access_token = get_access_token(id_token, client_id)
data = get_user_data(access_token)
email = data.get("email", None)
if email is None:
return Response(
{
"error": "Something went wrong. Please try again later or contact the support team."
},
status=status.HTTP_400_BAD_REQUEST,
)
if "@" in email:
user = User.objects.get(email=email)
email = data["email"]
mobile_number = uuid.uuid4().hex
email_verified = True
else:
return Response(
{
"error": "Something went wrong. Please try again later or contact the support team."
},
status=status.HTTP_400_BAD_REQUEST,
)
## Login Case
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
user.last_login_medium = "oauth"
user.last_login_uagent = request.META.get("HTTP_USER_AGENT")
user.is_email_verified = email_verified
user.save()
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
SocialLoginConnection.objects.update_or_create(
medium=medium,
extra_data={},
user=user,
defaults={
"token_data": {"id_token": id_token},
"last_login_at": timezone.now(),
},
)
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
return Response(data, status=status.HTTP_200_OK)
except User.DoesNotExist:
## Signup Case
username = uuid.uuid4().hex
if "@" in email:
email = data["email"]
mobile_number = uuid.uuid4().hex
email_verified = True
else:
return Response(
{
"error": "Something went wrong. Please try again later or contact the support team."
},
status=status.HTTP_400_BAD_REQUEST,
)
user = User(
username=username,
email=email,
mobile_number=mobile_number,
first_name=data.get("first_name", ""),
last_name=data.get("last_name", ""),
is_email_verified=email_verified,
is_password_autoset=True,
)
user.set_password(uuid.uuid4().hex)
user.is_password_autoset = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
user.last_login_medium = "oauth"
user.last_login_uagent = request.META.get("HTTP_USER_AGENT")
user.token_updated_at = timezone.now()
user.save()
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
SocialLoginConnection.objects.update_or_create(
medium=medium,
extra_data={},
user=user,
defaults={
"token_data": {"id_token": id_token},
"last_login_at": timezone.now(),
},
)
return Response(data, status=status.HTTP_201_CREATED)

View File

@ -1,255 +0,0 @@
# Python imports
from datetime import timedelta, date
# Django imports
from django.db.models import Exists, OuterRef, Q, Prefetch
from django.utils import timezone
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from sentry_sdk import capture_exception
# Module imports
from .base import BaseViewSet, BaseAPIView
from plane.api.permissions import ProjectEntityPermission
from plane.db.models import (
Page,
PageBlock,
PageFavorite,
Issue,
IssueAssignee,
IssueActivity,
)
from plane.api.serializers import (
PageSerializer,
PageBlockSerializer,
PageFavoriteSerializer,
IssueLiteSerializer,
)
class PageViewSet(BaseViewSet):
serializer_class = PageSerializer
model = Page
permission_classes = [
ProjectEntityPermission,
]
search_fields = [
"name",
]
def get_queryset(self):
subquery = PageFavorite.objects.filter(
user=self.request.user,
page_id=OuterRef("pk"),
project_id=self.kwargs.get("project_id"),
workspace__slug=self.kwargs.get("slug"),
)
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(project__project_projectmember__member=self.request.user)
.filter(Q(owned_by=self.request.user) | Q(access=0))
.select_related("project")
.select_related("workspace")
.select_related("owned_by")
.annotate(is_favorite=Exists(subquery))
.order_by(self.request.GET.get("order_by", "-created_at"))
.prefetch_related("labels")
.order_by("name", "-is_favorite")
.prefetch_related(
Prefetch(
"blocks",
queryset=PageBlock.objects.select_related(
"page", "issue", "workspace", "project"
),
)
)
.distinct()
)
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"), owned_by=self.request.user
)
def create(self, request, slug, project_id):
serializer = PageSerializer(
data=request.data,
context={"project_id": project_id, "owned_by_id": request.user.id},
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def partial_update(self, request, slug, project_id, pk):
page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id)
# Only update access if the page owner is the requesting user
if (
page.access != request.data.get("access", page.access)
and page.owned_by_id != request.user.id
):
return Response(
{
"error": "Access cannot be updated since this page is owned by someone else"
},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = PageSerializer(page, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def list(self, request, slug, project_id):
queryset = self.get_queryset()
page_view = request.GET.get("page_view", False)
if not page_view:
return Response({"error": "Page View parameter is required"}, status=status.HTTP_400_BAD_REQUEST)
# All Pages
if page_view == "all":
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
# Recent pages
if page_view == "recent":
current_time = date.today()
day_before = current_time - timedelta(days=1)
todays_pages = queryset.filter(updated_at__date=date.today())
yesterdays_pages = queryset.filter(updated_at__date=day_before)
earlier_this_week = queryset.filter( updated_at__date__range=(
(timezone.now() - timedelta(days=7)),
(timezone.now() - timedelta(days=2)),
))
return Response(
{
"today": PageSerializer(todays_pages, many=True).data,
"yesterday": PageSerializer(yesterdays_pages, many=True).data,
"earlier_this_week": PageSerializer(earlier_this_week, many=True).data,
},
status=status.HTTP_200_OK,
)
# Favorite Pages
if page_view == "favorite":
queryset = queryset.filter(is_favorite=True)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
# My pages
if page_view == "created_by_me":
queryset = queryset.filter(owned_by=request.user)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
# Created by other Pages
if page_view == "created_by_other":
queryset = queryset.filter(~Q(owned_by=request.user), access=0)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
return Response({"error": "No matching view found"}, status=status.HTTP_400_BAD_REQUEST)
class PageBlockViewSet(BaseViewSet):
serializer_class = PageBlockSerializer
model = PageBlock
permission_classes = [
ProjectEntityPermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(page_id=self.kwargs.get("page_id"))
.filter(project__project_projectmember__member=self.request.user)
.select_related("project")
.select_related("workspace")
.select_related("page")
.select_related("issue")
.order_by("sort_order")
.distinct()
)
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"),
page_id=self.kwargs.get("page_id"),
)
class PageFavoriteViewSet(BaseViewSet):
permission_classes = [
ProjectEntityPermission,
]
serializer_class = PageFavoriteSerializer
model = PageFavorite
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(user=self.request.user)
.select_related("page", "page__owned_by")
)
def create(self, request, slug, project_id):
serializer = PageFavoriteSerializer(data=request.data)
if serializer.is_valid():
serializer.save(user=request.user, project_id=project_id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def destroy(self, request, slug, project_id, page_id):
page_favorite = PageFavorite.objects.get(
project=project_id,
user=request.user,
workspace__slug=slug,
page_id=page_id,
)
page_favorite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CreateIssueFromPageBlockEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id, page_id, page_block_id):
page_block = PageBlock.objects.get(
pk=page_block_id,
workspace__slug=slug,
project_id=project_id,
page_id=page_id,
)
issue = Issue.objects.create(
name=page_block.name,
project_id=project_id,
description=page_block.description,
description_html=page_block.description_html,
description_stripped=page_block.description_stripped,
)
_ = IssueAssignee.objects.create(
issue=issue, assignee=request.user, project_id=project_id
)
_ = IssueActivity.objects.create(
issue=issue,
actor=request.user,
project_id=project_id,
comment=f"created the issue from {page_block.name} block",
verb="created",
)
page_block.issue = issue
page_block.save()
return Response(IssueLiteSerializer(issue).data, status=status.HTTP_200_OK)

View File

@ -1,121 +1,63 @@
# Python imports
import jwt
import boto3
from datetime import datetime
# Django imports # Django imports
from django.core.exceptions import ValidationError
from django.db import IntegrityError from django.db import IntegrityError
from django.db.models import ( from django.db.models import Exists, OuterRef, Q, F, Func, Subquery, Prefetch
Prefetch,
Q,
Exists,
OuterRef,
F,
Func,
Subquery,
)
from django.core.validators import validate_email
from django.conf import settings
# Third Party imports # Third party imports
from rest_framework.response import Response
from rest_framework import status from rest_framework import status
from rest_framework import serializers from rest_framework.response import Response
from rest_framework.permissions import AllowAny from rest_framework.serializers import ValidationError
from sentry_sdk import capture_exception
# Module imports # Module imports
from .base import BaseViewSet, BaseAPIView
from plane.api.serializers import (
ProjectSerializer,
ProjectListSerializer,
ProjectMemberSerializer,
ProjectDetailSerializer,
ProjectMemberInviteSerializer,
ProjectFavoriteSerializer,
ProjectDeployBoardSerializer,
ProjectMemberAdminSerializer,
)
from plane.api.permissions import (
ProjectBasePermission,
ProjectEntityPermission,
ProjectMemberPermission,
ProjectLitePermission,
)
from plane.db.models import ( from plane.db.models import (
Project,
ProjectMember,
Workspace, Workspace,
ProjectMemberInvite, Project,
User,
WorkspaceMember,
State,
TeamMember,
ProjectFavorite, ProjectFavorite,
ProjectIdentifier, ProjectMember,
Module,
Cycle,
CycleFavorite,
ModuleFavorite,
PageFavorite,
IssueViewFavorite,
Page,
IssueAssignee,
ModuleMember,
Inbox,
ProjectDeployBoard, ProjectDeployBoard,
State,
Cycle,
Module,
IssueProperty, IssueProperty,
Inbox,
) )
from plane.app.permissions import ProjectBasePermission
from plane.bgtasks.project_invitation_task import project_invitation from plane.api.serializers import ProjectSerializer
from .base import BaseAPIView, WebhookMixin
class ProjectViewSet(BaseViewSet): class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
"""Project Endpoints to create, update, list, retrieve and delete endpoint"""
serializer_class = ProjectSerializer serializer_class = ProjectSerializer
model = Project model = Project
webhook_event = "project"
permission_classes = [ permission_classes = [
ProjectBasePermission, ProjectBasePermission,
] ]
def get_serializer_class(self, *args, **kwargs):
if self.action in ["update", "partial_update"]:
return ProjectSerializer
return ProjectDetailSerializer
def get_queryset(self): def get_queryset(self):
return self.filter_queryset( return (
super() Project.objects.filter(workspace__slug=self.kwargs.get("slug"))
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(Q(project_projectmember__member=self.request.user) | Q(network=2)) .filter(Q(project_projectmember__member=self.request.user) | Q(network=2))
.select_related( .select_related(
"workspace", "workspace__owner", "default_assignee", "project_lead" "workspace", "workspace__owner", "default_assignee", "project_lead"
) )
.annotate(
is_favorite=Exists(
ProjectFavorite.objects.filter(
user=self.request.user,
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
)
)
)
.annotate( .annotate(
is_member=Exists( is_member=Exists(
ProjectMember.objects.filter( ProjectMember.objects.filter(
member=self.request.user, member=self.request.user,
project_id=OuterRef("pk"), project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"), workspace__slug=self.kwargs.get("slug"),
is_active=True,
) )
) )
) )
.annotate( .annotate(
total_members=ProjectMember.objects.filter( total_members=ProjectMember.objects.filter(
project_id=OuterRef("id"), member__is_bot=False project_id=OuterRef("id"),
member__is_bot=False,
is_active=True,
) )
.order_by() .order_by()
.annotate(count=Func(F("id"), function="Count")) .annotate(count=Func(F("id"), function="Count"))
@ -137,6 +79,7 @@ class ProjectViewSet(BaseViewSet):
member_role=ProjectMember.objects.filter( member_role=ProjectMember.objects.filter(
project_id=OuterRef("pk"), project_id=OuterRef("pk"),
member_id=self.request.user.id, member_id=self.request.user.id,
is_active=True,
).values("role") ).values("role")
) )
.annotate( .annotate(
@ -147,49 +90,46 @@ class ProjectViewSet(BaseViewSet):
) )
) )
) )
.order_by(self.kwargs.get("order_by", "-created_at"))
.distinct() .distinct()
) )
def list(self, request, slug): def get(self, request, slug, project_id=None):
fields = [field for field in request.GET.get("fields", "").split(",") if field] if project_id is None:
sort_order_query = ProjectMember.objects.filter(
sort_order_query = ProjectMember.objects.filter( member=request.user,
member=request.user, project_id=OuterRef("pk"),
project_id=OuterRef("pk"), workspace__slug=self.kwargs.get("slug"),
workspace__slug=self.kwargs.get("slug"), is_active=True,
).values("sort_order") ).values("sort_order")
projects = ( projects = (
self.get_queryset() self.get_queryset()
.annotate(sort_order=Subquery(sort_order_query)) .annotate(sort_order=Subquery(sort_order_query))
.prefetch_related( .prefetch_related(
Prefetch( Prefetch(
"project_projectmember", "project_projectmember",
queryset=ProjectMember.objects.filter( queryset=ProjectMember.objects.filter(
workspace__slug=slug, workspace__slug=slug,
).select_related("member"), is_active=True,
).select_related("member"),
)
) )
.order_by(request.GET.get("order_by", "sort_order"))
) )
.order_by("sort_order", "name")
)
if request.GET.get("per_page", False) and request.GET.get("cursor", False):
return self.paginate( return self.paginate(
request=request, request=request,
queryset=(projects), queryset=(projects),
on_results=lambda projects: ProjectListSerializer( on_results=lambda projects: ProjectSerializer(
projects, many=True projects, many=True, fields=self.fields, expand=self.expand,
).data, ).data,
) )
project = self.get_queryset().get(workspace__slug=slug, pk=project_id)
serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand,)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response( def post(self, request, slug):
ProjectListSerializer(
projects, many=True, fields=fields if fields else None
).data
)
def create(self, request, slug):
try: try:
workspace = Workspace.objects.get(slug=slug) workspace = Workspace.objects.get(slug=slug)
serializer = ProjectSerializer( serializer = ProjectSerializer(
data={**request.data}, context={"workspace_id": workspace.id} data={**request.data}, context={"workspace_id": workspace.id}
) )
@ -272,7 +212,7 @@ class ProjectViewSet(BaseViewSet):
) )
project = self.get_queryset().filter(pk=serializer.data["id"]).first() project = self.get_queryset().filter(pk=serializer.data["id"]).first()
serializer = ProjectListSerializer(project) serializer = ProjectSerializer(project)
return Response(serializer.data, status=status.HTTP_201_CREATED) return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response( return Response(
serializer.errors, serializer.errors,
@ -288,17 +228,16 @@ class ProjectViewSet(BaseViewSet):
return Response( return Response(
{"error": "Workspace does not exist"}, status=status.HTTP_404_NOT_FOUND {"error": "Workspace does not exist"}, status=status.HTTP_404_NOT_FOUND
) )
except serializers.ValidationError as e: except ValidationError as e:
return Response( return Response(
{"identifier": "The project identifier is already taken"}, {"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE, status=status.HTTP_410_GONE,
) )
def partial_update(self, request, slug, pk=None): def patch(self, request, slug, project_id=None):
try: try:
workspace = Workspace.objects.get(slug=slug) workspace = Workspace.objects.get(slug=slug)
project = Project.objects.get(pk=project_id)
project = Project.objects.get(pk=pk)
serializer = ProjectSerializer( serializer = ProjectSerializer(
project, project,
@ -319,15 +258,14 @@ class ProjectViewSet(BaseViewSet):
name="Triage", name="Triage",
group="backlog", group="backlog",
description="Default state for managing all Inbox Issues", description="Default state for managing all Inbox Issues",
project_id=pk, project_id=project_id,
color="#ff7700", color="#ff7700",
) )
project = self.get_queryset().filter(pk=serializer.data["id"]).first() project = self.get_queryset().filter(pk=serializer.data["id"]).first()
serializer = ProjectListSerializer(project) serializer = ProjectSerializer(project)
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e: except IntegrityError as e:
if "already exists" in str(e): if "already exists" in str(e):
return Response( return Response(
@ -338,710 +276,13 @@ class ProjectViewSet(BaseViewSet):
return Response( return Response(
{"error": "Project does not exist"}, status=status.HTTP_404_NOT_FOUND {"error": "Project does not exist"}, status=status.HTTP_404_NOT_FOUND
) )
except serializers.ValidationError as e: except ValidationError as e:
return Response( return Response(
{"identifier": "The project identifier is already taken"}, {"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE, status=status.HTTP_410_GONE,
) )
class InviteProjectEndpoint(BaseAPIView):
permission_classes = [
ProjectBasePermission,
]
def post(self, request, slug, project_id):
email = request.data.get("email", False)
role = request.data.get("role", False)
# Check if email is provided
if not email:
return Response(
{"error": "Email is required"}, status=status.HTTP_400_BAD_REQUEST
)
validate_email(email)
# Check if user is already a member of workspace
if ProjectMember.objects.filter(
project_id=project_id,
member__email=email,
member__is_bot=False,
).exists():
return Response(
{"error": "User is already member of workspace"},
status=status.HTTP_400_BAD_REQUEST,
)
user = User.objects.filter(email=email).first()
if user is None:
token = jwt.encode(
{"email": email, "timestamp": datetime.now().timestamp()},
settings.SECRET_KEY,
algorithm="HS256",
)
project_invitation_obj = ProjectMemberInvite.objects.create(
email=email.strip().lower(),
project_id=project_id,
token=token,
role=role,
)
domain = settings.WEB_URL
project_invitation.delay(email, project_id, token, domain)
return Response(
{
"message": "Email sent successfully",
"id": project_invitation_obj.id,
},
status=status.HTTP_200_OK,
)
project_member = ProjectMember.objects.create(
member=user, project_id=project_id, role=role
)
_ = IssueProperty.objects.create(user=user, project_id=project_id)
return Response(
ProjectMemberSerializer(project_member).data, status=status.HTTP_200_OK
)
class UserProjectInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(email=self.request.user.email)
.select_related("workspace", "workspace__owner", "project")
)
def create(self, request):
invitations = request.data.get("invitations")
project_invitations = ProjectMemberInvite.objects.filter(
pk__in=invitations, accepted=True
)
ProjectMember.objects.bulk_create(
[
ProjectMember(
project=invitation.project,
workspace=invitation.project.workspace,
member=request.user,
role=invitation.role,
created_by=request.user,
)
for invitation in project_invitations
]
)
IssueProperty.objects.bulk_create(
[
ProjectMember(
project=invitation.project,
workspace=invitation.project.workspace,
user=request.user,
created_by=request.user,
)
for invitation in project_invitations
]
)
# Delete joined project invites
project_invitations.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectMemberViewSet(BaseViewSet):
serializer_class = ProjectMemberAdminSerializer
model = ProjectMember
permission_classes = [
ProjectMemberPermission,
]
search_fields = [
"member__display_name",
"member__first_name",
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(member__is_bot=False)
.select_related("project")
.select_related("member")
.select_related("workspace", "workspace__owner")
)
def create(self, request, slug, project_id):
members = request.data.get("members", [])
# get the project
project = Project.objects.get(pk=project_id, workspace__slug=slug)
if not len(members):
return Response(
{"error": "Atleast one member is required"},
status=status.HTTP_400_BAD_REQUEST,
)
bulk_project_members = []
bulk_issue_props = []
project_members = (
ProjectMember.objects.filter(
workspace__slug=slug,
member_id__in=[member.get("member_id") for member in members],
)
.values("member_id", "sort_order")
.order_by("sort_order")
)
for member in members:
sort_order = [
project_member.get("sort_order")
for project_member in project_members
if str(project_member.get("member_id")) == str(member.get("member_id"))
]
bulk_project_members.append(
ProjectMember(
member_id=member.get("member_id"),
role=member.get("role", 10),
project_id=project_id,
workspace_id=project.workspace_id,
sort_order=sort_order[0] - 10000 if len(sort_order) else 65535,
)
)
bulk_issue_props.append(
IssueProperty(
user_id=member.get("member_id"),
project_id=project_id,
workspace_id=project.workspace_id,
)
)
project_members = ProjectMember.objects.bulk_create(
bulk_project_members,
batch_size=10,
ignore_conflicts=True,
)
_ = IssueProperty.objects.bulk_create(
bulk_issue_props, batch_size=10, ignore_conflicts=True
)
serializer = ProjectMemberSerializer(project_members, many=True)
return Response(serializer.data, status=status.HTTP_201_CREATED)
def list(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
member=request.user, workspace__slug=slug, project_id=project_id
)
project_members = ProjectMember.objects.filter(
project_id=project_id,
workspace__slug=slug,
member__is_bot=False,
).select_related("project", "member", "workspace")
if project_member.role > 10:
serializer = ProjectMemberAdminSerializer(project_members, many=True)
else:
serializer = ProjectMemberSerializer(project_members, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id
)
if request.user.id == project_member.member_id:
return Response(
{"error": "You cannot update your own role"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check while updating user roles
requested_project_member = ProjectMember.objects.get(
project_id=project_id, workspace__slug=slug, member=request.user
)
if (
"role" in request.data
and int(request.data.get("role", project_member.role))
> requested_project_member.role
):
return Response(
{"error": "You cannot update a role that is higher than your own role"},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = ProjectMemberSerializer(
project_member, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def destroy(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
# check requesting user role
requesting_project_member = ProjectMember.objects.get(
workspace__slug=slug, member=request.user, project_id=project_id
)
if requesting_project_member.role < project_member.role:
return Response(
{"error": "You cannot remove a user having role higher than yourself"},
status=status.HTTP_400_BAD_REQUEST,
)
# Remove all favorites
ProjectFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
CycleFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
ModuleFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
PageFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
IssueViewFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
# Also remove issue from issue assigned
IssueAssignee.objects.filter(
workspace__slug=slug,
project_id=project_id,
assignee=project_member.member,
).delete()
# Remove if module member
ModuleMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=project_member.member,
).delete()
# Delete owned Pages
Page.objects.filter(
workspace__slug=slug,
project_id=project_id,
owned_by=project_member.member,
).delete()
project_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class AddTeamToProjectEndpoint(BaseAPIView):
permission_classes = [
ProjectBasePermission,
]
def post(self, request, slug, project_id):
team_members = TeamMember.objects.filter(
workspace__slug=slug, team__in=request.data.get("teams", [])
).values_list("member", flat=True)
if len(team_members) == 0:
return Response(
{"error": "No such team exists"}, status=status.HTTP_400_BAD_REQUEST
)
workspace = Workspace.objects.get(slug=slug)
project_members = []
issue_props = []
for member in team_members:
project_members.append(
ProjectMember(
project_id=project_id,
member_id=member,
workspace=workspace,
created_by=request.user,
)
)
issue_props.append(
IssueProperty(
project_id=project_id,
user_id=member,
workspace=workspace,
created_by=request.user,
)
)
ProjectMember.objects.bulk_create(
project_members, batch_size=10, ignore_conflicts=True
)
_ = IssueProperty.objects.bulk_create(
issue_props, batch_size=10, ignore_conflicts=True
)
serializer = ProjectMemberSerializer(project_members, many=True)
return Response(serializer.data, status=status.HTTP_201_CREATED)
class ProjectMemberInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.select_related("project")
.select_related("workspace", "workspace__owner")
)
class ProjectMemberInviteDetailViewSet(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.select_related("project")
.select_related("workspace", "workspace__owner")
)
class ProjectIdentifierEndpoint(BaseAPIView):
permission_classes = [
ProjectBasePermission,
]
def get(self, request, slug):
name = request.GET.get("name", "").strip().upper()
if name == "":
return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
)
exists = ProjectIdentifier.objects.filter(
name=name, workspace__slug=slug
).values("id", "name", "project")
return Response(
{"exists": len(exists), "identifiers": exists},
status=status.HTTP_200_OK,
)
def delete(self, request, slug):
name = request.data.get("name", "").strip().upper()
if name == "":
return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
)
if Project.objects.filter(identifier=name, workspace__slug=slug).exists():
return Response(
{"error": "Cannot delete an identifier of an existing project"},
status=status.HTTP_400_BAD_REQUEST,
)
ProjectIdentifier.objects.filter(name=name, workspace__slug=slug).delete()
return Response(
status=status.HTTP_204_NO_CONTENT,
)
class ProjectJoinEndpoint(BaseAPIView):
def post(self, request, slug):
project_ids = request.data.get("project_ids", [])
# Get the workspace user role
workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug
)
workspace_role = workspace_member.role
workspace = workspace_member.workspace
ProjectMember.objects.bulk_create(
[
ProjectMember(
project_id=project_id,
member=request.user,
role=20
if workspace_role >= 15
else (15 if workspace_role == 10 else workspace_role),
workspace=workspace,
created_by=request.user,
)
for project_id in project_ids
],
ignore_conflicts=True,
)
IssueProperty.objects.bulk_create(
[
IssueProperty(
project_id=project_id,
user=request.user,
workspace=workspace,
created_by=request.user,
)
for project_id in project_ids
],
ignore_conflicts=True,
)
return Response(
{"message": "Projects joined successfully"},
status=status.HTTP_201_CREATED,
)
class ProjectUserViewsEndpoint(BaseAPIView):
def post(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project_member = ProjectMember.objects.filter(
member=request.user, project=project
).first()
if project_member is None:
return Response({"error": "Forbidden"}, status=status.HTTP_403_FORBIDDEN)
view_props = project_member.view_props
default_props = project_member.default_props
preferences = project_member.preferences
sort_order = project_member.sort_order
project_member.view_props = request.data.get("view_props", view_props)
project_member.default_props = request.data.get("default_props", default_props)
project_member.preferences = request.data.get("preferences", preferences)
project_member.sort_order = request.data.get("sort_order", sort_order)
project_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectMemberUserEndpoint(BaseAPIView):
def get(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
project_id=project_id, workspace__slug=slug, member=request.user
)
serializer = ProjectMemberSerializer(project_member)
return Response(serializer.data, status=status.HTTP_200_OK)
class ProjectFavoritesViewSet(BaseViewSet):
serializer_class = ProjectFavoriteSerializer
model = ProjectFavorite
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(user=self.request.user)
.select_related(
"project", "project__project_lead", "project__default_assignee"
)
.select_related("workspace", "workspace__owner")
)
def perform_create(self, serializer):
serializer.save(user=self.request.user)
def create(self, request, slug):
serializer = ProjectFavoriteSerializer(data=request.data)
if serializer.is_valid():
serializer.save(user=request.user)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def destroy(self, request, slug, project_id):
project_favorite = ProjectFavorite.objects.get(
project=project_id, user=request.user, workspace__slug=slug
)
project_favorite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectDeployBoardViewSet(BaseViewSet):
permission_classes = [
ProjectMemberPermission,
]
serializer_class = ProjectDeployBoardSerializer
model = ProjectDeployBoard
def get_queryset(self):
return (
super()
.get_queryset()
.filter(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
)
.select_related("project")
)
def create(self, request, slug, project_id):
comments = request.data.get("comments", False)
reactions = request.data.get("reactions", False)
inbox = request.data.get("inbox", None)
votes = request.data.get("votes", False)
views = request.data.get(
"views",
{
"list": True,
"kanban": True,
"calendar": True,
"gantt": True,
"spreadsheet": True,
},
)
project_deploy_board, _ = ProjectDeployBoard.objects.get_or_create(
anchor=f"{slug}/{project_id}",
project_id=project_id,
)
project_deploy_board.comments = comments
project_deploy_board.reactions = reactions
project_deploy_board.inbox = inbox
project_deploy_board.votes = votes
project_deploy_board.views = views
project_deploy_board.save()
serializer = ProjectDeployBoardSerializer(project_deploy_board)
return Response(serializer.data, status=status.HTTP_200_OK)
class ProjectDeployBoardPublicSettingsEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request, slug, project_id):
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
serializer = ProjectDeployBoardSerializer(project_deploy_board)
return Response(serializer.data, status=status.HTTP_200_OK)
class WorkspaceProjectDeployBoardEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request, slug):
projects = (
Project.objects.filter(workspace__slug=slug)
.annotate(
is_public=Exists(
ProjectDeployBoard.objects.filter(
workspace__slug=slug, project_id=OuterRef("pk")
)
)
)
.filter(is_public=True)
).values(
"id",
"identifier",
"name",
"description",
"emoji",
"icon_prop",
"cover_image",
)
return Response(projects, status=status.HTTP_200_OK)
class LeaveProjectEndpoint(BaseAPIView):
permission_classes = [
ProjectLitePermission,
]
def delete(self, request, slug, project_id): def delete(self, request, slug, project_id):
project_member = ProjectMember.objects.get( project = Project.objects.get(pk=project_id, workspace__slug=slug)
workspace__slug=slug, project.delete()
member=request.user,
project_id=project_id,
)
# Only Admin case
if (
project_member.role == 20
and ProjectMember.objects.filter(
workspace__slug=slug,
role=20,
project_id=project_id,
).count()
== 1
):
return Response(
{
"error": "You cannot leave the project since you are the only admin of the project you should delete the project"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the member from workspace
project_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectPublicCoverImagesEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request):
files = []
s3 = boto3.client(
"s3",
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
)
params = {
"Bucket": settings.AWS_S3_BUCKET_NAME,
"Prefix": "static/project-cover/",
}
response = s3.list_objects_v2(**params)
# Extracting file keys from the response
if "Contents" in response:
for content in response["Contents"]:
if not content["Key"].endswith(
"/"
): # This line ensures we're only getting files, not "sub-folders"
files.append(
f"https://{settings.AWS_S3_BUCKET_NAME}.s3.{settings.AWS_REGION}.amazonaws.com/{content['Key']}"
)
return Response(files, status=status.HTTP_200_OK)

View File

@ -7,30 +7,24 @@ from django.db.models import Q
# Third party imports # Third party imports
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework import status from rest_framework import status
from sentry_sdk import capture_exception
# Module imports # Module imports
from . import BaseViewSet, BaseAPIView from .base import BaseAPIView
from plane.api.serializers import StateSerializer from plane.api.serializers import StateSerializer
from plane.api.permissions import ProjectEntityPermission from plane.app.permissions import ProjectEntityPermission
from plane.db.models import State, Issue from plane.db.models import State, Issue
class StateViewSet(BaseViewSet): class StateAPIEndpoint(BaseAPIView):
serializer_class = StateSerializer serializer_class = StateSerializer
model = State model = State
permission_classes = [ permission_classes = [
ProjectEntityPermission, ProjectEntityPermission,
] ]
def perform_create(self, serializer):
serializer.save(project_id=self.kwargs.get("project_id"))
def get_queryset(self): def get_queryset(self):
return self.filter_queryset( return (
super() State.objects.filter(workspace__slug=self.kwargs.get("slug"))
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id")) .filter(project_id=self.kwargs.get("project_id"))
.filter(project__project_projectmember__member=self.request.user) .filter(project__project_projectmember__member=self.request.user)
.filter(~Q(name="Triage")) .filter(~Q(name="Triage"))
@ -39,40 +33,32 @@ class StateViewSet(BaseViewSet):
.distinct() .distinct()
) )
def create(self, request, slug, project_id): def post(self, request, slug, project_id):
serializer = StateSerializer(data=request.data) serializer = StateSerializer(data=request.data, context={"project_id": project_id})
if serializer.is_valid(): if serializer.is_valid():
serializer.save(project_id=project_id) serializer.save(project_id=project_id)
return Response(serializer.data, status=status.HTTP_200_OK) return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST) return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def list(self, request, slug, project_id): def get(self, request, slug, project_id, state_id=None):
states = StateSerializer(self.get_queryset(), many=True).data if state_id:
grouped = request.GET.get("grouped", False) serializer = StateSerializer(self.get_queryset().get(pk=state_id))
if grouped == "true": return Response(serializer.data, status=status.HTTP_200_OK)
state_dict = {} return self.paginate(
for key, value in groupby( request=request,
sorted(states, key=lambda state: state["group"]), queryset=(self.get_queryset()),
lambda state: state.get("group"), on_results=lambda states: StateSerializer(
): states,
state_dict[str(key)] = list(value) many=True,
return Response(state_dict, status=status.HTTP_200_OK) fields=self.fields,
return Response(states, status=status.HTTP_200_OK) expand=self.expand,
).data,
)
def mark_as_default(self, request, slug, project_id, pk): def delete(self, request, slug, project_id, state_id):
# Select all the states which are marked as default
_ = State.objects.filter(
workspace__slug=slug, project_id=project_id, default=True
).update(default=False)
_ = State.objects.filter(
workspace__slug=slug, project_id=project_id, pk=pk
).update(default=True)
return Response(status=status.HTTP_204_NO_CONTENT)
def destroy(self, request, slug, project_id, pk):
state = State.objects.get( state = State.objects.get(
~Q(name="Triage"), ~Q(name="Triage"),
pk=pk, pk=state_id,
project_id=project_id, project_id=project_id,
workspace__slug=slug, workspace__slug=slug,
) )
@ -81,7 +67,7 @@ class StateViewSet(BaseViewSet):
return Response({"error": "Default state cannot be deleted"}, status=False) return Response({"error": "Default state cannot be deleted"}, status=False)
# Check for any issues in the state # Check for any issues in the state
issue_exist = Issue.issue_objects.filter(state=pk).exists() issue_exist = Issue.issue_objects.filter(state=state_id).exists()
if issue_exist: if issue_exist:
return Response( return Response(
@ -91,3 +77,11 @@ class StateViewSet(BaseViewSet):
state.delete() state.delete()
return Response(status=status.HTTP_204_NO_CONTENT) return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, project_id, state_id=None):
state = State.objects.get(workspace__slug=slug, project_id=project_id, pk=state_id)
serializer = StateSerializer(state, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

View File

@ -1,73 +0,0 @@
# Third party imports
from rest_framework.response import Response
from rest_framework import status
from sentry_sdk import capture_exception
# Module imports
from plane.api.serializers import (
UserSerializer,
IssueActivitySerializer,
UserMeSerializer,
UserMeSettingsSerializer,
)
from plane.api.views.base import BaseViewSet, BaseAPIView
from plane.db.models import (
User,
Workspace,
WorkspaceMemberInvite,
Issue,
IssueActivity,
)
from plane.utils.paginator import BasePaginator
class UserEndpoint(BaseViewSet):
serializer_class = UserSerializer
model = User
def get_object(self):
return self.request.user
def retrieve(self, request):
serialized_data = UserMeSerializer(request.user).data
return Response(
serialized_data,
status=status.HTTP_200_OK,
)
def retrieve_user_settings(self, request):
serialized_data = UserMeSettingsSerializer(request.user).data
return Response(serialized_data, status=status.HTTP_200_OK)
class UpdateUserOnBoardedEndpoint(BaseAPIView):
def patch(self, request):
user = User.objects.get(pk=request.user.id)
user.is_onboarded = request.data.get("is_onboarded", False)
user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)
class UpdateUserTourCompletedEndpoint(BaseAPIView):
def patch(self, request):
user = User.objects.get(pk=request.user.id)
user.is_tour_completed = request.data.get("is_tour_completed", False)
user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)
class UserActivityEndpoint(BaseAPIView, BasePaginator):
def get(self, request, slug):
queryset = IssueActivity.objects.filter(
actor=request.user, workspace__slug=slug
).select_related("actor", "workspace", "issue", "project")
return self.paginate(
request=request,
queryset=queryset,
on_results=lambda issue_activities: IssueActivitySerializer(
issue_activities, many=True
).data,
)

View File

View File

@ -0,0 +1,5 @@
from django.apps import AppConfig
class AppApiConfig(AppConfig):
name = "plane.app"

View File

@ -0,0 +1,47 @@
# Django imports
from django.utils import timezone
from django.db.models import Q
# Third party imports
from rest_framework import authentication
from rest_framework.exceptions import AuthenticationFailed
# Module imports
from plane.db.models import APIToken
class APIKeyAuthentication(authentication.BaseAuthentication):
"""
Authentication with an API Key
"""
www_authenticate_realm = "api"
media_type = "application/json"
auth_header_name = "X-Api-Key"
def get_api_token(self, request):
return request.headers.get(self.auth_header_name)
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
token=token,
is_active=True,
)
except APIToken.DoesNotExist:
raise AuthenticationFailed("Given API token is not valid")
# save api token last used
api_token.last_used = timezone.now()
api_token.save(update_fields=["last_used"])
return (api_token.user, api_token.token)
def authenticate(self, request):
token = self.get_api_token(request=request)
if not token:
return None
# Validate the API token
user, token = self.validate_api_token(token)
return user, token

View File

@ -0,0 +1,17 @@
from .workspace import (
WorkSpaceBasePermission,
WorkspaceOwnerPermission,
WorkSpaceAdminPermission,
WorkspaceEntityPermission,
WorkspaceViewerPermission,
WorkspaceUserPermission,
)
from .project import (
ProjectBasePermission,
ProjectEntityPermission,
ProjectMemberPermission,
ProjectLitePermission,
)

View File

@ -13,14 +13,15 @@ Guest = 5
class ProjectBasePermission(BasePermission): class ProjectBasePermission(BasePermission):
def has_permission(self, request, view): def has_permission(self, request, view):
if request.user.is_anonymous: if request.user.is_anonymous:
return False return False
## Safe Methods -> Handle the filtering logic in queryset ## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS: if request.method in SAFE_METHODS:
return WorkspaceMember.objects.filter( return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug, member=request.user workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists() ).exists()
## Only workspace owners or admins can create the projects ## Only workspace owners or admins can create the projects
@ -29,6 +30,7 @@ class ProjectBasePermission(BasePermission):
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
member=request.user, member=request.user,
role__in=[Admin, Member], role__in=[Admin, Member],
is_active=True,
).exists() ).exists()
## Only Project Admins can update project attributes ## Only Project Admins can update project attributes
@ -37,19 +39,21 @@ class ProjectBasePermission(BasePermission):
member=request.user, member=request.user,
role=Admin, role=Admin,
project_id=view.project_id, project_id=view.project_id,
is_active=True,
).exists() ).exists()
class ProjectMemberPermission(BasePermission): class ProjectMemberPermission(BasePermission):
def has_permission(self, request, view): def has_permission(self, request, view):
if request.user.is_anonymous: if request.user.is_anonymous:
return False return False
## Safe Methods -> Handle the filtering logic in queryset ## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS: if request.method in SAFE_METHODS:
return ProjectMember.objects.filter( return ProjectMember.objects.filter(
workspace__slug=view.workspace_slug, member=request.user workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists() ).exists()
## Only workspace owners or admins can create the projects ## Only workspace owners or admins can create the projects
if request.method == "POST": if request.method == "POST":
@ -57,6 +61,7 @@ class ProjectMemberPermission(BasePermission):
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
member=request.user, member=request.user,
role__in=[Admin, Member], role__in=[Admin, Member],
is_active=True,
).exists() ).exists()
## Only Project Admins can update project attributes ## Only Project Admins can update project attributes
@ -65,12 +70,12 @@ class ProjectMemberPermission(BasePermission):
member=request.user, member=request.user,
role__in=[Admin, Member], role__in=[Admin, Member],
project_id=view.project_id, project_id=view.project_id,
is_active=True,
).exists() ).exists()
class ProjectEntityPermission(BasePermission): class ProjectEntityPermission(BasePermission):
def has_permission(self, request, view): def has_permission(self, request, view):
if request.user.is_anonymous: if request.user.is_anonymous:
return False return False
@ -80,6 +85,7 @@ class ProjectEntityPermission(BasePermission):
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
member=request.user, member=request.user,
project_id=view.project_id, project_id=view.project_id,
is_active=True,
).exists() ).exists()
## Only project members or admins can create and edit the project attributes ## Only project members or admins can create and edit the project attributes
@ -88,11 +94,11 @@ class ProjectEntityPermission(BasePermission):
member=request.user, member=request.user,
role__in=[Admin, Member], role__in=[Admin, Member],
project_id=view.project_id, project_id=view.project_id,
is_active=True,
).exists() ).exists()
class ProjectLitePermission(BasePermission): class ProjectLitePermission(BasePermission):
def has_permission(self, request, view): def has_permission(self, request, view):
if request.user.is_anonymous: if request.user.is_anonymous:
return False return False
@ -101,4 +107,5 @@ class ProjectLitePermission(BasePermission):
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
member=request.user, member=request.user,
project_id=view.project_id, project_id=view.project_id,
is_active=True,
).exists() ).exists()

View File

@ -32,15 +32,31 @@ class WorkSpaceBasePermission(BasePermission):
member=request.user, member=request.user,
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
role__in=[Owner, Admin], role__in=[Owner, Admin],
is_active=True,
).exists() ).exists()
# allow only owner to delete the workspace # allow only owner to delete the workspace
if request.method == "DELETE": if request.method == "DELETE":
return WorkspaceMember.objects.filter( return WorkspaceMember.objects.filter(
member=request.user, workspace__slug=view.workspace_slug, role=Owner member=request.user,
workspace__slug=view.workspace_slug,
role=Owner,
is_active=True,
).exists() ).exists()
class WorkspaceOwnerPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
role=Owner,
).exists()
class WorkSpaceAdminPermission(BasePermission): class WorkSpaceAdminPermission(BasePermission):
def has_permission(self, request, view): def has_permission(self, request, view):
if request.user.is_anonymous: if request.user.is_anonymous:
@ -50,6 +66,7 @@ class WorkSpaceAdminPermission(BasePermission):
member=request.user, member=request.user,
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
role__in=[Owner, Admin], role__in=[Owner, Admin],
is_active=True,
).exists() ).exists()
@ -63,12 +80,14 @@ class WorkspaceEntityPermission(BasePermission):
return WorkspaceMember.objects.filter( return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
member=request.user, member=request.user,
is_active=True,
).exists() ).exists()
return WorkspaceMember.objects.filter( return WorkspaceMember.objects.filter(
member=request.user, member=request.user,
workspace__slug=view.workspace_slug, workspace__slug=view.workspace_slug,
role__in=[Owner, Admin], role__in=[Owner, Admin],
is_active=True,
).exists() ).exists()
@ -78,5 +97,19 @@ class WorkspaceViewerPermission(BasePermission):
return False return False
return WorkspaceMember.objects.filter( return WorkspaceMember.objects.filter(
member=request.user, workspace__slug=view.workspace_slug, role__gte=10 member=request.user,
workspace__slug=view.workspace_slug,
is_active=True,
).exists()
class WorkspaceUserPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
is_active=True,
).exists() ).exists()

View File

@ -0,0 +1,104 @@
from .base import BaseSerializer
from .user import (
UserSerializer,
UserLiteSerializer,
ChangePasswordSerializer,
ResetPasswordSerializer,
UserAdminLiteSerializer,
UserMeSerializer,
UserMeSettingsSerializer,
)
from .workspace import (
WorkSpaceSerializer,
WorkSpaceMemberSerializer,
TeamSerializer,
WorkSpaceMemberInviteSerializer,
WorkspaceLiteSerializer,
WorkspaceThemeSerializer,
WorkspaceMemberAdminSerializer,
WorkspaceMemberMeSerializer,
)
from .project import (
ProjectSerializer,
ProjectListSerializer,
ProjectDetailSerializer,
ProjectMemberSerializer,
ProjectMemberInviteSerializer,
ProjectIdentifierSerializer,
ProjectFavoriteSerializer,
ProjectLiteSerializer,
ProjectMemberLiteSerializer,
ProjectDeployBoardSerializer,
ProjectMemberAdminSerializer,
ProjectPublicMemberSerializer,
)
from .state import StateSerializer, StateLiteSerializer
from .view import GlobalViewSerializer, IssueViewSerializer, IssueViewFavoriteSerializer
from .cycle import (
CycleSerializer,
CycleIssueSerializer,
CycleFavoriteSerializer,
CycleWriteSerializer,
)
from .asset import FileAssetSerializer
from .issue import (
IssueCreateSerializer,
IssueActivitySerializer,
IssueCommentSerializer,
IssuePropertySerializer,
IssueAssigneeSerializer,
LabelSerializer,
IssueSerializer,
IssueFlatSerializer,
IssueStateSerializer,
IssueLinkSerializer,
IssueLiteSerializer,
IssueAttachmentSerializer,
IssueSubscriberSerializer,
IssueReactionSerializer,
CommentReactionSerializer,
IssueVoteSerializer,
IssueRelationSerializer,
RelatedIssueSerializer,
IssuePublicSerializer,
)
from .module import (
ModuleWriteSerializer,
ModuleSerializer,
ModuleIssueSerializer,
ModuleLinkSerializer,
ModuleFavoriteSerializer,
)
from .api import APITokenSerializer, APITokenReadSerializer
from .integration import (
IntegrationSerializer,
WorkspaceIntegrationSerializer,
GithubIssueSyncSerializer,
GithubRepositorySerializer,
GithubRepositorySyncSerializer,
GithubCommentSyncSerializer,
SlackProjectSyncSerializer,
)
from .importer import ImporterSerializer
from .page import PageSerializer, PageLogSerializer, SubPageSerializer, PageFavoriteSerializer
from .estimate import (
EstimateSerializer,
EstimatePointSerializer,
EstimateReadSerializer,
)
from .inbox import InboxSerializer, InboxIssueSerializer, IssueStateInboxSerializer
from .analytic import AnalyticViewSerializer
from .notification import NotificationSerializer
from .exporter import ExporterHistorySerializer
from .webhook import WebhookSerializer, WebhookLogSerializer

View File

@ -0,0 +1,31 @@
from .base import BaseSerializer
from plane.db.models import APIToken, APIActivityLog
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = "__all__"
read_only_fields = [
"token",
"expired_at",
"created_at",
"updated_at",
"workspace",
"user",
]
class APITokenReadSerializer(BaseSerializer):
class Meta:
model = APIToken
exclude = ('token',)
class APIActivityLogSerializer(BaseSerializer):
class Meta:
model = APIActivityLog
fields = "__all__"

View File

@ -0,0 +1,58 @@
from rest_framework import serializers
class BaseSerializer(serializers.ModelSerializer):
id = serializers.PrimaryKeyRelatedField(read_only=True)
class DynamicBaseSerializer(BaseSerializer):
def __init__(self, *args, **kwargs):
# If 'fields' is provided in the arguments, remove it and store it separately.
# This is done so as not to pass this custom argument up to the superclass.
fields = kwargs.pop("fields", None)
# Call the initialization of the superclass.
super().__init__(*args, **kwargs)
# If 'fields' was provided, filter the fields of the serializer accordingly.
if fields is not None:
self.fields = self._filter_fields(fields)
def _filter_fields(self, fields):
"""
Adjust the serializer's fields based on the provided 'fields' list.
:param fields: List or dictionary specifying which fields to include in the serializer.
:return: The updated fields for the serializer.
"""
# Check each field_name in the provided fields.
for field_name in fields:
# If the field is a dictionary (indicating nested fields),
# loop through its keys and values.
if isinstance(field_name, dict):
for key, value in field_name.items():
# If the value of this nested field is a list,
# perform a recursive filter on it.
if isinstance(value, list):
self._filter_fields(self.fields[key], value)
# Create a list to store allowed fields.
allowed = []
for item in fields:
# If the item is a string, it directly represents a field's name.
if isinstance(item, str):
allowed.append(item)
# If the item is a dictionary, it represents a nested field.
# Add the key of this dictionary to the allowed list.
elif isinstance(item, dict):
allowed.append(list(item.keys())[0])
# Convert the current serializer's fields and the allowed fields to sets.
existing = set(self.fields)
allowed = set(allowed)
# Remove fields from the serializer that aren't in the 'allowed' list.
for field_name in (existing - allowed):
self.fields.pop(field_name)
return self.fields

View File

@ -0,0 +1,107 @@
# Third party imports
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .user import UserLiteSerializer
from .issue import IssueStateSerializer
from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer
from plane.db.models import Cycle, CycleIssue, CycleFavorite
class CycleWriteSerializer(BaseSerializer):
def validate(self, data):
if (
data.get("start_date", None) is not None
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError("Start date cannot exceed end date")
return data
class Meta:
model = Cycle
fields = "__all__"
class CycleSerializer(BaseSerializer):
owned_by = UserLiteSerializer(read_only=True)
is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
assignees = serializers.SerializerMethodField(read_only=True)
total_estimates = serializers.IntegerField(read_only=True)
completed_estimates = serializers.IntegerField(read_only=True)
started_estimates = serializers.IntegerField(read_only=True)
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
def validate(self, data):
if (
data.get("start_date", None) is not None
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError("Start date cannot exceed end date")
return data
def get_assignees(self, obj):
members = [
{
"avatar": assignee.avatar,
"display_name": assignee.display_name,
"id": assignee.id,
}
for issue_cycle in obj.issue_cycle.prefetch_related(
"issue__assignees"
).all()
for assignee in issue_cycle.issue.assignees.all()
]
# Use a set comprehension to return only the unique objects
unique_objects = {frozenset(item.items()) for item in members}
# Convert the set back to a list of dictionaries
unique_list = [dict(item) for item in unique_objects]
return unique_list
class Meta:
model = Cycle
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"owned_by",
]
class CycleIssueSerializer(BaseSerializer):
issue_detail = IssueStateSerializer(read_only=True, source="issue")
sub_issues_count = serializers.IntegerField(read_only=True)
class Meta:
model = CycleIssue
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"cycle",
]
class CycleFavoriteSerializer(BaseSerializer):
cycle_detail = CycleSerializer(source="cycle", read_only=True)
class Meta:
model = CycleFavorite
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"user",
]

View File

@ -2,7 +2,7 @@
from .base import BaseSerializer from .base import BaseSerializer
from plane.db.models import Estimate, EstimatePoint from plane.db.models import Estimate, EstimatePoint
from plane.api.serializers import WorkspaceLiteSerializer, ProjectLiteSerializer from plane.app.serializers import WorkspaceLiteSerializer, ProjectLiteSerializer
class EstimateSerializer(BaseSerializer): class EstimateSerializer(BaseSerializer):

View File

@ -0,0 +1,57 @@
# Third party frameworks
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .issue import IssueFlatSerializer, LabelLiteSerializer
from .project import ProjectLiteSerializer
from .state import StateLiteSerializer
from .user import UserLiteSerializer
from plane.db.models import Inbox, InboxIssue, Issue
class InboxSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(source="project", read_only=True)
pending_issue_count = serializers.IntegerField(read_only=True)
class Meta:
model = Inbox
fields = "__all__"
read_only_fields = [
"project",
"workspace",
]
class InboxIssueSerializer(BaseSerializer):
issue_detail = IssueFlatSerializer(source="issue", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
class Meta:
model = InboxIssue
fields = "__all__"
read_only_fields = [
"project",
"workspace",
]
class InboxIssueLiteSerializer(BaseSerializer):
class Meta:
model = InboxIssue
fields = ["id", "status", "duplicate_to", "snoozed_till", "source"]
read_only_fields = fields
class IssueStateInboxSerializer(BaseSerializer):
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
sub_issues_count = serializers.IntegerField(read_only=True)
bridge_id = serializers.UUIDField(read_only=True)
issue_inbox = InboxIssueLiteSerializer(read_only=True, many=True)
class Meta:
model = Issue
fields = "__all__"

View File

@ -1,5 +1,5 @@
# Module imports # Module imports
from plane.api.serializers import BaseSerializer from plane.app.serializers import BaseSerializer
from plane.db.models import Integration, WorkspaceIntegration from plane.db.models import Integration, WorkspaceIntegration

View File

@ -1,5 +1,5 @@
# Module imports # Module imports
from plane.api.serializers import BaseSerializer from plane.app.serializers import BaseSerializer
from plane.db.models import ( from plane.db.models import (
GithubIssueSync, GithubIssueSync,
GithubRepository, GithubRepository,

View File

@ -1,5 +1,5 @@
# Module imports # Module imports
from plane.api.serializers import BaseSerializer from plane.app.serializers import BaseSerializer
from plane.db.models import SlackProjectSync from plane.db.models import SlackProjectSync

View File

@ -0,0 +1,616 @@
# Django imports
from django.utils import timezone
# Third Party imports
from rest_framework import serializers
# Module imports
from .base import BaseSerializer, DynamicBaseSerializer
from .user import UserLiteSerializer
from .state import StateSerializer, StateLiteSerializer
from .project import ProjectLiteSerializer
from .workspace import WorkspaceLiteSerializer
from plane.db.models import (
User,
Issue,
IssueActivity,
IssueComment,
IssueProperty,
IssueAssignee,
IssueSubscriber,
IssueLabel,
Label,
CycleIssue,
Cycle,
Module,
ModuleIssue,
IssueLink,
IssueAttachment,
IssueReaction,
CommentReaction,
IssueVote,
IssueRelation,
)
class IssueFlatSerializer(BaseSerializer):
## Contain only flat fields
class Meta:
model = Issue
fields = [
"id",
"name",
"description",
"description_html",
"priority",
"start_date",
"target_date",
"sequence_id",
"sort_order",
"is_draft",
]
class IssueProjectLiteSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(source="project", read_only=True)
class Meta:
model = Issue
fields = [
"id",
"project_detail",
"name",
"sequence_id",
]
read_only_fields = fields
##TODO: Find a better way to write this serializer
## Find a better approach to save manytomany?
class IssueCreateSerializer(BaseSerializer):
state_detail = StateSerializer(read_only=True, source="state")
created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
assignees = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
labels = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True,
required=False,
)
class Meta:
model = Issue
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
def to_representation(self, instance):
data = super().to_representation(instance)
data['assignees'] = [str(assignee.id) for assignee in instance.assignees.all()]
data['labels'] = [str(label.id) for label in instance.labels.all()]
return data
def validate(self, data):
if (
data.get("start_date", None) is not None
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError("Start date cannot exceed target date")
return data
def create(self, validated_data):
assignees = validated_data.pop("assignees", None)
labels = validated_data.pop("labels", None)
project_id = self.context["project_id"]
workspace_id = self.context["workspace_id"]
default_assignee_id = self.context["default_assignee_id"]
issue = Issue.objects.create(**validated_data, project_id=project_id)
# Issue Audit Users
created_by_id = issue.created_by_id
updated_by_id = issue.updated_by_id
if assignees is not None and len(assignees):
IssueAssignee.objects.bulk_create(
[
IssueAssignee(
assignee=user,
issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for user in assignees
],
batch_size=10,
)
else:
# Then assign it to default assignee
if default_assignee_id is not None:
IssueAssignee.objects.create(
assignee_id=default_assignee_id,
issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
if labels is not None and len(labels):
IssueLabel.objects.bulk_create(
[
IssueLabel(
label=label,
issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
],
batch_size=10,
)
return issue
def update(self, instance, validated_data):
assignees = validated_data.pop("assignees", None)
labels = validated_data.pop("labels", None)
# Related models
project_id = instance.project_id
workspace_id = instance.workspace_id
created_by_id = instance.created_by_id
updated_by_id = instance.updated_by_id
if assignees is not None:
IssueAssignee.objects.filter(issue=instance).delete()
IssueAssignee.objects.bulk_create(
[
IssueAssignee(
assignee=user,
issue=instance,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for user in assignees
],
batch_size=10,
)
if labels is not None:
IssueLabel.objects.filter(issue=instance).delete()
IssueLabel.objects.bulk_create(
[
IssueLabel(
label=label,
issue=instance,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
],
batch_size=10,
)
# Time updation occues even when other related models are updated
instance.updated_at = timezone.now()
return super().update(instance, validated_data)
class IssueActivitySerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
class Meta:
model = IssueActivity
fields = "__all__"
class IssuePropertySerializer(BaseSerializer):
class Meta:
model = IssueProperty
fields = "__all__"
read_only_fields = [
"user",
"workspace",
"project",
]
class LabelSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
class Meta:
model = Label
fields = "__all__"
read_only_fields = [
"workspace",
"project",
]
class LabelLiteSerializer(BaseSerializer):
class Meta:
model = Label
fields = [
"id",
"name",
"color",
]
class IssueLabelSerializer(BaseSerializer):
class Meta:
model = IssueLabel
fields = "__all__"
read_only_fields = [
"workspace",
"project",
]
class IssueRelationSerializer(BaseSerializer):
issue_detail = IssueProjectLiteSerializer(read_only=True, source="related_issue")
class Meta:
model = IssueRelation
fields = [
"issue_detail",
"relation_type",
"related_issue",
"issue",
"id"
]
read_only_fields = [
"workspace",
"project",
]
class RelatedIssueSerializer(BaseSerializer):
issue_detail = IssueProjectLiteSerializer(read_only=True, source="issue")
class Meta:
model = IssueRelation
fields = [
"issue_detail",
"relation_type",
"related_issue",
"issue",
"id"
]
read_only_fields = [
"workspace",
"project",
]
class IssueAssigneeSerializer(BaseSerializer):
assignee_details = UserLiteSerializer(read_only=True, source="assignee")
class Meta:
model = IssueAssignee
fields = "__all__"
class CycleBaseSerializer(BaseSerializer):
class Meta:
model = Cycle
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueCycleDetailSerializer(BaseSerializer):
cycle_detail = CycleBaseSerializer(read_only=True, source="cycle")
class Meta:
model = CycleIssue
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class ModuleBaseSerializer(BaseSerializer):
class Meta:
model = Module
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueModuleDetailSerializer(BaseSerializer):
module_detail = ModuleBaseSerializer(read_only=True, source="module")
class Meta:
model = ModuleIssue
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueLinkSerializer(BaseSerializer):
created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
class Meta:
model = IssueLink
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
"issue",
]
# Validation if url already exists
def create(self, validated_data):
if IssueLink.objects.filter(
url=validated_data.get("url"), issue_id=validated_data.get("issue_id")
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
return IssueLink.objects.create(**validated_data)
class IssueAttachmentSerializer(BaseSerializer):
class Meta:
model = IssueAttachment
fields = "__all__"
read_only_fields = [
"created_by",
"updated_by",
"created_at",
"updated_at",
"workspace",
"project",
"issue",
]
class IssueReactionSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = IssueReaction
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
"actor",
]
class CommentReactionLiteSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = CommentReaction
fields = [
"id",
"reaction",
"comment",
"actor_detail",
]
class CommentReactionSerializer(BaseSerializer):
class Meta:
model = CommentReaction
fields = "__all__"
read_only_fields = ["workspace", "project", "comment", "actor"]
class IssueVoteSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = IssueVote
fields = ["issue", "vote", "workspace", "project", "actor", "actor_detail"]
read_only_fields = fields
class IssueCommentSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
comment_reactions = CommentReactionLiteSerializer(read_only=True, many=True)
is_member = serializers.BooleanField(read_only=True)
class Meta:
model = IssueComment
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueStateFlatSerializer(BaseSerializer):
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
class Meta:
model = Issue
fields = [
"id",
"sequence_id",
"name",
"state_detail",
"project_detail",
]
# Issue Serializer with state details
class IssueStateSerializer(DynamicBaseSerializer):
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
sub_issues_count = serializers.IntegerField(read_only=True)
bridge_id = serializers.UUIDField(read_only=True)
attachment_count = serializers.IntegerField(read_only=True)
link_count = serializers.IntegerField(read_only=True)
class Meta:
model = Issue
fields = "__all__"
class IssueSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateSerializer(read_only=True, source="state")
parent_detail = IssueStateFlatSerializer(read_only=True, source="parent")
label_details = LabelSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
related_issues = IssueRelationSerializer(read_only=True, source="issue_relation", many=True)
issue_relations = RelatedIssueSerializer(read_only=True, source="issue_related", many=True)
issue_cycle = IssueCycleDetailSerializer(read_only=True)
issue_module = IssueModuleDetailSerializer(read_only=True)
issue_link = IssueLinkSerializer(read_only=True, many=True)
issue_attachment = IssueAttachmentSerializer(read_only=True, many=True)
sub_issues_count = serializers.IntegerField(read_only=True)
issue_reactions = IssueReactionSerializer(read_only=True, many=True)
class Meta:
model = Issue
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueLiteSerializer(DynamicBaseSerializer):
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
sub_issues_count = serializers.IntegerField(read_only=True)
cycle_id = serializers.UUIDField(read_only=True)
module_id = serializers.UUIDField(read_only=True)
attachment_count = serializers.IntegerField(read_only=True)
link_count = serializers.IntegerField(read_only=True)
issue_reactions = IssueReactionSerializer(read_only=True, many=True)
class Meta:
model = Issue
fields = "__all__"
read_only_fields = [
"start_date",
"target_date",
"completed_at",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssuePublicSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")
reactions = IssueReactionSerializer(read_only=True, many=True, source="issue_reactions")
votes = IssueVoteSerializer(read_only=True, many=True)
class Meta:
model = Issue
fields = [
"id",
"name",
"description_html",
"sequence_id",
"state",
"state_detail",
"project",
"project_detail",
"workspace",
"priority",
"target_date",
"reactions",
"votes",
]
read_only_fields = fields
class IssueSubscriberSerializer(BaseSerializer):
class Meta:
model = IssueSubscriber
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
]

View File

@ -0,0 +1,198 @@
# Third Party imports
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .user import UserLiteSerializer
from .project import ProjectLiteSerializer
from .workspace import WorkspaceLiteSerializer
from plane.db.models import (
User,
Module,
ModuleMember,
ModuleIssue,
ModuleLink,
ModuleFavorite,
)
class ModuleWriteSerializer(BaseSerializer):
members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
class Meta:
model = Module
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
def to_representation(self, instance):
data = super().to_representation(instance)
data['members'] = [str(member.id) for member in instance.members.all()]
return data
def validate(self, data):
if data.get("start_date", None) is not None and data.get("target_date", None) is not None and data.get("start_date", None) > data.get("target_date", None):
raise serializers.ValidationError("Start date cannot exceed target date")
return data
def create(self, validated_data):
members = validated_data.pop("members", None)
project = self.context["project"]
module = Module.objects.create(**validated_data, project=project)
if members is not None:
ModuleMember.objects.bulk_create(
[
ModuleMember(
module=module,
member=member,
project=project,
workspace=project.workspace,
created_by=module.created_by,
updated_by=module.updated_by,
)
for member in members
],
batch_size=10,
ignore_conflicts=True,
)
return module
def update(self, instance, validated_data):
members = validated_data.pop("members", None)
if members is not None:
ModuleMember.objects.filter(module=instance).delete()
ModuleMember.objects.bulk_create(
[
ModuleMember(
module=instance,
member=member,
project=instance.project,
workspace=instance.project.workspace,
created_by=instance.created_by,
updated_by=instance.updated_by,
)
for member in members
],
batch_size=10,
ignore_conflicts=True,
)
return super().update(instance, validated_data)
class ModuleFlatSerializer(BaseSerializer):
class Meta:
model = Module
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class ModuleIssueSerializer(BaseSerializer):
module_detail = ModuleFlatSerializer(read_only=True, source="module")
issue_detail = ProjectLiteSerializer(read_only=True, source="issue")
sub_issues_count = serializers.IntegerField(read_only=True)
class Meta:
model = ModuleIssue
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
"module",
]
class ModuleLinkSerializer(BaseSerializer):
created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
class Meta:
model = ModuleLink
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
"module",
]
# Validation if url already exists
def create(self, validated_data):
if ModuleLink.objects.filter(
url=validated_data.get("url"), module_id=validated_data.get("module_id")
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
return ModuleLink.objects.create(**validated_data)
class ModuleSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project")
lead_detail = UserLiteSerializer(read_only=True, source="lead")
members_detail = UserLiteSerializer(read_only=True, many=True, source="members")
link_module = ModuleLinkSerializer(read_only=True, many=True)
is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
class Meta:
model = Module
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class ModuleFavoriteSerializer(BaseSerializer):
module_detail = ModuleFlatSerializer(source="module", read_only=True)
class Meta:
model = ModuleFavorite
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"user",
]

View File

@ -6,28 +6,7 @@ from .base import BaseSerializer
from .issue import IssueFlatSerializer, LabelLiteSerializer from .issue import IssueFlatSerializer, LabelLiteSerializer
from .workspace import WorkspaceLiteSerializer from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer from .project import ProjectLiteSerializer
from plane.db.models import Page, PageBlock, PageFavorite, PageLabel, Label from plane.db.models import Page, PageLog, PageFavorite, PageLabel, Label, Issue, Module
class PageBlockSerializer(BaseSerializer):
issue_detail = IssueFlatSerializer(source="issue", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
class Meta:
model = PageBlock
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
class PageBlockLiteSerializer(BaseSerializer):
class Meta:
model = PageBlock
fields = "__all__"
class PageSerializer(BaseSerializer): class PageSerializer(BaseSerializer):
@ -38,7 +17,6 @@ class PageSerializer(BaseSerializer):
write_only=True, write_only=True,
required=False, required=False,
) )
blocks = PageBlockLiteSerializer(read_only=True, many=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True) project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True) workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
@ -102,6 +80,41 @@ class PageSerializer(BaseSerializer):
return super().update(instance, validated_data) return super().update(instance, validated_data)
class SubPageSerializer(BaseSerializer):
entity_details = serializers.SerializerMethodField()
class Meta:
model = PageLog
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
def get_entity_details(self, obj):
entity_name = obj.entity_name
if entity_name == 'forward_link' or entity_name == 'back_link':
try:
page = Page.objects.get(pk=obj.entity_identifier)
return PageSerializer(page).data
except Page.DoesNotExist:
return None
return None
class PageLogSerializer(BaseSerializer):
class Meta:
model = PageLog
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
class PageFavoriteSerializer(BaseSerializer): class PageFavoriteSerializer(BaseSerializer):
page_detail = PageSerializer(source="page", read_only=True) page_detail = PageSerializer(source="page", read_only=True)

View File

@ -0,0 +1,217 @@
# Third party imports
from rest_framework import serializers
# Module imports
from .base import BaseSerializer, DynamicBaseSerializer
from plane.app.serializers.workspace import WorkspaceLiteSerializer
from plane.app.serializers.user import UserLiteSerializer, UserAdminLiteSerializer
from plane.db.models import (
Project,
ProjectMember,
ProjectMemberInvite,
ProjectIdentifier,
ProjectFavorite,
ProjectDeployBoard,
ProjectPublicMember,
)
class ProjectSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
class Meta:
model = Project
fields = "__all__"
read_only_fields = [
"workspace",
]
def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "":
raise serializers.ValidationError(detail="Project Identifier is required")
if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"]
).exists():
raise serializers.ValidationError(detail="Project Identifier is taken")
project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"]
)
_ = ProjectIdentifier.objects.create(
name=project.identifier,
project=project,
workspace_id=self.context["workspace_id"],
)
return project
def update(self, instance, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
# If identifier is not passed update the project and return
if identifier == "":
project = super().update(instance, validated_data)
return project
# If no Project Identifier is found create it
project_identifier = ProjectIdentifier.objects.filter(
name=identifier, workspace_id=instance.workspace_id
).first()
if project_identifier is None:
project = super().update(instance, validated_data)
project_identifier = ProjectIdentifier.objects.filter(
project=project
).first()
if project_identifier is not None:
project_identifier.name = identifier
project_identifier.save()
return project
# If found check if the project_id to be updated and identifier project id is same
if project_identifier.project_id == instance.id:
# If same pass update
project = super().update(instance, validated_data)
return project
# If not same fail update
raise serializers.ValidationError(detail="Project Identifier is already taken")
class ProjectLiteSerializer(BaseSerializer):
class Meta:
model = Project
fields = [
"id",
"identifier",
"name",
"cover_image",
"icon_prop",
"emoji",
"description",
]
read_only_fields = fields
class ProjectListSerializer(DynamicBaseSerializer):
is_favorite = serializers.BooleanField(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True)
total_modules = serializers.IntegerField(read_only=True)
is_member = serializers.BooleanField(read_only=True)
sort_order = serializers.FloatField(read_only=True)
member_role = serializers.IntegerField(read_only=True)
is_deployed = serializers.BooleanField(read_only=True)
members = serializers.SerializerMethodField()
def get_members(self, obj):
project_members = ProjectMember.objects.filter(
project_id=obj.id,
is_active=True,
).values(
"id",
"member_id",
"member__display_name",
"member__avatar",
)
return list(project_members)
class Meta:
model = Project
fields = "__all__"
class ProjectDetailSerializer(BaseSerializer):
# workspace = WorkSpaceSerializer(read_only=True)
default_assignee = UserLiteSerializer(read_only=True)
project_lead = UserLiteSerializer(read_only=True)
is_favorite = serializers.BooleanField(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True)
total_modules = serializers.IntegerField(read_only=True)
is_member = serializers.BooleanField(read_only=True)
sort_order = serializers.FloatField(read_only=True)
member_role = serializers.IntegerField(read_only=True)
is_deployed = serializers.BooleanField(read_only=True)
class Meta:
model = Project
fields = "__all__"
class ProjectMemberSerializer(BaseSerializer):
workspace = WorkspaceLiteSerializer(read_only=True)
project = ProjectLiteSerializer(read_only=True)
member = UserLiteSerializer(read_only=True)
class Meta:
model = ProjectMember
fields = "__all__"
class ProjectMemberAdminSerializer(BaseSerializer):
workspace = WorkspaceLiteSerializer(read_only=True)
project = ProjectLiteSerializer(read_only=True)
member = UserAdminLiteSerializer(read_only=True)
class Meta:
model = ProjectMember
fields = "__all__"
class ProjectMemberInviteSerializer(BaseSerializer):
project = ProjectLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
class Meta:
model = ProjectMemberInvite
fields = "__all__"
class ProjectIdentifierSerializer(BaseSerializer):
class Meta:
model = ProjectIdentifier
fields = "__all__"
class ProjectFavoriteSerializer(BaseSerializer):
class Meta:
model = ProjectFavorite
fields = "__all__"
read_only_fields = [
"workspace",
"user",
]
class ProjectMemberLiteSerializer(BaseSerializer):
member = UserLiteSerializer(read_only=True)
is_subscribed = serializers.BooleanField(read_only=True)
class Meta:
model = ProjectMember
fields = ["member", "id", "is_subscribed"]
read_only_fields = fields
class ProjectDeployBoardSerializer(BaseSerializer):
project_details = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
class Meta:
model = ProjectDeployBoard
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"anchor",
]
class ProjectPublicMemberSerializer(BaseSerializer):
class Meta:
model = ProjectPublicMember
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"member",
]

View File

@ -0,0 +1,28 @@
# Module imports
from .base import BaseSerializer
from plane.db.models import State
class StateSerializer(BaseSerializer):
class Meta:
model = State
fields = "__all__"
read_only_fields = [
"workspace",
"project",
]
class StateLiteSerializer(BaseSerializer):
class Meta:
model = State
fields = [
"id",
"name",
"color",
"group",
]
read_only_fields = fields

View File

@ -0,0 +1,195 @@
# Third party imports
from rest_framework import serializers
# Module import
from .base import BaseSerializer
from plane.db.models import User, Workspace, WorkspaceMemberInvite
from plane.license.models import InstanceAdmin, Instance
class UserSerializer(BaseSerializer):
class Meta:
model = User
fields = "__all__"
read_only_fields = [
"id",
"created_at",
"updated_at",
"is_superuser",
"is_staff",
"last_active",
"last_login_time",
"last_logout_time",
"last_login_ip",
"last_logout_ip",
"last_login_uagent",
"token_updated_at",
"is_onboarded",
"is_bot",
"is_password_autoset",
"is_email_verified",
]
extra_kwargs = {"password": {"write_only": True}}
# If the user has already filled first name or last name then he is onboarded
def get_is_onboarded(self, obj):
return bool(obj.first_name) or bool(obj.last_name)
class UserMeSerializer(BaseSerializer):
class Meta:
model = User
fields = [
"id",
"avatar",
"cover_image",
"date_joined",
"display_name",
"email",
"first_name",
"last_name",
"is_active",
"is_bot",
"is_email_verified",
"is_managed",
"is_onboarded",
"is_tour_completed",
"mobile_number",
"role",
"onboarding_step",
"user_timezone",
"username",
"theme",
"last_workspace_id",
"use_case",
"is_password_autoset",
"is_email_verified",
]
read_only_fields = fields
class UserMeSettingsSerializer(BaseSerializer):
workspace = serializers.SerializerMethodField()
class Meta:
model = User
fields = [
"id",
"email",
"workspace",
]
read_only_fields = fields
def get_workspace(self, obj):
workspace_invites = WorkspaceMemberInvite.objects.filter(
email=obj.email
).count()
if (
obj.last_workspace_id is not None
and Workspace.objects.filter(
pk=obj.last_workspace_id,
workspace_member__member=obj.id,
workspace_member__is_active=True,
).exists()
):
workspace = Workspace.objects.filter(
pk=obj.last_workspace_id,
workspace_member__member=obj.id,
workspace_member__is_active=True,
).first()
return {
"last_workspace_id": obj.last_workspace_id,
"last_workspace_slug": workspace.slug if workspace is not None else "",
"fallback_workspace_id": obj.last_workspace_id,
"fallback_workspace_slug": workspace.slug
if workspace is not None
else "",
"invites": workspace_invites,
}
else:
fallback_workspace = (
Workspace.objects.filter(
workspace_member__member_id=obj.id, workspace_member__is_active=True
)
.order_by("created_at")
.first()
)
return {
"last_workspace_id": None,
"last_workspace_slug": None,
"fallback_workspace_id": fallback_workspace.id
if fallback_workspace is not None
else None,
"fallback_workspace_slug": fallback_workspace.slug
if fallback_workspace is not None
else None,
"invites": workspace_invites,
}
class UserLiteSerializer(BaseSerializer):
class Meta:
model = User
fields = [
"id",
"first_name",
"last_name",
"avatar",
"is_bot",
"display_name",
]
read_only_fields = [
"id",
"is_bot",
]
class UserAdminLiteSerializer(BaseSerializer):
class Meta:
model = User
fields = [
"id",
"first_name",
"last_name",
"avatar",
"is_bot",
"display_name",
"email",
]
read_only_fields = [
"id",
"is_bot",
]
class ChangePasswordSerializer(serializers.Serializer):
model = User
"""
Serializer for password change endpoint.
"""
old_password = serializers.CharField(required=True)
new_password = serializers.CharField(required=True)
confirm_password = serializers.CharField(required=True)
def validate(self, data):
if data.get("old_password") == data.get("new_password"):
raise serializers.ValidationError(
{"error": "New password cannot be same as old password."}
)
if data.get("new_password") != data.get("confirm_password"):
raise serializers.ValidationError(
{"error": "Confirm password should be same as the new password."}
)
return data
class ResetPasswordSerializer(serializers.Serializer):
model = User
"""
Serializer for password change endpoint.
"""
new_password = serializers.CharField(required=True)

View File

@ -0,0 +1,106 @@
# Python imports
import urllib
import socket
import ipaddress
from urllib.parse import urlparse
# Third party imports
from rest_framework import serializers
# Module imports
from .base import DynamicBaseSerializer
from plane.db.models import Webhook, WebhookLog
from plane.db.models.webhook import validate_domain, validate_schema
class WebhookSerializer(DynamicBaseSerializer):
url = serializers.URLField(validators=[validate_schema, validate_domain])
def create(self, validated_data):
url = validated_data.get("url", None)
# Extract the hostname from the URL
hostname = urlparse(url).hostname
if not hostname:
raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
# Resolve the hostname to IP addresses
try:
ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror:
raise serializers.ValidationError({"url": "Hostname could not be resolved."})
if not ip_addresses:
raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
# Additional validation for multiple request domains and their subdomains
request = self.context.get('request')
disallowed_domains = ['plane.so',] # Add your disallowed domains here
if request:
request_host = request.get_host().split(':')[0] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
return Webhook.objects.create(**validated_data)
def update(self, instance, validated_data):
url = validated_data.get("url", None)
if url:
# Extract the hostname from the URL
hostname = urlparse(url).hostname
if not hostname:
raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
# Resolve the hostname to IP addresses
try:
ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror:
raise serializers.ValidationError({"url": "Hostname could not be resolved."})
if not ip_addresses:
raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
# Additional validation for multiple request domains and their subdomains
request = self.context.get('request')
disallowed_domains = ['plane.so',] # Add your disallowed domains here
if request:
request_host = request.get_host().split(':')[0] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
return super().update(instance, validated_data)
class Meta:
model = Webhook
fields = "__all__"
read_only_fields = [
"workspace",
"secret_key",
]
class WebhookLogSerializer(DynamicBaseSerializer):
class Meta:
model = WebhookLog
fields = "__all__"
read_only_fields = [
"workspace",
"webhook"
]

View File

@ -0,0 +1,153 @@
# Third party imports
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .user import UserLiteSerializer, UserAdminLiteSerializer
from plane.db.models import (
User,
Workspace,
WorkspaceMember,
Team,
TeamMember,
WorkspaceMemberInvite,
WorkspaceTheme,
)
class WorkSpaceSerializer(BaseSerializer):
owner = UserLiteSerializer(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
def validated(self, data):
if data.get("slug") in [
"404",
"accounts",
"api",
"create-workspace",
"god-mode",
"installations",
"invitations",
"onboarding",
"profile",
"spaces",
"workspace-invitations",
"password",
]:
raise serializers.ValidationError({"slug": "Slug is not valid"})
class Meta:
model = Workspace
fields = "__all__"
read_only_fields = [
"id",
"created_by",
"updated_by",
"created_at",
"updated_at",
"owner",
]
class WorkspaceLiteSerializer(BaseSerializer):
class Meta:
model = Workspace
fields = [
"name",
"slug",
"id",
]
read_only_fields = fields
class WorkSpaceMemberSerializer(BaseSerializer):
member = UserLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
class Meta:
model = WorkspaceMember
fields = "__all__"
class WorkspaceMemberMeSerializer(BaseSerializer):
class Meta:
model = WorkspaceMember
fields = "__all__"
class WorkspaceMemberAdminSerializer(BaseSerializer):
member = UserAdminLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
class Meta:
model = WorkspaceMember
fields = "__all__"
class WorkSpaceMemberInviteSerializer(BaseSerializer):
workspace = WorkSpaceSerializer(read_only=True)
total_members = serializers.IntegerField(read_only=True)
created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
class Meta:
model = WorkspaceMemberInvite
fields = "__all__"
class TeamSerializer(BaseSerializer):
members_detail = UserLiteSerializer(read_only=True, source="members", many=True)
members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
class Meta:
model = Team
fields = "__all__"
read_only_fields = [
"workspace",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
def create(self, validated_data, **kwargs):
if "members" in validated_data:
members = validated_data.pop("members")
workspace = self.context["workspace"]
team = Team.objects.create(**validated_data, workspace=workspace)
team_members = [
TeamMember(member=member, team=team, workspace=workspace)
for member in members
]
TeamMember.objects.bulk_create(team_members, batch_size=10)
return team
team = Team.objects.create(**validated_data)
return team
def update(self, instance, validated_data):
if "members" in validated_data:
members = validated_data.pop("members")
TeamMember.objects.filter(team=instance).delete()
team_members = [
TeamMember(member=member, team=instance, workspace=instance.workspace)
for member in members
]
TeamMember.objects.bulk_create(team_members, batch_size=10)
return super().update(instance, validated_data)
return super().update(instance, validated_data)
class WorkspaceThemeSerializer(BaseSerializer):
class Meta:
model = WorkspaceTheme
fields = "__all__"
read_only_fields = [
"workspace",
"actor",
]

View File

@ -0,0 +1,48 @@
from .analytic import urlpatterns as analytic_urls
from .asset import urlpatterns as asset_urls
from .authentication import urlpatterns as authentication_urls
from .config import urlpatterns as configuration_urls
from .cycle import urlpatterns as cycle_urls
from .estimate import urlpatterns as estimate_urls
from .external import urlpatterns as external_urls
from .importer import urlpatterns as importer_urls
from .inbox import urlpatterns as inbox_urls
from .integration import urlpatterns as integration_urls
from .issue import urlpatterns as issue_urls
from .module import urlpatterns as module_urls
from .notification import urlpatterns as notification_urls
from .page import urlpatterns as page_urls
from .project import urlpatterns as project_urls
from .search import urlpatterns as search_urls
from .state import urlpatterns as state_urls
from .user import urlpatterns as user_urls
from .views import urlpatterns as view_urls
from .workspace import urlpatterns as workspace_urls
from .api import urlpatterns as api_urls
from .webhook import urlpatterns as webhook_urls
urlpatterns = [
*analytic_urls,
*asset_urls,
*authentication_urls,
*configuration_urls,
*cycle_urls,
*estimate_urls,
*external_urls,
*importer_urls,
*inbox_urls,
*integration_urls,
*issue_urls,
*module_urls,
*notification_urls,
*page_urls,
*project_urls,
*search_urls,
*state_urls,
*user_urls,
*view_urls,
*workspace_urls,
*api_urls,
*webhook_urls,
]

View File

@ -1,7 +1,7 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.app.views import (
AnalyticsEndpoint, AnalyticsEndpoint,
AnalyticViewViewset, AnalyticViewViewset,
SavedAnalyticEndpoint, SavedAnalyticEndpoint,

View File

@ -0,0 +1,17 @@
from django.urls import path
from plane.app.views import ApiTokenEndpoint
urlpatterns = [
# API Tokens
path(
"workspaces/<str:slug>/api-tokens/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
path(
"workspaces/<str:slug>/api-tokens/<uuid:pk>/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
## End API Tokens
]

View File

@ -1,9 +1,10 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.app.views import (
FileAssetEndpoint, FileAssetEndpoint,
UserAssetsEndpoint, UserAssetsEndpoint,
FileAssetViewSet,
) )
@ -28,4 +29,13 @@ urlpatterns = [
UserAssetsEndpoint.as_view(), UserAssetsEndpoint.as_view(),
name="user-file-assets", name="user-file-assets",
), ),
path(
"workspaces/file-assets/<uuid:workspace_id>/<str:asset_key>/restore/",
FileAssetViewSet.as_view(
{
"post": "restore",
}
),
name="file-assets-restore",
),
] ]

View File

@ -3,20 +3,17 @@ from django.urls import path
from rest_framework_simplejwt.views import TokenRefreshView from rest_framework_simplejwt.views import TokenRefreshView
from plane.api.views import ( from plane.app.views import (
# Authentication # Authentication
SignUpEndpoint,
SignInEndpoint, SignInEndpoint,
SignOutEndpoint, SignOutEndpoint,
MagicSignInEndpoint, MagicSignInEndpoint,
MagicSignInGenerateEndpoint,
OauthEndpoint, OauthEndpoint,
EmailCheckEndpoint,
## End Authentication ## End Authentication
# Auth Extended # Auth Extended
ForgotPasswordEndpoint, ForgotPasswordEndpoint,
VerifyEmailEndpoint,
ResetPasswordEndpoint, ResetPasswordEndpoint,
RequestEmailVerificationEndpoint,
ChangePasswordEndpoint, ChangePasswordEndpoint,
## End Auth Extender ## End Auth Extender
# API Tokens # API Tokens
@ -27,24 +24,14 @@ from plane.api.views import (
urlpatterns = [ urlpatterns = [
# Social Auth # Social Auth
path("email-check/", EmailCheckEndpoint.as_view(), name="email"),
path("social-auth/", OauthEndpoint.as_view(), name="oauth"), path("social-auth/", OauthEndpoint.as_view(), name="oauth"),
# Auth # Auth
path("sign-up/", SignUpEndpoint.as_view(), name="sign-up"),
path("sign-in/", SignInEndpoint.as_view(), name="sign-in"), path("sign-in/", SignInEndpoint.as_view(), name="sign-in"),
path("sign-out/", SignOutEndpoint.as_view(), name="sign-out"), path("sign-out/", SignOutEndpoint.as_view(), name="sign-out"),
# Magic Sign In/Up # magic sign in
path(
"magic-generate/", MagicSignInGenerateEndpoint.as_view(), name="magic-generate"
),
path("magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"), path("magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"),
path("token/refresh/", TokenRefreshView.as_view(), name="token_refresh"), path("token/refresh/", TokenRefreshView.as_view(), name="token_refresh"),
# Email verification
path("email-verify/", VerifyEmailEndpoint.as_view(), name="email-verify"),
path(
"request-email-verify/",
RequestEmailVerificationEndpoint.as_view(),
name="request-reset-email",
),
# Password Manipulation # Password Manipulation
path( path(
"users/me/change-password/", "users/me/change-password/",

View File

@ -1,7 +1,7 @@
from django.urls import path from django.urls import path
from plane.api.views import ConfigurationEndpoint from plane.app.views import ConfigurationEndpoint
urlpatterns = [ urlpatterns = [
path( path(

View File

@ -0,0 +1,87 @@
from django.urls import path
from plane.app.views import (
CycleViewSet,
CycleIssueViewSet,
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
TransferCycleIssueEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/",
CycleViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:pk>/",
CycleViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/",
CycleIssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:pk>/",
CycleIssueViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/date-check/",
CycleDateCheckEndpoint.as_view(),
name="project-cycle-date",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-cycles/",
CycleFavoriteViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="user-favorite-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-cycles/<uuid:cycle_id>/",
CycleFavoriteViewSet.as_view(
{
"delete": "destroy",
}
),
name="user-favorite-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/transfer-issues/",
TransferCycleIssueEndpoint.as_view(),
name="transfer-issues",
),
]

View File

@ -1,7 +1,7 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.app.views import (
ProjectEstimatePointEndpoint, ProjectEstimatePointEndpoint,
BulkEstimatePointEndpoint, BulkEstimatePointEndpoint,
) )

View File

@ -1,9 +1,9 @@
from django.urls import path from django.urls import path
from plane.api.views import UnsplashEndpoint from plane.app.views import UnsplashEndpoint
from plane.api.views import ReleaseNotesEndpoint from plane.app.views import ReleaseNotesEndpoint
from plane.api.views import GPTIntegrationEndpoint from plane.app.views import GPTIntegrationEndpoint
urlpatterns = [ urlpatterns = [

View File

@ -1,7 +1,7 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.app.views import (
ServiceIssueImportSummaryEndpoint, ServiceIssueImportSummaryEndpoint,
ImportServiceEndpoint, ImportServiceEndpoint,
UpdateServiceImportStatusEndpoint, UpdateServiceImportStatusEndpoint,

View File

@ -0,0 +1,53 @@
from django.urls import path
from plane.app.views import (
InboxViewSet,
InboxIssueViewSet,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/",
InboxViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="inbox",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:pk>/",
InboxViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="inbox",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/",
InboxIssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/<uuid:pk>/",
InboxIssueViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="inbox-issue",
),
]

View File

@ -1,7 +1,7 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.app.views import (
IntegrationViewSet, IntegrationViewSet,
WorkspaceIntegrationViewSet, WorkspaceIntegrationViewSet,
GithubRepositoriesEndpoint, GithubRepositoriesEndpoint,

View File

@ -0,0 +1,315 @@
from django.urls import path
from plane.app.views import (
IssueViewSet,
LabelViewSet,
BulkCreateIssueLabelsEndpoint,
BulkDeleteIssuesEndpoint,
BulkImportIssuesEndpoint,
UserWorkSpaceIssues,
SubIssuesEndpoint,
IssueLinkViewSet,
IssueAttachmentEndpoint,
ExportIssuesEndpoint,
IssueActivityEndpoint,
IssueCommentViewSet,
IssueSubscriberViewSet,
IssueReactionViewSet,
CommentReactionViewSet,
IssueUserDisplayPropertyEndpoint,
IssueArchiveViewSet,
IssueRelationViewSet,
IssueDraftViewSet,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/",
IssueViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/",
LabelViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-labels",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/<uuid:pk>/",
LabelViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-labels",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-create-labels/",
BulkCreateIssueLabelsEndpoint.as_view(),
name="project-bulk-labels",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-delete-issues/",
BulkDeleteIssuesEndpoint.as_view(),
name="project-issues-bulk",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-import-issues/<str:service>/",
BulkImportIssuesEndpoint.as_view(),
name="project-issues-bulk",
),
path(
"workspaces/<str:slug>/my-issues/",
UserWorkSpaceIssues.as_view(),
name="workspace-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/sub-issues/",
SubIssuesEndpoint.as_view(),
name="sub-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/",
IssueLinkViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/<uuid:pk>/",
IssueLinkViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-attachments/",
IssueAttachmentEndpoint.as_view(),
name="project-issue-attachments",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-attachments/<uuid:pk>/",
IssueAttachmentEndpoint.as_view(),
name="project-issue-attachments",
),
path(
"workspaces/<str:slug>/export-issues/",
ExportIssuesEndpoint.as_view(),
name="export-issues",
),
## End Issues
## Issue Activity
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/history/",
IssueActivityEndpoint.as_view(),
name="project-issue-history",
),
## Issue Activity
## IssueComments
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/",
IssueCommentViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-comment",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/<uuid:pk>/",
IssueCommentViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-comment",
),
## End IssueComments
# Issue Subscribers
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-subscribers/",
IssueSubscriberViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-subscribers",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-subscribers/<uuid:subscriber_id>/",
IssueSubscriberViewSet.as_view({"delete": "destroy"}),
name="project-issue-subscribers",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/subscribe/",
IssueSubscriberViewSet.as_view(
{
"get": "subscription_status",
"post": "subscribe",
"delete": "unsubscribe",
}
),
name="project-issue-subscribers",
),
## End Issue Subscribers
# Issue Reactions
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/reactions/",
IssueReactionViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-reactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/reactions/<str:reaction_code>/",
IssueReactionViewSet.as_view(
{
"delete": "destroy",
}
),
name="project-issue-reactions",
),
## End Issue Reactions
# Comment Reactions
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/comments/<uuid:comment_id>/reactions/",
CommentReactionViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-comment-reactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/comments/<uuid:comment_id>/reactions/<str:reaction_code>/",
CommentReactionViewSet.as_view(
{
"delete": "destroy",
}
),
name="project-issue-comment-reactions",
),
## End Comment Reactions
## IssueProperty
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-display-properties/",
IssueUserDisplayPropertyEndpoint.as_view(),
name="project-issue-display-properties",
),
## IssueProperty End
## Issue Archives
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-issues/",
IssueArchiveViewSet.as_view(
{
"get": "list",
}
),
name="project-issue-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-issues/<uuid:pk>/",
IssueArchiveViewSet.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-issue-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/unarchive/<uuid:pk>/",
IssueArchiveViewSet.as_view(
{
"post": "unarchive",
}
),
name="project-issue-archive",
),
## End Issue Archives
## Issue Relation
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-relation/",
IssueRelationViewSet.as_view(
{
"post": "create",
}
),
name="issue-relation",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-relation/<uuid:pk>/",
IssueRelationViewSet.as_view(
{
"delete": "destroy",
}
),
name="issue-relation",
),
## End Issue Relation
## Issue Drafts
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-drafts/",
IssueDraftViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-draft",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-drafts/<uuid:pk>/",
IssueDraftViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-draft",
),
]

View File

@ -0,0 +1,104 @@
from django.urls import path
from plane.app.views import (
ModuleViewSet,
ModuleIssueViewSet,
ModuleLinkViewSet,
ModuleFavoriteViewSet,
BulkImportModulesEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/",
ModuleViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-modules",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:pk>/",
ModuleViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-modules",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/",
ModuleIssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-module-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:pk>/",
ModuleIssueViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-module-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-links/",
ModuleLinkViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-module-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-links/<uuid:pk>/",
ModuleLinkViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-module-links",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-modules/",
ModuleFavoriteViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="user-favorite-module",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-modules/<uuid:module_id>/",
ModuleFavoriteViewSet.as_view(
{
"delete": "destroy",
}
),
name="user-favorite-module",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-import-modules/<str:service>/",
BulkImportModulesEndpoint.as_view(),
name="bulk-modules-create",
),
]

View File

@ -1,7 +1,7 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.app.views import (
NotificationViewSet, NotificationViewSet,
UnreadNotificationEndpoint, UnreadNotificationEndpoint,
MarkAllReadNotificationViewSet, MarkAllReadNotificationViewSet,

View File

@ -0,0 +1,133 @@
from django.urls import path
from plane.app.views import (
PageViewSet,
PageFavoriteViewSet,
PageLogEndpoint,
SubPagesEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/",
PageViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:pk>/",
PageViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/",
PageFavoriteViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="user-favorite-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/<uuid:page_id>/",
PageFavoriteViewSet.as_view(
{
"delete": "destroy",
}
),
name="user-favorite-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/",
PageViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:pk>/",
PageViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/archive/",
PageViewSet.as_view(
{
"post": "archive",
}
),
name="project-page-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unarchive/",
PageViewSet.as_view(
{
"post": "unarchive",
}
),
name="project-page-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-pages/",
PageViewSet.as_view(
{
"get": "archive_list",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/lock/",
PageViewSet.as_view(
{
"post": "lock",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unlock/",
PageViewSet.as_view(
{
"post": "unlock",
}
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/",
PageLogEndpoint.as_view(),
name="page-transactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/<uuid:transaction>/",
PageLogEndpoint.as_view(),
name="page-transactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/sub-pages/",
SubPagesEndpoint.as_view(),
name="sub-page",
),
]

View File

@ -0,0 +1,172 @@
from django.urls import path
from plane.app.views import (
ProjectViewSet,
ProjectInvitationsViewset,
ProjectMemberViewSet,
ProjectMemberUserEndpoint,
ProjectJoinEndpoint,
AddTeamToProjectEndpoint,
ProjectUserViewsEndpoint,
ProjectIdentifierEndpoint,
ProjectFavoritesViewSet,
UserProjectInvitationsViewset,
ProjectPublicCoverImagesEndpoint,
ProjectDeployBoardViewSet,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/",
ProjectViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project",
),
path(
"workspaces/<str:slug>/projects/<uuid:pk>/",
ProjectViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project",
),
path(
"workspaces/<str:slug>/project-identifiers/",
ProjectIdentifierEndpoint.as_view(),
name="project-identifiers",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/",
ProjectInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/<uuid:pk>/",
ProjectInvitationsViewset.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-member-invite",
),
path(
"users/me/workspaces/<str:slug>/projects/invitations/",
UserProjectInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="user-project-invitations",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/join/<uuid:pk>/",
ProjectJoinEndpoint.as_view(),
name="project-join",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/",
ProjectMemberViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-member",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/<uuid:pk>/",
ProjectMemberViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-member",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/leave/",
ProjectMemberViewSet.as_view(
{
"post": "leave",
}
),
name="project-member",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/team-invite/",
AddTeamToProjectEndpoint.as_view(),
name="projects",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-views/",
ProjectUserViewsEndpoint.as_view(),
name="project-view",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-members/me/",
ProjectMemberUserEndpoint.as_view(),
name="project-member-view",
),
path(
"workspaces/<str:slug>/user-favorite-projects/",
ProjectFavoritesViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-favorite",
),
path(
"workspaces/<str:slug>/user-favorite-projects/<uuid:project_id>/",
ProjectFavoritesViewSet.as_view(
{
"delete": "destroy",
}
),
name="project-favorite",
),
path(
"project-covers/",
ProjectPublicCoverImagesEndpoint.as_view(),
name="project-covers",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-deploy-boards/",
ProjectDeployBoardViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-deploy-board",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-deploy-boards/<uuid:pk>/",
ProjectDeployBoardViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-deploy-board",
),
]

View File

@ -1,7 +1,7 @@
from django.urls import path from django.urls import path
from plane.api.views import ( from plane.app.views import (
GlobalSearchEndpoint, GlobalSearchEndpoint,
IssueSearchEndpoint, IssueSearchEndpoint,
) )

View File

@ -0,0 +1,38 @@
from django.urls import path
from plane.app.views import StateViewSet
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/",
StateViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-states",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/<uuid:pk>/",
StateViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-state",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/<uuid:pk>/mark-default/",
StateViewSet.as_view(
{
"post": "mark_as_default",
}
),
name="project-state",
),
]

Some files were not shown because too many files have changed in this diff Show More