@@ -109,7 +149,7 @@ For self hosting environment setup, visit the [Self Hosting](https://docs.plane.
@@ -118,7 +158,7 @@ For self hosting environment setup, visit the [Self Hosting](https://docs.plane.
@@ -128,7 +168,7 @@ For self hosting environment setup, visit the [Self Hosting](https://docs.plane.
@@ -136,20 +176,23 @@ For self hosting environment setup, visit the [Self Hosting](https://docs.plane.
-## 📚Documentation
-
-For full documentation, visit [docs.plane.so](https://docs.plane.so/)
-
-To see how to Contribute, visit [here](https://github.com/makeplane/plane/blob/master/CONTRIBUTING.md).
-
-## ❤️ Community
-
-The Plane community can be found on GitHub Discussions, where you can ask questions, voice ideas, and share your projects.
-
-To chat with other community members you can join the [Plane Discord](https://discord.com/invite/A92xrEGCge).
-
-Our [Code of Conduct](https://github.com/makeplane/plane/blob/master/CODE_OF_CONDUCT.md) applies to all Plane community channels.
-
## ⛓️ Security
-If you believe you have found a security vulnerability in Plane, we encourage you to responsibly disclose this and not open a public issue. We will investigate all legitimate reports. Email engineering@plane.so to disclose any security vulnerabilities.
+If you believe you have found a security vulnerability in Plane, we encourage you to responsibly disclose this and not open a public issue. We will investigate all legitimate reports.
+
+Email squawk@plane.so to disclose any security vulnerabilities.
+
+## ❤️ Contribute
+
+There are many ways to contribute to Plane, including:
+
+- Submitting [bugs](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%F0%9F%90%9Bbug&projects=&template=--bug-report.yaml&title=%5Bbug%5D%3A+) and [feature requests](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%E2%9C%A8feature&projects=&template=--feature-request.yaml&title=%5Bfeature%5D%3A+) for various components.
+- Reviewing [the documentation](https://docs.plane.so/) and submitting [pull requests](https://github.com/makeplane/plane), from fixing typos to adding new features.
+- Speaking or writing about Plane or any other ecosystem integration and [letting us know](https://discord.com/invite/A92xrEGCge)!
+- Upvoting [popular feature requests](https://github.com/makeplane/plane/issues) to show your support.
+
+### We couldn't have done this without you.
+
+
+
+
diff --git a/SECURITY.md b/SECURITY.md
new file mode 100644
index 000000000..36cdb982c
--- /dev/null
+++ b/SECURITY.md
@@ -0,0 +1,44 @@
+# Security Policy
+
+This document outlines security procedures and vulnerabilities reporting for the Plane project.
+
+At Plane, we safeguarding the security of our systems with top priority. Despite our efforts, vulnerabilities may still exist. We greatly appreciate your assistance in identifying and reporting any such vulnerabilities to help us maintain the integrity of our systems and protect our clients.
+
+To report a security vulnerability, please email us directly at security@plane.so with a detailed description of the vulnerability and steps to reproduce it. Please refrain from disclosing the vulnerability publicly until we have had an opportunity to review and address it.
+
+## Out of Scope Vulnerabilities
+
+We appreciate your help in identifying vulnerabilities. However, please note that the following types of vulnerabilities are considered out of scope:
+
+- Attacks requiring MITM or physical access to a user's device.
+- Content spoofing and text injection issues without demonstrating an attack vector or ability to modify HTML/CSS.
+- Email spoofing.
+- Missing DNSSEC, CAA, CSP headers.
+- Lack of Secure or HTTP only flag on non-sensitive cookies.
+
+## Reporting Process
+
+If you discover a vulnerability, please adhere to the following reporting process:
+
+1. Email your findings to security@plane.so.
+2. Refrain from running automated scanners on our infrastructure or dashboard without prior consent. Contact us to set up a sandbox environment if necessary.
+3. Do not exploit the vulnerability for malicious purposes, such as downloading excessive data or altering user data.
+4. Maintain confidentiality and refrain from disclosing the vulnerability until it has been resolved.
+5. Avoid using physical security attacks, social engineering, distributed denial of service, spam, or third-party applications.
+
+When reporting a vulnerability, please provide sufficient information to allow us to reproduce and address the issue promptly. Include the IP address or URL of the affected system, along with a detailed description of the vulnerability.
+
+## Our Commitment
+
+We are committed to promptly addressing reported vulnerabilities and maintaining open communication throughout the resolution process. Here's what you can expect from us:
+
+- **Response Time:** We will acknowledge receipt of your report within three business days and provide an expected resolution date.
+- **Legal Protection:** We will not pursue legal action against you for reporting vulnerabilities, provided you adhere to the reporting guidelines.
+- **Confidentiality:** Your report will be treated with strict confidentiality. We will not disclose your personal information to third parties without your consent.
+- **Progress Updates:** We will keep you informed of our progress in resolving the reported vulnerability.
+- **Recognition:** With your permission, we will publicly acknowledge you as the discoverer of the vulnerability.
+- **Timely Resolution:** We strive to resolve all reported vulnerabilities promptly and will actively participate in the publication process once the issue is resolved.
+
+We appreciate your cooperation in helping us maintain the security of our systems and protecting our clients. Thank you for your contributions to our security efforts.
+
+reference: https://supabase.com/.well-known/security.txt
diff --git a/admin/.env.example b/admin/.env.example
new file mode 100644
index 000000000..fdeb05c4d
--- /dev/null
+++ b/admin/.env.example
@@ -0,0 +1,3 @@
+NEXT_PUBLIC_API_BASE_URL=""
+NEXT_PUBLIC_ADMIN_BASE_PATH="/god-mode"
+NEXT_PUBLIC_WEB_BASE_URL=""
\ No newline at end of file
diff --git a/admin/.eslintrc.js b/admin/.eslintrc.js
new file mode 100644
index 000000000..a82c768a0
--- /dev/null
+++ b/admin/.eslintrc.js
@@ -0,0 +1,52 @@
+module.exports = {
+ root: true,
+ extends: ["custom"],
+ parser: "@typescript-eslint/parser",
+ settings: {
+ "import/resolver": {
+ typescript: {},
+ node: {
+ moduleDirectory: ["node_modules", "."],
+ },
+ },
+ },
+ rules: {
+ "import/order": [
+ "error",
+ {
+ groups: ["builtin", "external", "internal", "parent", "sibling",],
+ pathGroups: [
+ {
+ pattern: "react",
+ group: "external",
+ position: "before",
+ },
+ {
+ pattern: "lucide-react",
+ group: "external",
+ position: "after",
+ },
+ {
+ pattern: "@headlessui/**",
+ group: "external",
+ position: "after",
+ },
+ {
+ pattern: "@plane/**",
+ group: "external",
+ position: "after",
+ },
+ {
+ pattern: "@/**",
+ group: "internal",
+ }
+ ],
+ pathGroupsExcludedImportTypes: ["builtin", "internal", "react"],
+ alphabetize: {
+ order: "asc",
+ caseInsensitive: true,
+ },
+ },
+ ],
+ },
+}
\ No newline at end of file
diff --git a/admin/.prettierignore b/admin/.prettierignore
new file mode 100644
index 000000000..43e8a7b8f
--- /dev/null
+++ b/admin/.prettierignore
@@ -0,0 +1,6 @@
+.next
+.vercel
+.tubro
+out/
+dis/
+build/
\ No newline at end of file
diff --git a/admin/.prettierrc b/admin/.prettierrc
new file mode 100644
index 000000000..87d988f1b
--- /dev/null
+++ b/admin/.prettierrc
@@ -0,0 +1,5 @@
+{
+ "printWidth": 120,
+ "tabWidth": 2,
+ "trailingComma": "es5"
+}
diff --git a/admin/Dockerfile.admin b/admin/Dockerfile.admin
new file mode 100644
index 000000000..ad9469110
--- /dev/null
+++ b/admin/Dockerfile.admin
@@ -0,0 +1,86 @@
+# *****************************************************************************
+# STAGE 1: Build the project
+# *****************************************************************************
+FROM node:18-alpine AS builder
+RUN apk add --no-cache libc6-compat
+WORKDIR /app
+
+RUN yarn global add turbo
+COPY . .
+
+RUN turbo prune --scope=admin --docker
+
+# *****************************************************************************
+# STAGE 2: Install dependencies & build the project
+# *****************************************************************************
+FROM node:18-alpine AS installer
+
+RUN apk add --no-cache libc6-compat
+WORKDIR /app
+
+COPY .gitignore .gitignore
+COPY --from=builder /app/out/json/ .
+COPY --from=builder /app/out/yarn.lock ./yarn.lock
+RUN yarn install --network-timeout 500000
+
+COPY --from=builder /app/out/full/ .
+COPY turbo.json turbo.json
+
+ARG NEXT_PUBLIC_API_BASE_URL=""
+ENV NEXT_PUBLIC_API_BASE_URL=$NEXT_PUBLIC_API_BASE_URL
+
+ARG NEXT_PUBLIC_ADMIN_BASE_URL=""
+ENV NEXT_PUBLIC_ADMIN_BASE_URL=$NEXT_PUBLIC_ADMIN_BASE_URL
+
+ARG NEXT_PUBLIC_ADMIN_BASE_PATH="/god-mode"
+ENV NEXT_PUBLIC_ADMIN_BASE_PATH=$NEXT_PUBLIC_ADMIN_BASE_PATH
+
+ARG NEXT_PUBLIC_SPACE_BASE_URL=""
+ENV NEXT_PUBLIC_SPACE_BASE_URL=$NEXT_PUBLIC_SPACE_BASE_URL
+
+ARG NEXT_PUBLIC_SPACE_BASE_PATH="/spaces"
+ENV NEXT_PUBLIC_SPACE_BASE_PATH=$NEXT_PUBLIC_SPACE_BASE_PATH
+
+ARG NEXT_PUBLIC_WEB_BASE_URL=""
+ENV NEXT_PUBLIC_WEB_BASE_URL=$NEXT_PUBLIC_WEB_BASE_URL
+
+ENV NEXT_TELEMETRY_DISABLED 1
+ENV TURBO_TELEMETRY_DISABLED 1
+
+RUN yarn turbo run build --filter=admin
+
+# *****************************************************************************
+# STAGE 3: Copy the project and start it
+# *****************************************************************************
+FROM node:18-alpine AS runner
+WORKDIR /app
+
+COPY --from=installer /app/admin/next.config.js .
+COPY --from=installer /app/admin/package.json .
+
+COPY --from=installer /app/admin/.next/standalone ./
+COPY --from=installer /app/admin/.next/static ./admin/.next/static
+COPY --from=installer /app/admin/public ./admin/public
+
+ARG NEXT_PUBLIC_API_BASE_URL=""
+ENV NEXT_PUBLIC_API_BASE_URL=$NEXT_PUBLIC_API_BASE_URL
+
+ARG NEXT_PUBLIC_ADMIN_BASE_URL=""
+ENV NEXT_PUBLIC_ADMIN_BASE_URL=$NEXT_PUBLIC_ADMIN_BASE_URL
+
+ARG NEXT_PUBLIC_ADMIN_BASE_PATH="/god-mode"
+ENV NEXT_PUBLIC_ADMIN_BASE_PATH=$NEXT_PUBLIC_ADMIN_BASE_PATH
+
+ARG NEXT_PUBLIC_SPACE_BASE_URL=""
+ENV NEXT_PUBLIC_SPACE_BASE_URL=$NEXT_PUBLIC_SPACE_BASE_URL
+
+ARG NEXT_PUBLIC_SPACE_BASE_PATH="/spaces"
+ENV NEXT_PUBLIC_SPACE_BASE_PATH=$NEXT_PUBLIC_SPACE_BASE_PATH
+
+ARG NEXT_PUBLIC_WEB_BASE_URL=""
+ENV NEXT_PUBLIC_WEB_BASE_URL=$NEXT_PUBLIC_WEB_BASE_URL
+
+ENV NEXT_TELEMETRY_DISABLED 1
+ENV TURBO_TELEMETRY_DISABLED 1
+
+EXPOSE 3000
\ No newline at end of file
diff --git a/admin/Dockerfile.dev b/admin/Dockerfile.dev
new file mode 100644
index 000000000..1ed84e78e
--- /dev/null
+++ b/admin/Dockerfile.dev
@@ -0,0 +1,17 @@
+FROM node:18-alpine
+RUN apk add --no-cache libc6-compat
+# Set working directory
+WORKDIR /app
+
+COPY . .
+
+RUN yarn global add turbo
+RUN yarn install
+
+ENV NEXT_PUBLIC_ADMIN_BASE_PATH="/god-mode"
+
+EXPOSE 3000
+
+VOLUME [ "/app/node_modules", "/app/admin/node_modules" ]
+
+CMD ["yarn", "dev", "--filter=admin"]
diff --git a/admin/app/ai/form.tsx b/admin/app/ai/form.tsx
new file mode 100644
index 000000000..cec5c0748
--- /dev/null
+++ b/admin/app/ai/form.tsx
@@ -0,0 +1,128 @@
+import { FC } from "react";
+import { useForm } from "react-hook-form";
+import { Lightbulb } from "lucide-react";
+import { IFormattedInstanceConfiguration, TInstanceAIConfigurationKeys } from "@plane/types";
+import { Button, TOAST_TYPE, setToast } from "@plane/ui";
+// components
+import { ControllerInput, TControllerInputFormField } from "@/components/common";
+// hooks
+import { useInstance } from "@/hooks/store";
+
+type IInstanceAIForm = {
+ config: IFormattedInstanceConfiguration;
+};
+
+type AIFormValues = Record;
+
+export const InstanceAIForm: FC = (props) => {
+ const { config } = props;
+ // store
+ const { updateInstanceConfigurations } = useInstance();
+ // form data
+ const {
+ handleSubmit,
+ control,
+ formState: { errors, isSubmitting },
+ } = useForm({
+ defaultValues: {
+ OPENAI_API_KEY: config["OPENAI_API_KEY"],
+ GPT_ENGINE: config["GPT_ENGINE"],
+ },
+ });
+
+ const aiFormFields: TControllerInputFormField[] = [
+ {
+ key: "GPT_ENGINE",
+ type: "text",
+ label: "GPT_ENGINE",
+ description: (
+ <>
+ Choose an OpenAI engine.{" "}
+
+ Learn more
+
+ >
+ ),
+ placeholder: "gpt-3.5-turbo",
+ error: Boolean(errors.GPT_ENGINE),
+ required: false,
+ },
+ {
+ key: "OPENAI_API_KEY",
+ type: "password",
+ label: "API key",
+ description: (
+ <>
+ You will find your API key{" "}
+
+ here.
+
+ >
+ ),
+ placeholder: "sk-asddassdfasdefqsdfasd23das3dasdcasd",
+ error: Boolean(errors.OPENAI_API_KEY),
+ required: false,
+ },
+ ];
+
+ const onSubmit = async (formData: AIFormValues) => {
+ const payload: Partial = { ...formData };
+
+ await updateInstanceConfigurations(payload)
+ .then(() =>
+ setToast({
+ type: TOAST_TYPE.SUCCESS,
+ title: "Success",
+ message: "AI Settings updated successfully",
+ })
+ )
+ .catch((err) => console.error(err));
+ };
+
+ return (
+
+
+
+
OpenAI
+
If you use ChatGPT, this is for you.
+
+
+ {aiFormFields.map((field) => (
+
+ ))}
+
+
+
+
+
+ {isSubmitting ? "Saving..." : "Save changes"}
+
+
+
+
+
If you have a preferred AI models vendor, please get in touch with us.
+
+
+
+ );
+};
diff --git a/admin/app/ai/layout.tsx b/admin/app/ai/layout.tsx
new file mode 100644
index 000000000..0a0bacac1
--- /dev/null
+++ b/admin/app/ai/layout.tsx
@@ -0,0 +1,11 @@
+import { ReactNode } from "react";
+import { Metadata } from "next";
+import { AdminLayout } from "@/layouts/admin-layout";
+
+export const metadata: Metadata = {
+ title: "AI Settings - God Mode",
+};
+
+export default function AILayout({ children }: { children: ReactNode }) {
+ return {children} ;
+}
diff --git a/admin/app/ai/page.tsx b/admin/app/ai/page.tsx
new file mode 100644
index 000000000..a54ce6d8c
--- /dev/null
+++ b/admin/app/ai/page.tsx
@@ -0,0 +1,48 @@
+"use client";
+
+import { observer } from "mobx-react-lite";
+import useSWR from "swr";
+import { Loader } from "@plane/ui";
+// components
+import { PageHeader } from "@/components/core";
+// hooks
+import { useInstance } from "@/hooks/store";
+// components
+import { InstanceAIForm } from "./form";
+
+const InstanceAIPage = observer(() => {
+ // store
+ const { fetchInstanceConfigurations, formattedConfig } = useInstance();
+
+ useSWR("INSTANCE_CONFIGURATIONS", () => fetchInstanceConfigurations());
+
+ return (
+ <>
+
+
+
+
AI features for all your workspaces
+
+ Configure your AI API credentials so Plane AI features are turned on for all your workspaces.
+
+
+
+ {formattedConfig ? (
+
+ ) : (
+
+
+
+
+
+
+
+
+ )}
+
+
+ >
+ );
+});
+
+export default InstanceAIPage;
diff --git a/admin/app/authentication/components/authentication-method-card.tsx b/admin/app/authentication/components/authentication-method-card.tsx
new file mode 100644
index 000000000..1346a730e
--- /dev/null
+++ b/admin/app/authentication/components/authentication-method-card.tsx
@@ -0,0 +1,51 @@
+"use client";
+
+import { FC } from "react";
+// helpers
+import { cn } from "helpers/common.helper";
+
+type Props = {
+ name: string;
+ description: string;
+ icon: JSX.Element;
+ config: JSX.Element;
+ disabled?: boolean;
+ withBorder?: boolean;
+};
+
+export const AuthenticationMethodCard: FC = (props) => {
+ const { name, description, icon, config, disabled = false, withBorder = true } = props;
+
+ return (
+
+
+
+
+
+ {name}
+
+
+ {description}
+
+
+
+
{config}
+
+ );
+};
diff --git a/admin/app/authentication/components/email-config-switch.tsx b/admin/app/authentication/components/email-config-switch.tsx
new file mode 100644
index 000000000..0f09cf82c
--- /dev/null
+++ b/admin/app/authentication/components/email-config-switch.tsx
@@ -0,0 +1,36 @@
+"use client";
+
+import React from "react";
+import { observer } from "mobx-react-lite";
+// hooks
+import { TInstanceAuthenticationMethodKeys } from "@plane/types";
+import { ToggleSwitch } from "@plane/ui";
+import { useInstance } from "@/hooks/store";
+// ui
+// types
+
+type Props = {
+ disabled: boolean;
+ updateConfig: (key: TInstanceAuthenticationMethodKeys, value: string) => void;
+};
+
+export const EmailCodesConfiguration: React.FC = observer((props) => {
+ const { disabled, updateConfig } = props;
+ // store
+ const { formattedConfig } = useInstance();
+ // derived values
+ const enableMagicLogin = formattedConfig?.ENABLE_MAGIC_LINK_LOGIN ?? "";
+
+ return (
+ {
+ Boolean(parseInt(enableMagicLogin)) === true
+ ? updateConfig("ENABLE_MAGIC_LINK_LOGIN", "0")
+ : updateConfig("ENABLE_MAGIC_LINK_LOGIN", "1");
+ }}
+ size="sm"
+ disabled={disabled}
+ />
+ );
+});
diff --git a/admin/app/authentication/components/github-config.tsx b/admin/app/authentication/components/github-config.tsx
new file mode 100644
index 000000000..27264d460
--- /dev/null
+++ b/admin/app/authentication/components/github-config.tsx
@@ -0,0 +1,59 @@
+"use client";
+
+import React from "react";
+import { observer } from "mobx-react-lite";
+import Link from "next/link";
+// icons
+import { Settings2 } from "lucide-react";
+// types
+import { TInstanceAuthenticationMethodKeys } from "@plane/types";
+// ui
+import { ToggleSwitch, getButtonStyling } from "@plane/ui";
+// helpers
+import { cn } from "@/helpers/common.helper";
+// hooks
+import { useInstance } from "@/hooks/store";
+
+type Props = {
+ disabled: boolean;
+ updateConfig: (key: TInstanceAuthenticationMethodKeys, value: string) => void;
+};
+
+export const GithubConfiguration: React.FC = observer((props) => {
+ const { disabled, updateConfig } = props;
+ // store
+ const { formattedConfig } = useInstance();
+ // derived values
+ const enableGithubConfig = formattedConfig?.IS_GITHUB_ENABLED ?? "";
+ const isGithubConfigured = !!formattedConfig?.GITHUB_CLIENT_ID && !!formattedConfig?.GITHUB_CLIENT_SECRET;
+
+ return (
+ <>
+ {isGithubConfigured ? (
+
+
+ Edit
+
+ {
+ Boolean(parseInt(enableGithubConfig)) === true
+ ? updateConfig("IS_GITHUB_ENABLED", "0")
+ : updateConfig("IS_GITHUB_ENABLED", "1");
+ }}
+ size="sm"
+ disabled={disabled}
+ />
+
+ ) : (
+
+
+ Configure
+
+ )}
+ >
+ );
+});
diff --git a/admin/app/authentication/components/google-config.tsx b/admin/app/authentication/components/google-config.tsx
new file mode 100644
index 000000000..9fde70dac
--- /dev/null
+++ b/admin/app/authentication/components/google-config.tsx
@@ -0,0 +1,59 @@
+"use client";
+
+import React from "react";
+import { observer } from "mobx-react-lite";
+import Link from "next/link";
+// icons
+import { Settings2 } from "lucide-react";
+// types
+import { TInstanceAuthenticationMethodKeys } from "@plane/types";
+// ui
+import { ToggleSwitch, getButtonStyling } from "@plane/ui";
+// helpers
+import { cn } from "@/helpers/common.helper";
+// hooks
+import { useInstance } from "@/hooks/store";
+
+type Props = {
+ disabled: boolean;
+ updateConfig: (key: TInstanceAuthenticationMethodKeys, value: string) => void;
+};
+
+export const GoogleConfiguration: React.FC = observer((props) => {
+ const { disabled, updateConfig } = props;
+ // store
+ const { formattedConfig } = useInstance();
+ // derived values
+ const enableGoogleConfig = formattedConfig?.IS_GOOGLE_ENABLED ?? "";
+ const isGoogleConfigured = !!formattedConfig?.GOOGLE_CLIENT_ID && !!formattedConfig?.GOOGLE_CLIENT_SECRET;
+
+ return (
+ <>
+ {isGoogleConfigured ? (
+
+
+ Edit
+
+ {
+ Boolean(parseInt(enableGoogleConfig)) === true
+ ? updateConfig("IS_GOOGLE_ENABLED", "0")
+ : updateConfig("IS_GOOGLE_ENABLED", "1");
+ }}
+ size="sm"
+ disabled={disabled}
+ />
+
+ ) : (
+
+
+ Configure
+
+ )}
+ >
+ );
+});
diff --git a/admin/app/authentication/components/index.ts b/admin/app/authentication/components/index.ts
new file mode 100644
index 000000000..d76d61f57
--- /dev/null
+++ b/admin/app/authentication/components/index.ts
@@ -0,0 +1,5 @@
+export * from "./email-config-switch";
+export * from "./password-config-switch";
+export * from "./authentication-method-card";
+export * from "./github-config";
+export * from "./google-config";
diff --git a/admin/app/authentication/components/password-config-switch.tsx b/admin/app/authentication/components/password-config-switch.tsx
new file mode 100644
index 000000000..901cce862
--- /dev/null
+++ b/admin/app/authentication/components/password-config-switch.tsx
@@ -0,0 +1,36 @@
+"use client";
+
+import React from "react";
+import { observer } from "mobx-react-lite";
+// hooks
+import { TInstanceAuthenticationMethodKeys } from "@plane/types";
+import { ToggleSwitch } from "@plane/ui";
+import { useInstance } from "@/hooks/store";
+// ui
+// types
+
+type Props = {
+ disabled: boolean;
+ updateConfig: (key: TInstanceAuthenticationMethodKeys, value: string) => void;
+};
+
+export const PasswordLoginConfiguration: React.FC = observer((props) => {
+ const { disabled, updateConfig } = props;
+ // store
+ const { formattedConfig } = useInstance();
+ // derived values
+ const enableEmailPassword = formattedConfig?.ENABLE_EMAIL_PASSWORD ?? "";
+
+ return (
+ {
+ Boolean(parseInt(enableEmailPassword)) === true
+ ? updateConfig("ENABLE_EMAIL_PASSWORD", "0")
+ : updateConfig("ENABLE_EMAIL_PASSWORD", "1");
+ }}
+ size="sm"
+ disabled={disabled}
+ />
+ );
+});
diff --git a/admin/app/authentication/github/form.tsx b/admin/app/authentication/github/form.tsx
new file mode 100644
index 000000000..75c76e7a5
--- /dev/null
+++ b/admin/app/authentication/github/form.tsx
@@ -0,0 +1,213 @@
+import { FC, useState } from "react";
+import isEmpty from "lodash/isEmpty";
+import Link from "next/link";
+import { useForm } from "react-hook-form";
+// types
+import { IFormattedInstanceConfiguration, TInstanceGithubAuthenticationConfigurationKeys } from "@plane/types";
+// ui
+import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
+// components
+import {
+ ConfirmDiscardModal,
+ ControllerInput,
+ CopyField,
+ TControllerInputFormField,
+ TCopyField,
+} from "@/components/common";
+// helpers
+import { API_BASE_URL, cn } from "@/helpers/common.helper";
+// hooks
+import { useInstance } from "@/hooks/store";
+
+type Props = {
+ config: IFormattedInstanceConfiguration;
+};
+
+type GithubConfigFormValues = Record;
+
+export const InstanceGithubConfigForm: FC = (props) => {
+ const { config } = props;
+ // states
+ const [isDiscardChangesModalOpen, setIsDiscardChangesModalOpen] = useState(false);
+ // store hooks
+ const { updateInstanceConfigurations } = useInstance();
+ // form data
+ const {
+ handleSubmit,
+ control,
+ reset,
+ formState: { errors, isDirty, isSubmitting },
+ } = useForm({
+ defaultValues: {
+ GITHUB_CLIENT_ID: config["GITHUB_CLIENT_ID"],
+ GITHUB_CLIENT_SECRET: config["GITHUB_CLIENT_SECRET"],
+ },
+ });
+
+ const originURL = !isEmpty(API_BASE_URL) ? API_BASE_URL : typeof window !== "undefined" ? window.location.origin : "";
+
+ const GITHUB_FORM_FIELDS: TControllerInputFormField[] = [
+ {
+ key: "GITHUB_CLIENT_ID",
+ type: "text",
+ label: "Client ID",
+ description: (
+ <>
+ You will get this from your{" "}
+
+ GitHub OAuth application settings.
+
+ >
+ ),
+ placeholder: "70a44354520df8bd9bcd",
+ error: Boolean(errors.GITHUB_CLIENT_ID),
+ required: true,
+ },
+ {
+ key: "GITHUB_CLIENT_SECRET",
+ type: "password",
+ label: "Client secret",
+ description: (
+ <>
+ Your client secret is also found in your{" "}
+
+ GitHub OAuth application settings.
+
+ >
+ ),
+ placeholder: "9b0050f94ec1b744e32ce79ea4ffacd40d4119cb",
+ error: Boolean(errors.GITHUB_CLIENT_SECRET),
+ required: true,
+ },
+ ];
+
+ const GITHUB_SERVICE_FIELD: TCopyField[] = [
+ {
+ key: "Origin_URL",
+ label: "Origin URL",
+ url: originURL,
+ description: (
+ <>
+ We will auto-generate this. Paste this into the Authorized origin URL field{" "}
+
+ here.
+
+ >
+ ),
+ },
+ {
+ key: "Callback_URI",
+ label: "Callback URI",
+ url: `${originURL}/auth/github/callback/`,
+ description: (
+ <>
+ We will auto-generate this. Paste this into your Authorized Callback URI field{" "}
+
+ here.
+
+ >
+ ),
+ },
+ ];
+
+ const onSubmit = async (formData: GithubConfigFormValues) => {
+ const payload: Partial = { ...formData };
+
+ await updateInstanceConfigurations(payload)
+ .then((response = []) => {
+ setToast({
+ type: TOAST_TYPE.SUCCESS,
+ title: "Success",
+ message: "Github Configuration Settings updated successfully",
+ });
+ reset({
+ GITHUB_CLIENT_ID: response.find((item) => item.key === "GITHUB_CLIENT_ID")?.value,
+ GITHUB_CLIENT_SECRET: response.find((item) => item.key === "GITHUB_CLIENT_SECRET")?.value,
+ });
+ })
+ .catch((err) => console.error(err));
+ };
+
+ const handleGoBack = (e: React.MouseEvent) => {
+ if (isDirty) {
+ e.preventDefault();
+ setIsDiscardChangesModalOpen(true);
+ }
+ };
+
+ return (
+ <>
+ setIsDiscardChangesModalOpen(false)}
+ />
+
+
+
+
Configuration
+ {GITHUB_FORM_FIELDS.map((field) => (
+
+ ))}
+
+
+
+ {isSubmitting ? "Saving..." : "Save changes"}
+
+
+ Go back
+
+
+
+
+
+
+
Service provider details
+ {GITHUB_SERVICE_FIELD.map((field) => (
+
+ ))}
+
+
+
+
+ >
+ );
+};
diff --git a/admin/app/authentication/github/page.tsx b/admin/app/authentication/github/page.tsx
new file mode 100644
index 000000000..8532910f7
--- /dev/null
+++ b/admin/app/authentication/github/page.tsx
@@ -0,0 +1,114 @@
+"use client";
+
+import { useState } from "react";
+import { observer } from "mobx-react-lite";
+import Image from "next/image";
+import { useTheme } from "next-themes";
+import useSWR from "swr";
+import { Loader, ToggleSwitch, setPromiseToast } from "@plane/ui";
+// components
+import { PageHeader } from "@/components/core";
+// helpers
+import { resolveGeneralTheme } from "@/helpers/common.helper";
+// hooks
+import { useInstance } from "@/hooks/store";
+// icons
+import githubLightModeImage from "@/public/logos/github-black.png";
+import githubDarkModeImage from "@/public/logos/github-white.png";
+// local components
+import { AuthenticationMethodCard } from "../components";
+import { InstanceGithubConfigForm } from "./form";
+
+const InstanceGithubAuthenticationPage = observer(() => {
+ // store
+ const { fetchInstanceConfigurations, formattedConfig, updateInstanceConfigurations } = useInstance();
+ // state
+ const [isSubmitting, setIsSubmitting] = useState(false);
+ // theme
+ const { resolvedTheme } = useTheme();
+ // config
+ const enableGithubConfig = formattedConfig?.IS_GITHUB_ENABLED ?? "";
+
+ useSWR("INSTANCE_CONFIGURATIONS", () => fetchInstanceConfigurations());
+
+ const updateConfig = async (key: "IS_GITHUB_ENABLED", value: string) => {
+ setIsSubmitting(true);
+
+ const payload = {
+ [key]: value,
+ };
+
+ const updateConfigPromise = updateInstanceConfigurations(payload);
+
+ setPromiseToast(updateConfigPromise, {
+ loading: "Saving Configuration...",
+ success: {
+ title: "Configuration saved",
+ message: () => `Github authentication is now ${value ? "active" : "disabled"}.`,
+ },
+ error: {
+ title: "Error",
+ message: () => "Failed to save configuration",
+ },
+ });
+
+ await updateConfigPromise
+ .then(() => {
+ setIsSubmitting(false);
+ })
+ .catch((err) => {
+ console.error(err);
+ setIsSubmitting(false);
+ });
+ };
+ return (
+ <>
+
+
+
+
+ }
+ config={
+
{
+ Boolean(parseInt(enableGithubConfig)) === true
+ ? updateConfig("IS_GITHUB_ENABLED", "0")
+ : updateConfig("IS_GITHUB_ENABLED", "1");
+ }}
+ size="sm"
+ disabled={isSubmitting || !formattedConfig}
+ />
+ }
+ disabled={isSubmitting || !formattedConfig}
+ withBorder={false}
+ />
+
+
+ {formattedConfig ? (
+
+ ) : (
+
+
+
+
+
+
+
+ )}
+
+
+ >
+ );
+});
+
+export default InstanceGithubAuthenticationPage;
diff --git a/admin/app/authentication/google/form.tsx b/admin/app/authentication/google/form.tsx
new file mode 100644
index 000000000..fd2e7c73c
--- /dev/null
+++ b/admin/app/authentication/google/form.tsx
@@ -0,0 +1,211 @@
+import { FC, useState } from "react";
+import isEmpty from "lodash/isEmpty";
+import Link from "next/link";
+import { useForm } from "react-hook-form";
+// types
+import { IFormattedInstanceConfiguration, TInstanceGoogleAuthenticationConfigurationKeys } from "@plane/types";
+// ui
+import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
+// components
+import {
+ ConfirmDiscardModal,
+ ControllerInput,
+ CopyField,
+ TControllerInputFormField,
+ TCopyField,
+} from "@/components/common";
+// helpers
+import { API_BASE_URL, cn } from "@/helpers/common.helper";
+// hooks
+import { useInstance } from "@/hooks/store";
+
+type Props = {
+ config: IFormattedInstanceConfiguration;
+};
+
+type GoogleConfigFormValues = Record;
+
+export const InstanceGoogleConfigForm: FC = (props) => {
+ const { config } = props;
+ // states
+ const [isDiscardChangesModalOpen, setIsDiscardChangesModalOpen] = useState(false);
+ // store hooks
+ const { updateInstanceConfigurations } = useInstance();
+ // form data
+ const {
+ handleSubmit,
+ control,
+ reset,
+ formState: { errors, isDirty, isSubmitting },
+ } = useForm({
+ defaultValues: {
+ GOOGLE_CLIENT_ID: config["GOOGLE_CLIENT_ID"],
+ GOOGLE_CLIENT_SECRET: config["GOOGLE_CLIENT_SECRET"],
+ },
+ });
+
+ const originURL = !isEmpty(API_BASE_URL) ? API_BASE_URL : typeof window !== "undefined" ? window.location.origin : "";
+
+ const GOOGLE_FORM_FIELDS: TControllerInputFormField[] = [
+ {
+ key: "GOOGLE_CLIENT_ID",
+ type: "text",
+ label: "Client ID",
+ description: (
+ <>
+ Your client ID lives in your Google API Console.{" "}
+
+ Learn more
+
+ >
+ ),
+ placeholder: "840195096245-0p2tstej9j5nc4l8o1ah2dqondscqc1g.apps.googleusercontent.com",
+ error: Boolean(errors.GOOGLE_CLIENT_ID),
+ required: true,
+ },
+ {
+ key: "GOOGLE_CLIENT_SECRET",
+ type: "password",
+ label: "Client secret",
+ description: (
+ <>
+ Your client secret should also be in your Google API Console.{" "}
+
+ Learn more
+
+ >
+ ),
+ placeholder: "GOCShX-ADp4cI0kPqav1gGCBg5bE02E",
+ error: Boolean(errors.GOOGLE_CLIENT_SECRET),
+ required: true,
+ },
+ ];
+
+ const GOOGLE_SERVICE_DETAILS: TCopyField[] = [
+ {
+ key: "Origin_URL",
+ label: "Origin URL",
+ url: originURL,
+ description: (
+
+ We will auto-generate this. Paste this into your Authorized JavaScript origins field. For this OAuth client{" "}
+
+ here.
+
+
+ ),
+ },
+ {
+ key: "Callback_URI",
+ label: "Callback URI",
+ url: `${originURL}/auth/google/callback/`,
+ description: (
+
+ We will auto-generate this. Paste this into your Authorized Redirect URI field. For this OAuth client{" "}
+
+ here.
+
+
+ ),
+ },
+ ];
+
+ const onSubmit = async (formData: GoogleConfigFormValues) => {
+ const payload: Partial = { ...formData };
+
+ await updateInstanceConfigurations(payload)
+ .then((response = []) => {
+ setToast({
+ type: TOAST_TYPE.SUCCESS,
+ title: "Success",
+ message: "Google Configuration Settings updated successfully",
+ });
+ reset({
+ GOOGLE_CLIENT_ID: response.find((item) => item.key === "GOOGLE_CLIENT_ID")?.value,
+ GOOGLE_CLIENT_SECRET: response.find((item) => item.key === "GOOGLE_CLIENT_SECRET")?.value,
+ });
+ })
+ .catch((err) => console.error(err));
+ };
+
+ const handleGoBack = (e: React.MouseEvent) => {
+ if (isDirty) {
+ e.preventDefault();
+ setIsDiscardChangesModalOpen(true);
+ }
+ };
+
+ return (
+ <>
+ setIsDiscardChangesModalOpen(false)}
+ />
+
+
+
+
Configuration
+ {GOOGLE_FORM_FIELDS.map((field) => (
+
+ ))}
+
+
+
+ {isSubmitting ? "Saving..." : "Save changes"}
+
+
+ Go back
+
+
+
+
+
+
+
Service provider details
+ {GOOGLE_SERVICE_DETAILS.map((field) => (
+
+ ))}
+
+
+
+
+ >
+ );
+};
diff --git a/admin/app/authentication/google/page.tsx b/admin/app/authentication/google/page.tsx
new file mode 100644
index 000000000..fcdcd47ad
--- /dev/null
+++ b/admin/app/authentication/google/page.tsx
@@ -0,0 +1,102 @@
+"use client";
+
+import { useState } from "react";
+import { observer } from "mobx-react-lite";
+import Image from "next/image";
+import useSWR from "swr";
+import { Loader, ToggleSwitch, setPromiseToast } from "@plane/ui";
+// components
+import { PageHeader } from "@/components/core";
+// hooks
+import { useInstance } from "@/hooks/store";
+// icons
+import GoogleLogo from "@/public/logos/google-logo.svg";
+// local components
+import { AuthenticationMethodCard } from "../components";
+import { InstanceGoogleConfigForm } from "./form";
+
+const InstanceGoogleAuthenticationPage = observer(() => {
+ // store
+ const { fetchInstanceConfigurations, formattedConfig, updateInstanceConfigurations } = useInstance();
+ // state
+ const [isSubmitting, setIsSubmitting] = useState(false);
+ // config
+ const enableGoogleConfig = formattedConfig?.IS_GOOGLE_ENABLED ?? "";
+
+ useSWR("INSTANCE_CONFIGURATIONS", () => fetchInstanceConfigurations());
+
+ const updateConfig = async (key: "IS_GOOGLE_ENABLED", value: string) => {
+ setIsSubmitting(true);
+
+ const payload = {
+ [key]: value,
+ };
+
+ const updateConfigPromise = updateInstanceConfigurations(payload);
+
+ setPromiseToast(updateConfigPromise, {
+ loading: "Saving Configuration...",
+ success: {
+ title: "Configuration saved",
+ message: () => `Google authentication is now ${value ? "active" : "disabled"}.`,
+ },
+ error: {
+ title: "Error",
+ message: () => "Failed to save configuration",
+ },
+ });
+
+ await updateConfigPromise
+ .then(() => {
+ setIsSubmitting(false);
+ })
+ .catch((err) => {
+ console.error(err);
+ setIsSubmitting(false);
+ });
+ };
+ return (
+ <>
+
+
+
+
}
+ config={
+
{
+ Boolean(parseInt(enableGoogleConfig)) === true
+ ? updateConfig("IS_GOOGLE_ENABLED", "0")
+ : updateConfig("IS_GOOGLE_ENABLED", "1");
+ }}
+ size="sm"
+ disabled={isSubmitting || !formattedConfig}
+ />
+ }
+ disabled={isSubmitting || !formattedConfig}
+ withBorder={false}
+ />
+
+
+ {formattedConfig ? (
+
+ ) : (
+
+
+
+
+
+
+
+ )}
+
+
+ >
+ );
+});
+
+export default InstanceGoogleAuthenticationPage;
diff --git a/admin/app/authentication/layout.tsx b/admin/app/authentication/layout.tsx
new file mode 100644
index 000000000..64506ddb4
--- /dev/null
+++ b/admin/app/authentication/layout.tsx
@@ -0,0 +1,11 @@
+import { ReactNode } from "react";
+import { Metadata } from "next";
+import { AdminLayout } from "@/layouts/admin-layout";
+
+export const metadata: Metadata = {
+ title: "Authentication Settings - God Mode",
+};
+
+export default function AuthenticationLayout({ children }: { children: ReactNode }) {
+ return {children} ;
+}
diff --git a/admin/app/authentication/page.tsx b/admin/app/authentication/page.tsx
new file mode 100644
index 000000000..c44b74b49
--- /dev/null
+++ b/admin/app/authentication/page.tsx
@@ -0,0 +1,188 @@
+"use client";
+
+import { useState } from "react";
+import { observer } from "mobx-react-lite";
+import Image from "next/image";
+import { useTheme } from "next-themes";
+import useSWR from "swr";
+import { Mails, KeyRound } from "lucide-react";
+import { TInstanceConfigurationKeys } from "@plane/types";
+import { Loader, ToggleSwitch, setPromiseToast } from "@plane/ui";
+// components
+import { PageHeader } from "@/components/core";
+// hooks
+// helpers
+import { cn, resolveGeneralTheme } from "@/helpers/common.helper";
+import { useInstance } from "@/hooks/store";
+// images
+import githubLightModeImage from "@/public/logos/github-black.png";
+import githubDarkModeImage from "@/public/logos/github-white.png";
+import GoogleLogo from "@/public/logos/google-logo.svg";
+// local components
+import {
+ AuthenticationMethodCard,
+ EmailCodesConfiguration,
+ PasswordLoginConfiguration,
+ GithubConfiguration,
+ GoogleConfiguration,
+} from "./components";
+
+type TInstanceAuthenticationMethodCard = {
+ key: string;
+ name: string;
+ description: string;
+ icon: JSX.Element;
+ config: JSX.Element;
+};
+
+const InstanceAuthenticationPage = observer(() => {
+ // store
+ const { fetchInstanceConfigurations, formattedConfig, updateInstanceConfigurations } = useInstance();
+
+ useSWR("INSTANCE_CONFIGURATIONS", () => fetchInstanceConfigurations());
+
+ // state
+ const [isSubmitting, setIsSubmitting] = useState(false);
+ // theme
+ const { resolvedTheme } = useTheme();
+ // derived values
+ const enableSignUpConfig = formattedConfig?.ENABLE_SIGNUP ?? "";
+
+ const updateConfig = async (key: TInstanceConfigurationKeys, value: string) => {
+ setIsSubmitting(true);
+
+ const payload = {
+ [key]: value,
+ };
+
+ const updateConfigPromise = updateInstanceConfigurations(payload);
+
+ setPromiseToast(updateConfigPromise, {
+ loading: "Saving Configuration...",
+ success: {
+ title: "Success",
+ message: () => "Configuration saved successfully",
+ },
+ error: {
+ title: "Error",
+ message: () => "Failed to save configuration",
+ },
+ });
+
+ await updateConfigPromise
+ .then(() => {
+ setIsSubmitting(false);
+ })
+ .catch((err) => {
+ console.error(err);
+ setIsSubmitting(false);
+ });
+ };
+
+ // Authentication methods
+ const authenticationMethodsCard: TInstanceAuthenticationMethodCard[] = [
+ {
+ key: "email-codes",
+ name: "Email codes",
+ description: "Login or sign up using codes sent via emails. You need to have email setup here and enabled.",
+ icon: ,
+ config: ,
+ },
+ {
+ key: "password-login",
+ name: "Password based login",
+ description: "Allow members to create accounts with passwords for emails to sign in.",
+ icon: ,
+ config: ,
+ },
+ {
+ key: "google",
+ name: "Google",
+ description: "Allow members to login or sign up to plane with their Google accounts.",
+ icon: ,
+ config: ,
+ },
+ {
+ key: "github",
+ name: "Github",
+ description: "Allow members to login or sign up to plane with their Github accounts.",
+ icon: (
+
+ ),
+ config: ,
+ },
+ ];
+
+ return (
+ <>
+
+
+
+
Manage authentication for your instance
+
+ Configure authentication modes for your team and restrict sign ups to be invite only.
+
+
+
+ {formattedConfig ? (
+
+
Sign-up configuration
+
+
+
+
+ Allow anyone to sign up without invite
+
+
+ Toggling this off will disable self sign ups.
+
+
+
+
+
+ {
+ Boolean(parseInt(enableSignUpConfig)) === true
+ ? updateConfig("ENABLE_SIGNUP", "0")
+ : updateConfig("ENABLE_SIGNUP", "1");
+ }}
+ size="sm"
+ disabled={isSubmitting}
+ />
+
+
+
+
Authentication modes
+ {authenticationMethodsCard.map((method) => (
+
+ ))}
+
+ ) : (
+
+
+
+
+
+
+
+ )}
+
+
+ >
+ );
+});
+
+export default InstanceAuthenticationPage;
diff --git a/admin/app/email/email-config-form.tsx b/admin/app/email/email-config-form.tsx
new file mode 100644
index 000000000..8a18b481d
--- /dev/null
+++ b/admin/app/email/email-config-form.tsx
@@ -0,0 +1,222 @@
+import React, { FC, useMemo, useState } from "react";
+import { useForm } from "react-hook-form";
+// types
+import { IFormattedInstanceConfiguration, TInstanceEmailConfigurationKeys } from "@plane/types";
+// ui
+import { Button, CustomSelect, TOAST_TYPE, setToast } from "@plane/ui";
+// components
+import { ControllerInput, TControllerInputFormField } from "@/components/common";
+// hooks
+import { useInstance } from "@/hooks/store";
+// local components
+import { SendTestEmailModal } from "./test-email-modal";
+
+type IInstanceEmailForm = {
+ config: IFormattedInstanceConfiguration;
+};
+
+type EmailFormValues = Record;
+
+type TEmailSecurityKeys = "EMAIL_USE_TLS" | "EMAIL_USE_SSL" | "NONE";
+
+const EMAIL_SECURITY_OPTIONS: { [key in TEmailSecurityKeys]: string } = {
+ EMAIL_USE_TLS: "TLS",
+ EMAIL_USE_SSL: "SSL",
+ NONE: "No email security",
+};
+
+export const InstanceEmailForm: FC = (props) => {
+ const { config } = props;
+ // states
+ const [isSendTestEmailModalOpen, setIsSendTestEmailModalOpen] = useState(false);
+ // store hooks
+ const { updateInstanceConfigurations } = useInstance();
+ // form data
+ const {
+ handleSubmit,
+ watch,
+ setValue,
+ control,
+ formState: { errors, isValid, isDirty, isSubmitting },
+ } = useForm({
+ defaultValues: {
+ EMAIL_HOST: config["EMAIL_HOST"],
+ EMAIL_PORT: config["EMAIL_PORT"],
+ EMAIL_HOST_USER: config["EMAIL_HOST_USER"],
+ EMAIL_HOST_PASSWORD: config["EMAIL_HOST_PASSWORD"],
+ EMAIL_USE_TLS: config["EMAIL_USE_TLS"],
+ EMAIL_USE_SSL: config["EMAIL_USE_SSL"],
+ EMAIL_FROM: config["EMAIL_FROM"],
+ },
+ });
+
+ const emailFormFields: TControllerInputFormField[] = [
+ {
+ key: "EMAIL_HOST",
+ type: "text",
+ label: "Host",
+ placeholder: "email.google.com",
+ error: Boolean(errors.EMAIL_HOST),
+ required: true,
+ },
+ {
+ key: "EMAIL_PORT",
+ type: "text",
+ label: "Port",
+ placeholder: "8080",
+ error: Boolean(errors.EMAIL_PORT),
+ required: true,
+ },
+ {
+ key: "EMAIL_FROM",
+ type: "text",
+ label: "Sender email address",
+ description:
+ "This is the email address your users will see when getting emails from this instance. You will need to verify this address.",
+ placeholder: "no-reply@projectplane.so",
+ error: Boolean(errors.EMAIL_FROM),
+ required: true,
+ },
+ ];
+
+ const OptionalEmailFormFields: TControllerInputFormField[] = [
+ {
+ key: "EMAIL_HOST_USER",
+ type: "text",
+ label: "Username",
+ placeholder: "getitdone@projectplane.so",
+ error: Boolean(errors.EMAIL_HOST_USER),
+ required: false,
+ },
+ {
+ key: "EMAIL_HOST_PASSWORD",
+ type: "password",
+ label: "Password",
+ placeholder: "Password",
+ error: Boolean(errors.EMAIL_HOST_PASSWORD),
+ required: false,
+ },
+ ];
+
+ const onSubmit = async (formData: EmailFormValues) => {
+ const payload: Partial = { ...formData };
+
+ await updateInstanceConfigurations(payload)
+ .then(() =>
+ setToast({
+ type: TOAST_TYPE.SUCCESS,
+ title: "Success",
+ message: "Email Settings updated successfully",
+ })
+ )
+ .catch((err) => console.error(err));
+ };
+
+ const useTLSValue = watch("EMAIL_USE_TLS");
+ const useSSLValue = watch("EMAIL_USE_SSL");
+ const emailSecurityKey: TEmailSecurityKeys = useMemo(() => {
+ if (useTLSValue === "1") return "EMAIL_USE_TLS";
+ if (useSSLValue === "1") return "EMAIL_USE_SSL";
+ return "NONE";
+ }, [useTLSValue, useSSLValue]);
+
+ const handleEmailSecurityChange = (key: TEmailSecurityKeys) => {
+ if (key === "EMAIL_USE_SSL") {
+ setValue("EMAIL_USE_TLS", "0");
+ setValue("EMAIL_USE_SSL", "1");
+ }
+ if (key === "EMAIL_USE_TLS") {
+ setValue("EMAIL_USE_TLS", "1");
+ setValue("EMAIL_USE_SSL", "0");
+ }
+ if (key === "NONE") {
+ setValue("EMAIL_USE_TLS", "0");
+ setValue("EMAIL_USE_SSL", "0");
+ }
+ };
+
+ return (
+
+
+
setIsSendTestEmailModalOpen(false)} />
+
+ {emailFormFields.map((field) => (
+
+ ))}
+
+
Email security
+
+ {Object.entries(EMAIL_SECURITY_OPTIONS).map(([key, value]) => (
+
+ {value}
+
+ ))}
+
+
+
+
+
+
+
+
Authentication (optional)
+
+ We recommend setting up a username password for your SMTP server
+
+
+
+
+
+ {OptionalEmailFormFields.map((field) => (
+
+ ))}
+
+
+
+
+
+ {isSubmitting ? "Saving..." : "Save changes"}
+
+ setIsSendTestEmailModalOpen(true)}
+ loading={isSubmitting}
+ disabled={!isValid}
+ >
+ Send test email
+
+
+
+ );
+};
diff --git a/admin/app/email/layout.tsx b/admin/app/email/layout.tsx
new file mode 100644
index 000000000..64f019ec9
--- /dev/null
+++ b/admin/app/email/layout.tsx
@@ -0,0 +1,15 @@
+import { ReactNode } from "react";
+import { Metadata } from "next";
+import { AdminLayout } from "@/layouts/admin-layout";
+
+interface EmailLayoutProps {
+ children: ReactNode;
+}
+
+export const metadata: Metadata = {
+ title: "Email Settings - God Mode",
+};
+
+const EmailLayout = ({ children }: EmailLayoutProps) => {children} ;
+
+export default EmailLayout;
diff --git a/admin/app/email/page.tsx b/admin/app/email/page.tsx
new file mode 100644
index 000000000..198020d4d
--- /dev/null
+++ b/admin/app/email/page.tsx
@@ -0,0 +1,51 @@
+"use client";
+
+import { observer } from "mobx-react-lite";
+import useSWR from "swr";
+import { Loader } from "@plane/ui";
+// components
+import { PageHeader } from "@/components/core";
+// hooks
+import { useInstance } from "@/hooks/store";
+// components
+import { InstanceEmailForm } from "./email-config-form";
+
+const InstanceEmailPage = observer(() => {
+ // store
+ const { fetchInstanceConfigurations, formattedConfig } = useInstance();
+
+ useSWR("INSTANCE_CONFIGURATIONS", () => fetchInstanceConfigurations());
+
+ return (
+ <>
+
+
+
+
Secure emails from your own instance
+
+ Plane can send useful emails to you and your users from your own instance without talking to the Internet.
+
+ Set it up below and please test your settings before you save them.
+ Misconfigs can lead to email bounces and errors.
+
+
+
+
+ {formattedConfig ? (
+
+ ) : (
+
+
+
+
+
+
+
+ )}
+
+
+ >
+ );
+});
+
+export default InstanceEmailPage;
diff --git a/admin/app/email/test-email-modal.tsx b/admin/app/email/test-email-modal.tsx
new file mode 100644
index 000000000..6d5cb8032
--- /dev/null
+++ b/admin/app/email/test-email-modal.tsx
@@ -0,0 +1,135 @@
+import React, { FC, useEffect, useState } from "react";
+import { Dialog, Transition } from "@headlessui/react";
+// ui
+import { Button, Input } from "@plane/ui";
+// services
+import { InstanceService } from "@/services/instance.service";
+
+type Props = {
+ isOpen: boolean;
+ handleClose: () => void;
+};
+
+enum ESendEmailSteps {
+ SEND_EMAIL = "SEND_EMAIL",
+ SUCCESS = "SUCCESS",
+ FAILED = "FAILED",
+}
+
+const instanceService = new InstanceService();
+
+export const SendTestEmailModal: FC = (props) => {
+ const { isOpen, handleClose } = props;
+
+ // state
+ const [receiverEmail, setReceiverEmail] = useState("");
+ const [sendEmailStep, setSendEmailStep] = useState(ESendEmailSteps.SEND_EMAIL);
+ const [isLoading, setIsLoading] = useState(false);
+ const [error, setError] = useState("");
+
+ // reset state
+ const resetState = () => {
+ setReceiverEmail("");
+ setSendEmailStep(ESendEmailSteps.SEND_EMAIL);
+ setIsLoading(false);
+ setError("");
+ };
+
+ useEffect(() => {
+ if (!isOpen) {
+ resetState();
+ }
+ }, [isOpen]);
+
+ const handleSubmit = async (e: React.MouseEvent) => {
+ e.preventDefault();
+
+ setIsLoading(true);
+ await instanceService
+ .sendTestEmail(receiverEmail)
+ .then(() => {
+ setSendEmailStep(ESendEmailSteps.SUCCESS);
+ })
+ .catch((error) => {
+ setError(error?.error || "Failed to send email");
+ setSendEmailStep(ESendEmailSteps.FAILED);
+ })
+ .finally(() => {
+ setIsLoading(false);
+ });
+ };
+
+ return (
+
+
+
+
+
+
+
+
+
+
+ {sendEmailStep === ESendEmailSteps.SEND_EMAIL
+ ? "Send test email"
+ : sendEmailStep === ESendEmailSteps.SUCCESS
+ ? "Email send"
+ : "Failed"}{" "}
+
+
+ {sendEmailStep === ESendEmailSteps.SEND_EMAIL && (
+
setReceiverEmail(e.target.value)}
+ placeholder="Receiver email"
+ className="w-full resize-none text-lg"
+ tabIndex={1}
+ />
+ )}
+ {sendEmailStep === ESendEmailSteps.SUCCESS && (
+
+
+ We have sent the test email to {receiverEmail}. Please check your spam folder if you cannot find
+ it.
+
+
If you still cannot find it, recheck your SMTP configuration and trigger a new test email.
+
+ )}
+ {sendEmailStep === ESendEmailSteps.FAILED &&
{error}
}
+
+
+ {sendEmailStep === ESendEmailSteps.SEND_EMAIL ? "Cancel" : "Close"}
+
+ {sendEmailStep === ESendEmailSteps.SEND_EMAIL && (
+
+ {isLoading ? "Sending email..." : "Send email"}
+
+ )}
+
+
+
+
+
+
+
+
+ );
+};
diff --git a/admin/app/error.tsx b/admin/app/error.tsx
new file mode 100644
index 000000000..76794e04a
--- /dev/null
+++ b/admin/app/error.tsx
@@ -0,0 +1,9 @@
+"use client";
+
+export default function RootErrorPage() {
+ return (
+
+
Something went wrong.
+
+ );
+}
diff --git a/admin/app/general/form.tsx b/admin/app/general/form.tsx
new file mode 100644
index 000000000..5646084e2
--- /dev/null
+++ b/admin/app/general/form.tsx
@@ -0,0 +1,140 @@
+"use client";
+import { FC } from "react";
+import { observer } from "mobx-react-lite";
+import { Controller, useForm } from "react-hook-form";
+import { Telescope } from "lucide-react";
+// types
+import { IInstance, IInstanceAdmin } from "@plane/types";
+// ui
+import { Button, Input, TOAST_TYPE, ToggleSwitch, setToast } from "@plane/ui";
+// components
+import { ControllerInput } from "@/components/common";
+// hooks
+import { useInstance } from "@/hooks/store";
+
+export interface IGeneralConfigurationForm {
+ instance: IInstance;
+ instanceAdmins: IInstanceAdmin[];
+}
+
+export const GeneralConfigurationForm: FC = observer((props) => {
+ const { instance, instanceAdmins } = props;
+ // hooks
+ const { updateInstanceInfo } = useInstance();
+ // form data
+ const {
+ handleSubmit,
+ control,
+ formState: { errors, isSubmitting },
+ } = useForm>({
+ defaultValues: {
+ instance_name: instance?.instance_name,
+ is_telemetry_enabled: instance?.is_telemetry_enabled,
+ },
+ });
+
+ const onSubmit = async (formData: Partial) => {
+ const payload: Partial = { ...formData };
+
+ console.log("payload", payload);
+
+ await updateInstanceInfo(payload)
+ .then(() =>
+ setToast({
+ type: TOAST_TYPE.SUCCESS,
+ title: "Success",
+ message: "Settings updated successfully",
+ })
+ )
+ .catch((err) => console.error(err));
+ };
+
+ return (
+
+
+
Instance details
+
+
+
+
+
Email
+
+
+
+
+
Instance ID
+
+
+
+
+
+
+
Telemetry
+
+
+
+
+
+ Allow Plane to collect anonymous usage events
+
+
+ We collect usage events without any PII to analyse and improve Plane.{" "}
+
+ Know more.
+
+
+
+
+
+ (
+
+ )}
+ />
+
+
+
+
+
+
+ {isSubmitting ? "Saving..." : "Save changes"}
+
+
+
+ );
+});
diff --git a/admin/app/general/layout.tsx b/admin/app/general/layout.tsx
new file mode 100644
index 000000000..fabbe3640
--- /dev/null
+++ b/admin/app/general/layout.tsx
@@ -0,0 +1,11 @@
+import { ReactNode } from "react";
+import { Metadata } from "next";
+import { AdminLayout } from "@/layouts/admin-layout";
+
+export const metadata: Metadata = {
+ title: "General Settings - God Mode",
+};
+
+export default function GeneralLayout({ children }: { children: ReactNode }) {
+ return {children} ;
+}
diff --git a/admin/app/general/page.tsx b/admin/app/general/page.tsx
new file mode 100644
index 000000000..5aaea9f8e
--- /dev/null
+++ b/admin/app/general/page.tsx
@@ -0,0 +1,31 @@
+"use client";
+import { observer } from "mobx-react-lite";
+// hooks
+import { useInstance } from "@/hooks/store";
+// components
+import { GeneralConfigurationForm } from "./form";
+
+function GeneralPage() {
+ const { instance, instanceAdmins } = useInstance();
+ console.log("instance", instance);
+ return (
+ <>
+
+
+
General settings
+
+ Change the name of your instance and instance admin e-mail addresses. Enable or disable telemetry in your
+ instance.
+
+
+
+ {instance && instanceAdmins && (
+
+ )}
+
+
+ >
+ );
+}
+
+export default observer(GeneralPage);
diff --git a/admin/app/globals.css b/admin/app/globals.css
new file mode 100644
index 000000000..0a2218c21
--- /dev/null
+++ b/admin/app/globals.css
@@ -0,0 +1,432 @@
+@import url("https://fonts.googleapis.com/css2?family=Inter:wght@200;300;400;500;600;700;800&display=swap");
+@import url("https://fonts.googleapis.com/css2?family=Material+Symbols+Rounded:opsz,wght,FILL,GRAD@48,400,0,0&display=swap");
+
+@tailwind base;
+@tailwind components;
+@tailwind utilities;
+
+@layer components {
+ .text-1\.5xl {
+ font-size: 1.375rem;
+ line-height: 1.875rem;
+ }
+
+ .text-2\.5xl {
+ font-size: 1.75rem;
+ line-height: 2.25rem;
+ }
+}
+
+@layer base {
+ html {
+ font-family: "Inter", sans-serif;
+ }
+
+ :root {
+ color-scheme: light !important;
+
+ --color-primary-10: 236, 241, 255;
+ --color-primary-20: 217, 228, 255;
+ --color-primary-30: 197, 214, 255;
+ --color-primary-40: 178, 200, 255;
+ --color-primary-50: 159, 187, 255;
+ --color-primary-60: 140, 173, 255;
+ --color-primary-70: 121, 159, 255;
+ --color-primary-80: 101, 145, 255;
+ --color-primary-90: 82, 132, 255;
+ --color-primary-100: 63, 118, 255;
+ --color-primary-200: 57, 106, 230;
+ --color-primary-300: 50, 94, 204;
+ --color-primary-400: 44, 83, 179;
+ --color-primary-500: 38, 71, 153;
+ --color-primary-600: 32, 59, 128;
+ --color-primary-700: 25, 47, 102;
+ --color-primary-800: 19, 35, 76;
+ --color-primary-900: 13, 24, 51;
+
+ --color-background-100: 255, 255, 255; /* primary bg */
+ --color-background-90: 247, 247, 247; /* secondary bg */
+ --color-background-80: 232, 232, 232; /* tertiary bg */
+
+ --color-text-100: 23, 23, 23; /* primary text */
+ --color-text-200: 58, 58, 58; /* secondary text */
+ --color-text-300: 82, 82, 82; /* tertiary text */
+ --color-text-400: 163, 163, 163; /* placeholder text */
+
+ --color-scrollbar: 163, 163, 163; /* scrollbar thumb */
+
+ --color-border-100: 245, 245, 245; /* subtle border= 1 */
+ --color-border-200: 229, 229, 229; /* subtle border- 2 */
+ --color-border-300: 212, 212, 212; /* strong border- 1 */
+ --color-border-400: 185, 185, 185; /* strong border- 2 */
+
+ --color-shadow-2xs: 0px 0px 1px 0px rgba(23, 23, 23, 0.06), 0px 1px 2px 0px rgba(23, 23, 23, 0.06),
+ 0px 1px 2px 0px rgba(23, 23, 23, 0.14);
+ --color-shadow-xs: 0px 1px 2px 0px rgba(0, 0, 0, 0.16), 0px 2px 4px 0px rgba(16, 24, 40, 0.12),
+ 0px 1px 8px -1px rgba(16, 24, 40, 0.1);
+ --color-shadow-sm: 0px 1px 4px 0px rgba(0, 0, 0, 0.01), 0px 4px 8px 0px rgba(0, 0, 0, 0.02),
+ 0px 1px 12px 0px rgba(0, 0, 0, 0.12);
+ --color-shadow-rg: 0px 3px 6px 0px rgba(0, 0, 0, 0.1), 0px 4px 4px 0px rgba(16, 24, 40, 0.08),
+ 0px 1px 12px 0px rgba(16, 24, 40, 0.04);
+ --color-shadow-md: 0px 4px 8px 0px rgba(0, 0, 0, 0.12), 0px 6px 12px 0px rgba(16, 24, 40, 0.12),
+ 0px 1px 16px 0px rgba(16, 24, 40, 0.12);
+ --color-shadow-lg: 0px 6px 12px 0px rgba(0, 0, 0, 0.12), 0px 8px 16px 0px rgba(0, 0, 0, 0.12),
+ 0px 1px 24px 0px rgba(16, 24, 40, 0.12);
+ --color-shadow-xl: 0px 0px 18px 0px rgba(0, 0, 0, 0.16), 0px 0px 24px 0px rgba(16, 24, 40, 0.16),
+ 0px 0px 52px 0px rgba(16, 24, 40, 0.16);
+ --color-shadow-2xl: 0px 8px 16px 0px rgba(0, 0, 0, 0.12), 0px 12px 24px 0px rgba(16, 24, 40, 0.12),
+ 0px 1px 32px 0px rgba(16, 24, 40, 0.12);
+ --color-shadow-3xl: 0px 12px 24px 0px rgba(0, 0, 0, 0.12), 0px 16px 32px 0px rgba(0, 0, 0, 0.12),
+ 0px 1px 48px 0px rgba(16, 24, 40, 0.12);
+ --color-shadow-4xl: 0px 8px 40px 0px rgba(0, 0, 61, 0.05), 0px 12px 32px -16px rgba(0, 0, 0, 0.05);
+
+ --color-sidebar-background-100: var(--color-background-100); /* primary sidebar bg */
+ --color-sidebar-background-90: var(--color-background-90); /* secondary sidebar bg */
+ --color-sidebar-background-80: var(--color-background-80); /* tertiary sidebar bg */
+
+ --color-sidebar-text-100: var(--color-text-100); /* primary sidebar text */
+ --color-sidebar-text-200: var(--color-text-200); /* secondary sidebar text */
+ --color-sidebar-text-300: var(--color-text-300); /* tertiary sidebar text */
+ --color-sidebar-text-400: var(--color-text-400); /* sidebar placeholder text */
+
+ --color-sidebar-border-100: var(--color-border-100); /* subtle sidebar border= 1 */
+ --color-sidebar-border-200: var(--color-border-100); /* subtle sidebar border- 2 */
+ --color-sidebar-border-300: var(--color-border-100); /* strong sidebar border- 1 */
+ --color-sidebar-border-400: var(--color-border-100); /* strong sidebar border- 2 */
+
+ --color-sidebar-shadow-2xs: var(--color-shadow-2xs);
+ --color-sidebar-shadow-xs: var(--color-shadow-xs);
+ --color-sidebar-shadow-sm: var(--color-shadow-sm);
+ --color-sidebar-shadow-rg: var(--color-shadow-rg);
+ --color-sidebar-shadow-md: var(--color-shadow-md);
+ --color-sidebar-shadow-lg: var(--color-shadow-lg);
+ --color-sidebar-shadow-xl: var(--color-shadow-xl);
+ --color-sidebar-shadow-2xl: var(--color-shadow-2xl);
+ --color-sidebar-shadow-3xl: var(--color-shadow-3xl);
+ --color-sidebar-shadow-4xl: var(--color-shadow-4xl);
+ }
+
+ [data-theme="light"],
+ [data-theme="light-contrast"] {
+ color-scheme: light !important;
+
+ --color-background-100: 255, 255, 255; /* primary bg */
+ --color-background-90: 247, 247, 247; /* secondary bg */
+ --color-background-80: 232, 232, 232; /* tertiary bg */
+ }
+
+ [data-theme="light"] {
+ --color-text-100: 23, 23, 23; /* primary text */
+ --color-text-200: 58, 58, 58; /* secondary text */
+ --color-text-300: 82, 82, 82; /* tertiary text */
+ --color-text-400: 163, 163, 163; /* placeholder text */
+
+ --color-scrollbar: 163, 163, 163; /* scrollbar thumb */
+
+ --color-border-100: 245, 245, 245; /* subtle border= 1 */
+ --color-border-200: 229, 229, 229; /* subtle border- 2 */
+ --color-border-300: 212, 212, 212; /* strong border- 1 */
+ --color-border-400: 185, 185, 185; /* strong border- 2 */
+
+ /* onboarding colors */
+ --gradient-onboarding-100: linear-gradient(106deg, #f2f6ff 29.8%, #e1eaff 99.34%);
+ --gradient-onboarding-200: linear-gradient(129deg, rgba(255, 255, 255, 0) -22.23%, rgba(255, 255, 255, 0.8) 62.98%);
+ --gradient-onboarding-300: linear-gradient(164deg, #fff 4.25%, rgba(255, 255, 255, 0.06) 93.5%);
+ --gradient-onboarding-400: linear-gradient(129deg, rgba(255, 255, 255, 0) -22.23%, rgba(255, 255, 255, 0.8) 62.98%);
+
+ --color-onboarding-text-100: 23, 23, 23;
+ --color-onboarding-text-200: 58, 58, 58;
+ --color-onboarding-text-300: 82, 82, 82;
+ --color-onboarding-text-400: 163, 163, 163;
+
+ --color-onboarding-background-100: 236, 241, 255;
+ --color-onboarding-background-200: 255, 255, 255;
+ --color-onboarding-background-300: 236, 241, 255;
+ --color-onboarding-background-400: 177, 206, 250;
+
+ --color-onboarding-border-100: 229, 229, 229;
+ --color-onboarding-border-200: 217, 228, 255;
+ --color-onboarding-border-300: 229, 229, 229, 0.5;
+
+ --color-onboarding-shadow-sm: 0px 4px 20px 0px rgba(126, 139, 171, 0.1);
+
+ /* toast theme */
+ --color-toast-success-text: 62, 155, 79;
+ --color-toast-error-text: 220, 62, 66;
+ --color-toast-warning-text: 255, 186, 24;
+ --color-toast-info-text: 51, 88, 212;
+ --color-toast-loading-text: 28, 32, 36;
+ --color-toast-secondary-text: 128, 131, 141;
+ --color-toast-tertiary-text: 96, 100, 108;
+
+ --color-toast-success-background: 253, 253, 254;
+ --color-toast-error-background: 255, 252, 252;
+ --color-toast-warning-background: 254, 253, 251;
+ --color-toast-info-background: 253, 253, 254;
+ --color-toast-loading-background: 253, 253, 254;
+
+ --color-toast-success-border: 218, 241, 219;
+ --color-toast-error-border: 255, 219, 220;
+ --color-toast-warning-border: 255, 247, 194;
+ --color-toast-info-border: 210, 222, 255;
+ --color-toast-loading-border: 224, 225, 230;
+ }
+
+ [data-theme="light-contrast"] {
+ --color-text-100: 11, 11, 11; /* primary text */
+ --color-text-200: 38, 38, 38; /* secondary text */
+ --color-text-300: 58, 58, 58; /* tertiary text */
+ --color-text-400: 115, 115, 115; /* placeholder text */
+
+ --color-scrollbar: 115, 115, 115; /* scrollbar thumb */
+
+ --color-border-100: 34, 34, 34; /* subtle border= 1 */
+ --color-border-200: 38, 38, 38; /* subtle border- 2 */
+ --color-border-300: 46, 46, 46; /* strong border- 1 */
+ --color-border-400: 58, 58, 58; /* strong border- 2 */
+ }
+
+ [data-theme="dark"],
+ [data-theme="dark-contrast"] {
+ color-scheme: dark !important;
+
+ --color-background-100: 25, 25, 25; /* primary bg */
+ --color-background-90: 32, 32, 32; /* secondary bg */
+ --color-background-80: 44, 44, 44; /* tertiary bg */
+
+ --color-shadow-2xs: 0px 0px 1px 0px rgba(0, 0, 0, 0.15), 0px 1px 3px 0px rgba(0, 0, 0, 0.5);
+ --color-shadow-xs: 0px 0px 2px 0px rgba(0, 0, 0, 0.2), 0px 2px 4px 0px rgba(0, 0, 0, 0.5);
+ --color-shadow-sm: 0px 0px 4px 0px rgba(0, 0, 0, 0.2), 0px 2px 6px 0px rgba(0, 0, 0, 0.5);
+ --color-shadow-rg: 0px 0px 6px 0px rgba(0, 0, 0, 0.2), 0px 4px 6px 0px rgba(0, 0, 0, 0.5);
+ --color-shadow-md: 0px 2px 8px 0px rgba(0, 0, 0, 0.2), 0px 4px 8px 0px rgba(0, 0, 0, 0.5);
+ --color-shadow-lg: 0px 4px 12px 0px rgba(0, 0, 0, 0.25), 0px 4px 10px 0px rgba(0, 0, 0, 0.55);
+ --color-shadow-xl: 0px 0px 14px 0px rgba(0, 0, 0, 0.25), 0px 6px 10px 0px rgba(0, 0, 0, 0.55);
+ --color-shadow-2xl: 0px 0px 18px 0px rgba(0, 0, 0, 0.25), 0px 8px 12px 0px rgba(0, 0, 0, 0.6);
+ --color-shadow-3xl: 0px 4px 24px 0px rgba(0, 0, 0, 0.3), 0px 12px 40px 0px rgba(0, 0, 0, 0.65);
+ }
+
+ [data-theme="dark"] {
+ --color-text-100: 229, 229, 229; /* primary text */
+ --color-text-200: 163, 163, 163; /* secondary text */
+ --color-text-300: 115, 115, 115; /* tertiary text */
+ --color-text-400: 82, 82, 82; /* placeholder text */
+
+ --color-scrollbar: 82, 82, 82; /* scrollbar thumb */
+
+ --color-border-100: 34, 34, 34; /* subtle border= 1 */
+ --color-border-200: 38, 38, 38; /* subtle border- 2 */
+ --color-border-300: 46, 46, 46; /* strong border- 1 */
+ --color-border-400: 58, 58, 58; /* strong border- 2 */
+
+ /* onboarding colors */
+ --gradient-onboarding-100: linear-gradient(106deg, #18191b 25.17%, #18191b 99.34%);
+ --gradient-onboarding-200: linear-gradient(129deg, rgba(47, 49, 53, 0.8) -22.23%, rgba(33, 34, 37, 0.8) 62.98%);
+ --gradient-onboarding-300: linear-gradient(167deg, rgba(47, 49, 53, 0.45) 19.22%, #212225 98.48%);
+
+ --color-onboarding-text-100: 237, 238, 240;
+ --color-onboarding-text-200: 176, 180, 187;
+ --color-onboarding-text-300: 118, 123, 132;
+ --color-onboarding-text-400: 105, 110, 119;
+
+ --color-onboarding-background-100: 54, 58, 64;
+ --color-onboarding-background-200: 40, 42, 45;
+ --color-onboarding-background-300: 40, 42, 45;
+ --color-onboarding-background-400: 67, 72, 79;
+
+ --color-onboarding-border-100: 54, 58, 64;
+ --color-onboarding-border-200: 54, 58, 64;
+ --color-onboarding-border-300: 34, 35, 38, 0.5;
+
+ --color-onboarding-shadow-sm: 0px 4px 20px 0px rgba(39, 44, 56, 0.1);
+
+ /* toast theme */
+ --color-toast-success-text: 178, 221, 181;
+ --color-toast-error-text: 206, 44, 49;
+ --color-toast-warning-text: 255, 186, 24;
+ --color-toast-info-text: 141, 164, 239;
+ --color-toast-loading-text: 255, 255, 255;
+ --color-toast-secondary-text: 185, 187, 198;
+ --color-toast-tertiary-text: 139, 141, 152;
+
+ --color-toast-success-background: 46, 46, 46;
+ --color-toast-error-background: 46, 46, 46;
+ --color-toast-warning-background: 46, 46, 46;
+ --color-toast-info-background: 46, 46, 46;
+ --color-toast-loading-background: 46, 46, 46;
+
+ --color-toast-success-border: 42, 126, 59;
+ --color-toast-error-border: 100, 23, 35;
+ --color-toast-warning-border: 79, 52, 34;
+ --color-toast-info-border: 58, 91, 199;
+ --color-toast-loading-border: 96, 100, 108;
+ }
+
+ [data-theme="dark-contrast"] {
+ --color-text-100: 250, 250, 250; /* primary text */
+ --color-text-200: 241, 241, 241; /* secondary text */
+ --color-text-300: 212, 212, 212; /* tertiary text */
+ --color-text-400: 115, 115, 115; /* placeholder text */
+
+ --color-scrollbar: 115, 115, 115; /* scrollbar thumb */
+
+ --color-border-100: 245, 245, 245; /* subtle border= 1 */
+ --color-border-200: 229, 229, 229; /* subtle border- 2 */
+ --color-border-300: 212, 212, 212; /* strong border- 1 */
+ --color-border-400: 185, 185, 185; /* strong border- 2 */
+ }
+
+ [data-theme="light"],
+ [data-theme="dark"],
+ [data-theme="light-contrast"],
+ [data-theme="dark-contrast"] {
+ --color-primary-10: 236, 241, 255;
+ --color-primary-20: 217, 228, 255;
+ --color-primary-30: 197, 214, 255;
+ --color-primary-40: 178, 200, 255;
+ --color-primary-50: 159, 187, 255;
+ --color-primary-60: 140, 173, 255;
+ --color-primary-70: 121, 159, 255;
+ --color-primary-80: 101, 145, 255;
+ --color-primary-90: 82, 132, 255;
+ --color-primary-100: 63, 118, 255;
+ --color-primary-200: 57, 106, 230;
+ --color-primary-300: 50, 94, 204;
+ --color-primary-400: 44, 83, 179;
+ --color-primary-500: 38, 71, 153;
+ --color-primary-600: 32, 59, 128;
+ --color-primary-700: 25, 47, 102;
+ --color-primary-800: 19, 35, 76;
+ --color-primary-900: 13, 24, 51;
+
+ --color-sidebar-background-100: var(--color-background-100); /* primary sidebar bg */
+ --color-sidebar-background-90: var(--color-background-90); /* secondary sidebar bg */
+ --color-sidebar-background-80: var(--color-background-80); /* tertiary sidebar bg */
+
+ --color-sidebar-text-100: var(--color-text-100); /* primary sidebar text */
+ --color-sidebar-text-200: var(--color-text-200); /* secondary sidebar text */
+ --color-sidebar-text-300: var(--color-text-300); /* tertiary sidebar text */
+ --color-sidebar-text-400: var(--color-text-400); /* sidebar placeholder text */
+
+ --color-sidebar-border-100: var(--color-border-100); /* subtle sidebar border= 1 */
+ --color-sidebar-border-200: var(--color-border-200); /* subtle sidebar border- 2 */
+ --color-sidebar-border-300: var(--color-border-300); /* strong sidebar border- 1 */
+ --color-sidebar-border-400: var(--color-border-400); /* strong sidebar border- 2 */
+ }
+}
+
+* {
+ margin: 0;
+ padding: 0;
+ box-sizing: border-box;
+ -webkit-text-size-adjust: 100%;
+ -ms-text-size-adjust: 100%;
+ font-variant-ligatures: none;
+ -webkit-font-variant-ligatures: none;
+ text-rendering: optimizeLegibility;
+ -moz-osx-font-smoothing: grayscale;
+ -webkit-font-smoothing: antialiased;
+}
+
+body {
+ color: rgba(var(--color-text-100));
+}
+
+/* scrollbar style */
+@-moz-document url-prefix() {
+ * {
+ scrollbar-width: none;
+ }
+ .vertical-scrollbar,
+ .horizontal-scrollbar {
+ scrollbar-width: initial;
+ scrollbar-color: rgba(96, 100, 108, 0.1) transparent;
+ }
+ .vertical-scrollbar:hover,
+ .horizontal-scrollbar:hover {
+ scrollbar-color: rgba(96, 100, 108, 0.25) transparent;
+ }
+ .vertical-scrollbar:active,
+ .horizontal-scrollbar:active {
+ scrollbar-color: rgba(96, 100, 108, 0.7) transparent;
+ }
+}
+
+.vertical-scrollbar {
+ overflow-y: auto;
+}
+.horizontal-scrollbar {
+ overflow-x: auto;
+}
+.vertical-scrollbar::-webkit-scrollbar,
+.horizontal-scrollbar::-webkit-scrollbar {
+ display: block;
+}
+.vertical-scrollbar::-webkit-scrollbar-track,
+.horizontal-scrollbar::-webkit-scrollbar-track {
+ background-color: transparent;
+ border-radius: 9999px;
+}
+.vertical-scrollbar::-webkit-scrollbar-thumb,
+.horizontal-scrollbar::-webkit-scrollbar-thumb {
+ background-clip: padding-box;
+ background-color: rgba(96, 100, 108, 0.1);
+ border-radius: 9999px;
+}
+.vertical-scrollbar:hover::-webkit-scrollbar-thumb,
+.horizontal-scrollbar:hover::-webkit-scrollbar-thumb {
+ background-color: rgba(96, 100, 108, 0.25);
+}
+.vertical-scrollbar::-webkit-scrollbar-thumb:hover,
+.horizontal-scrollbar::-webkit-scrollbar-thumb:hover {
+ background-color: rgba(96, 100, 108, 0.5);
+}
+.vertical-scrollbar::-webkit-scrollbar-thumb:active,
+.horizontal-scrollbar::-webkit-scrollbar-thumb:active {
+ background-color: rgba(96, 100, 108, 0.7);
+}
+.vertical-scrollbar::-webkit-scrollbar-corner,
+.horizontal-scrollbar::-webkit-scrollbar-corner {
+ background-color: transparent;
+}
+.vertical-scrollbar-margin-top-md::-webkit-scrollbar-track {
+ margin-top: 44px;
+}
+
+/* scrollbar sm size */
+.scrollbar-sm::-webkit-scrollbar {
+ height: 12px;
+ width: 12px;
+}
+.scrollbar-sm::-webkit-scrollbar-thumb {
+ border: 3px solid rgba(0, 0, 0, 0);
+}
+/* scrollbar md size */
+.scrollbar-md::-webkit-scrollbar {
+ height: 14px;
+ width: 14px;
+}
+.scrollbar-md::-webkit-scrollbar-thumb {
+ border: 3px solid rgba(0, 0, 0, 0);
+}
+/* scrollbar lg size */
+
+.scrollbar-lg::-webkit-scrollbar {
+ height: 16px;
+ width: 16px;
+}
+.scrollbar-lg::-webkit-scrollbar-thumb {
+ border: 4px solid rgba(0, 0, 0, 0);
+}
+/* end scrollbar style */
+
+/* progress bar */
+.progress-bar {
+ fill: currentColor;
+ color: rgba(var(--color-sidebar-background-100));
+}
+
+::-webkit-input-placeholder,
+::placeholder,
+:-ms-input-placeholder {
+ color: rgb(var(--color-text-400));
+}
diff --git a/admin/app/image/form.tsx b/admin/app/image/form.tsx
new file mode 100644
index 000000000..a6fe2945b
--- /dev/null
+++ b/admin/app/image/form.tsx
@@ -0,0 +1,79 @@
+import { FC } from "react";
+import { useForm } from "react-hook-form";
+import { IFormattedInstanceConfiguration, TInstanceImageConfigurationKeys } from "@plane/types";
+import { Button, TOAST_TYPE, setToast } from "@plane/ui";
+// components
+import { ControllerInput } from "@/components/common";
+// hooks
+import { useInstance } from "@/hooks/store";
+
+type IInstanceImageConfigForm = {
+ config: IFormattedInstanceConfiguration;
+};
+
+type ImageConfigFormValues = Record;
+
+export const InstanceImageConfigForm: FC = (props) => {
+ const { config } = props;
+ // store hooks
+ const { updateInstanceConfigurations } = useInstance();
+ // form data
+ const {
+ handleSubmit,
+ control,
+ formState: { errors, isSubmitting },
+ } = useForm({
+ defaultValues: {
+ UNSPLASH_ACCESS_KEY: config["UNSPLASH_ACCESS_KEY"],
+ },
+ });
+
+ const onSubmit = async (formData: ImageConfigFormValues) => {
+ const payload: Partial = { ...formData };
+
+ await updateInstanceConfigurations(payload)
+ .then(() =>
+ setToast({
+ type: TOAST_TYPE.SUCCESS,
+ title: "Success",
+ message: "Image Configuration Settings updated successfully",
+ })
+ )
+ .catch((err) => console.error(err));
+ };
+
+ return (
+
+
+
+ You will find your access key in your Unsplash developer console.
+
+ Learn more.
+
+ >
+ }
+ placeholder="oXgq-sdfadsaeweqasdfasdf3234234rassd"
+ error={Boolean(errors.UNSPLASH_ACCESS_KEY)}
+ required
+ />
+
+
+
+
+ {isSubmitting ? "Saving..." : "Save changes"}
+
+
+
+ );
+};
diff --git a/admin/app/image/layout.tsx b/admin/app/image/layout.tsx
new file mode 100644
index 000000000..18e9343b5
--- /dev/null
+++ b/admin/app/image/layout.tsx
@@ -0,0 +1,15 @@
+import { ReactNode } from "react";
+import { Metadata } from "next";
+import { AdminLayout } from "@/layouts/admin-layout";
+
+interface ImageLayoutProps {
+ children: ReactNode;
+}
+
+export const metadata: Metadata = {
+ title: "Images Settings - God Mode",
+};
+
+const ImageLayout = ({ children }: ImageLayoutProps) => {children} ;
+
+export default ImageLayout;
diff --git a/admin/app/image/page.tsx b/admin/app/image/page.tsx
new file mode 100644
index 000000000..ceaad61f2
--- /dev/null
+++ b/admin/app/image/page.tsx
@@ -0,0 +1,44 @@
+"use client";
+
+import { observer } from "mobx-react-lite";
+import useSWR from "swr";
+import { Loader } from "@plane/ui";
+// components
+import { PageHeader } from "@/components/core";
+// hooks
+import { useInstance } from "@/hooks/store";
+// local
+import { InstanceImageConfigForm } from "./form";
+
+const InstanceImagePage = observer(() => {
+ // store
+ const { formattedConfig, fetchInstanceConfigurations } = useInstance();
+
+ useSWR("INSTANCE_CONFIGURATIONS", () => fetchInstanceConfigurations());
+
+ return (
+ <>
+
+
+
+
Third-party image libraries
+
+ Let your users search and choose images from third-party libraries
+
+
+
+ {formattedConfig ? (
+
+ ) : (
+
+
+
+
+ )}
+
+
+ >
+ );
+});
+
+export default InstanceImagePage;
diff --git a/admin/app/layout.tsx b/admin/app/layout.tsx
new file mode 100644
index 000000000..e79d0bac8
--- /dev/null
+++ b/admin/app/layout.tsx
@@ -0,0 +1,48 @@
+"use client";
+
+import { ReactNode } from "react";
+import { ThemeProvider, useTheme } from "next-themes";
+import { SWRConfig } from "swr";
+// ui
+import { Toast } from "@plane/ui";
+// constants
+import { SWR_CONFIG } from "@/constants/swr-config";
+// helpers
+import { ASSET_PREFIX, resolveGeneralTheme } from "@/helpers/common.helper";
+// lib
+import { InstanceProvider } from "@/lib/instance-provider";
+import { StoreProvider } from "@/lib/store-provider";
+import { UserProvider } from "@/lib/user-provider";
+// styles
+import "./globals.css";
+
+function RootLayout({ children }: { children: ReactNode }) {
+ // themes
+ const { resolvedTheme } = useTheme();
+
+ return (
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ {children}
+
+
+
+
+
+
+ );
+}
+
+export default RootLayout;
diff --git a/admin/app/page.tsx b/admin/app/page.tsx
new file mode 100644
index 000000000..b402fc44d
--- /dev/null
+++ b/admin/app/page.tsx
@@ -0,0 +1,30 @@
+import { Metadata } from "next";
+// components
+import { InstanceSignInForm } from "@/components/login";
+// layouts
+import { DefaultLayout } from "@/layouts/default-layout";
+
+export const metadata: Metadata = {
+ title: "Plane | Simple, extensible, open-source project management tool.",
+ description:
+ "Open-source project management tool to manage issues, sprints, and product roadmaps with peace of mind.",
+ openGraph: {
+ title: "Plane | Simple, extensible, open-source project management tool.",
+ description:
+ "Open-source project management tool to manage issues, sprints, and product roadmaps with peace of mind.",
+ url: "https://plane.so/",
+ },
+ keywords:
+ "software development, customer feedback, software, accelerate, code management, release management, project management, issue tracking, agile, scrum, kanban, collaboration",
+ twitter: {
+ site: "@planepowers",
+ },
+};
+
+export default async function LoginPage() {
+ return (
+
+
+
+ );
+}
diff --git a/web/components/instance/help-section.tsx b/admin/components/admin-sidebar/help-section.tsx
similarity index 52%
rename from web/components/instance/help-section.tsx
rename to admin/components/admin-sidebar/help-section.tsx
index 635dc8264..d2b3cc492 100644
--- a/web/components/instance/help-section.tsx
+++ b/admin/components/admin-sidebar/help-section.tsx
@@ -1,11 +1,16 @@
+"use client";
+
import { FC, useState, useRef } from "react";
-import { Transition } from "@headlessui/react";
+import { observer } from "mobx-react-lite";
import Link from "next/link";
-// mobx store
-import { useMobxStore } from "lib/mobx/store-provider";
-// icons
-import { FileText, HelpCircle, MessagesSquare, MoveLeft } from "lucide-react";
-import { DiscordIcon, GithubIcon } from "@plane/ui";
+import { ExternalLink, FileText, HelpCircle, MoveLeft } from "lucide-react";
+import { Transition } from "@headlessui/react";
+// ui
+import { DiscordIcon, GithubIcon, Tooltip } from "@plane/ui";
+// helpers
+import { WEB_BASE_URL, cn } from "@/helpers/common.helper";
+// hooks
+import { useTheme } from "@/hooks/store";
// assets
import packageJson from "package.json";
@@ -25,56 +30,59 @@ const helpOptions = [
href: "https://github.com/makeplane/plane/issues/new/choose",
Icon: GithubIcon,
},
- {
- name: "Chat with us",
- href: null,
- onClick: () => (window as any).$crisp.push(["do", "chat:show"]),
- Icon: MessagesSquare,
- },
];
-export const InstanceHelpSection: FC = () => {
+export const HelpSection: FC = observer(() => {
// states
const [isNeedHelpOpen, setIsNeedHelpOpen] = useState(false);
// store
- const {
- theme: { sidebarCollapsed, toggleSidebar },
- } = useMobxStore();
+ const { isSidebarCollapsed, toggleSidebar } = useTheme();
// refs
const helpOptionsRef = useRef(null);
+ const redirectionLink = encodeURI(WEB_BASE_URL + "/");
+
return (
-
-
setIsNeedHelpOpen((prev) => !prev)}
- >
-
-
-
toggleSidebar()}
- >
-
-
-
toggleSidebar()}
- >
-
-
+
@@ -89,12 +97,12 @@ export const InstanceHelpSection: FC = () => {
>
- {helpOptions.map(({ name, Icon, href, onClick }) => {
+ {helpOptions.map(({ name, Icon, href }) => {
if (href)
return (
@@ -111,7 +119,6 @@ export const InstanceHelpSection: FC = () => {
@@ -128,4 +135,4 @@ export const InstanceHelpSection: FC = () => {
);
-};
+});
diff --git a/admin/components/admin-sidebar/index.ts b/admin/components/admin-sidebar/index.ts
new file mode 100644
index 000000000..e800fe3c5
--- /dev/null
+++ b/admin/components/admin-sidebar/index.ts
@@ -0,0 +1,5 @@
+export * from "./root";
+export * from "./help-section";
+export * from "./sidebar-menu";
+export * from "./sidebar-dropdown";
+export * from "./sidebar-menu-hamburger-toogle";
diff --git a/admin/components/admin-sidebar/root.tsx b/admin/components/admin-sidebar/root.tsx
new file mode 100644
index 000000000..ff94bf228
--- /dev/null
+++ b/admin/components/admin-sidebar/root.tsx
@@ -0,0 +1,57 @@
+"use client";
+
+import { FC, useEffect, useRef } from "react";
+import { observer } from "mobx-react-lite";
+// hooks
+import { HelpSection, SidebarMenu, SidebarDropdown } from "@/components/admin-sidebar";
+import { useTheme } from "@/hooks/store";
+import useOutsideClickDetector from "hooks/use-outside-click-detector";
+// components
+
+export interface IInstanceSidebar {}
+
+export const InstanceSidebar: FC
= observer(() => {
+ // store
+ const { isSidebarCollapsed, toggleSidebar } = useTheme();
+
+ const ref = useRef(null);
+
+ useOutsideClickDetector(ref, () => {
+ if (isSidebarCollapsed === false) {
+ if (window.innerWidth < 768) {
+ toggleSidebar(!isSidebarCollapsed);
+ }
+ }
+ });
+
+ useEffect(() => {
+ const handleResize = () => {
+ if (window.innerWidth <= 768) {
+ toggleSidebar(true);
+ }
+ };
+ handleResize();
+ window.addEventListener("resize", handleResize);
+ return () => {
+ window.removeEventListener("resize", handleResize);
+ };
+ }, [toggleSidebar]);
+
+ return (
+
+ );
+});
diff --git a/admin/components/admin-sidebar/sidebar-dropdown.tsx b/admin/components/admin-sidebar/sidebar-dropdown.tsx
new file mode 100644
index 000000000..84583e24b
--- /dev/null
+++ b/admin/components/admin-sidebar/sidebar-dropdown.tsx
@@ -0,0 +1,147 @@
+"use client";
+
+import { Fragment, useEffect, useState } from "react";
+import { observer } from "mobx-react-lite";
+import { useTheme as useNextTheme } from "next-themes";
+import { LogOut, UserCog2, Palette } from "lucide-react";
+import { Menu, Transition } from "@headlessui/react";
+import { Avatar } from "@plane/ui";
+// hooks
+import { API_BASE_URL, cn } from "@/helpers/common.helper";
+import { useTheme, useUser } from "@/hooks/store";
+// helpers
+// services
+import { AuthService } from "@/services/auth.service";
+
+// service initialization
+const authService = new AuthService();
+
+export const SidebarDropdown = observer(() => {
+ // store hooks
+ const { isSidebarCollapsed } = useTheme();
+ const { currentUser, signOut } = useUser();
+ // hooks
+ const { resolvedTheme, setTheme } = useNextTheme();
+ // state
+ const [csrfToken, setCsrfToken] = useState(undefined);
+
+ const handleThemeSwitch = () => {
+ const newTheme = resolvedTheme === "dark" ? "light" : "dark";
+ setTheme(newTheme);
+ };
+
+ const handleSignOut = () => signOut();
+
+ const getSidebarMenuItems = () => (
+
+
+ {currentUser?.email}
+
+
+
+
+ Switch to {resolvedTheme === "dark" ? "light" : "dark"} mode
+
+
+
+
+
+
+ );
+
+ useEffect(() => {
+ if (csrfToken === undefined)
+ authService.requestCSRFToken().then((data) => data?.csrf_token && setCsrfToken(data.csrf_token));
+ }, [csrfToken]);
+
+ return (
+
+
+
+
+
+
+
+
+
+ {isSidebarCollapsed && (
+
+ {getSidebarMenuItems()}
+
+ )}
+
+
+ {!isSidebarCollapsed && (
+
+
Instance admin
+
+ )}
+
+
+
+ {!isSidebarCollapsed && currentUser && (
+
+
+
+
+
+
+ {getSidebarMenuItems()}
+
+
+ )}
+
+ );
+});
diff --git a/admin/components/admin-sidebar/sidebar-menu-hamburger-toogle.tsx b/admin/components/admin-sidebar/sidebar-menu-hamburger-toogle.tsx
new file mode 100644
index 000000000..2e8539488
--- /dev/null
+++ b/admin/components/admin-sidebar/sidebar-menu-hamburger-toogle.tsx
@@ -0,0 +1,20 @@
+"use client";
+
+import { FC } from "react";
+import { observer } from "mobx-react-lite";
+// hooks
+import { Menu } from "lucide-react";
+import { useTheme } from "@/hooks/store";
+// icons
+
+export const SidebarHamburgerToggle: FC = observer(() => {
+ const { isSidebarCollapsed, toggleSidebar } = useTheme();
+ return (
+ toggleSidebar(!isSidebarCollapsed)}
+ >
+
+
+ );
+});
diff --git a/admin/components/admin-sidebar/sidebar-menu.tsx b/admin/components/admin-sidebar/sidebar-menu.tsx
new file mode 100644
index 000000000..a821243b8
--- /dev/null
+++ b/admin/components/admin-sidebar/sidebar-menu.tsx
@@ -0,0 +1,104 @@
+"use client";
+
+import { observer } from "mobx-react-lite";
+import Link from "next/link";
+import { usePathname } from "next/navigation";
+import { Image, BrainCog, Cog, Lock, Mail } from "lucide-react";
+import { Tooltip } from "@plane/ui";
+// hooks
+import { cn } from "@/helpers/common.helper";
+import { useTheme } from "@/hooks/store";
+// helpers
+
+const INSTANCE_ADMIN_LINKS = [
+ {
+ Icon: Cog,
+ name: "General",
+ description: "Identify your instances and get key details",
+ href: `/general/`,
+ },
+ {
+ Icon: Mail,
+ name: "Email",
+ description: "Set up emails to your users",
+ href: `/email/`,
+ },
+ {
+ Icon: Lock,
+ name: "Authentication",
+ description: "Configure authentication modes",
+ href: `/authentication/`,
+ },
+ {
+ Icon: BrainCog,
+ name: "Artificial intelligence",
+ description: "Configure your OpenAI creds",
+ href: `/ai/`,
+ },
+ {
+ Icon: Image,
+ name: "Images in Plane",
+ description: "Allow third-party image libraries",
+ href: `/image/`,
+ },
+];
+
+export const SidebarMenu = observer(() => {
+ // store hooks
+ const { isSidebarCollapsed, toggleSidebar } = useTheme();
+ // router
+ const pathName = usePathname();
+
+ const handleItemClick = () => {
+ if (window.innerWidth < 768) {
+ toggleSidebar(!isSidebarCollapsed);
+ }
+ };
+
+ return (
+
+ {INSTANCE_ADMIN_LINKS.map((item, index) => {
+ const isActive = item.href === pathName || pathName.includes(item.href);
+ return (
+
+
+
+
+ {
}
+ {!isSidebarCollapsed && (
+
+
+ {item.name}
+
+
+ {item.description}
+
+
+ )}
+
+
+
+
+ );
+ })}
+
+ );
+});
diff --git a/admin/components/auth-header.tsx b/admin/components/auth-header.tsx
new file mode 100644
index 000000000..4becf928f
--- /dev/null
+++ b/admin/components/auth-header.tsx
@@ -0,0 +1,90 @@
+"use client";
+
+import { FC } from "react";
+import { observer } from "mobx-react-lite";
+import { usePathname } from "next/navigation";
+// mobx
+// ui
+import { Settings } from "lucide-react";
+// icons
+import { Breadcrumbs } from "@plane/ui";
+// components
+import { SidebarHamburgerToggle } from "@/components/admin-sidebar";
+import { BreadcrumbLink } from "components/common";
+
+export const InstanceHeader: FC = observer(() => {
+ const pathName = usePathname();
+
+ const getHeaderTitle = (pathName: string) => {
+ switch (pathName) {
+ case "general":
+ return "General";
+ case "ai":
+ return "Artificial Intelligence";
+ case "email":
+ return "Email";
+ case "authentication":
+ return "Authentication";
+ case "image":
+ return "Image";
+ case "google":
+ return "Google";
+ case "github":
+ return "Github";
+ default:
+ return pathName.toUpperCase();
+ }
+ };
+
+ // Function to dynamically generate breadcrumb items based on pathname
+ const generateBreadcrumbItems = (pathname: string) => {
+ const pathSegments = pathname.split("/").slice(1); // removing the first empty string.
+ pathSegments.pop();
+
+ let currentUrl = "";
+ const breadcrumbItems = pathSegments.map((segment) => {
+ currentUrl += "/" + segment;
+ return {
+ title: getHeaderTitle(segment),
+ href: currentUrl,
+ };
+ });
+ return breadcrumbItems;
+ };
+
+ const breadcrumbItems = generateBreadcrumbItems(pathName);
+
+ return (
+
+
+
+ {breadcrumbItems.length >= 0 && (
+
+
+ }
+ />
+ }
+ />
+ {breadcrumbItems.map(
+ (item) =>
+ item.title && (
+ }
+ />
+ )
+ )}
+
+
+ )}
+
+
+ );
+});
diff --git a/admin/components/common/banner.tsx b/admin/components/common/banner.tsx
new file mode 100644
index 000000000..932a0c629
--- /dev/null
+++ b/admin/components/common/banner.tsx
@@ -0,0 +1,32 @@
+import { FC } from "react";
+import { AlertCircle, CheckCircle2 } from "lucide-react";
+
+type TBanner = {
+ type: "success" | "error";
+ message: string;
+};
+
+export const Banner: FC = (props) => {
+ const { type, message } = props;
+
+ return (
+
+
+
+ {type === "error" ? (
+
+
+
+ ) : (
+
+ )}
+
+
+
+
+ );
+};
diff --git a/admin/components/common/breadcrumb-link.tsx b/admin/components/common/breadcrumb-link.tsx
new file mode 100644
index 000000000..dfa437231
--- /dev/null
+++ b/admin/components/common/breadcrumb-link.tsx
@@ -0,0 +1,36 @@
+import Link from "next/link";
+import { Tooltip } from "@plane/ui";
+
+type Props = {
+ label?: string;
+ href?: string;
+ icon?: React.ReactNode | undefined;
+};
+
+export const BreadcrumbLink: React.FC = (props) => {
+ const { href, label, icon } = props;
+ return (
+
+
+
+ {href ? (
+
+ {icon && (
+
{icon}
+ )}
+
{label}
+
+ ) : (
+
+ {icon &&
{icon}
}
+
{label}
+
+ )}
+
+
+
+ );
+};
diff --git a/admin/components/common/confirm-discard-modal.tsx b/admin/components/common/confirm-discard-modal.tsx
new file mode 100644
index 000000000..64e4d7a08
--- /dev/null
+++ b/admin/components/common/confirm-discard-modal.tsx
@@ -0,0 +1,83 @@
+import React from "react";
+import Link from "next/link";
+// headless ui
+import { Dialog, Transition } from "@headlessui/react";
+// ui
+import { Button, getButtonStyling } from "@plane/ui";
+
+type Props = {
+ isOpen: boolean;
+ handleClose: () => void;
+ onDiscardHref: string;
+};
+
+export const ConfirmDiscardModal: React.FC = (props) => {
+ const { isOpen, handleClose, onDiscardHref } = props;
+
+ return (
+
+
+
+
+
+
+
+
+
+
+
+
+
+ You have unsaved changes
+
+
+
+ Changes you made will be lost if you go back. Do you
+ wish to go back?
+
+
+
+
+
+
+
+ Keep editing
+
+
+ Go back
+
+
+
+
+
+
+
+
+ );
+};
diff --git a/admin/components/common/controller-input.tsx b/admin/components/common/controller-input.tsx
new file mode 100644
index 000000000..0eb215095
--- /dev/null
+++ b/admin/components/common/controller-input.tsx
@@ -0,0 +1,86 @@
+"use client";
+
+import React, { useState } from "react";
+import { Controller, Control } from "react-hook-form";
+// icons
+import { Eye, EyeOff } from "lucide-react";
+// ui
+import { Input } from "@plane/ui";
+// helpers
+import { cn } from "@/helpers/common.helper";
+
+type Props = {
+ control: Control;
+ type: "text" | "password";
+ name: string;
+ label: string;
+ description?: string | JSX.Element;
+ placeholder: string;
+ error: boolean;
+ required: boolean;
+};
+
+export type TControllerInputFormField = {
+ key: string;
+ type: "text" | "password";
+ label: string;
+ description?: string | JSX.Element;
+ placeholder: string;
+ error: boolean;
+ required: boolean;
+};
+
+export const ControllerInput: React.FC = (props) => {
+ const { name, control, type, label, description, placeholder, error, required } = props;
+ // states
+ const [showPassword, setShowPassword] = useState(false);
+
+ return (
+
+
+ {label} {!required && "(optional)"}
+
+
+ (
+
+ )}
+ />
+ {type === "password" &&
+ (showPassword ? (
+ setShowPassword(false)}
+ >
+
+
+ ) : (
+ setShowPassword(true)}
+ >
+
+
+ ))}
+
+ {description &&
{description}
}
+
+ );
+};
diff --git a/admin/components/common/copy-field.tsx b/admin/components/common/copy-field.tsx
new file mode 100644
index 000000000..6322356b4
--- /dev/null
+++ b/admin/components/common/copy-field.tsx
@@ -0,0 +1,46 @@
+"use client";
+
+import React from "react";
+// ui
+import { Copy } from "lucide-react";
+import { Button, TOAST_TYPE, setToast } from "@plane/ui";
+// icons
+
+type Props = {
+ label: string;
+ url: string;
+ description: string | JSX.Element;
+};
+
+export type TCopyField = {
+ key: string;
+ label: string;
+ url: string;
+ description: string | JSX.Element;
+};
+
+export const CopyField: React.FC = (props) => {
+ const { label, url, description } = props;
+
+ return (
+
+
{label}
+
{
+ navigator.clipboard.writeText(url);
+ setToast({
+ type: TOAST_TYPE.INFO,
+ title: "Copied to clipboard",
+ message: `The ${label} has been successfully copied to your clipboard`,
+ });
+ }}
+ >
+ {url}
+
+
+
{description}
+
+ );
+};
diff --git a/admin/components/common/empty-state.tsx b/admin/components/common/empty-state.tsx
new file mode 100644
index 000000000..fbbe0bc0f
--- /dev/null
+++ b/admin/components/common/empty-state.tsx
@@ -0,0 +1,46 @@
+import React from "react";
+import Image from "next/image";
+import { Button } from "@plane/ui";
+
+type Props = {
+ title: string;
+ description?: React.ReactNode;
+ image?: any;
+ primaryButton?: {
+ icon?: any;
+ text: string;
+ onClick: () => void;
+ };
+ secondaryButton?: React.ReactNode;
+ disabled?: boolean;
+};
+
+export const EmptyState: React.FC = ({
+ title,
+ description,
+ image,
+ primaryButton,
+ secondaryButton,
+ disabled = false,
+}) => (
+
+
+ {image &&
}
+
{title}
+ {description &&
{description}
}
+
+ {primaryButton && (
+
+ {primaryButton.text}
+
+ )}
+ {secondaryButton}
+
+
+
+);
diff --git a/admin/components/common/index.ts b/admin/components/common/index.ts
new file mode 100644
index 000000000..c810cac69
--- /dev/null
+++ b/admin/components/common/index.ts
@@ -0,0 +1,9 @@
+export * from "./breadcrumb-link";
+export * from "./confirm-discard-modal";
+export * from "./controller-input";
+export * from "./copy-field";
+export * from "./password-strength-meter";
+export * from "./banner";
+export * from "./empty-state";
+export * from "./logo-spinner";
+export * from "./toast";
diff --git a/admin/components/common/logo-spinner.tsx b/admin/components/common/logo-spinner.tsx
new file mode 100644
index 000000000..621b685b8
--- /dev/null
+++ b/admin/components/common/logo-spinner.tsx
@@ -0,0 +1,17 @@
+import Image from "next/image";
+import { useTheme } from "next-themes";
+// assets
+import LogoSpinnerDark from "@/public/images/logo-spinner-dark.gif";
+import LogoSpinnerLight from "@/public/images/logo-spinner-light.gif";
+
+export const LogoSpinner = () => {
+ const { resolvedTheme } = useTheme();
+
+ const logoSrc = resolvedTheme === "dark" ? LogoSpinnerDark : LogoSpinnerLight;
+
+ return (
+
+
+
+ );
+};
diff --git a/admin/components/common/password-strength-meter.tsx b/admin/components/common/password-strength-meter.tsx
new file mode 100644
index 000000000..004a927b2
--- /dev/null
+++ b/admin/components/common/password-strength-meter.tsx
@@ -0,0 +1,69 @@
+"use client";
+
+// helpers
+import { CircleCheck } from "lucide-react";
+import { cn } from "@/helpers/common.helper";
+import { getPasswordStrength } from "@/helpers/password.helper";
+// icons
+
+type Props = {
+ password: string;
+};
+
+export const PasswordStrengthMeter: React.FC = (props: Props) => {
+ const { password } = props;
+
+ const strength = getPasswordStrength(password);
+ let bars = [];
+ let text = "";
+ let textColor = "";
+
+ if (password.length === 0) {
+ bars = [`bg-[#F0F0F3]`, `bg-[#F0F0F3]`, `bg-[#F0F0F3]`];
+ text = "Password requirements";
+ } else if (password.length < 8) {
+ bars = [`bg-[#DC3E42]`, `bg-[#F0F0F3]`, `bg-[#F0F0F3]`];
+ text = "Password is too short";
+ textColor = `text-[#DC3E42]`;
+ } else if (strength < 3) {
+ bars = [`bg-[#FFBA18]`, `bg-[#FFBA18]`, `bg-[#F0F0F3]`];
+ text = "Password is weak";
+ textColor = `text-[#FFBA18]`;
+ } else {
+ bars = [`bg-[#3E9B4F]`, `bg-[#3E9B4F]`, `bg-[#3E9B4F]`];
+ text = "Password is strong";
+ textColor = `text-[#3E9B4F]`;
+ }
+
+ const criteria = [
+ { label: "Min 8 characters", isValid: password.length >= 8 },
+ { label: "Min 1 upper-case letter", isValid: /[A-Z]/.test(password) },
+ { label: "Min 1 number", isValid: /\d/.test(password) },
+ { label: "Min 1 special character", isValid: /[!@#$%^&*]/.test(password) },
+ ];
+
+ return (
+
+
+ {bars.map((color, index) => (
+
+ ))}
+
+
{text}
+
+ {criteria.map((criterion, index) => (
+
+
+ {criterion.label}
+
+ ))}
+
+
+ );
+};
diff --git a/admin/components/common/toast.tsx b/admin/components/common/toast.tsx
new file mode 100644
index 000000000..fe4983db6
--- /dev/null
+++ b/admin/components/common/toast.tsx
@@ -0,0 +1,11 @@
+import { useTheme } from "next-themes";
+// ui
+import { Toast as ToastComponent } from "@plane/ui";
+// helpers
+import { resolveGeneralTheme } from "@/helpers/common.helper";
+
+export const Toast = () => {
+ const { theme } = useTheme();
+
+ return ;
+};
diff --git a/admin/components/core/index.ts b/admin/components/core/index.ts
new file mode 100644
index 000000000..d32aafe96
--- /dev/null
+++ b/admin/components/core/index.ts
@@ -0,0 +1 @@
+export * from "./page-header";
diff --git a/admin/components/core/page-header.tsx b/admin/components/core/page-header.tsx
new file mode 100644
index 000000000..a4b27b92f
--- /dev/null
+++ b/admin/components/core/page-header.tsx
@@ -0,0 +1,17 @@
+"use client";
+
+type TPageHeader = {
+ title?: string;
+ description?: string;
+};
+
+export const PageHeader: React.FC = (props) => {
+ const { title = "God Mode - Plane", description = "Plane god mode" } = props;
+
+ return (
+ <>
+ {title}
+
+ >
+ );
+};
diff --git a/admin/components/instance/index.ts b/admin/components/instance/index.ts
new file mode 100644
index 000000000..56d1933f4
--- /dev/null
+++ b/admin/components/instance/index.ts
@@ -0,0 +1,3 @@
+export * from "./instance-not-ready";
+export * from "./instance-failure-view";
+export * from "./setup-form";
diff --git a/admin/components/instance/instance-failure-view.tsx b/admin/components/instance/instance-failure-view.tsx
new file mode 100644
index 000000000..8722929b5
--- /dev/null
+++ b/admin/components/instance/instance-failure-view.tsx
@@ -0,0 +1,42 @@
+"use client";
+import { FC } from "react";
+import Image from "next/image";
+import { useTheme } from "next-themes";
+import { Button } from "@plane/ui";
+// assets
+import InstanceFailureDarkImage from "@/public/instance/instance-failure-dark.svg";
+import InstanceFailureImage from "@/public/instance/instance-failure.svg";
+
+type InstanceFailureViewProps = {
+ // mutate: () => void;
+};
+
+export const InstanceFailureView: FC = () => {
+ const { resolvedTheme } = useTheme();
+
+ const instanceImage = resolvedTheme === "dark" ? InstanceFailureDarkImage : InstanceFailureImage;
+
+ const handleRetry = () => {
+ window.location.reload();
+ };
+
+ return (
+
+
+
+
+
Unable to fetch instance details.
+
+ We were unable to fetch the details of the instance.
+ Fret not, it might just be a connectivity issue.
+
+
+
+
+ Retry
+
+
+
+
+ );
+};
diff --git a/admin/components/instance/instance-not-ready.tsx b/admin/components/instance/instance-not-ready.tsx
new file mode 100644
index 000000000..874013f52
--- /dev/null
+++ b/admin/components/instance/instance-not-ready.tsx
@@ -0,0 +1,30 @@
+"use client";
+
+import { FC } from "react";
+import Image from "next/image";
+import Link from "next/link";
+import { Button } from "@plane/ui";
+// assets
+import PlaneTakeOffImage from "@/public/images/plane-takeoff.png";
+
+export const InstanceNotReady: FC = () => (
+
+
+
+
Welcome aboard Plane!
+
+
+ Get started by setting up your instance and workspace
+
+
+
+
+
+
+ Get started
+
+
+
+
+
+);
diff --git a/admin/components/instance/setup-form.tsx b/admin/components/instance/setup-form.tsx
new file mode 100644
index 000000000..77bf8e562
--- /dev/null
+++ b/admin/components/instance/setup-form.tsx
@@ -0,0 +1,353 @@
+"use client";
+
+import { FC, useEffect, useMemo, useState } from "react";
+import { useSearchParams } from "next/navigation";
+// icons
+import { Eye, EyeOff } from "lucide-react";
+// ui
+import { Button, Checkbox, Input, Spinner } from "@plane/ui";
+// components
+import { Banner, PasswordStrengthMeter } from "@/components/common";
+// helpers
+import { API_BASE_URL } from "@/helpers/common.helper";
+import { getPasswordStrength } from "@/helpers/password.helper";
+// services
+import { AuthService } from "@/services/auth.service";
+
+// service initialization
+const authService = new AuthService();
+
+// error codes
+enum EErrorCodes {
+ INSTANCE_NOT_CONFIGURED = "INSTANCE_NOT_CONFIGURED",
+ ADMIN_ALREADY_EXIST = "ADMIN_ALREADY_EXIST",
+ REQUIRED_EMAIL_PASSWORD_FIRST_NAME = "REQUIRED_EMAIL_PASSWORD_FIRST_NAME",
+ INVALID_EMAIL = "INVALID_EMAIL",
+ INVALID_PASSWORD = "INVALID_PASSWORD",
+ USER_ALREADY_EXISTS = "USER_ALREADY_EXISTS",
+}
+
+type TError = {
+ type: EErrorCodes | undefined;
+ message: string | undefined;
+};
+
+// form data
+type TFormData = {
+ first_name: string;
+ last_name: string;
+ email: string;
+ company_name: string;
+ password: string;
+ confirm_password?: string;
+ is_telemetry_enabled: boolean;
+};
+
+const defaultFromData: TFormData = {
+ first_name: "",
+ last_name: "",
+ email: "",
+ company_name: "",
+ password: "",
+ is_telemetry_enabled: true,
+};
+
+export const InstanceSetupForm: FC = (props) => {
+ const {} = props;
+ // search params
+ const searchParams = useSearchParams();
+ const firstNameParam = searchParams.get("first_name") || undefined;
+ const lastNameParam = searchParams.get("last_name") || undefined;
+ const companyParam = searchParams.get("company") || undefined;
+ const emailParam = searchParams.get("email") || undefined;
+ const isTelemetryEnabledParam = (searchParams.get("is_telemetry_enabled") === "True" ? true : false) || true;
+ const errorCode = searchParams.get("error_code") || undefined;
+ const errorMessage = searchParams.get("error_message") || undefined;
+ // state
+ const [showPassword, setShowPassword] = useState({
+ password: false,
+ retypePassword: false,
+ });
+ const [csrfToken, setCsrfToken] = useState(undefined);
+ const [formData, setFormData] = useState(defaultFromData);
+ const [isPasswordInputFocused, setIsPasswordInputFocused] = useState(false);
+ const [isSubmitting, setIsSubmitting] = useState(false);
+ const [isRetryPasswordInputFocused, setIsRetryPasswordInputFocused] = useState(false);
+
+ const handleShowPassword = (key: keyof typeof showPassword) =>
+ setShowPassword((prev) => ({ ...prev, [key]: !prev[key] }));
+
+ const handleFormChange = (key: keyof TFormData, value: string | boolean) =>
+ setFormData((prev) => ({ ...prev, [key]: value }));
+
+ useEffect(() => {
+ if (csrfToken === undefined)
+ authService.requestCSRFToken().then((data) => data?.csrf_token && setCsrfToken(data.csrf_token));
+ }, [csrfToken]);
+
+ useEffect(() => {
+ if (firstNameParam) setFormData((prev) => ({ ...prev, first_name: firstNameParam }));
+ if (lastNameParam) setFormData((prev) => ({ ...prev, last_name: lastNameParam }));
+ if (companyParam) setFormData((prev) => ({ ...prev, company_name: companyParam }));
+ if (emailParam) setFormData((prev) => ({ ...prev, email: emailParam }));
+ if (isTelemetryEnabledParam) setFormData((prev) => ({ ...prev, is_telemetry_enabled: isTelemetryEnabledParam }));
+ }, [firstNameParam, lastNameParam, companyParam, emailParam, isTelemetryEnabledParam]);
+
+ // derived values
+ const errorData: TError = useMemo(() => {
+ if (errorCode && errorMessage) {
+ switch (errorCode) {
+ case EErrorCodes.INSTANCE_NOT_CONFIGURED:
+ return { type: EErrorCodes.INSTANCE_NOT_CONFIGURED, message: errorMessage };
+ case EErrorCodes.ADMIN_ALREADY_EXIST:
+ return { type: EErrorCodes.ADMIN_ALREADY_EXIST, message: errorMessage };
+ case EErrorCodes.REQUIRED_EMAIL_PASSWORD_FIRST_NAME:
+ return { type: EErrorCodes.REQUIRED_EMAIL_PASSWORD_FIRST_NAME, message: errorMessage };
+ case EErrorCodes.INVALID_EMAIL:
+ return { type: EErrorCodes.INVALID_EMAIL, message: errorMessage };
+ case EErrorCodes.INVALID_PASSWORD:
+ return { type: EErrorCodes.INVALID_PASSWORD, message: errorMessage };
+ case EErrorCodes.USER_ALREADY_EXISTS:
+ return { type: EErrorCodes.USER_ALREADY_EXISTS, message: errorMessage };
+ default:
+ return { type: undefined, message: undefined };
+ }
+ } else return { type: undefined, message: undefined };
+ }, [errorCode, errorMessage]);
+
+ const isButtonDisabled = useMemo(
+ () =>
+ !isSubmitting &&
+ formData.first_name &&
+ formData.email &&
+ formData.password &&
+ getPasswordStrength(formData.password) >= 3 &&
+ formData.password === formData.confirm_password
+ ? false
+ : true,
+ [formData.confirm_password, formData.email, formData.first_name, formData.password, isSubmitting]
+ );
+
+ const password = formData?.password ?? "";
+ const confirmPassword = formData?.confirm_password ?? "";
+ const renderPasswordMatchError = !isRetryPasswordInputFocused || confirmPassword.length >= password.length;
+
+ return (
+
+
+
+
+ Setup your Plane Instance
+
+
+ Post setup you will be able to manage this Plane instance.
+
+
+
+ {errorData.type &&
+ errorData?.message &&
+ ![EErrorCodes.INVALID_EMAIL, EErrorCodes.INVALID_PASSWORD].includes(errorData.type) && (
+
+ )}
+
+
+
+
+ );
+};
diff --git a/web/components/instance/setup-form/index.ts b/admin/components/login/index.ts
similarity index 57%
rename from web/components/instance/setup-form/index.ts
rename to admin/components/login/index.ts
index e9a965d6d..bdeb387f3 100644
--- a/web/components/instance/setup-form/index.ts
+++ b/admin/components/login/index.ts
@@ -1,2 +1 @@
-export * from "./root";
export * from "./sign-in-form";
diff --git a/admin/components/login/sign-in-form.tsx b/admin/components/login/sign-in-form.tsx
new file mode 100644
index 000000000..45d448d12
--- /dev/null
+++ b/admin/components/login/sign-in-form.tsx
@@ -0,0 +1,179 @@
+"use client";
+
+import { FC, useEffect, useMemo, useState } from "react";
+import { useSearchParams } from "next/navigation";
+// services
+import { Eye, EyeOff } from "lucide-react";
+import { Button, Input, Spinner } from "@plane/ui";
+// components
+import { Banner } from "@/components/common";
+// helpers
+import { API_BASE_URL } from "@/helpers/common.helper";
+import { AuthService } from "@/services/auth.service";
+// ui
+// icons
+
+// service initialization
+const authService = new AuthService();
+
+// error codes
+enum EErrorCodes {
+ INSTANCE_NOT_CONFIGURED = "INSTANCE_NOT_CONFIGURED",
+ REQUIRED_EMAIL_PASSWORD = "REQUIRED_EMAIL_PASSWORD",
+ INVALID_EMAIL = "INVALID_EMAIL",
+ USER_DOES_NOT_EXIST = "USER_DOES_NOT_EXIST",
+ AUTHENTICATION_FAILED = "AUTHENTICATION_FAILED",
+}
+
+type TError = {
+ type: EErrorCodes | undefined;
+ message: string | undefined;
+};
+
+// form data
+type TFormData = {
+ email: string;
+ password: string;
+};
+
+const defaultFromData: TFormData = {
+ email: "",
+ password: "",
+};
+
+export const InstanceSignInForm: FC = (props) => {
+ const {} = props;
+ // search params
+ const searchParams = useSearchParams();
+ const emailParam = searchParams.get("email") || undefined;
+ const errorCode = searchParams.get("error_code") || undefined;
+ const errorMessage = searchParams.get("error_message") || undefined;
+ // state
+ const [showPassword, setShowPassword] = useState(false);
+ const [csrfToken, setCsrfToken] = useState(undefined);
+ const [formData, setFormData] = useState(defaultFromData);
+ const [isSubmitting, setIsSubmitting] = useState(false);
+
+ const handleFormChange = (key: keyof TFormData, value: string | boolean) =>
+ setFormData((prev) => ({ ...prev, [key]: value }));
+
+ console.log("csrfToken", csrfToken);
+
+ useEffect(() => {
+ if (csrfToken === undefined)
+ authService.requestCSRFToken().then((data) => data?.csrf_token && setCsrfToken(data.csrf_token));
+ }, [csrfToken]);
+
+ useEffect(() => {
+ if (emailParam) setFormData((prev) => ({ ...prev, email: emailParam }));
+ }, [emailParam]);
+
+ // derived values
+ const errorData: TError = useMemo(() => {
+ if (errorCode && errorMessage) {
+ switch (errorCode) {
+ case EErrorCodes.INSTANCE_NOT_CONFIGURED:
+ return { type: EErrorCodes.INVALID_EMAIL, message: errorMessage };
+ case EErrorCodes.REQUIRED_EMAIL_PASSWORD:
+ return { type: EErrorCodes.REQUIRED_EMAIL_PASSWORD, message: errorMessage };
+ case EErrorCodes.INVALID_EMAIL:
+ return { type: EErrorCodes.INVALID_EMAIL, message: errorMessage };
+ case EErrorCodes.USER_DOES_NOT_EXIST:
+ return { type: EErrorCodes.USER_DOES_NOT_EXIST, message: errorMessage };
+ case EErrorCodes.AUTHENTICATION_FAILED:
+ return { type: EErrorCodes.AUTHENTICATION_FAILED, message: errorMessage };
+ default:
+ return { type: undefined, message: undefined };
+ }
+ } else return { type: undefined, message: undefined };
+ }, [errorCode, errorMessage]);
+
+ const isButtonDisabled = useMemo(
+ () => (!isSubmitting && formData.email && formData.password ? false : true),
+ [formData.email, formData.password, isSubmitting]
+ );
+
+ return (
+
+
+
+
+ Manage your Plane instance
+
+
+ Configure instance-wide settings to secure your instance
+
+
+
+ {errorData.type && errorData?.message &&
}
+
+
+
+
+ );
+};
diff --git a/admin/components/new-user-popup.tsx b/admin/components/new-user-popup.tsx
new file mode 100644
index 000000000..840de0c3a
--- /dev/null
+++ b/admin/components/new-user-popup.tsx
@@ -0,0 +1,55 @@
+"use client";
+
+import React from "react";
+import { observer } from "mobx-react-lite";
+import Image from "next/image";
+import { useTheme as nextUseTheme } from "next-themes";
+// ui
+import { Button, getButtonStyling } from "@plane/ui";
+// helpers
+import { WEB_BASE_URL, resolveGeneralTheme } from "helpers/common.helper";
+// hooks
+import { useTheme } from "@/hooks/store";
+// icons
+import TakeoffIconLight from "/public/logos/takeoff-icon-light.svg";
+import TakeoffIconDark from "/public/logos/takeoff-icon-dark.svg";
+
+export const NewUserPopup: React.FC = observer(() => {
+ // hooks
+ const { isNewUserPopup, toggleNewUserPopup } = useTheme();
+ // theme
+ const { resolvedTheme } = nextUseTheme();
+
+ const redirectionLink = encodeURI(WEB_BASE_URL + "/create-workspace");
+
+ if (!isNewUserPopup) return <>>;
+ return (
+
+
+
+
Create workspace
+
+ Instance setup done! Welcome to Plane instance portal. Start your journey with by creating your first
+ workspace, you will need to login again.
+
+
+
+
+
+
+
+
+ );
+});
diff --git a/admin/constants/seo.ts b/admin/constants/seo.ts
new file mode 100644
index 000000000..aafd5f7a3
--- /dev/null
+++ b/admin/constants/seo.ts
@@ -0,0 +1,8 @@
+export const SITE_NAME = "Plane | Simple, extensible, open-source project management tool.";
+export const SITE_TITLE = "Plane | Simple, extensible, open-source project management tool.";
+export const SITE_DESCRIPTION =
+ "Open-source project management tool to manage issues, sprints, and product roadmaps with peace of mind.";
+export const SITE_KEYWORDS =
+ "software development, plan, ship, software, accelerate, code management, release management, project management, issue tracking, agile, scrum, kanban, collaboration";
+export const SITE_URL = "https://app.plane.so/";
+export const TWITTER_USER_NAME = "Plane | Simple, extensible, open-source project management tool.";
diff --git a/admin/constants/swr-config.ts b/admin/constants/swr-config.ts
new file mode 100644
index 000000000..38478fcea
--- /dev/null
+++ b/admin/constants/swr-config.ts
@@ -0,0 +1,8 @@
+export const SWR_CONFIG = {
+ refreshWhenHidden: false,
+ revalidateIfStale: false,
+ revalidateOnFocus: false,
+ revalidateOnMount: true,
+ refreshInterval: 600000,
+ errorRetryCount: 3,
+};
diff --git a/admin/helpers/authentication.helper.tsx b/admin/helpers/authentication.helper.tsx
new file mode 100644
index 000000000..cc9058611
--- /dev/null
+++ b/admin/helpers/authentication.helper.tsx
@@ -0,0 +1,136 @@
+import { ReactNode } from "react";
+import Link from "next/link";
+// helpers
+import { SUPPORT_EMAIL } from "./common.helper";
+
+export enum EPageTypes {
+ PUBLIC = "PUBLIC",
+ NON_AUTHENTICATED = "NON_AUTHENTICATED",
+ SET_PASSWORD = "SET_PASSWORD",
+ ONBOARDING = "ONBOARDING",
+ AUTHENTICATED = "AUTHENTICATED",
+}
+
+export enum EAuthModes {
+ SIGN_IN = "SIGN_IN",
+ SIGN_UP = "SIGN_UP",
+}
+
+export enum EAuthSteps {
+ EMAIL = "EMAIL",
+ PASSWORD = "PASSWORD",
+ UNIQUE_CODE = "UNIQUE_CODE",
+}
+
+export enum EErrorAlertType {
+ BANNER_ALERT = "BANNER_ALERT",
+ INLINE_FIRST_NAME = "INLINE_FIRST_NAME",
+ INLINE_EMAIL = "INLINE_EMAIL",
+ INLINE_PASSWORD = "INLINE_PASSWORD",
+ INLINE_EMAIL_CODE = "INLINE_EMAIL_CODE",
+}
+
+export enum EAuthenticationErrorCodes {
+ // Admin
+ ADMIN_ALREADY_EXIST = "5150",
+ REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME = "5155",
+ INVALID_ADMIN_EMAIL = "5160",
+ INVALID_ADMIN_PASSWORD = "5165",
+ REQUIRED_ADMIN_EMAIL_PASSWORD = "5170",
+ ADMIN_AUTHENTICATION_FAILED = "5175",
+ ADMIN_USER_ALREADY_EXIST = "5180",
+ ADMIN_USER_DOES_NOT_EXIST = "5185",
+ ADMIN_USER_DEACTIVATED = "5190",
+}
+
+export type TAuthErrorInfo = {
+ type: EErrorAlertType;
+ code: EAuthenticationErrorCodes;
+ title: string;
+ message: ReactNode;
+};
+
+const errorCodeMessages: {
+ [key in EAuthenticationErrorCodes]: { title: string; message: (email?: string | undefined) => ReactNode };
+} = {
+ // admin
+ [EAuthenticationErrorCodes.ADMIN_ALREADY_EXIST]: {
+ title: `Admin already exists`,
+ message: () => `Admin already exists. Please try again.`,
+ },
+ [EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME]: {
+ title: `Email, password and first name required`,
+ message: () => `Email, password and first name required. Please try again.`,
+ },
+ [EAuthenticationErrorCodes.INVALID_ADMIN_EMAIL]: {
+ title: `Invalid admin email`,
+ message: () => `Invalid admin email. Please try again.`,
+ },
+ [EAuthenticationErrorCodes.INVALID_ADMIN_PASSWORD]: {
+ title: `Invalid admin password`,
+ message: () => `Invalid admin password. Please try again.`,
+ },
+ [EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD]: {
+ title: `Email and password required`,
+ message: () => `Email and password required. Please try again.`,
+ },
+ [EAuthenticationErrorCodes.ADMIN_AUTHENTICATION_FAILED]: {
+ title: `Authentication failed`,
+ message: () => `Authentication failed. Please try again.`,
+ },
+ [EAuthenticationErrorCodes.ADMIN_USER_ALREADY_EXIST]: {
+ title: `Admin user already exists`,
+ message: () => (
+
+ Admin user already exists.
+
+ Sign In
+
+ now.
+
+ ),
+ },
+ [EAuthenticationErrorCodes.ADMIN_USER_DOES_NOT_EXIST]: {
+ title: `Admin user does not exist`,
+ message: () => (
+
+ Admin user does not exist.
+
+ Sign In
+
+ now.
+
+ ),
+ },
+ [EAuthenticationErrorCodes.ADMIN_USER_DEACTIVATED]: {
+ title: `User account deactivated`,
+ message: () => `User account deactivated. Please contact ${!!SUPPORT_EMAIL ? SUPPORT_EMAIL : "administrator"}.`,
+ },
+};
+
+export const authErrorHandler = (
+ errorCode: EAuthenticationErrorCodes,
+ email?: string | undefined
+): TAuthErrorInfo | undefined => {
+ const bannerAlertErrorCodes = [
+ EAuthenticationErrorCodes.ADMIN_ALREADY_EXIST,
+ EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME,
+ EAuthenticationErrorCodes.INVALID_ADMIN_EMAIL,
+ EAuthenticationErrorCodes.INVALID_ADMIN_PASSWORD,
+ EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD,
+ EAuthenticationErrorCodes.ADMIN_AUTHENTICATION_FAILED,
+ EAuthenticationErrorCodes.ADMIN_USER_ALREADY_EXIST,
+ EAuthenticationErrorCodes.ADMIN_USER_DOES_NOT_EXIST,
+ EAuthenticationErrorCodes.ADMIN_USER_DEACTIVATED,
+ ];
+
+ if (bannerAlertErrorCodes.includes(errorCode))
+ return {
+ type: EErrorAlertType.BANNER_ALERT,
+ code: errorCode,
+ title: errorCodeMessages[errorCode]?.title || "Error",
+ message: errorCodeMessages[errorCode]?.message(email) || "Something went wrong. Please try again.",
+ };
+
+ return undefined;
+};
diff --git a/admin/helpers/common.helper.ts b/admin/helpers/common.helper.ts
new file mode 100644
index 000000000..e282e5792
--- /dev/null
+++ b/admin/helpers/common.helper.ts
@@ -0,0 +1,20 @@
+import { clsx, type ClassValue } from "clsx";
+import { twMerge } from "tailwind-merge";
+
+export const API_BASE_URL = process.env.NEXT_PUBLIC_API_BASE_URL || "";
+
+export const ADMIN_BASE_PATH = process.env.NEXT_PUBLIC_ADMIN_BASE_PATH || "";
+
+export const SPACE_BASE_URL = process.env.NEXT_PUBLIC_SPACE_BASE_URL || "";
+export const SPACE_BASE_PATH = process.env.NEXT_PUBLIC_SPACE_BASE_PATH || "";
+
+export const WEB_BASE_URL = process.env.NEXT_PUBLIC_WEB_BASE_URL || "";
+
+export const SUPPORT_EMAIL = process.env.NEXT_PUBLIC_SUPPORT_EMAIL || "";
+
+export const ASSET_PREFIX = ADMIN_BASE_PATH;
+
+export const cn = (...inputs: ClassValue[]) => twMerge(clsx(inputs));
+
+export const resolveGeneralTheme = (resolvedTheme: string | undefined) =>
+ resolvedTheme?.includes("light") ? "light" : resolvedTheme?.includes("dark") ? "dark" : "system";
diff --git a/admin/helpers/index.ts b/admin/helpers/index.ts
new file mode 100644
index 000000000..ae6aab829
--- /dev/null
+++ b/admin/helpers/index.ts
@@ -0,0 +1,2 @@
+export * from "./instance.helper";
+export * from "./user.helper";
diff --git a/admin/helpers/instance.helper.ts b/admin/helpers/instance.helper.ts
new file mode 100644
index 000000000..f929b2211
--- /dev/null
+++ b/admin/helpers/instance.helper.ts
@@ -0,0 +1,9 @@
+export enum EInstanceStatus {
+ ERROR = "ERROR",
+ NOT_YET_READY = "NOT_YET_READY",
+}
+
+export type TInstanceStatus = {
+ status: EInstanceStatus | undefined;
+ data?: object;
+};
diff --git a/admin/helpers/password.helper.ts b/admin/helpers/password.helper.ts
new file mode 100644
index 000000000..8d80b3402
--- /dev/null
+++ b/admin/helpers/password.helper.ts
@@ -0,0 +1,16 @@
+import zxcvbn from "zxcvbn";
+
+export const isPasswordCriteriaMet = (password: string) => {
+ const criteria = [password.length >= 8, /[A-Z]/.test(password), /\d/.test(password), /[!@#$%^&*]/.test(password)];
+
+ return criteria.every((criterion) => criterion);
+};
+
+export const getPasswordStrength = (password: string) => {
+ if (password.length === 0) return 0;
+ if (password.length < 8) return 1;
+ if (!isPasswordCriteriaMet(password)) return 2;
+
+ const result = zxcvbn(password);
+ return result.score;
+};
diff --git a/admin/helpers/user.helper.ts b/admin/helpers/user.helper.ts
new file mode 100644
index 000000000..5c6a89a17
--- /dev/null
+++ b/admin/helpers/user.helper.ts
@@ -0,0 +1,21 @@
+export enum EAuthenticationPageType {
+ STATIC = "STATIC",
+ NOT_AUTHENTICATED = "NOT_AUTHENTICATED",
+ AUTHENTICATED = "AUTHENTICATED",
+}
+
+export enum EInstancePageType {
+ PRE_SETUP = "PRE_SETUP",
+ POST_SETUP = "POST_SETUP",
+}
+
+export enum EUserStatus {
+ ERROR = "ERROR",
+ AUTHENTICATION_NOT_DONE = "AUTHENTICATION_NOT_DONE",
+ NOT_YET_READY = "NOT_YET_READY",
+}
+
+export type TUserStatus = {
+ status: EUserStatus | undefined;
+ message?: string;
+};
diff --git a/admin/hooks/store/index.ts b/admin/hooks/store/index.ts
new file mode 100644
index 000000000..7447064da
--- /dev/null
+++ b/admin/hooks/store/index.ts
@@ -0,0 +1,3 @@
+export * from "./use-theme";
+export * from "./use-instance";
+export * from "./use-user";
diff --git a/admin/hooks/store/use-instance.tsx b/admin/hooks/store/use-instance.tsx
new file mode 100644
index 000000000..cf2edc39f
--- /dev/null
+++ b/admin/hooks/store/use-instance.tsx
@@ -0,0 +1,10 @@
+import { useContext } from "react";
+// store
+import { StoreContext } from "@/lib/store-provider";
+import { IInstanceStore } from "@/store/instance.store";
+
+export const useInstance = (): IInstanceStore => {
+ const context = useContext(StoreContext);
+ if (context === undefined) throw new Error("useInstance must be used within StoreProvider");
+ return context.instance;
+};
diff --git a/admin/hooks/store/use-theme.tsx b/admin/hooks/store/use-theme.tsx
new file mode 100644
index 000000000..bad89cfee
--- /dev/null
+++ b/admin/hooks/store/use-theme.tsx
@@ -0,0 +1,10 @@
+import { useContext } from "react";
+// store
+import { StoreContext } from "@/lib/store-provider";
+import { IThemeStore } from "@/store/theme.store";
+
+export const useTheme = (): IThemeStore => {
+ const context = useContext(StoreContext);
+ if (context === undefined) throw new Error("useTheme must be used within StoreProvider");
+ return context.theme;
+};
diff --git a/admin/hooks/store/use-user.tsx b/admin/hooks/store/use-user.tsx
new file mode 100644
index 000000000..823003144
--- /dev/null
+++ b/admin/hooks/store/use-user.tsx
@@ -0,0 +1,10 @@
+import { useContext } from "react";
+// store
+import { StoreContext } from "@/lib/store-provider";
+import { IUserStore } from "@/store/user.store";
+
+export const useUser = (): IUserStore => {
+ const context = useContext(StoreContext);
+ if (context === undefined) throw new Error("useUser must be used within StoreProvider");
+ return context.user;
+};
diff --git a/admin/hooks/use-outside-click-detector.tsx b/admin/hooks/use-outside-click-detector.tsx
new file mode 100644
index 000000000..b7b48c857
--- /dev/null
+++ b/admin/hooks/use-outside-click-detector.tsx
@@ -0,0 +1,21 @@
+"use client";
+
+import React, { useEffect } from "react";
+
+const useOutsideClickDetector = (ref: React.RefObject, callback: () => void) => {
+ const handleClick = (event: MouseEvent) => {
+ if (ref.current && !ref.current.contains(event.target as Node)) {
+ callback();
+ }
+ };
+
+ useEffect(() => {
+ document.addEventListener("mousedown", handleClick);
+
+ return () => {
+ document.removeEventListener("mousedown", handleClick);
+ };
+ });
+};
+
+export default useOutsideClickDetector;
diff --git a/admin/layouts/admin-layout.tsx b/admin/layouts/admin-layout.tsx
new file mode 100644
index 000000000..bcc103217
--- /dev/null
+++ b/admin/layouts/admin-layout.tsx
@@ -0,0 +1,47 @@
+"use client";
+import { FC, ReactNode, useEffect } from "react";
+import { observer } from "mobx-react-lite";
+import { useRouter } from "next/navigation";
+// components
+import { InstanceSidebar } from "@/components/admin-sidebar";
+import { InstanceHeader } from "@/components/auth-header";
+import { LogoSpinner } from "@/components/common";
+import { NewUserPopup } from "@/components/new-user-popup";
+// hooks
+import { useUser } from "@/hooks/store";
+
+type TAdminLayout = {
+ children: ReactNode;
+};
+
+export const AdminLayout: FC = observer((props) => {
+ const { children } = props;
+ // router
+ const router = useRouter();
+ const { isUserLoggedIn } = useUser();
+
+ useEffect(() => {
+ if (isUserLoggedIn === false) {
+ router.push("/");
+ }
+ }, [router, isUserLoggedIn]);
+
+ if (isUserLoggedIn === undefined) {
+ return (
+
+
+
+ );
+ }
+
+ return (
+
+ );
+});
diff --git a/admin/layouts/default-layout.tsx b/admin/layouts/default-layout.tsx
new file mode 100644
index 000000000..1be40ea12
--- /dev/null
+++ b/admin/layouts/default-layout.tsx
@@ -0,0 +1,45 @@
+"use client";
+
+import { FC, ReactNode } from "react";
+import Image from "next/image";
+import Link from "next/link";
+import { useTheme } from "next-themes";
+// logo/ images
+import PlaneBackgroundPatternDark from "public/auth/background-pattern-dark.svg";
+import PlaneBackgroundPattern from "public/auth/background-pattern.svg";
+import BlackHorizontalLogo from "public/plane-logos/black-horizontal-with-blue-logo.png";
+import WhiteHorizontalLogo from "public/plane-logos/white-horizontal-with-blue-logo.png";
+
+type TDefaultLayout = {
+ children: ReactNode;
+ withoutBackground?: boolean;
+};
+
+export const DefaultLayout: FC = (props) => {
+ const { children, withoutBackground = false } = props;
+ // hooks
+ const { resolvedTheme } = useTheme();
+ const patternBackground = resolvedTheme === "dark" ? PlaneBackgroundPatternDark : PlaneBackgroundPattern;
+
+ const logo = resolvedTheme === "light" ? BlackHorizontalLogo : WhiteHorizontalLogo;
+
+ return (
+
+
+
+ {!withoutBackground && (
+
+
+
+ )}
+
{children}
+
+
+ );
+};
diff --git a/admin/lib/instance-provider.tsx b/admin/lib/instance-provider.tsx
new file mode 100644
index 000000000..fbcf27d82
--- /dev/null
+++ b/admin/lib/instance-provider.tsx
@@ -0,0 +1,55 @@
+import { FC, ReactNode } from "react";
+import { observer } from "mobx-react-lite";
+import useSWR from "swr";
+// components
+import { LogoSpinner } from "@/components/common";
+import { InstanceSetupForm, InstanceFailureView } from "@/components/instance";
+// hooks
+import { useInstance } from "@/hooks/store";
+// layout
+import { DefaultLayout } from "@/layouts/default-layout";
+
+type InstanceProviderProps = {
+ children: ReactNode;
+};
+
+export const InstanceProvider: FC = observer((props) => {
+ const { children } = props;
+ // store hooks
+ const { instance, error, fetchInstanceInfo } = useInstance();
+ // fetching instance details
+ useSWR("INSTANCE_DETAILS", () => fetchInstanceInfo(), {
+ revalidateOnFocus: false,
+ revalidateIfStale: false,
+ errorRetryCount: 0,
+ });
+
+ if (!instance && !error)
+ return (
+
+
+
+ );
+
+ if (error) {
+ return (
+
+
+
+
+
+ );
+ }
+
+ if (!instance?.is_setup_done) {
+ return (
+
+
+
+
+
+ );
+ }
+
+ return <>{children}>;
+});
diff --git a/admin/lib/store-provider.tsx b/admin/lib/store-provider.tsx
new file mode 100644
index 000000000..842513860
--- /dev/null
+++ b/admin/lib/store-provider.tsx
@@ -0,0 +1,34 @@
+"use client";
+
+import { ReactNode, createContext } from "react";
+// store
+import { RootStore } from "@/store/root.store";
+
+let rootStore = new RootStore();
+
+export const StoreContext = createContext(rootStore);
+
+function initializeStore(initialData = {}) {
+ const singletonRootStore = rootStore ?? new RootStore();
+ // If your page has Next.js data fetching methods that use a Mobx store, it will
+ // get hydrated here, check `pages/ssg.js` and `pages/ssr.js` for more details
+ if (initialData) {
+ singletonRootStore.hydrate(initialData);
+ }
+ // For SSG and SSR always create a new store
+ if (typeof window === "undefined") return singletonRootStore;
+ // Create the store once in the client
+ if (!rootStore) rootStore = singletonRootStore;
+ return singletonRootStore;
+}
+
+export type StoreProviderProps = {
+ children: ReactNode;
+ // eslint-disable-next-line @typescript-eslint/no-explicit-any
+ initialState?: any;
+};
+
+export const StoreProvider = ({ children, initialState = {} }: StoreProviderProps) => {
+ const store = initializeStore(initialState);
+ return {children} ;
+};
diff --git a/admin/lib/user-provider.tsx b/admin/lib/user-provider.tsx
new file mode 100644
index 000000000..d8448d13e
--- /dev/null
+++ b/admin/lib/user-provider.tsx
@@ -0,0 +1,31 @@
+"use client";
+
+import { FC, ReactNode, useEffect } from "react";
+import { observer } from "mobx-react-lite";
+import useSWR from "swr";
+// hooks
+import { useInstance, useTheme, useUser } from "@/hooks/store";
+
+interface IUserProvider {
+ children: ReactNode;
+}
+
+export const UserProvider: FC = observer(({ children }) => {
+ // hooks
+ const { isSidebarCollapsed, toggleSidebar } = useTheme();
+ const { currentUser, fetchCurrentUser } = useUser();
+ const { fetchInstanceAdmins } = useInstance();
+
+ useSWR("CURRENT_USER", () => fetchCurrentUser(), {
+ shouldRetryOnError: false,
+ });
+ useSWR("INSTANCE_ADMINS", () => fetchInstanceAdmins());
+
+ useEffect(() => {
+ const localValue = localStorage && localStorage.getItem("god_mode_sidebar_collapsed");
+ const localBoolValue = localValue ? (localValue === "true" ? true : false) : false;
+ if (isSidebarCollapsed === undefined && localBoolValue != isSidebarCollapsed) toggleSidebar(localBoolValue);
+ }, [isSidebarCollapsed, currentUser, toggleSidebar]);
+
+ return <>{children}>;
+});
diff --git a/admin/next-env.d.ts b/admin/next-env.d.ts
new file mode 100644
index 000000000..4f11a03dc
--- /dev/null
+++ b/admin/next-env.d.ts
@@ -0,0 +1,5 @@
+///
+///
+
+// NOTE: This file should not be edited
+// see https://nextjs.org/docs/basic-features/typescript for more information.
diff --git a/admin/next.config.js b/admin/next.config.js
new file mode 100644
index 000000000..2109cec69
--- /dev/null
+++ b/admin/next.config.js
@@ -0,0 +1,14 @@
+/** @type {import('next').NextConfig} */
+
+const nextConfig = {
+ trailingSlash: true,
+ reactStrictMode: false,
+ swcMinify: true,
+ output: "standalone",
+ images: {
+ unoptimized: true,
+ },
+ basePath: process.env.NEXT_PUBLIC_ADMIN_BASE_PATH || "",
+};
+
+module.exports = nextConfig;
diff --git a/admin/package.json b/admin/package.json
new file mode 100644
index 000000000..9c4567070
--- /dev/null
+++ b/admin/package.json
@@ -0,0 +1,50 @@
+{
+ "name": "admin",
+ "version": "0.21.0",
+ "private": true,
+ "scripts": {
+ "dev": "turbo run develop",
+ "develop": "next dev --port 3001",
+ "build": "next build",
+ "preview": "next build && next start",
+ "start": "next start",
+ "lint": "next lint"
+ },
+ "dependencies": {
+ "@headlessui/react": "^1.7.19",
+ "@plane/types": "*",
+ "@plane/ui": "*",
+ "@plane/constants": "*",
+ "@tailwindcss/typography": "^0.5.9",
+ "@types/lodash": "^4.17.0",
+ "autoprefixer": "10.4.14",
+ "axios": "^1.6.7",
+ "js-cookie": "^3.0.5",
+ "lodash": "^4.17.21",
+ "lucide-react": "^0.356.0",
+ "mobx": "^6.12.0",
+ "mobx-react-lite": "^4.0.5",
+ "next": "^14.2.3",
+ "next-themes": "^0.2.1",
+ "postcss": "^8.4.38",
+ "react": "^18.3.1",
+ "react-dom": "^18.3.1",
+ "react-hook-form": "^7.51.0",
+ "swr": "^2.2.4",
+ "tailwindcss": "3.3.2",
+ "uuid": "^9.0.1",
+ "zxcvbn": "^4.4.2"
+ },
+ "devDependencies": {
+ "@types/js-cookie": "^3.0.6",
+ "@types/node": "18.16.1",
+ "@types/react": "^18.2.48",
+ "@types/react-dom": "^18.2.18",
+ "@types/uuid": "^9.0.8",
+ "@types/zxcvbn": "^4.4.4",
+ "eslint-config-custom": "*",
+ "tailwind-config-custom": "*",
+ "tsconfig": "*",
+ "typescript": "^5.4.2"
+ }
+}
diff --git a/admin/postcss.config.js b/admin/postcss.config.js
new file mode 100644
index 000000000..6887c8262
--- /dev/null
+++ b/admin/postcss.config.js
@@ -0,0 +1,8 @@
+module.exports = {
+ plugins: {
+ "postcss-import": {},
+ "tailwindcss/nesting": {},
+ tailwindcss: {},
+ autoprefixer: {},
+ },
+};
diff --git a/admin/public/auth/background-pattern-dark.svg b/admin/public/auth/background-pattern-dark.svg
new file mode 100644
index 000000000..c258cbabf
--- /dev/null
+++ b/admin/public/auth/background-pattern-dark.svg
@@ -0,0 +1,68 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/admin/public/auth/background-pattern.svg b/admin/public/auth/background-pattern.svg
new file mode 100644
index 000000000..5fcbeec27
--- /dev/null
+++ b/admin/public/auth/background-pattern.svg
@@ -0,0 +1,68 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/admin/public/favicon/android-chrome-192x192.png b/admin/public/favicon/android-chrome-192x192.png
new file mode 100644
index 000000000..62e95acfc
Binary files /dev/null and b/admin/public/favicon/android-chrome-192x192.png differ
diff --git a/admin/public/favicon/android-chrome-512x512.png b/admin/public/favicon/android-chrome-512x512.png
new file mode 100644
index 000000000..41400832b
Binary files /dev/null and b/admin/public/favicon/android-chrome-512x512.png differ
diff --git a/admin/public/favicon/apple-touch-icon.png b/admin/public/favicon/apple-touch-icon.png
new file mode 100644
index 000000000..5273d4951
Binary files /dev/null and b/admin/public/favicon/apple-touch-icon.png differ
diff --git a/admin/public/favicon/favicon-16x16.png b/admin/public/favicon/favicon-16x16.png
new file mode 100644
index 000000000..8ddbd49c0
Binary files /dev/null and b/admin/public/favicon/favicon-16x16.png differ
diff --git a/admin/public/favicon/favicon-32x32.png b/admin/public/favicon/favicon-32x32.png
new file mode 100644
index 000000000..80cbe7a68
Binary files /dev/null and b/admin/public/favicon/favicon-32x32.png differ
diff --git a/admin/public/favicon/favicon.ico b/admin/public/favicon/favicon.ico
new file mode 100644
index 000000000..9094a07c7
Binary files /dev/null and b/admin/public/favicon/favicon.ico differ
diff --git a/admin/public/favicon/site.webmanifest b/admin/public/favicon/site.webmanifest
new file mode 100644
index 000000000..0b08af126
--- /dev/null
+++ b/admin/public/favicon/site.webmanifest
@@ -0,0 +1,11 @@
+{
+ "name": "",
+ "short_name": "",
+ "icons": [
+ { "src": "/android-chrome-192x192.png", "sizes": "192x192", "type": "image/png" },
+ { "src": "/android-chrome-512x512.png", "sizes": "512x512", "type": "image/png" }
+ ],
+ "theme_color": "#ffffff",
+ "background_color": "#ffffff",
+ "display": "standalone"
+}
diff --git a/admin/public/images/logo-spinner-dark.gif b/admin/public/images/logo-spinner-dark.gif
new file mode 100644
index 000000000..4e0a1deb7
Binary files /dev/null and b/admin/public/images/logo-spinner-dark.gif differ
diff --git a/admin/public/images/logo-spinner-light.gif b/admin/public/images/logo-spinner-light.gif
new file mode 100644
index 000000000..7c9bfbe0e
Binary files /dev/null and b/admin/public/images/logo-spinner-light.gif differ
diff --git a/admin/public/images/plane-takeoff.png b/admin/public/images/plane-takeoff.png
new file mode 100644
index 000000000..417ff8299
Binary files /dev/null and b/admin/public/images/plane-takeoff.png differ
diff --git a/admin/public/instance/instance-failure-dark.svg b/admin/public/instance/instance-failure-dark.svg
new file mode 100644
index 000000000..58d691705
--- /dev/null
+++ b/admin/public/instance/instance-failure-dark.svg
@@ -0,0 +1,40 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/admin/public/instance/instance-failure.svg b/admin/public/instance/instance-failure.svg
new file mode 100644
index 000000000..a59862283
--- /dev/null
+++ b/admin/public/instance/instance-failure.svg
@@ -0,0 +1,40 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/admin/public/instance/plane-takeoff.png b/admin/public/instance/plane-takeoff.png
new file mode 100644
index 000000000..417ff8299
Binary files /dev/null and b/admin/public/instance/plane-takeoff.png differ
diff --git a/admin/public/logos/github-black.png b/admin/public/logos/github-black.png
new file mode 100644
index 000000000..7a7a82474
Binary files /dev/null and b/admin/public/logos/github-black.png differ
diff --git a/admin/public/logos/github-white.png b/admin/public/logos/github-white.png
new file mode 100644
index 000000000..dbb2b578c
Binary files /dev/null and b/admin/public/logos/github-white.png differ
diff --git a/admin/public/logos/google-logo.svg b/admin/public/logos/google-logo.svg
new file mode 100644
index 000000000..088288fa3
--- /dev/null
+++ b/admin/public/logos/google-logo.svg
@@ -0,0 +1 @@
+
\ No newline at end of file
diff --git a/admin/public/logos/takeoff-icon-dark.svg b/admin/public/logos/takeoff-icon-dark.svg
new file mode 100644
index 000000000..d3ef19119
--- /dev/null
+++ b/admin/public/logos/takeoff-icon-dark.svg
@@ -0,0 +1,35 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/admin/public/logos/takeoff-icon-light.svg b/admin/public/logos/takeoff-icon-light.svg
new file mode 100644
index 000000000..97cf43fe7
--- /dev/null
+++ b/admin/public/logos/takeoff-icon-light.svg
@@ -0,0 +1,40 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/admin/public/plane-logos/black-horizontal-with-blue-logo.png b/admin/public/plane-logos/black-horizontal-with-blue-logo.png
new file mode 100644
index 000000000..c14505a6f
Binary files /dev/null and b/admin/public/plane-logos/black-horizontal-with-blue-logo.png differ
diff --git a/admin/public/plane-logos/blue-without-text.png b/admin/public/plane-logos/blue-without-text.png
new file mode 100644
index 000000000..ea94aec79
Binary files /dev/null and b/admin/public/plane-logos/blue-without-text.png differ
diff --git a/admin/public/plane-logos/white-horizontal-with-blue-logo.png b/admin/public/plane-logos/white-horizontal-with-blue-logo.png
new file mode 100644
index 000000000..97560fb9f
Binary files /dev/null and b/admin/public/plane-logos/white-horizontal-with-blue-logo.png differ
diff --git a/admin/public/site.webmanifest.json b/admin/public/site.webmanifest.json
new file mode 100644
index 000000000..6e5e438f8
--- /dev/null
+++ b/admin/public/site.webmanifest.json
@@ -0,0 +1,13 @@
+{
+ "name": "Plane God Mode",
+ "short_name": "Plane God Mode",
+ "description": "Plane helps you plan your issues, cycles, and product modules.",
+ "start_url": ".",
+ "display": "standalone",
+ "background_color": "#f9fafb",
+ "theme_color": "#3f76ff",
+ "icons": [
+ { "src": "/favicon/android-chrome-192x192.png", "sizes": "192x192", "type": "image/png" },
+ { "src": "/favicon/android-chrome-512x512.png", "sizes": "512x512", "type": "image/png" }
+ ]
+}
diff --git a/admin/services/api.service.ts b/admin/services/api.service.ts
new file mode 100644
index 000000000..fa45c10b7
--- /dev/null
+++ b/admin/services/api.service.ts
@@ -0,0 +1,53 @@
+import axios, { AxiosInstance, AxiosRequestConfig, AxiosResponse } from "axios";
+// store
+// import { rootStore } from "@/lib/store-context";
+
+export abstract class APIService {
+ protected baseURL: string;
+ private axiosInstance: AxiosInstance;
+
+ constructor(baseURL: string) {
+ this.baseURL = baseURL;
+ this.axiosInstance = axios.create({
+ baseURL,
+ withCredentials: true,
+ });
+
+ this.setupInterceptors();
+ }
+
+ private setupInterceptors() {
+ // this.axiosInstance.interceptors.response.use(
+ // (response) => response,
+ // (error) => {
+ // const store = rootStore;
+ // if (error.response && error.response.status === 401 && store.user.currentUser) store.user.reset();
+ // return Promise.reject(error);
+ // }
+ // );
+ }
+
+ get(url: string, params = {}): Promise> {
+ return this.axiosInstance.get(url, { params });
+ }
+
+ post(url: string, data: RequestType, config = {}): Promise> {
+ return this.axiosInstance.post(url, data, config);
+ }
+
+ put(url: string, data: RequestType, config = {}): Promise> {
+ return this.axiosInstance.put(url, data, config);
+ }
+
+ patch(url: string, data: RequestType, config = {}): Promise> {
+ return this.axiosInstance.patch(url, data, config);
+ }
+
+ delete(url: string, data?: RequestType, config = {}) {
+ return this.axiosInstance.delete(url, { data, ...config });
+ }
+
+ request(config: AxiosRequestConfig = {}): Promise> {
+ return this.axiosInstance(config);
+ }
+}
diff --git a/admin/services/auth.service.ts b/admin/services/auth.service.ts
new file mode 100644
index 000000000..ef7b7b151
--- /dev/null
+++ b/admin/services/auth.service.ts
@@ -0,0 +1,22 @@
+// helpers
+import { API_BASE_URL } from "helpers/common.helper";
+// services
+import { APIService } from "services/api.service";
+
+type TCsrfTokenResponse = {
+ csrf_token: string;
+};
+
+export class AuthService extends APIService {
+ constructor() {
+ super(API_BASE_URL);
+ }
+
+ async requestCSRFToken(): Promise {
+ return this.get("/auth/get-csrf-token/")
+ .then((response) => response.data)
+ .catch((error) => {
+ throw error;
+ });
+ }
+}
diff --git a/admin/services/instance.service.ts b/admin/services/instance.service.ts
new file mode 100644
index 000000000..feb94ceea
--- /dev/null
+++ b/admin/services/instance.service.ts
@@ -0,0 +1,72 @@
+// types
+import type {
+ IFormattedInstanceConfiguration,
+ IInstance,
+ IInstanceAdmin,
+ IInstanceConfiguration,
+ IInstanceInfo,
+} from "@plane/types";
+// helpers
+import { API_BASE_URL } from "@/helpers/common.helper";
+import { APIService } from "@/services/api.service";
+
+export class InstanceService extends APIService {
+ constructor() {
+ super(API_BASE_URL);
+ }
+
+ async getInstanceInfo(): Promise {
+ return this.get("/api/instances/")
+ .then((response) => response.data)
+ .catch((error) => {
+ throw error?.response?.data;
+ });
+ }
+
+ async getInstanceAdmins(): Promise {
+ return this.get("/api/instances/admins/")
+ .then((response) => response.data)
+ .catch((error) => {
+ throw error;
+ });
+ }
+
+ async updateInstanceInfo(data: Partial): Promise {
+ return this.patch, IInstance>("/api/instances/", data)
+ .then((response) => response?.data)
+ .catch((error) => {
+ throw error?.response?.data;
+ });
+ }
+
+ async getInstanceConfigurations() {
+ return this.get("/api/instances/configurations/")
+ .then((response) => response.data)
+ .catch((error) => {
+ throw error;
+ });
+ }
+
+ async updateInstanceConfigurations(
+ data: Partial
+ ): Promise {
+ return this.patch, IInstanceConfiguration[]>(
+ "/api/instances/configurations/",
+ data
+ )
+ .then((response) => response?.data)
+ .catch((error) => {
+ throw error?.response?.data;
+ });
+ }
+
+ async sendTestEmail(receiverEmail: string): Promise {
+ return this.post<{ receiver_email: string }, undefined>("/api/instances/email-credentials-check/", {
+ receiver_email: receiverEmail,
+ })
+ .then((response) => response?.data)
+ .catch((error) => {
+ throw error?.response?.data;
+ });
+ }
+}
diff --git a/admin/services/user.service.ts b/admin/services/user.service.ts
new file mode 100644
index 000000000..bef384daf
--- /dev/null
+++ b/admin/services/user.service.ts
@@ -0,0 +1,30 @@
+// helpers
+import { API_BASE_URL } from "helpers/common.helper";
+// services
+import { APIService } from "services/api.service";
+// types
+import type { IUser } from "@plane/types";
+
+interface IUserSession extends IUser {
+ isAuthenticated: boolean;
+}
+
+export class UserService extends APIService {
+ constructor() {
+ super(API_BASE_URL);
+ }
+
+ async authCheck(): Promise {
+ return this.get("/api/instances/admins/me/")
+ .then((response) => ({ ...response?.data, isAuthenticated: true }))
+ .catch(() => ({ isAuthenticated: false }));
+ }
+
+ async currentUser(): Promise {
+ return this.get("/api/instances/admins/me/")
+ .then((response) => response?.data)
+ .catch((error) => {
+ throw error?.response;
+ });
+ }
+}
diff --git a/admin/store/instance.store.ts b/admin/store/instance.store.ts
new file mode 100644
index 000000000..a99cd808c
--- /dev/null
+++ b/admin/store/instance.store.ts
@@ -0,0 +1,191 @@
+import set from "lodash/set";
+import { observable, action, computed, makeObservable, runInAction } from "mobx";
+import {
+ IInstance,
+ IInstanceAdmin,
+ IInstanceConfiguration,
+ IFormattedInstanceConfiguration,
+ IInstanceInfo,
+ IInstanceConfig,
+} from "@plane/types";
+// helpers
+import { EInstanceStatus, TInstanceStatus } from "@/helpers";
+// services
+import { InstanceService } from "@/services/instance.service";
+// root store
+import { RootStore } from "@/store/root.store";
+
+export interface IInstanceStore {
+ // issues
+ isLoading: boolean;
+ error: any;
+ instanceStatus: TInstanceStatus | undefined;
+ instance: IInstance | undefined;
+ config: IInstanceConfig | undefined;
+ instanceAdmins: IInstanceAdmin[] | undefined;
+ instanceConfigurations: IInstanceConfiguration[] | undefined;
+ // computed
+ formattedConfig: IFormattedInstanceConfiguration | undefined;
+ // action
+ hydrate: (data: IInstanceInfo) => void;
+ fetchInstanceInfo: () => Promise;
+ updateInstanceInfo: (data: Partial) => Promise;
+ fetchInstanceAdmins: () => Promise;
+ fetchInstanceConfigurations: () => Promise;
+ updateInstanceConfigurations: (data: Partial) => Promise;
+}
+
+export class InstanceStore implements IInstanceStore {
+ isLoading: boolean = true;
+ error: any = undefined;
+ instanceStatus: TInstanceStatus | undefined = undefined;
+ instance: IInstance | undefined = undefined;
+ config: IInstanceConfig | undefined = undefined;
+ instanceAdmins: IInstanceAdmin[] | undefined = undefined;
+ instanceConfigurations: IInstanceConfiguration[] | undefined = undefined;
+ // service
+ instanceService;
+
+ constructor(private store: RootStore) {
+ makeObservable(this, {
+ // observable
+ isLoading: observable.ref,
+ error: observable.ref,
+ instanceStatus: observable,
+ instance: observable,
+ instanceAdmins: observable,
+ instanceConfigurations: observable,
+ // computed
+ formattedConfig: computed,
+ // actions
+ hydrate: action,
+ fetchInstanceInfo: action,
+ fetchInstanceAdmins: action,
+ updateInstanceInfo: action,
+ fetchInstanceConfigurations: action,
+ updateInstanceConfigurations: action,
+ });
+
+ this.instanceService = new InstanceService();
+ }
+
+ hydrate = (data: IInstanceInfo) => {
+ if (data) {
+ this.instance = data.instance;
+ this.config = data.config;
+ }
+ };
+
+ /**
+ * computed value for instance configurations data for forms.
+ * @returns configurations in the form of {key, value} pair.
+ */
+ get formattedConfig() {
+ if (!this.instanceConfigurations) return undefined;
+ return this.instanceConfigurations?.reduce((formData: IFormattedInstanceConfiguration, config) => {
+ formData[config.key] = config.value;
+ return formData;
+ }, {} as IFormattedInstanceConfiguration);
+ }
+
+ /**
+ * @description fetching instance configuration
+ * @returns {IInstance} instance
+ */
+ fetchInstanceInfo = async () => {
+ try {
+ if (this.instance === undefined) this.isLoading = true;
+ this.error = undefined;
+ const instanceInfo = await this.instanceService.getInstanceInfo();
+ // handling the new user popup toggle
+ if (this.instance === undefined && !instanceInfo?.instance?.workspaces_exist)
+ this.store.theme.toggleNewUserPopup();
+ runInAction(() => {
+ console.log("instanceInfo: ", instanceInfo);
+ this.isLoading = false;
+ this.instance = instanceInfo.instance;
+ this.config = instanceInfo.config;
+ });
+ return instanceInfo;
+ } catch (error) {
+ console.error("Error fetching the instance info");
+ this.isLoading = false;
+ this.error = { message: "Failed to fetch the instance info" };
+ this.instanceStatus = {
+ status: EInstanceStatus.ERROR,
+ };
+ throw error;
+ }
+ };
+
+ /**
+ * @description updating instance information
+ * @param {Partial} data
+ * @returns void
+ */
+ updateInstanceInfo = async (data: Partial) => {
+ try {
+ const instanceResponse = await this.instanceService.updateInstanceInfo(data);
+ if (instanceResponse) {
+ runInAction(() => {
+ if (this.instance) set(this.instance, "instance", instanceResponse);
+ });
+ }
+ return instanceResponse;
+ } catch (error) {
+ console.error("Error updating the instance info");
+ throw error;
+ }
+ };
+
+ /**
+ * @description fetching instance admins
+ * @return {IInstanceAdmin[]} instanceAdmins
+ */
+ fetchInstanceAdmins = async () => {
+ try {
+ const instanceAdmins = await this.instanceService.getInstanceAdmins();
+ if (instanceAdmins) runInAction(() => (this.instanceAdmins = instanceAdmins));
+ return instanceAdmins;
+ } catch (error) {
+ console.error("Error fetching the instance admins");
+ throw error;
+ }
+ };
+
+ /**
+ * @description fetching instance configurations
+ * @return {IInstanceAdmin[]} instanceConfigurations
+ */
+ fetchInstanceConfigurations = async () => {
+ try {
+ const instanceConfigurations = await this.instanceService.getInstanceConfigurations();
+ if (instanceConfigurations) runInAction(() => (this.instanceConfigurations = instanceConfigurations));
+ return instanceConfigurations;
+ } catch (error) {
+ console.error("Error fetching the instance configurations");
+ throw error;
+ }
+ };
+
+ /**
+ * @description updating instance configurations
+ * @param data
+ */
+ updateInstanceConfigurations = async (data: Partial) => {
+ try {
+ const response = await this.instanceService.updateInstanceConfigurations(data);
+ runInAction(() => {
+ this.instanceConfigurations = this.instanceConfigurations?.map((config) => {
+ const item = response.find((item) => item.key === config.key);
+ if (item) return item;
+ return config;
+ });
+ });
+ return response;
+ } catch (error) {
+ console.error("Error updating the instance configurations");
+ throw error;
+ }
+ };
+}
diff --git a/admin/store/root.store.ts b/admin/store/root.store.ts
new file mode 100644
index 000000000..553a22200
--- /dev/null
+++ b/admin/store/root.store.ts
@@ -0,0 +1,32 @@
+import { enableStaticRendering } from "mobx-react-lite";
+// stores
+import { IInstanceStore, InstanceStore } from "./instance.store";
+import { IThemeStore, ThemeStore } from "./theme.store";
+import { IUserStore, UserStore } from "./user.store";
+
+enableStaticRendering(typeof window === "undefined");
+
+export class RootStore {
+ theme: IThemeStore;
+ instance: IInstanceStore;
+ user: IUserStore;
+
+ constructor() {
+ this.theme = new ThemeStore(this);
+ this.instance = new InstanceStore(this);
+ this.user = new UserStore(this);
+ }
+
+ hydrate(initialData: any) {
+ this.theme.hydrate(initialData.theme);
+ this.instance.hydrate(initialData.instance);
+ this.user.hydrate(initialData.user);
+ }
+
+ resetOnSignOut() {
+ localStorage.setItem("theme", "system");
+ this.instance = new InstanceStore(this);
+ this.user = new UserStore(this);
+ this.theme = new ThemeStore(this);
+ }
+}
diff --git a/admin/store/theme.store.ts b/admin/store/theme.store.ts
new file mode 100644
index 000000000..a3f3b3d5a
--- /dev/null
+++ b/admin/store/theme.store.ts
@@ -0,0 +1,68 @@
+import { action, observable, makeObservable } from "mobx";
+// root store
+import { RootStore } from "@/store/root.store";
+
+type TTheme = "dark" | "light";
+export interface IThemeStore {
+ // observables
+ isNewUserPopup: boolean;
+ theme: string | undefined;
+ isSidebarCollapsed: boolean | undefined;
+ // actions
+ hydrate: (data: any) => void;
+ toggleNewUserPopup: () => void;
+ toggleSidebar: (collapsed: boolean) => void;
+ setTheme: (currentTheme: TTheme) => void;
+}
+
+export class ThemeStore implements IThemeStore {
+ // observables
+ isNewUserPopup: boolean = false;
+ isSidebarCollapsed: boolean | undefined = undefined;
+ theme: string | undefined = undefined;
+
+ constructor(private store: RootStore) {
+ makeObservable(this, {
+ // observables
+ isNewUserPopup: observable.ref,
+ isSidebarCollapsed: observable.ref,
+ theme: observable.ref,
+ // action
+ toggleNewUserPopup: action,
+ toggleSidebar: action,
+ setTheme: action,
+ });
+ }
+
+ hydrate = (data: any) => {
+ if (data) this.theme = data;
+ };
+
+ /**
+ * @description Toggle the new user popup modal
+ */
+ toggleNewUserPopup = () => (this.isNewUserPopup = !this.isNewUserPopup);
+
+ /**
+ * @description Toggle the sidebar collapsed state
+ * @param isCollapsed
+ */
+ toggleSidebar = (isCollapsed: boolean) => {
+ if (isCollapsed === undefined) this.isSidebarCollapsed = !this.isSidebarCollapsed;
+ else this.isSidebarCollapsed = isCollapsed;
+ localStorage.setItem("god_mode_sidebar_collapsed", isCollapsed.toString());
+ };
+
+ /**
+ * @description Sets the user theme and applies it to the platform
+ * @param currentTheme
+ */
+ setTheme = async (currentTheme: TTheme) => {
+ try {
+ localStorage.setItem("theme", currentTheme);
+ this.theme = currentTheme;
+ } catch (error) {
+ console.error("setting user theme error", error);
+ }
+ };
+}
diff --git a/admin/store/user.store.ts b/admin/store/user.store.ts
new file mode 100644
index 000000000..60638f0cd
--- /dev/null
+++ b/admin/store/user.store.ts
@@ -0,0 +1,104 @@
+import { action, observable, runInAction, makeObservable } from "mobx";
+import { IUser } from "@plane/types";
+// helpers
+import { EUserStatus, TUserStatus } from "@/helpers";
+// services
+import { AuthService } from "@/services/auth.service";
+import { UserService } from "@/services/user.service";
+// root store
+import { RootStore } from "@/store/root.store";
+
+export interface IUserStore {
+ // observables
+ isLoading: boolean;
+ userStatus: TUserStatus | undefined;
+ isUserLoggedIn: boolean | undefined;
+ currentUser: IUser | undefined;
+ // fetch actions
+ hydrate: (data: any) => void;
+ fetchCurrentUser: () => Promise;
+ reset: () => void;
+ signOut: () => void;
+}
+
+export class UserStore implements IUserStore {
+ // observables
+ isLoading: boolean = true;
+ userStatus: TUserStatus | undefined = undefined;
+ isUserLoggedIn: boolean | undefined = undefined;
+ currentUser: IUser | undefined = undefined;
+ // services
+ userService;
+ authService;
+
+ constructor(private store: RootStore) {
+ makeObservable(this, {
+ // observables
+ isLoading: observable.ref,
+ userStatus: observable,
+ isUserLoggedIn: observable.ref,
+ currentUser: observable,
+ // action
+ fetchCurrentUser: action,
+ reset: action,
+ signOut: action,
+ });
+ this.userService = new UserService();
+ this.authService = new AuthService();
+ }
+
+ hydrate = (data: any) => {
+ if (data) this.currentUser = data;
+ };
+
+ /**
+ * @description Fetches the current user
+ * @returns Promise
+ */
+ fetchCurrentUser = async () => {
+ try {
+ if (this.currentUser === undefined) this.isLoading = true;
+ const currentUser = await this.userService.currentUser();
+ if (currentUser) {
+ await this.store.instance.fetchInstanceAdmins();
+ runInAction(() => {
+ this.isUserLoggedIn = true;
+ this.currentUser = currentUser;
+ this.isLoading = false;
+ });
+ } else {
+ runInAction(() => {
+ this.isUserLoggedIn = false;
+ this.currentUser = undefined;
+ this.isLoading = false;
+ });
+ }
+ return currentUser;
+ } catch (error: any) {
+ this.isLoading = false;
+ this.isUserLoggedIn = false;
+ if (error.status === 403)
+ this.userStatus = {
+ status: EUserStatus.AUTHENTICATION_NOT_DONE,
+ message: error?.message || "",
+ };
+ else
+ this.userStatus = {
+ status: EUserStatus.ERROR,
+ message: error?.message || "",
+ };
+ throw error;
+ }
+ };
+
+ reset = async () => {
+ this.isUserLoggedIn = false;
+ this.currentUser = undefined;
+ this.isLoading = false;
+ this.userStatus = undefined;
+ };
+
+ signOut = async () => {
+ this.store.resetOnSignOut();
+ };
+}
diff --git a/admin/tailwind.config.js b/admin/tailwind.config.js
new file mode 100644
index 000000000..05bc93bdc
--- /dev/null
+++ b/admin/tailwind.config.js
@@ -0,0 +1,5 @@
+const sharedConfig = require("tailwind-config-custom/tailwind.config.js");
+
+module.exports = {
+ presets: [sharedConfig],
+};
diff --git a/admin/tsconfig.json b/admin/tsconfig.json
new file mode 100644
index 000000000..5bc5a5684
--- /dev/null
+++ b/admin/tsconfig.json
@@ -0,0 +1,18 @@
+{
+ "extends": "tsconfig/nextjs.json",
+ "include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
+ "exclude": ["node_modules"],
+ "compilerOptions": {
+ "baseUrl": ".",
+ "jsx": "preserve",
+ "esModuleInterop": true,
+ "paths": {
+ "@/*": ["*"]
+ },
+ "plugins": [
+ {
+ "name": "next"
+ }
+ ]
+ }
+}
diff --git a/aio/Dockerfile b/aio/Dockerfile
new file mode 100644
index 000000000..94d61b866
--- /dev/null
+++ b/aio/Dockerfile
@@ -0,0 +1,149 @@
+# *****************************************************************************
+# STAGE 1: Build the project
+# *****************************************************************************
+FROM node:18-alpine AS builder
+RUN apk add --no-cache libc6-compat
+# Set working directory
+WORKDIR /app
+ENV NEXT_PUBLIC_API_BASE_URL=http://NEXT_PUBLIC_API_BASE_URL_PLACEHOLDER
+
+RUN yarn global add turbo
+COPY . .
+
+RUN turbo prune --scope=web --scope=space --scope=admin --docker
+
+# *****************************************************************************
+# STAGE 2: Install dependencies & build the project
+# *****************************************************************************
+# Add lockfile and package.json's of isolated subworkspace
+FROM node:18-alpine AS installer
+
+RUN apk add --no-cache libc6-compat
+WORKDIR /app
+
+# First install the dependencies (as they change less often)
+COPY .gitignore .gitignore
+COPY --from=builder /app/out/json/ .
+COPY --from=builder /app/out/yarn.lock ./yarn.lock
+RUN yarn install
+
+# # Build the project
+COPY --from=builder /app/out/full/ .
+COPY turbo.json turbo.json
+
+ARG NEXT_PUBLIC_API_BASE_URL=""
+ENV NEXT_PUBLIC_API_BASE_URL=$NEXT_PUBLIC_API_BASE_URL
+
+ARG NEXT_PUBLIC_ADMIN_BASE_URL=""
+ENV NEXT_PUBLIC_ADMIN_BASE_URL=$NEXT_PUBLIC_ADMIN_BASE_URL
+
+ARG NEXT_PUBLIC_ADMIN_BASE_PATH="/god-mode"
+ENV NEXT_PUBLIC_ADMIN_BASE_PATH=$NEXT_PUBLIC_ADMIN_BASE_PATH
+
+ARG NEXT_PUBLIC_SPACE_BASE_URL=""
+ENV NEXT_PUBLIC_SPACE_BASE_URL=$NEXT_PUBLIC_SPACE_BASE_URL
+
+ARG NEXT_PUBLIC_SPACE_BASE_PATH="/spaces"
+ENV NEXT_PUBLIC_SPACE_BASE_PATH=$NEXT_PUBLIC_SPACE_BASE_PATH
+
+ENV NEXT_TELEMETRY_DISABLED 1
+ENV TURBO_TELEMETRY_DISABLED 1
+
+RUN yarn turbo run build
+
+# *****************************************************************************
+# STAGE 3: Copy the project and start it
+# *****************************************************************************
+# FROM makeplane/plane-aio-base AS runner
+FROM makeplane/plane-aio-base:develop AS runner
+
+WORKDIR /app
+
+SHELL [ "/bin/bash", "-c" ]
+
+# PYTHON APPLICATION SETUP
+
+ENV PYTHONDONTWRITEBYTECODE 1
+ENV PYTHONUNBUFFERED 1
+ENV PIP_DISABLE_PIP_VERSION_CHECK=1
+
+COPY apiserver/requirements.txt ./api/
+COPY apiserver/requirements ./api/requirements
+
+RUN python3.12 -m venv /app/venv && \
+ source /app/venv/bin/activate && \
+ /app/venv/bin/pip install --upgrade pip && \
+ /app/venv/bin/pip install -r ./api/requirements.txt --compile --no-cache-dir
+
+# Add in Django deps and generate Django's static files
+COPY apiserver/manage.py ./api/manage.py
+COPY apiserver/plane ./api/plane/
+COPY apiserver/templates ./api/templates/
+COPY package.json ./api/package.json
+
+COPY apiserver/bin ./api/bin/
+
+RUN chmod +x ./api/bin/*
+RUN chmod -R 777 ./api/
+
+# NEXTJS BUILDS
+
+COPY --from=installer /app/web/next.config.js ./web/
+COPY --from=installer /app/web/package.json ./web/
+COPY --from=installer /app/web/.next/standalone ./web
+COPY --from=installer /app/web/.next/static ./web/web/.next/static
+COPY --from=installer /app/web/public ./web/web/public
+
+COPY --from=installer /app/space/next.config.js ./space/
+COPY --from=installer /app/space/package.json ./space/
+COPY --from=installer /app/space/.next/standalone ./space
+COPY --from=installer /app/space/.next/static ./space/space/.next/static
+COPY --from=installer /app/space/public ./space/space/public
+
+COPY --from=installer /app/admin/next.config.js ./admin/
+COPY --from=installer /app/admin/package.json ./admin/
+COPY --from=installer /app/admin/.next/standalone ./admin
+COPY --from=installer /app/admin/.next/static ./admin/admin/.next/static
+COPY --from=installer /app/admin/public ./admin/admin/public
+
+ARG NEXT_PUBLIC_API_BASE_URL=""
+ENV NEXT_PUBLIC_API_BASE_URL=$NEXT_PUBLIC_API_BASE_URL
+
+ARG NEXT_PUBLIC_ADMIN_BASE_URL=""
+ENV NEXT_PUBLIC_ADMIN_BASE_URL=$NEXT_PUBLIC_ADMIN_BASE_URL
+
+ARG NEXT_PUBLIC_ADMIN_BASE_PATH="/god-mode"
+ENV NEXT_PUBLIC_ADMIN_BASE_PATH=$NEXT_PUBLIC_ADMIN_BASE_PATH
+
+ARG NEXT_PUBLIC_SPACE_BASE_URL=""
+ENV NEXT_PUBLIC_SPACE_BASE_URL=$NEXT_PUBLIC_SPACE_BASE_URL
+
+ARG NEXT_PUBLIC_SPACE_BASE_PATH="/spaces"
+ENV NEXT_PUBLIC_SPACE_BASE_PATH=$NEXT_PUBLIC_SPACE_BASE_PATH
+
+ARG NEXT_PUBLIC_WEB_BASE_URL=""
+ENV NEXT_PUBLIC_WEB_BASE_URL=$NEXT_PUBLIC_WEB_BASE_URL
+
+ENV NEXT_TELEMETRY_DISABLED 1
+ENV TURBO_TELEMETRY_DISABLED 1
+
+COPY aio/supervisord.conf /app/supervisord.conf
+
+COPY aio/aio.sh /app/aio.sh
+RUN chmod +x /app/aio.sh
+
+COPY aio/pg-setup.sh /app/pg-setup.sh
+RUN chmod +x /app/pg-setup.sh
+
+COPY deploy/selfhost/variables.env /app/plane.env
+
+# NGINX Conf Copy
+COPY ./aio/nginx.conf.aio /etc/nginx/nginx.conf.template
+COPY ./nginx/env.sh /app/nginx-start.sh
+RUN chmod +x /app/nginx-start.sh
+
+RUN ./pg-setup.sh
+
+VOLUME [ "/app/data/minio/uploads", "/var/lib/postgresql/data" ]
+
+CMD ["/usr/bin/supervisord", "-c", "/app/supervisord.conf"]
diff --git a/aio/Dockerfile.base b/aio/Dockerfile.base
new file mode 100644
index 000000000..092deb797
--- /dev/null
+++ b/aio/Dockerfile.base
@@ -0,0 +1,92 @@
+FROM --platform=$BUILDPLATFORM tonistiigi/binfmt as binfmt
+
+FROM debian:12-slim
+
+# Set environment variables to non-interactive for apt
+ENV DEBIAN_FRONTEND=noninteractive
+
+SHELL [ "/bin/bash", "-c" ]
+
+# Update the package list and install prerequisites
+RUN apt-get update && \
+ apt-get install -y \
+ gnupg2 curl ca-certificates lsb-release software-properties-common \
+ build-essential libssl-dev zlib1g-dev libbz2-dev libreadline-dev \
+ libsqlite3-dev wget llvm libncurses5-dev libncursesw5-dev xz-utils \
+ tk-dev libffi-dev liblzma-dev supervisor nginx nano vim ncdu
+
+# Install Redis 7.2
+RUN echo "deb http://deb.debian.org/debian $(lsb_release -cs)-backports main" > /etc/apt/sources.list.d/backports.list && \
+ curl -fsSL https://packages.redis.io/gpg | gpg --dearmor -o /usr/share/keyrings/redis-archive-keyring.gpg && \
+ echo "deb [signed-by=/usr/share/keyrings/redis-archive-keyring.gpg] https://packages.redis.io/deb $(lsb_release -cs) main" > /etc/apt/sources.list.d/redis.list && \
+ apt-get update && \
+ apt-get install -y redis-server
+
+# Install PostgreSQL 15
+ENV POSTGRES_VERSION 15
+RUN curl -fsSL https://www.postgresql.org/media/keys/ACCC4CF8.asc | gpg --dearmor -o /usr/share/keyrings/pgdg-archive-keyring.gpg && \
+ echo "deb [signed-by=/usr/share/keyrings/pgdg-archive-keyring.gpg] http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list && \
+ apt-get update && \
+ apt-get install -y postgresql-$POSTGRES_VERSION postgresql-client-$POSTGRES_VERSION && \
+ mkdir -p /var/lib/postgresql/data && \
+ chown -R postgres:postgres /var/lib/postgresql
+
+# Install MinIO
+ARG TARGETARCH
+RUN if [ "$TARGETARCH" = "amd64" ]; then \
+ curl -fSl https://dl.min.io/server/minio/release/linux-amd64/minio -o /usr/local/bin/minio; \
+ elif [ "$TARGETARCH" = "arm64" ]; then \
+ curl -fSl https://dl.min.io/server/minio/release/linux-arm64/minio -o /usr/local/bin/minio; \
+ else \
+ echo "Unsupported architecture: $TARGETARCH"; exit 1; \
+ fi && \
+ chmod +x /usr/local/bin/minio
+
+
+# Install Node.js 18
+RUN curl -fsSL https://deb.nodesource.com/setup_18.x | bash - && \
+ apt-get install -y nodejs
+
+# Install Python 3.12 from source
+RUN cd /usr/src && \
+ wget https://www.python.org/ftp/python/3.12.0/Python-3.12.0.tgz && \
+ tar xzf Python-3.12.0.tgz && \
+ cd Python-3.12.0 && \
+ ./configure --enable-optimizations && \
+ make altinstall && \
+ rm -f /usr/src/Python-3.12.0.tgz
+
+RUN python3.12 -m pip install --upgrade pip
+
+RUN echo "alias python=/usr/local/bin/python3.12" >> ~/.bashrc && \
+ echo "alias pip=/usr/local/bin/pip3.12" >> ~/.bashrc
+
+# Clean up
+RUN apt-get clean && \
+ rm -rf /var/lib/apt/lists/* /usr/src/Python-3.12.0
+
+WORKDIR /app
+
+RUN mkdir -p /app/{data,logs} && \
+ mkdir -p /app/data/{redis,pg,minio,nginx} && \
+ mkdir -p /app/logs/{access,error} && \
+ mkdir -p /etc/supervisor/conf.d
+
+# Create Supervisor configuration file
+COPY supervisord.base /app/supervisord.conf
+
+RUN apt-get update && \
+ apt-get install -y sudo lsof net-tools libpq-dev procps gettext && \
+ apt-get clean
+
+RUN sudo -u postgres /usr/lib/postgresql/$POSTGRES_VERSION/bin/initdb -D /var/lib/postgresql/data
+COPY postgresql.conf /etc/postgresql/postgresql.conf
+
+RUN echo "alias python=/usr/local/bin/python3.12" >> ~/.bashrc && \
+ echo "alias pip=/usr/local/bin/pip3.12" >> ~/.bashrc
+
+# Expose ports for Redis, PostgreSQL, and MinIO
+EXPOSE 6379 5432 9000 80
+
+# Start Supervisor
+CMD ["/usr/bin/supervisord", "-c", "/app/supervisord.conf"]
diff --git a/aio/aio.sh b/aio/aio.sh
new file mode 100644
index 000000000..53adbf42b
--- /dev/null
+++ b/aio/aio.sh
@@ -0,0 +1,30 @@
+#!/bin/bash
+set -e
+
+
+if [ "$1" = 'api' ]; then
+ source /app/venv/bin/activate
+ cd /app/api
+ exec ./bin/docker-entrypoint-api.sh
+elif [ "$1" = 'worker' ]; then
+ source /app/venv/bin/activate
+ cd /app/api
+ exec ./bin/docker-entrypoint-worker.sh
+elif [ "$1" = 'beat' ]; then
+ source /app/venv/bin/activate
+ cd /app/api
+ exec ./bin/docker-entrypoint-beat.sh
+elif [ "$1" = 'migrator' ]; then
+ source /app/venv/bin/activate
+ cd /app/api
+ exec ./bin/docker-entrypoint-migrator.sh
+elif [ "$1" = 'web' ]; then
+ node /app/web/web/server.js
+elif [ "$1" = 'space' ]; then
+ node /app/space/space/server.js
+elif [ "$1" = 'admin' ]; then
+ node /app/admin/admin/server.js
+else
+ echo "Command not found"
+ exit 1
+fi
\ No newline at end of file
diff --git a/aio/nginx.conf.aio b/aio/nginx.conf.aio
new file mode 100644
index 000000000..1a1f3c0b8
--- /dev/null
+++ b/aio/nginx.conf.aio
@@ -0,0 +1,73 @@
+events {
+}
+
+http {
+ sendfile on;
+
+ server {
+ listen 80;
+ root /www/data/;
+ access_log /var/log/nginx/access.log;
+
+ client_max_body_size ${FILE_SIZE_LIMIT};
+
+ add_header X-Content-Type-Options "nosniff" always;
+ add_header Referrer-Policy "no-referrer-when-downgrade" always;
+ add_header Permissions-Policy "interest-cohort=()" always;
+ add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
+ add_header X-Forwarded-Proto "${dollar}scheme";
+ add_header X-Forwarded-Host "${dollar}host";
+ add_header X-Forwarded-For "${dollar}proxy_add_x_forwarded_for";
+ add_header X-Real-IP "${dollar}remote_addr";
+
+ location / {
+ proxy_http_version 1.1;
+ proxy_set_header Upgrade ${dollar}http_upgrade;
+ proxy_set_header Connection "upgrade";
+ proxy_set_header Host ${dollar}http_host;
+ proxy_pass http://localhost:3001/;
+ }
+
+ location /spaces/ {
+ rewrite ^/spaces/?$ /spaces/login break;
+ proxy_http_version 1.1;
+ proxy_set_header Upgrade ${dollar}http_upgrade;
+ proxy_set_header Connection "upgrade";
+ proxy_set_header Host ${dollar}http_host;
+ proxy_pass http://localhost:3002/spaces/;
+ }
+
+
+ location /god-mode/ {
+ proxy_http_version 1.1;
+ proxy_set_header Upgrade ${dollar}http_upgrade;
+ proxy_set_header Connection "upgrade";
+ proxy_set_header Host ${dollar}http_host;
+ proxy_pass http://localhost:3003/god-mode/;
+ }
+
+ location /api/ {
+ proxy_http_version 1.1;
+ proxy_set_header Upgrade ${dollar}http_upgrade;
+ proxy_set_header Connection "upgrade";
+ proxy_set_header Host ${dollar}http_host;
+ proxy_pass http://localhost:8000/api/;
+ }
+
+ location /auth/ {
+ proxy_http_version 1.1;
+ proxy_set_header Upgrade ${dollar}http_upgrade;
+ proxy_set_header Connection "upgrade";
+ proxy_set_header Host ${dollar}http_host;
+ proxy_pass http://localhost:8000/auth/;
+ }
+
+ location /${BUCKET_NAME}/ {
+ proxy_http_version 1.1;
+ proxy_set_header Upgrade ${dollar}http_upgrade;
+ proxy_set_header Connection "upgrade";
+ proxy_set_header Host ${dollar}http_host;
+ proxy_pass http://localhost:9000/uploads/;
+ }
+ }
+}
diff --git a/aio/pg-setup.sh b/aio/pg-setup.sh
new file mode 100644
index 000000000..6f6ea88e6
--- /dev/null
+++ b/aio/pg-setup.sh
@@ -0,0 +1,14 @@
+#!/bin/bash
+
+
+# Variables
+set -o allexport
+source plane.env set
+set +o allexport
+
+export PGHOST=localhost
+
+sudo -u postgres "/usr/lib/postgresql/${POSTGRES_VERSION}/bin/pg_ctl" -D /var/lib/postgresql/data start
+sudo -u postgres "/usr/lib/postgresql/${POSTGRES_VERSION}/bin/psql" --command "CREATE USER $POSTGRES_USER WITH SUPERUSER PASSWORD '$POSTGRES_PASSWORD';" && \
+sudo -u postgres "/usr/lib/postgresql/${POSTGRES_VERSION}/bin/createdb" -O "$POSTGRES_USER" "$POSTGRES_DB" && \
+sudo -u postgres "/usr/lib/postgresql/${POSTGRES_VERSION}/bin/pg_ctl" -D /var/lib/postgresql/data stop
diff --git a/aio/postgresql.conf b/aio/postgresql.conf
new file mode 100644
index 000000000..8c6223fc4
--- /dev/null
+++ b/aio/postgresql.conf
@@ -0,0 +1,12 @@
+# PostgreSQL configuration file
+
+# Allow connections from any IP address
+listen_addresses = '*'
+
+# Set the maximum number of connections
+max_connections = 100
+
+# Set the shared buffers size
+shared_buffers = 128MB
+
+# Other custom configurations can be added here
diff --git a/aio/supervisord.base b/aio/supervisord.base
new file mode 100644
index 000000000..fe6a76e41
--- /dev/null
+++ b/aio/supervisord.base
@@ -0,0 +1,37 @@
+[supervisord]
+user=root
+nodaemon=true
+stderr_logfile=/app/logs/error/supervisor.err.log
+stdout_logfile=/app/logs/access/supervisor.out.log
+
+[program:redis]
+directory=/app/data/redis
+command=redis-server
+autostart=true
+autorestart=true
+stderr_logfile=/app/logs/error/redis.err.log
+stdout_logfile=/app/logs/access/redis.out.log
+
+[program:postgresql]
+user=postgres
+command=/usr/lib/postgresql/15/bin/postgres --config-file=/etc/postgresql/15/main/postgresql.conf
+autostart=true
+autorestart=true
+stderr_logfile=/app/logs/error/postgresql.err.log
+stdout_logfile=/app/logs/access/postgresql.out.log
+
+[program:minio]
+directory=/app/data/minio
+command=minio server /app/data/minio
+autostart=true
+autorestart=true
+stderr_logfile=/app/logs/error/minio.err.log
+stdout_logfile=/app/logs/access/minio.out.log
+
+[program:nginx]
+directory=/app/data/nginx
+command=/usr/sbin/nginx -g 'daemon off;'
+autostart=true
+autorestart=true
+stderr_logfile=/app/logs/error/nginx.err.log
+stdout_logfile=/app/logs/access/nginx.out.log
diff --git a/aio/supervisord.conf b/aio/supervisord.conf
new file mode 100644
index 000000000..46ef1b4fa
--- /dev/null
+++ b/aio/supervisord.conf
@@ -0,0 +1,115 @@
+[supervisord]
+user=root
+nodaemon=true
+priority=1
+stdout_logfile=/dev/stdout
+stdout_logfile_maxbytes=0
+stderr_logfile=/dev/stdout
+stderr_logfile_maxbytes=0
+
+[program:redis]
+directory=/app/data/redis
+command=redis-server
+autostart=true
+autorestart=true
+priority=1
+stdout_logfile=/dev/stdout
+stdout_logfile_maxbytes=0
+stderr_logfile=/dev/stdout
+stderr_logfile_maxbytes=0
+
+[program:postgresql]
+user=postgres
+command=/usr/lib/postgresql/15/bin/postgres -D /var/lib/postgresql/data --config-file=/etc/postgresql/postgresql.conf
+autostart=true
+autorestart=true
+priority=1
+stdout_logfile=/dev/stdout
+stdout_logfile_maxbytes=0
+stderr_logfile=/dev/stdout
+stderr_logfile_maxbytes=0
+
+[program:minio]
+directory=/app/data/minio
+command=minio server /app/data/minio
+autostart=true
+autorestart=true
+priority=1
+stdout_logfile=/app/logs/access/minio.log
+stderr_logfile=/app/logs/error/minio.err.log
+
+[program:nginx]
+command=/app/nginx-start.sh
+autostart=true
+autorestart=true
+priority=1
+stdout_logfile=/app/logs/access/nginx.log
+stderr_logfile=/app/logs/error/nginx.err.log
+
+
+[program:web]
+command=/app/aio.sh web
+autostart=true
+autorestart=true
+stdout_logfile=/dev/stdout
+stdout_logfile_maxbytes=0
+stderr_logfile=/dev/stdout
+stderr_logfile_maxbytes=0
+environment=PORT=3001,HOSTNAME=0.0.0.0
+
+[program:space]
+command=/app/aio.sh space
+autostart=true
+autorestart=true
+stdout_logfile=/dev/stdout
+stdout_logfile_maxbytes=0
+stderr_logfile=/dev/stdout
+stderr_logfile_maxbytes=0
+environment=PORT=3002,HOSTNAME=0.0.0.0
+
+[program:admin]
+command=/app/aio.sh admin
+autostart=true
+autorestart=true
+stdout_logfile=/dev/stdout
+stdout_logfile_maxbytes=0
+stderr_logfile=/dev/stdout
+stderr_logfile_maxbytes=0
+environment=PORT=3003,HOSTNAME=0.0.0.0
+
+[program:migrator]
+command=/app/aio.sh migrator
+autostart=true
+autorestart=false
+stdout_logfile=/dev/stdout
+stdout_logfile_maxbytes=0
+stderr_logfile=/dev/stdout
+stderr_logfile_maxbytes=0
+
+[program:api]
+command=/app/aio.sh api
+autostart=true
+autorestart=true
+stdout_logfile=/dev/stdout
+stdout_logfile_maxbytes=0
+stderr_logfile=/dev/stdout
+stderr_logfile_maxbytes=0
+
+[program:worker]
+command=/app/aio.sh worker
+autostart=true
+autorestart=true
+stdout_logfile=/dev/stdout
+stdout_logfile_maxbytes=0
+stderr_logfile=/dev/stdout
+stderr_logfile_maxbytes=0
+
+[program:beat]
+command=/app/aio.sh beat
+autostart=true
+autorestart=true
+stdout_logfile=/dev/stdout
+stdout_logfile_maxbytes=0
+stderr_logfile=/dev/stdout
+stderr_logfile_maxbytes=0
+
diff --git a/apiserver/.env.example b/apiserver/.env.example
index 37178b398..38944f79c 100644
--- a/apiserver/.env.example
+++ b/apiserver/.env.example
@@ -1,23 +1,20 @@
# Backend
# Debug value for api server use it as 0 for production use
DEBUG=0
-CORS_ALLOWED_ORIGINS=""
+CORS_ALLOWED_ORIGINS="http://localhost"
# Error logs
SENTRY_DSN=""
SENTRY_ENVIRONMENT="development"
# Database Settings
-PGUSER="plane"
-PGPASSWORD="plane"
-PGHOST="plane-db"
-PGDATABASE="plane"
-DATABASE_URL=postgresql://${PGUSER}:${PGPASSWORD}@${PGHOST}/${PGDATABASE}
+POSTGRES_USER="plane"
+POSTGRES_PASSWORD="plane"
+POSTGRES_HOST="plane-db"
+POSTGRES_DB="plane"
+POSTGRES_PORT=5432
+DATABASE_URL=postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
-# Oauth variables
-GOOGLE_CLIENT_ID=""
-GITHUB_CLIENT_ID=""
-GITHUB_CLIENT_SECRET=""
# Redis Settings
REDIS_HOST="plane-redis"
@@ -34,14 +31,6 @@ AWS_S3_BUCKET_NAME="uploads"
# Maximum file upload limit
FILE_SIZE_LIMIT=5242880
-# GPT settings
-OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
-OPENAI_API_KEY="sk-" # deprecated
-GPT_ENGINE="gpt-3.5-turbo" # deprecated
-
-# Github
-GITHUB_CLIENT_SECRET="" # For fetching release notes
-
# Settings related to Docker
DOCKERIZED=1 # deprecated
@@ -51,19 +40,13 @@ USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80
-
-# SignUps
-ENABLE_SIGNUP="1"
-
-# Enable Email/Password Signup
-ENABLE_EMAIL_PASSWORD="1"
-
-# Enable Magic link Login
-ENABLE_MAGIC_LINK_LOGIN="0"
-
# Email redirections and minio domain settings
WEB_URL="http://localhost"
# Gunicorn Workers
GUNICORN_WORKERS=2
+# Base URLs
+ADMIN_BASE_URL=
+SPACE_BASE_URL=
+APP_BASE_URL=
diff --git a/apiserver/Dockerfile.api b/apiserver/Dockerfile.api
index 0e4e0ac50..6447e9f97 100644
--- a/apiserver/Dockerfile.api
+++ b/apiserver/Dockerfile.api
@@ -32,29 +32,20 @@ RUN apk add --no-cache --virtual .build-deps \
apk del .build-deps
-RUN addgroup -S plane && \
- adduser -S captain -G plane
-
-RUN chown captain.plane /code
-
-USER captain
-
# Add in Django deps and generate Django's static files
COPY manage.py manage.py
COPY plane plane/
COPY templates templates/
COPY package.json package.json
-USER root
+
RUN apk --no-cache add "bash~=5.2"
COPY ./bin ./bin/
-RUN chmod +x ./bin/takeoff ./bin/worker ./bin/beat
+RUN mkdir -p /code/plane/logs
+RUN chmod +x ./bin/*
RUN chmod -R 777 /code
-USER captain
-
# Expose container port and run entry point script
EXPOSE 8000
-# CMD [ "./bin/takeoff" ]
diff --git a/apiserver/Dockerfile.dev b/apiserver/Dockerfile.dev
index cb2d1ca28..3de300db7 100644
--- a/apiserver/Dockerfile.dev
+++ b/apiserver/Dockerfile.dev
@@ -30,24 +30,16 @@ ADD requirements ./requirements
# Install the local development settings
RUN pip install -r requirements/local.txt --compile --no-cache-dir
-RUN addgroup -S plane && \
- adduser -S captain -G plane
-RUN chown captain.plane /code
+COPY . .
-USER captain
-
-# Add in Django deps and generate Django's static files
-
-USER root
-
-# RUN chmod +x ./bin/takeoff ./bin/worker ./bin/beat
+RUN mkdir -p /code/plane/logs
+RUN chmod -R +x /code/bin
RUN chmod -R 777 /code
-USER captain
# Expose container port and run entry point script
EXPOSE 8000
-CMD [ "./bin/takeoff.local" ]
+CMD [ "./bin/docker-entrypoint-api-local.sh" ]
diff --git a/apiserver/back_migration.py b/apiserver/back_migration.py
index c04ee7771..328b9db2b 100644
--- a/apiserver/back_migration.py
+++ b/apiserver/back_migration.py
@@ -26,7 +26,9 @@ def update_description():
updated_issues.append(issue)
Issue.objects.bulk_update(
- updated_issues, ["description_html", "description_stripped"], batch_size=100
+ updated_issues,
+ ["description_html", "description_stripped"],
+ batch_size=100,
)
print("Success")
except Exception as e:
@@ -40,7 +42,9 @@ def update_comments():
updated_issue_comments = []
for issue_comment in issue_comments:
- issue_comment.comment_html = f"{issue_comment.comment_stripped}
"
+ issue_comment.comment_html = (
+ f"{issue_comment.comment_stripped}
"
+ )
updated_issue_comments.append(issue_comment)
IssueComment.objects.bulk_update(
@@ -99,7 +103,9 @@ def updated_issue_sort_order():
issue.sort_order = issue.sequence_id * random.randint(100, 500)
updated_issues.append(issue)
- Issue.objects.bulk_update(updated_issues, ["sort_order"], batch_size=100)
+ Issue.objects.bulk_update(
+ updated_issues, ["sort_order"], batch_size=100
+ )
print("Success")
except Exception as e:
print(e)
@@ -137,7 +143,9 @@ def update_project_cover_images():
project.cover_image = project_cover_images[random.randint(0, 19)]
updated_projects.append(project)
- Project.objects.bulk_update(updated_projects, ["cover_image"], batch_size=100)
+ Project.objects.bulk_update(
+ updated_projects, ["cover_image"], batch_size=100
+ )
print("Success")
except Exception as e:
print(e)
@@ -174,7 +182,7 @@ def update_label_color():
labels = Label.objects.filter(color="")
updated_labels = []
for label in labels:
- label.color = "#" + "%06x" % random.randint(0, 0xFFFFFF)
+ label.color = f"#{random.randint(0, 0xFFFFFF+1):06X}"
updated_labels.append(label)
Label.objects.bulk_update(updated_labels, ["color"], batch_size=100)
@@ -186,7 +194,9 @@ def update_label_color():
def create_slack_integration():
try:
- _ = Integration.objects.create(provider="slack", network=2, title="Slack")
+ _ = Integration.objects.create(
+ provider="slack", network=2, title="Slack"
+ )
print("Success")
except Exception as e:
print(e)
@@ -212,12 +222,16 @@ def update_integration_verified():
def update_start_date():
try:
- issues = Issue.objects.filter(state__group__in=["started", "completed"])
+ issues = Issue.objects.filter(
+ state__group__in=["started", "completed"]
+ )
updated_issues = []
for issue in issues:
issue.start_date = issue.created_at.date()
updated_issues.append(issue)
- Issue.objects.bulk_update(updated_issues, ["start_date"], batch_size=500)
+ Issue.objects.bulk_update(
+ updated_issues, ["start_date"], batch_size=500
+ )
print("Success")
except Exception as e:
print(e)
diff --git a/apiserver/bin/beat b/apiserver/bin/beat
deleted file mode 100644
index 45d357442..000000000
--- a/apiserver/bin/beat
+++ /dev/null
@@ -1,5 +0,0 @@
-#!/bin/bash
-set -e
-
-python manage.py wait_for_db
-celery -A plane beat -l info
\ No newline at end of file
diff --git a/apiserver/bin/takeoff.local b/apiserver/bin/docker-entrypoint-api-local.sh
similarity index 78%
rename from apiserver/bin/takeoff.local
rename to apiserver/bin/docker-entrypoint-api-local.sh
index b89c20874..3194009b2 100755
--- a/apiserver/bin/takeoff.local
+++ b/apiserver/bin/docker-entrypoint-api-local.sh
@@ -1,7 +1,8 @@
#!/bin/bash
set -e
python manage.py wait_for_db
-python manage.py migrate
+# Wait for migrations
+python manage.py wait_for_migrations
# Create the default bucket
#!/bin/bash
@@ -20,12 +21,15 @@ SIGNATURE=$(echo "$HOSTNAME$MAC_ADDRESS$CPU_INFO$MEMORY_INFO$DISK_INFO" | sha256
export MACHINE_SIGNATURE=$SIGNATURE
# Register instance
-python manage.py register_instance $MACHINE_SIGNATURE
+python manage.py register_instance "$MACHINE_SIGNATURE"
# Load the configuration variable
python manage.py configure_instance
# Create the default bucket
python manage.py create_bucket
+# Clear Cache before starting to remove stale values
+python manage.py clear_cache
+
python manage.py runserver 0.0.0.0:8000 --settings=plane.settings.local
diff --git a/apiserver/bin/takeoff b/apiserver/bin/docker-entrypoint-api.sh
similarity index 62%
rename from apiserver/bin/takeoff
rename to apiserver/bin/docker-entrypoint-api.sh
index 0ec2e495c..5a1da1570 100755
--- a/apiserver/bin/takeoff
+++ b/apiserver/bin/docker-entrypoint-api.sh
@@ -1,7 +1,8 @@
#!/bin/bash
set -e
python manage.py wait_for_db
-python manage.py migrate
+# Wait for migrations
+python manage.py wait_for_migrations
# Create the default bucket
#!/bin/bash
@@ -20,11 +21,15 @@ SIGNATURE=$(echo "$HOSTNAME$MAC_ADDRESS$CPU_INFO$MEMORY_INFO$DISK_INFO" | sha256
export MACHINE_SIGNATURE=$SIGNATURE
# Register instance
-python manage.py register_instance $MACHINE_SIGNATURE
+python manage.py register_instance "$MACHINE_SIGNATURE"
+
# Load the configuration variable
python manage.py configure_instance
# Create the default bucket
python manage.py create_bucket
-exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:${PORT:-8000} --max-requests 1200 --max-requests-jitter 1000 --access-logfile -
+# Clear Cache before starting to remove stale values
+python manage.py clear_cache
+
+exec gunicorn -w "$GUNICORN_WORKERS" -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:"${PORT:-8000}" --max-requests 1200 --max-requests-jitter 1000 --access-logfile -
diff --git a/apiserver/bin/docker-entrypoint-beat.sh b/apiserver/bin/docker-entrypoint-beat.sh
new file mode 100644
index 000000000..3a9602a9e
--- /dev/null
+++ b/apiserver/bin/docker-entrypoint-beat.sh
@@ -0,0 +1,8 @@
+#!/bin/bash
+set -e
+
+python manage.py wait_for_db
+# Wait for migrations
+python manage.py wait_for_migrations
+# Run the processes
+celery -A plane beat -l info
\ No newline at end of file
diff --git a/apiserver/bin/docker-entrypoint-migrator.sh b/apiserver/bin/docker-entrypoint-migrator.sh
new file mode 100644
index 000000000..104b39024
--- /dev/null
+++ b/apiserver/bin/docker-entrypoint-migrator.sh
@@ -0,0 +1,6 @@
+#!/bin/bash
+set -e
+
+python manage.py wait_for_db $1
+
+python manage.py migrate $1
\ No newline at end of file
diff --git a/apiserver/bin/worker b/apiserver/bin/docker-entrypoint-worker.sh
similarity index 50%
rename from apiserver/bin/worker
rename to apiserver/bin/docker-entrypoint-worker.sh
index 9d2da1254..a70b5f77c 100755
--- a/apiserver/bin/worker
+++ b/apiserver/bin/docker-entrypoint-worker.sh
@@ -2,4 +2,7 @@
set -e
python manage.py wait_for_db
+# Wait for migrations
+python manage.py wait_for_migrations
+# Run the processes
celery -A plane worker -l info
\ No newline at end of file
diff --git a/apiserver/manage.py b/apiserver/manage.py
index 837297219..744086783 100644
--- a/apiserver/manage.py
+++ b/apiserver/manage.py
@@ -2,10 +2,10 @@
import os
import sys
-if __name__ == '__main__':
+if __name__ == "__main__":
os.environ.setdefault(
- 'DJANGO_SETTINGS_MODULE',
- 'plane.settings.production')
+ "DJANGO_SETTINGS_MODULE", "plane.settings.production"
+ )
try:
from django.core.management import execute_from_command_line
except ImportError as exc:
diff --git a/apiserver/package.json b/apiserver/package.json
index a317b4776..ecaf1194a 100644
--- a/apiserver/package.json
+++ b/apiserver/package.json
@@ -1,4 +1,4 @@
{
"name": "plane-api",
- "version": "0.14.0"
+ "version": "0.21.0"
}
diff --git a/apiserver/plane/__init__.py b/apiserver/plane/__init__.py
index fb989c4e6..53f4ccb1d 100644
--- a/apiserver/plane/__init__.py
+++ b/apiserver/plane/__init__.py
@@ -1,3 +1,3 @@
from .celery import app as celery_app
-__all__ = ('celery_app',)
+__all__ = ("celery_app",)
diff --git a/apiserver/plane/analytics/apps.py b/apiserver/plane/analytics/apps.py
index 353779983..52a59f313 100644
--- a/apiserver/plane/analytics/apps.py
+++ b/apiserver/plane/analytics/apps.py
@@ -2,4 +2,4 @@ from django.apps import AppConfig
class AnalyticsConfig(AppConfig):
- name = 'plane.analytics'
+ name = "plane.analytics"
diff --git a/apiserver/plane/api/apps.py b/apiserver/plane/api/apps.py
index 292ad9344..6ba36e7e5 100644
--- a/apiserver/plane/api/apps.py
+++ b/apiserver/plane/api/apps.py
@@ -2,4 +2,4 @@ from django.apps import AppConfig
class ApiConfig(AppConfig):
- name = "plane.api"
\ No newline at end of file
+ name = "plane.api"
diff --git a/apiserver/plane/api/middleware/api_authentication.py b/apiserver/plane/api/middleware/api_authentication.py
index 1b2c03318..893df7f84 100644
--- a/apiserver/plane/api/middleware/api_authentication.py
+++ b/apiserver/plane/api/middleware/api_authentication.py
@@ -25,7 +25,10 @@ class APIKeyAuthentication(authentication.BaseAuthentication):
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
- Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
+ Q(
+ Q(expired_at__gt=timezone.now())
+ | Q(expired_at__isnull=True)
+ ),
token=token,
is_active=True,
)
@@ -44,4 +47,4 @@ class APIKeyAuthentication(authentication.BaseAuthentication):
# Validate the API token
user, token = self.validate_api_token(token)
- return user, token
\ No newline at end of file
+ return user, token
diff --git a/apiserver/plane/api/rate_limit.py b/apiserver/plane/api/rate_limit.py
index f91e2d65d..b62936d8e 100644
--- a/apiserver/plane/api/rate_limit.py
+++ b/apiserver/plane/api/rate_limit.py
@@ -1,17 +1,18 @@
from rest_framework.throttling import SimpleRateThrottle
+
class ApiKeyRateThrottle(SimpleRateThrottle):
- scope = 'api_key'
- rate = '60/minute'
+ scope = "api_key"
+ rate = "60/minute"
def get_cache_key(self, request, view):
# Retrieve the API key from the request header
- api_key = request.headers.get('X-Api-Key')
+ api_key = request.headers.get("X-Api-Key")
if not api_key:
return None # Allow the request if there's no API key
# Use the API key as part of the cache key
- return f'{self.scope}:{api_key}'
+ return f"{self.scope}:{api_key}"
def allow_request(self, request, view):
allowed = super().allow_request(request, view)
@@ -24,7 +25,7 @@ class ApiKeyRateThrottle(SimpleRateThrottle):
# Remove old histories
while history and history[-1] <= now - self.duration:
history.pop()
-
+
# Calculate the requests
num_requests = len(history)
@@ -35,7 +36,7 @@ class ApiKeyRateThrottle(SimpleRateThrottle):
reset_time = int(now + self.duration)
# Add headers
- request.META['X-RateLimit-Remaining'] = max(0, available)
- request.META['X-RateLimit-Reset'] = reset_time
+ request.META["X-RateLimit-Remaining"] = max(0, available)
+ request.META["X-RateLimit-Reset"] = reset_time
- return allowed
\ No newline at end of file
+ return allowed
diff --git a/apiserver/plane/api/serializers/__init__.py b/apiserver/plane/api/serializers/__init__.py
index 1fd1bce78..10b0182d6 100644
--- a/apiserver/plane/api/serializers/__init__.py
+++ b/apiserver/plane/api/serializers/__init__.py
@@ -13,5 +13,9 @@ from .issue import (
)
from .state import StateLiteSerializer, StateSerializer
from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer
-from .module import ModuleSerializer, ModuleIssueSerializer, ModuleLiteSerializer
-from .inbox import InboxIssueSerializer
\ No newline at end of file
+from .module import (
+ ModuleSerializer,
+ ModuleIssueSerializer,
+ ModuleLiteSerializer,
+)
+from .inbox import InboxIssueSerializer
diff --git a/apiserver/plane/api/serializers/base.py b/apiserver/plane/api/serializers/base.py
index b96422501..5b68a7113 100644
--- a/apiserver/plane/api/serializers/base.py
+++ b/apiserver/plane/api/serializers/base.py
@@ -66,11 +66,11 @@ class BaseSerializer(serializers.ModelSerializer):
if expand in self.fields:
# Import all the expandable serializers
from . import (
- WorkspaceLiteSerializer,
- ProjectLiteSerializer,
- UserLiteSerializer,
- StateLiteSerializer,
IssueSerializer,
+ ProjectLiteSerializer,
+ StateLiteSerializer,
+ UserLiteSerializer,
+ WorkspaceLiteSerializer,
)
# Expansion mapper
@@ -97,9 +97,11 @@ class BaseSerializer(serializers.ModelSerializer):
exp_serializer = expansion[expand](
getattr(instance, expand)
)
- response[expand] = exp_serializer.data
+ response[expand] = exp_serializer.data
else:
# You might need to handle this case differently
- response[expand] = getattr(instance, f"{expand}_id", None)
+ response[expand] = getattr(
+ instance, f"{expand}_id", None
+ )
- return response
\ No newline at end of file
+ return response
diff --git a/apiserver/plane/api/serializers/cycle.py b/apiserver/plane/api/serializers/cycle.py
index eaff8181a..6fc73a4bc 100644
--- a/apiserver/plane/api/serializers/cycle.py
+++ b/apiserver/plane/api/serializers/cycle.py
@@ -23,7 +23,9 @@ class CycleSerializer(BaseSerializer):
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
- raise serializers.ValidationError("Start date cannot exceed end date")
+ raise serializers.ValidationError(
+ "Start date cannot exceed end date"
+ )
return data
class Meta:
@@ -55,7 +57,6 @@ class CycleIssueSerializer(BaseSerializer):
class CycleLiteSerializer(BaseSerializer):
-
class Meta:
model = Cycle
- fields = "__all__"
\ No newline at end of file
+ fields = "__all__"
diff --git a/apiserver/plane/api/serializers/inbox.py b/apiserver/plane/api/serializers/inbox.py
index 17ae8c1ed..a0c79235d 100644
--- a/apiserver/plane/api/serializers/inbox.py
+++ b/apiserver/plane/api/serializers/inbox.py
@@ -1,9 +1,13 @@
# Module improts
from .base import BaseSerializer
+from .issue import IssueExpandSerializer
from plane.db.models import InboxIssue
+
class InboxIssueSerializer(BaseSerializer):
+ issue_detail = IssueExpandSerializer(read_only=True, source="issue")
+
class Meta:
model = InboxIssue
fields = "__all__"
@@ -16,4 +20,4 @@ class InboxIssueSerializer(BaseSerializer):
"updated_by",
"created_at",
"updated_at",
- ]
\ No newline at end of file
+ ]
diff --git a/apiserver/plane/api/serializers/issue.py b/apiserver/plane/api/serializers/issue.py
index 75396e9bb..020917ee5 100644
--- a/apiserver/plane/api/serializers/issue.py
+++ b/apiserver/plane/api/serializers/issue.py
@@ -1,31 +1,34 @@
-from lxml import html
-
+from django.core.exceptions import ValidationError
+from django.core.validators import URLValidator
# Django imports
from django.utils import timezone
+from lxml import html
# Third party imports
from rest_framework import serializers
# Module imports
from plane.db.models import (
- User,
Issue,
- State,
+ IssueActivity,
IssueAssignee,
- Label,
+ IssueAttachment,
+ IssueComment,
IssueLabel,
IssueLink,
- IssueComment,
- IssueAttachment,
- IssueActivity,
+ Label,
ProjectMember,
+ State,
+ User,
)
+
from .base import BaseSerializer
-from .cycle import CycleSerializer, CycleLiteSerializer
-from .module import ModuleSerializer, ModuleLiteSerializer
-from .user import UserLiteSerializer
+from .cycle import CycleLiteSerializer, CycleSerializer
+from .module import ModuleLiteSerializer, ModuleSerializer
from .state import StateLiteSerializer
+from .user import UserLiteSerializer
+
class IssueSerializer(BaseSerializer):
assignees = serializers.ListField(
@@ -66,16 +69,18 @@ class IssueSerializer(BaseSerializer):
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
- raise serializers.ValidationError("Start date cannot exceed target date")
-
+ raise serializers.ValidationError(
+ "Start date cannot exceed target date"
+ )
+
try:
- if(data.get("description_html", None) is not None):
+ if data.get("description_html", None) is not None:
parsed = html.fromstring(data["description_html"])
- parsed_str = html.tostring(parsed, encoding='unicode')
+ parsed_str = html.tostring(parsed, encoding="unicode")
data["description_html"] = parsed_str
-
- except Exception as e:
- raise serializers.ValidationError(f"Invalid HTML: {str(e)}")
+
+ except Exception:
+ raise serializers.ValidationError("Invalid HTML passed")
# Validate assignees are from project
if data.get("assignees", []):
@@ -96,7 +101,8 @@ class IssueSerializer(BaseSerializer):
if (
data.get("state")
and not State.objects.filter(
- project_id=self.context.get("project_id"), pk=data.get("state").id
+ project_id=self.context.get("project_id"),
+ pk=data.get("state").id,
).exists()
):
raise serializers.ValidationError(
@@ -107,7 +113,8 @@ class IssueSerializer(BaseSerializer):
if (
data.get("parent")
and not Issue.objects.filter(
- workspace_id=self.context.get("workspace_id"), pk=data.get("parent").id
+ workspace_id=self.context.get("workspace_id"),
+ pk=data.get("parent").id,
).exists()
):
raise serializers.ValidationError(
@@ -238,9 +245,13 @@ class IssueSerializer(BaseSerializer):
]
if "labels" in self.fields:
if "labels" in self.expand:
- data["labels"] = LabelSerializer(instance.labels.all(), many=True).data
+ data["labels"] = LabelSerializer(
+ instance.labels.all(), many=True
+ ).data
else:
- data["labels"] = [str(label.id) for label in instance.labels.all()]
+ data["labels"] = [
+ str(label.id) for label in instance.labels.all()
+ ]
return data
@@ -275,16 +286,42 @@ class IssueLinkSerializer(BaseSerializer):
"updated_at",
]
+ def validate_url(self, value):
+ # Check URL format
+ validate_url = URLValidator()
+ try:
+ validate_url(value)
+ except ValidationError:
+ raise serializers.ValidationError("Invalid URL format.")
+
+ # Check URL scheme
+ if not value.startswith(("http://", "https://")):
+ raise serializers.ValidationError("Invalid URL scheme.")
+
+ return value
+
# Validation if url already exists
def create(self, validated_data):
if IssueLink.objects.filter(
- url=validated_data.get("url"), issue_id=validated_data.get("issue_id")
+ url=validated_data.get("url"),
+ issue_id=validated_data.get("issue_id"),
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
return IssueLink.objects.create(**validated_data)
+ def update(self, instance, validated_data):
+ if IssueLink.objects.filter(
+ url=validated_data.get("url"),
+ issue_id=instance.issue_id,
+ ).exclude(pk=instance.id).exists():
+ raise serializers.ValidationError(
+ {"error": "URL already exists for this Issue"}
+ )
+
+ return super().update(instance, validated_data)
+
class IssueAttachmentSerializer(BaseSerializer):
class Meta:
@@ -324,13 +361,13 @@ class IssueCommentSerializer(BaseSerializer):
def validate(self, data):
try:
- if(data.get("comment_html", None) is not None):
+ if data.get("comment_html", None) is not None:
parsed = html.fromstring(data["comment_html"])
- parsed_str = html.tostring(parsed, encoding='unicode')
+ parsed_str = html.tostring(parsed, encoding="unicode")
data["comment_html"] = parsed_str
-
- except Exception as e:
- raise serializers.ValidationError(f"Invalid HTML: {str(e)}")
+
+ except Exception:
+ raise serializers.ValidationError("Invalid HTML passed")
return data
@@ -362,7 +399,6 @@ class ModuleIssueSerializer(BaseSerializer):
class LabelLiteSerializer(BaseSerializer):
-
class Meta:
model = Label
fields = [
diff --git a/apiserver/plane/api/serializers/module.py b/apiserver/plane/api/serializers/module.py
index a96a9b54d..01a201064 100644
--- a/apiserver/plane/api/serializers/module.py
+++ b/apiserver/plane/api/serializers/module.py
@@ -52,7 +52,9 @@ class ModuleSerializer(BaseSerializer):
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
- raise serializers.ValidationError("Start date cannot exceed target date")
+ raise serializers.ValidationError(
+ "Start date cannot exceed target date"
+ )
if data.get("members", []):
data["members"] = ProjectMember.objects.filter(
@@ -146,16 +148,16 @@ class ModuleLinkSerializer(BaseSerializer):
# Validation if url already exists
def create(self, validated_data):
if ModuleLink.objects.filter(
- url=validated_data.get("url"), module_id=validated_data.get("module_id")
+ url=validated_data.get("url"),
+ module_id=validated_data.get("module_id"),
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
return ModuleLink.objects.create(**validated_data)
-
+
class ModuleLiteSerializer(BaseSerializer):
-
class Meta:
model = Module
- fields = "__all__"
\ No newline at end of file
+ fields = "__all__"
diff --git a/apiserver/plane/api/serializers/project.py b/apiserver/plane/api/serializers/project.py
index c394a080d..ce354ba5f 100644
--- a/apiserver/plane/api/serializers/project.py
+++ b/apiserver/plane/api/serializers/project.py
@@ -2,12 +2,16 @@
from rest_framework import serializers
# Module imports
-from plane.db.models import Project, ProjectIdentifier, WorkspaceMember, State, Estimate
+from plane.db.models import (
+ Project,
+ ProjectIdentifier,
+ WorkspaceMember,
+)
+
from .base import BaseSerializer
class ProjectSerializer(BaseSerializer):
-
total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True)
total_modules = serializers.IntegerField(read_only=True)
@@ -21,7 +25,7 @@ class ProjectSerializer(BaseSerializer):
fields = "__all__"
read_only_fields = [
"id",
- 'emoji',
+ "emoji",
"workspace",
"created_at",
"updated_at",
@@ -59,12 +63,16 @@ class ProjectSerializer(BaseSerializer):
def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "":
- raise serializers.ValidationError(detail="Project Identifier is required")
+ raise serializers.ValidationError(
+ detail="Project Identifier is required"
+ )
if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"]
).exists():
- raise serializers.ValidationError(detail="Project Identifier is taken")
+ raise serializers.ValidationError(
+ detail="Project Identifier is taken"
+ )
project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"]
@@ -89,4 +97,4 @@ class ProjectLiteSerializer(BaseSerializer):
"emoji",
"description",
]
- read_only_fields = fields
\ No newline at end of file
+ read_only_fields = fields
diff --git a/apiserver/plane/api/serializers/state.py b/apiserver/plane/api/serializers/state.py
index 9d08193d8..1649a7bcf 100644
--- a/apiserver/plane/api/serializers/state.py
+++ b/apiserver/plane/api/serializers/state.py
@@ -7,9 +7,9 @@ class StateSerializer(BaseSerializer):
def validate(self, data):
# If the default is being provided then make all other states default False
if data.get("default", False):
- State.objects.filter(project_id=self.context.get("project_id")).update(
- default=False
- )
+ State.objects.filter(
+ project_id=self.context.get("project_id")
+ ).update(default=False)
return data
class Meta:
@@ -35,4 +35,4 @@ class StateLiteSerializer(BaseSerializer):
"color",
"group",
]
- read_only_fields = fields
\ No newline at end of file
+ read_only_fields = fields
diff --git a/apiserver/plane/api/serializers/user.py b/apiserver/plane/api/serializers/user.py
index 42b6c3967..e853b90c2 100644
--- a/apiserver/plane/api/serializers/user.py
+++ b/apiserver/plane/api/serializers/user.py
@@ -1,5 +1,6 @@
# Module imports
from plane.db.models import User
+
from .base import BaseSerializer
@@ -10,7 +11,9 @@ class UserLiteSerializer(BaseSerializer):
"id",
"first_name",
"last_name",
+ "email",
"avatar",
"display_name",
+ "email",
]
- read_only_fields = fields
\ No newline at end of file
+ read_only_fields = fields
diff --git a/apiserver/plane/api/serializers/workspace.py b/apiserver/plane/api/serializers/workspace.py
index c4c5caceb..a47de3d31 100644
--- a/apiserver/plane/api/serializers/workspace.py
+++ b/apiserver/plane/api/serializers/workspace.py
@@ -5,6 +5,7 @@ from .base import BaseSerializer
class WorkspaceLiteSerializer(BaseSerializer):
"""Lite serializer with only required fields"""
+
class Meta:
model = Workspace
fields = [
@@ -12,4 +13,4 @@ class WorkspaceLiteSerializer(BaseSerializer):
"slug",
"id",
]
- read_only_fields = fields
\ No newline at end of file
+ read_only_fields = fields
diff --git a/apiserver/plane/api/urls/__init__.py b/apiserver/plane/api/urls/__init__.py
index a5ef0f5f1..84927439e 100644
--- a/apiserver/plane/api/urls/__init__.py
+++ b/apiserver/plane/api/urls/__init__.py
@@ -12,4 +12,4 @@ urlpatterns = [
*cycle_patterns,
*module_patterns,
*inbox_patterns,
-]
\ No newline at end of file
+]
diff --git a/apiserver/plane/api/urls/cycle.py b/apiserver/plane/api/urls/cycle.py
index f557f8af0..b0ae21174 100644
--- a/apiserver/plane/api/urls/cycle.py
+++ b/apiserver/plane/api/urls/cycle.py
@@ -4,6 +4,7 @@ from plane.api.views.cycle import (
CycleAPIEndpoint,
CycleIssueAPIEndpoint,
TransferCycleIssueAPIEndpoint,
+ CycleArchiveUnarchiveAPIEndpoint,
)
urlpatterns = [
@@ -32,4 +33,14 @@ urlpatterns = [
TransferCycleIssueAPIEndpoint.as_view(),
name="transfer-issues",
),
-]
\ No newline at end of file
+ path(
+ "workspaces//projects//cycles//archive/",
+ CycleArchiveUnarchiveAPIEndpoint.as_view(),
+ name="cycle-archive-unarchive",
+ ),
+ path(
+ "workspaces//projects//archived-cycles/",
+ CycleArchiveUnarchiveAPIEndpoint.as_view(),
+ name="cycle-archive-unarchive",
+ ),
+]
diff --git a/apiserver/plane/api/urls/inbox.py b/apiserver/plane/api/urls/inbox.py
index 3a2a57786..95eb68f3f 100644
--- a/apiserver/plane/api/urls/inbox.py
+++ b/apiserver/plane/api/urls/inbox.py
@@ -14,4 +14,4 @@ urlpatterns = [
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
-]
\ No newline at end of file
+]
diff --git a/apiserver/plane/api/urls/issue.py b/apiserver/plane/api/urls/issue.py
index 070ea8bd9..5ce9db85c 100644
--- a/apiserver/plane/api/urls/issue.py
+++ b/apiserver/plane/api/urls/issue.py
@@ -6,9 +6,15 @@ from plane.api.views import (
IssueLinkAPIEndpoint,
IssueCommentAPIEndpoint,
IssueActivityAPIEndpoint,
+ WorkspaceIssueAPIEndpoint,
)
urlpatterns = [
+ path(
+ "workspaces//issues/-/",
+ WorkspaceIssueAPIEndpoint.as_view(),
+ name="issue-by-identifier",
+ ),
path(
"workspaces//projects//issues/",
IssueAPIEndpoint.as_view(),
diff --git a/apiserver/plane/api/urls/module.py b/apiserver/plane/api/urls/module.py
index 7117a9e8b..a131f4d4f 100644
--- a/apiserver/plane/api/urls/module.py
+++ b/apiserver/plane/api/urls/module.py
@@ -1,6 +1,10 @@
from django.urls import path
-from plane.api.views import ModuleAPIEndpoint, ModuleIssueAPIEndpoint
+from plane.api.views import (
+ ModuleAPIEndpoint,
+ ModuleIssueAPIEndpoint,
+ ModuleArchiveUnarchiveAPIEndpoint,
+)
urlpatterns = [
path(
@@ -23,4 +27,14 @@ urlpatterns = [
ModuleIssueAPIEndpoint.as_view(),
name="module-issues",
),
-]
\ No newline at end of file
+ path(
+ "workspaces//projects//modules//archive/",
+ ModuleArchiveUnarchiveAPIEndpoint.as_view(),
+ name="module-archive-unarchive",
+ ),
+ path(
+ "workspaces//projects//archived-modules/",
+ ModuleArchiveUnarchiveAPIEndpoint.as_view(),
+ name="module-archive-unarchive",
+ ),
+]
diff --git a/apiserver/plane/api/urls/project.py b/apiserver/plane/api/urls/project.py
index c73e84c89..5efb85bb0 100644
--- a/apiserver/plane/api/urls/project.py
+++ b/apiserver/plane/api/urls/project.py
@@ -1,16 +1,24 @@
from django.urls import path
-from plane.api.views import ProjectAPIEndpoint
+from plane.api.views import (
+ ProjectAPIEndpoint,
+ ProjectArchiveUnarchiveAPIEndpoint,
+)
urlpatterns = [
- path(
+ path(
"workspaces//projects/",
ProjectAPIEndpoint.as_view(),
name="project",
),
path(
- "workspaces//projects//",
+ "workspaces//projects//",
ProjectAPIEndpoint.as_view(),
name="project",
),
-]
\ No newline at end of file
+ path(
+ "workspaces//projects//archive/",
+ ProjectArchiveUnarchiveAPIEndpoint.as_view(),
+ name="project-archive-unarchive",
+ ),
+]
diff --git a/apiserver/plane/api/urls/state.py b/apiserver/plane/api/urls/state.py
index 0676ac5ad..b03f386e6 100644
--- a/apiserver/plane/api/urls/state.py
+++ b/apiserver/plane/api/urls/state.py
@@ -13,4 +13,4 @@ urlpatterns = [
StateAPIEndpoint.as_view(),
name="states",
),
-]
\ No newline at end of file
+]
diff --git a/apiserver/plane/api/views/__init__.py b/apiserver/plane/api/views/__init__.py
index 84d8dcabb..d59b40fc5 100644
--- a/apiserver/plane/api/views/__init__.py
+++ b/apiserver/plane/api/views/__init__.py
@@ -1,8 +1,9 @@
-from .project import ProjectAPIEndpoint
+from .project import ProjectAPIEndpoint, ProjectArchiveUnarchiveAPIEndpoint
from .state import StateAPIEndpoint
from .issue import (
+ WorkspaceIssueAPIEndpoint,
IssueAPIEndpoint,
LabelAPIEndpoint,
IssueLinkAPIEndpoint,
@@ -14,8 +15,13 @@ from .cycle import (
CycleAPIEndpoint,
CycleIssueAPIEndpoint,
TransferCycleIssueAPIEndpoint,
+ CycleArchiveUnarchiveAPIEndpoint,
)
-from .module import ModuleAPIEndpoint, ModuleIssueAPIEndpoint
+from .module import (
+ ModuleAPIEndpoint,
+ ModuleIssueAPIEndpoint,
+ ModuleArchiveUnarchiveAPIEndpoint,
+)
-from .inbox import InboxIssueAPIEndpoint
\ No newline at end of file
+from .inbox import InboxIssueAPIEndpoint
diff --git a/apiserver/plane/api/views/base.py b/apiserver/plane/api/views/base.py
index abde4e8b0..fee508a30 100644
--- a/apiserver/plane/api/views/base.py
+++ b/apiserver/plane/api/views/base.py
@@ -1,25 +1,24 @@
# Python imports
import zoneinfo
-import json
# Django imports
from django.conf import settings
-from django.db import IntegrityError
from django.core.exceptions import ObjectDoesNotExist, ValidationError
+from django.db import IntegrityError
+from django.urls import resolve
from django.utils import timezone
+from rest_framework import status
+from rest_framework.permissions import IsAuthenticated
+from rest_framework.response import Response
# Third party imports
from rest_framework.views import APIView
-from rest_framework.response import Response
-from rest_framework.permissions import IsAuthenticated
-from rest_framework import status
-from sentry_sdk import capture_exception
# Module imports
from plane.api.middleware.api_authentication import APIKeyAuthentication
from plane.api.rate_limit import ApiKeyRateThrottle
+from plane.utils.exception_logger import log_exception
from plane.utils.paginator import BasePaginator
-from plane.bgtasks.webhook_task import send_webhook
class TimezoneMixin:
@@ -36,32 +35,6 @@ class TimezoneMixin:
timezone.deactivate()
-class WebhookMixin:
- webhook_event = None
- bulk = False
-
- def finalize_response(self, request, response, *args, **kwargs):
- response = super().finalize_response(request, response, *args, **kwargs)
-
- # Check for the case should webhook be sent
- if (
- self.webhook_event
- and self.request.method in ["POST", "PATCH", "DELETE"]
- and response.status_code in [200, 201, 204]
- ):
- # Push the object to delay
- send_webhook.delay(
- event=self.webhook_event,
- payload=response.data,
- kw=self.kwargs,
- action=self.request.method,
- slug=self.workspace_slug,
- bulk=self.bulk,
- )
-
- return response
-
-
class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
authentication_classes = [
APIKeyAuthentication,
@@ -97,28 +70,23 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
if isinstance(e, ValidationError):
return Response(
- {
- "error": "The provided payload is not valid please try with a valid payload"
- },
+ {"error": "Please provide valid detail"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ObjectDoesNotExist):
- model_name = str(exc).split(" matching query does not exist.")[0]
return Response(
- {"error": f"{model_name} does not exist."},
+ {"error": "The requested resource does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
if isinstance(e, KeyError):
return Response(
- {"error": f"key {e} does not exist"},
+ {"error": "The required key does not exist."},
status=status.HTTP_400_BAD_REQUEST,
)
- if settings.DEBUG:
- print(e)
- capture_exception(e)
+ log_exception(e)
return Response(
{"error": "Something went wrong please try again later"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
@@ -140,7 +108,9 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
def finalize_response(self, request, response, *args, **kwargs):
# Call super to get the default response
- response = super().finalize_response(request, response, *args, **kwargs)
+ response = super().finalize_response(
+ request, response, *args, **kwargs
+ )
# Add custom headers if they exist in the request META
ratelimit_remaining = request.META.get("X-RateLimit-Remaining")
@@ -159,18 +129,27 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
@property
def project_id(self):
- return self.kwargs.get("project_id", None)
+ project_id = self.kwargs.get("project_id", None)
+ if project_id:
+ return project_id
+
+ if resolve(self.request.path_info).url_name == "project":
+ return self.kwargs.get("pk", None)
@property
def fields(self):
fields = [
- field for field in self.request.GET.get("fields", "").split(",") if field
+ field
+ for field in self.request.GET.get("fields", "").split(",")
+ if field
]
return fields if fields else None
@property
def expand(self):
expand = [
- expand for expand in self.request.GET.get("expand", "").split(",") if expand
+ expand
+ for expand in self.request.GET.get("expand", "").split(",")
+ if expand
]
return expand if expand else None
diff --git a/apiserver/plane/api/views/cycle.py b/apiserver/plane/api/views/cycle.py
index 310332333..6e1e5e057 100644
--- a/apiserver/plane/api/views/cycle.py
+++ b/apiserver/plane/api/views/cycle.py
@@ -2,26 +2,36 @@
import json
# Django imports
-from django.db.models import Q, Count, Sum, Prefetch, F, OuterRef, Func
-from django.utils import timezone
from django.core import serializers
+from django.db.models import Count, F, Func, OuterRef, Q, Sum
+from django.utils import timezone
+from django.core.serializers.json import DjangoJSONEncoder
# Third party imports
-from rest_framework.response import Response
from rest_framework import status
+from rest_framework.response import Response
# Module imports
-from .base import BaseAPIView, WebhookMixin
-from plane.db.models import Cycle, Issue, CycleIssue, IssueLink, IssueAttachment
-from plane.app.permissions import ProjectEntityPermission
from plane.api.serializers import (
- CycleSerializer,
CycleIssueSerializer,
+ CycleSerializer,
)
+from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activites_task import issue_activity
+from plane.db.models import (
+ Cycle,
+ CycleIssue,
+ Issue,
+ IssueAttachment,
+ IssueLink,
+)
+from plane.utils.analytics_plot import burndown_plot
+
+from .base import BaseAPIView
+from plane.bgtasks.webhook_task import model_activity
-class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
+class CycleAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to cycle.
@@ -39,7 +49,10 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
return (
Cycle.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
- .filter(project__project_projectmember__member=self.request.user)
+ .filter(
+ project__project_projectmember__member=self.request.user,
+ project__project_projectmember__is_active=True,
+ )
.select_related("project")
.select_related("workspace")
.select_related("owned_by")
@@ -102,7 +115,9 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
),
)
)
- .annotate(total_estimates=Sum("issue_cycle__issue__estimate_point"))
+ .annotate(
+ total_estimates=Sum("issue_cycle__issue__estimate_point")
+ )
.annotate(
completed_estimates=Sum(
"issue_cycle__issue__estimate_point",
@@ -129,7 +144,9 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
def get(self, request, slug, project_id, pk=None):
if pk:
- queryset = self.get_queryset().get(pk=pk)
+ queryset = (
+ self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
+ )
data = CycleSerializer(
queryset,
fields=self.fields,
@@ -139,7 +156,7 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
data,
status=status.HTTP_200_OK,
)
- queryset = self.get_queryset()
+ queryset = self.get_queryset().filter(archived_at__isnull=True)
cycle_view = request.GET.get("cycle_view", "all")
# Current Cycle
@@ -201,7 +218,8 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
# Incomplete Cycles
if cycle_view == "incomplete":
queryset = queryset.filter(
- Q(end_date__gte=timezone.now().date()) | Q(end_date__isnull=True),
+ Q(end_date__gte=timezone.now().date())
+ | Q(end_date__isnull=True),
)
return self.paginate(
request=request,
@@ -234,12 +252,49 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
):
serializer = CycleSerializer(data=request.data)
if serializer.is_valid():
+ if (
+ request.data.get("external_id")
+ and request.data.get("external_source")
+ and Cycle.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get("external_source"),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ cycle = Cycle.objects.filter(
+ workspace__slug=slug,
+ project_id=project_id,
+ external_source=request.data.get("external_source"),
+ external_id=request.data.get("external_id"),
+ ).first()
+ return Response(
+ {
+ "error": "Cycle with the same external id and external source already exists",
+ "id": str(cycle.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
serializer.save(
project_id=project_id,
owned_by=request.user,
)
- return Response(serializer.data, status=status.HTTP_201_CREATED)
- return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
+ # Send the model activity
+ model_activity.delay(
+ model_name="cycle",
+ model_id=str(serializer.data["id"]),
+ requested_data=request.data,
+ current_instance=None,
+ actor_id=request.user.id,
+ slug=slug,
+ origin=request.META.get("HTTP_ORIGIN"),
+ )
+ return Response(
+ serializer.data, status=status.HTTP_201_CREATED
+ )
+ return Response(
+ serializer.errors, status=status.HTTP_400_BAD_REQUEST
+ )
else:
return Response(
{
@@ -249,15 +304,32 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
)
def patch(self, request, slug, project_id, pk):
- cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
+ cycle = Cycle.objects.get(
+ workspace__slug=slug, project_id=project_id, pk=pk
+ )
+
+ current_instance = json.dumps(
+ CycleSerializer(cycle).data, cls=DjangoJSONEncoder
+ )
+
+ if cycle.archived_at:
+ return Response(
+ {"error": "Archived cycle cannot be edited"},
+ status=status.HTTP_400_BAD_REQUEST,
+ )
request_data = request.data
- if cycle.end_date is not None and cycle.end_date < timezone.now().date():
+ if (
+ cycle.end_date is not None
+ and cycle.end_date < timezone.now().date()
+ ):
if "sort_order" in request_data:
# Can only change sort order
request_data = {
- "sort_order": request_data.get("sort_order", cycle.sort_order)
+ "sort_order": request_data.get(
+ "sort_order", cycle.sort_order
+ )
}
else:
return Response(
@@ -269,17 +341,49 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
serializer = CycleSerializer(cycle, data=request.data, partial=True)
if serializer.is_valid():
+ if (
+ request.data.get("external_id")
+ and (cycle.external_id != request.data.get("external_id"))
+ and Cycle.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get(
+ "external_source", cycle.external_source
+ ),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ return Response(
+ {
+ "error": "Cycle with the same external id and external source already exists",
+ "id": str(cycle.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
serializer.save()
+
+ # Send the model activity
+ model_activity.delay(
+ model_name="cycle",
+ model_id=str(serializer.data["id"]),
+ requested_data=request.data,
+ current_instance=current_instance,
+ actor_id=request.user.id,
+ slug=slug,
+ origin=request.META.get("HTTP_ORIGIN"),
+ )
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk):
cycle_issues = list(
- CycleIssue.objects.filter(cycle_id=self.kwargs.get("pk")).values_list(
- "issue", flat=True
- )
+ CycleIssue.objects.filter(
+ cycle_id=self.kwargs.get("pk")
+ ).values_list("issue", flat=True)
+ )
+ cycle = Cycle.objects.get(
+ workspace__slug=slug, project_id=project_id, pk=pk
)
- cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
issue_activity.delay(
type="cycle.activity.deleted",
@@ -301,7 +405,145 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
-class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
+class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
+
+ permission_classes = [
+ ProjectEntityPermission,
+ ]
+
+ def get_queryset(self):
+ return (
+ Cycle.objects.filter(workspace__slug=self.kwargs.get("slug"))
+ .filter(project_id=self.kwargs.get("project_id"))
+ .filter(
+ project__project_projectmember__member=self.request.user,
+ project__project_projectmember__is_active=True,
+ )
+ .filter(archived_at__isnull=False)
+ .select_related("project")
+ .select_related("workspace")
+ .select_related("owned_by")
+ .annotate(
+ total_issues=Count(
+ "issue_cycle",
+ filter=Q(
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ completed_issues=Count(
+ "issue_cycle__issue__state__group",
+ filter=Q(
+ issue_cycle__issue__state__group="completed",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ cancelled_issues=Count(
+ "issue_cycle__issue__state__group",
+ filter=Q(
+ issue_cycle__issue__state__group="cancelled",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ started_issues=Count(
+ "issue_cycle__issue__state__group",
+ filter=Q(
+ issue_cycle__issue__state__group="started",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ unstarted_issues=Count(
+ "issue_cycle__issue__state__group",
+ filter=Q(
+ issue_cycle__issue__state__group="unstarted",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ backlog_issues=Count(
+ "issue_cycle__issue__state__group",
+ filter=Q(
+ issue_cycle__issue__state__group="backlog",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ total_estimates=Sum("issue_cycle__issue__estimate_point")
+ )
+ .annotate(
+ completed_estimates=Sum(
+ "issue_cycle__issue__estimate_point",
+ filter=Q(
+ issue_cycle__issue__state__group="completed",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ started_estimates=Sum(
+ "issue_cycle__issue__estimate_point",
+ filter=Q(
+ issue_cycle__issue__state__group="started",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .order_by(self.kwargs.get("order_by", "-created_at"))
+ .distinct()
+ )
+
+ def get(self, request, slug, project_id):
+ return self.paginate(
+ request=request,
+ queryset=(self.get_queryset()),
+ on_results=lambda cycles: CycleSerializer(
+ cycles,
+ many=True,
+ fields=self.fields,
+ expand=self.expand,
+ ).data,
+ )
+
+ def post(self, request, slug, project_id, cycle_id):
+ cycle = Cycle.objects.get(
+ pk=cycle_id, project_id=project_id, workspace__slug=slug
+ )
+ if cycle.end_date >= timezone.now().date():
+ return Response(
+ {"error": "Only completed cycles can be archived"},
+ status=status.HTTP_400_BAD_REQUEST,
+ )
+ cycle.archived_at = timezone.now()
+ cycle.save()
+ return Response(status=status.HTTP_204_NO_CONTENT)
+
+ def delete(self, request, slug, project_id, cycle_id):
+ cycle = Cycle.objects.get(
+ pk=cycle_id, project_id=project_id, workspace__slug=slug
+ )
+ cycle.archived_at = None
+ cycle.save()
+ return Response(status=status.HTTP_204_NO_CONTENT)
+
+
+class CycleIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`,
and `destroy` actions related to cycle issues.
@@ -319,14 +561,19 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self):
return (
CycleIssue.objects.annotate(
- sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue_id"))
+ sub_issues_count=Issue.issue_objects.filter(
+ parent=OuterRef("issue_id")
+ )
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
- .filter(project__project_projectmember__member=self.request.user)
+ .filter(
+ project__project_projectmember__member=self.request.user,
+ project__project_projectmember__is_active=True,
+ )
.filter(cycle_id=self.kwargs.get("cycle_id"))
.select_related("project")
.select_related("workspace")
@@ -337,12 +584,28 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
.distinct()
)
- def get(self, request, slug, project_id, cycle_id):
+ def get(self, request, slug, project_id, cycle_id, issue_id=None):
+ # Get
+ if issue_id:
+ cycle_issue = CycleIssue.objects.get(
+ workspace__slug=slug,
+ project_id=project_id,
+ cycle_id=cycle_id,
+ issue_id=issue_id,
+ )
+ serializer = CycleIssueSerializer(
+ cycle_issue, fields=self.fields, expand=self.expand
+ )
+ return Response(serializer.data, status=status.HTTP_200_OK)
+
+ # List
order_by = request.GET.get("order_by", "created_at")
issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
.annotate(
- sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
+ sub_issues_count=Issue.issue_objects.filter(
+ parent=OuterRef("id")
+ )
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -364,7 +627,9 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
.values("count")
)
.annotate(
- attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
+ attachment_count=IssueAttachment.objects.filter(
+ issue=OuterRef("id")
+ )
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -387,14 +652,18 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
if not issues:
return Response(
- {"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
+ {"error": "Issues are required"},
+ status=status.HTTP_400_BAD_REQUEST,
)
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=cycle_id
)
- if cycle.end_date is not None and cycle.end_date < timezone.now().date():
+ if (
+ cycle.end_date is not None
+ and cycle.end_date < timezone.now().date()
+ ):
return Response(
{
"error": "The Cycle has already been completed so no new issues can be added"
@@ -479,7 +748,10 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def delete(self, request, slug, project_id, cycle_id, issue_id):
cycle_issue = CycleIssue.objects.get(
- issue_id=issue_id, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
+ issue_id=issue_id,
+ workspace__slug=slug,
+ project_id=project_id,
+ cycle_id=cycle_id,
)
issue_id = cycle_issue.issue_id
cycle_issue.delete()
@@ -502,7 +774,7 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
class TransferCycleIssueAPIEndpoint(BaseAPIView):
"""
- This viewset provides `create` actions for transfering the issues into a particular cycle.
+ This viewset provides `create` actions for transferring the issues into a particular cycle.
"""
@@ -523,6 +795,209 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
workspace__slug=slug, project_id=project_id, pk=new_cycle_id
)
+ old_cycle = (
+ Cycle.objects.filter(
+ workspace__slug=slug, project_id=project_id, pk=cycle_id
+ )
+ .annotate(
+ total_issues=Count(
+ "issue_cycle",
+ filter=Q(
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ completed_issues=Count(
+ "issue_cycle__issue__state__group",
+ filter=Q(
+ issue_cycle__issue__state__group="completed",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ cancelled_issues=Count(
+ "issue_cycle__issue__state__group",
+ filter=Q(
+ issue_cycle__issue__state__group="cancelled",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ started_issues=Count(
+ "issue_cycle__issue__state__group",
+ filter=Q(
+ issue_cycle__issue__state__group="started",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ unstarted_issues=Count(
+ "issue_cycle__issue__state__group",
+ filter=Q(
+ issue_cycle__issue__state__group="unstarted",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ backlog_issues=Count(
+ "issue_cycle__issue__state__group",
+ filter=Q(
+ issue_cycle__issue__state__group="backlog",
+ issue_cycle__issue__archived_at__isnull=True,
+ issue_cycle__issue__is_draft=False,
+ ),
+ )
+ )
+ )
+
+ # Pass the new_cycle queryset to burndown_plot
+ completion_chart = burndown_plot(
+ queryset=old_cycle.first(),
+ slug=slug,
+ project_id=project_id,
+ cycle_id=cycle_id,
+ )
+
+ # Get the assignee distribution
+ assignee_distribution = (
+ Issue.objects.filter(
+ issue_cycle__cycle_id=cycle_id,
+ workspace__slug=slug,
+ project_id=project_id,
+ )
+ .annotate(display_name=F("assignees__display_name"))
+ .annotate(assignee_id=F("assignees__id"))
+ .annotate(avatar=F("assignees__avatar"))
+ .values("display_name", "assignee_id", "avatar")
+ .annotate(
+ total_issues=Count(
+ "id",
+ filter=Q(archived_at__isnull=True, is_draft=False),
+ ),
+ )
+ .annotate(
+ completed_issues=Count(
+ "id",
+ filter=Q(
+ completed_at__isnull=False,
+ archived_at__isnull=True,
+ is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ pending_issues=Count(
+ "id",
+ filter=Q(
+ completed_at__isnull=True,
+ archived_at__isnull=True,
+ is_draft=False,
+ ),
+ )
+ )
+ .order_by("display_name")
+ )
+ # assignee distribution serialized
+ assignee_distribution_data = [
+ {
+ "display_name": item["display_name"],
+ "assignee_id": (
+ str(item["assignee_id"]) if item["assignee_id"] else None
+ ),
+ "avatar": item["avatar"],
+ "total_issues": item["total_issues"],
+ "completed_issues": item["completed_issues"],
+ "pending_issues": item["pending_issues"],
+ }
+ for item in assignee_distribution
+ ]
+
+ # Get the label distribution
+ label_distribution = (
+ Issue.objects.filter(
+ issue_cycle__cycle_id=cycle_id,
+ workspace__slug=slug,
+ project_id=project_id,
+ )
+ .annotate(label_name=F("labels__name"))
+ .annotate(color=F("labels__color"))
+ .annotate(label_id=F("labels__id"))
+ .values("label_name", "color", "label_id")
+ .annotate(
+ total_issues=Count(
+ "id",
+ filter=Q(archived_at__isnull=True, is_draft=False),
+ )
+ )
+ .annotate(
+ completed_issues=Count(
+ "id",
+ filter=Q(
+ completed_at__isnull=False,
+ archived_at__isnull=True,
+ is_draft=False,
+ ),
+ )
+ )
+ .annotate(
+ pending_issues=Count(
+ "id",
+ filter=Q(
+ completed_at__isnull=True,
+ archived_at__isnull=True,
+ is_draft=False,
+ ),
+ )
+ )
+ .order_by("label_name")
+ )
+
+ # Label distribution serilization
+ label_distribution_data = [
+ {
+ "label_name": item["label_name"],
+ "color": item["color"],
+ "label_id": (
+ str(item["label_id"]) if item["label_id"] else None
+ ),
+ "total_issues": item["total_issues"],
+ "completed_issues": item["completed_issues"],
+ "pending_issues": item["pending_issues"],
+ }
+ for item in label_distribution
+ ]
+
+ current_cycle = Cycle.objects.filter(
+ workspace__slug=slug, project_id=project_id, pk=cycle_id
+ ).first()
+
+ if current_cycle:
+ current_cycle.progress_snapshot = {
+ "total_issues": old_cycle.first().total_issues,
+ "completed_issues": old_cycle.first().completed_issues,
+ "cancelled_issues": old_cycle.first().cancelled_issues,
+ "started_issues": old_cycle.first().started_issues,
+ "unstarted_issues": old_cycle.first().unstarted_issues,
+ "backlog_issues": old_cycle.first().backlog_issues,
+ "distribution": {
+ "labels": label_distribution_data,
+ "assignees": assignee_distribution_data,
+ "completion_chart": completion_chart,
+ },
+ }
+ # Save the snapshot of the current cycle
+ current_cycle.save(update_fields=["progress_snapshot"])
+
if (
new_cycle.end_date is not None
and new_cycle.end_date < timezone.now().date()
@@ -550,4 +1025,4 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
updated_cycles, ["cycle_id"], batch_size=100
)
- return Response({"message": "Success"}, status=status.HTTP_200_OK)
\ No newline at end of file
+ return Response({"message": "Success"}, status=status.HTTP_200_OK)
diff --git a/apiserver/plane/api/views/inbox.py b/apiserver/plane/api/views/inbox.py
index 4f4cdc4ef..8987e4f63 100644
--- a/apiserver/plane/api/views/inbox.py
+++ b/apiserver/plane/api/views/inbox.py
@@ -2,20 +2,28 @@
import json
# Django improts
-from django.utils import timezone
-from django.db.models import Q
from django.core.serializers.json import DjangoJSONEncoder
+from django.db.models import Q
+from django.utils import timezone
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
-from .base import BaseAPIView
-from plane.app.permissions import ProjectLitePermission
from plane.api.serializers import InboxIssueSerializer, IssueSerializer
-from plane.db.models import InboxIssue, Issue, State, ProjectMember, Project, Inbox
+from plane.app.permissions import ProjectLitePermission
from plane.bgtasks.issue_activites_task import issue_activity
+from plane.db.models import (
+ Inbox,
+ InboxIssue,
+ Issue,
+ Project,
+ ProjectMember,
+ State,
+)
+
+from .base import BaseAPIView
class InboxIssueAPIEndpoint(BaseAPIView):
@@ -43,7 +51,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
).first()
project = Project.objects.get(
- workspace__slug=self.kwargs.get("slug"), pk=self.kwargs.get("project_id")
+ workspace__slug=self.kwargs.get("slug"),
+ pk=self.kwargs.get("project_id"),
)
if inbox is None and not project.inbox_view:
@@ -51,7 +60,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
return (
InboxIssue.objects.filter(
- Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
+ Q(snoozed_till__gte=timezone.now())
+ | Q(snoozed_till__isnull=True),
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
inbox_id=inbox.id,
@@ -87,7 +97,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
def post(self, request, slug, project_id):
if not request.data.get("issue", {}).get("name", False):
return Response(
- {"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
+ {"error": "Name is required"},
+ status=status.HTTP_400_BAD_REQUEST,
)
inbox = Inbox.objects.filter(
@@ -109,7 +120,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
)
# Check for valid priority
- if not request.data.get("issue", {}).get("priority", "none") in [
+ if request.data.get("issue", {}).get("priority", "none") not in [
"low",
"medium",
"high",
@@ -117,16 +128,18 @@ class InboxIssueAPIEndpoint(BaseAPIView):
"none",
]:
return Response(
- {"error": "Invalid priority"}, status=status.HTTP_400_BAD_REQUEST
+ {"error": "Invalid priority"},
+ status=status.HTTP_400_BAD_REQUEST,
)
# Create or get state
state, _ = State.objects.get_or_create(
name="Triage",
- group="backlog",
+ group="triage",
description="Default state for managing all Inbox Issues",
project_id=project_id,
color="#ff7700",
+ is_triage=True,
)
# create an issue
@@ -141,6 +154,13 @@ class InboxIssueAPIEndpoint(BaseAPIView):
state=state,
)
+ # create an inbox issue
+ inbox_issue = InboxIssue.objects.create(
+ inbox_id=inbox.id,
+ project_id=project_id,
+ issue=issue,
+ source=request.data.get("source", "in-app"),
+ )
# Create an Issue Activity
issue_activity.delay(
type="issue.activity.created",
@@ -150,14 +170,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
- )
-
- # create an inbox issue
- inbox_issue = InboxIssue.objects.create(
- inbox_id=inbox.id,
- project_id=project_id,
- issue=issue,
- source=request.data.get("source", "in-app"),
+ inbox=str(inbox_issue.id),
)
serializer = InboxIssueSerializer(inbox_issue)
@@ -222,10 +235,14 @@ class InboxIssueAPIEndpoint(BaseAPIView):
"description_html": issue_data.get(
"description_html", issue.description_html
),
- "description": issue_data.get("description", issue.description),
+ "description": issue_data.get(
+ "description", issue.description
+ ),
}
- issue_serializer = IssueSerializer(issue, data=issue_data, partial=True)
+ issue_serializer = IssueSerializer(
+ issue, data=issue_data, partial=True
+ )
if issue_serializer.is_valid():
current_instance = issue
@@ -243,6 +260,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp()),
+ inbox=(inbox_issue.id),
)
issue_serializer.save()
else:
@@ -255,6 +273,9 @@ class InboxIssueAPIEndpoint(BaseAPIView):
serializer = InboxIssueSerializer(
inbox_issue, data=request.data, partial=True
)
+ current_instance = json.dumps(
+ InboxIssueSerializer(inbox_issue).data, cls=DjangoJSONEncoder
+ )
if serializer.is_valid():
serializer.save()
@@ -266,7 +287,9 @@ class InboxIssueAPIEndpoint(BaseAPIView):
project_id=project_id,
)
state = State.objects.filter(
- group="cancelled", workspace__slug=slug, project_id=project_id
+ group="cancelled",
+ workspace__slug=slug,
+ project_id=project_id,
).first()
if state is not None:
issue.state = state
@@ -281,20 +304,41 @@ class InboxIssueAPIEndpoint(BaseAPIView):
)
# Update the issue state only if it is in triage state
- if issue.state.name == "Triage":
+ if issue.state.is_triage:
# Move to default state
state = State.objects.filter(
- workspace__slug=slug, project_id=project_id, default=True
+ workspace__slug=slug,
+ project_id=project_id,
+ default=True,
).first()
if state is not None:
issue.state = state
issue.save()
+ # create a activity for status change
+ issue_activity.delay(
+ type="inbox.activity.created",
+ requested_data=json.dumps(
+ request.data, cls=DjangoJSONEncoder
+ ),
+ actor_id=str(request.user.id),
+ issue_id=str(issue_id),
+ project_id=str(project_id),
+ current_instance=current_instance,
+ epoch=int(timezone.now().timestamp()),
+ notification=False,
+ origin=request.META.get("HTTP_ORIGIN"),
+ inbox=str(inbox_issue.id),
+ )
+
return Response(serializer.data, status=status.HTTP_200_OK)
- return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
+ return Response(
+ serializer.errors, status=status.HTTP_400_BAD_REQUEST
+ )
else:
return Response(
- InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
+ InboxIssueSerializer(inbox_issue).data,
+ status=status.HTTP_200_OK,
)
def delete(self, request, slug, project_id, issue_id):
diff --git a/apiserver/plane/api/views/issue.py b/apiserver/plane/api/views/issue.py
index 1ac8ddcff..ce0501dd2 100644
--- a/apiserver/plane/api/views/issue.py
+++ b/apiserver/plane/api/views/issue.py
@@ -1,22 +1,22 @@
# Python imports
import json
-from itertools import chain
+
+from django.core.serializers.json import DjangoJSONEncoder
# Django imports
from django.db import IntegrityError
from django.db.models import (
- OuterRef,
- Func,
- Q,
- F,
Case,
- When,
- Value,
CharField,
- Max,
Exists,
+ F,
+ Func,
+ Max,
+ OuterRef,
+ Q,
+ Value,
+ When,
)
-from django.core.serializers.json import DjangoJSONEncoder
from django.utils import timezone
# Third party imports
@@ -24,33 +24,96 @@ from rest_framework import status
from rest_framework.response import Response
# Module imports
-from .base import BaseAPIView, WebhookMixin
-from plane.app.permissions import (
- ProjectEntityPermission,
- ProjectMemberPermission,
- ProjectLitePermission,
-)
-from plane.db.models import (
- Issue,
- IssueAttachment,
- IssueLink,
- Project,
- Label,
- ProjectMember,
- IssueComment,
- IssueActivity,
-)
-from plane.bgtasks.issue_activites_task import issue_activity
from plane.api.serializers import (
+ IssueActivitySerializer,
+ IssueCommentSerializer,
+ IssueLinkSerializer,
IssueSerializer,
LabelSerializer,
- IssueLinkSerializer,
- IssueCommentSerializer,
- IssueActivitySerializer,
+)
+from plane.app.permissions import (
+ ProjectEntityPermission,
+ ProjectLitePermission,
+ ProjectMemberPermission,
+)
+from plane.bgtasks.issue_activites_task import issue_activity
+from plane.db.models import (
+ Issue,
+ IssueActivity,
+ IssueAttachment,
+ IssueComment,
+ IssueLink,
+ Label,
+ Project,
+ ProjectMember,
)
+from .base import BaseAPIView
-class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
+
+class WorkspaceIssueAPIEndpoint(BaseAPIView):
+ """
+ This viewset provides `retrieveByIssueId` on workspace level
+
+ """
+
+ model = Issue
+ webhook_event = "issue"
+ permission_classes = [ProjectEntityPermission]
+ serializer_class = IssueSerializer
+
+ @property
+ def project__identifier(self):
+ return self.kwargs.get("project__identifier", None)
+
+ def get_queryset(self):
+ return (
+ Issue.issue_objects.annotate(
+ sub_issues_count=Issue.issue_objects.filter(
+ parent=OuterRef("id")
+ )
+ .order_by()
+ .annotate(count=Func(F("id"), function="Count"))
+ .values("count")
+ )
+ .filter(workspace__slug=self.kwargs.get("slug"))
+ .filter(project__identifier=self.kwargs.get("project__identifier"))
+ .select_related("project")
+ .select_related("workspace")
+ .select_related("state")
+ .select_related("parent")
+ .prefetch_related("assignees")
+ .prefetch_related("labels")
+ .order_by(self.kwargs.get("order_by", "-created_at"))
+ ).distinct()
+
+ def get(
+ self, request, slug, project__identifier=None, issue__identifier=None
+ ):
+ if issue__identifier and project__identifier:
+ issue = Issue.issue_objects.annotate(
+ sub_issues_count=Issue.issue_objects.filter(
+ parent=OuterRef("id")
+ )
+ .order_by()
+ .annotate(count=Func(F("id"), function="Count"))
+ .values("count")
+ ).get(
+ workspace__slug=slug,
+ project__identifier=project__identifier,
+ sequence_id=issue__identifier,
+ )
+ return Response(
+ IssueSerializer(
+ issue,
+ fields=self.fields,
+ expand=self.expand,
+ ).data,
+ status=status.HTTP_200_OK,
+ )
+
+
+class IssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to issue.
@@ -67,7 +130,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self):
return (
Issue.issue_objects.annotate(
- sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
+ sub_issues_count=Issue.issue_objects.filter(
+ parent=OuterRef("id")
+ )
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -86,7 +151,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get(self, request, slug, project_id, pk=None):
if pk:
issue = Issue.issue_objects.annotate(
- sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
+ sub_issues_count=Issue.issue_objects.filter(
+ parent=OuterRef("id")
+ )
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -102,14 +169,19 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
- state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
+ state_order = [
+ "backlog",
+ "unstarted",
+ "started",
+ "completed",
+ "cancelled",
+ ]
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = (
self.get_queryset()
.annotate(cycle_id=F("issue_cycle__cycle_id"))
- .annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -117,7 +189,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
.values("count")
)
.annotate(
- attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
+ attachment_count=IssueAttachment.objects.filter(
+ issue=OuterRef("id")
+ )
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -127,7 +201,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
- priority_order if order_by_param == "priority" else priority_order[::-1]
+ priority_order
+ if order_by_param == "priority"
+ else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
@@ -175,7 +251,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
else order_by_param
)
).order_by(
- "-max_values" if order_by_param.startswith("-") else "max_values"
+ "-max_values"
+ if order_by_param.startswith("-")
+ else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
@@ -204,12 +282,38 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
)
if serializer.is_valid():
+ if (
+ request.data.get("external_id")
+ and request.data.get("external_source")
+ and Issue.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get("external_source"),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ issue = Issue.objects.filter(
+ workspace__slug=slug,
+ project_id=project_id,
+ external_id=request.data.get("external_id"),
+ external_source=request.data.get("external_source"),
+ ).first()
+ return Response(
+ {
+ "error": "Issue with the same external id and external source already exists",
+ "id": str(issue.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
+
serializer.save()
# Track the issue
issue_activity.delay(
type="issue.activity.created",
- requested_data=json.dumps(self.request.data, cls=DjangoJSONEncoder),
+ requested_data=json.dumps(
+ self.request.data, cls=DjangoJSONEncoder
+ ),
actor_id=str(request.user.id),
issue_id=str(serializer.data.get("id", None)),
project_id=str(project_id),
@@ -220,7 +324,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, pk=None):
- issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
+ issue = Issue.objects.get(
+ workspace__slug=slug, project_id=project_id, pk=pk
+ )
project = Project.objects.get(pk=project_id)
current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
@@ -236,6 +342,26 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
partial=True,
)
if serializer.is_valid():
+ if (
+ request.data.get("external_id")
+ and (issue.external_id != str(request.data.get("external_id")))
+ and Issue.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get(
+ "external_source", issue.external_source
+ ),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ return Response(
+ {
+ "error": "Issue with the same external id and external source already exists",
+ "id": str(issue.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
+
serializer.save()
issue_activity.delay(
type="issue.activity.updated",
@@ -250,7 +376,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk=None):
- issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
+ issue = Issue.objects.get(
+ workspace__slug=slug, project_id=project_id, pk=pk
+ )
current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
)
@@ -284,7 +412,11 @@ class LabelAPIEndpoint(BaseAPIView):
return (
Label.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
- .filter(project__project_projectmember__member=self.request.user)
+ .filter(
+ project__project_projectmember__member=self.request.user,
+ project__project_projectmember__is_active=True,
+ )
+ .filter(project__archived_at__isnull=True)
.select_related("project")
.select_related("workspace")
.select_related("parent")
@@ -296,13 +428,49 @@ class LabelAPIEndpoint(BaseAPIView):
try:
serializer = LabelSerializer(data=request.data)
if serializer.is_valid():
+ if (
+ request.data.get("external_id")
+ and request.data.get("external_source")
+ and Label.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get("external_source"),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ label = Label.objects.filter(
+ workspace__slug=slug,
+ project_id=project_id,
+ external_id=request.data.get("external_id"),
+ external_source=request.data.get("external_source"),
+ ).first()
+ return Response(
+ {
+ "error": "Label with the same external id and external source already exists",
+ "id": str(label.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
+
serializer.save(project_id=project_id)
- return Response(serializer.data, status=status.HTTP_201_CREATED)
- return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
- except IntegrityError:
+ return Response(
+ serializer.data, status=status.HTTP_201_CREATED
+ )
return Response(
- {"error": "Label with the same name already exists in the project"},
- status=status.HTTP_400_BAD_REQUEST,
+ serializer.errors, status=status.HTTP_400_BAD_REQUEST
+ )
+ except IntegrityError:
+ label = Label.objects.filter(
+ workspace__slug=slug,
+ project_id=project_id,
+ name=request.data.get("name"),
+ ).first()
+ return Response(
+ {
+ "error": "Label with the same name already exists in the project",
+ "id": str(label.id),
+ },
+ status=status.HTTP_409_CONFLICT,
)
def get(self, request, slug, project_id, pk=None):
@@ -318,17 +486,39 @@ class LabelAPIEndpoint(BaseAPIView):
).data,
)
label = self.get_queryset().get(pk=pk)
- serializer = LabelSerializer(label, fields=self.fields, expand=self.expand,)
+ serializer = LabelSerializer(
+ label,
+ fields=self.fields,
+ expand=self.expand,
+ )
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, project_id, pk=None):
label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer(label, data=request.data, partial=True)
if serializer.is_valid():
+ if (
+ str(request.data.get("external_id"))
+ and (label.external_id != str(request.data.get("external_id")))
+ and Issue.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get(
+ "external_source", label.external_source
+ ),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ return Response(
+ {
+ "error": "Label with the same external id and external source already exists",
+ "id": str(label.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
-
def delete(self, request, slug, project_id, pk=None):
label = self.get_queryset().get(pk=pk)
@@ -355,7 +545,11 @@ class IssueLinkAPIEndpoint(BaseAPIView):
IssueLink.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(issue_id=self.kwargs.get("issue_id"))
- .filter(project__project_projectmember__member=self.request.user)
+ .filter(
+ project__project_projectmember__member=self.request.user,
+ project__project_projectmember__is_active=True,
+ )
+ .filter(project__archived_at__isnull=True)
.order_by(self.kwargs.get("order_by", "-created_at"))
.distinct()
)
@@ -395,7 +589,9 @@ class IssueLinkAPIEndpoint(BaseAPIView):
)
issue_activity.delay(
type="link.activity.created",
- requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
+ requested_data=json.dumps(
+ serializer.data, cls=DjangoJSONEncoder
+ ),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id")),
project_id=str(self.kwargs.get("project_id")),
@@ -407,14 +603,19 @@ class IssueLinkAPIEndpoint(BaseAPIView):
def patch(self, request, slug, project_id, issue_id, pk):
issue_link = IssueLink.objects.get(
- workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
+ workspace__slug=slug,
+ project_id=project_id,
+ issue_id=issue_id,
+ pk=pk,
)
requested_data = json.dumps(request.data, cls=DjangoJSONEncoder)
current_instance = json.dumps(
IssueLinkSerializer(issue_link).data,
cls=DjangoJSONEncoder,
)
- serializer = IssueLinkSerializer(issue_link, data=request.data, partial=True)
+ serializer = IssueLinkSerializer(
+ issue_link, data=request.data, partial=True
+ )
if serializer.is_valid():
serializer.save()
issue_activity.delay(
@@ -431,7 +632,10 @@ class IssueLinkAPIEndpoint(BaseAPIView):
def delete(self, request, slug, project_id, issue_id, pk):
issue_link = IssueLink.objects.get(
- workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
+ workspace__slug=slug,
+ project_id=project_id,
+ issue_id=issue_id,
+ pk=pk,
)
current_instance = json.dumps(
IssueLinkSerializer(issue_link).data,
@@ -450,7 +654,7 @@ class IssueLinkAPIEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
-class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
+class IssueCommentAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to comments of the particular issue.
@@ -466,14 +670,17 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self):
return (
- IssueComment.objects.filter(workspace__slug=self.kwargs.get("slug"))
+ IssueComment.objects.filter(
+ workspace__slug=self.kwargs.get("slug")
+ )
.filter(project_id=self.kwargs.get("project_id"))
.filter(issue_id=self.kwargs.get("issue_id"))
- .filter(project__project_projectmember__member=self.request.user)
- .select_related("project")
- .select_related("workspace")
- .select_related("issue")
- .select_related("actor")
+ .filter(
+ project__project_projectmember__member=self.request.user,
+ project__project_projectmember__is_active=True,
+ )
+ .filter(project__archived_at__isnull=True)
+ .select_related("workspace", "project", "issue", "actor")
.annotate(
is_member=Exists(
ProjectMember.objects.filter(
@@ -509,6 +716,31 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
)
def post(self, request, slug, project_id, issue_id):
+ # Validation check if the issue already exists
+ if (
+ request.data.get("external_id")
+ and request.data.get("external_source")
+ and IssueComment.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get("external_source"),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ issue_comment = IssueComment.objects.filter(
+ workspace__slug=slug,
+ project_id=project_id,
+ external_id=request.data.get("external_id"),
+ external_source=request.data.get("external_source"),
+ ).first()
+ return Response(
+ {
+ "error": "Issue Comment with the same external id and external source already exists",
+ "id": str(issue_comment.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
+
serializer = IssueCommentSerializer(data=request.data)
if serializer.is_valid():
serializer.save(
@@ -518,7 +750,9 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
)
issue_activity.delay(
type="comment.activity.created",
- requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
+ requested_data=json.dumps(
+ serializer.data, cls=DjangoJSONEncoder
+ ),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id")),
project_id=str(self.kwargs.get("project_id")),
@@ -530,13 +764,41 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
def patch(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get(
- workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
+ workspace__slug=slug,
+ project_id=project_id,
+ issue_id=issue_id,
+ pk=pk,
)
requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder)
current_instance = json.dumps(
IssueCommentSerializer(issue_comment).data,
cls=DjangoJSONEncoder,
)
+
+ # Validation check if the issue already exists
+ if (
+ request.data.get("external_id")
+ and (
+ issue_comment.external_id
+ != str(request.data.get("external_id"))
+ )
+ and IssueComment.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get(
+ "external_source", issue_comment.external_source
+ ),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ return Response(
+ {
+ "error": "Issue Comment with the same external id and external source already exists",
+ "id": str(issue_comment.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
+
serializer = IssueCommentSerializer(
issue_comment, data=request.data, partial=True
)
@@ -556,7 +818,10 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
def delete(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get(
- workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
+ workspace__slug=slug,
+ project_id=project_id,
+ issue_id=issue_id,
+ pk=pk,
)
current_instance = json.dumps(
IssueCommentSerializer(issue_comment).data,
@@ -588,10 +853,12 @@ class IssueActivityAPIEndpoint(BaseAPIView):
.filter(
~Q(field__in=["comment", "vote", "reaction", "draft"]),
project__project_projectmember__member=self.request.user,
+ project__project_projectmember__is_active=True,
)
+ .filter(project__archived_at__isnull=True)
.select_related("actor", "workspace", "issue", "project")
).order_by(request.GET.get("order_by", "created_at"))
-
+
if pk:
issue_activities = issue_activities.get(pk=pk)
serializer = IssueActivitySerializer(issue_activities)
diff --git a/apiserver/plane/api/views/module.py b/apiserver/plane/api/views/module.py
index 959b7ccc3..eeb29dad2 100644
--- a/apiserver/plane/api/views/module.py
+++ b/apiserver/plane/api/views/module.py
@@ -2,35 +2,38 @@
import json
# Django imports
-from django.db.models import Count, Prefetch, Q, F, Func, OuterRef
-from django.utils import timezone
from django.core import serializers
+from django.db.models import Count, F, Func, OuterRef, Prefetch, Q
+from django.utils import timezone
+from django.core.serializers.json import DjangoJSONEncoder
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
-from .base import BaseAPIView, WebhookMixin
+from plane.api.serializers import (
+ IssueSerializer,
+ ModuleIssueSerializer,
+ ModuleSerializer,
+)
from plane.app.permissions import ProjectEntityPermission
+from plane.bgtasks.issue_activites_task import issue_activity
from plane.db.models import (
- Project,
- Module,
- ModuleLink,
Issue,
- ModuleIssue,
IssueAttachment,
IssueLink,
+ Module,
+ ModuleIssue,
+ ModuleLink,
+ Project,
)
-from plane.api.serializers import (
- ModuleSerializer,
- ModuleIssueSerializer,
- IssueSerializer,
-)
-from plane.bgtasks.issue_activites_task import issue_activity
+
+from .base import BaseAPIView
+from plane.bgtasks.webhook_task import model_activity
-class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
+class ModuleAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module.
@@ -55,7 +58,9 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
.prefetch_related(
Prefetch(
"link_module",
- queryset=ModuleLink.objects.select_related("module", "created_by"),
+ queryset=ModuleLink.objects.select_related(
+ "module", "created_by"
+ ),
)
)
.annotate(
@@ -65,6 +70,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
+ distinct=True,
),
)
.annotate(
@@ -75,6 +81,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
+ distinct=True,
)
)
.annotate(
@@ -85,6 +92,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
+ distinct=True,
)
)
.annotate(
@@ -95,6 +103,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
+ distinct=True,
)
)
.annotate(
@@ -105,6 +114,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
+ distinct=True,
)
)
.annotate(
@@ -115,6 +125,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
),
+ distinct=True,
)
)
.order_by(self.kwargs.get("order_by", "-created_at"))
@@ -122,25 +133,114 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
def post(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
- serializer = ModuleSerializer(data=request.data, context={"project_id": project_id, "workspace_id": project.workspace_id})
+ serializer = ModuleSerializer(
+ data=request.data,
+ context={
+ "project_id": project_id,
+ "workspace_id": project.workspace_id,
+ },
+ )
if serializer.is_valid():
+ if (
+ request.data.get("external_id")
+ and request.data.get("external_source")
+ and Module.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get("external_source"),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ module = Module.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get("external_source"),
+ external_id=request.data.get("external_id"),
+ ).first()
+ return Response(
+ {
+ "error": "Module with the same external id and external source already exists",
+ "id": str(module.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
serializer.save()
+ # Send the model activity
+ model_activity.delay(
+ model_name="module",
+ model_id=str(serializer.data["id"]),
+ requested_data=request.data,
+ current_instance=None,
+ actor_id=request.user.id,
+ slug=slug,
+ origin=request.META.get("HTTP_ORIGIN"),
+ )
module = Module.objects.get(pk=serializer.data["id"])
serializer = ModuleSerializer(module)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
-
+
def patch(self, request, slug, project_id, pk):
- module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
- serializer = ModuleSerializer(module, data=request.data, context={"project_id": project_id}, partial=True)
+ module = Module.objects.get(
+ pk=pk, project_id=project_id, workspace__slug=slug
+ )
+
+ current_instance = json.dumps(
+ ModuleSerializer(module).data, cls=DjangoJSONEncoder
+ )
+
+ if module.archived_at:
+ return Response(
+ {"error": "Archived module cannot be edited"},
+ status=status.HTTP_400_BAD_REQUEST,
+ )
+ serializer = ModuleSerializer(
+ module,
+ data=request.data,
+ context={"project_id": project_id},
+ partial=True,
+ )
if serializer.is_valid():
+ if (
+ request.data.get("external_id")
+ and (module.external_id != request.data.get("external_id"))
+ and Module.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get(
+ "external_source", module.external_source
+ ),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ return Response(
+ {
+ "error": "Module with the same external id and external source already exists",
+ "id": str(module.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
serializer.save()
- return Response(serializer.data, status=status.HTTP_201_CREATED)
+
+ # Send the model activity
+ model_activity.delay(
+ model_name="module",
+ model_id=str(serializer.data["id"]),
+ requested_data=request.data,
+ current_instance=current_instance,
+ actor_id=request.user.id,
+ slug=slug,
+ origin=request.META.get("HTTP_ORIGIN"),
+ )
+
+ return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def get(self, request, slug, project_id, pk=None):
if pk:
- queryset = self.get_queryset().get(pk=pk)
+ queryset = (
+ self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
+ )
data = ModuleSerializer(
queryset,
fields=self.fields,
@@ -152,7 +252,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
)
return self.paginate(
request=request,
- queryset=(self.get_queryset()),
+ queryset=(self.get_queryset().filter(archived_at__isnull=True)),
on_results=lambda modules: ModuleSerializer(
modules,
many=True,
@@ -162,9 +262,13 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
)
def delete(self, request, slug, project_id, pk):
- module = Module.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
+ module = Module.objects.get(
+ workspace__slug=slug, project_id=project_id, pk=pk
+ )
module_issues = list(
- ModuleIssue.objects.filter(module_id=pk).values_list("issue", flat=True)
+ ModuleIssue.objects.filter(module_id=pk).values_list(
+ "issue", flat=True
+ )
)
issue_activity.delay(
type="module.activity.deleted",
@@ -185,7 +289,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
-class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
+class ModuleIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module issues.
@@ -204,7 +308,9 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self):
return (
ModuleIssue.objects.annotate(
- sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue"))
+ sub_issues_count=Issue.issue_objects.filter(
+ parent=OuterRef("issue")
+ )
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -212,7 +318,11 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(module_id=self.kwargs.get("module_id"))
- .filter(project__project_projectmember__member=self.request.user)
+ .filter(
+ project__project_projectmember__member=self.request.user,
+ project__project_projectmember__is_active=True,
+ )
+ .filter(project__archived_at__isnull=True)
.select_related("project")
.select_related("workspace")
.select_related("module")
@@ -228,7 +338,9 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id)
.annotate(
- sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
+ sub_issues_count=Issue.issue_objects.filter(
+ parent=OuterRef("id")
+ )
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -250,7 +362,9 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
.values("count")
)
.annotate(
- attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
+ attachment_count=IssueAttachment.objects.filter(
+ issue=OuterRef("id")
+ )
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -271,7 +385,8 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
issues = request.data.get("issues", [])
if not len(issues):
return Response(
- {"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
+ {"error": "Issues are required"},
+ status=status.HTTP_400_BAD_REQUEST,
)
module = Module.objects.get(
workspace__slug=slug, project_id=project_id, pk=module_id
@@ -354,7 +469,10 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def delete(self, request, slug, project_id, module_id, issue_id):
module_issue = ModuleIssue.objects.get(
- workspace__slug=slug, project_id=project_id, module_id=module_id, issue_id=issue_id
+ workspace__slug=slug,
+ project_id=project_id,
+ module_id=module_id,
+ issue_id=issue_id,
)
module_issue.delete()
issue_activity.delay(
@@ -371,4 +489,131 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
- return Response(status=status.HTTP_204_NO_CONTENT)
\ No newline at end of file
+ return Response(status=status.HTTP_204_NO_CONTENT)
+
+
+class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
+
+ permission_classes = [
+ ProjectEntityPermission,
+ ]
+
+ def get_queryset(self):
+ return (
+ Module.objects.filter(project_id=self.kwargs.get("project_id"))
+ .filter(workspace__slug=self.kwargs.get("slug"))
+ .filter(archived_at__isnull=False)
+ .select_related("project")
+ .select_related("workspace")
+ .select_related("lead")
+ .prefetch_related("members")
+ .prefetch_related(
+ Prefetch(
+ "link_module",
+ queryset=ModuleLink.objects.select_related(
+ "module", "created_by"
+ ),
+ )
+ )
+ .annotate(
+ total_issues=Count(
+ "issue_module",
+ filter=Q(
+ issue_module__issue__archived_at__isnull=True,
+ issue_module__issue__is_draft=False,
+ ),
+ distinct=True,
+ ),
+ )
+ .annotate(
+ completed_issues=Count(
+ "issue_module__issue__state__group",
+ filter=Q(
+ issue_module__issue__state__group="completed",
+ issue_module__issue__archived_at__isnull=True,
+ issue_module__issue__is_draft=False,
+ ),
+ distinct=True,
+ )
+ )
+ .annotate(
+ cancelled_issues=Count(
+ "issue_module__issue__state__group",
+ filter=Q(
+ issue_module__issue__state__group="cancelled",
+ issue_module__issue__archived_at__isnull=True,
+ issue_module__issue__is_draft=False,
+ ),
+ distinct=True,
+ )
+ )
+ .annotate(
+ started_issues=Count(
+ "issue_module__issue__state__group",
+ filter=Q(
+ issue_module__issue__state__group="started",
+ issue_module__issue__archived_at__isnull=True,
+ issue_module__issue__is_draft=False,
+ ),
+ distinct=True,
+ )
+ )
+ .annotate(
+ unstarted_issues=Count(
+ "issue_module__issue__state__group",
+ filter=Q(
+ issue_module__issue__state__group="unstarted",
+ issue_module__issue__archived_at__isnull=True,
+ issue_module__issue__is_draft=False,
+ ),
+ distinct=True,
+ )
+ )
+ .annotate(
+ backlog_issues=Count(
+ "issue_module__issue__state__group",
+ filter=Q(
+ issue_module__issue__state__group="backlog",
+ issue_module__issue__archived_at__isnull=True,
+ issue_module__issue__is_draft=False,
+ ),
+ distinct=True,
+ )
+ )
+ .order_by(self.kwargs.get("order_by", "-created_at"))
+ )
+
+ def get(self, request, slug, project_id, pk):
+ return self.paginate(
+ request=request,
+ queryset=(self.get_queryset()),
+ on_results=lambda modules: ModuleSerializer(
+ modules,
+ many=True,
+ fields=self.fields,
+ expand=self.expand,
+ ).data,
+ )
+
+ def post(self, request, slug, project_id, pk):
+ module = Module.objects.get(
+ pk=pk, project_id=project_id, workspace__slug=slug
+ )
+ if module.status not in ["completed", "cancelled"]:
+ return Response(
+ {
+ "error": "Only completed or cancelled modules can be archived"
+ },
+ status=status.HTTP_400_BAD_REQUEST,
+ )
+ module.archived_at = timezone.now()
+ module.save()
+ return Response(status=status.HTTP_204_NO_CONTENT)
+
+ def delete(self, request, slug, project_id, pk):
+ module = Module.objects.get(
+ pk=pk, project_id=project_id, workspace__slug=slug
+ )
+ module.archived_at = None
+ module.save()
+ return Response(status=status.HTTP_204_NO_CONTENT)
diff --git a/apiserver/plane/api/views/project.py b/apiserver/plane/api/views/project.py
index e8dc9f5a9..019ab704e 100644
--- a/apiserver/plane/api/views/project.py
+++ b/apiserver/plane/api/views/project.py
@@ -1,31 +1,37 @@
+# Python imports
+import json
+
# Django imports
from django.db import IntegrityError
-from django.db.models import Exists, OuterRef, Q, F, Func, Subquery, Prefetch
+from django.db.models import Exists, F, Func, OuterRef, Prefetch, Q, Subquery
+from django.utils import timezone
+from django.core.serializers.json import DjangoJSONEncoder
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework.serializers import ValidationError
+from plane.api.serializers import ProjectSerializer
+from plane.app.permissions import ProjectBasePermission
+
# Module imports
from plane.db.models import (
- Workspace,
- Project,
- ProjectFavorite,
- ProjectMember,
- ProjectDeployBoard,
- State,
Cycle,
- Module,
- IssueProperty,
Inbox,
+ IssueProperty,
+ Module,
+ Project,
+ ProjectDeployBoard,
+ ProjectMember,
+ State,
+ Workspace,
)
-from plane.app.permissions import ProjectBasePermission
-from plane.api.serializers import ProjectSerializer
-from .base import BaseAPIView, WebhookMixin
+from plane.bgtasks.webhook_task import model_activity
+from .base import BaseAPIView
-class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
+class ProjectAPIEndpoint(BaseAPIView):
"""Project Endpoints to create, update, list, retrieve and delete endpoint"""
serializer_class = ProjectSerializer
@@ -39,9 +45,18 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self):
return (
Project.objects.filter(workspace__slug=self.kwargs.get("slug"))
- .filter(Q(project_projectmember__member=self.request.user) | Q(network=2))
+ .filter(
+ Q(
+ project_projectmember__member=self.request.user,
+ project_projectmember__is_active=True,
+ )
+ | Q(network=2)
+ )
.select_related(
- "workspace", "workspace__owner", "default_assignee", "project_lead"
+ "workspace",
+ "workspace__owner",
+ "default_assignee",
+ "project_lead",
)
.annotate(
is_member=Exists(
@@ -94,8 +109,8 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
.distinct()
)
- def get(self, request, slug, project_id=None):
- if project_id is None:
+ def get(self, request, slug, pk=None):
+ if pk is None:
sort_order_query = ProjectMember.objects.filter(
member=request.user,
project_id=OuterRef("pk"),
@@ -120,11 +135,18 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
request=request,
queryset=(projects),
on_results=lambda projects: ProjectSerializer(
- projects, many=True, fields=self.fields, expand=self.expand,
+ projects,
+ many=True,
+ fields=self.fields,
+ expand=self.expand,
).data,
)
- project = self.get_queryset().get(workspace__slug=slug, pk=project_id)
- serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand,)
+ project = self.get_queryset().get(workspace__slug=slug, pk=pk)
+ serializer = ProjectSerializer(
+ project,
+ fields=self.fields,
+ expand=self.expand,
+ )
return Response(serializer.data, status=status.HTTP_200_OK)
def post(self, request, slug):
@@ -137,8 +159,10 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
serializer.save()
# Add the user as Administrator to the project
- project_member = ProjectMember.objects.create(
- project_id=serializer.data["id"], member=request.user, role=20
+ _ = ProjectMember.objects.create(
+ project_id=serializer.data["id"],
+ member=request.user,
+ role=20,
)
# Also create the issue property for the user
_ = IssueProperty.objects.create(
@@ -211,9 +235,26 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
]
)
- project = self.get_queryset().filter(pk=serializer.data["id"]).first()
+ project = (
+ self.get_queryset()
+ .filter(pk=serializer.data["id"])
+ .first()
+ )
+ # Model activity
+ model_activity.delay(
+ model_name="project",
+ model_id=str(project.id),
+ requested_data=request.data,
+ current_instance=None,
+ actor_id=request.user.id,
+ slug=slug,
+ origin=request.META.get("HTTP_ORIGIN"),
+ )
+
serializer = ProjectSerializer(project)
- return Response(serializer.data, status=status.HTTP_201_CREATED)
+ return Response(
+ serializer.data, status=status.HTTP_201_CREATED
+ )
return Response(
serializer.errors,
status=status.HTTP_400_BAD_REQUEST,
@@ -224,20 +265,29 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
{"name": "The project name is already taken"},
status=status.HTTP_410_GONE,
)
- except Workspace.DoesNotExist as e:
+ except Workspace.DoesNotExist:
return Response(
- {"error": "Workspace does not exist"}, status=status.HTTP_404_NOT_FOUND
+ {"error": "Workspace does not exist"},
+ status=status.HTTP_404_NOT_FOUND,
)
- except ValidationError as e:
+ except ValidationError:
return Response(
{"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE,
)
- def patch(self, request, slug, project_id=None):
+ def patch(self, request, slug, pk):
try:
workspace = Workspace.objects.get(slug=slug)
- project = Project.objects.get(pk=project_id)
+ project = Project.objects.get(pk=pk)
+ current_instance = json.dumps(
+ ProjectSerializer(project).data, cls=DjangoJSONEncoder
+ )
+ if project.archived_at:
+ return Response(
+ {"error": "Archived project cannot be updated"},
+ status=status.HTTP_400_BAD_REQUEST,
+ )
serializer = ProjectSerializer(
project,
@@ -250,22 +300,42 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
serializer.save()
if serializer.data["inbox_view"]:
Inbox.objects.get_or_create(
- name=f"{project.name} Inbox", project=project, is_default=True
+ name=f"{project.name} Inbox",
+ project=project,
+ is_default=True,
)
# Create the triage state in Backlog group
State.objects.get_or_create(
name="Triage",
- group="backlog",
+ group="triage",
description="Default state for managing all Inbox Issues",
- project_id=project_id,
+ project_id=pk,
color="#ff7700",
+ is_triage=True,
)
- project = self.get_queryset().filter(pk=serializer.data["id"]).first()
+ project = (
+ self.get_queryset()
+ .filter(pk=serializer.data["id"])
+ .first()
+ )
+
+ model_activity.delay(
+ model_name="project",
+ model_id=str(project.id),
+ requested_data=request.data,
+ current_instance=current_instance,
+ actor_id=request.user.id,
+ slug=slug,
+ origin=request.META.get("HTTP_ORIGIN"),
+ )
+
serializer = ProjectSerializer(project)
return Response(serializer.data, status=status.HTTP_200_OK)
- return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
+ return Response(
+ serializer.errors, status=status.HTTP_400_BAD_REQUEST
+ )
except IntegrityError as e:
if "already exists" in str(e):
return Response(
@@ -274,15 +344,35 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
)
except (Project.DoesNotExist, Workspace.DoesNotExist):
return Response(
- {"error": "Project does not exist"}, status=status.HTTP_404_NOT_FOUND
+ {"error": "Project does not exist"},
+ status=status.HTTP_404_NOT_FOUND,
)
- except ValidationError as e:
+ except ValidationError:
return Response(
{"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE,
)
+ def delete(self, request, slug, pk):
+ project = Project.objects.get(pk=pk, workspace__slug=slug)
+ project.delete()
+ return Response(status=status.HTTP_204_NO_CONTENT)
+
+
+class ProjectArchiveUnarchiveAPIEndpoint(BaseAPIView):
+
+ permission_classes = [
+ ProjectBasePermission,
+ ]
+
+ def post(self, request, slug, project_id):
+ project = Project.objects.get(pk=project_id, workspace__slug=slug)
+ project.archived_at = timezone.now()
+ project.save()
+ return Response(status=status.HTTP_204_NO_CONTENT)
+
def delete(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
- project.delete()
- return Response(status=status.HTTP_204_NO_CONTENT)
\ No newline at end of file
+ project.archived_at = None
+ project.save()
+ return Response(status=status.HTTP_204_NO_CONTENT)
diff --git a/apiserver/plane/api/views/state.py b/apiserver/plane/api/views/state.py
index 3d2861778..dd239754c 100644
--- a/apiserver/plane/api/views/state.py
+++ b/apiserver/plane/api/views/state.py
@@ -1,18 +1,16 @@
-# Python imports
-from itertools import groupby
-
# Django imports
-from django.db.models import Q
+from django.db import IntegrityError
# Third party imports
-from rest_framework.response import Response
from rest_framework import status
+from rest_framework.response import Response
+
+from plane.api.serializers import StateSerializer
+from plane.app.permissions import ProjectEntityPermission
+from plane.db.models import Issue, State
# Module imports
from .base import BaseAPIView
-from plane.api.serializers import StateSerializer
-from plane.app.permissions import ProjectEntityPermission
-from plane.db.models import State, Issue
class StateAPIEndpoint(BaseAPIView):
@@ -26,23 +24,73 @@ class StateAPIEndpoint(BaseAPIView):
return (
State.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
- .filter(project__project_projectmember__member=self.request.user)
- .filter(~Q(name="Triage"))
+ .filter(
+ project__project_projectmember__member=self.request.user,
+ project__project_projectmember__is_active=True,
+ )
+ .filter(is_triage=False)
+ .filter(project__archived_at__isnull=True)
.select_related("project")
.select_related("workspace")
.distinct()
)
def post(self, request, slug, project_id):
- serializer = StateSerializer(data=request.data, context={"project_id": project_id})
- if serializer.is_valid():
- serializer.save(project_id=project_id)
- return Response(serializer.data, status=status.HTTP_200_OK)
- return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
+ try:
+ serializer = StateSerializer(
+ data=request.data, context={"project_id": project_id}
+ )
+ if serializer.is_valid():
+ if (
+ request.data.get("external_id")
+ and request.data.get("external_source")
+ and State.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get("external_source"),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ state = State.objects.filter(
+ workspace__slug=slug,
+ project_id=project_id,
+ external_id=request.data.get("external_id"),
+ external_source=request.data.get("external_source"),
+ ).first()
+ return Response(
+ {
+ "error": "State with the same external id and external source already exists",
+ "id": str(state.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
+
+ serializer.save(project_id=project_id)
+ return Response(serializer.data, status=status.HTTP_200_OK)
+ return Response(
+ serializer.errors, status=status.HTTP_400_BAD_REQUEST
+ )
+ except IntegrityError:
+ state = State.objects.filter(
+ workspace__slug=slug,
+ project_id=project_id,
+ name=request.data.get("name"),
+ ).first()
+ return Response(
+ {
+ "error": "State with the same name already exists in the project",
+ "id": str(state.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
def get(self, request, slug, project_id, state_id=None):
if state_id:
- serializer = StateSerializer(self.get_queryset().get(pk=state_id))
+ serializer = StateSerializer(
+ self.get_queryset().get(pk=state_id),
+ fields=self.fields,
+ expand=self.expand,
+ )
return Response(serializer.data, status=status.HTTP_200_OK)
return self.paginate(
request=request,
@@ -57,21 +105,26 @@ class StateAPIEndpoint(BaseAPIView):
def delete(self, request, slug, project_id, state_id):
state = State.objects.get(
- ~Q(name="Triage"),
+ is_triage=False,
pk=state_id,
project_id=project_id,
workspace__slug=slug,
)
if state.default:
- return Response({"error": "Default state cannot be deleted"}, status=status.HTTP_400_BAD_REQUEST)
+ return Response(
+ {"error": "Default state cannot be deleted"},
+ status=status.HTTP_400_BAD_REQUEST,
+ )
# Check for any issues in the state
issue_exist = Issue.issue_objects.filter(state=state_id).exists()
if issue_exist:
return Response(
- {"error": "The state is not empty, only empty states can be deleted"},
+ {
+ "error": "The state is not empty, only empty states can be deleted"
+ },
status=status.HTTP_400_BAD_REQUEST,
)
@@ -79,9 +132,30 @@ class StateAPIEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, project_id, state_id=None):
- state = State.objects.get(workspace__slug=slug, project_id=project_id, pk=state_id)
+ state = State.objects.get(
+ workspace__slug=slug, project_id=project_id, pk=state_id
+ )
serializer = StateSerializer(state, data=request.data, partial=True)
if serializer.is_valid():
+ if (
+ request.data.get("external_id")
+ and (state.external_id != str(request.data.get("external_id")))
+ and State.objects.filter(
+ project_id=project_id,
+ workspace__slug=slug,
+ external_source=request.data.get(
+ "external_source", state.external_source
+ ),
+ external_id=request.data.get("external_id"),
+ ).exists()
+ ):
+ return Response(
+ {
+ "error": "State with the same external id and external source already exists",
+ "id": str(state.id),
+ },
+ status=status.HTTP_409_CONFLICT,
+ )
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
- return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
\ No newline at end of file
+ return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
diff --git a/apiserver/plane/app/middleware/api_authentication.py b/apiserver/plane/app/middleware/api_authentication.py
index ddabb4132..893df7f84 100644
--- a/apiserver/plane/app/middleware/api_authentication.py
+++ b/apiserver/plane/app/middleware/api_authentication.py
@@ -25,7 +25,10 @@ class APIKeyAuthentication(authentication.BaseAuthentication):
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
- Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
+ Q(
+ Q(expired_at__gt=timezone.now())
+ | Q(expired_at__isnull=True)
+ ),
token=token,
is_active=True,
)
diff --git a/apiserver/plane/app/permissions/__init__.py b/apiserver/plane/app/permissions/__init__.py
index 2298f3442..8e8793504 100644
--- a/apiserver/plane/app/permissions/__init__.py
+++ b/apiserver/plane/app/permissions/__init__.py
@@ -1,4 +1,3 @@
-
from .workspace import (
WorkSpaceBasePermission,
WorkspaceOwnerPermission,
@@ -13,5 +12,3 @@ from .project import (
ProjectMemberPermission,
ProjectLitePermission,
)
-
-
diff --git a/apiserver/plane/app/permissions/project.py b/apiserver/plane/app/permissions/project.py
index 80775cbf6..25e5aaeb0 100644
--- a/apiserver/plane/app/permissions/project.py
+++ b/apiserver/plane/app/permissions/project.py
@@ -1,8 +1,8 @@
# Third Party imports
-from rest_framework.permissions import BasePermission, SAFE_METHODS
+from rest_framework.permissions import SAFE_METHODS, BasePermission
# Module import
-from plane.db.models import WorkspaceMember, ProjectMember
+from plane.db.models import ProjectMember, WorkspaceMember
# Permission Mappings
Admin = 20
@@ -79,6 +79,16 @@ class ProjectEntityPermission(BasePermission):
if request.user.is_anonymous:
return False
+ # Handle requests based on project__identifier
+ if hasattr(view, "project__identifier") and view.project__identifier:
+ if request.method in SAFE_METHODS:
+ return ProjectMember.objects.filter(
+ workspace__slug=view.workspace_slug,
+ member=request.user,
+ project__identifier=view.project__identifier,
+ is_active=True,
+ ).exists()
+
## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS:
return ProjectMember.objects.filter(
diff --git a/apiserver/plane/app/serializers/__init__.py b/apiserver/plane/app/serializers/__init__.py
index c406453b7..bdcdf6c0d 100644
--- a/apiserver/plane/app/serializers/__init__.py
+++ b/apiserver/plane/app/serializers/__init__.py
@@ -7,6 +7,8 @@ from .user import (
UserAdminLiteSerializer,
UserMeSerializer,
UserMeSettingsSerializer,
+ ProfileSerializer,
+ AccountSerializer,
)
from .workspace import (
WorkSpaceSerializer,
@@ -17,6 +19,7 @@ from .workspace import (
WorkspaceThemeSerializer,
WorkspaceMemberAdminSerializer,
WorkspaceMemberMeSerializer,
+ WorkspaceUserPropertiesSerializer,
)
from .project import (
ProjectSerializer,
@@ -25,20 +28,23 @@ from .project import (
ProjectMemberSerializer,
ProjectMemberInviteSerializer,
ProjectIdentifierSerializer,
- ProjectFavoriteSerializer,
ProjectLiteSerializer,
ProjectMemberLiteSerializer,
ProjectDeployBoardSerializer,
ProjectMemberAdminSerializer,
ProjectPublicMemberSerializer,
+ ProjectMemberRoleSerializer,
)
from .state import StateSerializer, StateLiteSerializer
-from .view import GlobalViewSerializer, IssueViewSerializer, IssueViewFavoriteSerializer
+from .view import (
+ GlobalViewSerializer,
+ IssueViewSerializer,
+)
from .cycle import (
CycleSerializer,
CycleIssueSerializer,
- CycleFavoriteSerializer,
CycleWriteSerializer,
+ CycleUserPropertiesSerializer,
)
from .asset import FileAssetSerializer
from .issue import (
@@ -52,6 +58,7 @@ from .issue import (
IssueFlatSerializer,
IssueStateSerializer,
IssueLinkSerializer,
+ IssueInboxSerializer,
IssueLiteSerializer,
IssueAttachmentSerializer,
IssueSubscriberSerializer,
@@ -61,44 +68,56 @@ from .issue import (
IssueRelationSerializer,
RelatedIssueSerializer,
IssuePublicSerializer,
+ IssueDetailSerializer,
+ IssueReactionLiteSerializer,
+ IssueAttachmentLiteSerializer,
+ IssueLinkLiteSerializer,
)
from .module import (
+ ModuleDetailSerializer,
ModuleWriteSerializer,
ModuleSerializer,
ModuleIssueSerializer,
ModuleLinkSerializer,
- ModuleFavoriteSerializer,
+ ModuleUserPropertiesSerializer,
)
from .api import APITokenSerializer, APITokenReadSerializer
-from .integration import (
- IntegrationSerializer,
- WorkspaceIntegrationSerializer,
- GithubIssueSyncSerializer,
- GithubRepositorySerializer,
- GithubRepositorySyncSerializer,
- GithubCommentSyncSerializer,
- SlackProjectSyncSerializer,
-)
-
from .importer import ImporterSerializer
-from .page import PageSerializer, PageLogSerializer, SubPageSerializer, PageFavoriteSerializer
+from .page import (
+ PageSerializer,
+ PageLogSerializer,
+ SubPageSerializer,
+ PageDetailSerializer,
+)
from .estimate import (
EstimateSerializer,
EstimatePointSerializer,
EstimateReadSerializer,
+ WorkspaceEstimateSerializer,
)
-from .inbox import InboxSerializer, InboxIssueSerializer, IssueStateInboxSerializer
+from .inbox import (
+ InboxSerializer,
+ InboxIssueSerializer,
+ IssueStateInboxSerializer,
+ InboxIssueLiteSerializer,
+ InboxIssueDetailSerializer,
+)
from .analytic import AnalyticViewSerializer
-from .notification import NotificationSerializer
+from .notification import (
+ NotificationSerializer,
+ UserNotificationPreferenceSerializer,
+)
from .exporter import ExporterHistorySerializer
-from .webhook import WebhookSerializer, WebhookLogSerializer
\ No newline at end of file
+from .webhook import WebhookSerializer, WebhookLogSerializer
+
+from .dashboard import DashboardSerializer, WidgetSerializer
diff --git a/apiserver/plane/app/serializers/api.py b/apiserver/plane/app/serializers/api.py
index 08bb747d9..264a58f92 100644
--- a/apiserver/plane/app/serializers/api.py
+++ b/apiserver/plane/app/serializers/api.py
@@ -3,7 +3,6 @@ from plane.db.models import APIToken, APIActivityLog
class APITokenSerializer(BaseSerializer):
-
class Meta:
model = APIToken
fields = "__all__"
@@ -18,14 +17,12 @@ class APITokenSerializer(BaseSerializer):
class APITokenReadSerializer(BaseSerializer):
-
class Meta:
model = APIToken
- exclude = ('token',)
+ exclude = ("token",)
class APIActivityLogSerializer(BaseSerializer):
-
class Meta:
model = APIActivityLog
fields = "__all__"
diff --git a/apiserver/plane/app/serializers/base.py b/apiserver/plane/app/serializers/base.py
index 89c9725d9..6693ba931 100644
--- a/apiserver/plane/app/serializers/base.py
+++ b/apiserver/plane/app/serializers/base.py
@@ -4,16 +4,17 @@ from rest_framework import serializers
class BaseSerializer(serializers.ModelSerializer):
id = serializers.PrimaryKeyRelatedField(read_only=True)
-class DynamicBaseSerializer(BaseSerializer):
+class DynamicBaseSerializer(BaseSerializer):
def __init__(self, *args, **kwargs):
# If 'fields' is provided in the arguments, remove it and store it separately.
# This is done so as not to pass this custom argument up to the superclass.
- fields = kwargs.pop("fields", None)
+ fields = kwargs.pop("fields", [])
+ self.expand = kwargs.pop("expand", []) or []
+ fields = self.expand
# Call the initialization of the superclass.
super().__init__(*args, **kwargs)
-
# If 'fields' was provided, filter the fields of the serializer accordingly.
if fields is not None:
self.fields = self._filter_fields(fields)
@@ -31,7 +32,7 @@ class DynamicBaseSerializer(BaseSerializer):
# loop through its keys and values.
if isinstance(field_name, dict):
for key, value in field_name.items():
- # If the value of this nested field is a list,
+ # If the value of this nested field is a list,
# perform a recursive filter on it.
if isinstance(value, list):
self._filter_fields(self.fields[key], value)
@@ -47,12 +48,134 @@ class DynamicBaseSerializer(BaseSerializer):
elif isinstance(item, dict):
allowed.append(list(item.keys())[0])
- # Convert the current serializer's fields and the allowed fields to sets.
- existing = set(self.fields)
- allowed = set(allowed)
+ for field in allowed:
+ if field not in self.fields:
+ from . import (
+ WorkspaceLiteSerializer,
+ ProjectLiteSerializer,
+ UserLiteSerializer,
+ StateLiteSerializer,
+ IssueSerializer,
+ LabelSerializer,
+ CycleIssueSerializer,
+ IssueLiteSerializer,
+ IssueRelationSerializer,
+ InboxIssueLiteSerializer,
+ IssueReactionLiteSerializer,
+ IssueAttachmentLiteSerializer,
+ IssueLinkLiteSerializer,
+ )
- # Remove fields from the serializer that aren't in the 'allowed' list.
- for field_name in (existing - allowed):
- self.fields.pop(field_name)
+ # Expansion mapper
+ expansion = {
+ "user": UserLiteSerializer,
+ "workspace": WorkspaceLiteSerializer,
+ "project": ProjectLiteSerializer,
+ "default_assignee": UserLiteSerializer,
+ "project_lead": UserLiteSerializer,
+ "state": StateLiteSerializer,
+ "created_by": UserLiteSerializer,
+ "issue": IssueSerializer,
+ "actor": UserLiteSerializer,
+ "owned_by": UserLiteSerializer,
+ "members": UserLiteSerializer,
+ "assignees": UserLiteSerializer,
+ "labels": LabelSerializer,
+ "issue_cycle": CycleIssueSerializer,
+ "parent": IssueLiteSerializer,
+ "issue_relation": IssueRelationSerializer,
+ "issue_inbox": InboxIssueLiteSerializer,
+ "issue_reactions": IssueReactionLiteSerializer,
+ "issue_attachment": IssueAttachmentLiteSerializer,
+ "issue_link": IssueLinkLiteSerializer,
+ "sub_issues": IssueLiteSerializer,
+ }
+
+ self.fields[field] = expansion[field](
+ many=(
+ True
+ if field
+ in [
+ "members",
+ "assignees",
+ "labels",
+ "issue_cycle",
+ "issue_relation",
+ "issue_inbox",
+ "issue_reactions",
+ "issue_attachment",
+ "issue_link",
+ "sub_issues",
+ ]
+ else False
+ )
+ )
return self.fields
+
+ def to_representation(self, instance):
+ response = super().to_representation(instance)
+
+ # Ensure 'expand' is iterable before processing
+ if self.expand:
+ for expand in self.expand:
+ if expand in self.fields:
+ # Import all the expandable serializers
+ from . import (
+ WorkspaceLiteSerializer,
+ ProjectLiteSerializer,
+ UserLiteSerializer,
+ StateLiteSerializer,
+ IssueSerializer,
+ LabelSerializer,
+ CycleIssueSerializer,
+ IssueRelationSerializer,
+ InboxIssueLiteSerializer,
+ IssueLiteSerializer,
+ IssueReactionLiteSerializer,
+ IssueAttachmentLiteSerializer,
+ IssueLinkLiteSerializer,
+ )
+
+ # Expansion mapper
+ expansion = {
+ "user": UserLiteSerializer,
+ "workspace": WorkspaceLiteSerializer,
+ "project": ProjectLiteSerializer,
+ "default_assignee": UserLiteSerializer,
+ "project_lead": UserLiteSerializer,
+ "state": StateLiteSerializer,
+ "created_by": UserLiteSerializer,
+ "issue": IssueSerializer,
+ "actor": UserLiteSerializer,
+ "owned_by": UserLiteSerializer,
+ "members": UserLiteSerializer,
+ "assignees": UserLiteSerializer,
+ "labels": LabelSerializer,
+ "issue_cycle": CycleIssueSerializer,
+ "parent": IssueLiteSerializer,
+ "issue_relation": IssueRelationSerializer,
+ "issue_inbox": InboxIssueLiteSerializer,
+ "issue_reactions": IssueReactionLiteSerializer,
+ "issue_attachment": IssueAttachmentLiteSerializer,
+ "issue_link": IssueLinkLiteSerializer,
+ "sub_issues": IssueLiteSerializer,
+ }
+ # Check if field in expansion then expand the field
+ if expand in expansion:
+ if isinstance(response.get(expand), list):
+ exp_serializer = expansion[expand](
+ getattr(instance, expand), many=True
+ )
+ else:
+ exp_serializer = expansion[expand](
+ getattr(instance, expand)
+ )
+ response[expand] = exp_serializer.data
+ else:
+ # You might need to handle this case differently
+ response[expand] = getattr(
+ instance, f"{expand}_id", None
+ )
+
+ return response
diff --git a/apiserver/plane/app/serializers/cycle.py b/apiserver/plane/app/serializers/cycle.py
index 104a3dd06..97fd47960 100644
--- a/apiserver/plane/app/serializers/cycle.py
+++ b/apiserver/plane/app/serializers/cycle.py
@@ -3,11 +3,12 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer
-from .user import UserLiteSerializer
from .issue import IssueStateSerializer
-from .workspace import WorkspaceLiteSerializer
-from .project import ProjectLiteSerializer
-from plane.db.models import Cycle, CycleIssue, CycleFavorite
+from plane.db.models import (
+ Cycle,
+ CycleIssue,
+ CycleUserProperties,
+)
class CycleWriteSerializer(BaseSerializer):
@@ -17,69 +18,68 @@ class CycleWriteSerializer(BaseSerializer):
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
- raise serializers.ValidationError("Start date cannot exceed end date")
+ raise serializers.ValidationError(
+ "Start date cannot exceed end date"
+ )
return data
class Meta:
model = Cycle
fields = "__all__"
-
-
-class CycleSerializer(BaseSerializer):
- owned_by = UserLiteSerializer(read_only=True)
- is_favorite = serializers.BooleanField(read_only=True)
- total_issues = serializers.IntegerField(read_only=True)
- cancelled_issues = serializers.IntegerField(read_only=True)
- completed_issues = serializers.IntegerField(read_only=True)
- started_issues = serializers.IntegerField(read_only=True)
- unstarted_issues = serializers.IntegerField(read_only=True)
- backlog_issues = serializers.IntegerField(read_only=True)
- assignees = serializers.SerializerMethodField(read_only=True)
- total_estimates = serializers.IntegerField(read_only=True)
- completed_estimates = serializers.IntegerField(read_only=True)
- started_estimates = serializers.IntegerField(read_only=True)
- workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
- project_detail = ProjectLiteSerializer(read_only=True, source="project")
-
- def validate(self, data):
- if (
- data.get("start_date", None) is not None
- and data.get("end_date", None) is not None
- and data.get("start_date", None) > data.get("end_date", None)
- ):
- raise serializers.ValidationError("Start date cannot exceed end date")
- return data
-
- def get_assignees(self, obj):
- members = [
- {
- "avatar": assignee.avatar,
- "display_name": assignee.display_name,
- "id": assignee.id,
- }
- for issue_cycle in obj.issue_cycle.prefetch_related(
- "issue__assignees"
- ).all()
- for assignee in issue_cycle.issue.assignees.all()
- ]
- # Use a set comprehension to return only the unique objects
- unique_objects = {frozenset(item.items()) for item in members}
-
- # Convert the set back to a list of dictionaries
- unique_list = [dict(item) for item in unique_objects]
-
- return unique_list
-
- class Meta:
- model = Cycle
- fields = "__all__"
read_only_fields = [
"workspace",
"project",
"owned_by",
+ "archived_at",
]
+class CycleSerializer(BaseSerializer):
+ # favorite
+ is_favorite = serializers.BooleanField(read_only=True)
+ total_issues = serializers.IntegerField(read_only=True)
+ # state group wise distribution
+ cancelled_issues = serializers.IntegerField(read_only=True)
+ completed_issues = serializers.IntegerField(read_only=True)
+ started_issues = serializers.IntegerField(read_only=True)
+ unstarted_issues = serializers.IntegerField(read_only=True)
+ backlog_issues = serializers.IntegerField(read_only=True)
+
+ # active | draft | upcoming | completed
+ status = serializers.CharField(read_only=True)
+
+ class Meta:
+ model = Cycle
+ fields = [
+ # necessary fields
+ "id",
+ "workspace_id",
+ "project_id",
+ # model fields
+ "name",
+ "description",
+ "start_date",
+ "end_date",
+ "owned_by_id",
+ "view_props",
+ "sort_order",
+ "external_source",
+ "external_id",
+ "progress_snapshot",
+ "logo_props",
+ # meta fields
+ "is_favorite",
+ "total_issues",
+ "cancelled_issues",
+ "completed_issues",
+ "started_issues",
+ "unstarted_issues",
+ "backlog_issues",
+ "status",
+ ]
+ read_only_fields = fields
+
+
class CycleIssueSerializer(BaseSerializer):
issue_detail = IssueStateSerializer(read_only=True, source="issue")
sub_issues_count = serializers.IntegerField(read_only=True)
@@ -93,15 +93,12 @@ class CycleIssueSerializer(BaseSerializer):
"cycle",
]
-
-class CycleFavoriteSerializer(BaseSerializer):
- cycle_detail = CycleSerializer(source="cycle", read_only=True)
-
+class CycleUserPropertiesSerializer(BaseSerializer):
class Meta:
- model = CycleFavorite
+ model = CycleUserProperties
fields = "__all__"
read_only_fields = [
"workspace",
"project",
- "user",
+ "cycle" "user",
]
diff --git a/apiserver/plane/app/serializers/dashboard.py b/apiserver/plane/app/serializers/dashboard.py
new file mode 100644
index 000000000..b0ed8841b
--- /dev/null
+++ b/apiserver/plane/app/serializers/dashboard.py
@@ -0,0 +1,21 @@
+# Module imports
+from .base import BaseSerializer
+from plane.db.models import Dashboard, Widget
+
+# Third party frameworks
+from rest_framework import serializers
+
+
+class DashboardSerializer(BaseSerializer):
+ class Meta:
+ model = Dashboard
+ fields = "__all__"
+
+
+class WidgetSerializer(BaseSerializer):
+ is_visible = serializers.BooleanField(read_only=True)
+ widget_filters = serializers.JSONField(read_only=True)
+
+ class Meta:
+ model = Widget
+ fields = ["id", "key", "is_visible", "widget_filters"]
diff --git a/apiserver/plane/app/serializers/estimate.py b/apiserver/plane/app/serializers/estimate.py
index 4a1cda779..d28f38c75 100644
--- a/apiserver/plane/app/serializers/estimate.py
+++ b/apiserver/plane/app/serializers/estimate.py
@@ -2,11 +2,18 @@
from .base import BaseSerializer
from plane.db.models import Estimate, EstimatePoint
-from plane.app.serializers import WorkspaceLiteSerializer, ProjectLiteSerializer
+from plane.app.serializers import (
+ WorkspaceLiteSerializer,
+ ProjectLiteSerializer,
+)
+
+from rest_framework import serializers
class EstimateSerializer(BaseSerializer):
- workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
+ workspace_detail = WorkspaceLiteSerializer(
+ read_only=True, source="workspace"
+ )
project_detail = ProjectLiteSerializer(read_only=True, source="project")
class Meta:
@@ -19,6 +26,16 @@ class EstimateSerializer(BaseSerializer):
class EstimatePointSerializer(BaseSerializer):
+ def validate(self, data):
+ if not data:
+ raise serializers.ValidationError("Estimate points are required")
+ value = data.get("value")
+ if value and len(value) > 20:
+ raise serializers.ValidationError(
+ "Value can't be more than 20 characters"
+ )
+ return data
+
class Meta:
model = EstimatePoint
fields = "__all__"
@@ -31,7 +48,9 @@ class EstimatePointSerializer(BaseSerializer):
class EstimateReadSerializer(BaseSerializer):
points = EstimatePointSerializer(read_only=True, many=True)
- workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
+ workspace_detail = WorkspaceLiteSerializer(
+ read_only=True, source="workspace"
+ )
project_detail = ProjectLiteSerializer(read_only=True, source="project")
class Meta:
@@ -42,3 +61,16 @@ class EstimateReadSerializer(BaseSerializer):
"name",
"description",
]
+
+
+class WorkspaceEstimateSerializer(BaseSerializer):
+ points = EstimatePointSerializer(read_only=True, many=True)
+
+ class Meta:
+ model = Estimate
+ fields = "__all__"
+ read_only_fields = [
+ "points",
+ "name",
+ "description",
+ ]
diff --git a/apiserver/plane/app/serializers/exporter.py b/apiserver/plane/app/serializers/exporter.py
index 5c78cfa69..2dd850fd3 100644
--- a/apiserver/plane/app/serializers/exporter.py
+++ b/apiserver/plane/app/serializers/exporter.py
@@ -5,7 +5,9 @@ from .user import UserLiteSerializer
class ExporterHistorySerializer(BaseSerializer):
- initiated_by_detail = UserLiteSerializer(source="initiated_by", read_only=True)
+ initiated_by_detail = UserLiteSerializer(
+ source="initiated_by", read_only=True
+ )
class Meta:
model = ExporterHistory
diff --git a/apiserver/plane/app/serializers/importer.py b/apiserver/plane/app/serializers/importer.py
index 8997f6392..c058994d6 100644
--- a/apiserver/plane/app/serializers/importer.py
+++ b/apiserver/plane/app/serializers/importer.py
@@ -7,9 +7,13 @@ from plane.db.models import Importer
class ImporterSerializer(BaseSerializer):
- initiated_by_detail = UserLiteSerializer(source="initiated_by", read_only=True)
+ initiated_by_detail = UserLiteSerializer(
+ source="initiated_by", read_only=True
+ )
project_detail = ProjectLiteSerializer(source="project", read_only=True)
- workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
+ workspace_detail = WorkspaceLiteSerializer(
+ source="workspace", read_only=True
+ )
class Meta:
model = Importer
diff --git a/apiserver/plane/app/serializers/inbox.py b/apiserver/plane/app/serializers/inbox.py
index f52a90660..e0c18b3d1 100644
--- a/apiserver/plane/app/serializers/inbox.py
+++ b/apiserver/plane/app/serializers/inbox.py
@@ -3,7 +3,11 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer
-from .issue import IssueFlatSerializer, LabelLiteSerializer
+from .issue import (
+ IssueInboxSerializer,
+ LabelLiteSerializer,
+ IssueDetailSerializer,
+)
from .project import ProjectLiteSerializer
from .state import StateLiteSerializer
from .user import UserLiteSerializer
@@ -24,17 +28,62 @@ class InboxSerializer(BaseSerializer):
class InboxIssueSerializer(BaseSerializer):
- issue_detail = IssueFlatSerializer(source="issue", read_only=True)
- project_detail = ProjectLiteSerializer(source="project", read_only=True)
+ issue = IssueInboxSerializer(read_only=True)
class Meta:
model = InboxIssue
- fields = "__all__"
+ fields = [
+ "id",
+ "status",
+ "duplicate_to",
+ "snoozed_till",
+ "source",
+ "issue",
+ "created_by",
+ ]
read_only_fields = [
"project",
"workspace",
]
+ def to_representation(self, instance):
+ # Pass the annotated fields to the Issue instance if they exist
+ if hasattr(instance, "label_ids"):
+ instance.issue.label_ids = instance.label_ids
+ return super().to_representation(instance)
+
+
+class InboxIssueDetailSerializer(BaseSerializer):
+ issue = IssueDetailSerializer(read_only=True)
+ duplicate_issue_detail = IssueInboxSerializer(
+ read_only=True, source="duplicate_to"
+ )
+
+ class Meta:
+ model = InboxIssue
+ fields = [
+ "id",
+ "status",
+ "duplicate_to",
+ "snoozed_till",
+ "duplicate_issue_detail",
+ "source",
+ "issue",
+ ]
+ read_only_fields = [
+ "project",
+ "workspace",
+ ]
+
+ def to_representation(self, instance):
+ # Pass the annotated fields to the Issue instance if they exist
+ if hasattr(instance, "assignee_ids"):
+ instance.issue.assignee_ids = instance.assignee_ids
+ if hasattr(instance, "label_ids"):
+ instance.issue.label_ids = instance.label_ids
+
+ return super().to_representation(instance)
+
class InboxIssueLiteSerializer(BaseSerializer):
class Meta:
@@ -46,10 +95,13 @@ class InboxIssueLiteSerializer(BaseSerializer):
class IssueStateInboxSerializer(BaseSerializer):
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
- label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
- assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
+ label_details = LabelLiteSerializer(
+ read_only=True, source="labels", many=True
+ )
+ assignee_details = UserLiteSerializer(
+ read_only=True, source="assignees", many=True
+ )
sub_issues_count = serializers.IntegerField(read_only=True)
- bridge_id = serializers.UUIDField(read_only=True)
issue_inbox = InboxIssueLiteSerializer(read_only=True, many=True)
class Meta:
diff --git a/apiserver/plane/app/serializers/integration/__init__.py b/apiserver/plane/app/serializers/integration/__init__.py
deleted file mode 100644
index 112ff02d1..000000000
--- a/apiserver/plane/app/serializers/integration/__init__.py
+++ /dev/null
@@ -1,8 +0,0 @@
-from .base import IntegrationSerializer, WorkspaceIntegrationSerializer
-from .github import (
- GithubRepositorySerializer,
- GithubRepositorySyncSerializer,
- GithubIssueSyncSerializer,
- GithubCommentSyncSerializer,
-)
-from .slack import SlackProjectSyncSerializer
diff --git a/apiserver/plane/app/serializers/integration/base.py b/apiserver/plane/app/serializers/integration/base.py
deleted file mode 100644
index 6f6543b9e..000000000
--- a/apiserver/plane/app/serializers/integration/base.py
+++ /dev/null
@@ -1,20 +0,0 @@
-# Module imports
-from plane.app.serializers import BaseSerializer
-from plane.db.models import Integration, WorkspaceIntegration
-
-
-class IntegrationSerializer(BaseSerializer):
- class Meta:
- model = Integration
- fields = "__all__"
- read_only_fields = [
- "verified",
- ]
-
-
-class WorkspaceIntegrationSerializer(BaseSerializer):
- integration_detail = IntegrationSerializer(read_only=True, source="integration")
-
- class Meta:
- model = WorkspaceIntegration
- fields = "__all__"
diff --git a/apiserver/plane/app/serializers/integration/github.py b/apiserver/plane/app/serializers/integration/github.py
deleted file mode 100644
index 850bccf1b..000000000
--- a/apiserver/plane/app/serializers/integration/github.py
+++ /dev/null
@@ -1,45 +0,0 @@
-# Module imports
-from plane.app.serializers import BaseSerializer
-from plane.db.models import (
- GithubIssueSync,
- GithubRepository,
- GithubRepositorySync,
- GithubCommentSync,
-)
-
-
-class GithubRepositorySerializer(BaseSerializer):
- class Meta:
- model = GithubRepository
- fields = "__all__"
-
-
-class GithubRepositorySyncSerializer(BaseSerializer):
- repo_detail = GithubRepositorySerializer(source="repository")
-
- class Meta:
- model = GithubRepositorySync
- fields = "__all__"
-
-
-class GithubIssueSyncSerializer(BaseSerializer):
- class Meta:
- model = GithubIssueSync
- fields = "__all__"
- read_only_fields = [
- "project",
- "workspace",
- "repository_sync",
- ]
-
-
-class GithubCommentSyncSerializer(BaseSerializer):
- class Meta:
- model = GithubCommentSync
- fields = "__all__"
- read_only_fields = [
- "project",
- "workspace",
- "repository_sync",
- "issue_sync",
- ]
diff --git a/apiserver/plane/app/serializers/integration/slack.py b/apiserver/plane/app/serializers/integration/slack.py
deleted file mode 100644
index 9c461c5b9..000000000
--- a/apiserver/plane/app/serializers/integration/slack.py
+++ /dev/null
@@ -1,14 +0,0 @@
-# Module imports
-from plane.app.serializers import BaseSerializer
-from plane.db.models import SlackProjectSync
-
-
-class SlackProjectSyncSerializer(BaseSerializer):
- class Meta:
- model = SlackProjectSync
- fields = "__all__"
- read_only_fields = [
- "project",
- "workspace",
- "workspace_integration",
- ]
diff --git a/apiserver/plane/app/serializers/issue.py b/apiserver/plane/app/serializers/issue.py
index b13d03e35..e4a04fadf 100644
--- a/apiserver/plane/app/serializers/issue.py
+++ b/apiserver/plane/app/serializers/issue.py
@@ -1,5 +1,7 @@
# Django imports
from django.utils import timezone
+from django.core.validators import URLValidator
+from django.core.exceptions import ValidationError
# Third Party imports
from rest_framework import serializers
@@ -7,7 +9,7 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer, DynamicBaseSerializer
from .user import UserLiteSerializer
-from .state import StateSerializer, StateLiteSerializer
+from .state import StateLiteSerializer
from .project import ProjectLiteSerializer
from .workspace import WorkspaceLiteSerializer
from plane.db.models import (
@@ -30,6 +32,7 @@ from plane.db.models import (
CommentReaction,
IssueVote,
IssueRelation,
+ State,
)
@@ -69,19 +72,26 @@ class IssueProjectLiteSerializer(BaseSerializer):
##TODO: Find a better way to write this serializer
## Find a better approach to save manytomany?
class IssueCreateSerializer(BaseSerializer):
- state_detail = StateSerializer(read_only=True, source="state")
- created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
- project_detail = ProjectLiteSerializer(read_only=True, source="project")
- workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
-
- assignees = serializers.ListField(
- child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
+ # ids
+ state_id = serializers.PrimaryKeyRelatedField(
+ source="state",
+ queryset=State.objects.all(),
+ required=False,
+ allow_null=True,
+ )
+ parent_id = serializers.PrimaryKeyRelatedField(
+ source="parent",
+ queryset=Issue.objects.all(),
+ required=False,
+ allow_null=True,
+ )
+ label_ids = serializers.ListField(
+ child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True,
required=False,
)
-
- labels = serializers.ListField(
- child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
+ assignee_ids = serializers.ListField(
+ child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
@@ -100,8 +110,10 @@ class IssueCreateSerializer(BaseSerializer):
def to_representation(self, instance):
data = super().to_representation(instance)
- data['assignees'] = [str(assignee.id) for assignee in instance.assignees.all()]
- data['labels'] = [str(label.id) for label in instance.labels.all()]
+ assignee_ids = self.initial_data.get("assignee_ids")
+ data["assignee_ids"] = assignee_ids if assignee_ids else []
+ label_ids = self.initial_data.get("label_ids")
+ data["label_ids"] = label_ids if label_ids else []
return data
def validate(self, data):
@@ -110,12 +122,14 @@ class IssueCreateSerializer(BaseSerializer):
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
- raise serializers.ValidationError("Start date cannot exceed target date")
+ raise serializers.ValidationError(
+ "Start date cannot exceed target date"
+ )
return data
def create(self, validated_data):
- assignees = validated_data.pop("assignees", None)
- labels = validated_data.pop("labels", None)
+ assignees = validated_data.pop("assignee_ids", None)
+ labels = validated_data.pop("label_ids", None)
project_id = self.context["project_id"]
workspace_id = self.context["workspace_id"]
@@ -173,8 +187,8 @@ class IssueCreateSerializer(BaseSerializer):
return issue
def update(self, instance, validated_data):
- assignees = validated_data.pop("assignees", None)
- labels = validated_data.pop("labels", None)
+ assignees = validated_data.pop("assignee_ids", None)
+ labels = validated_data.pop("label_ids", None)
# Related models
project_id = instance.project_id
@@ -225,14 +239,15 @@ class IssueActivitySerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
- workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
+ workspace_detail = WorkspaceLiteSerializer(
+ read_only=True, source="workspace"
+ )
class Meta:
model = IssueActivity
fields = "__all__"
-
class IssuePropertySerializer(BaseSerializer):
class Meta:
model = IssueProperty
@@ -245,12 +260,17 @@ class IssuePropertySerializer(BaseSerializer):
class LabelSerializer(BaseSerializer):
- workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
- project_detail = ProjectLiteSerializer(source="project", read_only=True)
-
class Meta:
model = Label
- fields = "__all__"
+ fields = [
+ "parent",
+ "name",
+ "color",
+ "id",
+ "project_id",
+ "workspace_id",
+ "sort_order",
+ ]
read_only_fields = [
"workspace",
"project",
@@ -268,7 +288,6 @@ class LabelLiteSerializer(BaseSerializer):
class IssueLabelSerializer(BaseSerializer):
-
class Meta:
model = IssueLabel
fields = "__all__"
@@ -279,33 +298,50 @@ class IssueLabelSerializer(BaseSerializer):
class IssueRelationSerializer(BaseSerializer):
- issue_detail = IssueProjectLiteSerializer(read_only=True, source="related_issue")
+ id = serializers.UUIDField(source="related_issue.id", read_only=True)
+ project_id = serializers.PrimaryKeyRelatedField(
+ source="related_issue.project_id", read_only=True
+ )
+ sequence_id = serializers.IntegerField(
+ source="related_issue.sequence_id", read_only=True
+ )
+ name = serializers.CharField(source="related_issue.name", read_only=True)
+ relation_type = serializers.CharField(read_only=True)
class Meta:
model = IssueRelation
fields = [
- "issue_detail",
+ "id",
+ "project_id",
+ "sequence_id",
"relation_type",
- "related_issue",
- "issue",
- "id"
+ "name",
]
read_only_fields = [
"workspace",
"project",
]
+
class RelatedIssueSerializer(BaseSerializer):
- issue_detail = IssueProjectLiteSerializer(read_only=True, source="issue")
+ id = serializers.UUIDField(source="issue.id", read_only=True)
+ project_id = serializers.PrimaryKeyRelatedField(
+ source="issue.project_id", read_only=True
+ )
+ sequence_id = serializers.IntegerField(
+ source="issue.sequence_id", read_only=True
+ )
+ name = serializers.CharField(source="issue.name", read_only=True)
+ relation_type = serializers.CharField(read_only=True)
class Meta:
model = IssueRelation
fields = [
- "issue_detail",
+ "id",
+ "project_id",
+ "sequence_id",
"relation_type",
- "related_issue",
- "issue",
- "id"
+ "name",
]
read_only_fields = [
"workspace",
@@ -397,16 +433,57 @@ class IssueLinkSerializer(BaseSerializer):
"issue",
]
+ def validate_url(self, value):
+ # Check URL format
+ validate_url = URLValidator()
+ try:
+ validate_url(value)
+ except ValidationError:
+ raise serializers.ValidationError("Invalid URL format.")
+
+ # Check URL scheme
+ if not value.startswith(("http://", "https://")):
+ raise serializers.ValidationError("Invalid URL scheme.")
+
+ return value
+
# Validation if url already exists
def create(self, validated_data):
if IssueLink.objects.filter(
- url=validated_data.get("url"), issue_id=validated_data.get("issue_id")
+ url=validated_data.get("url"),
+ issue_id=validated_data.get("issue_id"),
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
return IssueLink.objects.create(**validated_data)
+ def update(self, instance, validated_data):
+ if IssueLink.objects.filter(
+ url=validated_data.get("url"),
+ issue_id=instance.issue_id,
+ ).exclude(pk=instance.id).exists():
+ raise serializers.ValidationError(
+ {"error": "URL already exists for this Issue"}
+ )
+
+ return super().update(instance, validated_data)
+
+
+class IssueLinkLiteSerializer(BaseSerializer):
+ class Meta:
+ model = IssueLink
+ fields = [
+ "id",
+ "issue_id",
+ "title",
+ "url",
+ "metadata",
+ "created_by_id",
+ "created_at",
+ ]
+ read_only_fields = fields
+
class IssueAttachmentSerializer(BaseSerializer):
class Meta:
@@ -423,10 +500,23 @@ class IssueAttachmentSerializer(BaseSerializer):
]
+class IssueAttachmentLiteSerializer(DynamicBaseSerializer):
+ class Meta:
+ model = IssueAttachment
+ fields = [
+ "id",
+ "asset",
+ "attributes",
+ "issue_id",
+ "updated_at",
+ "updated_by_id",
+ ]
+ read_only_fields = fields
+
+
class IssueReactionSerializer(BaseSerializer):
-
actor_detail = UserLiteSerializer(read_only=True, source="actor")
-
+
class Meta:
model = IssueReaction
fields = "__all__"
@@ -438,16 +528,14 @@ class IssueReactionSerializer(BaseSerializer):
]
-class CommentReactionLiteSerializer(BaseSerializer):
- actor_detail = UserLiteSerializer(read_only=True, source="actor")
-
+class IssueReactionLiteSerializer(DynamicBaseSerializer):
class Meta:
- model = CommentReaction
+ model = IssueReaction
fields = [
"id",
+ "actor",
+ "issue",
"reaction",
- "comment",
- "actor_detail",
]
@@ -459,12 +547,18 @@ class CommentReactionSerializer(BaseSerializer):
class IssueVoteSerializer(BaseSerializer):
-
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = IssueVote
- fields = ["issue", "vote", "workspace", "project", "actor", "actor_detail"]
+ fields = [
+ "issue",
+ "vote",
+ "workspace",
+ "project",
+ "actor",
+ "actor_detail",
+ ]
read_only_fields = fields
@@ -472,8 +566,10 @@ class IssueCommentSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
- workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
- comment_reactions = CommentReactionLiteSerializer(read_only=True, many=True)
+ workspace_detail = WorkspaceLiteSerializer(
+ read_only=True, source="workspace"
+ )
+ comment_reactions = CommentReactionSerializer(read_only=True, many=True)
is_member = serializers.BooleanField(read_only=True)
class Meta:
@@ -507,12 +603,15 @@ class IssueStateFlatSerializer(BaseSerializer):
# Issue Serializer with state details
class IssueStateSerializer(DynamicBaseSerializer):
- label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
+ label_details = LabelLiteSerializer(
+ read_only=True, source="labels", many=True
+ )
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
- assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
+ assignee_details = UserLiteSerializer(
+ read_only=True, source="assignees", many=True
+ )
sub_issues_count = serializers.IntegerField(read_only=True)
- bridge_id = serializers.UUIDField(read_only=True)
attachment_count = serializers.IntegerField(read_only=True)
link_count = serializers.IntegerField(read_only=True)
@@ -521,67 +620,111 @@ class IssueStateSerializer(DynamicBaseSerializer):
fields = "__all__"
-class IssueSerializer(BaseSerializer):
- project_detail = ProjectLiteSerializer(read_only=True, source="project")
- state_detail = StateSerializer(read_only=True, source="state")
- parent_detail = IssueStateFlatSerializer(read_only=True, source="parent")
- label_details = LabelSerializer(read_only=True, source="labels", many=True)
- assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
- related_issues = IssueRelationSerializer(read_only=True, source="issue_relation", many=True)
- issue_relations = RelatedIssueSerializer(read_only=True, source="issue_related", many=True)
- issue_cycle = IssueCycleDetailSerializer(read_only=True)
- issue_module = IssueModuleDetailSerializer(read_only=True)
- issue_link = IssueLinkSerializer(read_only=True, many=True)
- issue_attachment = IssueAttachmentSerializer(read_only=True, many=True)
- sub_issues_count = serializers.IntegerField(read_only=True)
- issue_reactions = IssueReactionSerializer(read_only=True, many=True)
+class IssueInboxSerializer(DynamicBaseSerializer):
+ label_ids = serializers.ListField(
+ child=serializers.UUIDField(),
+ required=False,
+ )
class Meta:
model = Issue
- fields = "__all__"
- read_only_fields = [
- "workspace",
- "project",
+ fields = [
+ "id",
+ "name",
+ "priority",
+ "sequence_id",
+ "project_id",
+ "created_at",
+ "label_ids",
"created_by",
- "updated_by",
+ ]
+ read_only_fields = fields
+
+
+class IssueSerializer(DynamicBaseSerializer):
+ # ids
+ cycle_id = serializers.PrimaryKeyRelatedField(read_only=True)
+ module_ids = serializers.ListField(
+ child=serializers.UUIDField(),
+ required=False,
+ )
+
+ # Many to many
+ label_ids = serializers.ListField(
+ child=serializers.UUIDField(),
+ required=False,
+ )
+ assignee_ids = serializers.ListField(
+ child=serializers.UUIDField(),
+ required=False,
+ )
+
+ # Count items
+ sub_issues_count = serializers.IntegerField(read_only=True)
+ attachment_count = serializers.IntegerField(read_only=True)
+ link_count = serializers.IntegerField(read_only=True)
+
+ class Meta:
+ model = Issue
+ fields = [
+ "id",
+ "name",
+ "state_id",
+ "sort_order",
+ "completed_at",
+ "estimate_point",
+ "priority",
+ "start_date",
+ "target_date",
+ "sequence_id",
+ "project_id",
+ "parent_id",
+ "cycle_id",
+ "module_ids",
+ "label_ids",
+ "assignee_ids",
+ "sub_issues_count",
"created_at",
"updated_at",
+ "created_by",
+ "updated_by",
+ "attachment_count",
+ "link_count",
+ "is_draft",
+ "archived_at",
]
+ read_only_fields = fields
class IssueLiteSerializer(DynamicBaseSerializer):
- workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
- project_detail = ProjectLiteSerializer(read_only=True, source="project")
- state_detail = StateLiteSerializer(read_only=True, source="state")
- label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
- assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
- sub_issues_count = serializers.IntegerField(read_only=True)
- cycle_id = serializers.UUIDField(read_only=True)
- module_id = serializers.UUIDField(read_only=True)
- attachment_count = serializers.IntegerField(read_only=True)
- link_count = serializers.IntegerField(read_only=True)
- issue_reactions = IssueReactionSerializer(read_only=True, many=True)
-
class Meta:
model = Issue
- fields = "__all__"
- read_only_fields = [
- "start_date",
- "target_date",
- "completed_at",
- "workspace",
- "project",
- "created_by",
- "updated_by",
- "created_at",
- "updated_at",
+ fields = [
+ "id",
+ "sequence_id",
+ "project_id",
]
+ read_only_fields = fields
+
+
+class IssueDetailSerializer(IssueSerializer):
+ description_html = serializers.CharField()
+ is_subscribed = serializers.BooleanField(read_only=True)
+
+ class Meta(IssueSerializer.Meta):
+ fields = IssueSerializer.Meta.fields + [
+ "description_html",
+ "is_subscribed",
+ ]
+ read_only_fields = fields
class IssuePublicSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")
- reactions = IssueReactionSerializer(read_only=True, many=True, source="issue_reactions")
+ reactions = IssueReactionSerializer(
+ read_only=True, many=True, source="issue_reactions"
+ )
votes = IssueVoteSerializer(read_only=True, many=True)
class Meta:
@@ -604,7 +747,6 @@ class IssuePublicSerializer(BaseSerializer):
read_only_fields = fields
-
class IssueSubscriberSerializer(BaseSerializer):
class Meta:
model = IssueSubscriber
diff --git a/apiserver/plane/app/serializers/module.py b/apiserver/plane/app/serializers/module.py
index 48f773b0f..28d28d7db 100644
--- a/apiserver/plane/app/serializers/module.py
+++ b/apiserver/plane/app/serializers/module.py
@@ -2,10 +2,8 @@
from rest_framework import serializers
# Module imports
-from .base import BaseSerializer
-from .user import UserLiteSerializer
+from .base import BaseSerializer, DynamicBaseSerializer
from .project import ProjectLiteSerializer
-from .workspace import WorkspaceLiteSerializer
from plane.db.models import (
User,
@@ -13,20 +11,23 @@ from plane.db.models import (
ModuleMember,
ModuleIssue,
ModuleLink,
- ModuleFavorite,
+ ModuleUserProperties,
)
class ModuleWriteSerializer(BaseSerializer):
- members = serializers.ListField(
+ lead_id = serializers.PrimaryKeyRelatedField(
+ source="lead",
+ queryset=User.objects.all(),
+ required=False,
+ allow_null=True,
+ )
+ member_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
- project_detail = ProjectLiteSerializer(source="project", read_only=True)
- workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
-
class Meta:
model = Module
fields = "__all__"
@@ -37,25 +38,32 @@ class ModuleWriteSerializer(BaseSerializer):
"updated_by",
"created_at",
"updated_at",
+ "archived_at",
]
-
+
def to_representation(self, instance):
data = super().to_representation(instance)
- data['members'] = [str(member.id) for member in instance.members.all()]
+ data["member_ids"] = [
+ str(member.id) for member in instance.members.all()
+ ]
return data
def validate(self, data):
- if data.get("start_date", None) is not None and data.get("target_date", None) is not None and data.get("start_date", None) > data.get("target_date", None):
- raise serializers.ValidationError("Start date cannot exceed target date")
- return data
+ if (
+ data.get("start_date", None) is not None
+ and data.get("target_date", None) is not None
+ and data.get("start_date", None) > data.get("target_date", None)
+ ):
+ raise serializers.ValidationError(
+ "Start date cannot exceed target date"
+ )
+ return data
def create(self, validated_data):
- members = validated_data.pop("members", None)
-
+ members = validated_data.pop("member_ids", None)
project = self.context["project"]
module = Module.objects.create(**validated_data, project=project)
-
if members is not None:
ModuleMember.objects.bulk_create(
[
@@ -76,7 +84,7 @@ class ModuleWriteSerializer(BaseSerializer):
return module
def update(self, instance, validated_data):
- members = validated_data.pop("members", None)
+ members = validated_data.pop("member_ids", None)
if members is not None:
ModuleMember.objects.filter(module=instance).delete()
@@ -133,8 +141,6 @@ class ModuleIssueSerializer(BaseSerializer):
class ModuleLinkSerializer(BaseSerializer):
- created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
-
class Meta:
model = ModuleLink
fields = "__all__"
@@ -151,7 +157,8 @@ class ModuleLinkSerializer(BaseSerializer):
# Validation if url already exists
def create(self, validated_data):
if ModuleLink.objects.filter(
- url=validated_data.get("url"), module_id=validated_data.get("module_id")
+ url=validated_data.get("url"),
+ module_id=validated_data.get("module_id"),
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
@@ -159,11 +166,10 @@ class ModuleLinkSerializer(BaseSerializer):
return ModuleLink.objects.create(**validated_data)
-class ModuleSerializer(BaseSerializer):
- project_detail = ProjectLiteSerializer(read_only=True, source="project")
- lead_detail = UserLiteSerializer(read_only=True, source="lead")
- members_detail = UserLiteSerializer(read_only=True, many=True, source="members")
- link_module = ModuleLinkSerializer(read_only=True, many=True)
+class ModuleSerializer(DynamicBaseSerializer):
+ member_ids = serializers.ListField(
+ child=serializers.UUIDField(), required=False, allow_null=True
+ )
is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
@@ -174,25 +180,51 @@ class ModuleSerializer(BaseSerializer):
class Meta:
model = Module
- fields = "__all__"
- read_only_fields = [
- "workspace",
- "project",
- "created_by",
- "updated_by",
+ fields = [
+ # Required fields
+ "id",
+ "workspace_id",
+ "project_id",
+ # Model fields
+ "name",
+ "description",
+ "description_text",
+ "description_html",
+ "start_date",
+ "target_date",
+ "status",
+ "lead_id",
+ "member_ids",
+ "view_props",
+ "sort_order",
+ "external_source",
+ "external_id",
+ "logo_props",
+ # computed fields
+ "is_favorite",
+ "total_issues",
+ "cancelled_issues",
+ "completed_issues",
+ "started_issues",
+ "unstarted_issues",
+ "backlog_issues",
"created_at",
"updated_at",
+ "archived_at",
]
+ read_only_fields = fields
-class ModuleFavoriteSerializer(BaseSerializer):
- module_detail = ModuleFlatSerializer(source="module", read_only=True)
+class ModuleDetailSerializer(ModuleSerializer):
+ link_module = ModuleLinkSerializer(read_only=True, many=True)
+ sub_issues = serializers.IntegerField(read_only=True)
+ class Meta(ModuleSerializer.Meta):
+ fields = ModuleSerializer.Meta.fields + ["link_module", "sub_issues"]
+
+
+class ModuleUserPropertiesSerializer(BaseSerializer):
class Meta:
- model = ModuleFavorite
+ model = ModuleUserProperties
fields = "__all__"
- read_only_fields = [
- "workspace",
- "project",
- "user",
- ]
+ read_only_fields = ["workspace", "project", "module", "user"]
diff --git a/apiserver/plane/app/serializers/notification.py b/apiserver/plane/app/serializers/notification.py
index b6a4f3e4a..c6713a354 100644
--- a/apiserver/plane/app/serializers/notification.py
+++ b/apiserver/plane/app/serializers/notification.py
@@ -1,12 +1,20 @@
# Module imports
from .base import BaseSerializer
from .user import UserLiteSerializer
-from plane.db.models import Notification
+from plane.db.models import Notification, UserNotificationPreference
+
class NotificationSerializer(BaseSerializer):
- triggered_by_details = UserLiteSerializer(read_only=True, source="triggered_by")
+ triggered_by_details = UserLiteSerializer(
+ read_only=True, source="triggered_by"
+ )
class Meta:
model = Notification
fields = "__all__"
+
+class UserNotificationPreferenceSerializer(BaseSerializer):
+ class Meta:
+ model = UserNotificationPreference
+ fields = "__all__"
diff --git a/apiserver/plane/app/serializers/page.py b/apiserver/plane/app/serializers/page.py
index ff152627a..f13923831 100644
--- a/apiserver/plane/app/serializers/page.py
+++ b/apiserver/plane/app/serializers/page.py
@@ -3,42 +3,65 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer
-from .issue import IssueFlatSerializer, LabelLiteSerializer
-from .workspace import WorkspaceLiteSerializer
-from .project import ProjectLiteSerializer
-from plane.db.models import Page, PageLog, PageFavorite, PageLabel, Label, Issue, Module
+from plane.db.models import (
+ Page,
+ PageLog,
+ PageLabel,
+ Label,
+)
class PageSerializer(BaseSerializer):
is_favorite = serializers.BooleanField(read_only=True)
- label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
labels = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True,
required=False,
)
- project_detail = ProjectLiteSerializer(source="project", read_only=True)
- workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
class Meta:
model = Page
- fields = "__all__"
+ fields = [
+ "id",
+ "name",
+ "owned_by",
+ "access",
+ "color",
+ "labels",
+ "parent",
+ "is_favorite",
+ "is_locked",
+ "archived_at",
+ "workspace",
+ "project",
+ "created_at",
+ "updated_at",
+ "created_by",
+ "updated_by",
+ "view_props",
+ "logo_props",
+ ]
read_only_fields = [
"workspace",
"project",
"owned_by",
]
+
def to_representation(self, instance):
data = super().to_representation(instance)
- data['labels'] = [str(label.id) for label in instance.labels.all()]
+ data["labels"] = [str(label.id) for label in instance.labels.all()]
return data
def create(self, validated_data):
labels = validated_data.pop("labels", None)
project_id = self.context["project_id"]
owned_by_id = self.context["owned_by_id"]
+ description_html = self.context["description_html"]
page = Page.objects.create(
- **validated_data, project_id=project_id, owned_by_id=owned_by_id
+ **validated_data,
+ description_html=description_html,
+ project_id=project_id,
+ owned_by_id=owned_by_id,
)
if labels is not None:
@@ -80,6 +103,15 @@ class PageSerializer(BaseSerializer):
return super().update(instance, validated_data)
+class PageDetailSerializer(PageSerializer):
+ description_html = serializers.CharField()
+
+ class Meta(PageSerializer.Meta):
+ fields = PageSerializer.Meta.fields + [
+ "description_html",
+ ]
+
+
class SubPageSerializer(BaseSerializer):
entity_details = serializers.SerializerMethodField()
@@ -94,7 +126,7 @@ class SubPageSerializer(BaseSerializer):
def get_entity_details(self, obj):
entity_name = obj.entity_name
- if entity_name == 'forward_link' or entity_name == 'back_link':
+ if entity_name == "forward_link" or entity_name == "back_link":
try:
page = Page.objects.get(pk=obj.entity_identifier)
return PageSerializer(page).data
@@ -104,7 +136,6 @@ class SubPageSerializer(BaseSerializer):
class PageLogSerializer(BaseSerializer):
-
class Meta:
model = PageLog
fields = "__all__"
@@ -112,17 +143,4 @@ class PageLogSerializer(BaseSerializer):
"workspace",
"project",
"page",
- ]
-
-
-class PageFavoriteSerializer(BaseSerializer):
- page_detail = PageSerializer(source="page", read_only=True)
-
- class Meta:
- model = PageFavorite
- fields = "__all__"
- read_only_fields = [
- "workspace",
- "project",
- "user",
- ]
+ ]
\ No newline at end of file
diff --git a/apiserver/plane/app/serializers/project.py b/apiserver/plane/app/serializers/project.py
index aef715e33..96d92f340 100644
--- a/apiserver/plane/app/serializers/project.py
+++ b/apiserver/plane/app/serializers/project.py
@@ -4,20 +4,24 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer, DynamicBaseSerializer
from plane.app.serializers.workspace import WorkspaceLiteSerializer
-from plane.app.serializers.user import UserLiteSerializer, UserAdminLiteSerializer
+from plane.app.serializers.user import (
+ UserLiteSerializer,
+ UserAdminLiteSerializer,
+)
from plane.db.models import (
Project,
ProjectMember,
ProjectMemberInvite,
ProjectIdentifier,
- ProjectFavorite,
ProjectDeployBoard,
ProjectPublicMember,
)
class ProjectSerializer(BaseSerializer):
- workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
+ workspace_detail = WorkspaceLiteSerializer(
+ source="workspace", read_only=True
+ )
class Meta:
model = Project
@@ -29,12 +33,16 @@ class ProjectSerializer(BaseSerializer):
def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "":
- raise serializers.ValidationError(detail="Project Identifier is required")
+ raise serializers.ValidationError(
+ detail="Project Identifier is required"
+ )
if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"]
).exists():
- raise serializers.ValidationError(detail="Project Identifier is taken")
+ raise serializers.ValidationError(
+ detail="Project Identifier is taken"
+ )
project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"]
)
@@ -73,7 +81,9 @@ class ProjectSerializer(BaseSerializer):
return project
# If not same fail update
- raise serializers.ValidationError(detail="Project Identifier is already taken")
+ raise serializers.ValidationError(
+ detail="Project Identifier is already taken"
+ )
class ProjectLiteSerializer(BaseSerializer):
@@ -84,14 +94,19 @@ class ProjectLiteSerializer(BaseSerializer):
"identifier",
"name",
"cover_image",
- "icon_prop",
- "emoji",
+ "logo_props",
"description",
]
read_only_fields = fields
class ProjectListSerializer(DynamicBaseSerializer):
+ total_issues = serializers.IntegerField(read_only=True)
+ archived_issues = serializers.IntegerField(read_only=True)
+ archived_sub_issues = serializers.IntegerField(read_only=True)
+ draft_issues = serializers.IntegerField(read_only=True)
+ draft_sub_issues = serializers.IntegerField(read_only=True)
+ sub_issues = serializers.IntegerField(read_only=True)
is_favorite = serializers.BooleanField(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True)
@@ -160,6 +175,12 @@ class ProjectMemberAdminSerializer(BaseSerializer):
fields = "__all__"
+class ProjectMemberRoleSerializer(DynamicBaseSerializer):
+ class Meta:
+ model = ProjectMember
+ fields = ("id", "role", "member", "project")
+
+
class ProjectMemberInviteSerializer(BaseSerializer):
project = ProjectLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
@@ -175,16 +196,6 @@ class ProjectIdentifierSerializer(BaseSerializer):
fields = "__all__"
-class ProjectFavoriteSerializer(BaseSerializer):
- class Meta:
- model = ProjectFavorite
- fields = "__all__"
- read_only_fields = [
- "workspace",
- "user",
- ]
-
-
class ProjectMemberLiteSerializer(BaseSerializer):
member = UserLiteSerializer(read_only=True)
is_subscribed = serializers.BooleanField(read_only=True)
@@ -197,7 +208,9 @@ class ProjectMemberLiteSerializer(BaseSerializer):
class ProjectDeployBoardSerializer(BaseSerializer):
project_details = ProjectLiteSerializer(read_only=True, source="project")
- workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
+ workspace_detail = WorkspaceLiteSerializer(
+ read_only=True, source="workspace"
+ )
class Meta:
model = ProjectDeployBoard
@@ -217,4 +230,4 @@ class ProjectPublicMemberSerializer(BaseSerializer):
"workspace",
"project",
"member",
- ]
\ No newline at end of file
+ ]
diff --git a/apiserver/plane/app/serializers/state.py b/apiserver/plane/app/serializers/state.py
index 323254f26..773d8e461 100644
--- a/apiserver/plane/app/serializers/state.py
+++ b/apiserver/plane/app/serializers/state.py
@@ -6,10 +6,19 @@ from plane.db.models import State
class StateSerializer(BaseSerializer):
-
class Meta:
model = State
- fields = "__all__"
+ fields = [
+ "id",
+ "project_id",
+ "workspace_id",
+ "name",
+ "color",
+ "group",
+ "default",
+ "description",
+ "sequence",
+ ]
read_only_fields = [
"workspace",
"project",
@@ -25,4 +34,4 @@ class StateLiteSerializer(BaseSerializer):
"color",
"group",
]
- read_only_fields = fields
\ No newline at end of file
+ read_only_fields = fields
diff --git a/apiserver/plane/app/serializers/user.py b/apiserver/plane/app/serializers/user.py
index 1b94758e8..05d8665b5 100644
--- a/apiserver/plane/app/serializers/user.py
+++ b/apiserver/plane/app/serializers/user.py
@@ -2,9 +2,15 @@
from rest_framework import serializers
# Module import
+from plane.db.models import (
+ Account,
+ Profile,
+ User,
+ Workspace,
+ WorkspaceMemberInvite,
+)
+
from .base import BaseSerializer
-from plane.db.models import User, Workspace, WorkspaceMemberInvite
-from plane.license.models import InstanceAdmin, Instance
class UserSerializer(BaseSerializer):
@@ -24,10 +30,10 @@ class UserSerializer(BaseSerializer):
"last_logout_ip",
"last_login_uagent",
"token_updated_at",
- "is_onboarded",
"is_bot",
"is_password_autoset",
"is_email_verified",
+ "is_active",
]
extra_kwargs = {"password": {"write_only": True}}
@@ -51,19 +57,11 @@ class UserMeSerializer(BaseSerializer):
"is_active",
"is_bot",
"is_email_verified",
- "is_managed",
- "is_onboarded",
- "is_tour_completed",
- "mobile_number",
- "role",
- "onboarding_step",
"user_timezone",
"username",
- "theme",
- "last_workspace_id",
- "use_case",
"is_password_autoset",
"is_email_verified",
+ "last_login_medium",
]
read_only_fields = fields
@@ -84,32 +82,38 @@ class UserMeSettingsSerializer(BaseSerializer):
workspace_invites = WorkspaceMemberInvite.objects.filter(
email=obj.email
).count()
+
+ # profile
+ profile = Profile.objects.get(user=obj)
if (
- obj.last_workspace_id is not None
+ profile.last_workspace_id is not None
and Workspace.objects.filter(
- pk=obj.last_workspace_id,
+ pk=profile.last_workspace_id,
workspace_member__member=obj.id,
workspace_member__is_active=True,
).exists()
):
workspace = Workspace.objects.filter(
- pk=obj.last_workspace_id,
+ pk=profile.last_workspace_id,
workspace_member__member=obj.id,
workspace_member__is_active=True,
).first()
return {
- "last_workspace_id": obj.last_workspace_id,
- "last_workspace_slug": workspace.slug if workspace is not None else "",
- "fallback_workspace_id": obj.last_workspace_id,
- "fallback_workspace_slug": workspace.slug
- if workspace is not None
- else "",
+ "last_workspace_id": profile.last_workspace_id,
+ "last_workspace_slug": (
+ workspace.slug if workspace is not None else ""
+ ),
+ "fallback_workspace_id": profile.last_workspace_id,
+ "fallback_workspace_slug": (
+ workspace.slug if workspace is not None else ""
+ ),
"invites": workspace_invites,
}
else:
fallback_workspace = (
Workspace.objects.filter(
- workspace_member__member_id=obj.id, workspace_member__is_active=True
+ workspace_member__member_id=obj.id,
+ workspace_member__is_active=True,
)
.order_by("created_at")
.first()
@@ -117,12 +121,16 @@ class UserMeSettingsSerializer(BaseSerializer):
return {
"last_workspace_id": None,
"last_workspace_slug": None,
- "fallback_workspace_id": fallback_workspace.id
- if fallback_workspace is not None
- else None,
- "fallback_workspace_slug": fallback_workspace.slug
- if fallback_workspace is not None
- else None,
+ "fallback_workspace_id": (
+ fallback_workspace.id
+ if fallback_workspace is not None
+ else None
+ ),
+ "fallback_workspace_slug": (
+ fallback_workspace.slug
+ if fallback_workspace is not None
+ else None
+ ),
"invites": workspace_invites,
}
@@ -180,7 +188,9 @@ class ChangePasswordSerializer(serializers.Serializer):
if data.get("new_password") != data.get("confirm_password"):
raise serializers.ValidationError(
- {"error": "Confirm password should be same as the new password."}
+ {
+ "error": "Confirm password should be same as the new password."
+ }
)
return data
@@ -190,4 +200,17 @@ class ResetPasswordSerializer(serializers.Serializer):
"""
Serializer for password change endpoint.
"""
+
new_password = serializers.CharField(required=True, min_length=8)
+
+
+class ProfileSerializer(BaseSerializer):
+ class Meta:
+ model = Profile
+ fields = "__all__"
+
+
+class AccountSerializer(BaseSerializer):
+ class Meta:
+ model = Account
+ fields = "__all__"
diff --git a/apiserver/plane/app/serializers/view.py b/apiserver/plane/app/serializers/view.py
index e7502609a..c46a545d0 100644
--- a/apiserver/plane/app/serializers/view.py
+++ b/apiserver/plane/app/serializers/view.py
@@ -2,15 +2,17 @@
from rest_framework import serializers
# Module imports
-from .base import BaseSerializer
+from .base import BaseSerializer, DynamicBaseSerializer
from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer
-from plane.db.models import GlobalView, IssueView, IssueViewFavorite
+from plane.db.models import GlobalView, IssueView
from plane.utils.issue_filters import issue_filters
class GlobalViewSerializer(BaseSerializer):
- workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
+ workspace_detail = WorkspaceLiteSerializer(
+ source="workspace", read_only=True
+ )
class Meta:
model = GlobalView
@@ -38,10 +40,12 @@ class GlobalViewSerializer(BaseSerializer):
return super().update(instance, validated_data)
-class IssueViewSerializer(BaseSerializer):
+class IssueViewSerializer(DynamicBaseSerializer):
is_favorite = serializers.BooleanField(read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
- workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
+ workspace_detail = WorkspaceLiteSerializer(
+ source="workspace", read_only=True
+ )
class Meta:
model = IssueView
@@ -68,16 +72,3 @@ class IssueViewSerializer(BaseSerializer):
validated_data["query"] = {}
validated_data["query"] = issue_filters(query_params, "PATCH")
return super().update(instance, validated_data)
-
-
-class IssueViewFavoriteSerializer(BaseSerializer):
- view_detail = IssueViewSerializer(source="issue_view", read_only=True)
-
- class Meta:
- model = IssueViewFavorite
- fields = "__all__"
- read_only_fields = [
- "workspace",
- "project",
- "user",
- ]
diff --git a/apiserver/plane/app/serializers/webhook.py b/apiserver/plane/app/serializers/webhook.py
index 961466d28..175dea304 100644
--- a/apiserver/plane/app/serializers/webhook.py
+++ b/apiserver/plane/app/serializers/webhook.py
@@ -1,5 +1,4 @@
# Python imports
-import urllib
import socket
import ipaddress
from urllib.parse import urlparse
@@ -10,78 +9,113 @@ from rest_framework import serializers
# Module imports
from .base import DynamicBaseSerializer
from plane.db.models import Webhook, WebhookLog
-from plane.db.models.webhook import validate_domain, validate_schema
+from plane.db.models.webhook import validate_domain, validate_schema
+
class WebhookSerializer(DynamicBaseSerializer):
url = serializers.URLField(validators=[validate_schema, validate_domain])
-
+
def create(self, validated_data):
url = validated_data.get("url", None)
# Extract the hostname from the URL
hostname = urlparse(url).hostname
if not hostname:
- raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
+ raise serializers.ValidationError(
+ {"url": "Invalid URL: No hostname found."}
+ )
# Resolve the hostname to IP addresses
try:
ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror:
- raise serializers.ValidationError({"url": "Hostname could not be resolved."})
+ raise serializers.ValidationError(
+ {"url": "Hostname could not be resolved."}
+ )
if not ip_addresses:
- raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
+ raise serializers.ValidationError(
+ {"url": "No IP addresses found for the hostname."}
+ )
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
- raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
+ raise serializers.ValidationError(
+ {"url": "URL resolves to a blocked IP address."}
+ )
# Additional validation for multiple request domains and their subdomains
- request = self.context.get('request')
- disallowed_domains = ['plane.so',] # Add your disallowed domains here
+ request = self.context.get("request")
+ disallowed_domains = [
+ "plane.so",
+ ] # Add your disallowed domains here
if request:
- request_host = request.get_host().split(':')[0] # Remove port if present
+ request_host = request.get_host().split(":")[
+ 0
+ ] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
- if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
- raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
+ if any(
+ hostname == domain or hostname.endswith("." + domain)
+ for domain in disallowed_domains
+ ):
+ raise serializers.ValidationError(
+ {"url": "URL domain or its subdomain is not allowed."}
+ )
return Webhook.objects.create(**validated_data)
-
+
def update(self, instance, validated_data):
url = validated_data.get("url", None)
if url:
# Extract the hostname from the URL
hostname = urlparse(url).hostname
if not hostname:
- raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
+ raise serializers.ValidationError(
+ {"url": "Invalid URL: No hostname found."}
+ )
# Resolve the hostname to IP addresses
try:
ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror:
- raise serializers.ValidationError({"url": "Hostname could not be resolved."})
+ raise serializers.ValidationError(
+ {"url": "Hostname could not be resolved."}
+ )
if not ip_addresses:
- raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
+ raise serializers.ValidationError(
+ {"url": "No IP addresses found for the hostname."}
+ )
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
- raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
+ raise serializers.ValidationError(
+ {"url": "URL resolves to a blocked IP address."}
+ )
# Additional validation for multiple request domains and their subdomains
- request = self.context.get('request')
- disallowed_domains = ['plane.so',] # Add your disallowed domains here
+ request = self.context.get("request")
+ disallowed_domains = [
+ "plane.so",
+ ] # Add your disallowed domains here
if request:
- request_host = request.get_host().split(':')[0] # Remove port if present
+ request_host = request.get_host().split(":")[
+ 0
+ ] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
- if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
- raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
+ if any(
+ hostname == domain or hostname.endswith("." + domain)
+ for domain in disallowed_domains
+ ):
+ raise serializers.ValidationError(
+ {"url": "URL domain or its subdomain is not allowed."}
+ )
return super().update(instance, validated_data)
@@ -95,12 +129,7 @@ class WebhookSerializer(DynamicBaseSerializer):
class WebhookLogSerializer(DynamicBaseSerializer):
-
class Meta:
model = WebhookLog
fields = "__all__"
- read_only_fields = [
- "workspace",
- "webhook"
- ]
-
+ read_only_fields = ["workspace", "webhook"]
diff --git a/apiserver/plane/app/serializers/workspace.py b/apiserver/plane/app/serializers/workspace.py
index f0ad4b4ab..69f827c24 100644
--- a/apiserver/plane/app/serializers/workspace.py
+++ b/apiserver/plane/app/serializers/workspace.py
@@ -2,7 +2,7 @@
from rest_framework import serializers
# Module imports
-from .base import BaseSerializer
+from .base import BaseSerializer, DynamicBaseSerializer
from .user import UserLiteSerializer, UserAdminLiteSerializer
from plane.db.models import (
@@ -13,10 +13,11 @@ from plane.db.models import (
TeamMember,
WorkspaceMemberInvite,
WorkspaceTheme,
+ WorkspaceUserProperties,
)
-class WorkSpaceSerializer(BaseSerializer):
+class WorkSpaceSerializer(DynamicBaseSerializer):
owner = UserLiteSerializer(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
@@ -50,6 +51,7 @@ class WorkSpaceSerializer(BaseSerializer):
"owner",
]
+
class WorkspaceLiteSerializer(BaseSerializer):
class Meta:
model = Workspace
@@ -61,8 +63,7 @@ class WorkspaceLiteSerializer(BaseSerializer):
read_only_fields = fields
-
-class WorkSpaceMemberSerializer(BaseSerializer):
+class WorkSpaceMemberSerializer(DynamicBaseSerializer):
member = UserLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
@@ -72,13 +73,12 @@ class WorkSpaceMemberSerializer(BaseSerializer):
class WorkspaceMemberMeSerializer(BaseSerializer):
-
class Meta:
model = WorkspaceMember
fields = "__all__"
-class WorkspaceMemberAdminSerializer(BaseSerializer):
+class WorkspaceMemberAdminSerializer(DynamicBaseSerializer):
member = UserAdminLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
@@ -108,7 +108,9 @@ class WorkSpaceMemberInviteSerializer(BaseSerializer):
class TeamSerializer(BaseSerializer):
- members_detail = UserLiteSerializer(read_only=True, source="members", many=True)
+ members_detail = UserLiteSerializer(
+ read_only=True, source="members", many=True
+ )
members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
@@ -145,7 +147,9 @@ class TeamSerializer(BaseSerializer):
members = validated_data.pop("members")
TeamMember.objects.filter(team=instance).delete()
team_members = [
- TeamMember(member=member, team=instance, workspace=instance.workspace)
+ TeamMember(
+ member=member, team=instance, workspace=instance.workspace
+ )
for member in members
]
TeamMember.objects.bulk_create(team_members, batch_size=10)
@@ -161,3 +165,13 @@ class WorkspaceThemeSerializer(BaseSerializer):
"workspace",
"actor",
]
+
+
+class WorkspaceUserPropertiesSerializer(BaseSerializer):
+ class Meta:
+ model = WorkspaceUserProperties
+ fields = "__all__"
+ read_only_fields = [
+ "workspace",
+ "user",
+ ]
diff --git a/apiserver/plane/app/urls/__init__.py b/apiserver/plane/app/urls/__init__.py
index d8334ed57..cb5f0253a 100644
--- a/apiserver/plane/app/urls/__init__.py
+++ b/apiserver/plane/app/urls/__init__.py
@@ -1,13 +1,11 @@
from .analytic import urlpatterns as analytic_urls
+from .api import urlpatterns as api_urls
from .asset import urlpatterns as asset_urls
-from .authentication import urlpatterns as authentication_urls
-from .config import urlpatterns as configuration_urls
from .cycle import urlpatterns as cycle_urls
+from .dashboard import urlpatterns as dashboard_urls
from .estimate import urlpatterns as estimate_urls
from .external import urlpatterns as external_urls
-from .importer import urlpatterns as importer_urls
from .inbox import urlpatterns as inbox_urls
-from .integration import urlpatterns as integration_urls
from .issue import urlpatterns as issue_urls
from .module import urlpatterns as module_urls
from .notification import urlpatterns as notification_urls
@@ -17,22 +15,17 @@ from .search import urlpatterns as search_urls
from .state import urlpatterns as state_urls
from .user import urlpatterns as user_urls
from .views import urlpatterns as view_urls
-from .workspace import urlpatterns as workspace_urls
-from .api import urlpatterns as api_urls
from .webhook import urlpatterns as webhook_urls
-
+from .workspace import urlpatterns as workspace_urls
urlpatterns = [
*analytic_urls,
*asset_urls,
- *authentication_urls,
- *configuration_urls,
*cycle_urls,
+ *dashboard_urls,
*estimate_urls,
*external_urls,
- *importer_urls,
*inbox_urls,
- *integration_urls,
*issue_urls,
*module_urls,
*notification_urls,
@@ -45,4 +38,4 @@ urlpatterns = [
*workspace_urls,
*api_urls,
*webhook_urls,
-]
\ No newline at end of file
+]
diff --git a/apiserver/plane/app/urls/authentication.py b/apiserver/plane/app/urls/authentication.py
deleted file mode 100644
index 39986f791..000000000
--- a/apiserver/plane/app/urls/authentication.py
+++ /dev/null
@@ -1,57 +0,0 @@
-from django.urls import path
-
-from rest_framework_simplejwt.views import TokenRefreshView
-
-
-from plane.app.views import (
- # Authentication
- SignInEndpoint,
- SignOutEndpoint,
- MagicGenerateEndpoint,
- MagicSignInEndpoint,
- OauthEndpoint,
- EmailCheckEndpoint,
- ## End Authentication
- # Auth Extended
- ForgotPasswordEndpoint,
- ResetPasswordEndpoint,
- ChangePasswordEndpoint,
- ## End Auth Extender
- # API Tokens
- ApiTokenEndpoint,
- ## End API Tokens
-)
-
-
-urlpatterns = [
- # Social Auth
- path("email-check/", EmailCheckEndpoint.as_view(), name="email"),
- path("social-auth/", OauthEndpoint.as_view(), name="oauth"),
- # Auth
- path("sign-in/", SignInEndpoint.as_view(), name="sign-in"),
- path("sign-out/", SignOutEndpoint.as_view(), name="sign-out"),
- # magic sign in
- path("magic-generate/", MagicGenerateEndpoint.as_view(), name="magic-generate"),
- path("magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"),
- path("token/refresh/", TokenRefreshView.as_view(), name="token_refresh"),
- # Password Manipulation
- path(
- "users/me/change-password/",
- ChangePasswordEndpoint.as_view(),
- name="change-password",
- ),
- path(
- "reset-password///",
- ResetPasswordEndpoint.as_view(),
- name="password-reset",
- ),
- path(
- "forgot-password/",
- ForgotPasswordEndpoint.as_view(),
- name="forgot-password",
- ),
- # API Tokens
- path("api-tokens/", ApiTokenEndpoint.as_view(), name="api-tokens"),
- path("api-tokens//", ApiTokenEndpoint.as_view(), name="api-tokens"),
- ## End API Tokens
-]
diff --git a/apiserver/plane/app/urls/config.py b/apiserver/plane/app/urls/config.py
deleted file mode 100644
index 12beb63aa..000000000
--- a/apiserver/plane/app/urls/config.py
+++ /dev/null
@@ -1,12 +0,0 @@
-from django.urls import path
-
-
-from plane.app.views import ConfigurationEndpoint
-
-urlpatterns = [
- path(
- "configs/",
- ConfigurationEndpoint.as_view(),
- name="configuration",
- ),
-]
\ No newline at end of file
diff --git a/apiserver/plane/app/urls/cycle.py b/apiserver/plane/app/urls/cycle.py
index 46e6a5e84..ce2e0f6dc 100644
--- a/apiserver/plane/app/urls/cycle.py
+++ b/apiserver/plane/app/urls/cycle.py
@@ -7,6 +7,8 @@ from plane.app.views import (
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
TransferCycleIssueEndpoint,
+ CycleUserPropertiesEndpoint,
+ CycleArchiveUnarchiveEndpoint,
)
@@ -44,7 +46,7 @@ urlpatterns = [
name="project-issue-cycle",
),
path(
- "workspaces//projects//cycles//cycle-issues//",
+ "workspaces//projects//cycles//cycle-issues//",
CycleIssueViewSet.as_view(
{
"get": "retrieve",
@@ -84,4 +86,24 @@ urlpatterns = [
TransferCycleIssueEndpoint.as_view(),
name="transfer-issues",
),
+ path(
+ "workspaces//projects//cycles//user-properties/",
+ CycleUserPropertiesEndpoint.as_view(),
+ name="cycle-user-filters",
+ ),
+ path(
+ "workspaces//projects//cycles//archive/",
+ CycleArchiveUnarchiveEndpoint.as_view(),
+ name="cycle-archive-unarchive",
+ ),
+ path(
+ "workspaces//projects//archived-cycles/",
+ CycleArchiveUnarchiveEndpoint.as_view(),
+ name="cycle-archive-unarchive",
+ ),
+ path(
+ "workspaces//projects//archived-cycles//",
+ CycleArchiveUnarchiveEndpoint.as_view(),
+ name="cycle-archive-unarchive",
+ ),
]
diff --git a/apiserver/plane/app/urls/dashboard.py b/apiserver/plane/app/urls/dashboard.py
new file mode 100644
index 000000000..0dc24a808
--- /dev/null
+++ b/apiserver/plane/app/urls/dashboard.py
@@ -0,0 +1,23 @@
+from django.urls import path
+
+
+from plane.app.views import DashboardEndpoint, WidgetsEndpoint
+
+
+urlpatterns = [
+ path(
+ "workspaces//dashboard/",
+ DashboardEndpoint.as_view(),
+ name="dashboard",
+ ),
+ path(
+ "workspaces//dashboard//",
+ DashboardEndpoint.as_view(),
+ name="dashboard",
+ ),
+ path(
+ "dashboard//widgets//",
+ WidgetsEndpoint.as_view(),
+ name="widgets",
+ ),
+]
diff --git a/apiserver/plane/app/urls/external.py b/apiserver/plane/app/urls/external.py
index 774e6fb7c..8db87a249 100644
--- a/apiserver/plane/app/urls/external.py
+++ b/apiserver/plane/app/urls/external.py
@@ -2,7 +2,6 @@ from django.urls import path
from plane.app.views import UnsplashEndpoint
-from plane.app.views import ReleaseNotesEndpoint
from plane.app.views import GPTIntegrationEndpoint
@@ -12,11 +11,6 @@ urlpatterns = [
UnsplashEndpoint.as_view(),
name="unsplash",
),
- path(
- "release-notes/",
- ReleaseNotesEndpoint.as_view(),
- name="release-notes",
- ),
path(
"workspaces//projects//ai-assistant/",
GPTIntegrationEndpoint.as_view(),
diff --git a/apiserver/plane/app/urls/importer.py b/apiserver/plane/app/urls/importer.py
deleted file mode 100644
index f3a018d78..000000000
--- a/apiserver/plane/app/urls/importer.py
+++ /dev/null
@@ -1,37 +0,0 @@
-from django.urls import path
-
-
-from plane.app.views import (
- ServiceIssueImportSummaryEndpoint,
- ImportServiceEndpoint,
- UpdateServiceImportStatusEndpoint,
-)
-
-
-urlpatterns = [
- path(
- "workspaces//importers//",
- ServiceIssueImportSummaryEndpoint.as_view(),
- name="importer-summary",
- ),
- path(
- "workspaces//projects/importers//",
- ImportServiceEndpoint.as_view(),
- name="importer",
- ),
- path(
- "workspaces//importers/",
- ImportServiceEndpoint.as_view(),
- name="importer",
- ),
- path(
- "workspaces//importers///",
- ImportServiceEndpoint.as_view(),
- name="importer",
- ),
- path(
- "workspaces//projects//service//importers//",
- UpdateServiceImportStatusEndpoint.as_view(),
- name="importer-status",
- ),
-]
diff --git a/apiserver/plane/app/urls/inbox.py b/apiserver/plane/app/urls/inbox.py
index 16ea40b21..b6848244b 100644
--- a/apiserver/plane/app/urls/inbox.py
+++ b/apiserver/plane/app/urls/inbox.py
@@ -30,7 +30,7 @@ urlpatterns = [
name="inbox",
),
path(
- "workspaces//projects//inboxes//inbox-issues/",
+ "workspaces//projects//inbox-issues/",
InboxIssueViewSet.as_view(
{
"get": "list",
@@ -40,7 +40,7 @@ urlpatterns = [
name="inbox-issue",
),
path(
- "workspaces//projects//inboxes//inbox-issues//",
+ "workspaces//projects//inbox-issues//",
InboxIssueViewSet.as_view(
{
"get": "retrieve",
diff --git a/apiserver/plane/app/urls/integration.py b/apiserver/plane/app/urls/integration.py
deleted file mode 100644
index cf3f82d5a..000000000
--- a/apiserver/plane/app/urls/integration.py
+++ /dev/null
@@ -1,150 +0,0 @@
-from django.urls import path
-
-
-from plane.app.views import (
- IntegrationViewSet,
- WorkspaceIntegrationViewSet,
- GithubRepositoriesEndpoint,
- GithubRepositorySyncViewSet,
- GithubIssueSyncViewSet,
- GithubCommentSyncViewSet,
- BulkCreateGithubIssueSyncEndpoint,
- SlackProjectSyncViewSet,
-)
-
-
-urlpatterns = [
- path(
- "integrations/",
- IntegrationViewSet.as_view(
- {
- "get": "list",
- "post": "create",
- }
- ),
- name="integrations",
- ),
- path(
- "integrations//",
- IntegrationViewSet.as_view(
- {
- "get": "retrieve",
- "patch": "partial_update",
- "delete": "destroy",
- }
- ),
- name="integrations",
- ),
- path(
- "workspaces//workspace-integrations/",
- WorkspaceIntegrationViewSet.as_view(
- {
- "get": "list",
- }
- ),
- name="workspace-integrations",
- ),
- path(
- "workspaces//workspace-integrations//",
- WorkspaceIntegrationViewSet.as_view(
- {
- "post": "create",
- }
- ),
- name="workspace-integrations",
- ),
- path(
- "workspaces//workspace-integrations//provider/",
- WorkspaceIntegrationViewSet.as_view(
- {
- "get": "retrieve",
- "delete": "destroy",
- }
- ),
- name="workspace-integrations",
- ),
- # Github Integrations
- path(
- "workspaces//workspace-integrations//github-repositories/",
- GithubRepositoriesEndpoint.as_view(),
- ),
- path(
- "workspaces//projects//workspace-integrations//github-repository-sync/",
- GithubRepositorySyncViewSet.as_view(
- {
- "get": "list",
- "post": "create",
- }
- ),
- ),
- path(
- "workspaces//projects//workspace-integrations//github-repository-sync//",
- GithubRepositorySyncViewSet.as_view(
- {
- "get": "retrieve",
- "delete": "destroy",
- }
- ),
- ),
- path(
- "workspaces//projects//github-repository-sync//github-issue-sync/",
- GithubIssueSyncViewSet.as_view(
- {
- "post": "create",
- "get": "list",
- }
- ),
- ),
- path(
- "workspaces//projects//github-repository-sync//bulk-create-github-issue-sync/",
- BulkCreateGithubIssueSyncEndpoint.as_view(),
- ),
- path(
- "workspaces//projects//github-repository-sync//github-issue-sync//",
- GithubIssueSyncViewSet.as_view(
- {
- "get": "retrieve",
- "delete": "destroy",
- }
- ),
- ),
- path(
- "workspaces//projects//github-repository-sync//github-issue-sync//github-comment-sync/",
- GithubCommentSyncViewSet.as_view(
- {
- "post": "create",
- "get": "list",
- }
- ),
- ),
- path(
- "workspaces//projects//github-repository-sync//github-issue-sync//github-comment-sync//",
- GithubCommentSyncViewSet.as_view(
- {
- "get": "retrieve",
- "delete": "destroy",
- }
- ),
- ),
- ## End Github Integrations
- # Slack Integration
- path(
- "workspaces//projects//workspace-integrations//project-slack-sync/",
- SlackProjectSyncViewSet.as_view(
- {
- "post": "create",
- "get": "list",
- }
- ),
- ),
- path(
- "workspaces//projects//workspace-integrations//project-slack-sync//",
- SlackProjectSyncViewSet.as_view(
- {
- "delete": "destroy",
- "get": "retrieve",
- }
- ),
- ),
- ## End Slack Integration
-]
diff --git a/apiserver/plane/app/urls/issue.py b/apiserver/plane/app/urls/issue.py
index 971fbc395..0d3b9e063 100644
--- a/apiserver/plane/app/urls/issue.py
+++ b/apiserver/plane/app/urls/issue.py
@@ -1,30 +1,32 @@
from django.urls import path
-
from plane.app.views import (
- IssueViewSet,
- LabelViewSet,
BulkCreateIssueLabelsEndpoint,
BulkDeleteIssuesEndpoint,
- BulkImportIssuesEndpoint,
- UserWorkSpaceIssues,
SubIssuesEndpoint,
IssueLinkViewSet,
IssueAttachmentEndpoint,
+ CommentReactionViewSet,
ExportIssuesEndpoint,
IssueActivityEndpoint,
- IssueCommentViewSet,
- IssueSubscriberViewSet,
- IssueReactionViewSet,
- CommentReactionViewSet,
- IssueUserDisplayPropertyEndpoint,
IssueArchiveViewSet,
- IssueRelationViewSet,
+ IssueCommentViewSet,
IssueDraftViewSet,
+ IssueListEndpoint,
+ IssueReactionViewSet,
+ IssueRelationViewSet,
+ IssueSubscriberViewSet,
+ IssueUserDisplayPropertyEndpoint,
+ IssueViewSet,
+ LabelViewSet,
)
-
urlpatterns = [
+ path(
+ "workspaces//projects//issues/list/",
+ IssueListEndpoint.as_view(),
+ name="project-issue",
+ ),
path(
"workspaces//projects//issues/",
IssueViewSet.as_view(
@@ -79,16 +81,7 @@ urlpatterns = [
BulkDeleteIssuesEndpoint.as_view(),
name="project-issues-bulk",
),
- path(
- "workspaces//projects//bulk-import-issues//",
- BulkImportIssuesEndpoint.as_view(),
- name="project-issues-bulk",
- ),
- path(
- "workspaces//my-issues/",
- UserWorkSpaceIssues.as_view(),
- name="workspace-issues",
- ),
+ ##
path(
"workspaces//projects//issues//sub-issues/",
SubIssuesEndpoint.as_view(),
@@ -235,7 +228,7 @@ urlpatterns = [
## End Comment Reactions
## IssueProperty
path(
- "workspaces//projects//issue-display-properties/",
+ "workspaces//projects//user-properties/",
IssueUserDisplayPropertyEndpoint.as_view(),
name="project-issue-display-properties",
),
@@ -251,23 +244,15 @@ urlpatterns = [
name="project-issue-archive",
),
path(
- "workspaces//projects//archived-issues//",
+ "workspaces//projects//issues//archive/",
IssueArchiveViewSet.as_view(
{
"get": "retrieve",
- "delete": "destroy",
+ "post": "archive",
+ "delete": "unarchive",
}
),
- name="project-issue-archive",
- ),
- path(
- "workspaces//projects//unarchive/