forked from github/plane
feat: openai host (#1447)
* support custom openai api endpoints (for azure and local deployments)
* update openai python package and use new api format
* fix: project member list endpoint n+1 (#1458)
* chore: workspace char name and slug maximum length (#1453)
* fix: user invitation workflow for self hosted version (#1441)
* chore: due date filter (#1460)
* refactor: standardized date format throughout the platform (#1461)
* chore: due date filter (#965)
* chore: due date filter
* fix: deployment error
* chore: optimized code
* chore: created constants for due date
* chore: create seperated css file for react datepicker styling
* fix: due date filter
* chore: highlight selected option
* fix: merge conflicts
* fix: build error
* chore: date range selector validation
* fix: issue views overflow
* refactor: due date filter modal code
* refactor: multi level dropdown
* chore: due date filter select default value
---------
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
* fix: layout of tabs on Pages is not adaptable to mobile screens #1380 (#1400)
* fix: layout of tabs on Pages is not adaptable to mobile screens #1380
* fix: scrolling experience on page
* chore: update project members type (#1459)
* fix: state icon color on group titles (#1435)
* fix: workspace invitation delete for self hosted (#1475)
* chore: upgrade backend dependencies (#1479)
* chore: upgrade backend dependencies
* dev: update storage settings for self hosted version
* chore: project members endpoint to support bulk operations (#1464)
* chore: rename workspace company size to organization size (#1463)
* chore: rename workspace company size to organization size
* chore: make workspace organization size as required
* fix: static and media files storages (#1482)
* fix: emoji render function (#1484)
* fix: emoji render function
* fix: emoji render function
* feat: bulk invite for project (#1466)
* feat: bulk invite for project
* feat: members dropdown updated
* fix: error message added ,style: ui improvement
* feat: added add members button for scenarios with multiple members
* chore: updated watch to fields
* feat: created on and updated on column added in spreadsheet view (#1454)
* feat: created on and updated on column added in spreadsheet view
* fix: build fix
* refactor: simplify logic
---------
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
* fix: resolved overflow issue with longer state names (#1444)
* chore: update theming structure (#1422)
* chore: store various shades of accent color
* refactor: custom theme selector
* refactor: custom theme selector
* chore: update custom theme input labels
* fix: color generator function logic
* fix: accent color preloaded data
* chore: new theming structure
* chore: update shades calculation logic
* refactor: variable names
* chore: update color scheming
* chore: new color scheming
* refactor: themes folder structure
* chore: update classnames according to new variables
* Revert "chore: update classnames according to new variables"
This reverts commit 60a87453b21768167e37889e709c12287ca07b08.
* chore: remove temp file
* chore: update classnames according to the new theming structure (#1494)
* chore: store various shades of accent color
* refactor: custom theme selector
* refactor: custom theme selector
* chore: update custom theme input labels
* fix: color generator function logic
* fix: accent color preloaded data
* chore: new theming structure
* chore: update shades calculation logic
* refactor: variable names
* chore: update color scheming
* chore: new color scheming
* refactor: themes folder structure
* chore: update classnames to the new ones
* chore: update static colors
* chore: sidebar link colors
* chore: placeholder color
* chore: update border classnames
* chore: environment variables for worker and api (#1492)
* feat: notifications (#1363)
* feat: added new issue subscriber table
* dev: notification model
* feat: added CRUD operation for issue subscriber
* Revert "feat: added CRUD operation for issue subscriber"
This reverts commit b22e062576
.
* feat: added CRUD operation for issue subscriber
* dev: notification models and operations
* dev: remove delete endpoint response data
* dev: notification endpoints and fix bg worker for saving notifications
* feat: added list and unsubscribe function in issue subscriber
* dev: filter by snoozed and response update for list and permissions
* dev: update issue notifications
* dev: notification segregation
* dev: update notifications
* dev: notification filtering
* dev: add issue name in notifications
* dev: notification new endpoints
---------
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
* fix: docker inconsistencies (#1493)
Co-authored-by: pablohashescobar <118773738+pablohashescobar@users.noreply.github.com>
* feat: issue archival and close (#1474)
* chore: added issue archive using celery beat
* chore: changed the file name
* fix: created API and updated logic for achived-issues
* chore: added issue activity message
* chore: added the beat scheduler command
* feat: added unarchive issue functionality
* feat: auto issue close
* dev: refactor endpoints and issue archive activity
* dev: update manager for global filtering
* fix: added id in issue unarchive url
* dev: update auto close to include default close state
* fix: updated the list and retrive function
* fix: added the prefetch fields
* dev: update unarchive
---------
Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
* fix: updated text and background colors (#1496)
* fix: custom colors opacity
* chore: update text colors for dark mode
* fix: dropdown text colors, datepicker bg color
* chore: update text colors
* chore: updated primary bg color
* feat: web waitlist modal integration (#1487)
* dev : Updating the limit of the issues in the sidebar and a weight list modal
* dev: integrated supabase and implemented web waitlist api endpoint
* dev : updated web pro weightlist request
* dev: rename typo
* dev: web waitlist endpoint update
* update: ui fixes
* fix: removed supabase from env.example
* chore: replaced supabase npm package to cdn
* chore: updated supabase req
* fix: Handled error status and error message.
---------
Co-authored-by: srinivaspendem <you@example.comsrinivaspendem2612@gmail.com>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
* add openai host env in all places
---------
Co-authored-by: Ankur Singh <ankur.singh@epfl.ch>
Co-authored-by: pablohashescobar <118773738+pablohashescobar@users.noreply.github.com>
Co-authored-by: Anmol Singh Bhatia <121005188+anmolsinghbhatia@users.noreply.github.com>
Co-authored-by: Kunal Vishwakarma <116634168+kunalv17@users.noreply.github.com>
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
Co-authored-by: Chandan Jal <97095857+ChandanJal@users.noreply.github.com>
Co-authored-by: Aaryan Khandelwal <65252264+aaryan610@users.noreply.github.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: Quadrubo <71718414+Quadrubo@users.noreply.github.com>
Co-authored-by: Bavisetti Narayan <72156168+NarayanBavisetti@users.noreply.github.com>
Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: srinivas pendem <65014795+srinivaspendem@users.noreply.github.com>
Co-authored-by: srinivaspendem <you@example.comsrinivaspendem2612@gmail.com>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
This commit is contained in:
parent
eba2f3820a
commit
1403a536c1
13
.env.example
13
.env.example
@ -9,11 +9,11 @@ NEXT_PUBLIC_GITHUB_ID=""
|
|||||||
NEXT_PUBLIC_GITHUB_APP_NAME=""
|
NEXT_PUBLIC_GITHUB_APP_NAME=""
|
||||||
# Sentry DSN for error monitoring
|
# Sentry DSN for error monitoring
|
||||||
NEXT_PUBLIC_SENTRY_DSN=""
|
NEXT_PUBLIC_SENTRY_DSN=""
|
||||||
# Enable/Disable OAUTH - default 0 for selfhosted instance
|
# Enable/Disable OAUTH - default 0 for selfhosted instance
|
||||||
NEXT_PUBLIC_ENABLE_OAUTH=0
|
NEXT_PUBLIC_ENABLE_OAUTH=0
|
||||||
# Enable/Disable sentry
|
# Enable/Disable sentry
|
||||||
NEXT_PUBLIC_ENABLE_SENTRY=0
|
NEXT_PUBLIC_ENABLE_SENTRY=0
|
||||||
# Enable/Disable session recording
|
# Enable/Disable session recording
|
||||||
NEXT_PUBLIC_ENABLE_SESSION_RECORDER=0
|
NEXT_PUBLIC_ENABLE_SESSION_RECORDER=0
|
||||||
# Enable/Disable event tracking
|
# Enable/Disable event tracking
|
||||||
NEXT_PUBLIC_TRACK_EVENTS=0
|
NEXT_PUBLIC_TRACK_EVENTS=0
|
||||||
@ -59,15 +59,16 @@ AWS_S3_BUCKET_NAME="uploads"
|
|||||||
FILE_SIZE_LIMIT=5242880
|
FILE_SIZE_LIMIT=5242880
|
||||||
|
|
||||||
# GPT settings
|
# GPT settings
|
||||||
OPENAI_API_KEY=""
|
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint
|
||||||
GPT_ENGINE=""
|
OPENAI_API_KEY="sk-" # add your openai key here
|
||||||
|
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access
|
||||||
|
|
||||||
# Github
|
# Github
|
||||||
GITHUB_CLIENT_SECRET="" # For fetching release notes
|
GITHUB_CLIENT_SECRET="" # For fetching release notes
|
||||||
|
|
||||||
# Settings related to Docker
|
# Settings related to Docker
|
||||||
DOCKERIZED=1
|
DOCKERIZED=1
|
||||||
# set to 1 If using the pre-configured minio setup
|
# set to 1 If using the pre-configured minio setup
|
||||||
USE_MINIO=1
|
USE_MINIO=1
|
||||||
|
|
||||||
# Nginx Configuration
|
# Nginx Configuration
|
||||||
@ -79,4 +80,4 @@ DEFAULT_PASSWORD="password123"
|
|||||||
|
|
||||||
# SignUps
|
# SignUps
|
||||||
ENABLE_SIGNUP="1"
|
ENABLE_SIGNUP="1"
|
||||||
# Auto generated and Required that will be generated from setup.sh
|
# Auto generated and Required that will be generated from setup.sh
|
||||||
|
@ -67,7 +67,7 @@ class GPTIntegrationEndpoint(BaseAPIView):
|
|||||||
|
|
||||||
openai.api_key = settings.OPENAI_API_KEY
|
openai.api_key = settings.OPENAI_API_KEY
|
||||||
response = openai.Completion.create(
|
response = openai.Completion.create(
|
||||||
engine=settings.GPT_ENGINE,
|
model=settings.GPT_ENGINE,
|
||||||
prompt=final_text,
|
prompt=final_text,
|
||||||
temperature=0.7,
|
temperature=0.7,
|
||||||
max_tokens=1024,
|
max_tokens=1024,
|
||||||
|
@ -10,9 +10,7 @@ from sentry_sdk.integrations.redis import RedisIntegration
|
|||||||
|
|
||||||
from .common import * # noqa
|
from .common import * # noqa
|
||||||
|
|
||||||
DEBUG = int(os.environ.get(
|
DEBUG = int(os.environ.get("DEBUG", 1)) == 1
|
||||||
"DEBUG", 1
|
|
||||||
)) == 1
|
|
||||||
|
|
||||||
EMAIL_BACKEND = "django.core.mail.backends.console.EmailBackend"
|
EMAIL_BACKEND = "django.core.mail.backends.console.EmailBackend"
|
||||||
|
|
||||||
@ -27,13 +25,11 @@ DATABASES = {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
DOCKERIZED = int(os.environ.get(
|
DOCKERIZED = int(os.environ.get("DOCKERIZED", 0)) == 1
|
||||||
"DOCKERIZED", 0
|
|
||||||
)) == 1
|
|
||||||
|
|
||||||
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
|
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
|
||||||
|
|
||||||
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
|
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
|
||||||
|
|
||||||
if DOCKERIZED:
|
if DOCKERIZED:
|
||||||
DATABASES["default"] = dj_database_url.config()
|
DATABASES["default"] = dj_database_url.config()
|
||||||
@ -65,6 +61,27 @@ if os.environ.get("SENTRY_DSN", False):
|
|||||||
traces_sample_rate=0.7,
|
traces_sample_rate=0.7,
|
||||||
profiles_sample_rate=1.0,
|
profiles_sample_rate=1.0,
|
||||||
)
|
)
|
||||||
|
else:
|
||||||
|
LOGGING = {
|
||||||
|
"version": 1,
|
||||||
|
"disable_existing_loggers": False,
|
||||||
|
"handlers": {
|
||||||
|
"console": {
|
||||||
|
"class": "logging.StreamHandler",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"root": {
|
||||||
|
"handlers": ["console"],
|
||||||
|
"level": "DEBUG",
|
||||||
|
},
|
||||||
|
"loggers": {
|
||||||
|
"*": {
|
||||||
|
"handlers": ["console"],
|
||||||
|
"level": "DEBUG",
|
||||||
|
"propagate": True,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
REDIS_HOST = "localhost"
|
REDIS_HOST = "localhost"
|
||||||
REDIS_PORT = 6379
|
REDIS_PORT = 6379
|
||||||
@ -83,8 +100,9 @@ PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False)
|
|||||||
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
|
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
|
||||||
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
|
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
|
||||||
|
|
||||||
|
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
|
||||||
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
|
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
|
||||||
GPT_ENGINE = os.environ.get("GPT_ENGINE", "text-davinci-003")
|
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")
|
||||||
|
|
||||||
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
|
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
|
||||||
|
|
||||||
@ -95,4 +113,4 @@ CELERY_BROKER_URL = os.environ.get("REDIS_URL")
|
|||||||
|
|
||||||
GITHUB_ACCESS_TOKEN = os.environ.get("GITHUB_ACCESS_TOKEN", False)
|
GITHUB_ACCESS_TOKEN = os.environ.get("GITHUB_ACCESS_TOKEN", False)
|
||||||
|
|
||||||
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
|
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
|
||||||
|
@ -246,8 +246,9 @@ PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False)
|
|||||||
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
|
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
|
||||||
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
|
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
|
||||||
|
|
||||||
|
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
|
||||||
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
|
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
|
||||||
GPT_ENGINE = os.environ.get("GPT_ENGINE", "text-davinci-003")
|
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")
|
||||||
|
|
||||||
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
|
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
|
||||||
|
|
||||||
|
@ -11,10 +11,9 @@ from sentry_sdk.integrations.django import DjangoIntegration
|
|||||||
from sentry_sdk.integrations.redis import RedisIntegration
|
from sentry_sdk.integrations.redis import RedisIntegration
|
||||||
|
|
||||||
from .common import * # noqa
|
from .common import * # noqa
|
||||||
|
|
||||||
# Database
|
# Database
|
||||||
DEBUG = int(os.environ.get(
|
DEBUG = int(os.environ.get("DEBUG", 1)) == 1
|
||||||
"DEBUG", 1
|
|
||||||
)) == 1
|
|
||||||
DATABASES = {
|
DATABASES = {
|
||||||
"default": {
|
"default": {
|
||||||
"ENGINE": "django.db.backends.postgresql_psycopg2",
|
"ENGINE": "django.db.backends.postgresql_psycopg2",
|
||||||
@ -56,9 +55,7 @@ STORAGES = {
|
|||||||
|
|
||||||
|
|
||||||
# Make true if running in a docker environment
|
# Make true if running in a docker environment
|
||||||
DOCKERIZED = int(os.environ.get(
|
DOCKERIZED = int(os.environ.get("DOCKERIZED", 0)) == 1
|
||||||
"DOCKERIZED", 0
|
|
||||||
)) == 1
|
|
||||||
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
|
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
|
||||||
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
|
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
|
||||||
|
|
||||||
@ -201,15 +198,19 @@ PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False)
|
|||||||
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
|
ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
|
||||||
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
|
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
|
||||||
|
|
||||||
|
|
||||||
|
OPENAI_API_BASE = os.environ.get("OPENAI_API_BASE", "https://api.openai.com/v1")
|
||||||
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
|
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY", False)
|
||||||
GPT_ENGINE = os.environ.get("GPT_ENGINE", "text-davinci-003")
|
GPT_ENGINE = os.environ.get("GPT_ENGINE", "gpt-3.5-turbo")
|
||||||
|
|
||||||
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
|
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
|
||||||
|
|
||||||
LOGGER_BASE_URL = os.environ.get("LOGGER_BASE_URL", False)
|
LOGGER_BASE_URL = os.environ.get("LOGGER_BASE_URL", False)
|
||||||
|
|
||||||
redis_url = os.environ.get("REDIS_URL")
|
redis_url = os.environ.get("REDIS_URL")
|
||||||
broker_url = f"{redis_url}?ssl_cert_reqs={ssl.CERT_NONE.name}&ssl_ca_certs={certifi.where()}"
|
broker_url = (
|
||||||
|
f"{redis_url}?ssl_cert_reqs={ssl.CERT_NONE.name}&ssl_ca_certs={certifi.where()}"
|
||||||
|
)
|
||||||
|
|
||||||
CELERY_RESULT_BACKEND = broker_url
|
CELERY_RESULT_BACKEND = broker_url
|
||||||
CELERY_BROKER_URL = broker_url
|
CELERY_BROKER_URL = broker_url
|
||||||
|
@ -1,6 +1,7 @@
|
|||||||
version: "3.8"
|
version: "3.8"
|
||||||
|
|
||||||
x-api-and-worker-env: &api-and-worker-env
|
x-api-and-worker-env:
|
||||||
|
&api-and-worker-env
|
||||||
DEBUG: ${DEBUG}
|
DEBUG: ${DEBUG}
|
||||||
SENTRY_DSN: ${SENTRY_DSN}
|
SENTRY_DSN: ${SENTRY_DSN}
|
||||||
DJANGO_SETTINGS_MODULE: plane.settings.production
|
DJANGO_SETTINGS_MODULE: plane.settings.production
|
||||||
@ -23,6 +24,7 @@ x-api-and-worker-env: &api-and-worker-env
|
|||||||
GITHUB_CLIENT_SECRET: ${GITHUB_CLIENT_SECRET}
|
GITHUB_CLIENT_SECRET: ${GITHUB_CLIENT_SECRET}
|
||||||
DISABLE_COLLECTSTATIC: 1
|
DISABLE_COLLECTSTATIC: 1
|
||||||
DOCKERIZED: 1
|
DOCKERIZED: 1
|
||||||
|
OPENAI_API_BASE: ${OPENAI_API_BASE}
|
||||||
OPENAI_API_KEY: ${OPENAI_API_KEY}
|
OPENAI_API_KEY: ${OPENAI_API_KEY}
|
||||||
GPT_ENGINE: ${GPT_ENGINE}
|
GPT_ENGINE: ${GPT_ENGINE}
|
||||||
SECRET_KEY: ${SECRET_KEY}
|
SECRET_KEY: ${SECRET_KEY}
|
||||||
@ -118,9 +120,7 @@ services:
|
|||||||
createbuckets:
|
createbuckets:
|
||||||
image: minio/mc
|
image: minio/mc
|
||||||
entrypoint: >
|
entrypoint: >
|
||||||
/bin/sh -c " /usr/bin/mc config host add plane-minio http://plane-minio:9000 \$AWS_ACCESS_KEY_ID \$AWS_SECRET_ACCESS_KEY;
|
/bin/sh -c " /usr/bin/mc config host add plane-minio http://plane-minio:9000 \$AWS_ACCESS_KEY_ID \$AWS_SECRET_ACCESS_KEY; /usr/bin/mc mb plane-minio/\$AWS_S3_BUCKET_NAME; /usr/bin/mc anonymous set download plane-minio/\$AWS_S3_BUCKET_NAME; exit 0; "
|
||||||
/usr/bin/mc mb plane-minio/\$AWS_S3_BUCKET_NAME;
|
|
||||||
/usr/bin/mc anonymous set download plane-minio/\$AWS_S3_BUCKET_NAME; exit 0; "
|
|
||||||
env_file:
|
env_file:
|
||||||
- .env
|
- .env
|
||||||
depends_on:
|
depends_on:
|
||||||
|
@ -1,6 +1,7 @@
|
|||||||
version: "3.8"
|
version: "3.8"
|
||||||
|
|
||||||
x-api-and-worker-env: &api-and-worker-env
|
x-api-and-worker-env:
|
||||||
|
&api-and-worker-env
|
||||||
DEBUG: ${DEBUG}
|
DEBUG: ${DEBUG}
|
||||||
SENTRY_DSN: ${SENTRY_DSN}
|
SENTRY_DSN: ${SENTRY_DSN}
|
||||||
DJANGO_SETTINGS_MODULE: plane.settings.production
|
DJANGO_SETTINGS_MODULE: plane.settings.production
|
||||||
@ -23,6 +24,7 @@ x-api-and-worker-env: &api-and-worker-env
|
|||||||
GITHUB_CLIENT_SECRET: ${GITHUB_CLIENT_SECRET}
|
GITHUB_CLIENT_SECRET: ${GITHUB_CLIENT_SECRET}
|
||||||
DISABLE_COLLECTSTATIC: 1
|
DISABLE_COLLECTSTATIC: 1
|
||||||
DOCKERIZED: 1
|
DOCKERIZED: 1
|
||||||
|
OPENAI_API_BASE: ${OPENAI_API_BASE}
|
||||||
OPENAI_API_KEY: ${OPENAI_API_KEY}
|
OPENAI_API_KEY: ${OPENAI_API_KEY}
|
||||||
GPT_ENGINE: ${GPT_ENGINE}
|
GPT_ENGINE: ${GPT_ENGINE}
|
||||||
SECRET_KEY: ${SECRET_KEY}
|
SECRET_KEY: ${SECRET_KEY}
|
||||||
@ -126,9 +128,7 @@ services:
|
|||||||
createbuckets:
|
createbuckets:
|
||||||
image: minio/mc
|
image: minio/mc
|
||||||
entrypoint: >
|
entrypoint: >
|
||||||
/bin/sh -c " /usr/bin/mc config host add plane-minio http://plane-minio:9000 \$AWS_ACCESS_KEY_ID \$AWS_SECRET_ACCESS_KEY;
|
/bin/sh -c " /usr/bin/mc config host add plane-minio http://plane-minio:9000 \$AWS_ACCESS_KEY_ID \$AWS_SECRET_ACCESS_KEY; /usr/bin/mc mb plane-minio/\$AWS_S3_BUCKET_NAME; /usr/bin/mc anonymous set download plane-minio/\$AWS_S3_BUCKET_NAME; exit 0; "
|
||||||
/usr/bin/mc mb plane-minio/\$AWS_S3_BUCKET_NAME;
|
|
||||||
/usr/bin/mc anonymous set download plane-minio/\$AWS_S3_BUCKET_NAME; exit 0; "
|
|
||||||
env_file:
|
env_file:
|
||||||
- .env
|
- .env
|
||||||
depends_on:
|
depends_on:
|
||||||
|
Loading…
Reference in New Issue
Block a user