66 Commits

Author SHA1 Message Date
Stepan Vladovskiy
fb6e03c1a2 no force, deployd from gitea
Some checks failed
Deploy to discoursio-api / deploy (push) Failing after 46s
2025-04-07 11:14:58 -03:00
Stepan Vladovskiy
46c3345f45 debug: with force
Some checks failed
Deploy to discoursio-api / deploy (push) Failing after 1m42s
2025-04-07 11:11:01 -03:00
Stepan Vladovskiy
1156a32a88 feat: move map from nginx sigil to nginx global config
All checks were successful
Deploy to discoursio-api / deploy (push) Successful in 1m54s
2025-01-30 12:40:24 -03:00
d848af524f runtime-fix 2024-12-21 23:31:19 +03:00
c9f88c36cd trigdeploy 2024-12-21 23:00:30 +03:00
0ad44a944e Revert ".."
Some checks failed
Deploy to discoursio-api / deploy (push) Failing after 1m22s
This reverts commit fbd0e03a33.
2024-08-06 21:01:03 +03:00
fbd0e03a33 .. 2024-02-23 17:08:08 +03:00
Stepan Vladovskii
076828f003 feat: no force any more for CI deploy from Gitea
All checks were successful
Deploy to discoursio-api / deploy (push) Successful in 30s
2024-01-28 18:38:34 -03:00
Stepan Vladovskii
4f6c459532 feat: sigil with other architecture
All checks were successful
Deploy to discoursio-api / deploy (push) Successful in 1m41s
2024-01-24 22:53:16 -03:00
Stepan Vladovskii
11524c17ea feat: yess, it was deploy on staging
All checks were successful
Deploy to discoursio-api / deploy (push) Successful in 1m40s
2024-01-24 21:06:11 -03:00
168f845772 Merge branch 'main' of https://dev.discours.io/discours.io/core
Some checks failed
Deploy to discoursio-api / deploy (push) Failing after 1m31s
2024-01-25 02:59:43 +03:00
657146cdca trig-redeploy 2024-01-25 02:56:30 +03:00
Stepan Vladovskii
86111bc9f5 debug: simplify main.yml for actions
Some checks failed
Deploy to discoursio-api / deploy (push) Failing after 58s
2024-01-24 20:45:34 -03:00
Stepan Vladovskii
a8018a0b2f debug: simplify main.yml for actions
Some checks failed
Deploy to discoursio-api / deploy (push) Failing after 3s
2024-01-24 20:39:58 -03:00
Stepan Vladovskii
9d8bd629ab feat: add CI to main for deploy on discoursio-api 2024-01-24 19:09:54 -03:00
1eddf9cc0b topic-resolvers-fix 2024-01-24 18:21:34 +03:00
6415f86286 redis-log-fix 2024-01-24 15:32:53 +03:00
5d1c4f0084 launch-fix 2024-01-24 11:36:15 +03:00
1dce947db6 db-link-fix 2024-01-24 11:19:42 +03:00
4d9551a93c redis-fix 2024-01-24 11:06:46 +03:00
e6471280d5 dockerfile-fix2 2024-01-24 10:59:34 +03:00
3e062b4346 untransform-dockerfile 2024-01-24 10:55:14 +03:00
5b1a93c781 Merge branch 'main' of github.com:Discours/discours-backend 2024-01-24 10:47:07 +03:00
Igor Lobanov
c3a482614e robo migrate script 2024-01-08 09:20:29 +01:00
c30001547a Merge branch 'main' of github.com:Discours/discours-backend 2023-12-24 17:26:17 +03:00
Igor Lobanov
67576d0a5b only published in random topic shouts (#114) 2023-12-21 11:49:28 +01:00
Igor Lobanov
f395832d32 random topic shouts query, published date filter in random tops (#113)
Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-12-21 00:53:53 +01:00
Igor Lobanov
ff834987d4 unrated shouts query fix (#112)
Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-12-18 14:38:45 +01:00
Igor Lobanov
e23e379102 unrated shouts query update (#111)
Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-12-16 14:47:58 +01:00
Igor Lobanov
f5a3e273a6 unrated shouts query (#110)
Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-12-14 19:40:12 +01:00
Igor Lobanov
f9bc1d67ae random top articles query (#109)
* loadRandomTopShouts

* minor fixes

---------

Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-12-13 23:56:01 +01:00
025019b544 feat: add ACME location
Some checks failed
Deploy / push_to_target_repository (push) Failing after 4m25s
2023-11-28 14:20:19 -03:00
a862a11c91 Revert "some-fixes"
Some checks failed
Deploy / push_to_target_repository (push) Failing after 4m33s
This reverts commit f3d86daea7.
2023-11-24 12:54:01 +03:00
f3d86daea7 some-fixes 2023-11-24 05:19:25 +03:00
296716397e Merge branch 'main' of github.com:Discours/discours-backend 2023-11-24 00:38:20 +03:00
Ilya Y
b63b6e7ee7 timezones fixed once again (#107)
Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-11-14 14:56:41 +03:00
Igor Lobanov
34e18317a2 google oauth fix 2023-11-08 19:24:38 +01:00
Igor Lobanov
a2b47dab66 google oauth fix 2023-11-08 19:19:20 +01:00
Ilya Y
0e9f0b0682 Feature/google oauth (#106)
google oauth
---------

Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-11-08 21:12:55 +03:00
Ilya Y
2679b2c873 debug code removed (#105)
Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-11-06 12:03:04 +03:00
Ilya Y
0da4e110c1 test article (#104)
Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-11-04 19:44:58 +03:00
Ilya Y
21316187e0 Fix/deploy fix (#103)
* deploy fix
---------

Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-10-31 18:48:00 +03:00
Kosta
7f22966b41 Merge pull request #102 from Discours/feature/my_feed
my feed query fixed
2023-10-31 15:54:10 +02:00
Igor Lobanov
34d04e4240 my feed query fixed 2023-10-31 14:52:58 +01:00
Kosta
d7dd79336b Merge pull request #101 from Discours/feature/lint3
configured isort, black, flake8
2023-10-30 23:40:32 +02:00
Igor Lobanov
eaca3d613d build fix 2023-10-30 22:34:59 +01:00
Igor Lobanov
756a80151a build fix 2023-10-30 22:32:04 +01:00
Igor Lobanov
4395e3a72d build fix 2023-10-30 22:09:04 +01:00
Igor Lobanov
441bcc1e90 configured isort, black, flake8 2023-10-30 22:00:55 +01:00
Kosta
17c29c7f4f Merge pull request #100 from Discours/revert-99-feature/lint
Revert "Feature/lint"
2023-10-27 00:09:00 +03:00
Kosta
b142949805 Revert "Feature/lint" 2023-10-27 00:07:35 +03:00
Kosta
05136699ee Merge pull request #99 from Discours/feature/lint
Feature/lint
2023-10-26 23:48:25 +03:00
Igor Lobanov
c2cc428abe lint 2023-10-26 22:38:31 +02:00
Igor Lobanov
1c49780cd4 lint wip 2023-10-26 20:05:32 +02:00
Igor Lobanov
54457cb9c5 lint wip 2023-10-26 19:57:17 +02:00
Igor Lobanov
2c524279f6 lint wip 2023-10-26 19:56:42 +02:00
44bd146bdf Merge pull request #97 from Discours/feature/thumbor2
Feature/thumbor2
2023-10-26 00:55:36 +03:00
Igor Lobanov
9e3306fc3d Merge remote-tracking branch 'origin/main' into feature/thumbor2 2023-10-25 23:48:51 +02:00
Igor Lobanov
3389c5ce20 https for CDN, separate dir for non-image files 2023-10-25 23:48:16 +02:00
b71210a644 Merge pull request #96 from Discours/feature/thumbor
new thumbor (assets.discours.io -> images.discours.io), new visibilit…
2023-10-25 23:50:59 +03:00
Igor Lobanov
c8a951594c new thumbor (assets.discours.io -> images.discours.io), new visibility for non published articles (authors -> community) 2023-10-25 22:38:22 +02:00
Ilya Y
da8ee9b9c3 signIn/getSession optimization (#95)
Co-authored-by: Igor Lobanov <igor.lobanov@onetwotrip.com>
2023-10-19 17:54:38 +03:00
22c42839c1 new-sigil 2023-10-09 18:12:47 +03:00
4fd90e305f Merge branch 'main' of https://github.com/Discours/discours-backend 2023-10-09 10:32:33 +03:00
6dfec6714a Merge branch 'main' of https://github.com/Discours/discours-backend 2023-10-05 20:22:48 +03:00
2c72189055 lintbump 2023-09-28 15:51:28 +03:00
85 changed files with 3732 additions and 2092 deletions

View File

@@ -1,6 +0,0 @@
[flake8]
ignore = E203,W504,W191,W503
exclude = .git,__pycache__,orm/rbac.py
max-complexity = 10
max-line-length = 108
indent-string = ' '

31
.gitea/workflows/main.yml Normal file
View File

@@ -0,0 +1,31 @@
name: 'Deploy to discoursio-api'
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Cloning repo
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Get Repo Name
id: repo_name
run: echo "::set-output name=repo::$(echo ${GITHUB_REPOSITORY##*/})"
- name: Get Branch Name
id: branch_name
run: echo "::set-output name=branch::$(echo ${GITHUB_REF##*/})"
- name: Push to dokku
uses: dokku/github-action@master
with:
branch: 'main'
git_remote_url: 'ssh://dokku@v2.discours.io:22/discoursio-api'
ssh_private_key: ${{ secrets.SSH_PRIVATE_KEY }}

16
.github/workflows/checks.yml vendored Normal file
View File

@@ -0,0 +1,16 @@
name: Checks
on: [pull_request]
jobs:
build:
runs-on: ubuntu-latest
name: Checks
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.10.6
- run: pip install --upgrade pip
- run: pip install -r requirements.txt
- run: pip install -r requirements-dev.txt
- run: ./checks.sh

8
.gitignore vendored
View File

@@ -147,3 +147,11 @@ migration/content/**/*.md
*.csv
dev-server.pid
backups/
.ruff_cache
.venv
poetry.lock
.devcontainer/devcontainer.json
localhost-key.pem
.gitignore
discoursio.db
localhost.pem

View File

@@ -6,11 +6,11 @@ exclude: |
)
default_language_version:
python: python3.8
python: python3.10
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.2.0
rev: v4.5.0
hooks:
- id: check-added-large-files
- id: check-case-conflict
@@ -21,24 +21,24 @@ repos:
- id: check-yaml
- id: end-of-file-fixer
- id: trailing-whitespace
- id: requirements-txt-fixer
- repo: https://github.com/timothycrosley/isort
rev: 5.5.3
rev: 5.12.0
hooks:
- id: isort
- repo: https://github.com/ambv/black
rev: 20.8b1
rev: 23.10.1
hooks:
- id: black
args:
- --line-length=100
- --skip-string-normalization
- repo: https://gitlab.com/pycqa/flake8
rev: 3.8.3
- repo: https://github.com/PyCQA/flake8
rev: 6.1.0
hooks:
- id: flake8
args:
- --max-line-length=100
- --disable=protected-access
# - repo: https://github.com/python/mypy
# rev: v1.6.1
# hooks:
# - id: mypy

2
CHECKS
View File

@@ -1,5 +1,5 @@
WAIT=10
TIMEOUT=10
ATTEMPTS=10
ATTEMPTS=3
/

View File

@@ -1,9 +1,11 @@
FROM python:3.10
FROM python:3.11-slim
WORKDIR /app
EXPOSE 8080
ADD nginx.conf.sigil ./
RUN /usr/local/bin/python -m pip install --upgrade pip
WORKDIR /usr/src/app
COPY requirements.txt ./
COPY requirements.txt .
RUN apt update && apt install -y git gcc curl postgresql
RUN pip install -r requirements.txt
COPY . .
CMD python server.py

View File

@@ -1,2 +0,0 @@
web: python server.py

View File

@@ -7,10 +7,6 @@
- starlette
- uvicorn
# Local development
Install deps first
on osx
```
brew install redis nginx postgres
@@ -22,16 +18,23 @@ on debian/ubuntu
apt install redis nginx
```
First, install Postgres. Then you'll need some data, so migrate it:
# Local development
Install deps first
```
createdb discoursio
python server.py migrate
pip install -r requirements.txt
pip install -r requirements-dev.txt
pre-commit install
```
Then run nginx, redis and API server
Create database from backup
```
./restdb.sh
```
Start local server
```
redis-server
pip install -r requirements.txt
python3 server.py dev
```
@@ -42,4 +45,3 @@ Put the header 'Authorization' with token from signIn query or registerUser muta
# How to debug Ackee
Set ACKEE_TOKEN var

View File

@@ -1,75 +0,0 @@
import re
import nltk
from bs4 import BeautifulSoup
from nltk.corpus import stopwords
from pymystem3 import Mystem
from string import punctuation
from transformers import BertTokenizer
nltk.download("stopwords")
def get_clear_text(text):
soup = BeautifulSoup(text, 'html.parser')
# extract the plain text from the HTML document without tags
clear_text = ''
for tag in soup.find_all():
clear_text += tag.string or ''
clear_text = re.sub(pattern='[\u202F\u00A0\n]+', repl=' ', string=clear_text)
# only words
clear_text = re.sub(pattern='[^A-ZА-ЯЁ -]', repl='', string=clear_text, flags=re.IGNORECASE)
clear_text = re.sub(pattern='\s+', repl=' ', string=clear_text)
clear_text = clear_text.lower()
mystem = Mystem()
russian_stopwords = stopwords.words("russian")
tokens = mystem.lemmatize(clear_text)
tokens = [token for token in tokens if token not in russian_stopwords \
and token != " " \
and token.strip() not in punctuation]
clear_text = " ".join(tokens)
return clear_text
# if __name__ == '__main__':
#
# # initialize the tokenizer with the pre-trained BERT model and vocabulary
# tokenizer = BertTokenizer.from_pretrained('bert-base-multilingual-cased')
#
# # split each text into smaller segments of maximum length 512
# max_length = 512
# segmented_texts = []
# for text in [clear_text1, clear_text2]:
# segmented_text = []
# for i in range(0, len(text), max_length):
# segment = text[i:i+max_length]
# segmented_text.append(segment)
# segmented_texts.append(segmented_text)
#
# # tokenize each segment using the BERT tokenizer
# tokenized_texts = []
# for segmented_text in segmented_texts:
# tokenized_text = []
# for segment in segmented_text:
# segment_tokens = tokenizer.tokenize(segment)
# segment_tokens = ['[CLS]'] + segment_tokens + ['[SEP]']
# tokenized_text.append(segment_tokens)
# tokenized_texts.append(tokenized_text)
#
# input_ids = []
# for tokenized_text in tokenized_texts:
# input_id = []
# for segment_tokens in tokenized_text:
# segment_id = tokenizer.convert_tokens_to_ids(segment_tokens)
# input_id.append(segment_id)
# input_ids.append(input_id)
#
# print(input_ids)

View File

@@ -1,10 +1,9 @@
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from sqlalchemy import engine_from_config, pool
from alembic import context
from base.orm import Base
from settings import DB_URL
# this is the Alembic Config object, which provides
@@ -19,7 +18,6 @@ config.set_section_option(config.config_ini_section, "DB_URL", DB_URL)
if config.config_file_name is not None:
fileConfig(config.config_file_name)
from base.orm import Base
target_metadata = [Base.metadata]
# other values from the config, defined by the needs of env.py,
@@ -66,9 +64,7 @@ def run_migrations_online() -> None:
)
with connectable.connect() as connection:
context.configure(
connection=connection, target_metadata=target_metadata
)
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()

View File

@@ -7,12 +7,12 @@ Create Date: 2023-08-19 01:37:57.031933
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# import sqlalchemy as sa
# from alembic import op
# revision identifiers, used by Alembic.
revision: str = 'fe943b098418'
revision: str = "fe943b098418"
down_revision: Union[str, None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None

View File

@@ -2,75 +2,71 @@ from functools import wraps
from typing import Optional, Tuple
from graphql.type import GraphQLResolveInfo
from sqlalchemy.orm import joinedload, exc
from sqlalchemy.orm import exc, joinedload
from starlette.authentication import AuthenticationBackend
from starlette.requests import HTTPConnection
from auth.credentials import AuthCredentials, AuthUser
from base.orm import local_session
from orm.user import User, Role
from settings import SESSION_TOKEN_HEADER
from auth.tokenstorage import SessionToken
from base.exceptions import OperationNotAllowed
from base.orm import local_session
from orm.user import Role, User
from settings import SESSION_TOKEN_HEADER
class JWTAuthenticate(AuthenticationBackend):
async def authenticate(
self, request: HTTPConnection
) -> Optional[Tuple[AuthCredentials, AuthUser]]:
if SESSION_TOKEN_HEADER not in request.headers:
return AuthCredentials(scopes={}), AuthUser(user_id=None, username='')
return AuthCredentials(scopes={}), AuthUser(user_id=None, username="")
token = request.headers.get(SESSION_TOKEN_HEADER)
if not token:
print("[auth.authenticate] no token in header %s" % SESSION_TOKEN_HEADER)
return AuthCredentials(scopes={}, error_message=str("no token")), AuthUser(
user_id=None, username=''
user_id=None, username=""
)
if len(token.split('.')) > 1:
if len(token.split(".")) > 1:
payload = await SessionToken.verify(token)
with local_session() as session:
try:
user = (
session.query(User).options(
session.query(User)
.options(
joinedload(User.roles).options(joinedload(Role.permissions)),
joinedload(User.ratings)
).filter(
User.id == payload.user_id
).one()
joinedload(User.ratings),
)
.filter(User.id == payload.user_id)
.one()
)
scopes = {} # TODO: integrate await user.get_permission()
return (
AuthCredentials(
user_id=payload.user_id,
scopes=scopes,
logged_in=True
),
AuthUser(user_id=user.id, username=''),
AuthCredentials(user_id=payload.user_id, scopes=scopes, logged_in=True),
AuthUser(user_id=user.id, username=""),
)
except exc.NoResultFound:
pass
return AuthCredentials(scopes={}, error_message=str('Invalid token')), AuthUser(user_id=None, username='')
return AuthCredentials(scopes={}, error_message=str("Invalid token")), AuthUser(
user_id=None, username=""
)
def login_required(func):
@wraps(func)
async def wrap(parent, info: GraphQLResolveInfo, *args, **kwargs):
# print('[auth.authenticate] login required for %r with info %r' % (func, info)) # debug only
# debug only
# print('[auth.authenticate] login required for %r with info %r' % (func, info))
auth: AuthCredentials = info.context["request"].auth
# print(auth)
if not auth or not auth.logged_in:
# raise Unauthorized(auth.error_message or "Please login")
return {
"error": "Please login first"
}
return {"error": "Please login first"}
return await func(parent, info, *args, **kwargs)
return wrap
@@ -79,7 +75,9 @@ def login_required(func):
def permission_required(resource, operation, func):
@wraps(func)
async def wrap(parent, info: GraphQLResolveInfo, *args, **kwargs):
print('[auth.authenticate] permission_required for %r with info %r' % (func, info)) # debug only
print(
"[auth.authenticate] permission_required for %r with info %r" % (func, info)
) # debug only
auth: AuthCredentials = info.context["request"].auth
if not auth.logged_in:
raise OperationNotAllowed(auth.error_message or "Please login")

View File

@@ -23,13 +23,11 @@ class AuthCredentials(BaseModel):
async def permissions(self) -> List[Permission]:
if self.user_id is None:
# raise Unauthorized("Please login first")
return {
"error": "Please login first"
}
return {"error": "Please login first"}
else:
# TODO: implement permissions logix
print(self.user_id)
return NotImplemented()
return NotImplemented
class AuthUser(BaseModel):
@@ -40,6 +38,6 @@ class AuthUser(BaseModel):
def is_authenticated(self) -> bool:
return self.user_id is not None
@property
def display_id(self) -> int:
return self.user_id
# @property
# def display_id(self) -> int:
# return self.user_id

View File

@@ -2,19 +2,16 @@ import requests
from settings import MAILGUN_API_KEY, MAILGUN_DOMAIN
api_url = "https://api.mailgun.net/v3/%s/messages" % (MAILGUN_DOMAIN or 'discours.io')
noreply = "discours.io <noreply@%s>" % (MAILGUN_DOMAIN or 'discours.io')
lang_subject = {
"ru": "Подтверждение почты",
"en": "Confirm email"
}
api_url = "https://api.mailgun.net/v3/%s/messages" % (MAILGUN_DOMAIN or "discours.io")
noreply = "discours.io <noreply@%s>" % (MAILGUN_DOMAIN or "discours.io")
lang_subject = {"ru": "Подтверждение почты", "en": "Confirm email"}
async def send_auth_email(user, token, lang="ru", template="email_confirmation"):
try:
to = "%s <%s>" % (user.name, user.email)
if lang not in ['ru', 'en']:
lang = 'ru'
if lang not in ["ru", "en"]:
lang = "ru"
subject = lang_subject.get(lang, lang_subject["en"])
template = template + "_" + lang
payload = {
@@ -22,16 +19,12 @@ async def send_auth_email(user, token, lang="ru", template="email_confirmation")
"to": to,
"subject": subject,
"template": template,
"h:X-Mailgun-Variables": "{ \"token\": \"%s\" }" % token
"h:X-Mailgun-Variables": '{ "token": "%s" }' % token,
}
print('[auth.email] payload: %r' % payload)
print("[auth.email] payload: %r" % payload)
# debug
# print('http://localhost:3000/?modal=auth&mode=confirm-email&token=%s' % token)
response = requests.post(
api_url,
auth=("api", MAILGUN_API_KEY),
data=payload
)
response = requests.post(api_url, auth=("api", MAILGUN_API_KEY), data=payload)
response.raise_for_status()
except Exception as e:
print(e)

View File

@@ -3,14 +3,13 @@ from hashlib import sha256
from jwt import DecodeError, ExpiredSignatureError
from passlib.hash import bcrypt
from sqlalchemy import or_
from auth.jwtcodec import JWTCodec
from auth.tokenstorage import TokenStorage
# from base.exceptions import InvalidPassword, InvalidToken
from base.orm import local_session
from orm import User
from validations.auth import AuthInput
class Password:
@@ -34,8 +33,8 @@ class Password:
Verify that password hash is equal to specified hash. Hash format:
$2a$10$Ro0CUfOqk6cXEKf3dyaM7OhSCvnwM9s4wIX9JeLapehKK5YdLxKcm
\__/\/ \____________________/\_____________________________/
| | Salt Hash
__ __ ____________________________________________________ # noqa: W605
| | | Salt (22) | Hash
| Cost
Version
@@ -57,60 +56,41 @@ class Identity:
user = User(**orm_user.dict())
if not user.password:
# raise InvalidPassword("User password is empty")
return {
"error": "User password is empty"
}
return {"error": "User password is empty"}
if not Password.verify(password, user.password):
# raise InvalidPassword("Wrong user password")
return {
"error": "Wrong user password"
}
return {"error": "Wrong user password"}
return user
@staticmethod
def oauth(inp: AuthInput) -> User:
def oauth(inp) -> User:
with local_session() as session:
user = (
session.query(User)
.filter(or_(User.oauth == inp["oauth"], User.email == inp["email"]))
.first()
)
user = session.query(User).filter(User.email == inp["email"]).first()
if not user:
user = User.create(**inp)
if not user.oauth:
user.oauth = inp["oauth"]
user = User.create(**inp, emailConfirmed=True)
session.commit()
user = User(**user.dict())
return user
@staticmethod
async def onetime(token: str) -> User:
try:
print('[auth.identity] using one time token')
print("[auth.identity] using one time token")
payload = JWTCodec.decode(token)
if not await TokenStorage.exist(f"{payload.user_id}-{payload.username}-{token}"):
# raise InvalidToken("Login token has expired, please login again")
return {
"error": "Token has expired"
}
return {"error": "Token has expired"}
except ExpiredSignatureError:
# raise InvalidToken("Login token has expired, please try again")
return {
"error": "Token has expired"
}
return {"error": "Token has expired"}
except DecodeError:
# raise InvalidToken("token format error") from e
return {
"error": "Token format error"
}
return {"error": "Token format error"}
with local_session() as session:
user = session.query(User).filter_by(id=payload.user_id).first()
if not user:
# raise Exception("user not exist")
return {
"error": "User does not exist"
}
return {"error": "User does not exist"}
if not user.emailConfirmed:
user.emailConfirmed = True
session.commit()

View File

@@ -1,8 +1,10 @@
from datetime import datetime, timezone
import jwt
from base.exceptions import ExpiredToken, InvalidToken
from validations.auth import TokenPayload, AuthInput
from settings import JWT_ALGORITHM, JWT_SECRET_KEY
from validations.auth import AuthInput, TokenPayload
class JWTCodec:
@@ -13,12 +15,12 @@ class JWTCodec:
"username": user.email or user.phone,
"exp": exp,
"iat": datetime.now(tz=timezone.utc),
"iss": "discours"
"iss": "discours",
}
try:
return jwt.encode(payload, JWT_SECRET_KEY, JWT_ALGORITHM)
except Exception as e:
print('[auth.jwtcodec] JWT encode error %r' % e)
print("[auth.jwtcodec] JWT encode error %r" % e)
@staticmethod
def decode(token: str, verify_exp: bool = True) -> TokenPayload:
@@ -33,18 +35,18 @@ class JWTCodec:
# "verify_signature": False
},
algorithms=[JWT_ALGORITHM],
issuer="discours"
issuer="discours",
)
r = TokenPayload(**payload)
# print('[auth.jwtcodec] debug token %r' % r)
return r
except jwt.InvalidIssuedAtError:
print('[auth.jwtcodec] invalid issued at: %r' % payload)
raise ExpiredToken('check token issued time')
print("[auth.jwtcodec] invalid issued at: %r" % payload)
raise ExpiredToken("check token issued time")
except jwt.ExpiredSignatureError:
print('[auth.jwtcodec] expired signature %r' % payload)
raise ExpiredToken('check token lifetime')
print("[auth.jwtcodec] expired signature %r" % payload)
raise ExpiredToken("check token lifetime")
except jwt.InvalidTokenError:
raise InvalidToken('token is not valid')
raise InvalidToken("token is not valid")
except jwt.InvalidSignatureError:
raise InvalidToken('token is not valid')
raise InvalidToken("token is not valid")

View File

@@ -1,8 +1,9 @@
from authlib.integrations.starlette_client import OAuth
from starlette.responses import RedirectResponse
from auth.identity import Identity
from auth.tokenstorage import TokenStorage
from settings import OAUTH_CLIENTS, FRONTEND_URL
from settings import FRONTEND_URL, OAUTH_CLIENTS
oauth = OAuth()
@@ -36,12 +37,19 @@ oauth.register(
client_secret=OAUTH_CLIENTS["GOOGLE"]["key"],
server_metadata_url="https://accounts.google.com/.well-known/openid-configuration",
client_kwargs={"scope": "openid email profile"},
authorize_state="test",
)
async def google_profile(client, request, token):
profile = await client.parse_id_token(request, token)
profile["id"] = profile["sub"]
userinfo = token["userinfo"]
profile = {"name": userinfo["name"], "email": userinfo["email"], "id": userinfo["sub"]}
if userinfo["picture"]:
userpic = userinfo["picture"].replace("=s96", "=s600")
profile["userpic"] = userpic
return profile
@@ -81,6 +89,7 @@ async def oauth_authorize(request):
"oauth": user_oauth_info,
"email": profile["email"],
"username": profile["name"],
"userpic": profile["userpic"],
}
user = Identity.oauth(user_input)
session_token = await TokenStorage.create_session(user)

View File

@@ -1,9 +1,9 @@
from datetime import datetime, timedelta, timezone
from auth.jwtcodec import JWTCodec
from validations.auth import AuthInput
from base.redis import redis
from settings import SESSION_TOKEN_LIFE_SPAN, ONETIME_TOKEN_LIFE_SPAN
from settings import ONETIME_TOKEN_LIFE_SPAN, SESSION_TOKEN_LIFE_SPAN
from validations.auth import AuthInput
async def save(token_key, life_span, auto_delete=True):
@@ -35,7 +35,7 @@ class SessionToken:
class TokenStorage:
@staticmethod
async def get(token_key):
print('[tokenstorage.get] ' + token_key)
print("[tokenstorage.get] " + token_key)
# 2041-user@domain.zn-eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoyMDQxLCJ1c2VybmFtZSI6ImFudG9uLnJld2luK3Rlc3QtbG9hZGNoYXRAZ21haWwuY29tIiwiZXhwIjoxNjcxNzgwNjE2LCJpYXQiOjE2NjkxODg2MTYsImlzcyI6ImRpc2NvdXJzIn0.Nml4oV6iMjMmc6xwM7lTKEZJKBXvJFEIZ-Up1C1rITQ
return await redis.execute("GET", token_key)

View File

@@ -1,8 +1,8 @@
from graphql.error import GraphQLError
# TODO: remove traceback from logs for defined exceptions
class BaseHttpException(GraphQLError):
code = 500
message = "500 Server error"

View File

@@ -1,15 +1,13 @@
from typing import TypeVar, Any, Dict, Generic, Callable
from typing import Any, Callable, Dict, Generic, TypeVar
from sqlalchemy import create_engine, Column, Integer
from sqlalchemy import Column, Integer, create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import Session
from sqlalchemy.sql.schema import Table
from settings import DB_URL
engine = create_engine(
DB_URL, echo=False, pool_size=10, max_overflow=20
)
engine = create_engine(DB_URL, echo=False, pool_size=10, max_overflow=20)
T = TypeVar("T")
@@ -20,7 +18,10 @@ def local_session():
return Session(bind=engine, expire_on_commit=False)
class Base(declarative_base()):
DeclarativeBase = declarative_base() # type: Any
class Base(DeclarativeBase):
__table__: Table
__tablename__: str
__new__: Callable
@@ -47,7 +48,7 @@ class Base(declarative_base()):
def update(self, input):
column_names = self.__table__.columns.keys()
for (name, value) in input.items():
for name, value in input.items():
if name in column_names:
setattr(self, name, value)

View File

@@ -1,43 +1,61 @@
from aioredis import from_url
from asyncio import sleep
import redis.asyncio as aredis
from settings import REDIS_URL
import logging
logger = logging.getLogger("[services.redis] ")
logger.setLevel(logging.DEBUG)
class RedisCache:
def __init__(self, uri=REDIS_URL):
self._uri: str = uri
self._instance = None
self.pubsub_channels = []
self._client = None
async def connect(self):
if self._instance is not None:
return
self._instance = await from_url(self._uri, encoding="utf-8")
# print(self._instance)
self._client = aredis.Redis.from_url(self._uri, decode_responses=True)
async def disconnect(self):
if self._instance is None:
return
await self._instance.close()
# await self._instance.wait_closed() # deprecated
self._instance = None
if self._client:
await self._client.close()
async def execute(self, command, *args, **kwargs):
while not self._instance:
await sleep(1)
try:
# print("[redis] " + command + ' ' + ' '.join(args))
return await self._instance.execute_command(command, *args, **kwargs)
except Exception:
pass
if self._client:
try:
logger.debug(f"{command} {args} {kwargs}")
r = await self._client.execute_command(command, *args, **kwargs)
logger.debug(type(r))
logger.debug(r)
return r
except Exception as e:
logger.error(e)
async def subscribe(self, *channels):
if self._client:
async with self._client.pubsub() as pubsub:
for channel in channels:
await pubsub.subscribe(channel)
self.pubsub_channels.append(channel)
async def unsubscribe(self, *channels):
if not self._client:
return
async with self._client.pubsub() as pubsub:
for channel in channels:
await pubsub.unsubscribe(channel)
self.pubsub_channels.remove(channel)
async def publish(self, channel, data):
if not self._client:
return
await self._client.publish(channel, data)
async def mget(self, *keys):
return await self.execute('MGET', *keys)
async def lrange(self, key, start, stop):
# print(f"[redis] LRANGE {key} {start} {stop}")
return await self._instance.lrange(key, start, stop)
async def mget(self, key, *keys):
# print(f"[redis] MGET {key} {keys}")
return await self._instance.mget(key, *keys)
return await self.execute('LRANGE', key, start, stop)
redis = RedisCache()

10
checks.sh Executable file
View File

@@ -0,0 +1,10 @@
#!/usr/bin/env bash
echo "> isort"
isort .
echo "> black"
black .
echo "> flake8"
flake8 .
# echo "> mypy"
# mypy .

1
generate_gql_types.sh Executable file
View File

@@ -0,0 +1 @@
python -m gql_schema_codegen -p ./schema.graphql -t ./schema_types.py

16
lint.sh
View File

@@ -1,16 +0,0 @@
#!/usr/bin/env bash
set -e
find . -name "*.py[co]" -o -name __pycache__ -exec rm -rf {} +
#rm -rf .mypy_cache
echo "> isort"
isort --gitignore --settings-file=setup.cfg .
echo "> brunette"
brunette --config=setup.cfg .
echo "> flake8"
flake8 --config=setup.cfg .
echo "> mypy"
mypy --config-file=setup.cfg .
echo "> prettyjson"
python3 -m scripts.prettyjson

29
main.py
View File

@@ -2,6 +2,7 @@ import asyncio
import os
from importlib import import_module
from os.path import exists
from ariadne import load_schema_from_path, make_executable_schema
from ariadne.asgi import GraphQL
from starlette.applications import Starlette
@@ -9,23 +10,23 @@ from starlette.middleware import Middleware
from starlette.middleware.authentication import AuthenticationMiddleware
from starlette.middleware.sessions import SessionMiddleware
from starlette.routing import Route
from orm import init_tables
from auth.authenticate import JWTAuthenticate
from auth.oauth import oauth_login, oauth_authorize
from auth.oauth import oauth_authorize, oauth_login
from base.redis import redis
from base.resolvers import resolvers
from resolvers.auth import confirm_email_handler
from orm import init_tables
from resolvers.upload import upload_handler
from services.main import storages_init
from services.notifications.notification_service import notification_service
from services.notifications.sse import sse_subscribe_handler
from services.stat.viewed import ViewedStorage
# from services.zine.gittask import GitTask
from settings import DEV_SERVER_PID_FILE_NAME, SENTRY_DSN, SESSION_SECRET_KEY
from services.notifications.sse import sse_subscribe_handler
import_module("resolvers")
schema = make_executable_schema(load_schema_from_path("schema.graphql"), resolvers) # type: ignore
schema = make_executable_schema(load_schema_from_path("schema.graphql"), resolvers)
middleware = [
Middleware(AuthenticationMiddleware, backend=JWTAuthenticate()),
@@ -46,9 +47,10 @@ async def start_up():
try:
import sentry_sdk
sentry_sdk.init(SENTRY_DSN)
except Exception as e:
print('[sentry] init error')
print("[sentry] init error")
print(e)
@@ -57,7 +59,7 @@ async def dev_start_up():
await redis.connect()
return
else:
with open(DEV_SERVER_PID_FILE_NAME, 'w', encoding='utf-8') as f:
with open(DEV_SERVER_PID_FILE_NAME, "w", encoding="utf-8") as f:
f.write(str(os.getpid()))
await start_up()
@@ -68,11 +70,9 @@ async def shutdown():
routes = [
# Route("/messages", endpoint=sse_messages),
Route("/oauth/{provider}", endpoint=oauth_login),
Route("/oauth-authorize", endpoint=oauth_authorize),
Route("/confirm/{token}", endpoint=confirm_email_handler),
Route("/upload", endpoint=upload_handler, methods=['POST']),
Route("/upload", endpoint=upload_handler, methods=["POST"]),
Route("/subscribe/{user_id}", endpoint=sse_subscribe_handler),
]
@@ -82,9 +82,7 @@ app = Starlette(
middleware=middleware,
routes=routes,
)
app.mount("/", GraphQL(
schema
))
app.mount("/", GraphQL(schema))
dev_app = Starlette(
debug=True,
@@ -93,7 +91,4 @@ dev_app = Starlette(
middleware=middleware,
routes=routes,
)
dev_app.mount("/", GraphQL(
schema,
debug=True
))
dev_app.mount("/", GraphQL(schema, debug=True))

View File

@@ -16,4 +16,3 @@ echo "Start migration"
python3 server.py migrate
if [ $? -ne 0 ]; then { echo "Migration failed, aborting." ; exit 1; } fi
echo 'Done!'

View File

@@ -12,10 +12,12 @@ from migration.tables.comments import migrate as migrateComment
from migration.tables.comments import migrate_2stage as migrateComment_2stage
from migration.tables.content_items import get_shout_slug
from migration.tables.content_items import migrate as migrateShout
from migration.tables.remarks import migrate as migrateRemark
# from migration.tables.remarks import migrate as migrateRemark
from migration.tables.topics import migrate as migrateTopic
from migration.tables.users import migrate as migrateUser, post_migrate as users_post_migrate
from migration.tables.users import migrate as migrateUser
from migration.tables.users import migrate_2stage as migrateUser_2stage
from migration.tables.users import post_migrate as users_post_migrate
from orm import init_tables
from orm.reaction import Reaction
@@ -63,16 +65,8 @@ async def topics_handle(storage):
del storage["topics"]["by_slug"][oldslug]
storage["topics"]["by_oid"][oid] = storage["topics"]["by_slug"][newslug]
print("[migration] " + str(counter) + " topics migrated")
print(
"[migration] "
+ str(len(storage["topics"]["by_oid"].values()))
+ " topics by oid"
)
print(
"[migration] "
+ str(len(storage["topics"]["by_slug"].values()))
+ " topics by slug"
)
print("[migration] " + str(len(storage["topics"]["by_oid"].values())) + " topics by oid")
print("[migration] " + str(len(storage["topics"]["by_slug"].values())) + " topics by slug")
async def shouts_handle(storage, args):
@@ -117,9 +111,10 @@ async def shouts_handle(storage, args):
# print main counter
counter += 1
print('[migration] shouts_handle %d: %s @%s' % (
(counter + 1), shout_dict["slug"], author["slug"]
))
print(
"[migration] shouts_handle %d: %s @%s"
% ((counter + 1), shout_dict["slug"], author["slug"])
)
b = bs4.BeautifulSoup(shout_dict["body"], "html.parser")
texts = [shout_dict["title"].lower().replace(r"[^а-яА-Яa-zA-Z]", "")]
@@ -138,13 +133,13 @@ async def shouts_handle(storage, args):
print("[migration] " + str(anonymous_author) + " authored by @anonymous")
async def remarks_handle(storage):
print("[migration] comments")
c = 0
for entry_remark in storage["remarks"]["data"]:
remark = await migrateRemark(entry_remark, storage)
c += 1
print("[migration] " + str(c) + " remarks migrated")
# async def remarks_handle(storage):
# print("[migration] comments")
# c = 0
# for entry_remark in storage["remarks"]["data"]:
# remark = await migrateRemark(entry_remark, storage)
# c += 1
# print("[migration] " + str(c) + " remarks migrated")
async def comments_handle(storage):
@@ -155,9 +150,9 @@ async def comments_handle(storage):
for oldcomment in storage["reactions"]["data"]:
if not oldcomment.get("deleted"):
reaction = await migrateComment(oldcomment, storage)
if type(reaction) == str:
if isinstance(reaction, str):
missed_shouts[reaction] = oldcomment
elif type(reaction) == Reaction:
elif isinstance(reaction, Reaction):
reaction = reaction.dict()
rid = reaction["id"]
oid = reaction["oid"]
@@ -214,9 +209,7 @@ def data_load():
tags_data = json.loads(open("migration/data/tags.json").read())
storage["topics"]["tags"] = tags_data
print("[migration.load] " + str(len(tags_data)) + " tags ")
cats_data = json.loads(
open("migration/data/content_item_categories.json").read()
)
cats_data = json.loads(open("migration/data/content_item_categories.json").read())
storage["topics"]["cats"] = cats_data
print("[migration.load] " + str(len(cats_data)) + " cats ")
comments_data = json.loads(open("migration/data/comments.json").read())
@@ -235,11 +228,7 @@ def data_load():
storage["users"]["by_oid"][x["_id"]] = x
# storage['users']['by_slug'][x['slug']] = x
# no user.slug yet
print(
"[migration.load] "
+ str(len(storage["users"]["by_oid"].keys()))
+ " users by oid"
)
print("[migration.load] " + str(len(storage["users"]["by_oid"].keys())) + " users by oid")
for x in tags_data:
storage["topics"]["by_oid"][x["_id"]] = x
storage["topics"]["by_slug"][x["slug"]] = x
@@ -247,9 +236,7 @@ def data_load():
storage["topics"]["by_oid"][x["_id"]] = x
storage["topics"]["by_slug"][x["slug"]] = x
print(
"[migration.load] "
+ str(len(storage["topics"]["by_slug"].keys()))
+ " topics by slug"
"[migration.load] " + str(len(storage["topics"]["by_slug"].keys())) + " topics by slug"
)
for item in content_data:
slug = get_shout_slug(item)

View File

@@ -1,8 +1,9 @@
import gc
import json
import os
import bson
import gc
from .utils import DateTimeEncoder
@@ -15,10 +16,10 @@ def json_tables():
"email_subscriptions": [],
"users": [],
"comments": [],
"remarks": []
"remarks": [],
}
for table in data.keys():
print('[migration] bson2json for ' + table)
print("[migration] bson2json for " + table)
gc.collect()
lc = []
bs = open("dump/discours/" + table + ".bson", "rb").read()

View File

@@ -71,47 +71,29 @@ def export_slug(slug, storage):
def export_email_subscriptions():
email_subscriptions_data = json.loads(
open("migration/data/email_subscriptions.json").read()
)
email_subscriptions_data = json.loads(open("migration/data/email_subscriptions.json").read())
for data in email_subscriptions_data:
# TODO: migrate to mailgun list manually
# migrate_email_subscription(data)
pass
print(
"[migration] "
+ str(len(email_subscriptions_data))
+ " email subscriptions exported"
)
print("[migration] " + str(len(email_subscriptions_data)) + " email subscriptions exported")
def export_shouts(storage):
# update what was just migrated or load json again
if len(storage["users"]["by_slugs"].keys()) == 0:
storage["users"]["by_slugs"] = json.loads(
open(EXPORT_DEST + "authors.json").read()
)
print(
"[migration] "
+ str(len(storage["users"]["by_slugs"].keys()))
+ " exported authors "
)
storage["users"]["by_slugs"] = json.loads(open(EXPORT_DEST + "authors.json").read())
print("[migration] " + str(len(storage["users"]["by_slugs"].keys())) + " exported authors ")
if len(storage["shouts"]["by_slugs"].keys()) == 0:
storage["shouts"]["by_slugs"] = json.loads(
open(EXPORT_DEST + "articles.json").read()
)
storage["shouts"]["by_slugs"] = json.loads(open(EXPORT_DEST + "articles.json").read())
print(
"[migration] "
+ str(len(storage["shouts"]["by_slugs"].keys()))
+ " exported articles "
"[migration] " + str(len(storage["shouts"]["by_slugs"].keys())) + " exported articles "
)
for slug in storage["shouts"]["by_slugs"].keys():
export_slug(slug, storage)
def export_json(
export_articles={}, export_authors={}, export_topics={}, export_comments={}
):
def export_json(export_articles={}, export_authors={}, export_topics={}, export_comments={}):
open(EXPORT_DEST + "authors.json", "w").write(
json.dumps(
export_authors,
@@ -152,8 +134,4 @@ def export_json(
ensure_ascii=False,
)
)
print(
"[migration] "
+ str(len(export_comments.items()))
+ " exported articles with comments"
)
print("[migration] " + str(len(export_comments.items())) + " exported articles with comments")

View File

@@ -1,17 +1,14 @@
import base64
import os
import re
import uuid
from bs4 import BeautifulSoup
TOOLTIP_REGEX = r"(\/\/\/(.+)\/\/\/)"
contentDir = os.path.join(
os.path.dirname(os.path.realpath(__file__)), "..", "..", "discoursio-web", "content"
)
s3 = "https://discours-io.s3.amazonaws.com/"
cdn = "https://assets.discours.io"
cdn = "https://images.discours.io"
def replace_tooltips(body):
@@ -27,76 +24,79 @@ def replace_tooltips(body):
return newbody
def extract_footnotes(body, shout_dict):
parts = body.split("&&&")
lll = len(parts)
newparts = list(parts)
placed = False
if lll & 1:
if lll > 1:
i = 1
print("[extract] found %d footnotes in body" % (lll - 1))
for part in parts[1:]:
if i & 1:
placed = True
if 'a class="footnote-url" href=' in part:
print("[extract] footnote: " + part)
fn = 'a class="footnote-url" href="'
exxtracted_link = part.split(fn, 1)[1].split('"', 1)[0]
extracted_body = part.split(fn, 1)[1].split('>', 1)[1].split('</a>', 1)[0]
print("[extract] footnote link: " + extracted_link)
with local_session() as session:
Reaction.create({
"shout": shout_dict['id'],
"kind": ReactionKind.FOOTNOTE,
"body": extracted_body,
"range": str(body.index(fn + link) - len('<')) + ':' + str(body.index(extracted_body) + len('</a>'))
})
newparts[i] = "<a href='#'></a>"
else:
newparts[i] = part
i += 1
return ("".join(newparts), placed)
# def extract_footnotes(body, shout_dict):
# parts = body.split("&&&")
# lll = len(parts)
# newparts = list(parts)
# placed = False
# if lll & 1:
# if lll > 1:
# i = 1
# print("[extract] found %d footnotes in body" % (lll - 1))
# for part in parts[1:]:
# if i & 1:
# placed = True
# if 'a class="footnote-url" href=' in part:
# print("[extract] footnote: " + part)
# fn = 'a class="footnote-url" href="'
# exxtracted_link = part.split(fn, 1)[1].split('"', 1)[0]
# extracted_body = part.split(fn, 1)[1].split(">", 1)[1].split("</a>", 1)[0]
# print("[extract] footnote link: " + extracted_link)
# with local_session() as session:
# Reaction.create(
# {
# "shout": shout_dict["id"],
# "kind": ReactionKind.FOOTNOTE,
# "body": extracted_body,
# "range": str(body.index(fn + link) - len("<"))
# + ":"
# + str(body.index(extracted_body) + len("</a>")),
# }
# )
# newparts[i] = "<a href='#'></a>"
# else:
# newparts[i] = part
# i += 1
# return ("".join(newparts), placed)
def place_tooltips(body):
parts = body.split("&&&")
lll = len(parts)
newparts = list(parts)
placed = False
if lll & 1:
if lll > 1:
i = 1
print("[extract] found %d tooltips" % (lll - 1))
for part in parts[1:]:
if i & 1:
placed = True
if 'a class="footnote-url" href=' in part:
print("[extract] footnote: " + part)
fn = 'a class="footnote-url" href="'
link = part.split(fn, 1)[1].split('"', 1)[0]
extracted_part = (
part.split(fn, 1)[0] + " " + part.split("/", 1)[-1]
)
newparts[i] = (
"<Tooltip"
+ (' link="' + link + '" ' if link else "")
+ ">"
+ extracted_part
+ "</Tooltip>"
)
else:
newparts[i] = "<Tooltip>%s</Tooltip>" % part
# print('[extract] ' + newparts[i])
else:
# print('[extract] ' + part[:10] + '..')
newparts[i] = part
i += 1
return ("".join(newparts), placed)
# def place_tooltips(body):
# parts = body.split("&&&")
# lll = len(parts)
# newparts = list(parts)
# placed = False
# if lll & 1:
# if lll > 1:
# i = 1
# print("[extract] found %d tooltips" % (lll - 1))
# for part in parts[1:]:
# if i & 1:
# placed = True
# if 'a class="footnote-url" href=' in part:
# print("[extract] footnote: " + part)
# fn = 'a class="footnote-url" href="'
# link = part.split(fn, 1)[1].split('"', 1)[0]
# extracted_part = part.split(fn, 1)[0] + " " + part.split("/", 1)[-1]
# newparts[i] = (
# "<Tooltip"
# + (' link="' + link + '" ' if link else "")
# + ">"
# + extracted_part
# + "</Tooltip>"
# )
# else:
# newparts[i] = "<Tooltip>%s</Tooltip>" % part
# # print('[extract] ' + newparts[i])
# else:
# # print('[extract] ' + part[:10] + '..')
# newparts[i] = part
# i += 1
# return ("".join(newparts), placed)
IMG_REGEX = r"\!\[(.*?)\]\((data\:image\/(png|jpeg|jpg);base64\,((?:[A-Za-z\d+\/]{4})*(?:[A-Za-z\d+\/]{3}="
IMG_REGEX = (
r"\!\[(.*?)\]\((data\:image\/(png|jpeg|jpg);base64\,((?:[A-Za-z\d+\/]{4})*(?:[A-Za-z\d+\/]{3}="
)
IMG_REGEX += r"|[A-Za-z\d+\/]{2}==)))\)"
parentDir = "/".join(os.getcwd().split("/")[:-1])
@@ -104,29 +104,29 @@ public = parentDir + "/discoursio-web/public"
cache = {}
def reextract_images(body, oid):
# change if you prefer regexp
matches = list(re.finditer(IMG_REGEX, body, re.IGNORECASE | re.MULTILINE))[1:]
i = 0
for match in matches:
print("[extract] image " + match.group(1))
ext = match.group(3)
name = oid + str(i)
link = public + "/upload/image-" + name + "." + ext
img = match.group(4)
title = match.group(1) # NOTE: this is not the title
if img not in cache:
content = base64.b64decode(img + "==")
print(str(len(img)) + " image bytes been written")
open("../" + link, "wb").write(content)
cache[img] = name
i += 1
else:
print("[extract] image cached " + cache[img])
body.replace(
str(match), "![" + title + "](" + cdn + link + ")"
) # WARNING: this does not work
return body
# def reextract_images(body, oid):
# # change if you prefer regexp
# matches = list(re.finditer(IMG_REGEX, body, re.IGNORECASE | re.MULTILINE))[1:]
# i = 0
# for match in matches:
# print("[extract] image " + match.group(1))
# ext = match.group(3)
# name = oid + str(i)
# link = public + "/upload/image-" + name + "." + ext
# img = match.group(4)
# title = match.group(1) # NOTE: this is not the title
# if img not in cache:
# content = base64.b64decode(img + "==")
# print(str(len(img)) + " image bytes been written")
# open("../" + link, "wb").write(content)
# cache[img] = name
# i += 1
# else:
# print("[extract] image cached " + cache[img])
# body.replace(
# str(match), "![" + title + "](" + cdn + link + ")"
# ) # WARNING: this does not work
# return body
IMAGES = {
@@ -137,163 +137,11 @@ IMAGES = {
b64 = ";base64,"
def extract_imageparts(bodyparts, prefix):
# recursive loop
newparts = list(bodyparts)
for current in bodyparts:
i = bodyparts.index(current)
for mime in IMAGES.keys():
if mime == current[-len(mime) :] and (i + 1 < len(bodyparts)):
print("[extract] " + mime)
next = bodyparts[i + 1]
ext = IMAGES[mime]
b64end = next.index(")")
b64encoded = next[:b64end]
name = prefix + "-" + str(len(cache))
link = "/upload/image-" + name + "." + ext
print("[extract] name: " + name)
print("[extract] link: " + link)
print("[extract] %d bytes" % len(b64encoded))
if b64encoded not in cache:
try:
content = base64.b64decode(b64encoded + "==")
open(public + link, "wb").write(content)
print(
"[extract] "
+ str(len(content))
+ " image bytes been written"
)
cache[b64encoded] = name
except Exception:
raise Exception
# raise Exception('[extract] error decoding image %r' %b64encoded)
else:
print("[extract] cached link " + cache[b64encoded])
name = cache[b64encoded]
link = cdn + "/upload/image-" + name + "." + ext
newparts[i] = (
current[: -len(mime)]
+ current[-len(mime) :]
+ link
+ next[-b64end:]
)
newparts[i + 1] = next[:-b64end]
break
return (
extract_imageparts(
newparts[i] + newparts[i + 1] + b64.join(bodyparts[(i + 2) :]), prefix
)
if len(bodyparts) > (i + 1)
else "".join(newparts)
)
def extract_dataimages(parts, prefix):
newparts = list(parts)
for part in parts:
i = parts.index(part)
if part.endswith("]("):
[ext, rest] = parts[i + 1].split(b64)
name = prefix + "-" + str(len(cache))
if ext == "/jpeg":
ext = "jpg"
else:
ext = ext.replace("/", "")
link = "/upload/image-" + name + "." + ext
print("[extract] filename: " + link)
b64end = rest.find(")")
if b64end != -1:
b64encoded = rest[:b64end]
print("[extract] %d text bytes" % len(b64encoded))
# write if not cached
if b64encoded not in cache:
try:
content = base64.b64decode(b64encoded + "==")
open(public + link, "wb").write(content)
print("[extract] " + str(len(content)) + " image bytes")
cache[b64encoded] = name
except Exception:
raise Exception
# raise Exception('[extract] error decoding image %r' %b64encoded)
else:
print("[extract] 0 image bytes, cached for " + cache[b64encoded])
name = cache[b64encoded]
# update link with CDN
link = cdn + "/upload/image-" + name + "." + ext
# patch newparts
newparts[i + 1] = link + rest[b64end:]
else:
raise Exception("cannot find the end of base64 encoded string")
else:
print("[extract] dataimage skipping part " + str(i))
continue
return "".join(newparts)
di = "data:image"
def extract_md_images(body, prefix):
newbody = ""
body = (
body.replace("\n! [](" + di, "\n ![](" + di)
.replace("\n[](" + di, "\n![](" + di)
.replace(" [](" + di, " ![](" + di)
)
parts = body.split(di)
if len(parts) > 1:
newbody = extract_dataimages(parts, prefix)
else:
newbody = body
return newbody
def cleanup_md(body):
newbody = (
body.replace("<", "")
.replace(">", "")
.replace("{", "(")
.replace("}", ")")
.replace("", "...")
.replace(" __ ", " ")
.replace("_ _", " ")
.replace("****", "")
.replace("\u00a0", " ")
.replace("\u02c6", "^")
.replace("\u00a0", " ")
.replace("\ufeff", "")
.replace("\u200b", "")
.replace("\u200c", "")
) # .replace('\u2212', '-')
return newbody
def extract_md(body, shout_dict = None):
newbody = body
if newbody:
newbody = cleanup_md(newbody)
if not newbody:
raise Exception("cleanup error")
if shout_dict:
uid = shout_dict['id'] or uuid.uuid4()
newbody = extract_md_images(newbody, uid)
if not newbody:
raise Exception("extract_images error")
newbody, placed = extract_footnotes(body, shout_dict)
if not newbody:
raise Exception("extract_footnotes error")
return newbody
def extract_media(entry):
''' normalized media extraction method '''
"""normalized media extraction method"""
# media [ { title pic url body } ]}
kind = entry.get("type")
if not kind:
@@ -311,7 +159,7 @@ def extract_media(entry):
url = m.get("fileUrl") or m.get("url", "")
pic = ""
if m.get("thumborId"):
pic = cdn + "/unsafe/1600x/" + m["thumborId"]
pic = cdn + "/unsafe/" + m["thumborId"]
# url
if not url:
@@ -323,12 +171,7 @@ def extract_media(entry):
url = "https://vimeo.com/" + m["vimeoId"]
# body
body = m.get("body") or m.get("literatureBody") or ""
media.append({
"url": url,
"pic": pic,
"title": title,
"body": body
})
media.append({"url": url, "pic": pic, "title": title, "body": body})
return media
@@ -398,9 +241,7 @@ def cleanup_html(body: str) -> str:
r"<h4>\s*</h4>",
r"<div>\s*</div>",
]
regex_replace = {
r"<br>\s*</p>": "</p>"
}
regex_replace = {r"<br>\s*</p>": "</p>"}
changed = True
while changed:
# we need several iterations to clean nested tags this way
@@ -414,16 +255,17 @@ def cleanup_html(body: str) -> str:
changed = True
return new_body
def extract_html(entry, shout_id = None, cleanup=False):
body_orig = (entry.get("body") or "").replace('\(', '(').replace('\)', ')')
def extract_html(entry, shout_id=None, cleanup=False):
body_orig = (entry.get("body") or "").replace(r"\(", "(").replace(r"\)", ")")
if cleanup:
# we do that before bs parsing to catch the invalid html
body_clean = cleanup_html(body_orig)
if body_clean != body_orig:
print(f"[migration] html cleaned for slug {entry.get('slug', None)}")
body_orig = body_clean
if shout_id:
extract_footnotes(body_orig, shout_id)
# if shout_id:
# extract_footnotes(body_orig, shout_id)
body_html = str(BeautifulSoup(body_orig, features="html.parser"))
if cleanup:
# we do that after bs parsing because it can add dummy tags

View File

@@ -33,7 +33,7 @@ __version__ = (2020, 1, 16)
# TODO: Support decoded entities with UNIFIABLE.
class HTML2Text(html.parser.HTMLParser):
class HTML2Text(html.parser.HTMLParser): # noqa: C901
def __init__(
self,
out: Optional[OutCallback] = None,
@@ -85,7 +85,7 @@ class HTML2Text(html.parser.HTMLParser):
self.tag_callback = None
self.open_quote = config.OPEN_QUOTE # covered in cli
self.close_quote = config.CLOSE_QUOTE # covered in cli
self.header_id = None
self.header_id: str | None = None
self.span_highlight = False
self.span_lead = False
@@ -119,9 +119,7 @@ class HTML2Text(html.parser.HTMLParser):
self.lastWasList = False
self.style = 0
self.style_def = {} # type: Dict[str, Dict[str, str]]
self.tag_stack = (
[]
) # type: List[Tuple[str, Dict[str, Optional[str]], Dict[str, str]]]
self.tag_stack = [] # type: List[Tuple[str, Dict[str, Optional[str]], Dict[str, str]]]
self.emphasis = 0
self.drop_white_space = 0
self.inheader = False
@@ -227,7 +225,7 @@ class HTML2Text(html.parser.HTMLParser):
return i
return None
def handle_emphasis(
def handle_emphasis( # noqa: C901
self, start: bool, tag_style: Dict[str, str], parent_style: Dict[str, str]
) -> None:
"""
@@ -300,7 +298,7 @@ class HTML2Text(html.parser.HTMLParser):
if strikethrough:
self.quiet -= 1
def handle_tag(
def handle_tag( # noqa: C901
self, tag: str, attrs: Dict[str, Optional[str]], start: bool
) -> None:
self.current_tag = tag
@@ -333,9 +331,7 @@ class HTML2Text(html.parser.HTMLParser):
tag_style = element_style(attrs, self.style_def, parent_style)
self.tag_stack.append((tag, attrs, tag_style))
else:
dummy, attrs, tag_style = (
self.tag_stack.pop() if self.tag_stack else (None, {}, {})
)
dummy, attrs, tag_style = self.tag_stack.pop() if self.tag_stack else (None, {}, {})
if self.tag_stack:
parent_style = self.tag_stack[-1][2]
@@ -385,11 +381,7 @@ class HTML2Text(html.parser.HTMLParser):
):
self.o("`") # NOTE: same as <code>
self.span_highlight = True
elif (
self.current_class == "lead"
and not self.inheader
and not self.span_highlight
):
elif self.current_class == "lead" and not self.inheader and not self.span_highlight:
# self.o("==") # NOTE: CriticMarkup {==
self.span_lead = True
else:
@@ -479,11 +471,7 @@ class HTML2Text(html.parser.HTMLParser):
and not self.span_lead
and not self.span_highlight
):
if (
start
and self.preceding_data
and self.preceding_data[-1] == self.strong_mark[0]
):
if start and self.preceding_data and self.preceding_data[-1] == self.strong_mark[0]:
strong = " " + self.strong_mark
self.preceding_data += " "
else:
@@ -548,13 +536,8 @@ class HTML2Text(html.parser.HTMLParser):
"href" in attrs
and not attrs["href"].startswith("#_ftn")
and attrs["href"] is not None
and not (
self.skip_internal_links and attrs["href"].startswith("#")
)
and not (
self.ignore_mailto_links
and attrs["href"].startswith("mailto:")
)
and not (self.skip_internal_links and attrs["href"].startswith("#"))
and not (self.ignore_mailto_links and attrs["href"].startswith("mailto:"))
):
self.astack.append(attrs)
self.maybe_automatic_link = attrs["href"]
@@ -591,7 +574,7 @@ class HTML2Text(html.parser.HTMLParser):
if tag == "img" and start and not self.ignore_images:
# skip cloudinary images
if "src" in attrs and "cloudinary" not in attrs["src"]:
if "src" in attrs and ("cloudinary" not in attrs["src"]):
assert attrs["src"] is not None
if not self.images_to_alt:
attrs["href"] = attrs["src"]
@@ -638,9 +621,7 @@ class HTML2Text(html.parser.HTMLParser):
self.o("![" + escape_md(alt) + "]")
if self.inline_links:
href = attrs.get("href") or ""
self.o(
"(" + escape_md(urlparse.urljoin(self.baseurl, href)) + ")"
)
self.o("(" + escape_md(urlparse.urljoin(self.baseurl, href)) + ")")
else:
i = self.previousIndex(attrs)
if i is not None:
@@ -696,9 +677,7 @@ class HTML2Text(html.parser.HTMLParser):
# WARNING: does not line up <ol><li>s > 9 correctly.
parent_list = None
for list in self.list:
self.o(
" " if parent_list == "ol" and list.name == "ul" else " "
)
self.o(" " if parent_list == "ol" and list.name == "ul" else " ")
parent_list = list.name
if li.name == "ul":
@@ -787,7 +766,7 @@ class HTML2Text(html.parser.HTMLParser):
self.pbr()
self.br_toggle = " "
def o(
def o( # noqa: C901
self, data: str, puredata: bool = False, force: Union[bool, str] = False
) -> None:
"""
@@ -864,9 +843,7 @@ class HTML2Text(html.parser.HTMLParser):
self.out(" ")
self.space = False
if self.a and (
(self.p_p == 2 and self.links_each_paragraph) or force == "end"
):
if self.a and ((self.p_p == 2 and self.links_each_paragraph) or force == "end"):
if force == "end":
self.out("\n")
@@ -925,11 +902,7 @@ class HTML2Text(html.parser.HTMLParser):
if self.maybe_automatic_link is not None:
href = self.maybe_automatic_link
if (
href == data
and self.absolute_url_matcher.match(href)
and self.use_automatic_links
):
if href == data and self.absolute_url_matcher.match(href) and self.use_automatic_links:
self.o("<" + data + ">")
self.empty_link = False
return
@@ -980,7 +953,7 @@ class HTML2Text(html.parser.HTMLParser):
return nest_count
def optwrap(self, text: str) -> str:
def optwrap(self, text: str) -> str: # noqa: C901
"""
Wrap all paragraphs in the provided text.
@@ -1000,9 +973,7 @@ class HTML2Text(html.parser.HTMLParser):
self.inline_links = False
for para in text.split("\n"):
if len(para) > 0:
if not skipwrap(
para, self.wrap_links, self.wrap_list_items, self.wrap_tables
):
if not skipwrap(para, self.wrap_links, self.wrap_list_items, self.wrap_tables):
indent = ""
if para.startswith(" " + self.ul_item_mark):
# list item continuation: add a double indent to the
@@ -1043,12 +1014,10 @@ class HTML2Text(html.parser.HTMLParser):
return result
def html2text(
html: str, baseurl: str = "", bodywidth: Optional[int] = config.BODY_WIDTH
) -> str:
def html2text(html: str, baseurl: str = "", bodywidth: int = config.BODY_WIDTH) -> str:
h = html.strip() or ""
if h:
h = HTML2Text(baseurl=baseurl, bodywidth=bodywidth)
h = h.handle(html.strip())
h2t = HTML2Text(baseurl=baseurl, bodywidth=bodywidth)
h = h2t.handle(html.strip())
# print('[html2text] %d bytes' % len(html))
return h

View File

@@ -117,10 +117,7 @@ def main() -> None:
dest="images_with_size",
action="store_true",
default=config.IMAGES_WITH_SIZE,
help=(
"Write image tags with height and width attrs as raw html to retain "
"dimensions"
),
help=("Write image tags with height and width attrs as raw html to retain " "dimensions"),
)
p.add_argument(
"-g",
@@ -260,9 +257,7 @@ def main() -> None:
default=config.CLOSE_QUOTE,
help="The character used to close quotes",
)
p.add_argument(
"--version", action="version", version=".".join(map(str, __version__))
)
p.add_argument("--version", action="version", version=".".join(map(str, __version__)))
p.add_argument("filename", nargs="?")
p.add_argument("encoding", nargs="?", default="utf-8")
args = p.parse_args()

View File

@@ -4,9 +4,7 @@ from typing import Dict, List, Optional
from . import config
unifiable_n = {
html.entities.name2codepoint[k]: v
for k, v in config.UNIFIABLE.items()
if k != "nbsp"
html.entities.name2codepoint[k]: v for k, v in config.UNIFIABLE.items() if k != "nbsp"
}
@@ -68,12 +66,14 @@ def element_style(
:rtype: dict
"""
style = parent_style.copy()
if attrs.get("class"):
for css_class in attrs["class"].split():
attrs_class = attrs.get("class")
if attrs_class:
for css_class in attrs_class.split():
css_style = style_def.get("." + css_class, {})
style.update(css_style)
if attrs.get("style"):
immediate_style = dumb_property_dict(attrs["style"])
attrs_style = attrs.get("style")
if attrs_style:
immediate_style = dumb_property_dict(attrs_style)
style.update(immediate_style)
return style
@@ -147,18 +147,17 @@ def list_numbering_start(attrs: Dict[str, Optional[str]]) -> int:
:rtype: int or None
"""
if attrs.get("start"):
attrs_start = attrs.get("start")
if attrs_start:
try:
return int(attrs["start"]) - 1
return int(attrs_start) - 1
except ValueError:
pass
return 0
def skipwrap(
para: str, wrap_links: bool, wrap_list_items: bool, wrap_tables: bool
) -> bool:
def skipwrap(para: str, wrap_links: bool, wrap_list_items: bool, wrap_tables: bool) -> bool:
# If it appears to contain a link
# don't wrap
if not wrap_links and config.RE_LINK.search(para):
@@ -236,9 +235,7 @@ def reformat_table(lines: List[str], right_margin: int) -> List[str]:
max_width += [len(x) + right_margin for x in cols[-(num_cols - max_cols) :]]
max_cols = num_cols
max_width = [
max(len(x) + right_margin, old_len) for x, old_len in zip(cols, max_width)
]
max_width = [max(len(x) + right_margin, old_len) for x, old_len in zip(cols, max_width)]
# reformat
new_lines = []
@@ -247,15 +244,13 @@ def reformat_table(lines: List[str], right_margin: int) -> List[str]:
if set(line.strip()) == set("-|"):
filler = "-"
new_cols = [
x.rstrip() + (filler * (M - len(x.rstrip())))
for x, M in zip(cols, max_width)
x.rstrip() + (filler * (M - len(x.rstrip()))) for x, M in zip(cols, max_width)
]
new_lines.append("|-" + "|".join(new_cols) + "|")
else:
filler = " "
new_cols = [
x.rstrip() + (filler * (M - len(x.rstrip())))
for x, M in zip(cols, max_width)
x.rstrip() + (filler * (M - len(x.rstrip()))) for x, M in zip(cols, max_width)
]
new_lines.append("| " + "|".join(new_cols) + "|")
return new_lines

View File

@@ -1 +0,0 @@
__all__ = (["users", "topics", "content_items", "comments"],)

View File

@@ -5,61 +5,48 @@ from dateutil.parser import parse as date_parse
from base.orm import local_session
from migration.html2text import html2text
from orm.reaction import Reaction, ReactionKind
from orm.shout import ShoutReactionsFollower
from orm.shout import Shout, ShoutReactionsFollower
from orm.topic import TopicFollower
from orm.user import User
from orm.shout import Shout
ts = datetime.now(tz=timezone.utc)
def auto_followers(session, topics, reaction_dict):
# creating shout's reactions following for reaction author
following1 = session.query(
ShoutReactionsFollower
).where(
ShoutReactionsFollower.follower == reaction_dict["createdBy"]
).filter(
ShoutReactionsFollower.shout == reaction_dict["shout"]
).first()
following1 = (
session.query(ShoutReactionsFollower)
.where(ShoutReactionsFollower.follower == reaction_dict["createdBy"])
.filter(ShoutReactionsFollower.shout == reaction_dict["shout"])
.first()
)
if not following1:
following1 = ShoutReactionsFollower.create(
follower=reaction_dict["createdBy"],
shout=reaction_dict["shout"],
auto=True
follower=reaction_dict["createdBy"], shout=reaction_dict["shout"], auto=True
)
session.add(following1)
# creating topics followings for reaction author
for t in topics:
tf = session.query(
TopicFollower
).where(
TopicFollower.follower == reaction_dict["createdBy"]
).filter(
TopicFollower.topic == t['id']
).first()
tf = (
session.query(TopicFollower)
.where(TopicFollower.follower == reaction_dict["createdBy"])
.filter(TopicFollower.topic == t["id"])
.first()
)
if not tf:
topic_following = TopicFollower.create(
follower=reaction_dict["createdBy"],
topic=t['id'],
auto=True
follower=reaction_dict["createdBy"], topic=t["id"], auto=True
)
session.add(topic_following)
def migrate_ratings(session, entry, reaction_dict):
for comment_rating_old in entry.get("ratings", []):
rater = (
session.query(User)
.filter(User.oid == comment_rating_old["createdBy"])
.first()
)
rater = session.query(User).filter(User.oid == comment_rating_old["createdBy"]).first()
re_reaction_dict = {
"shout": reaction_dict["shout"],
"replyTo": reaction_dict["id"],
"kind": ReactionKind.LIKE
if comment_rating_old["value"] > 0
else ReactionKind.DISLIKE,
"kind": ReactionKind.LIKE if comment_rating_old["value"] > 0 else ReactionKind.DISLIKE,
"createdBy": rater.id if rater else 1,
}
cts = comment_rating_old.get("createdAt")
@@ -68,18 +55,15 @@ def migrate_ratings(session, entry, reaction_dict):
try:
# creating reaction from old rating
rr = Reaction.create(**re_reaction_dict)
following2 = session.query(
ShoutReactionsFollower
).where(
ShoutReactionsFollower.follower == re_reaction_dict['createdBy']
).filter(
ShoutReactionsFollower.shout == rr.shout
).first()
following2 = (
session.query(ShoutReactionsFollower)
.where(ShoutReactionsFollower.follower == re_reaction_dict["createdBy"])
.filter(ShoutReactionsFollower.shout == rr.shout)
.first()
)
if not following2:
following2 = ShoutReactionsFollower.create(
follower=re_reaction_dict['createdBy'],
shout=rr.shout,
auto=True
follower=re_reaction_dict["createdBy"], shout=rr.shout, auto=True
)
session.add(following2)
session.add(rr)
@@ -150,9 +134,7 @@ async def migrate(entry, storage):
else:
stage = "author and old id found"
try:
shout = session.query(
Shout
).where(Shout.slug == old_shout["slug"]).one()
shout = session.query(Shout).where(Shout.slug == old_shout["slug"]).one()
if shout:
reaction_dict["shout"] = shout.id
reaction_dict["createdBy"] = author.id if author else 1
@@ -178,9 +160,9 @@ async def migrate(entry, storage):
def migrate_2stage(old_comment, idmap):
if old_comment.get('body'):
new_id = idmap.get(old_comment.get('oid'))
new_id = idmap.get(old_comment.get('_id'))
if old_comment.get("body"):
new_id = idmap.get(old_comment.get("oid"))
new_id = idmap.get(old_comment.get("_id"))
if new_id:
new_replyto_id = None
old_replyto_id = old_comment.get("replyTo")
@@ -190,17 +172,20 @@ def migrate_2stage(old_comment, idmap):
comment = session.query(Reaction).where(Reaction.id == new_id).first()
try:
if new_replyto_id:
new_reply = session.query(Reaction).where(Reaction.id == new_replyto_id).first()
new_reply = (
session.query(Reaction).where(Reaction.id == new_replyto_id).first()
)
if not new_reply:
print(new_replyto_id)
raise Exception("cannot find reply by id!")
comment.replyTo = new_reply.id
session.add(comment)
srf = session.query(ShoutReactionsFollower).where(
ShoutReactionsFollower.shout == comment.shout
).filter(
ShoutReactionsFollower.follower == comment.createdBy
).first()
srf = (
session.query(ShoutReactionsFollower)
.where(ShoutReactionsFollower.shout == comment.shout)
.filter(ShoutReactionsFollower.follower == comment.createdBy)
.first()
)
if not srf:
srf = ShoutReactionsFollower.create(
shout=comment.shout, follower=comment.createdBy, auto=True

View File

@@ -1,16 +1,18 @@
from datetime import datetime, timezone
import json
import re
from datetime import datetime, timezone
from dateutil.parser import parse as date_parse
from sqlalchemy.exc import IntegrityError
from transliterate import translit
from base.orm import local_session
from migration.extract import extract_html, extract_media
from orm.reaction import Reaction, ReactionKind
from orm.shout import Shout, ShoutTopic, ShoutReactionsFollower
from orm.shout import Shout, ShoutReactionsFollower, ShoutTopic
from orm.topic import Topic, TopicFollower
from orm.user import User
from orm.topic import TopicFollower, Topic
from services.stat.viewed import ViewedStorage
import re
OLD_DATE = "2016-03-05 22:22:00.350000"
ts = datetime.now(tz=timezone.utc)
@@ -33,7 +35,7 @@ def get_shout_slug(entry):
slug = friend.get("slug", "")
if slug:
break
slug = re.sub('[^0-9a-zA-Z]+', '-', slug)
slug = re.sub("[^0-9a-zA-Z]+", "-", slug)
return slug
@@ -41,27 +43,27 @@ def create_author_from_app(app):
user = None
userdata = None
# check if email is used
if app['email']:
if app["email"]:
with local_session() as session:
user = session.query(User).where(User.email == app['email']).first()
user = session.query(User).where(User.email == app["email"]).first()
if not user:
# print('[migration] app %r' % app)
name = app.get('name')
name = app.get("name")
if name:
slug = translit(name, "ru", reversed=True).lower()
slug = re.sub('[^0-9a-zA-Z]+', '-', slug)
print('[migration] created slug %s' % slug)
slug = re.sub("[^0-9a-zA-Z]+", "-", slug)
print("[migration] created slug %s" % slug)
# check if slug is used
if slug:
user = session.query(User).where(User.slug == slug).first()
# get slug from email
if user:
slug = app['email'].split('@')[0]
slug = app["email"].split("@")[0]
user = session.query(User).where(User.slug == slug).first()
# one more try
if user:
slug += '-author'
slug += "-author"
user = session.query(User).where(User.slug == slug).first()
# create user with application data
@@ -79,7 +81,7 @@ def create_author_from_app(app):
user = User.create(**userdata)
session.add(user)
session.commit()
userdata['id'] = user.id
userdata["id"] = user.id
userdata = user.dict()
return userdata
@@ -91,11 +93,12 @@ async def create_shout(shout_dict):
s = Shout.create(**shout_dict)
author = s.authors[0]
with local_session() as session:
srf = session.query(ShoutReactionsFollower).where(
ShoutReactionsFollower.shout == s.id
).filter(
ShoutReactionsFollower.follower == author.id
).first()
srf = (
session.query(ShoutReactionsFollower)
.where(ShoutReactionsFollower.shout == s.id)
.filter(ShoutReactionsFollower.follower == author.id)
.first()
)
if not srf:
srf = ShoutReactionsFollower.create(shout=s.id, follower=author.id, auto=True)
session.add(srf)
@@ -116,14 +119,14 @@ async def get_user(entry, storage):
elif user_oid:
userdata = storage["users"]["by_oid"].get(user_oid)
if not userdata:
print('no userdata by oid, anonymous')
print("no userdata by oid, anonymous")
userdata = anondict
print(app)
# cleanup slug
if userdata:
slug = userdata.get("slug", "")
if slug:
slug = re.sub('[^0-9a-zA-Z]+', '-', slug)
slug = re.sub("[^0-9a-zA-Z]+", "-", slug)
userdata["slug"] = slug
else:
userdata = anondict
@@ -137,24 +140,27 @@ async def migrate(entry, storage):
r = {
"layout": type2layout[entry["type"]],
"title": entry["title"],
"authors": [author, ],
"authors": [
author,
],
"slug": get_shout_slug(entry),
"cover": (
"https://assets.discours.io/unsafe/1600x/" +
entry["thumborId"] if entry.get("thumborId") else entry.get("image", {}).get("url")
"https://images.discours.io/unsafe/" + entry["thumborId"]
if entry.get("thumborId")
else entry.get("image", {}).get("url")
),
"visibility": "public" if entry.get("published") else "authors",
"visibility": "public" if entry.get("published") else "community",
"publishedAt": date_parse(entry.get("publishedAt")) if entry.get("published") else None,
"deletedAt": date_parse(entry.get("deletedAt")) if entry.get("deletedAt") else None,
"createdAt": date_parse(entry.get("createdAt", OLD_DATE)),
"updatedAt": date_parse(entry["updatedAt"]) if "updatedAt" in entry else ts,
"createdBy": author.id,
"topics": await add_topics_follower(entry, storage, author),
"body": extract_html(entry, cleanup=True)
"body": extract_html(entry, cleanup=True),
}
# main topic patch
r['mainTopic'] = r['topics'][0]
r["mainTopic"] = r["topics"][0]
# published author auto-confirm
if entry.get("published"):
@@ -177,14 +183,16 @@ async def migrate(entry, storage):
shout_dict["oid"] = entry.get("_id", "")
shout = await create_shout(shout_dict)
except IntegrityError as e:
print('[migration] create_shout integrity error', e)
print("[migration] create_shout integrity error", e)
shout = await resolve_create_shout(shout_dict)
except Exception as e:
raise Exception(e)
# udpate data
shout_dict = shout.dict()
shout_dict["authors"] = [author.dict(), ]
shout_dict["authors"] = [
author.dict(),
]
# shout topics aftermath
shout_dict["topics"] = await topics_aftermath(r, storage)
@@ -193,7 +201,9 @@ async def migrate(entry, storage):
await content_ratings_to_reactions(entry, shout_dict["slug"])
# shout views
await ViewedStorage.increment(shout_dict["slug"], amount=entry.get("views", 1), viewer='old-discours')
await ViewedStorage.increment(
shout_dict["slug"], amount=entry.get("views", 1), viewer="old-discours"
)
# del shout_dict['ratings']
storage["shouts"]["by_oid"][entry["_id"]] = shout_dict
@@ -205,7 +215,9 @@ async def add_topics_follower(entry, storage, user):
topics = set([])
category = entry.get("category")
topics_by_oid = storage["topics"]["by_oid"]
oids = [category, ] + entry.get("tags", [])
oids = [
category,
] + entry.get("tags", [])
for toid in oids:
tslug = topics_by_oid.get(toid, {}).get("slug")
if tslug:
@@ -217,23 +229,18 @@ async def add_topics_follower(entry, storage, user):
try:
tpc = session.query(Topic).where(Topic.slug == tpcslug).first()
if tpc:
tf = session.query(
TopicFollower
).where(
TopicFollower.follower == user.id
).filter(
TopicFollower.topic == tpc.id
).first()
tf = (
session.query(TopicFollower)
.where(TopicFollower.follower == user.id)
.filter(TopicFollower.topic == tpc.id)
.first()
)
if not tf:
tf = TopicFollower.create(
topic=tpc.id,
follower=user.id,
auto=True
)
tf = TopicFollower.create(topic=tpc.id, follower=user.id, auto=True)
session.add(tf)
session.commit()
except IntegrityError:
print('[migration.shout] hidden by topic ' + tpc.slug)
print("[migration.shout] hidden by topic " + tpc.slug)
# main topic
maintopic = storage["replacements"].get(topics_by_oid.get(category, {}).get("slug"))
if maintopic in ttt:
@@ -254,7 +261,7 @@ async def process_user(userdata, storage, oid):
if not user:
try:
slug = userdata["slug"].lower().strip()
slug = re.sub('[^0-9a-zA-Z]+', '-', slug)
slug = re.sub("[^0-9a-zA-Z]+", "-", slug)
userdata["slug"] = slug
user = User.create(**userdata)
session.add(user)
@@ -282,9 +289,9 @@ async def resolve_create_shout(shout_dict):
s = session.query(Shout).filter(Shout.slug == shout_dict["slug"]).first()
bump = False
if s:
if s.createdAt != shout_dict['createdAt']:
if s.createdAt != shout_dict["createdAt"]:
# create new with different slug
shout_dict["slug"] += '-' + shout_dict["layout"]
shout_dict["slug"] += "-" + shout_dict["layout"]
try:
await create_shout(shout_dict)
except IntegrityError as e:
@@ -295,10 +302,7 @@ async def resolve_create_shout(shout_dict):
for key in shout_dict:
if key in s.__dict__:
if s.__dict__[key] != shout_dict[key]:
print(
"[migration] shout already exists, but differs in %s"
% key
)
print("[migration] shout already exists, but differs in %s" % key)
bump = True
else:
print("[migration] shout already exists, but lacks %s" % key)
@@ -344,9 +348,7 @@ async def topics_aftermath(entry, storage):
)
if not shout_topic_new:
try:
ShoutTopic.create(
**{"shout": shout.id, "topic": new_topic.id}
)
ShoutTopic.create(**{"shout": shout.id, "topic": new_topic.id})
except Exception:
print("[migration] shout topic error: " + newslug)
session.commit()
@@ -363,9 +365,7 @@ async def content_ratings_to_reactions(entry, slug):
with local_session() as session:
for content_rating in entry.get("ratings", []):
rater = (
session.query(User)
.filter(User.oid == content_rating["createdBy"])
.first()
session.query(User).filter(User.oid == content_rating["createdBy"]).first()
) or User.default_user
shout = session.query(Shout).where(Shout.slug == slug).first()
cts = content_rating.get("createdAt")
@@ -375,7 +375,7 @@ async def content_ratings_to_reactions(entry, slug):
if content_rating["value"] > 0
else ReactionKind.DISLIKE,
"createdBy": rater.id,
"shout": shout.id
"shout": shout.id,
}
reaction = (
session.query(Reaction)

View File

@@ -1,42 +1,35 @@
from base.orm import local_session
from migration.extract import extract_md
from migration.html2text import html2text
from orm.reaction import Reaction, ReactionKind
# from base.orm import local_session
# from migration.extract import extract_md
# from migration.html2text import html2text
# from orm.reaction import Reaction, ReactionKind
def migrate(entry, storage):
post_oid = entry['contentItem']
print(post_oid)
shout_dict = storage['shouts']['by_oid'].get(post_oid)
if shout_dict:
print(shout_dict['body'])
remark = {
"shout": shout_dict['id'],
"body": extract_md(
html2text(entry['body']),
shout_dict
),
"kind": ReactionKind.REMARK
}
if entry.get('textBefore'):
remark['range'] = str(
shout_dict['body']
.index(
entry['textBefore'] or ''
)
) + ':' + str(
shout_dict['body']
.index(
entry['textAfter'] or ''
) + len(
entry['textAfter'] or ''
)
)
with local_session() as session:
rmrk = Reaction.create(**remark)
session.commit()
del rmrk["_sa_instance_state"]
return rmrk
return
# def migrate(entry, storage):
# post_oid = entry["contentItem"]
# print(post_oid)
# shout_dict = storage["shouts"]["by_oid"].get(post_oid)
# if shout_dict:
# print(shout_dict["body"])
# remark = {
# "shout": shout_dict["id"],
# "body": extract_md(html2text(entry["body"]), shout_dict),
# "kind": ReactionKind.REMARK,
# }
#
# if entry.get("textBefore"):
# remark["range"] = (
# str(shout_dict["body"].index(entry["textBefore"] or ""))
# + ":"
# + str(
# shout_dict["body"].index(entry["textAfter"] or "")
# + len(entry["textAfter"] or "")
# )
# )
#
# with local_session() as session:
# rmrk = Reaction.create(**remark)
# session.commit()
# del rmrk["_sa_instance_state"]
# return rmrk
# return

View File

@@ -1,5 +1,4 @@
from base.orm import local_session
from migration.extract import extract_md
from migration.html2text import html2text
from orm import Topic
@@ -10,7 +9,7 @@ def migrate(entry):
"slug": entry["slug"],
"oid": entry["_id"],
"title": entry["title"].replace("&nbsp;", " "),
"body": extract_md(html2text(body_orig))
"body": html2text(body_orig),
}
with local_session() as session:

View File

@@ -8,7 +8,7 @@ from base.orm import local_session
from orm.user import AuthorFollower, User, UserRating
def migrate(entry):
def migrate(entry): # noqa: C901
if "subscribedTo" in entry:
del entry["subscribedTo"]
email = entry["emails"][0]["address"]
@@ -23,7 +23,7 @@ def migrate(entry):
"muted": False, # amnesty
"links": [],
"name": "anonymous",
"password": entry["services"]["password"].get("bcrypt")
"password": entry["services"]["password"].get("bcrypt"),
}
if "updatedAt" in entry:
@@ -33,9 +33,13 @@ def migrate(entry):
if entry.get("profile"):
# slug
slug = entry["profile"].get("path").lower()
slug = re.sub('[^0-9a-zA-Z]+', '-', slug).strip()
slug = re.sub("[^0-9a-zA-Z]+", "-", slug).strip()
user_dict["slug"] = slug
bio = (entry.get("profile", {"bio": ""}).get("bio") or "").replace('\(', '(').replace('\)', ')')
bio = (
(entry.get("profile", {"bio": ""}).get("bio") or "")
.replace(r"\(", "(")
.replace(r"\)", ")")
)
bio_text = BeautifulSoup(bio, features="lxml").text
if len(bio_text) > 120:
@@ -46,8 +50,7 @@ def migrate(entry):
# userpic
try:
user_dict["userpic"] = (
"https://assets.discours.io/unsafe/100x/"
+ entry["profile"]["thumborId"]
"https://images.discours.io/unsafe/" + entry["profile"]["thumborId"]
)
except KeyError:
try:
@@ -62,11 +65,7 @@ def migrate(entry):
name = (name + " " + ln) if ln else name
if not name:
name = slug if slug else "anonymous"
name = (
entry["profile"]["path"].lower().strip().replace(" ", "-")
if len(name) < 2
else name
)
name = entry["profile"]["path"].lower().strip().replace(" ", "-") if len(name) < 2 else name
user_dict["name"] = name
# links
@@ -95,9 +94,7 @@ def migrate(entry):
except IntegrityError:
print("[migration] cannot create user " + user_dict["slug"])
with local_session() as session:
old_user = (
session.query(User).filter(User.slug == user_dict["slug"]).first()
)
old_user = session.query(User).filter(User.slug == user_dict["slug"]).first()
old_user.oid = oid
old_user.password = user_dict["password"]
session.commit()
@@ -114,7 +111,7 @@ def post_migrate():
"slug": "old-discours",
"username": "old-discours",
"email": "old@discours.io",
"name": "Просмотры на старой версии сайта"
"name": "Просмотры на старой версии сайта",
}
with local_session() as session:
@@ -147,12 +144,8 @@ def migrate_2stage(entry, id_map):
}
user_rating = UserRating.create(**user_rating_dict)
if user_rating_dict['value'] > 0:
af = AuthorFollower.create(
author=user.id,
follower=rater.id,
auto=True
)
if user_rating_dict["value"] > 0:
af = AuthorFollower.create(author=user.id, follower=rater.id, auto=True)
session.add(af)
session.add(user_rating)
session.commit()

View File

@@ -1,226 +1,91 @@
{{ $proxy_settings := "proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection $http_connection; proxy_set_header Host $http_host; proxy_set_header X-Request-Start $msec;" }}
{{ $gzip_settings := "gzip on; gzip_min_length 1100; gzip_buffers 4 32k; gzip_types text/css text/javascript text/xml text/plain text/x-component application/javascript application/x-javascript application/json application/xml application/rss+xml font/truetype application/x-font-ttf font/opentype application/vnd.ms-fontobject image/svg+xml; gzip_vary on; gzip_comp_level 6;" }}
{{ $cors_headers_options := "if ($request_method = 'OPTIONS') { add_header 'Access-Control-Allow-Origin' '$allow_origin' always; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS'; add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization'; add_header 'Access-Control-Allow-Credentials' 'true'; add_header 'Access-Control-Max-Age' 1728000; add_header 'Content-Type' 'text/plain; charset=utf-8'; add_header 'Content-Length' 0; return 204; }" }}
{{ $cors_headers_post := "if ($request_method = 'POST') { add_header 'Access-Control-Allow-Origin' '$allow_origin' always; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS' always; add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization' always; add_header 'Access-Control-Expose-Headers' 'Content-Length,Content-Range' always; add_header 'Access-Control-Allow-Credentials' 'true' always; }" }}
{{ $cors_headers_get := "if ($request_method = 'GET') { add_header 'Access-Control-Allow-Origin' '$allow_origin' always; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS' always; add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization' always; add_header 'Access-Control-Expose-Headers' 'Content-Length,Content-Range' always; add_header 'Access-Control-Allow-Credentials' 'true' always; }" }}
{{ range $port_map := .PROXY_PORT_MAP | split " " }}
{{ $port_map_list := $port_map | split ":" }}
{{ $scheme := index $port_map_list 0 }}
{{ $listen_port := index $port_map_list 1 }}
{{ $upstream_port := index $port_map_list 2 }}
map $http_origin $allow_origin {
~^https?:\/\/((.*\.)?localhost(:\d+)?|discoursio-webapp(-(.*))?\.vercel\.app|(.*\.)?discours\.io)$ $http_origin;
default "";
}
{{ if eq $scheme "http" }}
server {
listen [{{ $.NGINX_BIND_ADDRESS_IP6 }}]:{{ $listen_port }};
listen {{ if $.NGINX_BIND_ADDRESS_IP4 }}{{ $.NGINX_BIND_ADDRESS_IP4 }}:{{end}}{{ $listen_port }};
{{ if $.NOSSL_SERVER_NAME }}server_name {{ $.NOSSL_SERVER_NAME }}; {{ end }}
access_log {{ $.NGINX_ACCESS_LOG_PATH }}{{ if and ($.NGINX_ACCESS_LOG_FORMAT) (ne $.NGINX_ACCESS_LOG_PATH "off") }} {{ $.NGINX_ACCESS_LOG_FORMAT }}{{ end }};
error_log {{ $.NGINX_ERROR_LOG_PATH }};
{{ if (and (eq $listen_port "80") ($.SSL_INUSE)) }}
include {{ $.DOKKU_ROOT }}/{{ $.APP }}/nginx.conf.d/*.conf;
location / {
return 301 https://$host:{{ $.PROXY_SSL_PORT }}$request_uri;
}
{{ else }}
location / {
{{ if eq $scheme "http" }}
listen [::]:{{ $listen_port }};
listen {{ $listen_port }};
server_name {{ $.NOSSL_SERVER_NAME }};
access_log /var/log/nginx/{{ $.APP }}-access.log;
error_log /var/log/nginx/{{ $.APP }}-error.log;
gzip on;
gzip_min_length 1100;
gzip_buffers 4 32k;
gzip_types text/css text/javascript text/xml text/plain text/x-component application/javascript application/x-javascript application/json application/xml application/rss+xml font/truetype application/x-font-ttf font/opentype application/vnd.ms-fontobject image/svg+xml;
gzip_vary on;
gzip_comp_level 6;
{{ else if eq $scheme "https" }}
listen [::]:{{ $listen_port }} ssl http2;
listen {{ $listen_port }} ssl http2;
server_name {{ $.NOSSL_SERVER_NAME }};
access_log /var/log/nginx/{{ $.APP }}-access.log;
error_log /var/log/nginx/{{ $.APP }}-error.log;
ssl_certificate {{ $.APP_SSL_PATH }}/server.crt;
ssl_certificate_key {{ $.APP_SSL_PATH }}/server.key;
ssl_protocols TLSv1.2 TLSv1.3;
ssl_prefer_server_ciphers off;
proxy_pass http://{{ $.APP }}-{{ $upstream_port }};
proxy_http_version 1.1;
proxy_read_timeout {{ $.PROXY_READ_TIMEOUT }};
proxy_buffer_size {{ $.PROXY_BUFFER_SIZE }};
proxy_buffering {{ $.PROXY_BUFFERING }};
proxy_buffers {{ $.PROXY_BUFFERS }};
proxy_busy_buffers_size {{ $.PROXY_BUSY_BUFFERS_SIZE }};
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $http_connection;
proxy_set_header Host $http_host;
proxy_set_header X-Forwarded-For {{ $.PROXY_X_FORWARDED_FOR }};
proxy_set_header X-Forwarded-Port {{ $.PROXY_X_FORWARDED_PORT }};
proxy_set_header X-Forwarded-Proto {{ $.PROXY_X_FORWARDED_PROTO }};
proxy_set_header X-Request-Start $msec;
{{ if $.PROXY_X_FORWARDED_SSL }}proxy_set_header X-Forwarded-Ssl {{ $.PROXY_X_FORWARDED_SSL }};{{ end }}
}
keepalive_timeout 70;
{{ end }}
{{ if $.CLIENT_MAX_BODY_SIZE }}client_max_body_size {{ $.CLIENT_MAX_BODY_SIZE }};{{ end }}
include {{ $.DOKKU_ROOT }}/{{ $.APP }}/nginx.conf.d/*.conf;
error_page 400 401 402 403 405 406 407 408 409 410 411 412 413 414 415 416 417 418 420 422 423 424 426 428 429 431 444 449 450 451 /400-error.html;
location / {
proxy_pass http://{{ $.APP }}-{{ $upstream_port }};
{{ $proxy_settings }}
{{ $gzip_settings }}
{{ $cors_headers_options }}
{{ $cors_headers_post }}
{{ $cors_headers_get }}
}
location ~* \.(jpg|jpeg|png|gif|ico|css|js)$ {
expires 30d; # This means that the client can cache these resources for 30 days.
add_header Cache-Control "public, no-transform";
}
error_page 400 401 402 403 405 406 407 408 409 410 411 412 413 414 415 416 417 418 420 422 423 424 426 428 429 431 444 449 450 451 /400-error.html;
location /400-error.html {
root {{ $.DOKKU_LIB_ROOT }}/data/nginx-vhosts/dokku-errors;
root /var/lib/dokku/data/nginx-vhosts/dokku-errors;
internal;
}
error_page 404 /404-error.html;
location /404-error.html {
root {{ $.DOKKU_LIB_ROOT }}/data/nginx-vhosts/dokku-errors;
internal;
}
error_page 500 501 502 503 504 505 506 507 508 509 510 511 /500-error.html;
location /500-error.html {
root {{ $.DOKKU_LIB_ROOT }}/data/nginx-vhosts/dokku-errors;
internal;
}
{{ end }}
}
{{ else if eq $scheme "https"}}
server {
listen [{{ $.NGINX_BIND_ADDRESS_IP6 }}]:{{ $listen_port }} ssl {{ if eq $.HTTP2_SUPPORTED "true" }}http2{{ else if eq $.SPDY_SUPPORTED "true" }}spdy{{ end }};
listen {{ if $.NGINX_BIND_ADDRESS_IP4 }}{{ $.NGINX_BIND_ADDRESS_IP4 }}:{{end}}{{ $listen_port }} ssl {{ if eq $.HTTP2_SUPPORTED "true" }}http2{{ else if eq $.SPDY_SUPPORTED "true" }}spdy{{ end }};
{{ if $.SSL_SERVER_NAME }}server_name {{ $.SSL_SERVER_NAME }}; {{ end }}
{{ if $.NOSSL_SERVER_NAME }}server_name {{ $.NOSSL_SERVER_NAME }}; {{ end }}
access_log {{ $.NGINX_ACCESS_LOG_PATH }}{{ if and ($.NGINX_ACCESS_LOG_FORMAT) (ne $.NGINX_ACCESS_LOG_PATH "off") }} {{ $.NGINX_ACCESS_LOG_FORMAT }}{{ end }};
error_log {{ $.NGINX_ERROR_LOG_PATH }};
ssl_certificate {{ $.APP_SSL_PATH }}/server.crt;
ssl_certificate_key {{ $.APP_SSL_PATH }}/server.key;
ssl_protocols TLSv1.2 {{ if eq $.TLS13_SUPPORTED "true" }}TLSv1.3{{ end }};
ssl_prefer_server_ciphers off;
keepalive_timeout 70;
{{ if and (eq $.SPDY_SUPPORTED "true") (ne $.HTTP2_SUPPORTED "true") }}add_header Alternate-Protocol {{ $.PROXY_SSL_PORT }}:npn-spdy/2;{{ end }}
location / {
gzip on;
gzip_min_length 1100;
gzip_buffers 4 32k;
gzip_types text/css text/javascript text/xml text/plain text/x-component application/javascript application/x-javascript application/json application/xml application/rss+xml font/truetype application/x-font-ttf font/opentype application/vnd.ms-fontobject image/svg+xml;
gzip_vary on;
gzip_comp_level 6;
proxy_pass http://{{ $.APP }}-{{ $upstream_port }};
{{ if eq $.HTTP2_PUSH_SUPPORTED "true" }}http2_push_preload on; {{ end }}
proxy_http_version 1.1;
proxy_read_timeout {{ $.PROXY_READ_TIMEOUT }};
proxy_buffer_size {{ $.PROXY_BUFFER_SIZE }};
proxy_buffering {{ $.PROXY_BUFFERING }};
proxy_buffers {{ $.PROXY_BUFFERS }};
proxy_busy_buffers_size {{ $.PROXY_BUSY_BUFFERS_SIZE }};
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $http_connection;
proxy_set_header Host $http_host;
proxy_set_header X-Forwarded-For {{ $.PROXY_X_FORWARDED_FOR }};
proxy_set_header X-Forwarded-Port {{ $.PROXY_X_FORWARDED_PORT }};
proxy_set_header X-Forwarded-Proto {{ $.PROXY_X_FORWARDED_PROTO }};
proxy_set_header X-Request-Start $msec;
{{ if $.PROXY_X_FORWARDED_SSL }}proxy_set_header X-Forwarded-Ssl {{ $.PROXY_X_FORWARDED_SSL }};{{ end }}
if ($request_method = 'OPTIONS') {
add_header 'Access-Control-Allow-Origin' '$allow_origin' always;
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
#
# Custom headers and headers various browsers *should* be OK with but aren't
#
add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization';
add_header 'Access-Control-Allow-Credentials' 'true';
#
# Tell client that this pre-flight info is valid for 20 days
#
add_header 'Access-Control-Max-Age' 1728000;
add_header 'Content-Type' 'text/plain; charset=utf-8';
add_header 'Content-Length' 0;
return 204;
}
if ($request_method = 'POST') {
add_header 'Access-Control-Allow-Origin' '$allow_origin' always;
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS' always;
add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization' always;
add_header 'Access-Control-Expose-Headers' 'Content-Length,Content-Range' always;
add_header 'Access-Control-Allow-Credentials' 'true' always;
}
if ($request_method = 'GET') {
add_header 'Access-Control-Allow-Origin' '$allow_origin' always;
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS' always;
add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization' always;
add_header 'Access-Control-Expose-Headers' 'Content-Length,Content-Range' always;
add_header 'Access-Control-Allow-Credentials' 'true' always;
}
}
{{ if $.CLIENT_MAX_BODY_SIZE }}client_max_body_size {{ $.CLIENT_MAX_BODY_SIZE }};{{ end }}
include {{ $.DOKKU_ROOT }}/{{ $.APP }}/nginx.conf.d/*.conf;
error_page 400 401 402 403 405 406 407 408 409 410 411 412 413 414 415 416 417 418 420 422 423 424 426 428 429 431 444 449 450 451 /400-error.html;
location /400-error.html {
root {{ $.DOKKU_LIB_ROOT }}/data/nginx-vhosts/dokku-errors;
internal;
}
error_page 404 /404-error.html;
location /404-error.html {
root {{ $.DOKKU_LIB_ROOT }}/data/nginx-vhosts/dokku-errors;
root /var/lib/dokku/data/nginx-vhosts/dokku-errors;
internal;
}
error_page 500 501 503 504 505 506 507 508 509 510 511 /500-error.html;
location /500-error.html {
root {{ $.DOKKU_LIB_ROOT }}/data/nginx-vhosts/dokku-errors;
root /var/lib/dokku/data/nginx-vhosts/dokku-errors;
internal;
}
error_page 502 /502-error.html;
location /502-error.html {
root {{ $.DOKKU_LIB_ROOT }}/data/nginx-vhosts/dokku-errors;
root /var/lib/dokku/data/nginx-vhosts/dokku-errors;
internal;
}
}
{{ else if eq $scheme "grpc"}}
{{ if eq $.GRPC_SUPPORTED "true"}}{{ if eq $.HTTP2_SUPPORTED "true"}}
server {
listen [{{ $.NGINX_BIND_ADDRESS_IP6 }}]:{{ $listen_port }} http2;
listen {{ if $.NGINX_BIND_ADDRESS_IP4 }}{{ $.NGINX_BIND_ADDRESS_IP4 }}:{{end}}{{ $listen_port }} http2;
{{ if $.NOSSL_SERVER_NAME }}server_name {{ $.NOSSL_SERVER_NAME }}; {{ end }}
access_log {{ $.NGINX_ACCESS_LOG_PATH }}{{ if and ($.NGINX_ACCESS_LOG_FORMAT) (ne $.NGINX_ACCESS_LOG_PATH "off") }} {{ $.NGINX_ACCESS_LOG_FORMAT }}{{ end }};
error_log {{ $.NGINX_ERROR_LOG_PATH }};
location / {
grpc_pass grpc://{{ $.APP }}-{{ $upstream_port }};
}
{{ if $.CLIENT_MAX_BODY_SIZE }}client_max_body_size {{ $.CLIENT_MAX_BODY_SIZE }};{{ end }}
# include /home/dokku/gateway/nginx.conf.d/*.conf;
include {{ $.DOKKU_ROOT }}/{{ $.APP }}/nginx.conf.d/*.conf;
}
{{ end }}{{ end }}
{{ else if eq $scheme "grpcs"}}
{{ if eq $.GRPC_SUPPORTED "true"}}{{ if eq $.HTTP2_SUPPORTED "true"}}
server {
listen [{{ $.NGINX_BIND_ADDRESS_IP6 }}]:{{ $listen_port }} ssl http2;
listen {{ if $.NGINX_BIND_ADDRESS_IP4 }}{{ $.NGINX_BIND_ADDRESS_IP4 }}:{{end}}{{ $listen_port }} ssl http2;
{{ if $.NOSSL_SERVER_NAME }}server_name {{ $.NOSSL_SERVER_NAME }}; {{ end }}
access_log {{ $.NGINX_ACCESS_LOG_PATH }}{{ if and ($.NGINX_ACCESS_LOG_FORMAT) (ne $.NGINX_ACCESS_LOG_PATH "off") }} {{ $.NGINX_ACCESS_LOG_FORMAT }}{{ end }};
error_log {{ $.NGINX_ERROR_LOG_PATH }};
ssl_certificate {{ $.APP_SSL_PATH }}/server.crt;
ssl_certificate_key {{ $.APP_SSL_PATH }}/server.key;
ssl_protocols TLSv1.2 {{ if eq $.TLS13_SUPPORTED "true" }}TLSv1.3{{ end }};
ssl_prefer_server_ciphers off;
location / {
grpc_pass grpc://{{ $.APP }}-{{ $upstream_port }};
}
{{ if $.CLIENT_MAX_BODY_SIZE }}client_max_body_size {{ $.CLIENT_MAX_BODY_SIZE }};{{ end }}
include {{ $.DOKKU_ROOT }}/{{ $.APP }}/nginx.conf.d/*.conf;
}
{{ end }}{{ end }}
{{ end }}
{{ end }}
{{ if $.DOKKU_APP_WEB_LISTENERS }}
{{ range $upstream_port := $.PROXY_UPSTREAM_PORTS | split " " }}
upstream {{ $.APP }}-{{ $upstream_port }} {
{{ range $listeners := $.DOKKU_APP_WEB_LISTENERS | split " " }}
{{ $listener_list := $listeners | split ":" }}
{{ $listener_ip := index $listener_list 0 }}
server {{ $listener_ip }}:{{ $upstream_port }};{{ end }}
{{ $listener_port := index $listener_list 1 }}
server {{ $listener_ip }}:{{ $upstream_port }};
{{ end }}
}
{{ end }}{{ end }}
{{ end }}

View File

@@ -1,7 +1,7 @@
from base.orm import Base, engine
from orm.community import Community
from orm.notification import Notification
from orm.rbac import Operation, Resource, Permission, Role
from orm.rbac import Operation, Permission, Resource, Role
from orm.reaction import Reaction
from orm.shout import Shout
from orm.topic import Topic, TopicFollower
@@ -32,5 +32,5 @@ __all__ = [
"Notification",
"Reaction",
"UserRating",
"init_tables"
"init_tables",
]

View File

@@ -1,6 +1,4 @@
from datetime import datetime
from sqlalchemy import Column, DateTime, ForeignKey, String
from sqlalchemy import Column, DateTime, ForeignKey, String, func
from base.orm import Base
@@ -8,7 +6,7 @@ from base.orm import Base
class ShoutCollection(Base):
__tablename__ = "shout_collection"
id = None # type: ignore
id = None
shout = Column(ForeignKey("shout.id"), primary_key=True)
collection = Column(ForeignKey("collection.id"), primary_key=True)
@@ -20,6 +18,6 @@ class Collection(Base):
title = Column(String, nullable=False, comment="Title")
body = Column(String, nullable=True, comment="Body")
pic = Column(String, nullable=True, comment="Picture")
createdAt = Column(DateTime, default=datetime.now, comment="Created At")
createdAt = Column(DateTime(timezone=True), server_default=func.now(), comment="Created At")
createdBy = Column(ForeignKey("user.id"), comment="Created By")
publishedAt = Column(DateTime, default=datetime.now, comment="Published At")
publishedAt = Column(DateTime(timezone=True), server_default=func.now(), comment="Published At")

View File

@@ -1,17 +1,16 @@
from datetime import datetime
from sqlalchemy import Column, DateTime, ForeignKey, String, func
from sqlalchemy import Column, String, ForeignKey, DateTime
from base.orm import Base, local_session
class CommunityFollower(Base):
__tablename__ = "community_followers"
id = None # type: ignore
follower = Column(ForeignKey("user.id"), primary_key=True)
community = Column(ForeignKey("community.id"), primary_key=True)
id = None
follower: Column = Column(ForeignKey("user.id"), primary_key=True)
community: Column = Column(ForeignKey("community.id"), primary_key=True)
joinedAt = Column(
DateTime, nullable=False, default=datetime.now, comment="Created at"
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
# role = Column(ForeignKey(Role.id), nullable=False, comment="Role for member")
@@ -24,18 +23,16 @@ class Community(Base):
desc = Column(String, nullable=False, default="")
pic = Column(String, nullable=False, default="")
createdAt = Column(
DateTime, nullable=False, default=datetime.now, comment="Created at"
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
@staticmethod
def init_table():
with local_session() as session:
d = (
session.query(Community).filter(Community.slug == "discours").first()
)
d = session.query(Community).filter(Community.slug == "discours").first()
if not d:
d = Community.create(name="Дискурс", slug="discours")
session.add(d)
session.commit()
Community.default_community = d
print('[orm] default community id: %s' % d.id)
print("[orm] default community id: %s" % d.id)

View File

@@ -1,9 +1,9 @@
from datetime import datetime
from sqlalchemy import Column, Enum, ForeignKey, DateTime, Boolean, Integer
from enum import Enum as Enumeration
from sqlalchemy import Boolean, Column, DateTime, Enum, ForeignKey, Integer, func
from sqlalchemy.dialects.postgresql import JSONB
from base.orm import Base
from enum import Enum as Enumeration
class NotificationType(Enumeration):
@@ -14,10 +14,12 @@ class NotificationType(Enumeration):
class Notification(Base):
__tablename__ = "notification"
shout = Column(ForeignKey("shout.id"), index=True)
reaction = Column(ForeignKey("reaction.id"), index=True)
user = Column(ForeignKey("user.id"), index=True)
createdAt = Column(DateTime, nullable=False, default=datetime.now, index=True)
shout: Column = Column(ForeignKey("shout.id"), index=True)
reaction: Column = Column(ForeignKey("reaction.id"), index=True)
user: Column = Column(ForeignKey("user.id"), index=True)
createdAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), index=True
)
seen = Column(Boolean, nullable=False, default=False, index=True)
type = Column(Enum(NotificationType), nullable=False)
data = Column(JSONB, nullable=True)

View File

@@ -1,9 +1,9 @@
import warnings
from sqlalchemy import String, Column, ForeignKey, UniqueConstraint, TypeDecorator
from sqlalchemy import Column, ForeignKey, String, TypeDecorator, UniqueConstraint
from sqlalchemy.orm import relationship
from base.orm import Base, REGISTRY, engine, local_session
from base.orm import REGISTRY, Base, local_session
# Role Based Access Control #
@@ -121,16 +121,23 @@ class Operation(Base):
class Resource(Base):
__tablename__ = "resource"
resourceClass = Column(
String, nullable=False, unique=True, comment="Resource class"
)
resourceClass = Column(String, nullable=False, unique=True, comment="Resource class")
name = Column(String, nullable=False, unique=True, comment="Resource name")
# TODO: community = Column(ForeignKey())
@staticmethod
def init_table():
with local_session() as session:
for res in ["shout", "topic", "reaction", "chat", "message", "invite", "community", "user"]:
for res in [
"shout",
"topic",
"reaction",
"chat",
"message",
"invite",
"community",
"user",
]:
r = session.query(Resource).filter(Resource.name == res).first()
if not r:
r = Resource.create(name=res, resourceClass=res)
@@ -145,29 +152,27 @@ class Permission(Base):
{"extend_existing": True},
)
role = Column(
ForeignKey("role.id", ondelete="CASCADE"), nullable=False, comment="Role"
)
operation = Column(
role: Column = Column(ForeignKey("role.id", ondelete="CASCADE"), nullable=False, comment="Role")
operation: Column = Column(
ForeignKey("operation.id", ondelete="CASCADE"),
nullable=False,
comment="Operation",
)
resource = Column(
resource: Column = Column(
ForeignKey("resource.id", ondelete="CASCADE"),
nullable=False,
comment="Resource",
)
if __name__ == "__main__":
Base.metadata.create_all(engine)
ops = [
Permission(role=1, operation=1, resource=1),
Permission(role=1, operation=2, resource=1),
Permission(role=1, operation=3, resource=1),
Permission(role=1, operation=4, resource=1),
Permission(role=2, operation=4, resource=1),
]
global_session.add_all(ops)
global_session.commit()
# if __name__ == "__main__":
# Base.metadata.create_all(engine)
# ops = [
# Permission(role=1, operation=1, resource=1),
# Permission(role=1, operation=2, resource=1),
# Permission(role=1, operation=3, resource=1),
# Permission(role=1, operation=4, resource=1),
# Permission(role=2, operation=4, resource=1),
# ]
# global_session.add_all(ops)
# global_session.commit()

View File

@@ -1,7 +1,6 @@
from datetime import datetime
from enum import Enum as Enumeration
from sqlalchemy import Column, DateTime, Enum, ForeignKey, String
from sqlalchemy import Column, DateTime, Enum, ForeignKey, String, func
from base.orm import Base
@@ -28,15 +27,19 @@ class Reaction(Base):
__tablename__ = "reaction"
body = Column(String, nullable=True, comment="Reaction Body")
createdAt = Column(
DateTime, nullable=False, default=datetime.now, comment="Created at"
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
createdBy = Column(ForeignKey("user.id"), nullable=False, index=True, comment="Sender")
updatedAt = Column(DateTime, nullable=True, comment="Updated at")
updatedBy = Column(ForeignKey("user.id"), nullable=True, index=True, comment="Last Editor")
deletedAt = Column(DateTime, nullable=True, comment="Deleted at")
deletedBy = Column(ForeignKey("user.id"), nullable=True, index=True, comment="Deleted by")
shout = Column(ForeignKey("shout.id"), nullable=False, index=True)
replyTo = Column(
createdBy: Column = Column(ForeignKey("user.id"), nullable=False, index=True, comment="Sender")
updatedAt = Column(DateTime(timezone=True), nullable=True, comment="Updated at")
updatedBy: Column = Column(
ForeignKey("user.id"), nullable=True, index=True, comment="Last Editor"
)
deletedAt = Column(DateTime(timezone=True), nullable=True, comment="Deleted at")
deletedBy: Column = Column(
ForeignKey("user.id"), nullable=True, index=True, comment="Deleted by"
)
shout: Column = Column(ForeignKey("shout.id"), nullable=False, index=True)
replyTo: Column = Column(
ForeignKey("reaction.id"), nullable=True, comment="Reply to reaction ID"
)
range = Column(String, nullable=True, comment="Range in format <start index>:<end>")

View File

@@ -1,6 +1,13 @@
from datetime import datetime
from sqlalchemy import Boolean, Column, DateTime, ForeignKey, Integer, String, JSON
from sqlalchemy import (
JSON,
Boolean,
Column,
DateTime,
ForeignKey,
Integer,
String,
func,
)
from sqlalchemy.orm import column_property, relationship
from base.orm import Base, local_session
@@ -12,44 +19,46 @@ from orm.user import User
class ShoutTopic(Base):
__tablename__ = "shout_topic"
id = None # type: ignore
shout = Column(ForeignKey("shout.id"), primary_key=True, index=True)
topic = Column(ForeignKey("topic.id"), primary_key=True, index=True)
id = None
shout: Column = Column(ForeignKey("shout.id"), primary_key=True, index=True)
topic: Column = Column(ForeignKey("topic.id"), primary_key=True, index=True)
class ShoutReactionsFollower(Base):
__tablename__ = "shout_reactions_followers"
id = None # type: ignore
follower = Column(ForeignKey("user.id"), primary_key=True, index=True)
shout = Column(ForeignKey("shout.id"), primary_key=True, index=True)
id = None
follower: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
shout: Column = Column(ForeignKey("shout.id"), primary_key=True, index=True)
auto = Column(Boolean, nullable=False, default=False)
createdAt = Column(
DateTime, nullable=False, default=datetime.now, comment="Created at"
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
deletedAt = Column(DateTime, nullable=True)
deletedAt = Column(DateTime(timezone=True), nullable=True)
class ShoutAuthor(Base):
__tablename__ = "shout_author"
id = None # type: ignore
shout = Column(ForeignKey("shout.id"), primary_key=True, index=True)
user = Column(ForeignKey("user.id"), primary_key=True, index=True)
caption = Column(String, nullable=True, default="")
id = None
shout: Column = Column(ForeignKey("shout.id"), primary_key=True, index=True)
user: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
caption: Column = Column(String, nullable=True, default="")
class Shout(Base):
__tablename__ = "shout"
# timestamps
createdAt = Column(DateTime, nullable=False, default=datetime.now, comment="Created at")
updatedAt = Column(DateTime, nullable=True, comment="Updated at")
publishedAt = Column(DateTime, nullable=True)
deletedAt = Column(DateTime, nullable=True)
createdAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
updatedAt = Column(DateTime(timezone=True), nullable=True, comment="Updated at")
publishedAt = Column(DateTime(timezone=True), nullable=True)
deletedAt = Column(DateTime(timezone=True), nullable=True)
createdBy = Column(ForeignKey("user.id"), comment="Created By")
deletedBy = Column(ForeignKey("user.id"), nullable=True)
createdBy: Column = Column(ForeignKey("user.id"), comment="Created By")
deletedBy: Column = Column(ForeignKey("user.id"), nullable=True)
slug = Column(String, unique=True)
cover = Column(String, nullable=True, comment="Cover image url")
@@ -71,11 +80,11 @@ class Shout(Base):
reactions = relationship(lambda: Reaction)
# TODO: these field should be used or modified
community = Column(ForeignKey("community.id"), default=1)
lang = Column(String, nullable=False, default='ru', comment="Language")
mainTopic = Column(ForeignKey("topic.slug"), nullable=True)
community: Column = Column(ForeignKey("community.id"), default=1)
lang = Column(String, nullable=False, default="ru", comment="Language")
mainTopic: Column = Column(ForeignKey("topic.slug"), nullable=True)
visibility = Column(String, nullable=True) # owner authors community public
versionOf = Column(ForeignKey("shout.id"), nullable=True)
versionOf: Column = Column(ForeignKey("shout.id"), nullable=True)
oid = Column(String, nullable=True)
@staticmethod
@@ -83,12 +92,7 @@ class Shout(Base):
with local_session() as session:
s = session.query(Shout).first()
if not s:
entry = {
"slug": "genesis-block",
"body": "",
"title": "Ничего",
"lang": "ru"
}
entry = {"slug": "genesis-block", "body": "", "title": "Ничего", "lang": "ru"}
s = Shout.create(**entry)
session.add(s)
session.commit()

View File

@@ -1,6 +1,4 @@
from datetime import datetime
from sqlalchemy import Boolean, Column, DateTime, ForeignKey, String
from sqlalchemy import Boolean, Column, DateTime, ForeignKey, String, func
from base.orm import Base
@@ -8,11 +6,11 @@ from base.orm import Base
class TopicFollower(Base):
__tablename__ = "topic_followers"
id = None # type: ignore
follower = Column(ForeignKey("user.id"), primary_key=True, index=True)
topic = Column(ForeignKey("topic.id"), primary_key=True, index=True)
id = None
follower: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
topic: Column = Column(ForeignKey("topic.id"), primary_key=True, index=True)
createdAt = Column(
DateTime, nullable=False, default=datetime.now, comment="Created at"
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
auto = Column(Boolean, nullable=False, default=False)
@@ -24,7 +22,5 @@ class Topic(Base):
title = Column(String, nullable=False, comment="Title")
body = Column(String, nullable=True, comment="Body")
pic = Column(String, nullable=True, comment="Picture")
community = Column(
ForeignKey("community.id"), default=1, comment="Community"
)
community: Column = Column(ForeignKey("community.id"), default=1, comment="Community")
oid = Column(String, nullable=True, comment="Old ID")

View File

@@ -1,8 +1,7 @@
from datetime import datetime
from sqlalchemy import JSON as JSONType
from sqlalchemy import Boolean, Column, DateTime, ForeignKey, Integer, String
from sqlalchemy import Boolean, Column, DateTime, ForeignKey, Integer, String, func
from sqlalchemy.orm import relationship
from base.orm import Base, local_session
from orm.rbac import Role
@@ -10,10 +9,10 @@ from orm.rbac import Role
class UserRating(Base):
__tablename__ = "user_rating"
id = None # type: ignore
rater = Column(ForeignKey("user.id"), primary_key=True, index=True)
user = Column(ForeignKey("user.id"), primary_key=True, index=True)
value = Column(Integer)
id = None
rater: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
user: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
value: Column = Column(Integer)
@staticmethod
def init_table():
@@ -23,7 +22,7 @@ class UserRating(Base):
class UserRole(Base):
__tablename__ = "user_role"
id = None # type: ignore
id = None
user = Column(ForeignKey("user.id"), primary_key=True, index=True)
role = Column(ForeignKey("role.id"), primary_key=True, index=True)
@@ -31,11 +30,11 @@ class UserRole(Base):
class AuthorFollower(Base):
__tablename__ = "author_follower"
id = None # type: ignore
follower = Column(ForeignKey("user.id"), primary_key=True, index=True)
author = Column(ForeignKey("user.id"), primary_key=True, index=True)
id = None
follower: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
author: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
createdAt = Column(
DateTime, nullable=False, default=datetime.now, comment="Created at"
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
auto = Column(Boolean, nullable=False, default=False)
@@ -55,12 +54,12 @@ class User(Base):
muted = Column(Boolean, default=False)
emailConfirmed = Column(Boolean, default=False)
createdAt = Column(
DateTime, nullable=False, default=datetime.now, comment="Created at"
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
lastSeen = Column(
DateTime, nullable=False, default=datetime.now, comment="Was online at"
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Was online at"
)
deletedAt = Column(DateTime, nullable=True, comment="Deleted at")
deletedAt = Column(DateTime(timezone=True), nullable=True, comment="Deleted at")
links = Column(JSONType, nullable=True, comment="Links")
oauth = Column(String, nullable=True)
ratings = relationship(UserRating, foreign_keys=UserRating.user)
@@ -103,4 +102,4 @@ class User(Base):
# if __name__ == "__main__":
# print(User.get_permission(user_id=1)) # type: ignore
# print(User.get_permission(user_id=1))

1802
poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,8 @@
isort
brunette
flake8
mypy
black==23.10.1
flake8==6.1.0
gql_schema_codegen==1.0.1
isort==5.12.0
mypy==1.6.1
pre-commit==3.5.0
pymongo-stubs==0.2.0
sqlalchemy-stubs==0.4

View File

@@ -1,40 +1,37 @@
python-frontmatter~=1.0.0
aioredis~=2.0.1
aiohttp
aiohttp==3.8.6
alembic==1.11.3
ariadne>=0.17.0
PyYAML>=5.4
pyjwt>=2.6.0
starlette~=0.23.1
sqlalchemy>=1.4.41
graphql-core>=3.0.3
gql~=3.4.0
uvicorn>=0.18.3
pydantic>=1.10.2
passlib~=1.7.4
authlib>=1.1.0
httpx>=0.23.0
psycopg2-binary
transliterate~=1.10.2
requests~=2.28.1
bcrypt>=4.0.0
bson~=0.5.10
flake8
DateTime~=4.7
asyncio~=3.4.3
python-dateutil~=2.8.2
authlib==1.2.1
bcrypt>=4.0.0
beautifulsoup4~=4.11.1
lxml
sentry-sdk>=1.14.0
# sse_starlette
graphql-ws
nltk~=3.8.1
pymystem3~=0.2.0
transformers~=4.28.1
boto3~=1.28.2
botocore~=1.31.2
python-multipart~=0.0.6
alembic==1.11.3
bson~=0.5.10
DateTime~=4.7
gql~=3.4.0
graphql-core>=3.0.3
httpx>=0.23.0
itsdangerous
lxml
Mako==1.2.4
MarkupSafe==2.1.3
nltk~=3.8.1
passlib~=1.7.4
psycopg2-binary
pydantic>=1.10.2
pyjwt>=2.6.0
pymystem3~=0.2.0
python-dateutil~=2.8.2
python-frontmatter~=1.0.0
python-multipart~=0.0.6
PyYAML>=5.4
requests~=2.28.1
sentry-sdk>=1.14.0
sqlalchemy>=1.4.41
sse-starlette==1.6.5
itsdangerous
starlette~=0.23.1
transliterate~=1.10.2
uvicorn>=0.18.3
redis

View File

@@ -53,4 +53,3 @@ echo "Start migration"
python3 server.py migrate
if [ $? -ne 0 ]; then { echo "Migration failed, aborting." ; exit 1; } fi
echo 'Done!'

View File

@@ -1,67 +1,46 @@
# flake8: noqa
from resolvers.auth import (
login,
sign_out,
is_email_used,
register_by_email,
confirm_email,
auth_send_link,
confirm_email,
get_current_user,
is_email_used,
login,
register_by_email,
sign_out,
)
from resolvers.create.migrate import markdown_body
from resolvers.create.editor import create_shout, delete_shout, update_shout
from resolvers.zine.profile import (
load_authors_by,
rate_user,
update_profile,
get_authors_all
)
from resolvers.zine.reactions import (
create_reaction,
delete_reaction,
update_reaction,
reactions_unfollow,
reactions_follow,
load_reactions_by
)
from resolvers.zine.topics import (
topic_follow,
topic_unfollow,
topics_by_author,
topics_by_community,
topics_all,
get_topic
)
from resolvers.zine.following import (
follow,
unfollow
)
from resolvers.zine.load import (
load_shout,
load_shouts_by
)
from resolvers.inbox.chats import (
create_chat,
delete_chat,
update_chat
)
from resolvers.inbox.chats import create_chat, delete_chat, update_chat
from resolvers.inbox.load import load_chats, load_messages_by, load_recipients
from resolvers.inbox.messages import (
create_message,
delete_message,
mark_as_read,
update_message,
mark_as_read
)
from resolvers.inbox.load import (
load_chats,
load_messages_by,
load_recipients
)
from resolvers.inbox.search import search_recipients
from resolvers.notifications import load_notifications
from resolvers.zine.following import follow, unfollow
from resolvers.zine.load import load_shout, load_shouts_by
from resolvers.zine.profile import (
get_authors_all,
load_authors_by,
rate_user,
update_profile,
)
from resolvers.zine.reactions import (
create_reaction,
delete_reaction,
load_reactions_by,
reactions_follow,
reactions_unfollow,
update_reaction,
)
from resolvers.zine.topics import (
get_topic,
topic_follow,
topic_unfollow,
topics_all,
topics_by_author,
topics_by_community,
)

View File

@@ -1,25 +1,23 @@
# -*- coding: utf-8 -*-
import re
from datetime import datetime, timezone
from urllib.parse import quote_plus
from graphql.type import GraphQLResolveInfo
from starlette.responses import RedirectResponse
from transliterate import translit
import re
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from auth.email import send_auth_email
from auth.identity import Identity, Password
from auth.jwtcodec import JWTCodec
from auth.tokenstorage import TokenStorage
from base.exceptions import (BaseHttpException, InvalidPassword, InvalidToken,
ObjectNotExist, Unauthorized)
from base.exceptions import InvalidPassword, InvalidToken, ObjectNotExist, Unauthorized
from base.orm import local_session
from base.resolvers import mutation, query
from orm import Role, User
from resolvers.zine.profile import user_subscriptions
from settings import SESSION_TOKEN_HEADER, FRONTEND_URL
from settings import SESSION_TOKEN_HEADER
@mutation.field("getSession")
@@ -33,18 +31,14 @@ async def get_current_user(_, info):
user.lastSeen = datetime.now(tz=timezone.utc)
session.commit()
return {
"token": token,
"user": user,
"news": await user_subscriptions(user.id),
}
return {"token": token, "user": user}
@mutation.field("confirmEmail")
async def confirm_email(_, info, token):
"""confirm owning email address"""
try:
print('[resolvers.auth] confirm email by token')
print("[resolvers.auth] confirm email by token")
payload = JWTCodec.decode(token)
user_id = payload.user_id
await TokenStorage.get(f"{user_id}-{payload.username}-{token}")
@@ -55,11 +49,7 @@ async def confirm_email(_, info, token):
user.lastSeen = datetime.now(tz=timezone.utc)
session.add(user)
session.commit()
return {
"token": session_token,
"user": user,
"news": await user_subscriptions(user.id)
}
return {"token": session_token, "user": user}
except InvalidToken as e:
raise InvalidToken(e.message)
except Exception as e:
@@ -67,19 +57,6 @@ async def confirm_email(_, info, token):
return {"error": "email is not confirmed"}
async def confirm_email_handler(request):
token = request.path_params["token"] # one time
request.session["token"] = token
res = await confirm_email(None, {}, token)
print('[resolvers.auth] confirm_email request: %r' % request)
if "error" in res:
raise BaseHttpException(res['error'])
else:
response = RedirectResponse(url=FRONTEND_URL)
response.set_cookie("token", res["token"]) # session token
return response
def create_user(user_dict):
user = User(**user_dict)
with local_session() as session:
@@ -90,22 +67,22 @@ def create_user(user_dict):
def generate_unique_slug(src):
print('[resolvers.auth] generating slug from: ' + src)
print("[resolvers.auth] generating slug from: " + src)
slug = translit(src, "ru", reversed=True).replace(".", "-").lower()
slug = re.sub('[^0-9a-zA-Z]+', '-', slug)
slug = re.sub("[^0-9a-zA-Z]+", "-", slug)
if slug != src:
print('[resolvers.auth] translited name: ' + slug)
print("[resolvers.auth] translited name: " + slug)
c = 1
with local_session() as session:
user = session.query(User).where(User.slug == slug).first()
while user:
user = session.query(User).where(User.slug == slug).first()
slug = slug + '-' + str(c)
slug = slug + "-" + str(c)
c += 1
if not user:
unique_slug = slug
print('[resolvers.auth] ' + unique_slug)
return quote_plus(unique_slug.replace('\'', '')).replace('+', '-')
print("[resolvers.auth] " + unique_slug)
return quote_plus(unique_slug.replace("'", "")).replace("+", "-")
@mutation.field("registerUser")
@@ -120,12 +97,12 @@ async def register_by_email(_, _info, email: str, password: str = "", name: str
slug = generate_unique_slug(name)
user = session.query(User).where(User.slug == slug).first()
if user:
slug = generate_unique_slug(email.split('@')[0])
slug = generate_unique_slug(email.split("@")[0])
user_dict = {
"email": email,
"username": email, # will be used to store phone number or some messenger network id
"name": name,
"slug": slug
"slug": slug,
}
if password:
user_dict["password"] = Password.encode(password)
@@ -175,11 +152,7 @@ async def login(_, info, email: str, password: str = "", lang: str = "ru"):
user = Identity.password(orm_user, password)
session_token = await TokenStorage.create_session(user)
print(f"[auth] user {email} authorized")
return {
"token": session_token,
"user": user,
"news": await user_subscriptions(user.id),
}
return {"token": session_token, "user": user}
except InvalidPassword:
print(f"[auth] {email}: invalid password")
raise InvalidPassword("invalid password") # contains webserver status

View File

@@ -18,21 +18,23 @@ async def create_shout(_, info, inp):
auth: AuthCredentials = info.context["request"].auth
with local_session() as session:
topics = session.query(Topic).filter(Topic.slug.in_(inp.get('topics', []))).all()
topics = session.query(Topic).filter(Topic.slug.in_(inp.get("topics", []))).all()
new_shout = Shout.create(**{
"title": inp.get("title"),
"subtitle": inp.get('subtitle'),
"lead": inp.get('lead'),
"description": inp.get('description'),
"body": inp.get("body", ''),
"layout": inp.get("layout"),
"authors": inp.get("authors", []),
"slug": inp.get("slug"),
"mainTopic": inp.get("mainTopic"),
"visibility": "owner",
"createdBy": auth.user_id
})
new_shout = Shout.create(
**{
"title": inp.get("title"),
"subtitle": inp.get("subtitle"),
"lead": inp.get("lead"),
"description": inp.get("description"),
"body": inp.get("body", ""),
"layout": inp.get("layout"),
"authors": inp.get("authors", []),
"slug": inp.get("slug"),
"mainTopic": inp.get("mainTopic"),
"visibility": "owner",
"createdBy": auth.user_id,
}
)
for topic in topics:
t = ShoutTopic.create(topic=topic.id, shout=new_shout.id)
@@ -60,14 +62,19 @@ async def create_shout(_, info, inp):
@mutation.field("updateShout")
@login_required
async def update_shout(_, info, shout_id, shout_input=None, publish=False):
async def update_shout(_, info, shout_id, shout_input=None, publish=False): # noqa: C901
auth: AuthCredentials = info.context["request"].auth
with local_session() as session:
shout = session.query(Shout).options(
joinedload(Shout.authors),
joinedload(Shout.topics),
).filter(Shout.id == shout_id).first()
shout = (
session.query(Shout)
.options(
joinedload(Shout.authors),
joinedload(Shout.topics),
)
.filter(Shout.id == shout_id)
.first()
)
if not shout:
return {"error": "shout not found"}
@@ -94,25 +101,34 @@ async def update_shout(_, info, shout_id, shout_input=None, publish=False):
session.commit()
for new_topic_to_link in new_topics_to_link:
created_unlinked_topic = ShoutTopic.create(shout=shout.id, topic=new_topic_to_link.id)
created_unlinked_topic = ShoutTopic.create(
shout=shout.id, topic=new_topic_to_link.id
)
session.add(created_unlinked_topic)
existing_topics_input = [topic_input for topic_input in topics_input if topic_input.get("id", 0) > 0]
existing_topic_to_link_ids = [existing_topic_input["id"] for existing_topic_input in existing_topics_input
if existing_topic_input["id"] not in [topic.id for topic in shout.topics]]
existing_topics_input = [
topic_input for topic_input in topics_input if topic_input.get("id", 0) > 0
]
existing_topic_to_link_ids = [
existing_topic_input["id"]
for existing_topic_input in existing_topics_input
if existing_topic_input["id"] not in [topic.id for topic in shout.topics]
]
for existing_topic_to_link_id in existing_topic_to_link_ids:
created_unlinked_topic = ShoutTopic.create(shout=shout.id, topic=existing_topic_to_link_id)
created_unlinked_topic = ShoutTopic.create(
shout=shout.id, topic=existing_topic_to_link_id
)
session.add(created_unlinked_topic)
topic_to_unlink_ids = [topic.id for topic in shout.topics
if topic.id not in [topic_input["id"] for topic_input in existing_topics_input]]
topic_to_unlink_ids = [
topic.id
for topic in shout.topics
if topic.id not in [topic_input["id"] for topic_input in existing_topics_input]
]
shout_topics_to_remove = session.query(ShoutTopic).filter(
and_(
ShoutTopic.shout == shout.id,
ShoutTopic.topic.in_(topic_to_unlink_ids)
)
and_(ShoutTopic.shout == shout.id, ShoutTopic.topic.in_(topic_to_unlink_ids))
)
for shout_topic_to_remove in shout_topics_to_remove:
@@ -120,13 +136,13 @@ async def update_shout(_, info, shout_id, shout_input=None, publish=False):
shout_input["mainTopic"] = shout_input["mainTopic"]["slug"]
if shout_input["mainTopic"] == '':
if shout_input["mainTopic"] == "":
del shout_input["mainTopic"]
shout.update(shout_input)
updated = True
if publish and shout.visibility == 'owner':
if publish and shout.visibility == "owner":
shout.visibility = "community"
shout.publishedAt = datetime.now(tz=timezone.utc)
updated = True

View File

@@ -1,11 +0,0 @@
from base.resolvers import query
from resolvers.auth import login_required
from migration.extract import extract_md
@login_required
@query.field("markdownBody")
def markdown_body(_, info, body: str):
body = extract_md(body)
return body

View File

@@ -24,27 +24,24 @@ async def update_chat(_, info, chat_new: Chat):
chat_id = chat_new["id"]
chat = await redis.execute("GET", f"chats/{chat_id}")
if not chat:
return {
"error": "chat not exist"
}
return {"error": "chat not exist"}
chat = dict(json.loads(chat))
# TODO
if auth.user_id in chat["admins"]:
chat.update({
"title": chat_new.get("title", chat["title"]),
"description": chat_new.get("description", chat["description"]),
"updatedAt": int(datetime.now(tz=timezone.utc).timestamp()),
"admins": chat_new.get("admins", chat.get("admins") or []),
"users": chat_new.get("users", chat["users"])
})
chat.update(
{
"title": chat_new.get("title", chat["title"]),
"description": chat_new.get("description", chat["description"]),
"updatedAt": int(datetime.now(tz=timezone.utc).timestamp()),
"admins": chat_new.get("admins", chat.get("admins") or []),
"users": chat_new.get("users", chat["users"]),
}
)
await redis.execute("SET", f"chats/{chat.id}", json.dumps(chat))
await redis.execute("COMMIT")
return {
"error": None,
"chat": chat
}
return {"error": None, "chat": chat}
@mutation.field("createChat")
@@ -52,7 +49,7 @@ async def update_chat(_, info, chat_new: Chat):
async def create_chat(_, info, title="", members=[]):
auth: AuthCredentials = info.context["request"].auth
chat = {}
print('create_chat members: %r' % members)
print("create_chat members: %r" % members)
if auth.user_id not in members:
members.append(int(auth.user_id))
@@ -74,15 +71,12 @@ async def create_chat(_, info, title="", members=[]):
chat = await redis.execute("GET", f"chats/{c.decode('utf-8')}")
if chat:
chat = json.loads(chat)
if chat['title'] == "":
print('[inbox] createChat found old chat')
if chat["title"] == "":
print("[inbox] createChat found old chat")
print(chat)
break
if chat:
return {
"chat": chat,
"error": "existed"
}
return {"chat": chat, "error": "existed"}
chat_id = str(uuid.uuid4())
chat = {
@@ -92,7 +86,7 @@ async def create_chat(_, info, title="", members=[]):
"createdBy": auth.user_id,
"createdAt": int(datetime.now(tz=timezone.utc).timestamp()),
"updatedAt": int(datetime.now(tz=timezone.utc).timestamp()),
"admins": members if (len(members) == 2 and title == "") else []
"admins": members if (len(members) == 2 and title == "") else [],
}
for m in members:
@@ -100,10 +94,7 @@ async def create_chat(_, info, title="", members=[]):
await redis.execute("SET", f"chats/{chat_id}", json.dumps(chat))
await redis.execute("SET", f"chats/{chat_id}/next_message_id", str(0))
await redis.execute("COMMIT")
return {
"error": None,
"chat": chat
}
return {"error": None, "chat": chat}
@mutation.field("deleteChat")
@@ -114,11 +105,9 @@ async def delete_chat(_, info, chat_id: str):
chat = await redis.execute("GET", f"/chats/{chat_id}")
if chat:
chat = dict(json.loads(chat))
if auth.user_id in chat['admins']:
if auth.user_id in chat["admins"]:
await redis.execute("DEL", f"chats/{chat_id}")
await redis.execute("SREM", "chats_by_user/" + str(auth.user_id), chat_id)
await redis.execute("COMMIT")
else:
return {
"error": "chat not exist"
}
return {"error": "chat not exist"}

View File

@@ -1,28 +1,27 @@
import json
# from datetime import datetime, timedelta, timezone
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.redis import redis
from base.orm import local_session
from base.redis import redis
from base.resolvers import query
from orm.user import User
from resolvers.zine.profile import followed_authors
from .unread import get_unread_counter
# from datetime import datetime, timedelta, timezone
async def load_messages(chat_id: str, limit: int = 5, offset: int = 0, ids=[]):
''' load :limit messages for :chat_id with :offset '''
"""load :limit messages for :chat_id with :offset"""
messages = []
message_ids = []
if ids:
message_ids += ids
try:
if limit:
mids = await redis.lrange(f"chats/{chat_id}/message_ids",
offset,
offset + limit
)
mids = await redis.lrange(f"chats/{chat_id}/message_ids", offset, offset + limit)
mids = [mid.decode("utf-8") for mid in mids]
message_ids += mids
except Exception as e:
@@ -30,10 +29,10 @@ async def load_messages(chat_id: str, limit: int = 5, offset: int = 0, ids=[]):
if message_ids:
message_keys = [f"chats/{chat_id}/messages/{mid}" for mid in message_ids]
messages = await redis.mget(*message_keys)
messages = [json.loads(msg.decode('utf-8')) for msg in messages]
messages = [json.loads(msg.decode("utf-8")) for msg in messages]
replies = []
for m in messages:
rt = m.get('replyTo')
rt = m.get("replyTo")
if rt:
rt = int(rt)
if rt not in message_ids:
@@ -46,14 +45,14 @@ async def load_messages(chat_id: str, limit: int = 5, offset: int = 0, ids=[]):
@query.field("loadChats")
@login_required
async def load_chats(_, info, limit: int = 50, offset: int = 0):
""" load :limit chats of current user with :offset """
"""load :limit chats of current user with :offset"""
auth: AuthCredentials = info.context["request"].auth
cids = await redis.execute("SMEMBERS", "chats_by_user/" + str(auth.user_id))
if cids:
cids = list(cids)[offset:offset + limit]
cids = list(cids)[offset : offset + limit]
if not cids:
print('[inbox.load] no chats were found')
print("[inbox.load] no chats were found")
cids = []
onliners = await redis.execute("SMEMBERS", "users-online")
if not onliners:
@@ -64,62 +63,50 @@ async def load_chats(_, info, limit: int = 50, offset: int = 0):
c = await redis.execute("GET", "chats/" + cid)
if c:
c = dict(json.loads(c))
c['messages'] = await load_messages(cid, 5, 0)
c['unread'] = await get_unread_counter(cid, auth.user_id)
c["messages"] = await load_messages(cid, 5, 0)
c["unread"] = await get_unread_counter(cid, auth.user_id)
with local_session() as session:
c['members'] = []
c["members"] = []
for uid in c["users"]:
a = session.query(User).where(User.id == uid).first()
if a:
c['members'].append({
"id": a.id,
"slug": a.slug,
"userpic": a.userpic,
"name": a.name,
"lastSeen": a.lastSeen,
"online": a.id in onliners
})
c["members"].append(
{
"id": a.id,
"slug": a.slug,
"userpic": a.userpic,
"name": a.name,
"lastSeen": a.lastSeen,
"online": a.id in onliners,
}
)
chats.append(c)
return {
"chats": chats,
"error": None
}
return {"chats": chats, "error": None}
@query.field("loadMessagesBy")
@login_required
async def load_messages_by(_, info, by, limit: int = 10, offset: int = 0):
''' load :limit messages of :chat_id with :offset '''
"""load :limit messages of :chat_id with :offset"""
auth: AuthCredentials = info.context["request"].auth
userchats = await redis.execute("SMEMBERS", "chats_by_user/" + str(auth.user_id))
userchats = [c.decode('utf-8') for c in userchats]
userchats = [c.decode("utf-8") for c in userchats]
# print('[inbox] userchats: %r' % userchats)
if userchats:
# print('[inbox] loading messages by...')
messages = []
by_chat = by.get('chat')
by_chat = by.get("chat")
if by_chat in userchats:
chat = await redis.execute("GET", f"chats/{by_chat}")
# print(chat)
if not chat:
return {
"messages": [],
"error": "chat not exist"
}
return {"messages": [], "error": "chat not exist"}
# everyone's messages in filtered chat
messages = await load_messages(by_chat, limit, offset)
return {
"messages": sorted(
list(messages),
key=lambda m: m['createdAt']
),
"error": None
}
return {"messages": sorted(list(messages), key=lambda m: m["createdAt"]), "error": None}
else:
return {
"error": "Cannot access messages of this chat"
}
return {"error": "Cannot access messages of this chat"}
@query.field("loadRecipients")
@@ -138,15 +125,14 @@ async def load_recipients(_, info, limit=50, offset=0):
chat_users += session.query(User).where(User.emailConfirmed).limit(limit).offset(offset)
members = []
for a in chat_users:
members.append({
"id": a.id,
"slug": a.slug,
"userpic": a.userpic,
"name": a.name,
"lastSeen": a.lastSeen,
"online": a.id in onliners
})
return {
"members": members,
"error": None
}
members.append(
{
"id": a.id,
"slug": a.slug,
"userpic": a.userpic,
"name": a.name,
"lastSeen": a.lastSeen,
"online": a.id in onliners,
}
)
return {"members": members, "error": None}

View File

@@ -1,62 +1,54 @@
import asyncio
import json
from typing import Any
from datetime import datetime, timezone
from graphql.type import GraphQLResolveInfo
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.redis import redis
from base.resolvers import mutation
from services.following import FollowingManager, FollowingResult, Following
from validations.inbox import Message
from services.following import FollowingManager, FollowingResult
@mutation.field("createMessage")
@login_required
async def create_message(_, info, chat: str, body: str, replyTo=None):
""" create message with :body for :chat_id replying to :replyTo optionally """
"""create message with :body for :chat_id replying to :replyTo optionally"""
auth: AuthCredentials = info.context["request"].auth
chat = await redis.execute("GET", f"chats/{chat}")
if not chat:
return {
"error": "chat is not exist"
}
return {"error": "chat is not exist"}
else:
chat = dict(json.loads(chat))
message_id = await redis.execute("GET", f"chats/{chat['id']}/next_message_id")
chat_dict = dict(json.loads(chat))
message_id = await redis.execute("GET", f"chats/{chat_dict['id']}/next_message_id")
message_id = int(message_id)
new_message = {
"chatId": chat['id'],
"chatId": chat_dict["id"],
"id": message_id,
"author": auth.user_id,
"body": body,
"createdAt": int(datetime.now(tz=timezone.utc).timestamp())
"createdAt": int(datetime.now(tz=timezone.utc).timestamp()),
}
if replyTo:
new_message['replyTo'] = replyTo
chat['updatedAt'] = new_message['createdAt']
await redis.execute("SET", f"chats/{chat['id']}", json.dumps(chat))
new_message["replyTo"] = replyTo
chat_dict["updatedAt"] = new_message["createdAt"]
await redis.execute("SET", f"chats/{chat_dict['id']}", json.dumps(chat))
print(f"[inbox] creating message {new_message}")
await redis.execute(
"SET", f"chats/{chat['id']}/messages/{message_id}", json.dumps(new_message)
"SET", f"chats/{chat_dict['id']}/messages/{message_id}", json.dumps(new_message)
)
await redis.execute("LPUSH", f"chats/{chat['id']}/message_ids", str(message_id))
await redis.execute("SET", f"chats/{chat['id']}/next_message_id", str(message_id + 1))
await redis.execute("LPUSH", f"chats/{chat_dict['id']}/message_ids", str(message_id))
await redis.execute("SET", f"chats/{chat_dict['id']}/next_message_id", str(message_id + 1))
users = chat["users"]
users = chat_dict["users"]
for user_slug in users:
await redis.execute(
"LPUSH", f"chats/{chat['id']}/unread/{user_slug}", str(message_id)
"LPUSH", f"chats/{chat_dict['id']}/unread/{user_slug}", str(message_id)
)
result = FollowingResult("NEW", 'chat', new_message)
await FollowingManager.push('chat', result)
result = FollowingResult("NEW", "chat", new_message)
await FollowingManager.push("chat", result)
return {
"message": new_message,
"error": None
}
return {"message": new_message, "error": None}
@mutation.field("updateMessage")
@@ -81,13 +73,10 @@ async def update_message(_, info, chat_id: str, message_id: int, body: str):
await redis.execute("SET", f"chats/{chat_id}/messages/{message_id}", json.dumps(message))
result = FollowingResult("UPDATED", 'chat', message)
await FollowingManager.push('chat', result)
result = FollowingResult("UPDATED", "chat", message)
await FollowingManager.push("chat", result)
return {
"message": message,
"error": None
}
return {"message": message, "error": None}
@mutation.field("deleteMessage")
@@ -114,7 +103,7 @@ async def delete_message(_, info, chat_id: str, message_id: int):
for user_id in users:
await redis.execute("LREM", f"chats/{chat_id}/unread/{user_id}", 0, str(message_id))
result = FollowingResult("DELETED", 'chat', message)
result = FollowingResult("DELETED", "chat", message)
await FollowingManager.push(result)
return {}
@@ -137,6 +126,4 @@ async def mark_as_read(_, info, chat_id: str, messages: [int]):
for message_id in messages:
await redis.execute("LREM", f"chats/{chat_id}/unread/{auth.user_id}", 0, str(message_id))
return {
"error": None
}
return {"error": None}

View File

@@ -1,10 +1,11 @@
import json
from datetime import datetime, timezone, timedelta
from datetime import datetime, timedelta, timezone
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.orm import local_session
from base.redis import redis
from base.resolvers import query
from base.orm import local_session
from orm.user import AuthorFollower, User
from resolvers.inbox.load import load_messages
@@ -17,7 +18,7 @@ async def search_recipients(_, info, query: str, limit: int = 50, offset: int =
auth: AuthCredentials = info.context["request"].auth
talk_before = await redis.execute("GET", f"/chats_by_user/{auth.user_id}")
if talk_before:
talk_before = list(json.loads(talk_before))[offset:offset + limit]
talk_before = list(json.loads(talk_before))[offset : offset + limit]
for chat_id in talk_before:
members = await redis.execute("GET", f"/chats/{chat_id}/users")
if members:
@@ -31,23 +32,24 @@ async def search_recipients(_, info, query: str, limit: int = 50, offset: int =
with local_session() as session:
# followings
result += session.query(AuthorFollower.author).join(
User, User.id == AuthorFollower.follower
).where(
User.slug.startswith(query)
).offset(offset + len(result)).limit(more_amount)
result += (
session.query(AuthorFollower.author)
.join(User, User.id == AuthorFollower.follower)
.where(User.slug.startswith(query))
.offset(offset + len(result))
.limit(more_amount)
)
more_amount = limit
# followers
result += session.query(AuthorFollower.follower).join(
User, User.id == AuthorFollower.author
).where(
User.slug.startswith(query)
).offset(offset + len(result)).limit(offset + len(result) + limit)
return {
"members": list(result),
"error": None
}
result += (
session.query(AuthorFollower.follower)
.join(User, User.id == AuthorFollower.author)
.where(User.slug.startswith(query))
.offset(offset + len(result))
.limit(offset + len(result) + limit)
)
return {"members": list(result), "error": None}
@query.field("searchMessages")
@@ -57,22 +59,22 @@ async def search_user_chats(by, messages, user_id: int, limit, offset):
cids.union(set(await redis.execute("SMEMBERS", "chats_by_user/" + str(user_id))))
messages = []
by_author = by.get('author')
by_author = by.get("author")
if by_author:
# all author's messages
cids.union(set(await redis.execute("SMEMBERS", f"chats_by_user/{by_author}")))
# author's messages in filtered chat
messages.union(set(filter(lambda m: m["author"] == by_author, list(messages))))
for c in cids:
c = c.decode('utf-8')
c = c.decode("utf-8")
messages = await load_messages(c, limit, offset)
body_like = by.get('body')
body_like = by.get("body")
if body_like:
# search in all messages in all user's chats
for c in cids:
# FIXME: use redis scan here
c = c.decode('utf-8')
c = c.decode("utf-8")
mmm = await load_messages(c, limit, offset)
for m in mmm:
if body_like in m["body"]:
@@ -83,13 +85,12 @@ async def search_user_chats(by, messages, user_id: int, limit, offset):
days = by.get("days")
if days:
messages.extend(filter(
list(messages),
key=lambda m: (
datetime.now(tz=timezone.utc) - int(m["createdAt"]) < timedelta(days=by["days"])
messages.extend(
filter(
list(messages),
key=lambda m: (
datetime.now(tz=timezone.utc) - int(m["createdAt"]) < timedelta(days=by["days"])
),
)
))
return {
"messages": messages,
"error": None
}
)
return {"messages": messages, "error": None}

View File

@@ -1,5 +1,4 @@
from base.redis import redis
import json
async def get_unread_counter(chat_id: str, user_id: int):
@@ -9,14 +8,3 @@ async def get_unread_counter(chat_id: str, user_id: int):
return unread
except Exception:
return 0
async def get_total_unread_counter(user_id: int):
chats = await redis.execute("GET", f"chats_by_user/{str(user_id)}")
unread = 0
if chats:
chats = json.loads(chats)
for chat_id in chats:
n = await get_unread_counter(chat_id.decode('utf-8'), user_id)
unread += n
return unread

View File

@@ -1,9 +1,9 @@
from sqlalchemy import select, desc, and_, update
from sqlalchemy import and_, desc, select, update
from auth.credentials import AuthCredentials
from base.resolvers import query, mutation
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.orm import local_session
from base.resolvers import mutation, query
from orm import Notification
@@ -16,25 +16,26 @@ async def load_notifications(_, info, params=None):
auth: AuthCredentials = info.context["request"].auth
user_id = auth.user_id
limit = params.get('limit', 50)
offset = params.get('offset', 0)
limit = params.get("limit", 50)
offset = params.get("offset", 0)
q = select(Notification).where(
Notification.user == user_id
).order_by(desc(Notification.createdAt)).limit(limit).offset(offset)
q = (
select(Notification)
.where(Notification.user == user_id)
.order_by(desc(Notification.createdAt))
.limit(limit)
.offset(offset)
)
notifications = []
with local_session() as session:
total_count = session.query(Notification).where(
Notification.user == user_id
).count()
total_count = session.query(Notification).where(Notification.user == user_id).count()
total_unread_count = session.query(Notification).where(
and_(
Notification.user == user_id,
Notification.seen == False
)
).count()
total_unread_count = (
session.query(Notification)
.where(and_(Notification.user == user_id, Notification.seen == False)) # noqa: E712
.count()
)
for [notification] in session.execute(q):
notification.type = notification.type.name
@@ -43,7 +44,7 @@ async def load_notifications(_, info, params=None):
return {
"notifications": notifications,
"totalCount": total_count,
"totalUnreadCount": total_unread_count
"totalUnreadCount": total_unread_count,
}
@@ -54,9 +55,11 @@ async def mark_notification_as_read(_, info, notification_id: int):
user_id = auth.user_id
with local_session() as session:
notification = session.query(Notification).where(
and_(Notification.id == notification_id, Notification.user == user_id)
).one()
notification = (
session.query(Notification)
.where(and_(Notification.id == notification_id, Notification.user == user_id))
.one()
)
notification.seen = True
session.commit()
@@ -69,12 +72,11 @@ async def mark_all_notifications_as_read(_, info):
auth: AuthCredentials = info.context["request"].auth
user_id = auth.user_id
statement = update(Notification).where(
and_(
Notification.user == user_id,
Notification.seen == False
)
).values(seen=True)
statement = (
update(Notification)
.where(and_(Notification.user == user_id, Notification.seen == False)) # noqa: E712
.values(seen=True)
)
with local_session() as session:
try:

View File

@@ -2,33 +2,36 @@ import os
import shutil
import tempfile
import uuid
import boto3
from botocore.exceptions import BotoCoreError, ClientError
from starlette.responses import JSONResponse
STORJ_ACCESS_KEY = os.environ.get('STORJ_ACCESS_KEY')
STORJ_SECRET_KEY = os.environ.get('STORJ_SECRET_KEY')
STORJ_END_POINT = os.environ.get('STORJ_END_POINT')
STORJ_BUCKET_NAME = os.environ.get('STORJ_BUCKET_NAME')
CDN_DOMAIN = os.environ.get('CDN_DOMAIN')
STORJ_ACCESS_KEY = os.environ.get("STORJ_ACCESS_KEY")
STORJ_SECRET_KEY = os.environ.get("STORJ_SECRET_KEY")
STORJ_END_POINT = os.environ.get("STORJ_END_POINT")
STORJ_BUCKET_NAME = os.environ.get("STORJ_BUCKET_NAME")
CDN_DOMAIN = os.environ.get("CDN_DOMAIN")
async def upload_handler(request):
form = await request.form()
file = form.get('file')
file = form.get("file")
if file is None:
return JSONResponse({'error': 'No file uploaded'}, status_code=400)
return JSONResponse({"error": "No file uploaded"}, status_code=400)
file_name, file_extension = os.path.splitext(file.filename)
key = str(uuid.uuid4()) + file_extension
key = "files/" + str(uuid.uuid4()) + file_extension
# Create an S3 client with Storj configuration
s3 = boto3.client('s3',
aws_access_key_id=STORJ_ACCESS_KEY,
aws_secret_access_key=STORJ_SECRET_KEY,
endpoint_url=STORJ_END_POINT)
s3 = boto3.client(
"s3",
aws_access_key_id=STORJ_ACCESS_KEY,
aws_secret_access_key=STORJ_SECRET_KEY,
endpoint_url=STORJ_END_POINT,
)
try:
# Save the uploaded file to a temporary file
@@ -39,18 +42,13 @@ async def upload_handler(request):
Filename=tmp_file.name,
Bucket=STORJ_BUCKET_NAME,
Key=key,
ExtraArgs={
"ContentType": file.content_type
}
ExtraArgs={"ContentType": file.content_type},
)
url = 'http://' + CDN_DOMAIN + '/' + key
url = "https://" + CDN_DOMAIN + "/" + key
return JSONResponse({'url': url, 'originalFilename': file.filename})
return JSONResponse({"url": url, "originalFilename": file.filename})
except (BotoCoreError, ClientError) as e:
print(e)
return JSONResponse({'error': 'Failed to upload file'}, status_code=500)
return JSONResponse({"error": "Failed to upload file"}, status_code=500)

View File

@@ -1,41 +1,36 @@
import asyncio
from base.orm import local_session
from base.resolvers import mutation
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.resolvers import mutation
# from resolvers.community import community_follow, community_unfollow
from orm.user import AuthorFollower
from orm.topic import TopicFollower
from orm.shout import ShoutReactionsFollower
from resolvers.zine.profile import author_follow, author_unfollow
from resolvers.zine.reactions import reactions_follow, reactions_unfollow
from resolvers.zine.topics import topic_follow, topic_unfollow
from services.following import Following, FollowingManager, FollowingResult
from graphql.type import GraphQLResolveInfo
from services.following import FollowingManager, FollowingResult
@mutation.field("follow")
@login_required
async def follow(_, info, what, slug):
async def follow(_, info, what, slug): # noqa: C901
auth: AuthCredentials = info.context["request"].auth
try:
if what == "AUTHOR":
if author_follow(auth.user_id, slug):
result = FollowingResult("NEW", 'author', slug)
await FollowingManager.push('author', result)
result = FollowingResult("NEW", "author", slug)
await FollowingManager.push("author", result)
elif what == "TOPIC":
if topic_follow(auth.user_id, slug):
result = FollowingResult("NEW", 'topic', slug)
await FollowingManager.push('topic', result)
result = FollowingResult("NEW", "topic", slug)
await FollowingManager.push("topic", result)
elif what == "COMMUNITY":
if False: # TODO: use community_follow(auth.user_id, slug):
result = FollowingResult("NEW", 'community', slug)
await FollowingManager.push('community', result)
result = FollowingResult("NEW", "community", slug)
await FollowingManager.push("community", result)
elif what == "REACTIONS":
if reactions_follow(auth.user_id, slug):
result = FollowingResult("NEW", 'shout', slug)
await FollowingManager.push('shout', result)
result = FollowingResult("NEW", "shout", slug)
await FollowingManager.push("shout", result)
except Exception as e:
print(Exception(e))
return {"error": str(e)}
@@ -45,26 +40,26 @@ async def follow(_, info, what, slug):
@mutation.field("unfollow")
@login_required
async def unfollow(_, info, what, slug):
async def unfollow(_, info, what, slug): # noqa: C901
auth: AuthCredentials = info.context["request"].auth
try:
if what == "AUTHOR":
if author_unfollow(auth.user_id, slug):
result = FollowingResult("DELETED", 'author', slug)
await FollowingManager.push('author', result)
result = FollowingResult("DELETED", "author", slug)
await FollowingManager.push("author", result)
elif what == "TOPIC":
if topic_unfollow(auth.user_id, slug):
result = FollowingResult("DELETED", 'topic', slug)
await FollowingManager.push('topic', result)
result = FollowingResult("DELETED", "topic", slug)
await FollowingManager.push("topic", result)
elif what == "COMMUNITY":
if False: # TODO: use community_unfollow(auth.user_id, slug):
result = FollowingResult("DELETED", 'community', slug)
await FollowingManager.push('community', result)
result = FollowingResult("DELETED", "community", slug)
await FollowingManager.push("community", result)
elif what == "REACTIONS":
if reactions_unfollow(auth.user_id, slug):
result = FollowingResult("DELETED", 'shout', slug)
await FollowingManager.push('shout', result)
result = FollowingResult("DELETED", "shout", slug)
await FollowingManager.push("shout", result)
except Exception as e:
return {"error": str(e)}

View File

@@ -1,33 +1,50 @@
from datetime import datetime, timedelta, timezone
import json
from datetime import datetime, timedelta
from sqlalchemy.orm import joinedload, aliased
from sqlalchemy.sql.expression import desc, asc, select, func, case, and_, text, nulls_last
from sqlalchemy.orm import aliased, joinedload
from sqlalchemy.sql.expression import (
and_,
asc,
case,
desc,
distinct,
func,
nulls_last,
select,
)
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.exceptions import ObjectNotExist, OperationNotAllowed
from base.exceptions import ObjectNotExist
from base.orm import local_session
from base.resolvers import query
from orm import TopicFollower
from orm.reaction import Reaction, ReactionKind
from orm.shout import Shout, ShoutAuthor, ShoutTopic
from orm.user import AuthorFollower
from resolvers.zine.topics import get_random_topic
def add_stat_columns(q):
aliased_reaction = aliased(Reaction)
def get_shouts_from_query(q):
shouts = []
with local_session() as session:
for [shout, reacted_stat, commented_stat, rating_stat, last_comment] in session.execute(
q
).unique():
shouts.append(shout)
shout.stat = {
"viewed": shout.views,
"reacted": reacted_stat,
"commented": commented_stat,
"rating": rating_stat,
}
q = q.outerjoin(aliased_reaction).add_columns(
func.sum(
aliased_reaction.id
).label('reacted_stat'),
func.sum(
case(
(aliased_reaction.kind == ReactionKind.COMMENT, 1),
else_=0
)
).label('commented_stat'),
func.sum(case(
return shouts
def get_rating_func(aliased_reaction):
return func.sum(
case(
# do not count comments' reactions
(aliased_reaction.replyTo.is_not(None), 0),
(aliased_reaction.kind == ReactionKind.AGREE, 1),
@@ -38,47 +55,77 @@ def add_stat_columns(q):
(aliased_reaction.kind == ReactionKind.REJECT, -1),
(aliased_reaction.kind == ReactionKind.LIKE, 1),
(aliased_reaction.kind == ReactionKind.DISLIKE, -1),
else_=0)
).label('rating_stat'),
func.max(case(
(aliased_reaction.kind != ReactionKind.COMMENT, None),
else_=aliased_reaction.createdAt
)).label('last_comment'))
else_=0,
)
)
def add_stat_columns(q):
aliased_reaction = aliased(Reaction)
q = q.outerjoin(aliased_reaction).add_columns(
func.sum(aliased_reaction.id).label("reacted_stat"),
func.sum(case((aliased_reaction.kind == ReactionKind.COMMENT, 1), else_=0)).label(
"commented_stat"
),
get_rating_func(aliased_reaction).label("rating_stat"),
func.max(
case(
(aliased_reaction.kind != ReactionKind.COMMENT, None),
else_=aliased_reaction.createdAt,
)
).label("last_comment"),
)
return q
def apply_filters(q, filters, user_id=None):
# use_published_date is a quick fix, will be reworked as a part of tech debt
def apply_filters(q, filters, user_id=None, use_published_date=False): # noqa: C901
if filters.get("reacted") and user_id:
q.join(Reaction, Reaction.createdBy == user_id)
v = filters.get("visibility")
if v == "public":
q = q.filter(Shout.visibility == filters.get("visibility"))
q = q.filter(Shout.visibility == "public")
if v == "community":
q = q.filter(Shout.visibility.in_(["public", "community"]))
if filters.get("layout"):
q = q.filter(Shout.layout == filters.get("layout"))
if filters.get('excludeLayout'):
if filters.get("excludeLayout"):
q = q.filter(Shout.layout != filters.get("excludeLayout"))
if filters.get("author"):
q = q.filter(Shout.authors.any(slug=filters.get("author")))
if filters.get("topic"):
q = q.filter(Shout.topics.any(slug=filters.get("topic")))
if filters.get("title"):
q = q.filter(Shout.title.ilike(f'%{filters.get("title")}%'))
if filters.get("body"):
q = q.filter(Shout.body.ilike(f'%{filters.get("body")}%s'))
if filters.get("days"):
before = datetime.now(tz=timezone.utc) - timedelta(days=int(filters.get("days")) or 30)
q = q.filter(Shout.createdAt > before)
if filters.get("fromDate"):
# fromDate: '2022-12-31
date_from = datetime.strptime(filters.get("fromDate"), "%Y-%m-%d")
if use_published_date:
q = q.filter(Shout.publishedAt >= date_from)
else:
q = q.filter(Shout.createdAt >= date_from)
if filters.get("toDate"):
# toDate: '2023-12-31'
date_to = datetime.strptime(filters.get("toDate"), "%Y-%m-%d")
if use_published_date:
q = q.filter(Shout.publishedAt < (date_to + timedelta(days=1)))
else:
q = q.filter(Shout.createdAt < (date_to + timedelta(days=1)))
return q
@query.field("loadShout")
async def load_shout(_, info, slug=None, shout_id=None):
# for testing, soon will be removed
if slug == "testtesttest":
with open("test/test.json") as json_file:
test_shout = json.load(json_file)["data"]["loadShout"]
test_shout["createdAt"] = datetime.fromisoformat(test_shout["createdAt"])
test_shout["publishedAt"] = datetime.fromisoformat(test_shout["publishedAt"])
return test_shout
with local_session() as session:
q = select(Shout).options(
joinedload(Shout.authors),
@@ -87,27 +134,23 @@ async def load_shout(_, info, slug=None, shout_id=None):
q = add_stat_columns(q)
if slug is not None:
q = q.filter(
Shout.slug == slug
)
q = q.filter(Shout.slug == slug)
if shout_id is not None:
q = q.filter(
Shout.id == shout_id
)
q = q.filter(Shout.id == shout_id)
q = q.filter(
Shout.deletedAt.is_(None)
).group_by(Shout.id)
q = q.filter(Shout.deletedAt.is_(None)).group_by(Shout.id)
try:
[shout, reacted_stat, commented_stat, rating_stat, last_comment] = session.execute(q).first()
[shout, reacted_stat, commented_stat, rating_stat, last_comment] = session.execute(
q
).first()
shout.stat = {
"viewed": shout.views,
"reacted": reacted_stat,
"commented": commented_stat,
"rating": rating_stat
"rating": rating_stat,
}
for author_caption in session.query(ShoutAuthor).join(Shout).where(Shout.slug == slug):
@@ -131,7 +174,8 @@ async def load_shouts_by(_, info, options):
topic: 'culture',
title: 'something',
body: 'something else',
days: 30
fromDate: '2022-12-31',
toDate: '2023-12-31'
}
offset: 0
limit: 50
@@ -142,14 +186,13 @@ async def load_shouts_by(_, info, options):
:return: Shout[]
"""
q = select(Shout).options(
joinedload(Shout.authors),
joinedload(Shout.topics),
).where(
and_(
Shout.deletedAt.is_(None),
Shout.layout.is_not(None)
q = (
select(Shout)
.options(
joinedload(Shout.authors),
joinedload(Shout.topics),
)
.where(and_(Shout.deletedAt.is_(None), Shout.layout.is_not(None)))
)
q = add_stat_columns(q)
@@ -159,27 +202,149 @@ async def load_shouts_by(_, info, options):
order_by = options.get("order_by", Shout.publishedAt)
query_order_by = desc(order_by) if options.get('order_by_desc', True) else asc(order_by)
query_order_by = desc(order_by) if options.get("order_by_desc", True) else asc(order_by)
offset = options.get("offset", 0)
limit = options.get("limit", 10)
q = q.group_by(Shout.id).order_by(nulls_last(query_order_by)).limit(limit).offset(offset)
shouts = []
with local_session() as session:
shouts_map = {}
return get_shouts_from_query(q)
for [shout, reacted_stat, commented_stat, rating_stat, last_comment] in session.execute(q).unique():
shouts.append(shout)
shout.stat = {
"viewed": shout.views,
"reacted": reacted_stat,
"commented": commented_stat,
"rating": rating_stat
}
shouts_map[shout.id] = shout
return shouts
@query.field("loadRandomTopShouts")
async def load_random_top_shouts(_, info, params):
"""
:param params: {
filters: {
layout: 'music',
excludeLayout: 'article',
fromDate: '2022-12-31'
toDate: '2023-12-31'
}
fromRandomCount: 100,
limit: 50
}
:return: Shout[]
"""
aliased_reaction = aliased(Reaction)
subquery = (
select(Shout.id)
.outerjoin(aliased_reaction)
.where(and_(Shout.deletedAt.is_(None), Shout.layout.is_not(None)))
)
subquery = apply_filters(subquery, params.get("filters", {}), use_published_date=True)
subquery = subquery.group_by(Shout.id).order_by(desc(get_rating_func(aliased_reaction)))
from_random_count = params.get("fromRandomCount")
if from_random_count:
subquery = subquery.limit(from_random_count)
q = (
select(Shout)
.options(
joinedload(Shout.authors),
joinedload(Shout.topics),
)
.where(Shout.id.in_(subquery))
)
q = add_stat_columns(q)
limit = params.get("limit", 10)
q = q.group_by(Shout.id).order_by(func.random()).limit(limit)
# print(q.compile(compile_kwargs={"literal_binds": True}))
return get_shouts_from_query(q)
@query.field("loadRandomTopicShouts")
async def load_random_topic_shouts(_, info, limit):
topic = get_random_topic()
q = (
select(Shout)
.options(
joinedload(Shout.authors),
joinedload(Shout.topics),
)
.join(ShoutTopic, and_(Shout.id == ShoutTopic.shout, ShoutTopic.topic == topic.id))
.where(
and_(Shout.deletedAt.is_(None), Shout.layout.is_not(None), Shout.visibility == "public")
)
)
q = add_stat_columns(q)
q = q.group_by(Shout.id).order_by(desc(Shout.createdAt)).limit(limit)
shouts = get_shouts_from_query(q)
return {"topic": topic, "shouts": shouts}
@query.field("loadUnratedShouts")
async def load_unrated_shouts(_, info, limit):
auth: AuthCredentials = info.context["request"].auth
user_id = auth.user_id
aliased_reaction = aliased(Reaction)
q = (
select(Shout)
.options(
joinedload(Shout.authors),
joinedload(Shout.topics),
)
.outerjoin(
Reaction,
and_(
Reaction.shout == Shout.id,
Reaction.replyTo.is_(None),
Reaction.kind.in_([ReactionKind.LIKE, ReactionKind.DISLIKE]),
),
)
)
if user_id:
q = q.outerjoin(
aliased_reaction,
and_(
aliased_reaction.shout == Shout.id,
aliased_reaction.replyTo.is_(None),
aliased_reaction.kind.in_([ReactionKind.LIKE, ReactionKind.DISLIKE]),
aliased_reaction.createdBy == user_id,
),
)
q = q.where(
and_(
Shout.deletedAt.is_(None),
Shout.layout.is_not(None),
Shout.createdAt >= (datetime.now() - timedelta(days=14)).date(),
)
)
if user_id:
q = q.where(Shout.createdBy != user_id)
# 3 or fewer votes is 0, 1, 2 or 3 votes (null, reaction id1, reaction id2, reaction id3)
q = q.having(func.count(distinct(Reaction.id)) <= 4)
if user_id:
q = q.having(func.count(distinct(aliased_reaction.id)) == 0)
q = add_stat_columns(q)
q = q.group_by(Shout.id).order_by(func.random()).limit(limit)
# print(q.compile(compile_kwargs={"literal_binds": True}))
return get_shouts_from_query(q)
@query.field("loadDrafts")
@@ -188,11 +353,13 @@ async def get_drafts(_, info):
auth: AuthCredentials = info.context["request"].auth
user_id = auth.user_id
q = select(Shout).options(
joinedload(Shout.authors),
joinedload(Shout.topics),
).where(
and_(Shout.deletedAt.is_(None), Shout.createdBy == user_id)
q = (
select(Shout)
.options(
joinedload(Shout.authors),
joinedload(Shout.topics),
)
.where(and_(Shout.deletedAt.is_(None), Shout.createdBy == user_id))
)
q = q.group_by(Shout.id)
@@ -211,24 +378,27 @@ async def get_my_feed(_, info, options):
auth: AuthCredentials = info.context["request"].auth
user_id = auth.user_id
subquery = select(Shout.id).join(
ShoutAuthor
).join(
AuthorFollower, AuthorFollower.follower == user_id
).join(
ShoutTopic
).join(
TopicFollower, TopicFollower.follower == user_id
user_followed_authors = select(AuthorFollower.author).where(AuthorFollower.follower == user_id)
user_followed_topics = select(TopicFollower.topic).where(TopicFollower.follower == user_id)
subquery = (
select(Shout.id)
.where(Shout.id == ShoutAuthor.shout)
.where(Shout.id == ShoutTopic.shout)
.where(
(ShoutAuthor.user.in_(user_followed_authors))
| (ShoutTopic.topic.in_(user_followed_topics))
)
)
q = select(Shout).options(
joinedload(Shout.authors),
joinedload(Shout.topics),
).where(
and_(
Shout.publishedAt.is_not(None),
Shout.deletedAt.is_(None),
Shout.id.in_(subquery)
q = (
select(Shout)
.options(
joinedload(Shout.authors),
joinedload(Shout.topics),
)
.where(
and_(Shout.publishedAt.is_not(None), Shout.deletedAt.is_(None), Shout.id.in_(subquery))
)
)
@@ -237,23 +407,12 @@ async def get_my_feed(_, info, options):
order_by = options.get("order_by", Shout.publishedAt)
query_order_by = desc(order_by) if options.get('order_by_desc', True) else asc(order_by)
query_order_by = desc(order_by) if options.get("order_by_desc", True) else asc(order_by)
offset = options.get("offset", 0)
limit = options.get("limit", 10)
q = q.group_by(Shout.id).order_by(nulls_last(query_order_by)).limit(limit).offset(offset)
shouts = []
with local_session() as session:
shouts_map = {}
for [shout, reacted_stat, commented_stat, rating_stat, last_comment] in session.execute(q).unique():
shouts.append(shout)
shout.stat = {
"viewed": shout.views,
"reacted": reacted_stat,
"commented": commented_stat,
"rating": rating_stat
}
shouts_map[shout.id] = shout
# print(q.compile(compile_kwargs={"literal_binds": True}))
return shouts
return get_shouts_from_query(q)

View File

@@ -1,6 +1,7 @@
from typing import List
from datetime import datetime, timedelta, timezone
from sqlalchemy import and_, func, distinct, select, literal
from typing import List
from sqlalchemy import and_, distinct, func, literal, select
from sqlalchemy.orm import aliased, joinedload
from auth.authenticate import login_required
@@ -9,11 +10,8 @@ from base.orm import local_session
from base.resolvers import mutation, query
from orm.reaction import Reaction, ReactionKind
from orm.shout import ShoutAuthor, ShoutTopic
from orm.topic import Topic
from orm.topic import Topic, TopicFollower
from orm.user import AuthorFollower, Role, User, UserRating, UserRole
# from .community import followed_communities
from resolvers.inbox.unread import get_total_unread_counter
from resolvers.zine.topics import followed_by_user
@@ -24,27 +22,27 @@ def add_author_stat_columns(q):
# user_rating_aliased = aliased(UserRating)
q = q.outerjoin(shout_author_aliased).add_columns(
func.count(distinct(shout_author_aliased.shout)).label('shouts_stat')
func.count(distinct(shout_author_aliased.shout)).label("shouts_stat")
)
q = q.outerjoin(author_followers, author_followers.author == User.id).add_columns(
func.count(distinct(author_followers.follower)).label('followers_stat')
func.count(distinct(author_followers.follower)).label("followers_stat")
)
q = q.outerjoin(author_following, author_following.follower == User.id).add_columns(
func.count(distinct(author_following.author)).label('followings_stat')
func.count(distinct(author_following.author)).label("followings_stat")
)
q = q.add_columns(literal(0).label('rating_stat'))
q = q.add_columns(literal(0).label("rating_stat"))
# FIXME
# q = q.outerjoin(user_rating_aliased, user_rating_aliased.user == User.id).add_columns(
# # TODO: check
# func.sum(user_rating_aliased.value).label('rating_stat')
# )
q = q.add_columns(literal(0).label('commented_stat'))
# q = q.outerjoin(Reaction, and_(Reaction.createdBy == User.id, Reaction.body.is_not(None))).add_columns(
# func.count(distinct(Reaction.id)).label('commented_stat')
# )
q = q.add_columns(literal(0).label("commented_stat"))
# q = q.outerjoin(
# Reaction, and_(Reaction.createdBy == User.id, Reaction.body.is_not(None))
# ).add_columns(func.count(distinct(Reaction.id)).label("commented_stat"))
q = q.group_by(User.id)
@@ -58,7 +56,7 @@ def add_stat(author, stat_columns):
"followers": followers_stat,
"followings": followings_stat,
"rating": rating_stat,
"commented": commented_stat
"commented": commented_stat,
}
return author
@@ -74,34 +72,6 @@ def get_authors_from_query(q):
return authors
async def user_subscriptions(user_id: int):
return {
"unread": await get_total_unread_counter(user_id), # unread inbox messages counter
"topics": [t.slug for t in await followed_topics(user_id)], # followed topics slugs
"authors": [a.slug for a in await followed_authors(user_id)], # followed authors slugs
"reactions": await followed_reactions(user_id)
# "communities": [c.slug for c in followed_communities(slug)], # communities
}
# @query.field("userFollowedDiscussions")
# @login_required
async def followed_discussions(_, info, user_id) -> List[Topic]:
return await followed_reactions(user_id)
async def followed_reactions(user_id):
with local_session() as session:
user = session.query(User).where(User.id == user_id).first()
return session.query(
Reaction.shout
).where(
Reaction.createdBy == user.id
).filter(
Reaction.createdAt > user.lastSeen
).all()
# dufok mod (^*^') :
@query.field("userFollowedTopics")
async def get_followed_topics(_, info, slug) -> List[Topic]:
@@ -150,10 +120,10 @@ async def user_followers(_, _info, slug) -> List[User]:
q = add_author_stat_columns(q)
aliased_user = aliased(User)
q = q.join(AuthorFollower, AuthorFollower.follower == User.id).join(
aliased_user, aliased_user.id == AuthorFollower.author
).where(
aliased_user.slug == slug
q = (
q.join(AuthorFollower, AuthorFollower.follower == User.id)
.join(aliased_user, aliased_user.id == AuthorFollower.author)
.where(aliased_user.slug == slug)
)
return get_authors_from_query(q)
@@ -181,15 +151,10 @@ async def update_profile(_, info, profile):
with local_session() as session:
user = session.query(User).filter(User.id == user_id).one()
if not user:
return {
"error": "canoot find user"
}
return {"error": "canoot find user"}
user.update(profile)
session.commit()
return {
"error": None,
"author": user
}
return {"error": None, "author": user}
@mutation.field("rateUser")
@@ -223,7 +188,8 @@ def author_follow(user_id, slug):
session.add(af)
session.commit()
return True
except:
except Exception as e:
print(e)
return False
@@ -231,13 +197,10 @@ def author_follow(user_id, slug):
def author_unfollow(user_id, slug):
with local_session() as session:
flw = (
session.query(
AuthorFollower
).join(User, User.id == AuthorFollower.author).filter(
and_(
AuthorFollower.follower == user_id, User.slug == slug
)
).first()
session.query(AuthorFollower)
.join(User, User.id == AuthorFollower.author)
.filter(and_(AuthorFollower.follower == user_id, User.slug == slug))
.first()
)
if flw:
session.delete(flw)
@@ -263,12 +226,11 @@ async def get_author(_, _info, slug):
[author] = get_authors_from_query(q)
with local_session() as session:
comments_count = session.query(Reaction).where(
and_(
Reaction.createdBy == author.id,
Reaction.kind == ReactionKind.COMMENT
)
).count()
comments_count = (
session.query(Reaction)
.where(and_(Reaction.createdBy == author.id, Reaction.kind == ReactionKind.COMMENT))
.count()
)
author.stat["commented"] = comments_count
return author
@@ -291,8 +253,33 @@ async def load_authors_by(_, info, by, limit, offset):
days_before = datetime.now(tz=timezone.utc) - timedelta(days=by["createdAt"])
q = q.filter(User.createdAt > days_before)
q = q.order_by(
by.get("order", User.createdAt)
).limit(limit).offset(offset)
q = q.order_by(by.get("order", User.createdAt)).limit(limit).offset(offset)
return get_authors_from_query(q)
@query.field("loadMySubscriptions")
@login_required
async def load_my_subscriptions(_, info):
auth = info.context["request"].auth
user_id = auth.user_id
authors_query = (
select(User)
.join(AuthorFollower, AuthorFollower.author == User.id)
.where(AuthorFollower.follower == user_id)
)
topics_query = select(Topic).join(TopicFollower).where(TopicFollower.follower == user_id)
topics = []
authors = []
with local_session() as session:
for [author] in session.execute(authors_query):
authors.append(author)
for [topic] in session.execute(topics_query):
topics.append(topic)
return {"topics": topics, "authors": authors}

View File

@@ -1,5 +1,6 @@
from datetime import datetime, timedelta, timezone
from sqlalchemy import and_, asc, desc, select, text, func, case
from sqlalchemy import and_, asc, case, desc, func, select, text
from sqlalchemy.orm import aliased
from auth.authenticate import login_required
@@ -17,26 +18,22 @@ def add_reaction_stat_columns(q):
aliased_reaction = aliased(Reaction)
q = q.outerjoin(aliased_reaction, Reaction.id == aliased_reaction.replyTo).add_columns(
func.sum(
aliased_reaction.id
).label('reacted_stat'),
func.sum(aliased_reaction.id).label("reacted_stat"),
func.sum(case((aliased_reaction.body.is_not(None), 1), else_=0)).label("commented_stat"),
func.sum(
case(
(aliased_reaction.body.is_not(None), 1),
else_=0
(aliased_reaction.kind == ReactionKind.AGREE, 1),
(aliased_reaction.kind == ReactionKind.DISAGREE, -1),
(aliased_reaction.kind == ReactionKind.PROOF, 1),
(aliased_reaction.kind == ReactionKind.DISPROOF, -1),
(aliased_reaction.kind == ReactionKind.ACCEPT, 1),
(aliased_reaction.kind == ReactionKind.REJECT, -1),
(aliased_reaction.kind == ReactionKind.LIKE, 1),
(aliased_reaction.kind == ReactionKind.DISLIKE, -1),
else_=0,
)
).label('commented_stat'),
func.sum(case(
(aliased_reaction.kind == ReactionKind.AGREE, 1),
(aliased_reaction.kind == ReactionKind.DISAGREE, -1),
(aliased_reaction.kind == ReactionKind.PROOF, 1),
(aliased_reaction.kind == ReactionKind.DISPROOF, -1),
(aliased_reaction.kind == ReactionKind.ACCEPT, 1),
(aliased_reaction.kind == ReactionKind.REJECT, -1),
(aliased_reaction.kind == ReactionKind.LIKE, 1),
(aliased_reaction.kind == ReactionKind.DISLIKE, -1),
else_=0)
).label('rating_stat'))
).label("rating_stat"),
)
return q
@@ -47,22 +44,25 @@ def reactions_follow(user_id, shout_id: int, auto=False):
shout = session.query(Shout).where(Shout.id == shout_id).one()
following = (
session.query(ShoutReactionsFollower).where(and_(
ShoutReactionsFollower.follower == user_id,
ShoutReactionsFollower.shout == shout.id,
)).first()
session.query(ShoutReactionsFollower)
.where(
and_(
ShoutReactionsFollower.follower == user_id,
ShoutReactionsFollower.shout == shout.id,
)
)
.first()
)
if not following:
following = ShoutReactionsFollower.create(
follower=user_id,
shout=shout.id,
auto=auto
follower=user_id, shout=shout.id, auto=auto
)
session.add(following)
session.commit()
return True
except:
except Exception as e:
print(e)
return False
@@ -72,46 +72,52 @@ def reactions_unfollow(user_id: int, shout_id: int):
shout = session.query(Shout).where(Shout.id == shout_id).one()
following = (
session.query(ShoutReactionsFollower).where(and_(
ShoutReactionsFollower.follower == user_id,
ShoutReactionsFollower.shout == shout.id
)).first()
session.query(ShoutReactionsFollower)
.where(
and_(
ShoutReactionsFollower.follower == user_id,
ShoutReactionsFollower.shout == shout.id,
)
)
.first()
)
if following:
session.delete(following)
session.commit()
return True
except:
except Exception as e:
print(e)
pass
return False
def is_published_author(session, user_id):
''' checks if user has at least one publication '''
return session.query(
Shout
).where(
Shout.authors.contains(user_id)
).filter(
and_(
Shout.publishedAt.is_not(None),
Shout.deletedAt.is_(None)
)
).count() > 0
"""checks if user has at least one publication"""
return (
session.query(Shout)
.where(Shout.authors.contains(user_id))
.filter(and_(Shout.publishedAt.is_not(None), Shout.deletedAt.is_(None)))
.count()
> 0
)
def check_to_publish(session, user_id, reaction):
''' set shout to public if publicated approvers amount > 4 '''
"""set shout to public if publicated approvers amount > 4"""
if not reaction.replyTo and reaction.kind in [
ReactionKind.ACCEPT,
ReactionKind.LIKE,
ReactionKind.PROOF
ReactionKind.PROOF,
]:
if is_published_author(user_id):
# now count how many approvers are voted already
approvers_reactions = session.query(Reaction).where(Reaction.shout == reaction.shout).all()
approvers = [user_id, ]
approvers_reactions = (
session.query(Reaction).where(Reaction.shout == reaction.shout).all()
)
approvers = [
user_id,
]
for ar in approvers_reactions:
a = ar.createdBy
if is_published_author(session, a):
@@ -122,21 +128,17 @@ def check_to_publish(session, user_id, reaction):
def check_to_hide(session, user_id, reaction):
''' hides any shout if 20% of reactions are negative '''
"""hides any shout if 20% of reactions are negative"""
if not reaction.replyTo and reaction.kind in [
ReactionKind.REJECT,
ReactionKind.DISLIKE,
ReactionKind.DISPROOF
ReactionKind.DISPROOF,
]:
# if is_published_author(user):
approvers_reactions = session.query(Reaction).where(Reaction.shout == reaction.shout).all()
rejects = 0
for r in approvers_reactions:
if r.kind in [
ReactionKind.REJECT,
ReactionKind.DISLIKE,
ReactionKind.DISPROOF
]:
if r.kind in [ReactionKind.REJECT, ReactionKind.DISLIKE, ReactionKind.DISPROOF]:
rejects += 1
if len(approvers_reactions) / rejects < 5:
return True
@@ -146,14 +148,14 @@ def check_to_hide(session, user_id, reaction):
def set_published(session, shout_id):
s = session.query(Shout).where(Shout.id == shout_id).first()
s.publishedAt = datetime.now(tz=timezone.utc)
s.visibility = text('public')
s.visibility = text("public")
session.add(s)
session.commit()
def set_hidden(session, shout_id):
s = session.query(Shout).where(Shout.id == shout_id).first()
s.visibility = text('community')
s.visibility = text("community")
session.add(s)
session.commit()
@@ -162,37 +164,46 @@ def set_hidden(session, shout_id):
@login_required
async def create_reaction(_, info, reaction):
auth: AuthCredentials = info.context["request"].auth
reaction['createdBy'] = auth.user_id
reaction["createdBy"] = auth.user_id
rdict = {}
with local_session() as session:
shout = session.query(Shout).where(Shout.id == reaction["shout"]).one()
author = session.query(User).where(User.id == auth.user_id).one()
if reaction["kind"] in [
ReactionKind.DISLIKE.name,
ReactionKind.LIKE.name
]:
existing_reaction = session.query(Reaction).where(
and_(
Reaction.shout == reaction["shout"],
Reaction.createdBy == auth.user_id,
Reaction.kind == reaction["kind"],
Reaction.replyTo == reaction.get("replyTo")
if reaction["kind"] in [ReactionKind.DISLIKE.name, ReactionKind.LIKE.name]:
existing_reaction = (
session.query(Reaction)
.where(
and_(
Reaction.shout == reaction["shout"],
Reaction.createdBy == auth.user_id,
Reaction.kind == reaction["kind"],
Reaction.replyTo == reaction.get("replyTo"),
)
)
).first()
.first()
)
if existing_reaction is not None:
raise OperationNotAllowed("You can't vote twice")
opposite_reaction_kind = ReactionKind.DISLIKE if reaction["kind"] == ReactionKind.LIKE.name else ReactionKind.LIKE
opposite_reaction = session.query(Reaction).where(
opposite_reaction_kind = (
ReactionKind.DISLIKE
if reaction["kind"] == ReactionKind.LIKE.name
else ReactionKind.LIKE
)
opposite_reaction = (
session.query(Reaction)
.where(
and_(
Reaction.shout == reaction["shout"],
Reaction.createdBy == auth.user_id,
Reaction.kind == opposite_reaction_kind,
Reaction.replyTo == reaction.get("replyTo")
Reaction.replyTo == reaction.get("replyTo"),
)
).first()
)
.first()
)
if opposite_reaction is not None:
session.delete(opposite_reaction)
@@ -221,8 +232,8 @@ async def create_reaction(_, info, reaction):
await notification_service.handle_new_reaction(r.id)
rdict = r.dict()
rdict['shout'] = shout.dict()
rdict['createdBy'] = author.dict()
rdict["shout"] = shout.dict()
rdict["createdBy"] = author.dict()
# self-regulation mechanics
if check_to_hide(session, auth.user_id, r):
@@ -235,11 +246,7 @@ async def create_reaction(_, info, reaction):
except Exception as e:
print(f"[resolvers.reactions] error on reactions autofollowing: {e}")
rdict['stat'] = {
"commented": 0,
"reacted": 0,
"rating": 0
}
rdict["stat"] = {"commented": 0, "reacted": 0, "rating": 0}
return {"reaction": rdict}
@@ -269,11 +276,7 @@ async def update_reaction(_, info, id, reaction={}):
if reaction.get("range"):
r.range = reaction.get("range")
session.commit()
r.stat = {
"commented": commented_stat,
"reacted": reacted_stat,
"rating": rating_stat
}
r.stat = {"commented": commented_stat, "reacted": reacted_stat, "rating": rating_stat}
return {"reaction": r}
@@ -290,17 +293,12 @@ async def delete_reaction(_, info, id):
if r.createdBy != auth.user_id:
return {"error": "access denied"}
if r.kind in [
ReactionKind.LIKE,
ReactionKind.DISLIKE
]:
if r.kind in [ReactionKind.LIKE, ReactionKind.DISLIKE]:
session.delete(r)
else:
r.deletedAt = datetime.now(tz=timezone.utc)
session.commit()
return {
"reaction": r
}
return {"reaction": r}
@query.field("loadReactionsBy")
@@ -321,12 +319,10 @@ async def load_reactions_by(_, _info, by, limit=50, offset=0):
:return: Reaction[]
"""
q = select(
Reaction, User, Shout
).join(
User, Reaction.createdBy == User.id
).join(
Shout, Reaction.shout == Shout.id
q = (
select(Reaction, User, Shout)
.join(User, Reaction.createdBy == User.id)
.join(Shout, Reaction.shout == Shout.id)
)
if by.get("shout"):
@@ -344,7 +340,7 @@ async def load_reactions_by(_, _info, by, limit=50, offset=0):
if by.get("comment"):
q = q.filter(func.length(Reaction.body) > 0)
if len(by.get('search', '')) > 2:
if len(by.get("search", "")) > 2:
q = q.filter(Reaction.body.ilike(f'%{by["body"]}%'))
if by.get("days"):
@@ -352,13 +348,9 @@ async def load_reactions_by(_, _info, by, limit=50, offset=0):
q = q.filter(Reaction.createdAt > after)
order_way = asc if by.get("sort", "").startswith("-") else desc
order_field = by.get("sort", "").replace('-', '') or Reaction.createdAt
order_field = by.get("sort", "").replace("-", "") or Reaction.createdAt
q = q.group_by(
Reaction.id, User.id, Shout.id
).order_by(
order_way(order_field)
)
q = q.group_by(Reaction.id, User.id, Shout.id).order_by(order_way(order_field))
q = add_reaction_stat_columns(q)
@@ -367,13 +359,15 @@ async def load_reactions_by(_, _info, by, limit=50, offset=0):
reactions = []
with local_session() as session:
for [reaction, user, shout, reacted_stat, commented_stat, rating_stat] in session.execute(q):
for [reaction, user, shout, reacted_stat, commented_stat, rating_stat] in session.execute(
q
):
reaction.createdBy = user
reaction.shout = shout
reaction.stat = {
"rating": rating_stat,
"commented": commented_stat,
"reacted": reacted_stat
"reacted": reacted_stat,
}
reaction.kind = reaction.kind.name

View File

@@ -1,24 +1,26 @@
from sqlalchemy import and_, select, distinct, func
from sqlalchemy import and_, distinct, func, select
from sqlalchemy.orm import aliased
from auth.authenticate import login_required
from base.orm import local_session
from base.resolvers import mutation, query
from orm.shout import ShoutTopic, ShoutAuthor
from orm.topic import Topic, TopicFollower
from orm import User
from orm.shout import ShoutAuthor, ShoutTopic
from orm.topic import Topic, TopicFollower
def add_topic_stat_columns(q):
aliased_shout_author = aliased(ShoutAuthor)
aliased_topic_follower = aliased(TopicFollower)
aliased_shout_topic = aliased(ShoutTopic)
q = q.outerjoin(ShoutTopic, Topic.id == ShoutTopic.topic).add_columns(
func.count(distinct(ShoutTopic.shout)).label('shouts_stat')
).outerjoin(aliased_shout_author, ShoutTopic.shout == aliased_shout_author.shout).add_columns(
func.count(distinct(aliased_shout_author.user)).label('authors_stat')
).outerjoin(aliased_topic_follower).add_columns(
func.count(distinct(aliased_topic_follower.follower)).label('followers_stat')
q = (
q.outerjoin(aliased_shout_topic, Topic.id == aliased_shout_topic.topic)
.add_columns(func.count(distinct(aliased_shout_topic.shout)).label("shouts_stat"))
.outerjoin(aliased_shout_author, aliased_shout_topic.shout == aliased_shout_author.shout)
.add_columns(func.count(distinct(aliased_shout_author.user)).label("authors_stat"))
.outerjoin(aliased_topic_follower)
.add_columns(func.count(distinct(aliased_topic_follower.follower)).label("followers_stat"))
)
q = q.group_by(Topic.id)
@@ -28,11 +30,7 @@ def add_topic_stat_columns(q):
def add_stat(topic, stat_columns):
[shouts_stat, authors_stat, followers_stat] = stat_columns
topic.stat = {
"shouts": shouts_stat,
"authors": authors_stat,
"followers": followers_stat
}
topic.stat = {"shouts": shouts_stat, "authors": authors_stat, "followers": followers_stat}
return topic
@@ -86,7 +84,8 @@ async def get_topic(_, _info, slug):
q = add_topic_stat_columns(q)
topics = get_topics_from_query(q)
return topics[0]
if topics:
return topics[0]
@mutation.field("createTopic")
@@ -125,7 +124,8 @@ def topic_follow(user_id, slug):
session.add(following)
session.commit()
return True
except:
except Exception as e:
print(e)
return False
@@ -133,22 +133,34 @@ def topic_unfollow(user_id, slug):
try:
with local_session() as session:
sub = (
session.query(TopicFollower).join(Topic).filter(
and_(
TopicFollower.follower == user_id,
Topic.slug == slug
)
).first()
session.query(TopicFollower)
.join(Topic)
.filter(and_(TopicFollower.follower == user_id, Topic.slug == slug))
.first()
)
if sub:
session.delete(sub)
session.commit()
return True
except:
except Exception as e:
print(e)
pass
return False
def get_random_topic():
q = select(Topic)
q = q.join(ShoutTopic)
q = q.group_by(Topic.id)
q = q.having(func.count(distinct(ShoutTopic.shout)) > 10)
q = q.order_by(func.random()).limit(1)
with local_session() as session:
topics = session.execute(q).first()
if topics:
return topics[0]
@query.field("topicsRandom")
async def topics_random(_, info, amount=12):
q = select(Topic)

220
robo_migrate_a2.sh Normal file
View File

@@ -0,0 +1,220 @@
#!/bin/bash
# This version is a2.1 because have update in postgres dsn to ip adress
export PATH="$PATH:/usr/local/sbin:/usr/sbin:/sbin"
APP="discoursio-api"
SSH_KEY="/root/.ssh/id_rsa"
YMD=$(date "+%Y-%m-%d")
DUMP_PATH="/var/lib/dokku/data/storage/discoursio-api/migration/dump"
DATA_PATH="/var/lib/dokku/data/storage/discoursio-api/migration/data"
SCRIPT_PATH="/root/robo_script"
MONGO_DB_PATH="/var/backups/mongodb"
POSTGRES_DB_PATH="/var/backups/postgres"
CONTAINER_ID=$(docker ps | grep "$APP" | /bin/awk '{print $1}')
OLD_DB=$(dokku postgres:app-links "$APP")
NEW_DB="discoursio-db-$YMD"
DSN_OLD_DB=$(dokku config:get "$APP" DATABASE_URL)
LAST_DB_MONGO=$(find "$MONGO_DB_PATH" -printf '%T@ %p\n' | sort -nk1 | grep discours | tail -n 1 | /bin/awk '{print $2}')
LAST_DB_POSTGRES=$(find "$POSTGRES_DB_PATH" -printf '%T@ %p\n' | sort -nk1 | grep discours | tail -n 1 | /bin/awk '{print $2}')
NEW_HOST="testapi.discours.io"
NEW_PATH="/root/."
increase_swap() {
echo "Make Swap 6GB"
swapoff -a
dd if=/dev/zero of=/swap_file bs=1M count=6144
chmod 600 /swap_file
mkswap /swap_file
swapon /swap_file
}
check_container() {
if [ -z "$CONTAINER_ID" ]; then
echo "Container $APP is not Running"
exit 1
fi
echo "Container $APP is running"
}
check_dump_dir() {
if [ ! -d $DUMP_PATH ]; then
echo "$DUMP_PATH dosn't exist"
exit 1
else
echo "$DUMP_PATH exist (^.-)"
fi
if [ ! -d $DATA_PATH ]; then
echo "$DATA_PATH dosn't exist"
exit 1
else
echo "$DATA_PATH exist (-.^)"
fi
}
check_old_db() {
if [ -z "$OLD_DB" ]; then
echo "DB postgres is not set"
exit 1
fi
echo "DB postgres is set"
}
check_app_config() {
if $(dokku docker-options:report $APP | grep -q $DUMP_PATH) && $(dokku docker-options:report $APP | grep -q $DATA_PATH); then
echo "DUMP_PATH and DATA_PATH exist in $APP config"
else
echo "DUMP_PATH or DATA_PATH does not exist in $APP config"
exit 1
fi
}
untar_mongo_db() {
if [ -d "$DUMP_PATH/discours" ]; then
echo "$DUMP_PATH/discours File exists"
else
tar xzf $LAST_DB_MONGO && mv *.bson/discours $DUMP_PATH/ && rm -R *.bson
fi
echo "Untar Bson from mongoDB"
}
bson_mode() {
CONTAINER_ID=$(docker ps | grep "$APP" | /bin/awk '{print $1}')
if [ -z "$CONTAINER_ID" ]; then
echo "Container $APP is not Running"
exit 1
fi
docker exec -t "$CONTAINER_ID" rm -rf dump
docker exec -t "$CONTAINER_ID" ln -s /migration/dump dump
docker exec -t "$CONTAINER_ID" rm -rf migration/data
docker exec -t "$CONTAINER_ID" ln -s /migration/data migration/data
docker exec -t "$CONTAINER_ID" python3 server.py bson
}
create_new_postgres_db() {
echo "Create NEW postgres DB"
dokku postgres:create "$NEW_DB"
# Get the internal IP address
INTERNAL_IP=$(dokku postgres:info "$NEW_DB" | grep 'Internal ip:' | awk '{print $3}')
# Get the DSN without the hostname
DSN=$(dokku postgres:info "$NEW_DB" --dsn | sed 's/postgres/postgresql/')
# Replace the hostname with the internal IP address
DSN_NEW_DB=$(echo "$DSN" | sed "s@dokku-postgres-$NEW_DB@$INTERNAL_IP@")
echo "$DSN_NEW_DB"
dokku postgres:link "$NEW_DB" "$APP" -a "MIGRATION_DATABASE"
dokku config:set "$APP" MIGRATION_DATABASE_URL="$DSN_NEW_DB" --no-restart
# Wait for 120 seconds
echo "Waiting for 120 seconds..."
for i in {1..120}; do
sleep 1
echo -n "(^.^') "
done
}
migrate_jsons() {
CONTAINER_ID=$(docker ps | grep $APP | /bin/awk '{print $1}')
if [ -z "$CONTAINER_ID" ]; then
echo "Container $APP is not Running"
exit 1
fi
docker exec -t "$CONTAINER_ID" rm -rf dump
docker exec -t "$CONTAINER_ID" ln -s /migration/dump dump
docker exec -t "$CONTAINER_ID" rm -rf migration/data
docker exec -t "$CONTAINER_ID" ln -s /migration/data migration/data
docker exec -t --env DATABASE_URL="$DSN_NEW_DB" "$CONTAINER_ID" python3 server.py migrate
}
restart_and_clean() {
dokku ps:stop "$APP"
dokku config:unset "$APP" MIGRATION_DATABASE_URL --no-restart
dokku config:unset "$APP" DATABASE_URL --no-restart
dokku config:set "$APP" DATABASE_URL="$DSN_NEW_DB" --no-restart
dokku postgres:unlink "$OLD_DB" "$APP"
dokku ps:start "$APP"
}
send_postgres_dump() {
echo "send postgres.dump to $NEW_HOST"
scp -i "$SSH_KEY" -r "$LAST_DB_POSTGRES" "root@$NEW_HOST:$NEW_PATH"
}
delete_files() {
rm -rf $DUMP_PATH/*
rm -rf $DATA_PATH/*
}
configure_pgweb() {
echo "config PGWEB"
dokku ps:stop pgweb
dokku config:unset pgweb DATABASE_URL --no-restart
dokku postgres:unlink "$OLD_DB" pgweb
dokku postgres:link "$NEW_DB" pgweb -a "DATABASE"
dokku postgres:destroy "$OLD_DB" -f
dokku ps:start pgweb
}
rm_old_db() {
echo "remove old DB"
dokku postgres:destroy "$OLD_DB" -f
}
decrease_swap() {
echo "make swap 2gb again"
swapoff -a
dd if=/dev/zero of=/swap_file bs=1M count=2048
chmod 600 /swap_file
mkswap /swap_file
swapon /swap_file
}
# Main script flow
increase_swap
check_container
check_dump_dir
check_old_db
check_app_config
untar_mongo_db
if bson_mode; then
create_new_postgres_db
else
echo "BSON move didn't work well! ERROR!"
decrease_swap
delete_files
exit 1
fi
if migrate_jsons; then
restart_and_clean
else
echo "MIGRATE move didn't work well! ERROR!"
delete_files
rm_old_db
decrease_swap
exit 1
fi
send_postgres_dump
delete_files
#configure_pgweb
rm_old_db
decrease_swap

1
runtime.txt Normal file
View File

@@ -0,0 +1 @@
python-3.11.7

View File

@@ -8,19 +8,10 @@ enum MessageStatus {
DELETED
}
type UserFollowings {
unread: Int
topics: [String]
authors: [String]
reactions: [Int]
communities: [String]
}
type AuthResult {
error: String
token: String
user: User
news: UserFollowings
}
type ChatMember {
@@ -221,14 +212,13 @@ input AuthorsBy {
}
input LoadShoutsFilters {
title: String
body: String
topic: String
author: String
layout: String
excludeLayout: String
visibility: String
days: Int
fromDate: String
toDate: String
reacted: Boolean
}
@@ -241,6 +231,12 @@ input LoadShoutsOptions {
order_by_desc: Boolean
}
input LoadRandomTopShoutsParams {
filters: LoadShoutsFilters
limit: Int!
fromRandomCount: Int
}
input ReactionBy {
shout: String # slug
shouts: [String]
@@ -263,6 +259,16 @@ type NotificationsQueryResult {
totalUnreadCount: Int!
}
type MySubscriptionsQueryResult {
topics: [Topic]!
authors: [Author]!
}
type RandomTopicShoutsQueryResult {
topic: Topic!
shouts: [Shout]!
}
type Query {
# inbox
loadChats( limit: Int, offset: Int): Result! # your chats
@@ -280,6 +286,9 @@ type Query {
loadAuthorsBy(by: AuthorsBy, limit: Int, offset: Int): [Author]!
loadShout(slug: String, shout_id: Int): Shout
loadShouts(options: LoadShoutsOptions): [Shout]!
loadRandomTopShouts(params: LoadRandomTopShoutsParams): [Shout]!
loadRandomTopicShouts(limit: Int!): RandomTopicShoutsQueryResult!
loadUnratedShouts(limit: Int!): [Shout]!
loadDrafts: [Shout]!
loadReactionsBy(by: ReactionBy!, limit: Int, offset: Int): [Reaction]!
userFollowers(slug: String!): [Author]!
@@ -300,6 +309,8 @@ type Query {
topicsByAuthor(author: String!): [Topic]!
loadNotifications(params: NotificationsQueryParams!): NotificationsQueryResult!
loadMySubscriptions: MySubscriptionsQueryResult
}
############################################ Entities

View File

@@ -1,8 +1,9 @@
import sys
import os
import sys
import uvicorn
from settings import PORT, DEV_SERVER_PID_FILE_NAME
from settings import DEV_SERVER_PID_FILE_NAME, PORT
def exception_handler(exception_type, exception, traceback, debug_hook=sys.excepthook):
@@ -10,47 +11,36 @@ def exception_handler(exception_type, exception, traceback, debug_hook=sys.excep
log_settings = {
'version': 1,
'disable_existing_loggers': True,
'formatters': {
'default': {
'()': 'uvicorn.logging.DefaultFormatter',
'fmt': '%(levelprefix)s %(message)s',
'use_colors': None
"version": 1,
"disable_existing_loggers": True,
"formatters": {
"default": {
"()": "uvicorn.logging.DefaultFormatter",
"fmt": "%(levelprefix)s %(message)s",
"use_colors": None,
},
"access": {
"()": "uvicorn.logging.AccessFormatter",
"fmt": '%(levelprefix)s %(client_addr)s - "%(request_line)s" %(status_code)s',
},
'access': {
'()': 'uvicorn.logging.AccessFormatter',
'fmt': '%(levelprefix)s %(client_addr)s - "%(request_line)s" %(status_code)s'
}
},
'handlers': {
'default': {
'formatter': 'default',
'class': 'logging.StreamHandler',
'stream': 'ext://sys.stderr'
"handlers": {
"default": {
"formatter": "default",
"class": "logging.StreamHandler",
"stream": "ext://sys.stderr",
},
"access": {
"formatter": "access",
"class": "logging.StreamHandler",
"stream": "ext://sys.stdout",
},
'access': {
'formatter': 'access',
'class': 'logging.StreamHandler',
'stream': 'ext://sys.stdout'
}
},
'loggers': {
'uvicorn': {
'handlers': ['default'],
'level': 'INFO'
},
'uvicorn.error': {
'level': 'INFO',
'handlers': ['default'],
'propagate': True
},
'uvicorn.access': {
'handlers': ['access'],
'level': 'INFO',
'propagate': False
}
}
"loggers": {
"uvicorn": {"handlers": ["default"], "level": "INFO"},
"uvicorn.error": {"level": "INFO", "handlers": ["default"], "propagate": True},
"uvicorn.access": {"handlers": ["access"], "level": "INFO", "propagate": False},
},
}
local_headers = [
@@ -58,7 +48,8 @@ local_headers = [
("Access-Control-Allow-Origin", "https://localhost:3000"),
(
"Access-Control-Allow-Headers",
"DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization",
"DNT,User-Agent,X-Requested-With,If-Modified-Since,"
+ " Cache-Control,Content-Type,Range,Authorization",
),
("Access-Control-Expose-Headers", "Content-Length,Content-Range"),
("Access-Control-Allow-Credentials", "true"),
@@ -86,24 +77,20 @@ if __name__ == "__main__":
# log_config=log_settings,
log_level=None,
access_log=True,
reload=want_reload
reload=want_reload,
) # , ssl_keyfile="discours.key", ssl_certfile="discours.crt")
elif x == "migrate":
from migration import process
print("MODE: MIGRATE")
process()
elif x == "bson":
from migration.bson2json import json_tables
print("MODE: BSON")
json_tables()
else:
sys.excepthook = exception_handler
uvicorn.run(
"main:app",
host="0.0.0.0",
port=PORT,
proxy_headers=True,
server_header=True
)
uvicorn.run("main:app", host="0.0.0.0", port=PORT, proxy_headers=True, server_header=True)

View File

@@ -18,12 +18,7 @@ class Following:
class FollowingManager:
lock = asyncio.Lock()
data = {
'author': [],
'topic': [],
'shout': [],
'chat': []
}
data = {"author": [], "topic": [], "shout": [], "chat": []}
@staticmethod
async def register(kind, uid):
@@ -39,13 +34,13 @@ class FollowingManager:
async def push(kind, payload):
try:
async with FollowingManager.lock:
if kind == 'chat':
for chat in FollowingManager['chat']:
if kind == "chat":
for chat in FollowingManager["chat"]:
if payload.message["chatId"] == chat.uid:
chat.queue.put_nowait(payload)
else:
for entity in FollowingManager[kind]:
if payload.shout['createdBy'] == entity.uid:
if payload.shout["createdBy"] == entity.uid:
entity.queue.put_nowait(payload)
except Exception as e:
print(Exception(e))

View File

@@ -1,13 +1,13 @@
from base.orm import local_session
from services.search import SearchService
from services.stat.viewed import ViewedStorage
from base.orm import local_session
async def storages_init():
with local_session() as session:
print('[main] initialize SearchService')
print("[main] initialize SearchService")
await SearchService.init(session)
print('[main] SearchService initialized')
print('[main] initialize storages')
print("[main] SearchService initialized")
print("[main] initialize storages")
await ViewedStorage.init()
print('[main] storages initialized')
print("[main] storages initialized")

View File

@@ -5,32 +5,24 @@ from datetime import datetime, timezone
from sqlalchemy import and_
from base.orm import local_session
from orm import Reaction, Shout, Notification, User
from orm import Notification, Reaction, Shout, User
from orm.notification import NotificationType
from orm.reaction import ReactionKind
from services.notifications.sse import connection_manager
def shout_to_shout_data(shout):
return {
"title": shout.title,
"slug": shout.slug
}
return {"title": shout.title, "slug": shout.slug}
def user_to_user_data(user):
return {
"id": user.id,
"name": user.name,
"slug": user.slug,
"userpic": user.userpic
}
return {"id": user.id, "name": user.name, "slug": user.slug, "userpic": user.userpic}
def update_prev_notification(notification, user, reaction):
notification_data = json.loads(notification.data)
notification_data["users"] = [u for u in notification_data["users"] if u['id'] != user.id]
notification_data["users"] = [u for u in notification_data["users"] if u["id"] != user.id]
notification_data["users"].append(user_to_user_data(user))
if notification_data["reactionIds"] is None:
@@ -57,34 +49,45 @@ class NewReactionNotificator:
if reaction.kind == ReactionKind.COMMENT:
parent_reaction = None
if reaction.replyTo:
parent_reaction = session.query(Reaction).where(Reaction.id == reaction.replyTo).one()
parent_reaction = (
session.query(Reaction).where(Reaction.id == reaction.replyTo).one()
)
if parent_reaction.createdBy != reaction.createdBy:
prev_new_reply_notification = session.query(Notification).where(
and_(
Notification.user == shout.createdBy,
Notification.type == NotificationType.NEW_REPLY,
Notification.shout == shout.id,
Notification.reaction == parent_reaction.id,
Notification.seen == False
prev_new_reply_notification = (
session.query(Notification)
.where(
and_(
Notification.user == shout.createdBy,
Notification.type == NotificationType.NEW_REPLY,
Notification.shout == shout.id,
Notification.reaction == parent_reaction.id,
Notification.seen == False, # noqa: E712
)
)
).first()
.first()
)
if prev_new_reply_notification:
update_prev_notification(prev_new_reply_notification, user, reaction)
else:
reply_notification_data = json.dumps({
"shout": shout_to_shout_data(shout),
"users": [user_to_user_data(user)],
"reactionIds": [reaction.id]
}, ensure_ascii=False)
reply_notification_data = json.dumps(
{
"shout": shout_to_shout_data(shout),
"users": [user_to_user_data(user)],
"reactionIds": [reaction.id],
},
ensure_ascii=False,
)
reply_notification = Notification.create(**{
"user": parent_reaction.createdBy,
"type": NotificationType.NEW_REPLY,
"shout": shout.id,
"reaction": parent_reaction.id,
"data": reply_notification_data
})
reply_notification = Notification.create(
**{
"user": parent_reaction.createdBy,
"type": NotificationType.NEW_REPLY,
"shout": shout.id,
"reaction": parent_reaction.id,
"data": reply_notification_data,
}
)
session.add(reply_notification)
@@ -93,30 +96,39 @@ class NewReactionNotificator:
if reaction.createdBy != shout.createdBy and (
parent_reaction is None or parent_reaction.createdBy != shout.createdBy
):
prev_new_comment_notification = session.query(Notification).where(
and_(
Notification.user == shout.createdBy,
Notification.type == NotificationType.NEW_COMMENT,
Notification.shout == shout.id,
Notification.seen == False
prev_new_comment_notification = (
session.query(Notification)
.where(
and_(
Notification.user == shout.createdBy,
Notification.type == NotificationType.NEW_COMMENT,
Notification.shout == shout.id,
Notification.seen == False, # noqa: E712
)
)
).first()
.first()
)
if prev_new_comment_notification:
update_prev_notification(prev_new_comment_notification, user, reaction)
else:
notification_data_string = json.dumps({
"shout": shout_to_shout_data(shout),
"users": [user_to_user_data(user)],
"reactionIds": [reaction.id]
}, ensure_ascii=False)
notification_data_string = json.dumps(
{
"shout": shout_to_shout_data(shout),
"users": [user_to_user_data(user)],
"reactionIds": [reaction.id],
},
ensure_ascii=False,
)
author_notification = Notification.create(**{
"user": shout.createdBy,
"type": NotificationType.NEW_COMMENT,
"shout": shout.id,
"data": notification_data_string
})
author_notification = Notification.create(
**{
"user": shout.createdBy,
"type": NotificationType.NEW_COMMENT,
"shout": shout.id,
"data": notification_data_string,
}
)
session.add(author_notification)
@@ -130,7 +142,7 @@ class NewReactionNotificator:
class NotificationService:
def __init__(self):
self._queue = asyncio.Queue()
self._queue = asyncio.Queue(maxsize=1000)
async def handle_new_reaction(self, reaction_id):
notificator = NewReactionNotificator(reaction_id)
@@ -142,7 +154,7 @@ class NotificationService:
try:
await notificator.run()
except Exception as e:
print(f'[NotificationService.worker] error: {str(e)}')
print(f"[NotificationService.worker] error: {str(e)}")
notification_service = NotificationService()

View File

@@ -1,8 +1,8 @@
import asyncio
import json
from sse_starlette.sse import EventSourceResponse
from starlette.requests import Request
import asyncio
class ConnectionManager:
@@ -28,9 +28,7 @@ class ConnectionManager:
return
for connection in self.connections_by_user_id[user_id]:
data = {
"type": "newNotifications"
}
data = {"type": "newNotifications"}
data_string = json.dumps(data, ensure_ascii=False)
await connection.put(data_string)

View File

@@ -1,5 +1,7 @@
import asyncio
import json
from typing import List
from base.redis import redis
from orm.shout import Shout
from resolvers.zine.load import load_shouts_by
@@ -7,25 +9,20 @@ from resolvers.zine.load import load_shouts_by
class SearchService:
lock = asyncio.Lock()
cache = {}
# cache = {}
@staticmethod
async def init(session):
async with SearchService.lock:
print('[search.service] did nothing')
SearchService.cache = {}
print("[search.service] did nothing")
# SearchService.cache = {}
@staticmethod
async def search(text, limit, offset) -> [Shout]:
async def search(text, limit, offset) -> List[Shout]:
cached = await redis.execute("GET", text)
if not cached:
async with SearchService.lock:
options = {
"title": text,
"body": text,
"limit": limit,
"offset": offset
}
options = {"title": text, "body": text, "limit": limit, "offset": offset}
payload = await load_shouts_by(None, None, options)
await redis.execute("SET", text, json.dumps(payload))
return payload

View File

@@ -1,18 +1,18 @@
import asyncio
import time
from datetime import timedelta, timezone, datetime
from datetime import datetime, timedelta, timezone
from os import environ, path
from ssl import create_default_context
from gql import Client, gql
from gql.transport.aiohttp import AIOHTTPTransport
from sqlalchemy import func
from base.orm import local_session
from orm import User, Topic
from orm.shout import ShoutTopic, Shout
from orm import Topic
from orm.shout import Shout, ShoutTopic
load_facts = gql("""
load_facts = gql(
"""
query getDomains {
domains {
id
@@ -25,9 +25,11 @@ query getDomains {
}
}
}
""")
"""
)
load_pages = gql("""
load_pages = gql(
"""
query getDomains {
domains {
title
@@ -41,8 +43,9 @@ query getDomains {
}
}
}
""")
schema_str = open(path.dirname(__file__) + '/ackee.graphql').read()
"""
)
schema_str = open(path.dirname(__file__) + "/ackee.graphql").read()
token = environ.get("ACKEE_TOKEN", "")
@@ -50,10 +53,8 @@ def create_client(headers=None, schema=None):
return Client(
schema=schema,
transport=AIOHTTPTransport(
url="https://ackee.discours.io/api",
ssl=create_default_context(),
headers=headers
)
url="https://ackee.discours.io/api", ssl=create_default_context(), headers=headers
),
)
@@ -71,13 +72,13 @@ class ViewedStorage:
@staticmethod
async def init():
""" graphql client connection using permanent token """
"""graphql client connection using permanent token"""
self = ViewedStorage
async with self.lock:
if token:
self.client = create_client({
"Authorization": "Bearer %s" % str(token)
}, schema=schema_str)
self.client = create_client(
{"Authorization": "Bearer %s" % str(token)}, schema=schema_str
)
print("[stat.viewed] * authorized permanentely by ackee.discours.io: %s" % token)
else:
print("[stat.viewed] * please set ACKEE_TOKEN")
@@ -85,7 +86,7 @@ class ViewedStorage:
@staticmethod
async def update_pages():
""" query all the pages from ackee sorted by views count """
"""query all the pages from ackee sorted by views count"""
print("[stat.viewed] ⎧ updating ackee pages data ---")
start = time.time()
self = ViewedStorage
@@ -96,7 +97,7 @@ class ViewedStorage:
try:
for page in self.pages:
p = page["value"].split("?")[0]
slug = p.split('discours.io/')[-1]
slug = p.split("discours.io/")[-1]
shouts[slug] = page["count"]
for slug in shouts.keys():
await ViewedStorage.increment(slug, shouts[slug])
@@ -118,7 +119,7 @@ class ViewedStorage:
# unused yet
@staticmethod
async def get_shout(shout_slug):
""" getting shout views metric by slug """
"""getting shout views metric by slug"""
self = ViewedStorage
async with self.lock:
shout_views = self.by_shouts.get(shout_slug)
@@ -136,7 +137,7 @@ class ViewedStorage:
@staticmethod
async def get_topic(topic_slug):
""" getting topic views value summed """
"""getting topic views value summed"""
self = ViewedStorage
topic_views = 0
async with self.lock:
@@ -146,24 +147,28 @@ class ViewedStorage:
@staticmethod
def update_topics(session, shout_slug):
""" updates topics counters by shout slug """
"""updates topics counters by shout slug"""
self = ViewedStorage
for [shout_topic, topic] in session.query(ShoutTopic, Topic).join(Topic).join(Shout).where(
Shout.slug == shout_slug
).all():
for [shout_topic, topic] in (
session.query(ShoutTopic, Topic)
.join(Topic)
.join(Shout)
.where(Shout.slug == shout_slug)
.all()
):
if not self.by_topics.get(topic.slug):
self.by_topics[topic.slug] = {}
self.by_topics[topic.slug][shout_slug] = self.by_shouts[shout_slug]
@staticmethod
async def increment(shout_slug, amount=1, viewer='ackee'):
""" the only way to change views counter """
async def increment(shout_slug, amount=1, viewer="ackee"):
"""the only way to change views counter"""
self = ViewedStorage
async with self.lock:
# TODO optimize, currenty we execute 1 DB transaction per shout
with local_session() as session:
shout = session.query(Shout).where(Shout.slug == shout_slug).one()
if viewer == 'old-discours':
if viewer == "old-discours":
# this is needed for old db migration
if shout.viewsOld == amount:
print(f"viewsOld amount: {amount}")
@@ -185,7 +190,7 @@ class ViewedStorage:
@staticmethod
async def worker():
""" async task worker """
"""async task worker"""
failed = 0
self = ViewedStorage
if self.disabled:
@@ -205,9 +210,10 @@ class ViewedStorage:
if failed == 0:
when = datetime.now(timezone.utc) + timedelta(seconds=self.period)
t = format(when.astimezone().isoformat())
print("[stat.viewed] ⎩ next update: %s" % (
t.split("T")[0] + " " + t.split("T")[1].split(".")[0]
))
print(
"[stat.viewed] ⎩ next update: %s"
% (t.split("T")[0] + " " + t.split("T")[1].split(".")[0])
)
await asyncio.sleep(self.period)
else:
await asyncio.sleep(10)

View File

@@ -3,9 +3,10 @@ from os import environ
PORT = 8080
DB_URL = (
environ.get("DATABASE_URL") or environ.get("DB_URL") or
"postgresql://postgres@localhost:5432/discoursio"
)
environ.get("DATABASE_URL")
or environ.get("DB_URL")
or "postgresql://postgres@localhost:5432/discoursio"
).replace("postgres://", "postgresql://")
JWT_ALGORITHM = "HS256"
JWT_SECRET_KEY = environ.get("JWT_SECRET_KEY") or "8f1bd7696ffb482d8486dfbc6e7d16dd-secret-key"
SESSION_TOKEN_LIFE_SPAN = 30 * 24 * 60 * 60 # 1 month in seconds
@@ -22,7 +23,7 @@ for provider in OAUTH_PROVIDERS:
"id": environ.get(provider + "_OAUTH_ID"),
"key": environ.get(provider + "_OAUTH_KEY"),
}
FRONTEND_URL = environ.get("FRONTEND_URL") or "http://localhost:3000"
FRONTEND_URL = environ.get("FRONTEND_URL") or "https://localhost:3000"
SHOUTS_REPO = "content"
SESSION_TOKEN_HEADER = "Authorization"
@@ -30,4 +31,4 @@ SENTRY_DSN = environ.get("SENTRY_DSN")
SESSION_SECRET_KEY = environ.get("SESSION_SECRET_KEY") or "!secret"
# for local development
DEV_SERVER_PID_FILE_NAME = 'dev-server.pid'
DEV_SERVER_PID_FILE_NAME = "dev-server.pid"

View File

@@ -1,23 +1,13 @@
[isort]
# https://github.com/PyCQA/isort
line_length = 120
multi_line_output = 3
include_trailing_comma = true
force_grid_wrap = 0
use_parentheses = true
force_alphabetical_sort = false
[tool:brunette]
# https://github.com/odwyersoftware/brunette
line-length = 120
single-quotes = false
profile = black
[flake8]
# https://github.com/PyCQA/flake8
exclude = .git,__pycache__,.mypy_cache,.vercel
max-line-length = 120
max-complexity = 15
select = B,C,E,F,W,T4,B9
exclude = .git,.mypy_cache,schema_types.py
max-line-length = 100
max-complexity = 10
# select = B,C,E,F,W,T4,B9
# E203: Whitespace before ':'
# E266: Too many leading '#' for block comment
# E501: Line too long (82 > 79 characters)
@@ -25,15 +15,12 @@ select = B,C,E,F,W,T4,B9
# W503: Line break occurred before a binary operator
# F403: 'from module import *' used; unable to detect undefined names
# C901: Function is too complex
ignore = E203,E266,E501,E722,W503,F403,C901
# ignore = E203,E266,E501,E722,W503,F403,C901
extend-ignore = E203
[mypy]
# https://github.com/python/mypy
ignore_missing_imports = true
warn_return_any = false
warn_unused_configs = true
disallow_untyped_calls = true
disallow_untyped_defs = true
disallow_incomplete_defs = true
[mypy-api.*]
ignore_errors = true
exclude = schema_types.py
explicit_package_bases = true
check_untyped_defs = true
plugins = sqlmypy

File diff suppressed because one or more lines are too long

View File

@@ -1,4 +1,5 @@
from typing import Optional, Text
from pydantic import BaseModel

View File

@@ -1,4 +1,5 @@
from typing import Optional, Text, List
from typing import List, Optional, Text
from pydantic import BaseModel
@@ -20,6 +21,7 @@ class Member(BaseModel):
class Chat(BaseModel):
id: int
createdAt: int
createdBy: int
users: List[int]