Compare commits

..

1971 Commits
main ... dev

Author SHA1 Message Date
9f16ee022b cors-applevel-fix
Some checks failed
Deploy on push / deploy (push) Failing after 40s
2025-06-02 23:59:09 +03:00
7bbb847eb1 tests, maintainance fixes
All checks were successful
Deploy on push / deploy (push) Successful in 1m36s
2025-05-16 09:22:53 +03:00
8a60bec73a tests upgrade 2025-05-16 09:11:39 +03:00
a6b3b21894 draft-topics-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 45s
2025-05-07 10:37:18 +03:00
51de649686 draft-topics-fix
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-05-07 10:22:30 +03:00
2b7d5a25b5 unpublish-fix7
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-05-03 11:52:14 +03:00
32cb810f51 unpublish-fix7 2025-05-03 11:52:10 +03:00
d2a8c23076 unpublish-fix5
All checks were successful
Deploy on push / deploy (push) Successful in 45s
2025-05-03 11:47:35 +03:00
96afda77a6 unpublish-fix5
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-05-03 11:35:03 +03:00
785548d055 cache-revalidation-fix
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-05-03 11:11:14 +03:00
d6202561a9 unpublish-fix4
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-05-03 11:07:03 +03:00
3fbd2e677a unpublish-fix3
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-05-03 11:00:19 +03:00
4f1eab513a unpublish-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-05-03 10:57:55 +03:00
44852a1553 delete-draft-fix2 2025-05-03 10:56:34 +03:00
58ec60262b unpublish,delete-draft-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m23s
2025-05-03 10:53:40 +03:00
5f3d90fc90 draft-publication-debug
All checks were successful
Deploy on push / deploy (push) Successful in 48s
2025-04-28 16:24:08 +03:00
f71fc7fde9 draft-publication-info
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-04-28 11:10:18 +03:00
ed71405082 topic-commented-stat-fix4
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-04-28 10:30:58 +03:00
79e1f15a2e topic-commented-stat-fix3
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-04-28 10:30:04 +03:00
b17acae0af topic-commented-stat-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-04-28 10:24:48 +03:00
d293819ad9 topic-commented-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 48s
2025-04-28 10:13:29 +03:00
bcbfdd76e9 html wrap fix
All checks were successful
Deploy on push / deploy (push) Successful in 48s
2025-04-27 12:53:49 +03:00
b735bf8cab draft-creator-adding
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-04-27 09:15:07 +03:00
20fd40df0e updated-by-auto-fix
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-04-26 23:46:07 +03:00
bde3211a5f updated-by-auto
All checks were successful
Deploy on push / deploy (push) Successful in 49s
2025-04-26 17:07:02 +03:00
4cd8883d72 validhtmlfix 2025-04-26 17:02:55 +03:00
0939e91700 empty-body-fix
All checks were successful
Deploy on push / deploy (push) Successful in 45s
2025-04-26 16:19:33 +03:00
dfbdfba2f0 draft-create-fix5
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-04-26 16:13:07 +03:00
b66e347c91 draft-create-fix
All checks were successful
Deploy on push / deploy (push) Successful in 45s
2025-04-26 16:03:41 +03:00
6d9513f1b2 reaction-by-fix4
All checks were successful
Deploy on push / deploy (push) Successful in 45s
2025-04-26 15:57:51 +03:00
af7fbd2fc9 reaction-by-fix3
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-04-26 15:50:20 +03:00
631ad47fe8 reaction-by-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-04-26 15:47:44 +03:00
3b3cc1c1d8 reaction-by-ашч
All checks were successful
Deploy on push / deploy (push) Successful in 35s
2025-04-26 15:42:15 +03:00
e4943f524c reaction-by-upgrade2
All checks were successful
Deploy on push / deploy (push) Successful in 36s
2025-04-26 15:35:31 +03:00
e7684c9c05 reaction-by-upgrade
All checks were successful
Deploy on push / deploy (push) Successful in 36s
2025-04-26 14:10:05 +03:00
bdae2abe25 drafts schema restore + publish/unpublish fixes
All checks were successful
Deploy on push / deploy (push) Successful in 32s
2025-04-26 13:11:12 +03:00
a310d59432 draft-resolvers
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-04-26 11:45:16 +03:00
6b2ac09f74 unpublish-fix
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-04-26 10:16:55 +03:00
5024e963e3 notify-draft-hotfix
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-04-24 12:12:48 +03:00
aaa6022a53 draft-create-fix4
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-04-16 14:17:59 +03:00
d6ada44c7f draft-create-fix3
All checks were successful
Deploy on push / deploy (push) Successful in 48s
2025-04-16 11:51:19 +03:00
243f836f0a draft-create-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-04-16 11:48:47 +03:00
536c094e72 draft-create-fix
All checks were successful
Deploy on push / deploy (push) Successful in 48s
2025-04-16 11:45:38 +03:00
6920351b82 schema-fix
All checks were successful
Deploy on push / deploy (push) Successful in 52s
2025-04-15 20:30:12 +03:00
eb216a5f36 draft-seo-handling
All checks were successful
Deploy on push / deploy (push) Successful in 1m10s
2025-04-15 20:16:01 +03:00
bd129efde6 update-seo-handling 2025-04-15 20:14:42 +03:00
b9f6033e66 generate seo text when draft created 2025-04-15 20:09:22 +03:00
710f522c8f schema-upgrade
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-04-14 19:53:14 +03:00
0de4404cb1 draft-community
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-04-14 16:02:19 +03:00
1c61e889d6 update-draft-fix
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-04-10 22:51:07 +03:00
fdedb75a2c topics-comments-stat
All checks were successful
Deploy on push / deploy (push) Successful in 45s
2025-04-10 19:14:27 +03:00
f20000f1f6 topic.stat.authors-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 45s
2025-04-10 18:46:09 +03:00
7d50638b3a topic.stat.authors-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m20s
2025-04-10 18:39:31 +03:00
abbc074474 updateby-fix
All checks were successful
Deploy on push / deploy (push) Successful in 54s
2025-03-31 14:39:02 +03:00
4f599e097f [0.4.17] - 2025-03-26
All checks were successful
Deploy on push / deploy (push) Successful in 54s
- Fixed `'Reaction' object is not subscriptable` error in hierarchical comments:
  - Modified `get_reactions_with_stat()` to convert Reaction objects to dictionaries
  - Added default values for limit/offset parameters
  - Fixed `load_first_replies()` implementation with proper parameter passing
  - Added doctest with example usage
  - Limited child comments to 100 per parent for performance
2025-03-26 08:54:10 +03:00
a5eaf4bb65 commented->comments_count
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2025-03-26 08:25:18 +03:00
3c56fdfaea get_topics_paginated-fix
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-03-22 18:49:15 +03:00
81a8bf3c58 query-type-fix
All checks were successful
Deploy on push / deploy (push) Successful in 49s
2025-03-22 18:44:31 +03:00
fe9984e2d8 type-fix
All checks were successful
Deploy on push / deploy (push) Successful in 43s
2025-03-22 18:39:14 +03:00
369ff757b0 [0.4.16] - 2025-03-22
All checks were successful
Deploy on push / deploy (push) Successful in 6s
- Added hierarchical comments pagination:
  - Created new GraphQL query `load_comments_branch` for efficient loading of hierarchical comments
  - Ability to load root comments with their first N replies
  - Added pagination for both root and child comments
  - Using existing `commented` field in `Stat` type to display number of replies
  - Added special `first_replies` field to store first replies to a comment
  - Optimized SQL queries for efficient loading of comment hierarchies
  - Implemented flexible comment sorting system (by time, rating)
2025-03-22 13:37:43 +03:00
615f1fe468 topics+authors-reimplemented-cache
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2025-03-22 11:47:19 +03:00
86ddb50cb8 topics caching upgrade 2025-03-22 09:31:53 +03:00
31c32143d0 reaction-to-feature-fix
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-03-21 12:34:10 +03:00
b63c387806 jsonfix3
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-03-20 12:52:44 +03:00
dbbfd42e08 redeploy
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-03-20 12:35:55 +03:00
47e12b4452 fx2
Some checks failed
Deploy on push / deploy (push) Failing after 16s
2025-03-20 12:33:27 +03:00
e1a1b4dc7d fx
All checks were successful
Deploy on push / deploy (push) Successful in 44s
2025-03-20 12:25:18 +03:00
ca01181f37 jsonfix
All checks were successful
Deploy on push / deploy (push) Successful in 44s
2025-03-20 12:24:30 +03:00
0aff77eda6 portfix
All checks were successful
Deploy on push / deploy (push) Successful in 49s
2025-03-20 12:13:14 +03:00
8a95aa1209 jsonload-fix
All checks were successful
Deploy on push / deploy (push) Successful in 45s
2025-03-20 12:05:58 +03:00
a4a3c35f4d lesscode
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2025-03-20 12:04:47 +03:00
edece36ecc jsonenc-fix
All checks were successful
Deploy on push / deploy (push) Successful in 12s
2025-03-20 11:59:43 +03:00
247fc98760 cachedep-fix+orjson+fmt
All checks were successful
Deploy on push / deploy (push) Successful in 1m16s
2025-03-20 11:55:21 +03:00
a1781b3800 depfix
All checks were successful
Deploy on push / deploy (push) Successful in 1m4s
2025-03-20 11:36:12 +03:00
450c73c060 nothreads
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2025-03-20 11:30:36 +03:00
3a1924279f redeploy
All checks were successful
Deploy on push / deploy (push) Successful in 49s
2025-03-20 11:23:37 +03:00
094e7e6fe2 granian-fix
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2025-03-20 11:19:29 +03:00
ae48a18536 comment-delete-handling-patch
All checks were successful
Deploy on push / deploy (push) Successful in 1m15s
2025-03-20 11:01:39 +03:00
354bda0efa drafts-fix
All checks were successful
Deploy on push / deploy (push) Successful in 59s
2025-03-13 22:21:43 +03:00
856f4ffc85 i 2025-03-09 21:01:52 +03:00
20eba36c65 create-draft-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m47s
2025-02-27 16:16:41 +03:00
8cd0c8ea4c less-logs
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-02-19 00:23:42 +03:00
2939cd8adc pyright-conf
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-02-19 00:21:51 +03:00
41d8253094 lesslogs
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-02-14 21:49:21 +03:00
5263d1657e 0.4.11-b
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-02-12 22:34:57 +03:00
1de3d163c1 0.4.11-create_draft-fix
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-02-12 21:59:05 +03:00
d3ed335fde main_topic-fix7-debug
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2025-02-12 20:04:06 +03:00
f84be7b11b main_topic-fix7
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-02-12 19:33:02 +03:00
b011c0fd48 main_topic-fix6
All checks were successful
Deploy on push / deploy (push) Successful in 54s
2025-02-12 19:21:21 +03:00
fe661a5008 main_topic-json-fix
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-02-12 02:23:51 +03:00
e97823f99c main_topic-fix4
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-02-12 00:55:55 +03:00
a9dd593ac8 main_topic-fix3
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-02-12 00:47:39 +03:00
1585e55342 main_topic-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-02-12 00:39:25 +03:00
52b608da99 main_topic-fix
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-02-12 00:31:18 +03:00
5a4f75537d debug more
All checks were successful
Deploy on push / deploy (push) Successful in 58s
2025-02-11 23:47:54 +03:00
ce4a401c1a minor-debug
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-02-11 23:44:29 +03:00
7814e3d64d 0.4.10
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-02-11 12:40:55 +03:00
9191d83f84 usermoved
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2025-02-11 12:24:02 +03:00
5d87035885 0.4.10-a
All checks were successful
Deploy on push / deploy (push) Successful in 44s
2025-02-11 12:00:35 +03:00
25b61c6b29 simple-dockerfile
All checks were successful
Deploy on push / deploy (push) Successful in 1m41s
2025-02-10 19:10:13 +03:00
9671ef2508 author-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m22s
2025-02-10 18:38:26 +03:00
759520f024 initdb-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m13s
2025-02-10 18:15:54 +03:00
a84d8a0c7e 0.4.9-c
All checks were successful
Deploy on push / deploy (push) Successful in 7s
2025-02-10 18:04:08 +03:00
20173f7d1c trigdeploy
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2025-02-10 11:30:58 +03:00
4a835bbfba 0.4.9-b
All checks were successful
Deploy on push / deploy (push) Successful in 2m38s
2025-02-09 22:26:50 +03:00
37a9a284ef 0.4.9-drafts 2025-02-09 17:18:01 +03:00
dce05342df fmt
All checks were successful
Deploy on push / deploy (push) Successful in 58s
2025-02-04 15:27:59 +03:00
56db33d7f1 get_my_rates_comments-fix
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2025-02-04 02:53:01 +03:00
40b4703b1a get_cached_topic_followers-fix
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2025-02-04 01:40:00 +03:00
747d550d80 fix-revalidation2
All checks were successful
Deploy on push / deploy (push) Successful in 58s
2025-02-04 00:08:25 +03:00
84de0c5538 fix-revalidation
All checks were successful
Deploy on push / deploy (push) Successful in 59s
2025-02-04 00:01:54 +03:00
33ddfc6c31 after-handlers-cache
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-02-03 23:22:45 +03:00
26b862d601 more-revalidation
All checks were successful
Deploy on push / deploy (push) Successful in 1m32s
2025-02-03 23:16:50 +03:00
9fe5fea238 editor-fix
All checks were successful
Deploy on push / deploy (push) Successful in 58s
2025-02-03 19:06:00 +03:00
0347b6f5ff logs-update-shout-5
All checks were successful
Deploy on push / deploy (push) Successful in 59s
2025-02-02 21:57:51 +03:00
ffb75e53f7 logs-update-shout-4
All checks were successful
Deploy on push / deploy (push) Successful in 58s
2025-02-02 21:55:22 +03:00
582ba75643 logs-update-shout-3
All checks were successful
Deploy on push / deploy (push) Successful in 58s
2025-02-02 21:49:28 +03:00
2db1da3194 logs-update-shout-2
All checks were successful
Deploy on push / deploy (push) Successful in 58s
2025-02-02 21:45:24 +03:00
fd6b0ce5fd logs-update-shout
All checks were successful
Deploy on push / deploy (push) Successful in 58s
2025-02-02 21:41:03 +03:00
Stepan Vladovskiy
670a477f9a debug: ok, moved map on tgop layaer of nginx. now this version without map
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-01-29 14:38:33 -03:00
Stepan Vladovskiy
46945197d9 debug: with hardcoded domain testing.dscrs.site and in default non for understanding
All checks were successful
Deploy on push / deploy (push) Successful in 1m0s
2025-01-29 13:59:47 -03:00
Stepan Vladovskiy
4ebc64d13a fix: so, the problem can be somewhere else, becasue map is working fine. And we are trying to find where it is ovveriting issue. Modified main.py with some extra rules. Maybe it is helps
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2025-01-28 20:03:56 -03:00
Stepan Vladovskiy
bc9560e56e feat: safe version. debug is not give results. this is simple version. In case of code beuty can be rewrite with previos nodebug version
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2025-01-28 19:38:19 -03:00
Stepan Vladovskiy
38f5aab9e0 debug: not safe version. back to safe map function
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2025-01-28 19:27:24 -03:00
Stepan Vladovskiy
95f49a7ca5 debug: rewrite nginx file to use it without variables logic
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-01-28 19:23:02 -03:00
Stepan Vladovskiy
cd8f5977af debug: sv just in case for testing maping issue and trying to find place with filter maybe, in this option if dscrs.site origin then allow is discours.io
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2025-01-28 18:57:27 -03:00
Stepan Vladovskiy
a218d1309b debug: no force optins and simpl regexp logic
All checks were successful
Deploy on push / deploy (push) Successful in 59s
2025-01-28 18:24:10 -03:00
Stepan Vladovskiy
113d4807b2 feat:sv with force flag
All checks were successful
Deploy on push / deploy (push) Successful in 1m2s
2025-01-28 17:55:41 -03:00
Stepan Vladovskiy
9bc3cdbd0b debug: sv clean testing in cors polici maping because it is redundant and add Allow origin heade custom log
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2025-01-28 17:48:59 -03:00
Stepan Vladovskiy
79e6402df3 debug: added for dscrs.site separate rule in nginx config map part 2025-01-28 17:48:59 -03:00
Stepan Vladovskiy
ec2e9444e3 debug: nginx conf sigil with custom logs with headers and backslash in dscrs.site 2025-01-28 17:48:59 -03:00
Stepan Vladovskiy
a86a2fee85 debug: nginx conf sigil file withou custom log and add for domain dscrs.site determinating backsplash 2025-01-28 17:48:59 -03:00
Stepan Vladovskiy
aec67b9db8 debug: layer with logs added for debug allow_orrigin missing for dscrs.site domain fix back slash 2025-01-28 17:48:59 -03:00
Stepan Vladovskiy
0bbe1d428a debug: layer with logs added for debug allow_orrigin missing for dscrs.site domain 2025-01-28 17:48:59 -03:00
Stepan Vladovskiy
a05f0afa8b debug: layer with logs added for debug allow_orrigin missing for dscrs.site domain 2025-01-28 17:48:59 -03:00
5e2842774a media-field-workarounds
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2025-01-28 15:38:10 +03:00
e17690f27b nostat
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2025-01-26 18:16:33 +03:00
cb990b61a3 gqldata 2025-01-26 18:01:04 +03:00
cc837288bb simpler-reader-field 2025-01-26 17:59:08 +03:00
4a26e4f75b fmt 2025-01-26 17:53:16 +03:00
eee2c1a13d fieldresolver-fix 2025-01-26 17:52:45 +03:00
209d5c1a5e shout-media-field-resolver 2025-01-25 15:31:23 +03:00
4f4affaca4 cache-invalidate-fix-3 2025-01-25 15:19:19 +03:00
d59710309d cache-invalidate-fix-2 2025-01-25 11:57:10 +03:00
88525276c2 cache-invalidate-fix 2025-01-25 11:23:20 +03:00
1f4b3d3eee create-shout-fix6 2025-01-22 00:43:59 +03:00
76a4c5fb53 create-shout-fix5 2025-01-21 21:54:23 +03:00
8f6b96cb0f create-shout-fix4 2025-01-21 20:53:27 +03:00
76a707c7fd create-shout-fix3 2025-01-21 20:39:54 +03:00
ae584abb5b create-shout-fix2 2025-01-21 19:58:20 +03:00
eff8278cc3 create-shout-fix 2025-01-21 19:33:28 +03:00
8432a00691 create-shout-fix2 2025-01-21 18:28:03 +03:00
1ed185a701 create-shout-fix 2025-01-21 18:19:25 +03:00
562ce3296e published_at-revert2 2025-01-21 17:52:04 +03:00
ddc2d69e54 published_at-revert 2025-01-21 17:50:02 +03:00
f6863b32e8 published_at-fix5 2025-01-21 17:44:29 +03:00
9bf9f3d384 published_at-fix4 2025-01-21 16:40:52 +03:00
998d01c751 published_at-fix3 2025-01-21 15:57:22 +03:00
57d04ddf1c published_at-fix2 2025-01-21 13:34:20 +03:00
0ba2d2ecee published_at-fix 2025-01-21 13:11:15 +03:00
839cc84c26 stat-syntetic 2025-01-21 10:21:38 +03:00
c80c282118 prepare-topics-authors-dicts 2025-01-21 10:09:49 +03:00
5acae03c55 fmt 2025-01-21 10:09:28 +03:00
49be05d4db shout-create-fix 2025-01-18 10:57:34 +03:00
ae7580252b invcache-fix6 2025-01-16 06:49:15 +03:00
7c85f51436 invcache-fix5 2025-01-16 06:42:12 +03:00
83ec475cc8 invcache-fix4 2025-01-16 06:01:47 +03:00
c1c095a73c invcache-fix3 2025-01-16 06:00:15 +03:00
c4e84364c6 invcache-fix 2025-01-16 05:53:37 +03:00
8287b82554 invalidate-cache-fix 2025-01-16 05:46:31 +03:00
56fe8bebbe invalidate-cache-fix 2025-01-16 05:45:53 +03:00
4fffd1025f debug-update-shout-2 2025-01-16 05:42:53 +03:00
576e1ea152 debug-update-shout 2025-01-16 05:34:43 +03:00
5e1021a18e corsfix-34 2024-12-24 14:22:49 +03:00
dcbdd01f53 cors-fix-33
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2024-12-24 14:04:52 +03:00
608bf8f33a tokencheckfix
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2024-12-22 11:33:57 +03:00
48994d8bfd claims
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2024-12-22 00:34:35 +03:00
4ffcbf36d3 nobearer
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2024-12-22 00:30:04 +03:00
e539e0334f Merge branch 'dev' of https://dev.discours.io/discours.io/core into dev
All checks were successful
Deploy on push / deploy (push) Successful in 55s
2024-12-22 00:24:29 +03:00
1898b3ef3f auth-debug 2024-12-22 00:22:26 +03:00
Stepan Vladovskiy
1100a1b66f debug: add dscrs.site map in cors
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2024-12-20 14:47:40 -03:00
Stepan Vladovskiy
04a0a6ddf4 debug: Sigil back to map with only discours.io domain
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2024-12-20 14:35:59 -03:00
bfbb307d6b corsfix8
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-12-17 20:26:17 +03:00
1c573f9a12 corsfix7 2024-12-17 20:17:19 +03:00
6b1533402a corsfix6 2024-12-17 20:14:01 +03:00
fdf5f795da corsfix5 2024-12-17 20:09:39 +03:00
daf5336410 corsfix4 2024-12-17 20:06:15 +03:00
0923dc61d6 corsfix3 2024-12-17 20:02:41 +03:00
4275131645 corslogs 2024-12-17 19:52:49 +03:00
c64d5971ee corsfix2 2024-12-17 19:51:00 +03:00
3968bc3910 sigilfix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-12-17 19:46:47 +03:00
99b0748129 mapfix2 2024-12-17 00:31:02 +03:00
fcaac9cc41 mapfix 2024-12-17 00:27:07 +03:00
b5c6535ee8 wh5
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-12-16 20:14:11 +03:00
cf6150b155 wh4 2024-12-16 20:10:39 +03:00
5d1bfeaa9a headercase 2024-12-16 20:07:56 +03:00
c4ffc08bae webfk 2024-12-16 20:03:00 +03:00
f73f3608c0 webhook-fix2 2024-12-16 19:50:25 +03:00
5944d9542e webhook-fix 2024-12-16 19:44:24 +03:00
2aefcd2708 corsfix 2024-12-16 19:39:31 +03:00
3af4c1ac7a issuer-port-fix 2024-12-16 19:23:45 +03:00
aff0e8b1df webhookfix 2024-12-16 19:13:16 +03:00
e4a9bfa08b authdev2 2024-12-16 19:06:47 +03:00
a41a5ad39a authdev 2024-12-16 18:57:10 +03:00
434d59a7ba nginx-fix10 2024-12-16 14:05:26 +03:00
407de622ec allow-origin-fix 2024-12-16 14:01:05 +03:00
be03e7b931 viewed-storage-update
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-12-12 02:03:19 +03:00
d02ae5bd3f fmt+debug
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-12-12 01:04:11 +03:00
87506b0478 check-inner-logix
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-12-12 00:32:27 +03:00
3a819007c1 morelogs-update
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-12-12 00:29:04 +03:00
961ba9c616 warnbetter
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-12-12 00:21:51 +03:00
7b58c7537e warn-not-found
All checks were successful
Deploy on push / deploy (push) Successful in 4s
2024-12-12 00:20:43 +03:00
a1486b3bba comments-rates-fix
All checks were successful
Deploy on push / deploy (push) Successful in 7s
2024-12-11 23:49:58 +03:00
f3c06e1969 mutation-fix-2 2024-12-11 23:06:55 +03:00
354d9c20a3 mutation-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-12-11 23:04:45 +03:00
fbcee18db1 fmt
All checks were successful
Deploy on push / deploy (push) Successful in 7s
2024-12-11 23:02:14 +03:00
c5d21c3554 check-webhook
All checks were successful
Deploy on push / deploy (push) Successful in 7s
2024-12-11 23:01:03 +03:00
4410311b80 webhook-is-mutation
All checks were successful
Deploy on push / deploy (push) Successful in 7s
2024-12-11 22:54:37 +03:00
8f5ee384ff logsdebug
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-12-11 22:52:25 +03:00
bffc48e5d9 log-auth-graphql
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-12-11 22:49:08 +03:00
9cead2ab0e search-off
All checks were successful
Deploy on push / deploy (push) Successful in 7s
2024-12-11 22:31:41 +03:00
444c853f54 webhook-fix
All checks were successful
Deploy on push / deploy (push) Successful in 7s
2024-12-11 22:21:05 +03:00
7751b0d0f8 startup-fixes
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-12-11 22:10:48 +03:00
fe93439194 webhook-add
All checks were successful
Deploy on push / deploy (push) Successful in 7s
2024-12-11 22:07:36 +03:00
6762b18135 get-author-followers-fix2 2024-12-11 21:34:43 +03:00
9439d71249 get-author-followers-fix
All checks were successful
Deploy on push / deploy (push) Successful in 7s
2024-12-11 21:25:03 +03:00
b8f86e5d5e last-commented-fix
All checks were successful
Deploy on push / deploy (push) Successful in 8s
2024-12-04 18:25:51 +03:00
597fd6ad55 last_commented_at
All checks were successful
Deploy on push / deploy (push) Successful in 8s
2024-12-04 17:40:45 +03:00
a71a6fcc41 saerch-fail-toler
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-22 20:32:14 +03:00
9dde136c9c search-fail-tolerance
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-22 20:23:45 +03:00
779cb9a87c following-error
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-22 20:19:56 +03:00
79f7c914d7 v0.4.7 2024-11-20 23:59:11 +03:00
a9d181db8f fixapi
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-18 23:23:20 +03:00
283ad80632 fasternomyreate
All checks were successful
Deploy on push / deploy (push) Successful in 7s
2024-11-18 22:24:54 +03:00
e9f9582110 sqlsynt2 2024-11-18 22:21:15 +03:00
3a5449df79 sqlsynt
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-18 22:19:06 +03:00
cf88c165ee nomyratestat2
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-18 22:16:42 +03:00
2fec47d363 nomyratestat
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-18 22:13:49 +03:00
6966d900fa myrates-api-minor-fix3 2024-11-18 22:10:25 +03:00
773615e201 myrates-api-minor-fix2 2024-11-18 22:05:45 +03:00
080ba76684 myrates-api-minor-fix
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-18 22:03:11 +03:00
25f929026f commend-id-fix
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-18 13:14:32 +03:00
47a8493824 no-my-rate
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-18 11:31:19 +03:00
821a4c0df1 info-context-debug
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-14 14:11:51 +03:00
1a371b191a ..
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-14 14:00:33 +03:00
471781f942 debug-stat-wip
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-14 13:42:40 +03:00
b4eff32427 authorized-context-debug
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-14 13:33:09 +03:00
2d0ca1c7bf myrate-fix+log
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-14 13:25:33 +03:00
88812da592 myrate-fix
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-14 13:21:32 +03:00
bffa4aa1ef unrated-fix5
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-14 01:36:15 +03:00
4adf3d5a1e unrated-fix3 2024-11-14 01:32:00 +03:00
4b111951b7 unrated-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-14 00:55:25 +03:00
b91e4ddfd1 unrated-fix2 2024-11-14 00:29:15 +03:00
cd90e7a2d0 unrated-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-14 00:26:12 +03:00
af2d8caebe toler-none2
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-12 18:52:48 +03:00
f32b6a6a27 toler-none
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-12 18:49:44 +03:00
8116160b4d my_rate-stat
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-12 17:56:20 +03:00
34511a8edf join-maintopic-unrated
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-03 11:32:05 +03:00
08fb1d3510 create-reaction-shout
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 22:38:40 +03:00
6d61e038e7 create-reaction-fix-4
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 22:34:20 +03:00
bcb602d3cf create-reaction-fix3 2024-11-02 19:48:43 +03:00
f4a8a653d0 create-reaction-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 19:16:52 +03:00
2c981bc972 create-reaction-fkx2
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 13:52:03 +03:00
b322219173 create-reaction-fkx
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 13:49:22 +03:00
52567557e8 debug-create-reaction
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-02 13:44:00 +03:00
3f1ef8dfd8 proposals-fix
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-02 13:35:30 +03:00
1b43f742d3 tolerate-double-follow
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 12:33:52 +03:00
5f3f00366f tolerate-double-follow 2024-11-02 12:33:35 +03:00
a61bb6da20 unfollow-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 12:12:19 +03:00
11611fd577 following-fixes+fmt
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 12:09:24 +03:00
09a6d085fd revalidation-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 11:56:47 +03:00
d4548f71c7 lesslogs
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 11:49:30 +03:00
9b67f1aa21 notify-follower-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 11:42:24 +03:00
2e91f9399a revalidation-follower-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 11:40:02 +03:00
0eb95e238b following-debug
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 11:35:02 +03:00
65bd2ef9cf author-created-at-fix
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-02 06:27:31 +03:00
9a6c995589 lgos
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-02 04:44:07 +03:00
8965395377 viewed-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 04:28:16 +03:00
38d39dd618 debug-create-reaction
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-02 04:24:41 +03:00
0c009495a3 async-revised
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-02 00:26:57 +03:00
54c59d26b9 media-item-type
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-01 22:57:20 +03:00
92e49c8ad9 group-by-shout
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 22:23:23 +03:00
493e6cf92c psql8
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 22:17:56 +03:00
1dcc0cf8c5 psql7
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 22:11:42 +03:00
d3daf2800e psql6
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-01 22:04:39 +03:00
d0b5c2d3f9 psql5
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 22:01:41 +03:00
0930e80b9b psql2 2024-11-01 21:52:25 +03:00
044d28cfe9 psql
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 21:49:31 +03:00
4b4234314d fields-group 2024-11-01 21:45:51 +03:00
baa8d56799 .
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 21:42:20 +03:00
d40728aec9 nodist4 2024-11-01 21:39:05 +03:00
c78347b6f9 nodist2 2024-11-01 21:35:33 +03:00
021765340a nodist
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 21:30:52 +03:00
567507c412 groupby-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 21:25:25 +03:00
8bf0566d72 row.stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 21:17:51 +03:00
0874794140 stat-dict
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 21:09:53 +03:00
154477e1ad logfix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 21:04:30 +03:00
f495953f6a media-item-type
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 21:03:09 +03:00
fba0f34020 nodistinct
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 20:28:59 +03:00
4752ef19b2 order-by-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 20:27:25 +03:00
3e50902f07 json-distinct-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 20:24:09 +03:00
a0f29eb5b8 json-builder-compat
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 20:11:58 +03:00
fcbbe4fcac fixed-shouts-load
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 20:02:46 +03:00
4ef5d172a0 results-fix
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-11-01 17:26:45 +03:00
31bd421e22 merged-hub
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-01 15:06:21 +03:00
dd60d1a1c4 deployfix
All checks were successful
Deploy on push / deploy (push) Successful in 5s
2024-11-01 14:33:34 +03:00
1892ea666a apply-options-moved
All checks were successful
Deploy on push / deploy (push) Successful in 1m26s
2024-11-01 14:29:58 +03:00
3a5297015f rating-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m15s
2024-11-01 14:09:22 +03:00
8ad00f0fa5 case-whens-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m21s
2024-11-01 14:07:10 +03:00
3247a3674f feed-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m22s
2024-11-01 14:00:19 +03:00
d88f905609 reworked-feed+reader
All checks were successful
Deploy on push / deploy (push) Successful in 1m16s
2024-11-01 13:50:47 +03:00
a01a3f1d7a reader-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 1m18s
2024-11-01 12:27:13 +03:00
75e7079087 reader-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m15s
2024-11-01 12:06:23 +03:00
7f58bf48fe row-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m16s
2024-11-01 11:57:49 +03:00
f7c41532a5 feed-fixes
All checks were successful
Deploy on push / deploy (push) Successful in 1m16s
2024-11-01 11:29:41 +03:00
a105372b15 norandomtopic-onserver-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m17s
2024-11-01 11:09:16 +03:00
54e26fb863 main-topic
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-11-01 10:29:18 +03:00
600d52414e txt
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-11-01 10:04:32 +03:00
5a9a02d3a4 0.4.6 2024-11-01 09:50:19 +03:00
bcac627345 main-x-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-11-01 07:51:33 +03:00
5dd47b3cd4 maintopic-nullable
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-11-01 07:43:10 +03:00
c9328041ce main_-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 21:58:24 +03:00
ddd18f8d70 media-type
Some checks failed
Deploy on push / deploy (push) Failing after 11s
2024-10-31 21:45:55 +03:00
1ccc5fb9e7 more-agile-query-shout-api
Some checks failed
Deploy on push / deploy (push) Failing after 16s
2024-10-31 21:11:54 +03:00
fc930a539b 5random
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 20:42:09 +03:00
e7b4e59b65 authors_and_topics-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 20:34:25 +03:00
e2b6ae5e81 agile-query
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 20:28:52 +03:00
827300366d unrated-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 19:57:09 +03:00
8c05589168 optimized-query
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 19:48:06 +03:00
f29eb5f35a separate-subq
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 19:11:41 +03:00
62370b94b3 reader-query-optimized
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 19:06:58 +03:00
1114c7766d get-shout-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 18:37:00 +03:00
0c83b9c401 query-shouts-simpler 2024-10-31 18:28:09 +03:00
f437119711 unrated-sort-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-10-31 17:47:07 +03:00
eaa23134de comments-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 17:27:07 +03:00
00fe5d91a7 dictify
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-10-31 15:41:18 +03:00
071022c63b _sa_instance..
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-10-31 15:37:32 +03:00
3ace2093b2 keep-json
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-10-31 15:32:13 +03:00
42e06bd2e6 jsongix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 15:28:41 +03:00
6dd6fd764a no-create-json
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 15:25:22 +03:00
21888c6d00 ismain-field
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 15:18:27 +03:00
2bc0ac1cff maintopicslug
Some checks failed
Deploy on push / deploy (push) Failing after 11s
2024-10-31 15:14:58 +03:00
bf3fd4b39a captionfix-2
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 15:10:26 +03:00
7eed615991 author-captions-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 15:05:22 +03:00
6e56eba0c2 oneval-subq
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 14:50:45 +03:00
5f2f4262a5 scalar
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 14:48:15 +03:00
882ef0288a whensfix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 14:29:47 +03:00
9416165699 minorfix-3
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 14:27:13 +03:00
c72588800f minorfi4
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 14:20:22 +03:00
1c6678d55d minorfix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 14:14:54 +03:00
91e4e751d8 readerfix2 2024-10-31 14:11:59 +03:00
bc4432c057 readerfix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 14:09:33 +03:00
38185273af get-shouts-with-stats-fix8 2024-10-31 14:04:28 +03:00
5fb7ba074c get-shouts-with-stats-fix7 2024-10-31 14:02:36 +03:00
d83be5247b get-shouts-with-stats-fix6 2024-10-31 14:00:56 +03:00
0f87ac6a00 get-shouts-with-stats-fix5 2024-10-31 13:59:18 +03:00
f61a2d07fe get-shouts-with-stats-fix4
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 13:52:32 +03:00
d48577b191 get-shouts-with-stats-fix3 2024-10-31 13:46:33 +03:00
4aec829c74 get-shouts-with-stats-fix2 2024-10-31 13:42:46 +03:00
d8496bf094 get-shouts-with-stats-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 13:39:38 +03:00
55a0474602 reader-reactionalias-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 13:25:05 +03:00
751f3de4b1 jsonify2
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-31 12:49:18 +03:00
5b211c349e create_shout-community-1-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m15s
2024-10-31 09:33:17 +03:00
a578e8160e unrated-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m14s
2024-10-24 16:27:16 +03:00
9ac533ee73 fmt 2024-10-24 00:01:09 +03:00
d9644f901e more-toler3
All checks were successful
Deploy on push / deploy (push) Successful in 1m12s
2024-10-24 00:00:04 +03:00
0a26f2986f more-toler2
Some checks failed
Deploy on push / deploy (push) Has been cancelled
2024-10-23 23:59:17 +03:00
7cf3f91dac more-toler
All checks were successful
Deploy on push / deploy (push) Successful in 1m13s
2024-10-23 23:57:52 +03:00
33bedbcd67 restoring-test
All checks were successful
Deploy on push / deploy (push) Successful in 1m15s
2024-10-23 11:29:44 +03:00
8de91a8232 hgetall-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m12s
2024-10-23 11:25:56 +03:00
23514ca5a4 get_shout-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m12s
2024-10-23 11:22:07 +03:00
79ab0d6a4c init-create-fix
All checks were successful
Deploy on push / deploy (push) Successful in 31s
2024-10-21 20:21:31 +03:00
1476d4262d trick-import
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-21 20:19:52 +03:00
724f901bbd community-stat-fixes
All checks were successful
Deploy on push / deploy (push) Successful in 1m4s
2024-10-21 16:57:03 +03:00
a4e48eb3f4 commynity-cudl
All checks were successful
Deploy on push / deploy (push) Successful in 1m5s
2024-10-21 16:42:50 +03:00
c6f160c8cf update-api-3
All checks were successful
Deploy on push / deploy (push) Successful in 1m12s
2024-10-21 12:15:44 +03:00
62f2876ade queryfix
All checks were successful
Deploy on push / deploy (push) Successful in 1m12s
2024-10-21 11:53:00 +03:00
93b7c6bf4d rolesfix
All checks were successful
Deploy on push / deploy (push) Successful in 1m4s
2024-10-21 11:48:51 +03:00
635ff4285e communityfollower-roles
All checks were successful
Deploy on push / deploy (push) Successful in 1m3s
2024-10-21 11:29:57 +03:00
0cf963240e virtual-cols-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m6s
2024-10-21 11:08:16 +03:00
160f02e67f 0.4.5-api-update
All checks were successful
Deploy on push / deploy (push) Successful in 1m49s
2024-10-21 10:52:23 +03:00
045d2ddadf create-all-tables-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 1m35s
2024-10-15 19:52:12 +03:00
63ebf3af2d create-all-tables-fix 2024-10-15 19:50:17 +03:00
bf33cdc95c fixed-coales
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-15 11:12:09 +03:00
76aeddbde2 ignoreup 2024-10-15 10:07:44 +03:00
3b1c4475c6 readme-update 2024-10-14 19:06:30 +03:00
5966512a8f poetry-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-10-14 18:28:16 +03:00
8b65c87750 add-fakeredis
All checks were successful
Deploy on push / deploy (push) Successful in 1m19s
2024-10-14 13:08:43 +03:00
6f6b619c11 graphql-handler-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m12s
2024-10-14 12:31:55 +03:00
3188a67661 async+fmt-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m12s
2024-10-14 12:19:30 +03:00
4e7fb953ba try-to-fix-2 2024-10-14 12:13:18 +03:00
173c865a69 try-to-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m46s
2024-10-14 11:11:13 +03:00
d5ba8d1cde correlate-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m12s
2024-10-14 10:47:38 +03:00
998db09c09 shout-query-substat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m12s
2024-10-14 09:37:40 +03:00
78d575863d logfixes 2024-10-14 09:33:31 +03:00
503e859b5c query-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m33s
2024-10-14 09:23:11 +03:00
5dc61dc397 db-init-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m13s
2024-10-14 09:12:20 +03:00
7c86d95f5e sqlite-support
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-10-14 02:05:20 +03:00
5c40ab3d00 312
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-10-13 00:49:06 +03:00
31867d3c6c ex-support
All checks were successful
Deploy on push / deploy (push) Successful in 1m59s
2024-09-27 10:18:08 +03:00
e2b54b37dd sentry-log-detailed
All checks were successful
Deploy on push / deploy (push) Successful in 1m13s
2024-08-26 21:18:33 +03:00
6a6df10825 fixd2
All checks were successful
Deploy on push / deploy (push) Successful in 1m11s
2024-08-22 16:04:23 +03:00
15ffc9eb3e restore-authorizer-dev
All checks were successful
Deploy on push / deploy (push) Successful in 1m12s
2024-08-22 15:55:26 +03:00
5095b0b4c0 get-with-stat-as-arg
All checks were successful
Deploy on push / deploy (push) Successful in 1m11s
2024-08-14 18:33:11 +03:00
4c126fd859 cache-author-fiz 2024-08-14 16:30:52 +03:00
8f3fded5fe nosp
Some checks failed
Deploy on push / deploy (push) Failing after 11s
2024-08-14 14:32:40 +03:00
96ea356c62 ms2 2024-08-12 11:16:25 +03:00
4c8f7d5958 ms
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-12 11:13:36 +03:00
c5ee827230 merged
Some checks failed
Deploy on push / deploy (push) Failing after 11s
2024-08-12 11:00:01 +03:00
208de158bc imports sort
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-09 09:37:06 +03:00
d0c1f33227 nodistinct
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-09 08:17:40 +03:00
71db929fa4 comments-counter-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-09 07:44:23 +03:00
56f1506450 followers-ids-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-09 07:35:45 +03:00
fae5f6f735 get-objects
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-09 07:26:04 +03:00
983f25d6d3 debug-followers
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-09 07:22:55 +03:00
1c9f6f30d9 debug:get_cached_topic_followers
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-09 07:14:33 +03:00
4a7b305ad4 fmt
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-08 18:57:03 +03:00
b5deb8889a follower-stat-fix 2024-08-08 18:56:49 +03:00
218bbd54da redis-fix-3
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-08 18:14:49 +03:00
531e4bf32c redis-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-08 18:13:51 +03:00
65bbbdb2b0 is_main-fix2
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-08 18:06:06 +03:00
13acff1708 get_cached_topic_followers-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-08 18:00:50 +03:00
ff9c0a0b82 redis-fixes
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-08 17:55:34 +03:00
69a848d6a7 redis-fix 2024-08-08 17:54:15 +03:00
6a13e3bb0f is_main-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-08 17:48:53 +03:00
e4266b0bab get_cached_topic_by_slug-fix
Some checks failed
Deploy on push / deploy (push) Failing after 11s
2024-08-08 17:46:25 +03:00
5bd9c9750d parse_aggregated_string-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-08 17:39:37 +03:00
e46de27ba9 get-shout-fix2
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-08 17:36:20 +03:00
7bb70c41df get-shout-fix 2024-08-08 17:36:11 +03:00
a771cd0617 reaction
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-08 17:33:55 +03:00
21d9b75a09 fix-string-agg 2024-08-08 16:20:45 +03:00
71015c2ca3 fix-topic-non-body
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-08-08 16:16:40 +03:00
ea99219283 fmt
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-08 16:10:45 +03:00
0533863230 minor-fixes 2024-08-08 16:10:31 +03:00
a5ec1838b1 parser-fix
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-08-07 18:13:40 +03:00
7fb4b5bd18 follower-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 18:11:32 +03:00
87aa39959a tricky-followers-count
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 18:09:44 +03:00
8b377123e1 follower-groupby2
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 18:06:31 +03:00
fb687d50dd follower-groupby
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 18:03:30 +03:00
64e0e0ce79 followers_stat
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-08-07 18:01:12 +03:00
5a6a318b60 ismain-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-07 17:53:59 +03:00
1ce12c0980 parse-agregated-string
Some checks failed
Deploy on push / deploy (push) Failing after 11s
2024-08-07 17:52:23 +03:00
9c374d789e string_agg
Some checks failed
Deploy on push / deploy (push) Failing after 11s
2024-08-07 17:45:22 +03:00
f9a91e3a66 from-clause
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 15:36:05 +03:00
c551ca2e70 nogroupby2
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 15:31:13 +03:00
6a4785cdac nogroupby
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 15:10:37 +03:00
ec7b25df3c вшые 2024-08-07 15:04:17 +03:00
c601fcc2a4 alc
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-08-07 14:54:13 +03:00
1524f141b8 distinct
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-08-07 14:49:15 +03:00
50f2c9d161 subsub3 2024-08-07 14:41:22 +03:00
7712832b76 subsub2 2024-08-07 14:38:42 +03:00
a973da5bb4 subsub
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 14:35:50 +03:00
3fde67a87d sqltypes-text
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 14:29:31 +03:00
19c9ef462e CITEXT
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 14:26:41 +03:00
56c010975c array_agg
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-08-07 14:22:24 +03:00
572f63f12b reader-loads-move 2024-08-07 14:18:05 +03:00
a01ca30f5b stat-docs-reactions-apifix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 14:02:36 +03:00
6517fc9550 groupby
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 13:57:37 +03:00
dcd9f9e0bf json-agg-fix§2 2024-08-07 13:53:44 +03:00
26d83aba7a json-agg-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 13:51:35 +03:00
087f6a7157 shouts-distinc-topics-authors-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 13:47:10 +03:00
7e89a3471f import-fix
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-08-07 13:37:50 +03:00
1f9b320f04 viewed-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 13:37:08 +03:00
eba97e967b thread-lock-fix2
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-07 13:30:41 +03:00
2f65a538fa thread-lock-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-07 13:25:48 +03:00
57d25b637d sync-viewed-stat
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-07 13:15:58 +03:00
9c7a62c384 selectinload2
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 12:57:48 +03:00
41482bfd4b selectinload 2024-08-07 12:57:01 +03:00
d369cfe333 ident-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 12:49:25 +03:00
2082e2a6e5 discussed-fix
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-07 12:48:57 +03:00
7a8f0a1c21 reader-oneloop
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-08-07 12:38:15 +03:00
3febfff1db postquery-topics-authors3 2024-08-07 12:29:51 +03:00
ad320ae83e postquery-topics-authors2 2024-08-07 12:23:56 +03:00
5609184d3b all 2024-08-07 12:22:51 +03:00
1e8d2aba0a postquery-topics-authors
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 12:18:29 +03:00
ebec80f198 precache-аix2 2024-08-07 11:56:13 +03:00
2a21decc94 precache-debug 2024-08-07 11:53:31 +03:00
520b39cb0b groupbyfix 2024-08-07 11:52:16 +03:00
1b46184781 groupbyfix 2024-08-07 11:52:07 +03:00
c1675cdf32 precache-fix2 2024-08-07 11:40:32 +03:00
c5a5e449d4 precache-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 11:38:34 +03:00
69a5dfcc45 shouts-load-optimisations
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 11:35:59 +03:00
7c48a6a1dc dict-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 10:30:51 +03:00
1af63dee81 shout-stats-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 10:22:37 +03:00
d4982017f6 refactored-starting
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 09:51:09 +03:00
60a56fd098 moved 2024-08-07 08:57:56 +03:00
1d4fa4b977 loop-fix-4 2024-08-07 08:42:59 +03:00
8b1e42de1c loop-fix-3 2024-08-07 08:35:38 +03:00
6bab1b0189 loop-fix-2 2024-08-07 08:33:02 +03:00
26fcd4ba50 loop-fix
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-08-07 08:31:11 +03:00
c731639aa4 get-cached-topic
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 08:25:47 +03:00
b358a6f4a9 nocacheshout
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 08:22:08 +03:00
df25eaf905 query-fitness
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 07:27:56 +03:00
821c81dd9c redis-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-07 07:18:49 +03:00
3981fa3181 revalidation-manager
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-06 21:44:33 +03:00
a577b5510d cache-fix3
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-06 20:55:19 +03:00
1612778baa cache-fix3 2024-08-06 20:23:23 +03:00
4cbe78f81f cache-fix2 2024-08-06 20:20:20 +03:00
31d38c016e get_cached_author_by_user_id
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-06 20:05:24 +03:00
08eebd6071 cache-part2 2024-08-06 19:59:27 +03:00
c276a0eeb0 caching-wip1 2024-08-06 19:55:27 +03:00
9f91490441 trigger-fix-2
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-06 19:45:42 +03:00
e0a44ae199 indexing2
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-06 19:03:43 +03:00
ab388af35f indexing
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-06 19:01:50 +03:00
95977f0853 semaphore
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-06 18:57:35 +03:00
b823862cec caching-fix
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-08-06 18:53:25 +03:00
522718f3a1 last-comment-revert
Some checks failed
Deploy on push / deploy (push) Failing after 10s
2024-08-06 18:18:51 +03:00
dfd476411f nossl
All checks were successful
Deploy on push / deploy (push) Successful in 1m10s
2024-08-06 14:44:01 +03:00
626d76f406 fmt2
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-08-06 14:37:50 +03:00
c576fc0241 fmt
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-08-06 14:34:12 +03:00
385c8ce04b logging-fix 2024-08-06 14:33:52 +03:00
34c16c8cdf logging-sentry
All checks were successful
Deploy on push / deploy (push) Successful in 2m0s
2024-08-06 13:47:49 +03:00
2f4c8acaa2 reaction.likes fix
Some checks failed
Deploy on push / deploy (push) Failing after 56s
2024-07-30 05:19:16 +03:00
960a00101c load-comment-ratings
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-07-26 19:04:40 +03:00
c46dc759d7 load-shout-comments-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 57s
2024-07-26 16:56:30 +03:00
16728f1d49 group-by-asked
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-07-26 16:42:26 +03:00
4c625db853 group_by-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 2m7s
2024-07-23 17:36:26 +03:00
fce78df549 group_by-fix 2024-07-23 17:35:45 +03:00
a4411cfa34 comment-ratings
All checks were successful
Deploy on push / deploy (push) Successful in 1m31s
2024-07-22 11:32:47 +03:00
a43a44302b reactions-api-update
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-07-22 10:42:41 +03:00
451f041206 aliased-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-07-18 12:13:30 +03:00
6595d12108 unrated-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m10s
2024-07-18 12:07:53 +03:00
983ad12dd3 slug-filter-author-2
All checks were successful
Deploy on push / deploy (push) Successful in 1m11s
2024-07-18 09:09:48 +03:00
3ff52f944c slug-filter-author
All checks were successful
Deploy on push / deploy (push) Successful in 1m10s
2024-07-18 09:05:10 +03:00
77282ade62 load-shouts-discussed-coauthored
All checks were successful
Deploy on push / deploy (push) Successful in 2m4s
2024-07-16 01:06:43 +03:00
1223c633d4 followed-by
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-07-03 15:35:12 +03:00
d55a3050fc Merge branch 'dev' of https://dev.discours.io/discours.io/core into dev
All checks were successful
Deploy on push / deploy (push) Successful in 1m11s
2024-07-03 11:57:26 +03:00
62a2280a80 load-shouts-followed-fix 2024-07-03 11:57:17 +03:00
Stepan Vladovskiy
c57fca0aee test: encrease users from one ip to 10000 to see if something chnages on stress press tests
All checks were successful
Deploy on push / deploy (push) Successful in 2m2s
2024-07-03 01:40:00 -03:00
612f91a708 followers-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m53s
2024-06-17 14:52:09 +03:00
a25a434ea2 check-existing-on-create
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-12 13:00:35 +03:00
ac9f1d8a40 followers-empty-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-12 12:52:07 +03:00
e32baa8d8f stat-aliased-select-fix-3 2024-06-12 12:48:09 +03:00
9580282c79 stat-aliased-select-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m10s
2024-06-12 12:26:53 +03:00
c24f3bbb4a faster-response
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-11 22:46:35 +03:00
04e20b29ee author-with-stat-cache-nonblock-2
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-06-11 17:51:34 +03:00
b2fdc9a453 parser-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-11 14:46:10 +03:00
8708efece2 stabfix
All checks were successful
Deploy on push / deploy (push) Successful in 1m10s
2024-06-09 15:49:37 +03:00
51f56c0f1f issue#842-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-09 14:02:24 +03:00
e58fbe263f reaction.shout-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-09 09:13:21 +03:00
ea28f5346c auth-debug
All checks were successful
Deploy on push / deploy (push) Successful in 1m54s
2024-06-09 09:07:48 +03:00
4743581395 load_authors_by-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-06 14:16:16 +03:00
3f12bcfd39 precache-debug-2
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-06 12:49:22 +03:00
10ad7089f4 precache-debug
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-06 12:47:43 +03:00
8d371e6519 log-loza
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-06 12:42:54 +03:00
76ee4a387c shout-link-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-06 12:37:55 +03:00
7a4c02d11d typo-fx
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-06 12:33:58 +03:00
ae861aa8b4 fix-select-by-topic
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-06 12:23:47 +03:00
ddc5254e5f log-response
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-06 11:24:26 +03:00
543b2e6b4d load_shouts_unrated-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-06 11:13:54 +03:00
626e899ca3 get-cached-topic-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-06 11:07:49 +03:00
f5ebd0ada9 text-order-by-fix 2024-06-06 11:06:18 +03:00
afe710d955 debug-precache
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-06 09:56:21 +03:00
1946d5eda2 int
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-06 09:40:39 +03:00
3476d6e6d1 get_cached_topic_by_slug-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-06 09:23:45 +03:00
85f63a0e17 precache-synced-with-cache
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-06 08:06:10 +03:00
1cc779e17b get-author-id-by-user-id-2
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-05 23:42:25 +03:00
b04fc1ba65 get-author-id-by-user-id
Some checks failed
Deploy on push / deploy (push) Has been cancelled
2024-06-05 23:42:09 +03:00
5afa046f18 get-author-by-user
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-05 23:15:19 +03:00
fbf21ae3f9 strip-more-2
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-05 23:07:29 +03:00
12439b6ef2 strip-more
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-05 23:03:13 +03:00
b72ef072e4 fix-cache-topic
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-05 22:56:48 +03:00
ee6a636e68 fix-cache-author
Some checks failed
Deploy on push / deploy (push) Has been cancelled
2024-06-05 22:55:44 +03:00
e942fdbffa debug-precache
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-05 22:27:23 +03:00
13e609bcf7 fixed-redis-intfix4
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-05 22:20:39 +03:00
d5d5a69ab4 userid-renewal-toler
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-05 21:44:51 +03:00
53545605d0 fixed-redis-cache-4
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-05 21:40:32 +03:00
d93fa4cb4b cached-author-fi
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-05 21:04:48 +03:00
35ef4357fb simpler-cache-author
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-05 18:51:12 +03:00
d3fe4c4aff get_cached_author-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 1m6s
2024-06-05 18:48:41 +03:00
1e0d0f465a get_cached_author-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-05 18:46:01 +03:00
6e80942beb reactions-follow-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m10s
2024-06-05 18:29:15 +03:00
67636e6d17 author-id-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m21s
2024-06-05 18:18:03 +03:00
713fb4d62b 0.4.1-following-update 2024-06-05 17:45:55 +03:00
67c299939c toler-no-author
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-05 16:23:53 +03:00
1042eb6e58 less-bloat-log
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-04 12:55:12 +03:00
db2ae09ead aifix
All checks were successful
Deploy on push / deploy (push) Successful in 1m28s
2024-06-04 11:51:39 +03:00
708bdaa7f6 ruff-update 2024-06-04 09:10:52 +03:00
9c02333e2b precache-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-04 09:07:46 +03:00
bfc177a811 exc-mw-connected
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-04 08:15:59 +03:00
d53256bcd7 exc-mw
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-04 08:10:57 +03:00
231de135ca search-fin
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-02 19:28:21 +03:00
5f36b7c6e2 search-with-images40
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 19:24:23 +03:00
23e46df8a9 search-with-images39
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 19:21:50 +03:00
6b8b61fa37 search-with-images38
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 19:19:30 +03:00
25964b6797 search-with-images36
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-02 19:14:18 +03:00
c0b3e90943 search-with-images35
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-02 19:09:02 +03:00
9c4ddea33d search-with-images34
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-02 19:06:26 +03:00
f41359b8c9 search-with-images33
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-06-02 18:59:00 +03:00
44b797c1de search-with-images32
Some checks failed
Deploy on push / deploy (push) Has been cancelled
2024-06-02 18:58:24 +03:00
4933553d50 search-with-images31
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-02 18:52:57 +03:00
93c9fcc248 search-with-images30
Some checks failed
Deploy on push / deploy (push) Has been cancelled
2024-06-02 18:52:34 +03:00
2365485a68 search-with-images29
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 18:48:54 +03:00
27bea7d06f search-with-images28
All checks were successful
Deploy on push / deploy (push) Successful in 1m10s
2024-06-02 18:47:01 +03:00
c29838b6ee search-with-images27
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 18:44:27 +03:00
c8baa6abf9 search-with-images26
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-02 18:41:09 +03:00
9358a86df1 search-with-images25
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-02 18:39:05 +03:00
7e8757ec72 search-with-images24
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-02 18:36:11 +03:00
c1fe419ff9 search-with-images22
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-02 18:34:15 +03:00
ebf1309b48 search-with-images22
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 18:01:17 +03:00
d83b459408 search-with-images120
All checks were successful
Deploy on push / deploy (push) Successful in 1m34s
2024-06-02 17:56:24 +03:00
db8472ae06 search-with-images19
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 17:47:27 +03:00
9d265fa3f9 search-with-images17
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 17:36:34 +03:00
5169cff892 search-with-images16
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 17:25:09 +03:00
8f2bd30d54 search-with-images15
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 17:16:43 +03:00
b8266c41fc search-with-images14
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 17:12:34 +03:00
1a601b93eb search-with-images13
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 17:08:50 +03:00
1b838676e3 search-with-images12
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 17:07:29 +03:00
8cc9d0d4d3 search-with-images11
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 17:01:22 +03:00
8e77a57bc1 search-with-images10
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-02 16:48:11 +03:00
e74c9688c8 search-with-images9
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-02 16:40:47 +03:00
60d6743fcd search-with-images8
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-02 16:38:38 +03:00
f42d81b9fc search-with-images7
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 16:36:12 +03:00
774240ca73 search-with-images6
All checks were successful
Deploy on push / deploy (push) Successful in 1m9s
2024-06-02 16:18:19 +03:00
fb2c31a81b search-with-images6
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 16:14:01 +03:00
eba991f4f5 search-with-images5
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 16:10:09 +03:00
0fdb056460 search-with-images4
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 16:05:59 +03:00
17da2c8359 search-with-images3
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 16:00:09 +03:00
0abb4d605d search-with-images2
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-02 15:58:14 +03:00
465d9093bd search-with-images
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-06-02 15:56:17 +03:00
67e4cacb28 scoreidcache-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-02 15:32:02 +03:00
a3d1d1b067 saerch-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-06-02 14:11:46 +03:00
2e5919f3e6 saerch-fix
Some checks failed
Deploy on push / deploy (push) Has been cancelled
2024-06-02 14:10:49 +03:00
9b2db3cc1d d-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m7s
2024-05-30 22:04:28 +03:00
9307fc97fb follower-topics-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-05-30 21:29:25 +03:00
b3a998fec2 followers-cache-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 1m27s
2024-05-30 21:13:50 +03:00
5ba7f5e3c9 followers-cache-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 48s
2024-05-30 21:10:16 +03:00
9212fbe6b5 followers-cache-fix
All checks were successful
Deploy on push / deploy (push) Successful in 48s
2024-05-30 20:55:47 +03:00
8dcd985c67 cache-fix-10
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2024-05-30 20:49:37 +03:00
c9dcd6a9c9 cache-fix-9
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2024-05-30 20:43:18 +03:00
afef19fae3 cache-fix-8
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2024-05-30 20:26:53 +03:00
2e2dc80718 cache-fix-7
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2024-05-30 20:23:32 +03:00
abc5381adb cache-fix-6
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2024-05-30 20:22:10 +03:00
75dd4120ec cache-fix-5
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2024-05-30 20:20:02 +03:00
b0637da11d cache-fix-4
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2024-05-30 20:14:26 +03:00
968935869e cache-refactored-4
All checks were successful
Deploy on push / deploy (push) Successful in 48s
2024-05-30 19:42:38 +03:00
74e000c96b cache-refactored-3
All checks were successful
Deploy on push / deploy (push) Successful in 48s
2024-05-30 19:29:57 +03:00
8dd885b6a8 cache-refactored2
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2024-05-30 19:16:50 +03:00
042cf595f7 cache-refactored
All checks were successful
Deploy on push / deploy (push) Successful in 39s
2024-05-30 19:15:11 +03:00
3712ecf8ae author:user-key
All checks were successful
Deploy on push / deploy (push) Successful in 49s
2024-05-30 18:07:41 +03:00
d20647c825 cache-fix
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2024-05-30 17:49:33 +03:00
98010ed1bc get-with-stat-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 47s
2024-05-30 15:05:06 +03:00
76d4fc675f get-with-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-30 14:45:41 +03:00
e4cc182db4 get-with-stat-debug-2
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-30 14:40:04 +03:00
9ca7a42d56 get-with-stat-debug
All checks were successful
Deploy on push / deploy (push) Successful in 30s
2024-05-30 14:38:14 +03:00
570c8a97e3 shouts-stat-fix-5
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-30 14:29:00 +03:00
3bde3ea5e9 shouts-stat-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-30 14:25:35 +03:00
d54e2a2f3f shouts-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-30 14:01:34 +03:00
a1ee49ba54 poestmerge
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-30 12:49:46 +03:00
e638ad81e2 fmt+follows-refactored 2024-05-30 07:12:00 +03:00
bce43096b1 postmerge-fixex
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-27 20:14:40 +03:00
19d10b6219 Merge branch 'v2' into dev
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-27 20:11:04 +03:00
a9ab2e8bb2 cached-empty-fix-4 2024-05-27 20:03:07 +03:00
9a94e5ac56 cached-empty-fix-3 2024-05-27 19:59:16 +03:00
d93311541e cached-empty-fix-2 2024-05-27 19:57:22 +03:00
01d2d90df1 cached-empty-fix 2024-05-27 19:55:56 +03:00
7b72963b24 reply-to-fix 2024-05-27 19:39:48 +03:00
c90783f461 async-fix-3 2024-05-27 19:38:34 +03:00
9d9adfbdfa async-fix-2 2024-05-27 19:36:25 +03:00
f43624ca3d async-fix 2024-05-27 19:29:51 +03:00
3f6f7f1aa0 get-followers-fix 2024-05-27 18:30:28 +03:00
da89b20e5c session-close-fix 2024-05-26 02:17:45 +03:00
c4817c1e52 logfix
All checks were successful
Deploy on push / deploy (push) Successful in 29s
2024-05-24 13:25:05 +03:00
c444895945 log-response
All checks were successful
Deploy on push / deploy (push) Successful in 1m50s
2024-05-24 13:14:19 +03:00
9791ba4b49 result-fix6
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-21 04:40:48 +03:00
6ed144327c result-fix5
All checks were successful
Deploy on push / deploy (push) Successful in 31s
2024-05-21 04:34:08 +03:00
472801199c result-fix4
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-21 04:28:15 +03:00
a3514e6874 result-fix3
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-21 04:25:40 +03:00
95b2b97dd4 result-fix2
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-21 04:24:58 +03:00
df934a8fd2 result-fix
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-21 04:22:36 +03:00
d89fa283dc cache-postrefactor-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-21 02:56:58 +03:00
1592065a8c postfixing-reimplemented-cache
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-21 02:01:18 +03:00
4c1fbf64a2 cache-reimplement-2
All checks were successful
Deploy on push / deploy (push) Successful in 1m4s
2024-05-21 01:40:57 +03:00
3742528e3a follows-returns
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-20 19:11:07 +03:00
232892d397 isort
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-20 16:46:05 +03:00
e0b3562e80 follow/unfollow-handling-noauthor 2024-05-20 16:23:49 +03:00
71c2e8ea13 notopicid
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-18 19:30:25 +03:00
b73cce5431 create-reaction-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-18 17:41:04 +03:00
0d618116e1 compound-fix
All checks were successful
Deploy on push / deploy (push) Successful in 28s
2024-05-18 17:31:45 +03:00
b7dbaa6e73 aliased-union-fix
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-18 16:16:09 +03:00
5fe51e03bb fix-get-stat
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-18 15:40:15 +03:00
306caf9520 logs-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-18 15:26:22 +03:00
e6f42b388a logs-fix
Some checks failed
Deploy on push / deploy (push) Has been cancelled
2024-05-18 15:25:53 +03:00
fd7bd385fc queries-refactoring-2
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-18 14:15:05 +03:00
7d97f40826 cache-when-created
All checks were successful
Deploy on push / deploy (push) Successful in 28s
2024-05-18 13:57:30 +03:00
bc01dfb125 media-indexed
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-18 13:16:39 +03:00
5dfb890b84 no-delete-on-create
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-18 12:57:18 +03:00
2beb584e87 search-index-softer-check
All checks were successful
Deploy on push / deploy (push) Successful in 32s
2024-05-18 12:55:34 +03:00
1f3607b4d3 search-compare-fix
All checks were successful
Deploy on push / deploy (push) Successful in 32s
2024-05-18 12:51:41 +03:00
0051492bd3 proper-searchfields
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-18 12:48:43 +03:00
0f5df77d28 create-reaction-unauthorized-handling
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-18 12:38:46 +03:00
c80229b7b9 delete-if-wrong
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-05-18 12:11:34 +03:00
8bc7a471cd index-struct-sync
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-18 11:58:47 +03:00
91a2854537 no-remove-index-fix
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-18 11:52:17 +03:00
3d8e484187 no-delete-index 2024-05-18 11:32:30 +03:00
be6d2454b1 search-info-2
All checks were successful
Deploy on push / deploy (push) Successful in 41s
2024-05-18 11:28:38 +03:00
4e97a22642 search-service-improve
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-18 11:22:13 +03:00
a749ade30b fmt
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-18 11:00:46 +03:00
3d90d9c81d search-cached-non-empty-only-fix 2024-05-18 11:00:01 +03:00
102eae1c98 sentry-on
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-09 08:32:25 +03:00
75cd8b9f71 get-author-ref
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-09 00:02:59 +03:00
a18ad12ff7 lesslog 2024-05-08 23:57:31 +03:00
f7fdd6fd76 sentry-off
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-08 23:48:11 +03:00
80685fd1cc follows-result-update
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-08 23:42:57 +03:00
69409f92e1 redis-set-set-fix
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-07 21:56:07 +03:00
cfcb858bba new-profile-followers 2024-05-07 19:17:18 +03:00
8618e1eff7 followers-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-07 00:10:54 +03:00
e0a5c654d8 fmt
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-05-07 00:06:31 +03:00
e61db5d6e5 logs-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-05-07 00:03:58 +03:00
fac25ab4f4 followers-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-05-07 00:02:15 +03:00
ceeeb23c26 delete-reaction-update-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-05-06 22:44:30 +03:00
ce90fedacb delete-reaction-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-05-06 22:38:19 +03:00
0179c69b82 delete-reaction-debug 2024-05-06 22:37:38 +03:00
dac79b53ca api-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-06 21:18:19 +03:00
b372fd81d5 drafts-api-common
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-06 21:14:17 +03:00
205019ce39 handle-no-author-profile
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-06 21:01:10 +03:00
9c4d88c8fd handle-noauthor 2024-05-06 20:59:56 +03:00
dd2becaab2 cache_by_id-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-06 20:41:34 +03:00
658c8c7702 followers-cache-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-06 20:30:49 +03:00
809b980145 load-authors-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-06 20:04:50 +03:00
1185880f8e authors-all-fix
All checks were successful
Deploy on push / deploy (push) Successful in 37s
2024-05-06 20:00:26 +03:00
499ecb501d load-authors-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-06 19:40:51 +03:00
b3e7d24d9d shouts-counter-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-05-06 19:27:51 +03:00
78b12d4f33 followers-cache-debug-2 2024-05-06 12:46:42 +03:00
5caa2d1f8c followers-cache-debug
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-06 12:41:53 +03:00
c46f264c4b followers-fix
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-06 12:38:39 +03:00
f6b21174bf unique-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 37s
2024-05-06 11:27:15 +03:00
d15b36a0f1 unique-fix
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-06 11:23:14 +03:00
232cdbfad8 docker-check-logger
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-06 10:58:31 +03:00
55e28162fe subprocess-fix
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-06 10:53:27 +03:00
49eec2de46 Merge branch 'dev' of https://dev.discours.io/discours.io/core into dev
All checks were successful
Deploy on push / deploy (push) Successful in 30s
2024-05-06 10:30:05 +03:00
52f5a4e813 followers-cache-fix 2024-05-06 10:29:50 +03:00
a5d99fa517 cache-follower-fix 2024-05-06 10:25:09 +03:00
Stepan Vladovskiy
2a08e6204e feat: sv - nginx sigil with /upload
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-06 04:20:18 -03:00
Stepan Vladovskiy
ab6dcde170 feat: sv - nginx sigil with /upload and now / on the end of proxy-pass 2024-05-06 04:18:24 -03:00
Stepan Vladovskiy
bf9e571cd8 feat: sv - nginx sigil with /upload 2024-05-06 04:07:07 -03:00
Stepan Vladovskiy
e38df1f9d5 debug: sv - nginx sigil old fasch alot of /// around all 2024-05-06 03:28:31 -03:00
Stepan Vladovskiy
449f63f540 debug: sv - back configs in nginx.sigil 2024-05-06 02:56:40 -03:00
Stepan Vladovskiy
22106ad657 debug: sv - trying different configs in nginx.sigil 2024-05-06 02:54:58 -03:00
Stepan Vladovskiy
4c274eee2e debug: /upload instead of /upload/ in sigil 2024-05-06 02:04:55 -03:00
Stepan Vladovskiy
b3caccb786 debug: sv - sigil style for uploader without / on the end of the proxy_pass 2024-05-05 23:11:31 -03:00
Stepan Vladovskiy
fc033734f5 debug: with proxy-pass in nginx to uploader
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-05-05 16:55:21 -03:00
Stepan Vladovskiy
2fb21847c3 debug: nginx.conf.sigil right names
Some checks failed
Deploy on push / deploy (push) Failing after 22s
2024-05-05 16:48:11 -03:00
Stepan Vladovskiy
e4d83d35eb debug: without check uploader in server.py
Some checks failed
Deploy on push / deploy (push) Failing after 22s
2024-05-05 16:44:09 -03:00
Stepan Vladovskiy
98d7c522fb debug: run check with dokku not cocker for find uploader
Some checks failed
Deploy on push / deploy (push) Failing after 32s
2024-05-05 16:41:39 -03:00
Stepan Vladovskiy
e6f88ffff0 feat: run check with dokku not cocker for find uploader
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2024-05-05 16:40:33 -03:00
Stepan Vladovskiy
d26f8c4903 feat: gitea workflow if uploader then stop installer
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-05-05 16:35:19 -03:00
Stepan Vladovskiy
89021ea018 feat: gitea workflow with Uploader check is runing, plus in server.py is checker too 2024-05-05 16:35:19 -03:00
0d87d3d889 unique-follows-debug
All checks were successful
Deploy on push / deploy (push) Successful in 35s
2024-05-05 21:38:59 +03:00
2b5fb704ba follow/unfollow-cache-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-05-05 21:04:38 +03:00
13d144f838 cant-follow-catch-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-05-05 20:44:57 +03:00
ac5674d18f following-cache-anyway-found
All checks were successful
Deploy on push / deploy (push) Successful in 27s
2024-05-05 20:17:07 +03:00
3ab42ecb72 following-cache-anyway-found 2024-05-05 20:16:45 +03:00
cfe9ac1005 follow-fi
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-05-05 19:49:07 +03:00
e50a6358a8 merged
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-05-05 18:46:46 +03:00
f6cb7e18d1 cache-updates-fix 2024-05-05 18:46:16 +03:00
Stepan Vladovskiy
526d2c3e4e feat: sv - in nginx client_max_body_size=100M, solution for upload large files
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-05-05 03:15:47 -03:00
c9205a698f typo-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-05-05 00:00:58 +03:00
dc791d4e7a same-rating-fox
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-05-04 23:48:55 +03:00
b2f7b06a93 topic caching
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-05-03 14:12:57 +03:00
db33410675 lesslog
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-05-02 09:12:47 +03:00
6c58f09402 feed-filter-fix-2 2024-05-01 05:08:54 +03:00
79f21387a5 feed-filter-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-05-01 05:02:35 +03:00
dc9c66c00f follow-fmr
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-05-01 04:01:21 +03:00
c68322e550 follow-fix 2024-05-01 04:00:54 +03:00
88de00706d follow-fix 2024-05-01 04:00:37 +03:00
658a2400c4 debug-4 2024-05-01 03:46:16 +03:00
12e42f2023 debug-2 2024-05-01 03:38:49 +03:00
f1bda441b4 debug-2 2024-05-01 03:35:31 +03:00
026bad95e2 debug
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-05-01 03:29:25 +03:00
831684922a get-my-shout-dbg 2024-05-01 03:09:54 +03:00
435e97ab04 get-my-shout-debug 2024-05-01 02:46:19 +03:00
883e98c3d3 get-my-shout-debug 2024-05-01 02:42:25 +03:00
94bf54b192 get-my-shout-fix
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-05-01 02:38:17 +03:00
9aacb75e84 auth-debug
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-04-30 16:18:50 +03:00
61c7f5a0dc drafts-debug
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-04-30 14:10:01 +03:00
a7f163009e fastify-load-authors-2
All checks were successful
Deploy on push / deploy (push) Successful in 49s
2024-04-30 12:35:51 +03:00
ab6ef76a34 fastify-load-authors
All checks were successful
Deploy on push / deploy (push) Successful in 39s
2024-04-30 12:33:41 +03:00
a992941aef logs-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-27 13:30:19 +03:00
9dc986b08c start-date-views-fix
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-04-27 01:51:45 +03:00
653b18041e str-time
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-04-27 01:48:15 +03:00
868b2ba16a removed-search-vector
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-04-27 01:43:42 +03:00
2e4d70db28 viwed-fix+fmt+outerjoin-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m1s
2024-04-27 01:41:47 +03:00
89956d6240 get-shout-debug
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-26 11:43:22 +03:00
7f1794891c cache-follower-fix
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-26 11:21:00 +03:00
ee24f2f1db my-shout-not-published
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-26 11:06:13 +03:00
cfed40ddd9 not-error-expired-token
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-04-26 07:21:03 +03:00
899016907c shouts-load-debug-2
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-25 14:06:21 +03:00
54e82f99eb shouts-load-debug
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-04-25 14:03:30 +03:00
605d60f126 featured-filter-fix 2024-04-25 14:01:16 +03:00
b1bd9a4829 feed-featured-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-04-25 12:19:42 +03:00
54766ffa42 feed-featured-fix 2024-04-25 12:08:20 +03:00
27d5272032 fmt
All checks were successful
Deploy on push / deploy (push) Successful in 58s
2024-04-25 12:07:30 +03:00
e68196ce0b counters-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-25 11:47:13 +03:00
c4148254ed get-topic-fix-5
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-25 11:25:39 +03:00
1e8b6b156b get-topic-fix-4 2024-04-25 11:25:23 +03:00
b1d459d7fa get-topic-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-04-25 11:24:16 +03:00
961d86c8f9 get-topic-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-04-25 11:22:18 +03:00
1b22276237 get-topic-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-25 11:20:57 +03:00
0b185c1c2d fmt
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-24 10:42:33 +03:00
5dbb0ccb12 region-cache-fix 2024-04-24 10:42:09 +03:00
e90d5aefb2 stat-resolver-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-04-24 10:30:32 +03:00
c1a66500e5 sort-order-reimplement-syntetic-stat
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-23 16:05:27 +03:00
54980faf49 select-from-fix
All checks were successful
Deploy on push / deploy (push) Successful in 28s
2024-04-23 15:46:59 +03:00
83204d1dff left-join-fix 2024-04-23 15:27:19 +03:00
870d5b62dc isort
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-23 15:15:18 +03:00
0b4c0faa79 stat-fix 2024-04-23 15:14:59 +03:00
f64d0a09a8 color-fix
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-04-23 14:41:19 +03:00
8436bc4305 separate-stat-query
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-23 14:31:34 +03:00
8e130027f0 auth-request-data-log
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-22 12:48:57 +03:00
b7d82d9cc5 refactored-author-on-login-required
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-19 18:22:07 +03:00
0ca6676474 Merge branch 'dev' of https://dev.discours.io/discours.io/core into dev
All checks were successful
Deploy on push / deploy (push) Successful in 56s
2024-04-18 12:34:32 +03:00
1a685e458d following-fix 2024-04-18 12:34:04 +03:00
47bc3adb69 fixes 2024-04-17 20:30:05 +03:00
372185e336 webhook-fix 2024-04-17 19:54:38 +03:00
519f5e4624 use-email-login-webhook 2024-04-17 19:20:35 +03:00
c25d7e3ab6 fmt 2024-04-17 18:32:23 +03:00
937b154c6b family-name-fix 2024-04-17 18:31:11 +03:00
Stepan Vladovskiy
994cd05d85 feat: no fore push dev to staging
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-11 16:14:26 -03:00
Stepan Vladovskiy
52280c29ea feat: fore push dev to staging
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-11 16:12:11 -03:00
Stepan Vladovskiy
dce4d77706 feat: fore push dev to staging
All checks were successful
Deploy on push / deploy (push) Successful in 6s
2024-04-11 16:11:17 -03:00
Stepan Vladovskiy
9ce0426b7e feat: force push to staging inbox 2024-04-11 15:41:00 -03:00
9911a9410d ..
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2024-04-10 16:09:03 +03:00
25868ec27b logger-fix
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 22:37:58 +03:00
25a65d09d6 tolerate
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 22:35:11 +03:00
cd99041bcc add-author-stat-fix
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 22:32:15 +03:00
1110f7d8ec any-id-fix-2
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2024-04-09 22:24:47 +03:00
e0df7e7436 any-id-fix
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 22:09:26 +03:00
44647bbf39 author-stat-fix
Some checks failed
Deploy on push / deploy (push) Failing after 5s
2024-04-09 22:06:00 +03:00
103fcfd045 trace-fix
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 22:02:26 +03:00
3f2c00a1df get-author-fix-3
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 21:53:35 +03:00
3cc680754b get-author-fix-3
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2024-04-09 21:51:24 +03:00
d7db2689c8 get-author-fix
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2024-04-09 21:46:21 +03:00
23288d1f91 query-debug-3
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 21:39:59 +03:00
1b00086148 query-debug-2
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 21:15:38 +03:00
0501b0f38e outerjoin-fix
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 21:08:47 +03:00
6703e3d093 authors-stat-fix 2024-04-09 21:06:34 +03:00
10c24fe400 topic-stat-fix 2024-04-09 21:05:24 +03:00
489c3c3232 any-fix-4 2024-04-09 20:59:03 +03:00
64f473e037 any-fix-3 2024-04-09 20:56:47 +03:00
202c8461f5 any-fix-2 2024-04-09 20:55:35 +03:00
cf64090ac3 any-fix
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 20:54:00 +03:00
f22b37cc91 has-fix
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 20:51:32 +03:00
e9fa53aff9 glitchtip
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 19:50:27 +03:00
d3262accc5 shout-topic-comments
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2024-04-09 19:48:02 +03:00
142a5f09af ..
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 19:40:44 +03:00
c6a4f04779 topic-stat-fix
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-04-09 19:38:02 +03:00
4fe15d1440 topic-stat-join-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-09 18:06:29 +03:00
e529ecbe41 params-fix-2 2024-04-09 17:57:22 +03:00
7be4642f5d params-fix
All checks were successful
Deploy on push / deploy (push) Successful in 35s
2024-04-09 17:55:07 +03:00
3fd94dc0fa notification-check
All checks were successful
Deploy on push / deploy (push) Successful in 34s
2024-04-09 17:51:23 +03:00
9e6f81606b import-fix
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-04-09 16:59:41 +03:00
2bf456b343 reactions-cache-update
All checks were successful
Deploy on push / deploy (push) Successful in 34s
2024-04-09 16:43:06 +03:00
1769b0925b follow-if-liked-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-04-09 14:03:50 +03:00
5e8c1ac30b tolerate-notifier-fails 2024-04-09 13:46:27 +03:00
6e17b89f26 author-get-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-04-09 13:41:30 +03:00
739b7b40d6 follower-id-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-04-09 13:38:44 +03:00
b3eda4a0e1 result-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-09 13:32:11 +03:00
dd0c5d15fd fix2
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-04-09 13:30:48 +03:00
e587ed05df found-author-fix 2024-04-09 13:30:02 +03:00
5bbfd2249f topic-cache-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-04-09 11:30:20 +03:00
d3ae078b20 refactored-cache-following
All checks were successful
Deploy on push / deploy (push) Successful in 36s
2024-04-09 11:17:32 +03:00
b802bb029a cache-upgrade
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-08 21:33:47 +03:00
d1cd69eb2a async-fig
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-08 12:42:45 +03:00
c301256751 precommit
All checks were successful
Deploy on push / deploy (push) Successful in 49s
2024-04-08 10:38:58 +03:00
df15e63dde reindex-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-08 10:23:54 +03:00
aa1693cc16 sentry-init-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-08 09:17:05 +03:00
Stepan Vladovskiy
8aa133aab1 feat: nginx with limit_conn_zone 10m change place
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-04-07 14:31:38 -03:00
Stepan Vladovskiy
acaea73a38 feat: with limit_conn_zone 10m
Some checks failed
Deploy on push / deploy (push) Failing after 20s
2024-04-07 14:29:21 -03:00
Stepan Vladovskiy
f4c43f7c00 feat: events worker_connections in global nginx.conf
Some checks failed
Deploy on push / deploy (push) Failing after 20s
2024-04-07 14:25:56 -03:00
Stepan Vladovskiy
7c19291ba9 feat: nginx worker events config in Dockerfile
Some checks failed
Deploy on push / deploy (push) Failing after 9s
2024-04-07 13:45:59 -03:00
Stepan Vladovskiy
0da9c87f5a feat: nginx with cach, keepalive, proxy_read, users_from_one_ip, workers
Some checks failed
Deploy on push / deploy (push) Failing after 21s
2024-04-07 13:39:37 -03:00
Stepan Vladovskiy
c9369e3c08 feat: simple glitchtip setup, without all
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-03 01:21:19 -03:00
Stepan Vladovskiy
4166f8e695 feat: make all like in docs
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-04-01 17:44:18 -03:00
Stepan Vladovskiy
c8776df610 debug: with glitchtip amd middleware in main.py
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-04-01 00:25:14 -03:00
Stepan Vladovskiy
deb8da2363 feat: with glitchtip amd middleware in main.py
All checks were successful
Deploy on push / deploy (push) Successful in 38s
2024-04-01 00:11:48 -03:00
Stepan Vladovskiy
1970b197a5 feat: with glitchtip in main.py
All checks were successful
Deploy on push / deploy (push) Successful in 34s
2024-04-01 00:01:38 -03:00
232f41b905 isolate-ratings
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-29 14:44:44 +03:00
c159490413 rating-fix-9
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-03-29 03:03:37 +03:00
dd840b63ca rating-fix-8
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-29 02:56:25 +03:00
d06b8eaa4e rating-fix-7
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-29 02:50:38 +03:00
d529daea25 rating-fix-6
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-29 02:45:23 +03:00
489e6b39a9 rating-fix-5
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-03-29 02:40:08 +03:00
943b52e067 rating-fix-4
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-29 02:37:26 +03:00
99895d1b94 rating-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-29 02:31:59 +03:00
3f68e25230 rating-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-03-29 02:29:16 +03:00
9cc0c5b011 rating-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-29 02:15:38 +03:00
a4dd56ee44 comments-rating-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-29 01:49:30 +03:00
53c067ff80 comments-rating-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-29 01:34:50 +03:00
cc8f08588c comments-rating-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-03-29 00:36:19 +03:00
b8f08c3411 comments-rating
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-03-29 00:29:28 +03:00
8f532b0023 author-stat-fix-9 2024-03-28 23:59:53 +03:00
4b5c101f2f author-stat-fix-8 2024-03-28 23:59:26 +03:00
f8f3a32556 author-stat-fix-7 2024-03-28 23:39:12 +03:00
8ff0e6786b author-stat-fix-6
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-03-28 23:33:56 +03:00
e9c852d23d author-stat-fix-5
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 23:26:45 +03:00
feede764bf author-stat-fix-4
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 23:19:07 +03:00
e426a2b087 author-stat-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 22:53:02 +03:00
284250770e author-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 22:51:09 +03:00
d74a6dedaa comments-count-fix
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 22:41:48 +03:00
0a767a14b6 author-rating-4
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-03-28 22:31:33 +03:00
2f4019ca6f author-rating-3
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 22:29:51 +03:00
b023773cc6 author-rating-2
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-03-28 22:26:46 +03:00
34e12975fe get-author-stat-debug
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-03-28 22:10:01 +03:00
c9605cf918 get
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 20:45:26 +03:00
ea16de3f1a rating-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 20:45:03 +03:00
d6bf3e1602 rating-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 20:42:22 +03:00
029e6af161 debloat-get-author
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 20:36:35 +03:00
5c41312b1d with-stat-cached-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-03-28 19:51:09 +03:00
495b296508 with-stat-cached-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 19:45:21 +03:00
1eeff25b4d with-stat-cached-fix
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-03-28 19:40:54 +03:00
1f012ae5c9 revalidate-stat
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-03-28 19:39:10 +03:00
77440388d3 author-refactored
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 19:36:27 +03:00
736877d50e cached-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 19:22:47 +03:00
0f57bea256 renew-stat 2024-03-28 19:21:57 +03:00
9647ec9708 scalar
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 19:16:47 +03:00
a4957ef0ad stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 19:14:39 +03:00
2d538a292a refactored-get-author-4
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 19:11:26 +03:00
9d8831d7ed refactored-get-author-3
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 19:08:55 +03:00
8826af02b5 refactored-get-author
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-03-28 19:05:27 +03:00
e103b283cb dblog-debug5
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 16:37:04 +03:00
9a12cbcdde dblog-debug3
All checks were successful
Deploy on push / deploy (push) Successful in 36s
2024-03-28 16:34:27 +03:00
6bc4fe42c4 dblog-debug3
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 16:30:04 +03:00
556857fc28 dblog-debug2
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 16:28:17 +03:00
23fb4227ad dblog-debug
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 16:25:51 +03:00
057b43730e dblog-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 16:17:34 +03:00
bb0412bb5c dblog-fix
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 16:05:28 +03:00
e9be761420 dblog-fox
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 16:01:48 +03:00
9bda7cef95 fmt
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 15:56:32 +03:00
7f913050ee author-follows-result-type-debug
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-03-28 15:48:58 +03:00
73c3d47f1b author-follows-result-type-debug
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 15:43:41 +03:00
72b9bb407d compact-author-ratings
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 15:38:14 +03:00
1eb3d54dd0 author-follows-result-type-
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 15:11:08 +03:00
e7149e905a author-follows-result-type
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 15:04:46 +03:00
2ee87c975a get_author-follows-fixed 2024-03-28 14:58:47 +03:00
cf6230e8d6 get_author-follows-fixed
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 14:57:21 +03:00
054077c99e get_author-follows-debug-3
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 14:56:08 +03:00
3d28370362 get_author-follows-debug-2
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 14:38:03 +03:00
6c9fd23e67 get_author-follows-debгп
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-03-28 14:13:18 +03:00
95c54ff0c4 get_author-follows-debug
All checks were successful
Deploy on push / deploy (push) Successful in 22s
2024-03-28 14:09:11 +03:00
e2faec5893 scalar-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 14:05:46 +03:00
6f016f236d caching-fixes 2024-03-28 14:05:06 +03:00
7907e5bc4f get_author_follows-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-28 13:37:28 +03:00
65fd4df5ef get_author_follows-fix 2024-03-28 13:33:41 +03:00
235b908766 logger-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m17s
2024-03-26 11:57:00 +03:00
3eacc142f2 unrated-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-25 21:07:32 +03:00
9eb2ad21d0 filters-fix
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-03-25 20:41:28 +03:00
f03a6d0efe filter-my-fix
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-03-25 20:38:46 +03:00
e9611fc8c1 feed-filters-fix
All checks were successful
Deploy on push / deploy (push) Successful in 21s
2024-03-25 20:28:58 +03:00
337fa82fb4 last-comment-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-25 19:50:23 +03:00
d92d280595 typo-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-03-25 15:31:16 +03:00
fab57469d3 random-top-shouts
All checks were successful
Deploy on push / deploy (push) Successful in 46s
2024-03-25 15:03:03 +03:00
4daf746976 views-independant 2024-03-19 16:24:25 +03:00
e97ffacd23 update-after-debug-2 2024-03-18 15:01:43 +03:00
c346481ade update-after-debug 2024-03-18 15:01:10 +03:00
818b4ccae9 debug-get-with-stat 2024-03-14 10:21:04 +03:00
837763ed64 get-author-fix-2 2024-03-14 09:59:38 +03:00
ab36dfe233 debug-get-author 2024-03-14 09:55:14 +03:00
64b1498215 authorid-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-03-14 01:35:09 +03:00
ff7c5df8de trigdeploy
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-03-13 23:02:41 +03:00
3231e42428 query-fix 2024-03-13 15:53:40 +03:00
324f069844 following-error-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-03-13 15:35:49 +03:00
1dd34d5818 following-error 2024-03-13 15:34:17 +03:00
4c0f3087db nginx-ports
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-03-13 12:50:40 +03:00
13bff800f0 author-id-faster
Some checks failed
Deploy on push / deploy (push) Failing after 20s
2024-03-13 12:44:08 +03:00
13e2a4b7ba log-color-fix-4
Some checks failed
Deploy on push / deploy (push) Failing after 20s
2024-03-12 18:27:58 +03:00
9a15cda218 log-color-fix-3 2024-03-12 18:24:44 +03:00
695c9a97eb log-color-fix 2024-03-12 18:17:28 +03:00
b6691b1b7b logger-fix 2024-03-12 18:14:34 +03:00
4667168636 logs-fox 2024-03-12 17:48:34 +03:00
9c7c5fb8d2 multiline-logger-fix 2024-03-12 17:40:55 +03:00
e99acd591a cached-all 2024-03-12 17:26:52 +03:00
a3303837d5 cached-load-fix-2 2024-03-12 17:00:20 +03:00
567f41c0c3 cached-load-fix 2024-03-12 16:50:14 +03:00
23547546cb cached-authors-fix 2024-03-12 16:46:18 +03:00
0b8776a87f topics-fix 2024-03-12 16:23:01 +03:00
358cc86197 debug-topics 2024-03-12 16:21:28 +03:00
6064f0326a dogpiled-cache-authors 2024-03-12 16:18:07 +03:00
625836afee authorsby-not-cached 2024-03-12 16:07:21 +03:00
3e57ef5948 views-log-fix 2024-03-12 15:57:46 +03:00
9b7aa57a18 cache-reform 2024-03-12 15:50:57 +03:00
d1a510b093 use-cached-following 2024-03-12 15:28:20 +03:00
26a527473f use-cached-authors 2024-03-12 15:26:36 +03:00
d5a9a18c04 dict-fix 2024-03-12 15:05:45 +03:00
480485c20a circular-fix 2024-03-12 15:01:45 +03:00
37319c2091 cache-events-fix 2024-03-12 14:59:36 +03:00
91ffcb85df typechecker-column-fix 2024-03-12 10:52:32 +03:00
04f7231fe9 refactored-2 2024-03-12 10:36:34 +03:00
a7944f5176 refactored 2024-03-12 10:35:33 +03:00
0e1df1e7ca followers-update-fix 2024-03-12 08:00:42 +03:00
059dd0f9b4 remove-follow-debug 2024-03-11 17:07:37 +03:00
78dbde6273 dict-fix 2024-03-11 16:58:31 +03:00
e6f5cfcb8d return-error-on-follow 2024-03-11 16:52:16 +03:00
ebf08ea2ed clean 2024-03-11 16:17:52 +03:00
c6e045d5ee follows-brushed 2024-03-11 16:12:28 +03:00
4bc469ab04 dbeug-follow-3 2024-03-11 15:50:44 +03:00
11f3cdeb7c dbeug-follow-2 2024-03-11 15:41:24 +03:00
9944277908 tuple-fix 2024-03-11 15:21:34 +03:00
8b5a50b7ae author-id-fix 2024-03-11 15:19:10 +03:00
b45ad1082d author-id-fix 2024-03-11 15:18:51 +03:00
10f8faccdd follows-return 2024-03-11 15:15:28 +03:00
4898e43f57 follow-unfollow-2 2024-03-11 15:13:46 +03:00
df55b68a5a follow-unfollow 2024-03-11 14:49:42 +03:00
23be0da876 search-log 2024-03-11 13:47:12 +03:00
e50bbcdb7c debug-unfollow 2024-03-11 13:44:48 +03:00
b3196f6dcb reaction-after-fix 2024-03-11 13:41:15 +03:00
ebbd1d729e reaction-after-debug 2024-03-11 13:39:12 +03:00
e6cd0ecadc unfollow-fix-2 2024-03-11 13:37:35 +03:00
1572c77882 remove-logs 2024-03-11 12:43:37 +03:00
bda2b7b59a unfollow-debug 2024-03-11 12:41:00 +03:00
7234eb9519 less-log 2024-03-11 12:37:34 +03:00
b18ba16aab update_follows_for_author-call-fix 2024-03-11 12:25:08 +03:00
b58406866c set_follows_authors_cache-fix 2024-03-11 12:22:28 +03:00
9933545383 debug-get_author_follows 2024-03-11 12:20:50 +03:00
1c7729a5b9 query-fixes 2024-03-11 12:10:14 +03:00
e23c49b6c6 redis-fix 2024-03-11 12:07:19 +03:00
5f7087b0df follow-fix 2024-03-11 12:03:41 +03:00
1162c62a9b logger-auth 2024-03-11 11:59:20 +03:00
6243c27390 handle-no-userid 2024-03-11 11:56:14 +03:00
bf1068d070 follow-unfollow-fix 2024-03-11 11:33:39 +03:00
20cc14adc6 debug-follow 2024-03-11 11:16:12 +03:00
94be60304e refactored-get-my-shout-topics
Some checks failed
Deploy on push / deploy (push) Failing after 20s
2024-03-07 14:46:03 +03:00
0182b501fe refactored-get-my-shout-2 2024-03-07 14:44:07 +03:00
0d111bda47 refactored-get-my-shout 2024-03-07 14:42:48 +03:00
6f3ed3704a get-my-shout-api-fix 2024-03-07 14:29:45 +03:00
61088320c9 patch-main-topic-fix 2024-03-07 11:55:23 +03:00
e378cbd442 rm-reaction-fix 2024-03-07 10:18:05 +03:00
c84aae40d3 rm-reaction-debug 2024-03-07 08:13:19 +03:00
e4e681a9ab logs-with-params 2024-03-06 22:18:32 +03:00
5c7b28de90 custom-encoder-fix-3 2024-03-06 22:05:17 +03:00
7a5cbf7438 custom-encoder-fix-2 2024-03-06 22:00:37 +03:00
2b89ab7c78 custom-encoder-fix 2024-03-06 21:57:04 +03:00
4aa4303a59 groupby-fix-2 2024-03-06 15:17:46 +03:00
b13d57ca17 groupby-fix 2024-03-06 15:16:29 +03:00
54eeb5b549 subquery-fix-2 2024-03-06 15:13:05 +03:00
83f12202a8 subquery-fix 2024-03-06 15:11:01 +03:00
045217c011 shout-id-fix 2024-03-06 15:08:20 +03:00
30f5b09a51 typo-fix 2024-03-06 14:27:30 +03:00
7199539a28 reaction-cudl-log 2024-03-06 14:09:21 +03:00
2c1bfaf0fe topics-comments-stat 2024-03-06 13:43:30 +03:00
70c5233305 oauth-name-patch 2024-03-06 13:04:23 +03:00
b82a4bb2fa add_author_stat-fix-2 2024-03-06 12:34:17 +03:00
9f881c0641 add_author_stat-fix+fmt 2024-03-06 12:25:55 +03:00
70589a35da cosmetics 2024-03-06 12:15:26 +03:00
6e046a677c less-cond 2024-03-06 12:09:46 +03:00
c55f696bf3 typo-fix 2024-03-06 12:07:40 +03:00
8bbbe2b0c7 delete-reaction-fix
Some checks failed
Deploy on push / deploy (push) Failing after 18s
2024-03-06 12:03:26 +03:00
cb535cffea forbidden-fix
Some checks failed
Deploy on push / deploy (push) Failing after 20s
2024-03-06 10:44:08 +03:00
b09ea39668 get-my-shout-resolver 2024-03-05 20:12:17 +03:00
5d8c46e76c drafts-resolver-1
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-03-05 18:53:18 +03:00
b5727b1b85 update-shout-fix-10 2024-03-05 18:17:48 +03:00
13f6c43df2 update-shout-fix-9 2024-03-05 18:15:21 +03:00
f378925a16 update-shout-fix-8 2024-03-05 18:13:39 +03:00
f68778e529 update-shout-fix-7 2024-03-05 18:10:58 +03:00
fa76d6c7b4 update-shout-fix-6 2024-03-05 18:04:47 +03:00
ee7c464065 update-shout-fix-5 2024-03-05 18:01:47 +03:00
78c7a41c46 update-shout-fix-4 2024-03-05 18:01:29 +03:00
5943f9bf81 update-shout-fix-3 2024-03-05 17:59:00 +03:00
7c75c2accc update-shout-fix-2 2024-03-05 17:53:49 +03:00
12a9880815 update-shout-fix
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-03-05 16:59:55 +03:00
130942d9dd decor-order-fix 2024-03-05 14:56:19 +03:00
005889c470 less-scope-exception-5 2024-03-05 14:50:50 +03:00
16c425fd5e less-scope-exception-4 2024-03-05 14:45:53 +03:00
cc3e7b982b less-scope-exception- 2024-03-05 14:44:19 +03:00
3e96366887 less-scope-exception-2 2024-03-05 14:39:47 +03:00
c8b55d0d5b less-scope-exception 2024-03-05 14:38:04 +03:00
1099f8a185 401-ex 2024-03-05 12:50:01 +03:00
8a449bbe7a get-shout-access
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-03-05 12:00:45 +03:00
ef25ebc7bc result-fix-2
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-03-04 21:08:01 +03:00
2f4747a5de result-fix 2024-03-04 20:34:11 +03:00
e4915dcd7d debug-logs
Some checks failed
Deploy on push / deploy (push) Failing after 20s
2024-03-04 20:25:47 +03:00
b62f40d549 webhook-fix
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-03-04 20:24:17 +03:00
21bcda1e3b webhook-fix 2024-03-04 19:08:21 +03:00
5ff28ce31b schema-update 2024-03-04 15:48:04 +03:00
36fefd93be offset-entity-fix
Some checks failed
Deploy on push / deploy (push) Failing after 18s
2024-03-04 15:47:17 +03:00
abfe9f6e0e notifier-fixes
Some checks failed
Deploy on push / deploy (push) Failing after 18s
2024-03-04 13:43:02 +03:00
88ca5a1362 notifier-schema-fix
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-03-04 10:59:14 +03:00
3016a75332 notifier-integration
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-03-04 10:35:33 +03:00
ad0dc98bc9 webhook-fix 2024-03-03 16:59:15 +03:00
ab7d677a20 long-queries-only 2024-03-01 12:20:06 +03:00
da0a709ce7 sqlalchemy-warn-fix 2024-03-01 12:18:06 +03:00
ef36e38007 groupby-fix 2024-03-01 10:32:18 +03:00
3a04a69d24 typofix+topic-stat 2024-03-01 09:59:19 +03:00
c41ae4ba98 comments-stat-subquery 2024-03-01 09:56:36 +03:00
b0136fd9bc follows-return
Some checks failed
Deploy on push / deploy (push) Failing after 20s
2024-03-01 00:51:49 +03:00
bdf78bb45d comments-order-hotfix-2
Some checks failed
Deploy on push / deploy (push) Failing after 20s
2024-02-29 20:00:35 +03:00
bd905021ae coalesce-desc-sord 2024-02-29 15:52:36 +03:00
978595c246 drafts-ordered 2024-02-29 15:50:26 +03:00
dfbfa9335c get-author-slug-fix 2024-02-29 15:47:32 +03:00
1a563420d3 reaction-sort-type 2024-02-29 15:39:55 +03:00
4d992f1b60 aliased-revert 2024-02-29 15:21:46 +03:00
cc16163673 order-field-fix-2 2024-02-29 15:19:53 +03:00
395120ad7a order-field-fix 2024-02-29 15:17:42 +03:00
acb804f78c sa-warns-back 2024-02-29 15:15:04 +03:00
0437052280 comments-sort-order-fix 2024-02-29 15:10:59 +03:00
fc3bb52431 reindex-fix-6 2024-02-29 14:56:50 +03:00
cb85e24a11 recreate-fixed-4 2024-02-29 14:48:08 +03:00
c8acf6a9ac recreate-fixed-2
Some checks failed
Deploy on push / deploy (push) Failing after 18s
2024-02-29 14:41:32 +03:00
8de765ed50 recreate-fixed 2024-02-29 14:28:51 +03:00
7ad9b7919a search-logs
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-02-29 14:24:53 +03:00
5df82704b3 indexing-fix- 2024-02-29 14:17:10 +03:00
2b530131e5 indexing-fix-5 2024-02-29 14:12:35 +03:00
67d1a3ae5c indexing-fix-4 2024-02-29 14:11:48 +03:00
ca3065f741 indexing-fix-3 2024-02-29 14:09:50 +03:00
f07fd646d3 indexing-fix
Some checks failed
Deploy on push / deploy (push) Failing after 18s
2024-02-29 14:04:24 +03:00
0ea4e596d2 indexing 2024-02-29 13:55:44 +03:00
14c2750d92 search-thread-2 2024-02-29 13:49:34 +03:00
b4f86526a2 search-thread 2024-02-29 13:48:20 +03:00
24cbba0746 search-reindex-fix
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-02-29 13:43:41 +03:00
e656920f7b search-reindex-fix 2024-02-29 13:43:30 +03:00
435279735b viewes-service-refactor
Some checks failed
Deploy on push / deploy (push) Failing after 19s
2024-02-29 13:18:17 +03:00
9f30f251d6 update-shout-fix 2024-02-29 13:14:14 +03:00
d28024a69b logs-fix 2024-02-29 13:04:25 +03:00
cfb0ba910f redeploy 2024-02-29 12:14:45 +03:00
62b90d73a7 views-logs-fix 2024-02-29 11:48:18 +03:00
aaa39e0a0d no-cursor-events
All checks were successful
Deploy on push / deploy (push) Successful in 1m31s
2024-02-29 11:29:28 +03:00
5bec25fc23 less-logs
Some checks failed
Deploy on push / deploy (push) Failing after 5s
2024-02-29 11:12:54 +03:00
a3c94a9ab7 load-authors-by-fix
Some checks failed
Deploy on push / deploy (push) Failing after 5s
2024-02-29 11:00:41 +03:00
5e8b7cfe98 followers-cache-fixes 2024-02-29 10:42:17 +03:00
977b86a3c6 fix-followers-save 2024-02-29 10:39:07 +03:00
5e400a7618 redis-keys-renamed 2024-02-29 10:34:22 +03:00
10248ffd8c debug-followers-cache 2024-02-29 10:31:49 +03:00
f774c54cc2 followers-cached
Some checks failed
Deploy on push / deploy (push) Failing after 5s
2024-02-29 10:23:08 +03:00
caf45f3d42 .dict-fxt
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-02-29 10:02:29 +03:00
ad5b4a81c3 get-author-debug
Some checks failed
Deploy on push / deploy (push) Failing after 5s
2024-02-29 09:48:41 +03:00
ceecef6a7a return-none
Some checks failed
Deploy on push / deploy (push) Failing after 5s
2024-02-29 09:44:04 +03:00
b26da8f316 search-debug
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-02-29 09:34:40 +03:00
f52c13e082 staging-deploy-test
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-02-29 07:56:23 +03:00
31320c9972 revert-2-queries-less-price
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-28 19:24:05 +03:00
b99ed1a7d1 groupby-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-28 19:14:57 +03:00
6c0b43bd14 random-topic-shouts-patch-2
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-28 18:20:58 +03:00
7a3ce4a982 .c
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-28 18:15:19 +03:00
ac1fc151ab random-topic-shouts-patch
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-28 18:11:51 +03:00
129c4bccf4 get-followers-scalar-fix
All checks were successful
Deploy on push / deploy (push) Successful in 28s
2024-02-27 17:03:21 +03:00
a993741cf2 get-followers-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-27 16:56:00 +03:00
04d918749f get-followers-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-27 16:52:11 +03:00
fa7b05a86e get-author-followers-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-27 16:42:26 +03:00
eadae7f639 logger-improved-2
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-27 16:41:09 +03:00
4c328370c2 logger-improved
All checks were successful
Deploy on push / deploy (push) Successful in 29s
2024-02-27 16:33:25 +03:00
eb295549fb update-tolerate 2024-02-27 16:28:54 +03:00
2e68128dfc cache-refactored
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-27 15:40:53 +03:00
564a8c10b7 cache-author-with-stat
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-27 14:53:13 +03:00
8d058b4902 delete-shout-tolerate
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-27 14:29:28 +03:00
52f46555a7 auth-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-27 14:16:54 +03:00
fc0e3b5541 authlogs2
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-02-27 14:06:00 +03:00
def6921215 authlogs
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-27 13:56:21 +03:00
a962435898 root-auth-logs-3
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-02-27 13:55:11 +03:00
7434c47755 root-auth-logs-2 2024-02-27 13:54:47 +03:00
401c058f32 root-auth-logs
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-02-27 13:49:06 +03:00
9f49cde0d7 notuple
All checks were successful
Deploy on push / deploy (push) Successful in 30s
2024-02-27 13:40:56 +03:00
03568ecea0 login-required-async-fix
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-02-27 13:21:50 +03:00
4ee4c3595a async-create-shout-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-27 13:07:14 +03:00
82e129a589 less-fields-author-serlect-after-reaction 2024-02-27 12:58:24 +03:00
193332f6d8 Merge branch 'dev' of https://dev.discours.io/discours.io/core into dev
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-27 12:49:17 +03:00
cbd8ba6b68 authors-subquery-json-fix 2024-02-27 12:47:42 +03:00
Stepan Vladovskiy
145c5cdbc2 feat: Cors with mp3 and clean up basura
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-27 06:05:01 -03:00
ef2f8dca82 compound-select-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-27 11:22:48 +03:00
a5636af259 compound-select-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-27 11:19:46 +03:00
8914dfc8b0 select_from-author-2
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-27 11:09:04 +03:00
23b7fe7af9 select_from-author
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-27 11:07:24 +03:00
1214dc03d9 less-logs
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-27 10:53:53 +03:00
fc6b8d3a08 debug-cached-authpr
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-27 10:41:36 +03:00
3efcfef537 sort-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 20:26:57 +03:00
be27e7306c sort-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 20:07:42 +03:00
02b504cc4f no-distinct
All checks were successful
Deploy on push / deploy (push) Successful in 30s
2024-02-26 20:05:07 +03:00
02b2aad813 no-comments-stat
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 19:50:54 +03:00
2ae3f2875f comments_stat-0
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 19:44:13 +03:00
fbee450bde comments_stat
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 19:38:22 +03:00
248620622a reactions-distinct
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-02-26 18:16:52 +03:00
172b3af6df no-distinct-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-26 18:12:09 +03:00
c905666591 json-as-dict
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-26 18:04:34 +03:00
72aa96a99f dict-patch
All checks were successful
Deploy on push / deploy (push) Successful in 26s
2024-02-26 18:00:55 +03:00
431b14bf5b orderby-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 16:04:39 +03:00
3c0a1cf592 less-redis-log
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 15:56:13 +03:00
851a661c6f distinct-reactions
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 15:46:30 +03:00
fec363063d distinct
All checks were successful
Deploy on push / deploy (push) Successful in 29s
2024-02-26 15:21:12 +03:00
ced8c9f75c error-handle-create-shout-2
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 12:52:22 +03:00
4a57866c3d error-handle-create-shout
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-26 12:22:55 +03:00
a93fa7fb18 async-login-requiered
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-26 12:14:08 +03:00
2257c3375a nodict
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-26 11:57:18 +03:00
ecbeb5b85e shout-author-create-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 11:52:57 +03:00
33a59a4acc after-shouts-update-fix
All checks were successful
Deploy on push / deploy (push) Successful in 33s
2024-02-26 11:47:52 +03:00
886ca8c0ff setex-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-26 05:52:08 +03:00
ebbbe05237 get-author-fix-6
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-26 05:43:35 +03:00
8fb161470f preparing-cache-data
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-26 05:36:18 +03:00
28d2227c39 get-author-fix-5
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-26 05:23:18 +03:00
8b8a284e59 more-caching
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-26 05:06:27 +03:00
732bd2b098 caching-follows
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-26 04:58:27 +03:00
f40eff2822 events-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-26 04:46:23 +03:00
eab1700b0d get-author-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-26 04:22:06 +03:00
a00c68068f follows-cache-fix
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-26 03:49:56 +03:00
5478ff45e7 get-author-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 23s
2024-02-26 02:07:46 +03:00
8635fd9c08 comments-stat-off-2
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-26 01:24:32 +03:00
90a6e23e61 comments-stat-off
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 01:10:15 +03:00
152730526f get-author-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-26 01:06:10 +03:00
f12d2fc560 get-author-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 01:03:11 +03:00
a7f14ee473 author.stat.comments
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 00:29:14 +03:00
5ca072dfaa events-trigger-query-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-26 00:06:37 +03:00
b02b8276a6 get-author-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-25 22:45:36 +03:00
8be96daae4 cache-update-fix
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-25 21:47:14 +03:00
fc774adb9f search-authors-fix
All checks were successful
Deploy on push / deploy (push) Successful in 25s
2024-02-25 21:43:30 +03:00
8b3cfebc47 Merge remote-tracking branch 'origin/dev' into dev 2024-02-25 21:27:17 +03:00
f596a9bf2c update-author_cache 2024-02-25 21:27:07 +03:00
7a89bb2783 update-author_cache
All checks were successful
Deploy on push / deploy (push) Successful in 1m26s
2024-02-25 21:16:34 +03:00
314c54969b sa-warning-fix
Some checks failed
Deploy on push / deploy (push) Failing after 1m4s
2024-02-25 20:58:48 +03:00
c7fe7f458c aliased-author-fix
All checks were successful
Deploy on push / deploy (push) Successful in 2m12s
2024-02-25 19:44:33 +03:00
9ea10ba5c1 dockerfile-revert
All checks were successful
Deploy on push / deploy (push) Successful in 1m46s
2024-02-25 19:32:36 +03:00
695c5efbc8 dockerfile-update-4
Some checks failed
Deploy on push / deploy (push) Failing after 2m39s
2024-02-25 19:29:32 +03:00
feea5845a8 dockerfile-update-3
Some checks failed
Deploy on push / deploy (push) Failing after 28s
2024-02-25 19:27:41 +03:00
3b5a6973ef dockerfile-fix
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-02-25 19:08:20 +03:00
b12db9af0e faster-get-author
Some checks failed
Deploy on push / deploy (push) Failing after 8s
2024-02-25 19:02:15 +03:00
1e922e3161 create-all-fix
Some checks failed
Deploy on push / deploy (push) Failing after 35s
2024-02-25 18:36:08 +03:00
a760d253b3 configure-mappers-call-fix-3
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2024-02-25 18:26:23 +03:00
b5240d9508 configure-mappers-call-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 2m10s
2024-02-25 18:19:12 +03:00
4dbd593cba configure-mappers-call-fix
Some checks failed
Deploy on push / deploy (push) Failing after 35s
2024-02-25 18:08:02 +03:00
309ac2d929 desc-order-fix 2024-02-25 17:58:09 +03:00
2e635abe5e sql-text-fix-order
All checks were successful
Deploy on push / deploy (push) Successful in 1m56s
2024-02-25 17:49:15 +03:00
26c12b2aad order-by-text-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m39s
2024-02-25 17:39:38 +03:00
ad1bb4af19 search-pg-catalog 2024-02-25 16:46:27 +03:00
2222f6fc19 searchable
All checks were successful
Deploy on push / deploy (push) Successful in 4m21s
2024-02-25 16:43:04 +03:00
4b83f5d0f5 sql-text-fix
All checks were successful
Deploy on push / deploy (push) Successful in 3m35s
2024-02-25 16:15:07 +03:00
857a3648a3 pgtrgm-add
All checks were successful
Deploy on push / deploy (push) Successful in 1m49s
2024-02-25 16:12:08 +03:00
a4745df71b sql-text-fix
All checks were successful
Deploy on push / deploy (push) Successful in 2m1s
2024-02-25 16:04:15 +03:00
8b15ef9429 fmt
Some checks failed
Deploy on push / deploy (push) Failing after 7s
2024-02-25 16:00:50 +03:00
07a9e7ef56 engine-exec-2
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-02-25 15:56:28 +03:00
146d49be5b table-name-fix-2
Some checks failed
Deploy on push / deploy (push) Failing after 1m43s
2024-02-25 15:47:28 +03:00
ccc5c98a14 typo-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m43s
2024-02-25 15:33:07 +03:00
a149091e3c search-authors-fmt
All checks were successful
Deploy on push / deploy (push) Successful in 1m25s
2024-02-25 15:22:48 +03:00
9aabfacf84 little-redis-cache
All checks were successful
Deploy on push / deploy (push) Successful in 1m25s
2024-02-25 14:58:16 +03:00
9c6a349cc7 re-alias-author
All checks were successful
Deploy on push / deploy (push) Successful in 1m8s
2024-02-25 14:41:04 +03:00
fc58208bdd more-logs
All checks were successful
Deploy on push / deploy (push) Successful in 1m29s
2024-02-25 14:39:26 +03:00
60e7cd03b7 logs
All checks were successful
Deploy on push / deploy (push) Successful in 2m23s
2024-02-25 14:26:44 +03:00
5d8638867d no-ratings-check
All checks were successful
Deploy on push / deploy (push) Successful in 1m30s
2024-02-25 14:08:09 +03:00
fc0e4bb2df aliased-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m45s
2024-02-25 13:54:28 +03:00
c863dda81b ratings-subquery-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 1m35s
2024-02-25 13:45:33 +03:00
8d47c02511 ratings-subquery-fix 2024-02-25 13:43:12 +03:00
c216161ece one-joined-query
All checks were successful
Deploy on push / deploy (push) Successful in 1m32s
2024-02-25 13:29:57 +03:00
eb4a4fef61 import-fix-3
All checks were successful
Deploy on push / deploy (push) Successful in 1m19s
2024-02-25 12:10:09 +03:00
7370c8ca2d import-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 1m30s
2024-02-25 12:06:41 +03:00
42313184b0 import-fix
All checks were successful
Deploy on push / deploy (push) Successful in 3m59s
2024-02-25 11:35:06 +03:00
efa6ac7d60 get-author-followers-fix
All checks were successful
Deploy on push / deploy (push) Successful in 2m10s
2024-02-25 11:27:08 +03:00
b2357e0afb debug-stat
All checks were successful
Deploy on push / deploy (push) Successful in 1m21s
2024-02-25 09:48:16 +03:00
d58bbe3499 load-authors-by-rating
All checks were successful
Deploy on push / deploy (push) Successful in 1m46s
2024-02-25 09:31:06 +03:00
40305ad35d fix-sawarning
All checks were successful
Deploy on push / deploy (push) Successful in 1m40s
2024-02-25 00:42:22 +03:00
3097c33e44 full-traceback-on-sawarning
All checks were successful
Deploy on push / deploy (push) Successful in 2m11s
2024-02-25 00:06:54 +03:00
6f11652320 fix-int
All checks were successful
Deploy on push / deploy (push) Successful in 1m37s
2024-02-24 21:56:09 +03:00
f5b3cd8f97 debug-query-follows
All checks were successful
Deploy on push / deploy (push) Successful in 1m24s
2024-02-24 21:52:16 +03:00
eaaace4d28 fmt
All checks were successful
Deploy on push / deploy (push) Successful in 3m45s
2024-02-24 21:45:38 +03:00
12137eccda minor-fixes
All checks were successful
Deploy on push / deploy (push) Successful in 1m48s
2024-02-24 21:30:19 +03:00
d7c9622ffa int-id-fix
All checks were successful
Deploy on push / deploy (push) Successful in 5m45s
2024-02-24 21:15:11 +03:00
5e72a08e0f circular-fix-2
All checks were successful
Deploy on push / deploy (push) Successful in 3m49s
2024-02-24 20:42:19 +03:00
a3244fc74b circular-fix
Some checks failed
Deploy on push / deploy (push) Failing after 6s
2024-02-24 19:53:47 +03:00
f1444cbe10 stat-fn-moved
All checks were successful
Deploy on push / deploy (push) Successful in 1m59s
2024-02-24 19:23:53 +03:00
3e58164ae8 ratings-true
All checks were successful
Deploy on push / deploy (push) Successful in 1m42s
2024-02-24 19:12:35 +03:00
003fa1bbac types fixes 2024-02-24 13:22:35 +03:00
0ca83cc91e cache authors by id 2024-02-24 09:26:31 +03:00
02a7b64449 unauthorized-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m52s
2024-02-24 00:00:46 +03:00
dae2c7b689 select-from-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m42s
2024-02-23 23:42:49 +03:00
11ea8b7efb fieldname-fix
All checks were successful
Deploy on push / deploy (push) Successful in 2m2s
2024-02-23 23:26:12 +03:00
1edf93f7ce follows-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m43s
2024-02-23 23:15:16 +03:00
8b9ac594cd query-fixed
All checks were successful
Deploy on push / deploy (push) Successful in 2m36s
2024-02-23 22:43:50 +03:00
fbbc408df6 clean
All checks were successful
Deploy on push / deploy (push) Successful in 1m27s
2024-02-23 22:24:48 +03:00
f16f345040 topics-with-stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m23s
2024-02-23 22:14:08 +03:00
2f81a5cf12 coalesce
All checks were successful
Deploy on push / deploy (push) Successful in 1m37s
2024-02-23 21:34:02 +03:00
586672b279 fieldnames-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m48s
2024-02-23 21:27:38 +03:00
f04e20426f topic-stat-query-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m19s
2024-02-23 21:22:55 +03:00
a05072fd71 separated-follows
All checks were successful
Deploy on push / deploy (push) Successful in 1m43s
2024-02-23 21:10:11 +03:00
3bc7946ab3 stat-fix
All checks were successful
Deploy on push / deploy (push) Successful in 1m32s
2024-02-23 20:25:52 +03:00
e80b3ac770 fmt+refactor
All checks were successful
Deploy on push / deploy (push) Successful in 24s
2024-02-23 19:35:40 +03:00
14947225a6 same-shout-on-update-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-02-23 15:38:13 +03:00
2e2eba68a2 db-adapter-fixes
All checks were successful
Deploy to core / deploy (push) Successful in 1m7s
2024-02-23 15:02:14 +03:00
32bc750071 revert-auth-nocache
All checks were successful
Deploy to core / deploy (push) Successful in 1m35s
2024-02-23 14:53:14 +03:00
a0f75c0505 stat-fix-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m43s
2024-02-23 14:43:14 +03:00
5b34cab6bc add-columns-stat
All checks were successful
Deploy to core / deploy (push) Successful in 23s
2024-02-23 14:40:38 +03:00
cc80c92ad3 stat-fix
All checks were successful
Deploy to core / deploy (push) Successful in 3m59s
2024-02-23 14:34:43 +03:00
a55fa8d2ff trace-more
All checks were successful
Deploy to core / deploy (push) Successful in 1m30s
2024-02-23 14:23:13 +03:00
9999c362d4 auth-cache-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m58s
2024-02-23 14:19:54 +03:00
64012344cb alias-fix-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m25s
2024-02-23 14:09:12 +03:00
6e0da78658 alias-fix
All checks were successful
Deploy to core / deploy (push) Successful in 2m7s
2024-02-23 14:05:46 +03:00
14e2828e2d aliased-more
All checks were successful
Deploy to core / deploy (push) Successful in 2m1s
2024-02-23 13:52:31 +03:00
595e4ba87d nosearchinfo
All checks were successful
Deploy to core / deploy (push) Successful in 4m8s
2024-02-23 13:40:40 +03:00
72aa21c9cd get-topic-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m15s
2024-02-23 13:34:31 +03:00
17f79e1622 Merge branch 'dev' of https://dev.discours.io/discours.io/core into dev
All checks were successful
Deploy to core / deploy (push) Successful in 5m51s
2024-02-23 10:20:13 +03:00
ec08e85e8f select-from-fix 2024-02-23 10:14:58 +03:00
6ed09d5851 reverte 2024-02-23 04:08:29 +03:00
f8b4b0b96f gixing-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m7s
2024-02-23 03:59:28 +03:00
ef7f2d7b92 aliasing
All checks were successful
Deploy to core / deploy (push) Successful in 1m25s
2024-02-23 03:28:46 +03:00
8d97463c1d join-clause-groupby-fixes
All checks were successful
Deploy to core / deploy (push) Successful in 1m6s
2024-02-23 03:03:34 +03:00
60c7ab5fe4 separate-getter-follows
All checks were successful
Deploy to core / deploy (push) Successful in 3m44s
2024-02-23 02:53:19 +03:00
392cfb19bd separate-getter
All checks were successful
Deploy to core / deploy (push) Successful in 1m53s
2024-02-23 02:49:34 +03:00
3d34c6c540 stat-refactored
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-02-23 02:08:43 +03:00
b0e2551e9b groupby-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m23s
2024-02-23 00:03:12 +03:00
54f7dd9c1f select-from
All checks were successful
Deploy to core / deploy (push) Successful in 1m19s
2024-02-22 23:53:28 +03:00
d69f29bda3 move-author-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m26s
2024-02-22 23:32:26 +03:00
f8dafda86b no-select-from-fix 2024-02-22 23:13:29 +03:00
96b698f7ff select-from-fix-aliased
All checks were successful
Deploy to core / deploy (push) Successful in 1m53s
2024-02-22 23:13:00 +03:00
a877e1a7b8 select-from-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m44s
2024-02-22 23:07:08 +03:00
00b7aab220 debug-auth
All checks were successful
Deploy to core / deploy (push) Successful in 2m0s
2024-02-22 23:01:13 +03:00
5303aef4f0 alias-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m9s
2024-02-22 22:56:58 +03:00
078e8ab7d1 aliased
All checks were successful
Deploy to core / deploy (push) Successful in 2m1s
2024-02-22 21:22:22 +03:00
ebf342c73b webhook-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m28s
2024-02-22 21:18:20 +03:00
ce736e2624 session-commit-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m15s
2024-02-22 21:10:43 +03:00
88a0d58751 update-last-seen-aware
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-02-22 13:20:14 +03:00
4a1ee2ac80 add-topic-stats
Some checks failed
Deploy to core / deploy (push) Failing after 6s
2024-02-22 13:12:34 +03:00
a5416143df query-follows-fix
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-02-22 13:12:01 +03:00
d9abea9840 get-user-followsx
All checks were successful
Deploy to core / deploy (push) Successful in 1m25s
2024-02-22 13:07:09 +03:00
0f038ac6d7 caching-author-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m10s
2024-02-22 13:01:38 +03:00
187c14d6b0 slug-patch-on-create
All checks were successful
Deploy to core / deploy (push) Successful in 1m29s
2024-02-22 12:23:46 +03:00
8d06f59702 port-fix
Some checks failed
Deploy to core / deploy (push) Failing after 6s
2024-02-21 23:14:06 +03:00
750f00c6ac 1sec-delay
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-02-21 23:12:47 +03:00
aed1885278 row-adapt
All checks were successful
Deploy to core / deploy (push) Successful in 2m39s
2024-02-21 22:29:27 +03:00
1796d0c82d small-fix
Some checks failed
Deploy to core / deploy (push) Failing after 6m38s
2024-02-21 22:20:17 +03:00
fc3f859602 profiling-less
Some checks failed
Deploy to core / deploy (push) Failing after 6s
2024-02-21 22:16:29 +03:00
d50064a97e query-fix7
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-02-21 22:12:31 +03:00
332be3f12b query-fix6
All checks were successful
Deploy to core / deploy (push) Successful in 4m16s
2024-02-21 22:03:57 +03:00
da33ae92a9 query-fix5
All checks were successful
Deploy to core / deploy (push) Successful in 2m40s
2024-02-21 21:53:11 +03:00
f49fb2d01d db-profiling-simple
All checks were successful
Deploy to core / deploy (push) Successful in 4m19s
2024-02-21 21:47:00 +03:00
296721d2b1 fix-queru-more-2
All checks were successful
Deploy to core / deploy (push) Successful in 2m35s
2024-02-21 21:33:27 +03:00
5f4e30866f fix-queru-more
All checks were successful
Deploy to core / deploy (push) Successful in 2m31s
2024-02-21 21:25:23 +03:00
1c04125921 noworker-5
All checks were successful
Deploy to core / deploy (push) Successful in 3m11s
2024-02-21 21:04:57 +03:00
3db2efdf79 noworker-3
All checks were successful
Deploy to core / deploy (push) Successful in 4m47s
2024-02-21 20:50:26 +03:00
cb64cd66da noworker
All checks were successful
Deploy to core / deploy (push) Successful in 2m51s
2024-02-21 20:46:29 +03:00
9b2d1c96ba fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m10s
2024-02-21 20:38:12 +03:00
1f0d5ae8e8 batch-load-fix
Some checks failed
Deploy to core / deploy (push) Failing after 16m25s
2024-02-21 20:12:47 +03:00
784f790b83 stats-follows
All checks were successful
Deploy to core / deploy (push) Successful in 3m20s
2024-02-21 19:48:33 +03:00
1eac614e35 Merge branch 'dev' of https://dev.discours.io/discours.io/core into dev 2024-02-21 19:48:04 +03:00
214af0cf51 fmt 2024-02-21 19:45:53 +03:00
823e59ea74 fmt
Some checks failed
Deploy to core / deploy (push) Failing after 15m33s
2024-02-21 19:14:58 +03:00
88cd6e1060 dict-query-fix
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-02-21 19:12:24 +03:00
5b8347ee54 query-fix-2 2024-02-21 19:11:49 +03:00
2e07219732 initial-delay 2024-02-21 19:06:39 +03:00
59c46172c4 almost-dict
Some checks are pending
Deploy to core / deploy (push) Waiting to run
2024-02-21 19:03:49 +03:00
2e3d85b43d select-fix
Some checks are pending
Deploy to core / deploy (push) Waiting to run
2024-02-21 18:55:21 +03:00
b7cbef01a3 dictify
All checks were successful
Deploy to core / deploy (push) Successful in 2m31s
2024-02-21 18:51:37 +03:00
3f361b1af7 sqlfix
Some checks are pending
Deploy to core / deploy (push) Waiting to run
2024-02-21 18:38:15 +03:00
3ae706d6db healhchecks-warn-out
All checks were successful
Deploy to core / deploy (push) Successful in 1m55s
2024-02-21 18:33:42 +03:00
960cdf30da batch-render-follows
Some checks are pending
Deploy to core / deploy (push) Waiting to run
2024-02-21 18:26:18 +03:00
ab31d0d296 query_follows-fix
Some checks are pending
Deploy to core / deploy (push) Waiting to run
2024-02-21 18:13:43 +03:00
67fa44b062 redis-save-fi
Some checks failed
Deploy to core / deploy (push) Failing after 1m45s
2024-02-21 18:07:02 +03:00
74e639737e profiling-fix-2 2024-02-21 18:03:02 +03:00
be9f62eb76 profiling-db
Some checks failed
Deploy to core / deploy (push) Failing after 2m5s
2024-02-21 17:55:54 +03:00
e69046a1f8 cache-fixed
Some checks failed
Deploy to core / deploy (push) Failing after 15m39s
2024-02-21 17:37:58 +03:00
63f5a708b7 update-redis-api
Some checks failed
Deploy to core / deploy (push) Failing after 1m27s
2024-02-21 16:06:24 +03:00
33330fb052 logger-restore
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-02-21 14:23:42 +03:00
a40eb878be async-events-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m35s
2024-02-21 14:21:04 +03:00
9da452c2f0 follower-resolver-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m30s
2024-02-21 13:59:17 +03:00
3b867ded20 redis-hset-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m25s
2024-02-21 13:51:07 +03:00
2cfcab744e fmt
All checks were successful
Deploy to core / deploy (push) Successful in 1m55s
2024-02-21 13:47:33 +03:00
f75eb13971 less-log 2024-02-21 13:45:33 +03:00
9118ae9268 logger-query-id
All checks were successful
Deploy to core / deploy (push) Successful in 1m13s
2024-02-21 13:44:36 +03:00
4ca884f257 debug-get-author-but-userid
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-02-21 13:27:00 +03:00
9c14f4b4d3 logger-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m51s
2024-02-21 13:22:46 +03:00
fb48bee8df get_author_by_user_id-fix
Some checks failed
Deploy to core / deploy (push) Failing after 6s
2024-02-21 13:16:39 +03:00
ba436de055 lesslog
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-02-21 13:13:43 +03:00
253ee11bb9 logger-timing-logix-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m27s
2024-02-21 13:09:40 +03:00
731f9a45df logger-timing-logix-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-02-21 13:02:49 +03:00
73f020ae5d fix-circular
All checks were successful
Deploy to core / deploy (push) Successful in 1m59s
2024-02-21 12:34:12 +03:00
762857ffbe trigger-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m59s
2024-02-21 12:22:55 +03:00
8f6416a73c trigger-get-author-fixes
Some checks failed
Deploy to core / deploy (push) Failing after 3m32s
2024-02-21 12:10:30 +03:00
4cde1c14b4 handle-shouts-paginating
Some checks failed
Deploy to core / deploy (push) Failing after 1m33s
2024-02-21 11:59:47 +03:00
ee577c75fd graphql-schema-update
Some checks failed
Deploy to core / deploy (push) Failing after 1m46s
2024-02-21 11:52:57 +03:00
9eee73acf3 shouts-follows
All checks were successful
Deploy to core / deploy (push) Successful in 5m54s
2024-02-21 11:35:13 +03:00
7cf702eb98 fmt
All checks were successful
Deploy to core / deploy (push) Successful in 2m0s
2024-02-21 10:27:16 +03:00
4f26812340 appdata-triggers
All checks were successful
Deploy to core / deploy (push) Successful in 1m16s
2024-02-20 21:57:39 +03:00
66f1c654cf format-multiline-log-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m8s
2024-02-20 19:45:55 +03:00
abc752c629 format-multiline-log-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m10s
2024-02-20 19:42:14 +03:00
333340056e logger-3301
All checks were successful
Deploy to core / deploy (push) Successful in 1m25s
2024-02-20 19:37:20 +03:00
3c03688544 logger-3000
Some checks failed
Deploy to core / deploy (push) Failing after 1m9s
2024-02-20 19:33:24 +03:00
b59a8ef323 root-logger
All checks were successful
Deploy to core / deploy (push) Successful in 1m27s
2024-02-20 19:23:38 +03:00
183755e637 one-logger
All checks were successful
Deploy to core / deploy (push) Successful in 1m24s
2024-02-20 19:19:46 +03:00
822815fdac logger-3
All checks were successful
Deploy to core / deploy (push) Successful in 1m10s
2024-02-20 19:01:50 +03:00
9f10a23345 typo-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m24s
2024-02-20 18:46:30 +03:00
86754c341d log
Some checks failed
Deploy to core / deploy (push) Failing after 2m7s
2024-02-20 18:42:14 +03:00
20e9add575 log
Some checks failed
Deploy to core / deploy (push) Failing after 3m51s
2024-02-20 18:37:53 +03:00
b5cdface63 logger-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m25s
2024-02-20 18:28:38 +03:00
dd2301343f loggerfix
All checks were successful
Deploy to core / deploy (push) Successful in 1m25s
2024-02-20 18:22:54 +03:00
f7d0d10d50 debug-auth 2024-02-20 18:20:57 +03:00
e85c179d93 muiltilinelog
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-02-20 18:16:17 +03:00
d8a4481aab logger-fix
All checks were successful
Deploy to core / deploy (push) Successful in 3m17s
2024-02-20 18:10:36 +03:00
cbb4533855 depfix
All checks were successful
Deploy to core / deploy (push) Successful in 1m32s
2024-02-20 18:04:59 +03:00
40e52b4d71 nosentry
Some checks failed
Deploy to core / deploy (push) Failing after 1m56s
2024-02-20 17:54:43 +03:00
f283ea048b logs-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m28s
2024-02-20 17:49:21 +03:00
0febd91b25 logs-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m22s
2024-02-20 17:46:33 +03:00
0e701020bb lesslog
All checks were successful
Deploy to core / deploy (push) Successful in 3m34s
2024-02-20 17:27:30 +03:00
0d1b73878e debug-auth 2024-02-20 17:22:55 +03:00
5af3dcb132 typo-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m26s
2024-02-20 12:58:16 +03:00
8b08e23801 fixmodel
All checks were successful
Deploy to core / deploy (push) Successful in 1m27s
2024-02-20 12:53:15 +03:00
6377bc3d64 revert
Some checks failed
Deploy to core / deploy (push) Failing after 6s
2024-02-20 12:40:22 +03:00
811086de83 simpler-author-model
All checks were successful
Deploy to core / deploy (push) Successful in 1m22s
2024-02-20 12:04:45 +03:00
a00fe8b8ef orm-update2
All checks were successful
Deploy to core / deploy (push) Successful in 1m28s
2024-02-20 11:53:55 +03:00
d590884dca change-index
All checks were successful
Deploy to core / deploy (push) Successful in 3m51s
2024-02-20 11:47:37 +03:00
da9ccbd0cc ratings-model-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m29s
2024-02-20 10:52:30 +03:00
69984788fa no-unique-index
All checks were successful
Deploy to core / deploy (push) Successful in 1m33s
2024-02-19 17:22:38 +03:00
981a4c4fce buildsystemver-fix-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m10s
2024-02-19 16:29:05 +03:00
67d6d7134a buildsystemver-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m24s
2024-02-19 16:23:24 +03:00
2d75593cc2 realname-core
All checks were successful
Deploy to core / deploy (push) Successful in 1m57s
2024-02-19 16:18:35 +03:00
e483ea9329 no-searchclient-info
All checks were successful
Deploy to core / deploy (push) Successful in 1m24s
2024-02-19 16:07:52 +03:00
09887bc516 handle-exception
All checks were successful
Deploy to core / deploy (push) Successful in 1m23s
2024-02-19 15:51:09 +03:00
74233e96ff auth-cached-fix-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m31s
2024-02-19 15:31:51 +03:00
e5edc97ab1 auth-cached-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m52s
2024-02-19 15:18:25 +03:00
75edee4fe9 we-all-made-of-stars
All checks were successful
Deploy to core / deploy (push) Successful in 1m21s
2024-02-19 14:54:13 +03:00
37230a8392 ruff-up 2024-02-19 14:46:45 +03:00
6d3c0ee39e isort+authfix
All checks were successful
Deploy to core / deploy (push) Successful in 1m36s
2024-02-19 14:45:55 +03:00
b89060f15f model-index-slug
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-02-19 13:25:47 +03:00
8193bd0178 all-cached-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m30s
2024-02-19 13:16:44 +03:00
1fa97908b2 auth-cached-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m38s
2024-02-19 12:56:58 +03:00
a39db6991c depfix
All checks were successful
Deploy to core / deploy (push) Successful in 2m18s
2024-02-19 12:49:33 +03:00
add5f6df63 cache-jwt-validation
Some checks failed
Deploy to core / deploy (push) Failing after 28s
2024-02-19 12:40:26 +03:00
cf8934c605 schema-main 2024-02-19 11:58:31 +03:00
680242f1e3 schema-main 2024-02-19 11:58:02 +03:00
0301d8041d schema-move-test
All checks were successful
Deploy to core / deploy (push) Successful in 1m54s
2024-02-19 11:20:13 +03:00
2464b91f9b fix-env
Some checks failed
Deploy to core / deploy (push) Failing after 5s
2024-02-19 11:16:48 +03:00
ddf203a869 healthcheck
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-02-19 11:15:53 +03:00
b01bf77d8e fix-pjs
Some checks failed
Deploy to core / deploy (push) Failing after 1m18s
2024-02-19 11:13:05 +03:00
22466e65e2 4threads-1worker
Some checks failed
Deploy to core / deploy (push) Failing after 8s
2024-02-19 11:12:00 +03:00
e4036c8a79 no-aiohttp
Some checks failed
Deploy to core / deploy (push) Failing after 9s
2024-02-19 11:11:13 +03:00
5772db6a36 query-time-log
Some checks failed
Deploy to core / deploy (push) Failing after 10s
2024-02-19 11:10:12 +03:00
f01dde845c fixrating
Some checks failed
Deploy to core / deploy (push) Failing after 5s
2024-02-19 10:33:15 +03:00
e6720ccaaf restore-struct
Some checks failed
Deploy to core / deploy (push) Failing after 7s
2024-02-19 10:14:14 +03:00
7b8e9fbea6 sql-profiling
Some checks failed
Deploy to core / deploy (push) Failing after 1m24s
2024-02-19 10:06:46 +03:00
aa55e952aa dockerfile fix 2024-02-19 09:56:23 +03:00
f74358be76 rating in orm
Some checks failed
Deploy to core / deploy (push) Failing after 3m55s
2024-02-19 09:50:15 +03:00
ca22ac9b13 dev-deploy
All checks were successful
Deploy to core / deploy (push) Successful in 1m21s
2024-02-19 09:40:30 +03:00
1092b8a2ca ml
All checks were successful
Deploy to core / deploy (push) Successful in 6s
2024-02-19 09:38:18 +03:00
a1ed480567 shout-id
All checks were successful
Deploy to core / deploy (push) Successful in 6s
2024-02-17 21:55:50 +03:00
f3df37a41b update-reaction-fix-3
All checks were successful
Deploy to core / deploy (push) Successful in 1m24s
2024-02-17 21:44:22 +03:00
c6df11dc7d update-reaction-fix-2
Some checks failed
Deploy to core / deploy (push) Failing after 1m9s
2024-02-17 21:04:01 +03:00
47ecf4bd1a dockerfile-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m52s
2024-02-17 13:25:24 +03:00
93d536bdba dockerfile-fix
Some checks failed
Deploy to core / deploy (push) Failing after 30s
2024-02-17 13:21:52 +03:00
8a4e4ce6d5 linter-update
Some checks failed
Deploy to core / deploy (push) Failing after 50s
2024-02-17 13:18:54 +03:00
92246bc9d1 create-update-shout-fix
Some checks failed
Deploy to core / deploy (push) Failing after 46s
2024-02-17 09:35:11 +03:00
6ef2c47e11 id-optional-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m58s
2024-02-16 19:59:12 +03:00
0a74ed0f63 update-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m37s
2024-02-16 19:46:57 +03:00
7aaa9e8d8b sentry-enable
Some checks failed
Deploy to core / deploy (push) Failing after 46s
2024-02-16 12:44:19 +03:00
9a2d7b6f11 fmt
All checks were successful
Deploy to core / deploy (push) Successful in 1m53s
2024-02-16 12:40:41 +03:00
994469c2e3 cleaner-main
All checks were successful
Deploy to core / deploy (push) Successful in 1m51s
2024-02-16 12:34:39 +03:00
79ec5a1841 log-fix
All checks were successful
Deploy to core / deploy (push) Successful in 2m25s
2024-02-16 12:16:00 +03:00
233c71385f more-instance-check
All checks were successful
Deploy to core / deploy (push) Successful in 2m8s
2024-02-15 18:17:18 +03:00
e9ed01e797 postprocess-query-for-order-4
All checks were successful
Deploy to core / deploy (push) Successful in 2m29s
2024-02-14 12:07:55 +03:00
2e60fd2cc7 postprocess-query-for-order-3
All checks were successful
Deploy to core / deploy (push) Successful in 2m9s
2024-02-14 12:05:19 +03:00
9b174d94c6 postprocess-query-for-order-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m43s
2024-02-14 10:51:43 +03:00
3488282c14 postprocess-query-for-order
All checks were successful
Deploy to core / deploy (push) Successful in 2m27s
2024-02-14 10:47:54 +03:00
c732ec8136 reply-to-empty-fix
Some checks failed
Deploy to core / deploy (push) Failing after 50s
2024-02-07 19:50:01 +03:00
180dab1c06 filter-rating-only
All checks were successful
Deploy to core / deploy (push) Successful in 1m58s
2024-02-07 18:39:55 +03:00
85931d04ba delete-reaction-fix
Some checks failed
Deploy to core / deploy (push) Failing after 2m11s
2024-02-07 16:41:17 +03:00
7746d1992f fmt
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-02-05 12:47:26 +03:00
77dddedae6 no-notify-on-entity-create
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-02-05 10:08:11 +03:00
23468e4b3e debug-3
All checks were successful
Deploy to core / deploy (push) Successful in 1m38s
2024-02-03 20:13:51 +03:00
e7a1697f11 get-my-followings-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m34s
2024-02-03 20:08:22 +03:00
e4846f8abb readme-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m36s
2024-02-03 18:40:25 +03:00
33193b2345 update_profile-
All checks were successful
Deploy to core / deploy (push) Successful in 1m37s
2024-02-03 17:44:28 +03:00
2008345e69 common-result-type
All checks were successful
Deploy to core / deploy (push) Successful in 1m38s
2024-02-03 17:35:57 +03:00
d3b2eddf58 return-type-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m30s
2024-02-03 17:31:00 +03:00
18521f3fc5 schema-fix
Some checks failed
Deploy to core / deploy (push) Failing after 1m33s
2024-02-03 17:18:20 +03:00
1b4315fcce bye-following-manageer
All checks were successful
Deploy to core / deploy (push) Successful in 1m43s
2024-02-03 17:00:48 +03:00
53ceac108f full-preload
Some checks failed
Deploy to core / deploy (push) Failing after 1m32s
2024-02-03 16:17:00 +03:00
066770febc logs
All checks were successful
Deploy to core / deploy (push) Successful in 1m37s
2024-02-03 12:51:52 +03:00
83390912e9 following-manager-upgrade
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-02-03 12:48:36 +03:00
7f04eba208 comment-filter-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-02-03 12:10:38 +03:00
dea03ffa4c load-reactions-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-02-03 01:39:57 +03:00
d6151c00c8 update-fix-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m49s
2024-02-02 23:59:42 +03:00
b0e981ece4 update-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m34s
2024-02-02 23:49:12 +03:00
7cd7447796 revied
All checks were successful
Deploy to core / deploy (push) Successful in 1m37s
2024-02-02 23:38:42 +03:00
8cc7e21338 revised
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-02-02 23:38:16 +03:00
6d3bd13218 check-twice
All checks were successful
Deploy to core / deploy (push) Successful in 1m38s
2024-02-02 23:16:04 +03:00
516945ddec publish-fix
Some checks failed
Deploy to core / deploy (push) Failing after 5s
2024-02-02 21:04:21 +03:00
410d426ea5 indexing-serializer-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-02-02 20:54:17 +03:00
1be8eeb810 typo-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-02-02 19:57:34 +03:00
61528e5269 visibility-no-need
All checks were successful
Deploy to core / deploy (push) Successful in 1m36s
2024-02-02 19:36:30 +03:00
e3ee65f79a unfeature-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m37s
2024-02-02 19:29:26 +03:00
fa2b0eeffa name
All checks were successful
Deploy to core / deploy (push) Successful in 29s
2024-02-02 16:00:57 +03:00
d1f4b05e8d name
Some checks failed
Deploy to core / deploy (push) Failing after 6s
2024-02-02 15:59:56 +03:00
7a3830653e fmt
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-02-02 15:59:22 +03:00
08b69e5d0a packaging-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-02-02 15:16:53 +03:00
c00361b2ec featured-id-patch
All checks were successful
Deploy to core / deploy (push) Successful in 1m40s
2024-02-02 15:05:20 +03:00
bd5f910f8c delete-shout-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-01-31 22:47:30 +03:00
fbbe6b0751 following-set-fix-4
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-01-31 18:23:00 +03:00
a6d604f233 following-set-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m38s
2024-01-31 18:18:36 +03:00
5a810fa126 following-fix-3
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-01-31 17:48:36 +03:00
77907c73e0 following-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m43s
2024-01-31 17:45:02 +03:00
ff30960608 following-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-01-31 17:11:53 +03:00
1fb37f8aa0 create-reaction-fix-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-01-31 03:09:58 +03:00
75cff9dbed create-reaction-fox
All checks were successful
Deploy to core / deploy (push) Successful in 1m40s
2024-01-31 02:46:52 +03:00
880e295b45 unique-reactions
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-01-31 01:53:54 +03:00
fceb3b61c7 logs-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m44s
2024-01-30 14:00:53 +03:00
e28f03d7db author-shouts-counter-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m49s
2024-01-30 11:58:17 +03:00
e4d7284681 reacted-stat-restore
All checks were successful
Deploy to core / deploy (push) Successful in 1m38s
2024-01-29 15:20:28 +03:00
325927739e info-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-01-29 13:02:14 +03:00
774a5ee596 reorg-code
Some checks failed
Deploy to core / deploy (push) Failing after 1m32s
2024-01-29 11:09:10 +03:00
b975e174ca lesslog
Some checks failed
Deploy to core / deploy (push) Failing after 1m32s
2024-01-29 11:04:50 +03:00
98b379c8e1 lock-more-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-01-29 11:01:04 +03:00
133067d09a await-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m40s
2024-01-29 10:48:36 +03:00
e6f12e9106 lesslog
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-01-29 10:47:31 +03:00
e6366d15f6 debug-search-results
All checks were successful
Deploy to core / deploy (push) Successful in 1m45s
2024-01-29 10:37:21 +03:00
ae9e025959 cache-success-only
All checks were successful
Deploy to core / deploy (push) Successful in 1m46s
2024-01-29 09:45:00 +03:00
2f2fa346ed bloatcodeless
All checks were successful
Deploy to core / deploy (push) Successful in 1m38s
2024-01-29 07:04:37 +03:00
b9d602eedf not-error-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m34s
2024-01-29 06:56:34 +03:00
9f9ea93526 release-lock-fix
Some checks failed
Deploy to core / deploy (push) Failing after 6s
2024-01-29 06:52:51 +03:00
520b43ee48 bypass-fix
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-01-29 06:51:26 +03:00
d595a18de4 logs-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m34s
2024-01-29 06:48:11 +03:00
f164fd66d4 index-restruct-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m43s
2024-01-29 06:45:07 +03:00
5002e85177 index-restruct
All checks were successful
Deploy to core / deploy (push) Successful in 1m40s
2024-01-29 06:42:02 +03:00
56bf5b2874 simpler-disabled
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-01-29 06:18:36 +03:00
8a88a98b53 ignore-unavial-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m36s
2024-01-29 06:09:40 +03:00
4b9382c47d stability-2
Some checks failed
Deploy to core / deploy (push) Failing after 1m33s
2024-01-29 06:03:37 +03:00
cf23d343d1 stability-fail
All checks were successful
Deploy to core / deploy (push) Successful in 1m35s
2024-01-29 05:56:28 +03:00
9e18697cac disabling
All checks were successful
Deploy to core / deploy (push) Successful in 1m35s
2024-01-29 05:37:10 +03:00
b574673f00 search-indicies
All checks were successful
Deploy to core / deploy (push) Successful in 1m35s
2024-01-29 05:26:49 +03:00
62018534fd index-name-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m34s
2024-01-29 05:20:24 +03:00
f86d2f0cd6 readme-fix 2024-01-29 05:17:47 +03:00
4a6863c474 check-if-exists
All checks were successful
Deploy to core / deploy (push) Successful in 1m33s
2024-01-29 05:13:37 +03:00
f38ee9239f tolerate-error
All checks were successful
Deploy to core / deploy (push) Successful in 1m35s
2024-01-29 05:07:30 +03:00
ff3ccc6174 opensearch-client-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m33s
2024-01-29 05:03:20 +03:00
69eb41fc8d opensearch-client
Some checks failed
Deploy to core / deploy (push) Failing after 1m30s
2024-01-29 05:00:54 +03:00
6c398fc593 disabled-logix
Some checks failed
Deploy to core / deploy (push) Failing after 1m30s
2024-01-29 04:47:53 +03:00
258bb4e779 logs-fox
Some checks failed
Deploy to core / deploy (push) Failing after 5s
2024-01-29 04:43:02 +03:00
e1a27b55cd inner-search-3
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-01-29 04:41:46 +03:00
2663d1cbc5 allow-selfsigned
All checks were successful
Deploy to core / deploy (push) Successful in 1m38s
2024-01-29 04:21:28 +03:00
8ff1949170 inner-search-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m36s
2024-01-29 04:09:54 +03:00
2c2932caeb inner-search
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-01-29 03:27:30 +03:00
35f7a35f27 scored-subquery-fix-3
All checks were successful
Deploy to core / deploy (push) Successful in 1m38s
2024-01-29 01:57:34 +03:00
1066b85e1b scored-subquery-fix-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m43s
2024-01-29 01:25:47 +03:00
982d424e1b merged
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-01-29 00:43:13 +03:00
f749ac7999 scored-subquery-fix 2024-01-29 00:42:03 +03:00
Stepan Vladovskii
84078c7cfe feat: no force any more for CI deploy from Gitea
All checks were successful
Deploy to core / deploy (push) Successful in 1m43s
2024-01-28 18:37:47 -03:00
86f2c51f5a virtual-score-column-fix-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-01-29 00:31:48 +03:00
18fc08f6c8 virtual-score-column-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m44s
2024-01-29 00:28:04 +03:00
b92431e802 search-simpler-query-fix-6
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-01-28 23:57:34 +03:00
01b9091310 search-simpler-query-fix-5
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-01-28 23:52:58 +03:00
77114c66ec search-simpler-query-fix-4
All checks were successful
Deploy to core / deploy (push) Successful in 1m39s
2024-01-28 23:46:01 +03:00
30a281a693 search-simpler-query-fix-2-3
All checks were successful
Deploy to core / deploy (push) Successful in 1m47s
2024-01-28 23:42:35 +03:00
c061e5cdb3 search-simpler-query-fix-2
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-01-28 23:27:40 +03:00
5e4ef40b21 search-simpler-query-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m45s
2024-01-28 23:21:02 +03:00
00a672f96e slug-string-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-01-28 18:56:06 +03:00
263ceac5a3 found-keys-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m42s
2024-01-28 18:51:12 +03:00
c90b0bd994 ga-metric-fieldname-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m44s
2024-01-28 18:33:04 +03:00
ef9fbe7c88 trig-ga
All checks were successful
Deploy to core / deploy (push) Successful in 1m41s
2024-01-28 18:17:34 +03:00
4bd7e7d0a1 creds-fix
All checks were successful
Deploy to core / deploy (push) Successful in 1m59s
2024-01-28 16:38:17 +03:00
7f203bf900 deps-fix
Some checks failed
Deploy to core / deploy (push) Failing after 2m13s
2024-01-28 16:33:45 +03:00
ebdfdb2613 ga4-data-api-usage
Some checks failed
Deploy to core / deploy (push) Failing after 2m16s
2024-01-28 16:26:40 +03:00
bba87bbf1d daterange-fix
All checks were successful
Deploy to core / deploy (push) Successful in 2m44s
2024-01-28 15:54:38 +03:00
bd004f6fce no-view-id
All checks were successful
Deploy to core / deploy (push) Successful in 3m21s
2024-01-28 15:40:44 +03:00
753a77ae72 daterange-format-fix
Some checks are pending
Deploy to core / deploy (push) Waiting to run
2024-01-28 14:28:03 +03:00
37b6776bdb viewed-service-fix
Some checks are pending
Deploy to core / deploy (push) Waiting to run
2024-01-28 14:20:22 +03:00
38645d063a logs-fix 2024-01-28 12:03:41 +03:00
08845152d1 Merge branch 'feature/core' of v2.discours.io:core into feature/core 2024-01-28 11:42:40 +03:00
dd2ef55f04 Merge branch 'feature/core' of v2.discours.io:core into feature/core
Some checks failed
Deploy to core / deploy (push) Has been cancelled
2024-01-28 11:40:00 +03:00
a98284522b Merge branch 'feature/core' of v2.discours.io:core into feature/core 2024-01-28 10:03:51 +03:00
8a0da7381b Merge branch 'feature/core' of https://dev.discours.io/discours.io/core into feature/core 2024-01-28 10:01:28 +03:00
Stepan Vladovskii
bed2f89964 debug: main.py with import sentry-sdk
All checks were successful
Deploy to core / deploy (push) Successful in 1m38s
2024-01-27 22:11:39 -03:00
Stepan Vladovskii
0eef9b3061 debug: main.py with import sentry-sdk
Some checks failed
Deploy to core / deploy (push) Failing after 1m29s
2024-01-27 22:02:22 -03:00
Stepan Vladovskii
d7a3c840ea feat: gitea runner push branch feature/core to v2.discours.io/core feauter/core
Some checks failed
Deploy to core / deploy (push) Failing after 1m32s
2024-01-27 21:55:12 -03:00
Stepan Vladovskii
2c9155cd54 feat: gitea runner branch feature/core to v2.discours.io/core
All checks were successful
Deploy to core / deploy (push) Successful in 6s
2024-01-27 21:50:13 -03:00
Stepan Vladovskii
405337da27 feat: add Sentry Reddis perfomance monitoring 2024-01-27 21:45:24 -03:00
to
f73c2094d9 Update README.md 2024-01-27 08:48:03 +00:00
to
7235d2acc4 Update README.md 2024-01-27 08:45:29 +00:00
to
db33c625db Update README.md 2024-01-27 08:36:03 +00:00
7e4aa83b8e joined-search-fix 2024-01-26 18:28:02 +03:00
6116254d9f search-fix-3 2024-01-26 18:19:10 +03:00
90f164521b search-fix-2 2024-01-26 18:09:25 +03:00
24da021a62 search-fix-2 2024-01-26 17:58:01 +03:00
e7e9089b7c query-fix 2024-01-26 13:28:49 +03:00
59dec8cad6 query-fix 2024-01-26 04:24:47 +03:00
1b80d596cb search-fix-2 2024-01-26 04:05:25 +03:00
3f703ad357 add-granian 2024-01-25 22:58:35 +03:00
e2f2976572 portfix 2024-01-25 22:55:00 +03:00
f3acf878aa Merge branch 'feature/core' of https://dev.discours.io/discours.io/core into feature/core 2024-01-25 22:47:40 +03:00
4a5f1d634a granian+precommit 2024-01-25 22:41:27 +03:00
ad3fd32a6e precommit-3 2024-01-25 11:05:28 +03:00
623e532533 precommit-installed 2024-01-25 11:04:00 +03:00
9aea7b02fb precommit 2024-01-25 11:02:31 +03:00
Stepan Vladovskii
1db943acc0 debug: deploy in branch main of core dokku app 2024-01-24 22:47:36 -03:00
Stepan Vladovskii
ebbbcc97f2 feat: yess, it was deploy on staging
All checks were successful
Deploy to core / deploy (push) Successful in 5s
2024-01-24 21:07:05 -03:00
Stepan Vladovskii
e8d85d9914 debug: simplify main.yml for actions
All checks were successful
Deploy to core / deploy (push) Successful in 1m56s
2024-01-24 20:43:10 -03:00
Stepan Vladovskii
2d73a5b874 debug: simplify main.yml for actions 2024-01-24 20:42:51 -03:00
Stepan Vladovskii
1883f0d733 debug: simplify main.yml for actions
Some checks failed
Deploy to core / deploy (push) Failing after 5s
2024-01-24 20:39:26 -03:00
Stepan Vladovskii
3332088b21 debug: actions without yml strange contex 2024-01-24 19:14:41 -03:00
Stepan Vladovskii
284f91b851 feat: change workflow for use branch feature/core in app core 2024-01-24 19:04:36 -03:00
ccbbc04051 .. 2024-01-24 18:19:26 +03:00
7fe026cb41 Merge branch 'feature/core' of https://dev.discours.io/discours.io/core into feature/core
All checks were successful
deploy / deploy (push) Successful in 1m55s
2024-01-24 15:36:46 +03:00
8c33955d5c redis-service-fix 2024-01-24 15:36:34 +03:00
Stepan Vladovskii
ac31a96a89 feat: migrate CI to v2 strange update of yml
All checks were successful
deploy / deploy (push) Successful in 1m48s
2024-01-23 23:31:49 -03:00
Stepan Vladovskii
0923070111 feat: migrate CI to v2
Some checks failed
deploy / deploy (push) Failing after 5s
2024-01-23 22:42:38 -03:00
06699a000a delete-reaction-schema-fix 2024-01-23 22:52:40 +03:00
f5f5cea184 load-shouts-feed 2024-01-23 22:20:43 +03:00
92dd45d278 auth-uncache 2024-01-23 22:12:22 +03:00
86e142292f cache-fix 2024-01-23 21:59:46 +03:00
c41fe8b6c9 cached-auth 2024-01-23 21:34:51 +03:00
987eb8c078 visibility-fix 2024-01-23 19:51:26 +03:00
3a6c805bcf rating-fix 2024-01-23 18:07:37 +03:00
e2e85376f0 no-return-reaction-fix 2024-01-23 17:14:43 +03:00
3f65652a5f 0.2.21-ga 2024-01-23 16:04:38 +03:00
954e6dabb7 no-rating-stat 2024-01-23 11:50:58 +03:00
d6dc374b01 community-stats-fix 2024-01-23 05:03:23 +03:00
ce5077a529 reacted_shouts_updates-fix 2024-01-23 04:58:45 +03:00
43f0c517b3 load-random-top-fix 2024-01-23 04:34:48 +03:00
e0395b0ab6 unread-fixes 2024-01-23 04:03:15 +03:00
6f5b5c364a self-regulation-logix-fix 2024-01-23 03:12:59 +03:00
8f846b6f7a refactored 2024-01-23 03:06:48 +03:00
c6088c5705 notifier-call-fix 2024-01-23 02:47:23 +03:00
f4e8f29fdd following-fix-2 2024-01-23 02:41:37 +03:00
5548d6d1f7 following-fix 2024-01-23 02:37:18 +03:00
6c5ce12b7e wrap-order-fix 2024-01-23 02:28:54 +03:00
bb2edd13e9 follow-debug 2024-01-23 02:23:31 +03:00
adbcec2511 reaction-kind-fix 2024-01-23 02:09:42 +03:00
0a38ae8e7e rating-fix 2024-01-23 02:08:59 +03:00
438baeb1a2 reaction-api-upgrade 2024-01-23 01:57:25 +03:00
4cb70d951a rating-sum-fix 2024-01-23 01:51:38 +03:00
9782cf402e create-reaction-fix-7 2024-01-23 01:21:01 +03:00
257ff43eaa create-reaction-debug-6 2024-01-23 01:11:34 +03:00
31f2414064 create-reaction-debug-4 2024-01-23 00:51:50 +03:00
3e6354afed craete-reaction-fix 2024-01-23 00:36:52 +03:00
8eb36f0cc3 create-reaction-debug-2 2024-01-23 00:27:57 +03:00
6be7ada9a1 create-reaction-revision 2024-01-22 23:54:02 +03:00
ad45cd4b10 minor-fixes
Some checks failed
deploy / deploy (push) Failing after 4s
2024-01-22 22:21:41 +03:00
0ebea28cce schema-upgrade
Some checks failed
deploy / deploy (push) Failing after 5s
2024-01-22 21:38:38 +03:00
a3688ba29a viewed-by-author-by-topic-feat
Some checks failed
deploy / deploy (push) Failing after 5s
2024-01-22 21:20:17 +03:00
f67ef7dd05 create-shout-fix 2024-01-22 19:57:48 +03:00
ff6637a51e precounted-views-import 2024-01-22 19:17:39 +03:00
f08a00e3c2 imports-fix 2024-01-22 18:48:58 +03:00
cdb54dbbe0 schema-path-fix 2024-01-22 18:45:35 +03:00
9bd458c47c add
Some checks failed
deploy / deploy (push) Failing after 6s
2024-01-22 18:42:45 +03:00
7b5330625b get-my-followed-fix-2 2024-01-18 15:30:53 +03:00
4320c9674c get-my-followed-fix
Some checks failed
deploy / deploy (push) Failing after 4s
2024-01-18 15:12:40 +03:00
9812b308b3 load_shouts_random_top-fix
Some checks failed
deploy / deploy (push) Failing after 4s
2024-01-18 14:45:47 +03:00
a43eaee8e0 ackee-load-fix
Some checks failed
deploy / deploy (push) Failing after 8s
2024-01-13 15:57:35 +03:00
033a8b6534 viewed-service-fix
Some checks failed
deploy / deploy (push) Failing after 4s
2024-01-13 15:44:56 +03:00
8f690af6ef from-topic-follower-fix
Some checks failed
deploy / deploy (push) Failing after 5s
2024-01-13 11:49:12 +03:00
8050a7e828 reactions-by-fix+reacted-shouts-fix
Some checks failed
deploy / deploy (push) Failing after 4s
2024-01-13 11:15:45 +03:00
d561deeb73 v2-deploy
Some checks failed
deploy / deploy (push) Failing after 5s
2024-01-13 11:03:35 +03:00
9c804bc873 get-my-followed-fix
Some checks failed
deploy / deploy (push) Has been cancelled
2024-01-13 11:01:59 +03:00
28f1f1cc57 reactions-sort-groupby-fix
All checks were successful
deploy / deploy (push) Successful in 1m44s
2024-01-13 10:27:45 +03:00
3a0683137d reactions-order-fix 2024-01-13 09:59:56 +03:00
10be35c78c dokku-conf 2024-01-11 20:23:02 +03:00
bd31c0afc5 no-presence-sigil 2024-01-11 20:02:39 +03:00
d9e1fb5161 no-gateway-sigil 2024-01-11 19:52:10 +03:00
3175fbd4a4 start-fix
All checks were successful
deploy / deploy (push) Successful in 1m36s
2024-01-10 16:36:42 +03:00
1b2b060b23 0.2.19-fixes
Some checks failed
deploy / deploy (push) Failing after 1m35s
2024-01-10 16:29:49 +03:00
14dc1c761a fix-get-author-i
All checks were successful
deploy / deploy (push) Successful in 1m31s
2023-12-29 02:31:44 +03:00
aa9ffd3053 ratings-update
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-28 01:37:54 +03:00
0ba38ac700 author-fix-3
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-28 01:09:38 +03:00
9968fb27f4 author-fix
All checks were successful
deploy / deploy (push) Successful in 1m29s
2023-12-28 01:05:52 +03:00
6207f7d3ed author-rating-fix
All checks were successful
deploy / deploy (push) Successful in 1m33s
2023-12-28 00:30:18 +03:00
da3e7e55fd logs-gic
All checks were successful
deploy / deploy (push) Successful in 1m32s
2023-12-25 10:48:50 +03:00
48b8209e23 search-query-fix-7
All checks were successful
deploy / deploy (push) Successful in 1m28s
2023-12-25 06:16:40 +03:00
c4c7ce0ad4 search-query-fix-7
All checks were successful
deploy / deploy (push) Successful in 1m31s
2023-12-25 05:04:53 +03:00
5492887a10 search-query-fix-6
All checks were successful
deploy / deploy (push) Successful in 1m27s
2023-12-25 05:01:49 +03:00
ec70549e48 search-query-fix-5
All checks were successful
deploy / deploy (push) Successful in 1m28s
2023-12-25 04:56:30 +03:00
c76e1625f3 search-query-fix-4
All checks were successful
deploy / deploy (push) Successful in 1m29s
2023-12-25 04:52:40 +03:00
d528da9b4a search-query-fix-3
All checks were successful
deploy / deploy (push) Successful in 1m27s
2023-12-25 04:45:21 +03:00
f4f1b3bb45 search-query-fix
Some checks failed
deploy / deploy (push) Failing after 22s
2023-12-25 04:35:21 +03:00
15fbc56d78 search-results-fix
Some checks failed
deploy / deploy (push) Failing after 1m24s
2023-12-25 04:27:02 +03:00
a4b0fd1a46 add-role-feature
Some checks failed
deploy / deploy (push) Failing after 1m23s
2023-12-25 01:42:39 +03:00
2547bd111b logs-fix
All checks were successful
deploy / deploy (push) Successful in 1m28s
2023-12-25 01:13:17 +03:00
935a12945d case-fix
All checks were successful
deploy / deploy (push) Successful in 1m26s
2023-12-25 01:08:31 +03:00
0ea9f45854 load-random-topic-fix
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-25 01:06:27 +03:00
c236768c07 trig
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-25 00:02:54 +03:00
88d33f96b0 commented-fix-2
All checks were successful
deploy / deploy (push) Successful in 1m28s
2023-12-24 21:38:16 +03:00
f9abe421aa commented-fix
All checks were successful
deploy / deploy (push) Successful in 1m29s
2023-12-24 20:46:50 +03:00
8c67438d01 commented-outerjoin-fix
Some checks failed
deploy / deploy (push) Failing after 5s
2023-12-24 18:34:06 +03:00
392712c604 sqlalchemy-debug
Some checks failed
deploy / deploy (push) Has been cancelled
2023-12-24 17:25:57 +03:00
8856bfc978 resolvers-fix
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-23 22:00:22 +03:00
bf2c5b67e3 cache-fix
Some checks failed
deploy / deploy (push) Failing after 9s
2023-12-23 08:40:41 +03:00
8e28e3d86d model-fix
All checks were successful
deploy / deploy (push) Successful in 1m22s
2023-12-22 21:25:21 +03:00
4fb581de2d random-topic-fix-2
All checks were successful
deploy / deploy (push) Successful in 1m23s
2023-12-22 21:22:23 +03:00
d9d2e5e954 random-topic-fix
All checks were successful
deploy / deploy (push) Successful in 1m25s
2023-12-22 21:15:26 +03:00
d65687afb3 unrated-fi
All checks were successful
deploy / deploy (push) Successful in 1m35s
2023-12-22 21:12:42 +03:00
d3ea567797 postmerge
All checks were successful
deploy / deploy (push) Successful in 1m26s
2023-12-22 21:08:37 +03:00
4e769332b7 viewed-fix
All checks were successful
deploy / deploy (push) Successful in 1m22s
2023-12-22 12:09:24 +03:00
b502c581f7 search-result-schema-fix-5
All checks were successful
deploy / deploy (push) Successful in 1m29s
2023-12-19 15:42:46 +03:00
56cdd4e0f9 search-result-schema-fix-4
All checks were successful
deploy / deploy (push) Successful in 1m19s
2023-12-19 15:32:34 +03:00
d14f0c2f95 search-result-schema-fix-3
All checks were successful
deploy / deploy (push) Successful in 1m21s
2023-12-19 15:28:55 +03:00
5aa8258f16 search-result-schema-fix
All checks were successful
deploy / deploy (push) Successful in 1m21s
2023-12-19 15:18:58 +03:00
71000aad35 search-debug
All checks were successful
deploy / deploy (push) Successful in 1m22s
2023-12-19 15:03:27 +03:00
f52db8f9e5 get-authors-all
All checks were successful
deploy / deploy (push) Successful in 2m3s
2023-12-19 11:09:50 +03:00
8e8952dd46 last-seen-mark-remove
All checks were successful
deploy / deploy (push) Successful in 1m21s
2023-12-18 18:37:39 +03:00
8830908307 auth-connector-less
All checks were successful
deploy / deploy (push) Successful in 1m20s
2023-12-18 10:12:17 +03:00
64b571fccd schema-fix
All checks were successful
deploy / deploy (push) Successful in 1m33s
2023-12-18 03:55:12 +03:00
a2ab5e8473 update-last-seen-author
All checks were successful
deploy / deploy (push) Successful in 1m24s
2023-12-18 01:20:00 +03:00
a6c5243c06 viewed-service-fixes
All checks were successful
deploy / deploy (push) Successful in 1m23s
2023-12-17 23:30:20 +03:00
2c6b872acb following-fix-5
All checks were successful
deploy / deploy (push) Successful in 1m25s
2023-12-17 15:30:28 +03:00
5bac172cce less-logs-auth
All checks were successful
deploy / deploy (push) Successful in 1m25s
2023-12-17 15:27:26 +03:00
49fe665d4d following-fix-4
All checks were successful
deploy / deploy (push) Successful in 1m26s
2023-12-17 15:22:07 +03:00
5cccaf43f7 following-fix-3
All checks were successful
deploy / deploy (push) Successful in 1m25s
2023-12-17 15:15:08 +03:00
ea5b9e5b09 following-fix
All checks were successful
deploy / deploy (push) Successful in 1m25s
2023-12-17 15:07:53 +03:00
a79f3cd5ec community-author-fix-2
All checks were successful
deploy / deploy (push) Successful in 1m26s
2023-12-17 09:23:15 +03:00
af4c1efd1c less-logs
All checks were successful
deploy / deploy (push) Successful in 1m26s
2023-12-17 09:20:33 +03:00
312900cec1 community-author-fix
All checks were successful
deploy / deploy (push) Successful in 1m41s
2023-12-17 09:17:23 +03:00
edf20466d6 formatting
All checks were successful
deploy / deploy (push) Successful in 1m28s
2023-12-17 08:40:05 +03:00
509f4409ff upgraded-resolvers-fix
All checks were successful
deploy / deploy (push) Successful in 1m29s
2023-12-17 08:28:34 +03:00
bb0a218eb7 new-resolvers
All checks were successful
deploy / deploy (push) Successful in 1m25s
2023-12-17 08:16:08 +03:00
81173f989a version-upgrade-0.2.18
All checks were successful
deploy / deploy (push) Successful in 1m25s
2023-12-17 08:08:35 +03:00
4697b44504 import-fix
All checks were successful
deploy / deploy (push) Successful in 1m55s
2023-12-17 07:59:16 +03:00
cd0ba88462 comminity-author-link-name-fix
Some checks failed
deploy / deploy (push) Failing after 1m22s
2023-12-16 20:03:00 +03:00
d0ce4dd3d3 webhook-name-fix
Some checks failed
deploy / deploy (push) Failing after 1m24s
2023-12-16 19:59:43 +03:00
692dd9cfe0 resolvers-updates
Some checks failed
deploy / deploy (push) Failing after 1m30s
2023-12-16 18:24:30 +03:00
bf7bc03e50 webhook-fix-3
All checks were successful
deploy / deploy (push) Successful in 1m32s
2023-12-15 19:27:23 +03:00
642c4eeb9d debug-webhook
All checks were successful
deploy / deploy (push) Successful in 1m32s
2023-12-15 18:46:53 +03:00
7e16ee97fa webhook-debug
All checks were successful
deploy / deploy (push) Successful in 1m28s
2023-12-15 18:28:44 +03:00
a8ee8cde0b author-hook-fix-2
All checks were successful
deploy / deploy (push) Successful in 1m27s
2023-12-15 17:37:32 +03:00
f9afe3d9dd author-hook-fix
All checks were successful
deploy / deploy (push) Successful in 1m29s
2023-12-15 17:25:21 +03:00
1ca23cc159 author-debug
All checks were successful
deploy / deploy (push) Successful in 1m35s
2023-12-15 16:59:03 +03:00
50016c0ba7 auth-debug
All checks were successful
deploy / deploy (push) Successful in 1m33s
2023-12-15 16:55:12 +03:00
db7aee730f debug-get-author
All checks were successful
deploy / deploy (push) Successful in 1m41s
2023-12-15 16:48:47 +03:00
68978fa1c0 json-fix
All checks were successful
deploy / deploy (push) Successful in 32s
2023-12-14 03:06:35 +03:00
ab9be5ef14 encode-try
All checks were successful
deploy / deploy (push) Successful in 1m42s
2023-12-14 00:57:32 +03:00
2f13943781 fix-operation
All checks were successful
deploy / deploy (push) Successful in 1m40s
2023-12-14 00:53:37 +03:00
afb65d396b operation-name-fix
All checks were successful
deploy / deploy (push) Successful in 1m34s
2023-12-14 00:47:02 +03:00
b36a655090 logs-fix
All checks were successful
deploy / deploy (push) Successful in 1m36s
2023-12-14 00:17:20 +03:00
8fb2764bc1 debug-gql
All checks were successful
deploy / deploy (push) Successful in 1m32s
2023-12-14 00:10:34 +03:00
2518e0357b dep-fix-2
All checks were successful
deploy / deploy (push) Successful in 1m33s
2023-12-13 23:54:38 +03:00
2fb48d76b6 dep-fix
All checks were successful
deploy / deploy (push) Successful in 1m39s
2023-12-13 23:48:42 +03:00
510402032d auth-connector-fix-3
Some checks failed
deploy / deploy (push) Failing after 1m28s
2023-12-13 23:42:52 +03:00
f51d7539eb auth-connector-fix-2
Some checks failed
deploy / deploy (push) Has been cancelled
2023-12-13 23:42:19 +03:00
99349dcad6 auth-connector-fix
Some checks failed
deploy / deploy (push) Failing after 1m31s
2023-12-13 23:39:25 +03:00
c97bd9c784 debug-get-author-2
All checks were successful
deploy / deploy (push) Successful in 1m34s
2023-12-13 22:59:21 +03:00
c68900babf debug-response-3
All checks were successful
deploy / deploy (push) Successful in 1m34s
2023-12-13 21:33:23 +03:00
d1447d3c05 debug-response
All checks were successful
deploy / deploy (push) Successful in 1m35s
2023-12-13 20:49:26 +03:00
fa0e815f13 gql-fix
All checks were successful
deploy / deploy (push) Successful in 1m36s
2023-12-13 20:42:00 +03:00
a86739ed1b debug-response
All checks were successful
deploy / deploy (push) Successful in 1m36s
2023-12-13 20:13:57 +03:00
29c02158b7 debug-authors-2
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-13 16:32:02 +03:00
4bd5109034 debug-authors
All checks were successful
deploy / deploy (push) Successful in 1m33s
2023-12-13 16:27:51 +03:00
359cfb1b75 validate-jwt
All checks were successful
deploy / deploy (push) Successful in 1m35s
2023-12-13 16:20:06 +03:00
a72dd5675e authorizer-full-vars-fix-2
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-12 11:25:34 +03:00
d27a6897cc authorizer-full-vars-fix
All checks were successful
deploy / deploy (push) Successful in 1m31s
2023-12-12 11:19:22 +03:00
74ca120879 authorizer-connector-debug
All checks were successful
deploy / deploy (push) Successful in 1m32s
2023-12-12 10:30:32 +03:00
954c3740cd authorizer-connector-fix-7
All checks were successful
deploy / deploy (push) Successful in 1m42s
2023-12-12 08:00:46 +03:00
3b7b47599c authorizer-connector-fix-7
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-11 23:06:51 +03:00
2f3ceae8c2 authorizer-connector-fix-6
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-11 22:56:59 +03:00
27612186de authorizer-connector-fix-5
All checks were successful
deploy / deploy (push) Successful in 1m34s
2023-12-11 22:50:13 +03:00
54acfe2b89 authorizer-connector-fix-4
All checks were successful
deploy / deploy (push) Successful in 1m32s
2023-12-11 22:39:58 +03:00
ccfeb89e66 authorizer-connector-fix-3
All checks were successful
deploy / deploy (push) Successful in 1m31s
2023-12-11 22:36:46 +03:00
7937fb89d4 authorizer-connector-fix-2
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-11 22:12:18 +03:00
7d0268ec52 authorizer-connector-fix
Some checks failed
deploy / deploy (push) Has been cancelled
2023-12-11 22:10:45 +03:00
2184fcf1f9 reaction-order-fix
All checks were successful
deploy / deploy (push) Successful in 1m48s
2023-12-11 17:57:34 +03:00
159c151ae7 reactions-sort-order-fix
Some checks failed
deploy / deploy (push) Failing after 1m22s
2023-12-10 01:47:22 +03:00
de63f313a5 paginated-authors
All checks were successful
deploy / deploy (push) Successful in 1m29s
2023-12-09 22:02:04 +03:00
275a1f9a08 pop-fix-3
All checks were successful
deploy / deploy (push) Successful in 1m28s
2023-12-09 21:21:38 +03:00
1f6f722eef pop-fix-2
All checks were successful
deploy / deploy (push) Successful in 1m30s
2023-12-09 21:15:30 +03:00
b992a73698 pop-fix
All checks were successful
deploy / deploy (push) Successful in 1m55s
2023-12-09 21:03:53 +03:00
d37f68869c alchemy-fix
All checks were successful
deploy / deploy (push) Successful in 1m28s
2023-12-09 20:15:57 +03:00
0b69b0b856 import-fix
All checks were successful
deploy / deploy (push) Successful in 1m28s
2023-12-09 20:12:04 +03:00
3acedcc7d6 main_topic-fix
All checks were successful
deploy / deploy (push) Successful in 1m28s
2023-12-09 19:45:02 +03:00
724e9bd5a0 add-main_topic
All checks were successful
deploy / deploy (push) Successful in 1m32s
2023-12-09 19:22:47 +03:00
c1adaf3ed6 schema-fix-author-fix
All checks were successful
deploy / deploy (push) Successful in 1m42s
2023-12-07 21:29:25 +03:00
bb55cfaefe get-shout-fix 2023-12-03 01:42:16 +03:00
b93d91528b search-resolver-fix 2023-12-03 01:22:16 +03:00
4f857e1425 revert-fix 2023-12-03 01:14:36 +03:00
748e3c6db3 join-cond-2 2023-12-03 00:39:06 +03:00
e2271e38e1 join-cond 2023-12-03 00:36:22 +03:00
a6df648af1 stat-fix-8 2023-12-03 00:29:57 +03:00
a3294de4dc stat-fix-7 2023-12-02 23:46:01 +03:00
89c453fedc stat-fix-6 2023-12-02 23:44:36 +03:00
ebe034a527 stat-fix-5 2023-12-02 23:38:28 +03:00
2e3e79f51e stat-fix-4 2023-12-02 23:35:06 +03:00
fcdaabd10d stat-fix-3 2023-12-02 23:30:06 +03:00
807f6ba5b1 stat-fix-2 2023-12-02 23:23:16 +03:00
16bbe995b7 stat-fix 2023-12-02 23:17:26 +03:00
6c607732a8 all-authors-fix-2 2023-12-02 22:45:41 +03:00
1cdf286594 all-authors-fix 2023-12-02 22:33:00 +03:00
fc3745f07e groupby-fix-2 2023-12-02 22:17:09 +03:00
a8b8637057 groupby-fix 2023-12-02 22:13:47 +03:00
34940178ad resolvers-fix 2023-12-02 09:25:08 +03:00
5fe27f9c0c .. 2023-12-01 13:00:10 +03:00
c049f882f3 joinedload-fix-5
All checks were successful
deploy / deploy (push) Successful in 5s
2023-11-30 20:08:56 +03:00
dbab772e62 joinedload-fix-2 2023-11-30 19:41:53 +03:00
e82ca2e385 joinedload-fix 2023-11-30 19:37:53 +03:00
f1ccef7919 no-debug 2023-11-30 16:07:30 +03:00
5f0a8f3b10 replyto-fix 2023-11-30 15:12:12 +03:00
95507ffa48 topicstat-fix 2023-11-30 15:07:08 +03:00
ecf0727631 joined-createdby-fix 2023-11-30 14:04:55 +03:00
e2f2dff755 topics-sql-debug 2023-11-30 13:30:50 +03:00
919aaa951f string-enum-fix 2023-11-30 11:40:27 +03:00
1362eaa125 createdby-fix 2023-11-30 11:27:06 +03:00
685988c219 createdby 2023-11-30 11:04:03 +03:00
2d3f7a51b4 enum-fix 2023-11-30 10:38:41 +03:00
537d588853 stats-fix 2023-11-30 00:21:22 +03:00
f57719d182 author-stats 2023-11-29 23:53:26 +03:00
ece918ac2c plus-fix 2023-11-29 23:22:39 +03:00
a0ee3a1be9 less-classes 2023-11-29 21:11:05 +03:00
dc80255fc7 schema-fix 2023-11-29 15:14:21 +03:00
28853c3a4b published-filter 2023-11-29 15:11:05 +03:00
4a1d7280fc schema-fix 2023-11-29 15:01:51 +03:00
ecaa4ffbc5 param-fox 2023-11-29 14:28:08 +03:00
3454766063 reaction-fix 2023-11-29 14:24:59 +03:00
cd955ecf8a createdat-fix 2023-11-29 14:16:09 +03:00
a950f57efc groupby-createdby 2023-11-29 13:56:26 +03:00
cdb9d31fa4 query-fix 2023-11-29 13:50:20 +03:00
6bac6b737e isnot-fix 2023-11-29 13:44:40 +03:00
af761f916f reactions-filters-fix
Some checks failed
deploy / deploy (push) Has been cancelled
2023-11-29 12:59:00 +03:00
f930822d8a filters-fix-2
Some checks are pending
deploy / deploy (push) Waiting to run
2023-11-29 12:33:33 +03:00
64e8c8afd7 filters-fix 2023-11-29 12:29:09 +03:00
44b7a3da98 visibility-fix-2
Some checks are pending
deploy / deploy (push) Waiting to run
2023-11-29 12:19:01 +03:00
0920af7e77 visibility-filter-fix
Some checks are pending
deploy / deploy (push) Waiting to run
2023-11-29 12:16:37 +03:00
fe4e37663e pyrafixes
Some checks are pending
deploy / deploy (push) Waiting to run
2023-11-29 11:00:00 +03:00
63eb952655 aiohttp-try 2023-11-29 10:23:41 +03:00
36ab83d02f shoutauthor-fix
Some checks failed
deploy / deploy (push) Has been cancelled
2023-11-29 09:32:24 +03:00
cefc77e8e4 sentry-add
Some checks are pending
deploy / deploy (push) Waiting to run
2023-11-29 09:14:23 +03:00
4b77cea690 sentry-integrations 2023-11-29 07:48:31 +03:00
4ca9491824 routes-fix
Some checks failed
deploy / deploy (push) Has been cancelled
2023-11-29 00:19:33 +03:00
6cd2fc0f80 typed-endpoint
Some checks are pending
deploy / deploy (push) Waiting to run
2023-11-29 00:13:46 +03:00
aaf4c0b876 trig-ci
Some checks failed
deploy / deploy (push) Has been cancelled
2023-11-28 23:13:42 +03:00
269c0e449f webhook-fix
All checks were successful
deploy / deploy (push) Successful in 9s
2023-11-28 22:13:53 +03:00
0c2af2bdf4 new-author-webhook-endpoint
All checks were successful
deploy / deploy (push) Successful in 2m14s
2023-11-28 22:07:53 +03:00
a241a098b9 create-invite-fix
All checks were successful
deploy / deploy (push) Successful in 2m20s
2023-11-28 15:56:32 +03:00
01d7dadd78 load-shouts-filters
All checks were successful
deploy / deploy (push) Successful in 2m17s
2023-11-28 14:17:21 +03:00
168a7079f6 schema-fix
All checks were successful
deploy / deploy (push) Successful in 2m16s
2023-11-28 13:59:36 +03:00
a21efb99df author-invitee-fix
All checks were successful
deploy / deploy (push) Successful in 2m8s
2023-11-28 13:55:05 +03:00
0240005ed1 invite-feature
All checks were successful
deploy / deploy (push) Successful in 2m10s
2023-11-28 13:46:06 +03:00
13ba5ebaed shout-followers
All checks were successful
deploy / deploy (push) Successful in 2m20s
2023-11-28 12:11:45 +03:00
20f7c22441 0.2.16-resolvers-revision
All checks were successful
deploy / deploy (push) Successful in 2m22s
2023-11-28 10:53:48 +03:00
3cf86d9e6e isnot-fix
All checks were successful
deploy / deploy (push) Successful in 2m11s
2023-11-28 08:56:57 +03:00
14ae7fbcc9 resolvers-fix
All checks were successful
deploy / deploy (push) Successful in 2m14s
2023-11-27 21:18:52 +03:00
5f8ec549df emptybody-fix
All checks were successful
deploy / deploy (push) Successful in 2m12s
2023-11-27 21:03:59 +03:00
3b0aedf959 loadshouts-fix
All checks were successful
deploy / deploy (push) Successful in 2m10s
2023-11-27 20:35:26 +03:00
53a0f2e328 iffix
All checks were successful
deploy / deploy (push) Successful in 2m20s
2023-11-27 19:15:34 +03:00
caa2dbfdf3 reaction-model-fix
All checks were successful
deploy / deploy (push) Successful in 2m19s
2023-11-27 19:03:47 +03:00
909ddbd79d pyright-fix
All checks were successful
deploy / deploy (push) Successful in 2m16s
2023-11-27 11:12:42 +03:00
fe60d625e5 notest 2023-11-24 05:21:31 +03:00
4e7250acef logs-fix
All checks were successful
deploy / deploy (push) Successful in 2m5s
2023-11-24 04:53:30 +03:00
167eed436d my-subs-fix
All checks were successful
deploy / deploy (push) Successful in 2m6s
2023-11-24 04:13:55 +03:00
7257f52aeb query-schema-fix
All checks were successful
deploy / deploy (push) Successful in 2m2s
2023-11-24 02:10:13 +03:00
a63cf24812 0.2.15
Some checks failed
deploy / deploy (push) Failing after 1m58s
2023-11-24 02:00:28 +03:00
c150d28447 schema-fix 2023-11-23 23:30:00 +03:00
7d5dc8b8cd nochecks
All checks were successful
deploy / deploy (push) Successful in 2m11s
2023-11-23 01:19:50 +03:00
3ab5d53439 curl-fix
Some checks failed
deploy / deploy (push) Failing after 22s
2023-11-23 00:19:15 +03:00
4b85b602c2 community-fix-2
Some checks failed
deploy / deploy (push) Failing after 1m50s
2023-11-23 00:12:23 +03:00
bdae67804e community-fix
Some checks failed
deploy / deploy (push) Failing after 1m56s
2023-11-23 00:05:04 +03:00
af5746c5d8 imports-fix
Some checks failed
deploy / deploy (push) Failing after 2m1s
2023-11-22 21:23:15 +03:00
3379376016 binary-back-dburl-fix
Some checks failed
deploy / deploy (push) Failing after 1m53s
2023-11-22 21:06:45 +03:00
998340baf8 psycopg2-ix
Some checks failed
deploy / deploy (push) Failing after 1m36s
2023-11-22 21:04:51 +03:00
9ee850ddb7 import-fix
Some checks failed
deploy / deploy (push) Failing after 25s
2023-11-22 20:56:25 +03:00
db76ba3733 0.2.14
Some checks failed
deploy / deploy (push) Failing after 2m1s
2023-11-22 19:38:39 +03:00
e2082b48d3 orm-fix
Some checks failed
deploy / deploy (push) Failing after 1m46s
2023-11-04 12:43:08 +03:00
435d1e4505 new-version-0-2-13
Some checks failed
deploy / deploy (push) Failing after 1m54s
2023-11-03 13:10:22 +03:00
1f5e5472c9 refactoring
Some checks failed
deploy / deploy (push) Failing after 1m56s
2023-10-25 21:33:53 +03:00
20e1fa989a shout-community-fix
Some checks failed
deploy / deploy (push) Failing after 2m18s
2023-10-25 20:25:53 +03:00
04dedaa3a3 updates-fixes
Some checks failed
deploy / deploy (push) Failing after 2m0s
2023-10-25 20:02:01 +03:00
46e684b28d core-update
Some checks failed
deploy / deploy (push) Failing after 2m0s
2023-10-25 19:55:30 +03:00
e151034bab fix-imports
Some checks failed
deploy / deploy (push) Failing after 1m43s
2023-10-23 17:51:13 +03:00
bf241a8fbd merged-isolated-core
Some checks failed
deploy / deploy (push) Failing after 1m46s
2023-10-23 17:47:11 +03:00
b675188013 upd
All checks were successful
deploy / deploy (push) Successful in 1m33s
2023-10-19 17:42:42 +03:00
fa7a04077a feat: /connect/
All checks were successful
deploy / deploy (push) Successful in 28s
2023-10-18 07:33:36 -03:00
24be18abf1 feat: /connect/=
All checks were successful
deploy / deploy (push) Successful in 27s
2023-10-18 07:30:22 -03:00
83b5c2c139 feat: presence-8080
Some checks failed
deploy / deploy (push) Failing after 5s
2023-10-18 07:27:06 -03:00
9e84d6ea37 feat: presence-8080
Some checks are pending
deploy / deploy (push) Waiting to run
2023-10-18 07:25:34 -03:00
4da963f9c8 Noop commit to sync with server changes
All checks were successful
deploy / deploy (push) Successful in 29s
2023-10-18 07:18:34 -03:00
c1d6a2d4e3 feat: add wildcat to /connect for handle Token
Some checks failed
deploy / deploy (push) Failing after 5s
2023-10-18 07:13:10 -03:00
518bc4020b feat: add wildcat to /connect for handle Token
Some checks are pending
deploy / deploy (push) Waiting to run
2023-10-18 07:11:08 -03:00
e13cdd7298 feat: add wildcat to /connect for handle Token
All checks were successful
deploy / deploy (push) Successful in 28s
2023-10-18 06:57:40 -03:00
4fec0ca7fb fix-follow-author-notification
All checks were successful
deploy / deploy (push) Successful in 30s
2023-10-16 22:24:10 +03:00
b13d532da2 postmerge
All checks were successful
deploy / deploy (push) Successful in 27s
2023-10-16 20:45:01 +03:00
b03ac825b6 unread-fix3
All checks were successful
deploy / deploy (push) Successful in 27s
2023-10-16 19:13:39 +03:00
49423ffb93 unread-fix-2
All checks were successful
deploy / deploy (push) Successful in 29s
2023-10-16 19:00:18 +03:00
faa97d27c2 unread-fix
All checks were successful
deploy / deploy (push) Successful in 27s
2023-10-16 18:56:03 +03:00
6e0cb18909 cleanup-notifications
All checks were successful
deploy / deploy (push) Successful in 29s
2023-10-16 18:30:54 +03:00
066bf72547 cleanup-orm
Some checks failed
deploy / deploy (push) Failing after 23s
2023-10-16 18:28:43 +03:00
bc08ece4c3 user User for awhile, filter follower fields
Some checks failed
deploy / deploy (push) Failing after 23s
2023-10-16 18:25:15 +03:00
562a919fca post-merge
Some checks failed
deploy / deploy (push) Failing after 22s
2023-10-16 18:21:05 +03:00
51ad266b62 Merge branch 'main' of https://github.com/Discours/discours-backend into feature/refactoring-services 2023-10-16 18:19:06 +03:00
15ef976538 using presence service 2023-10-16 18:18:29 +03:00
823b3c56c1 presence service interface fix 2023-10-16 17:51:08 +03:00
34e6a03a89 following manager does not manage chats 2023-10-16 17:50:40 +03:00
0c75902a64 fix-unread 2023-10-16 17:50:05 +03:00
582a21408e feat:test
All checks were successful
deploy / deploy (push) Successful in 1m40s
2023-10-16 09:14:37 -03:00
9a7852e17c feat: add to CI/CD piplin
All checks were successful
deploy / deploy (push) Successful in 29s
2023-10-15 15:33:11 -03:00
cbd4c41d32 feat: add to CI/CD piplin
Some checks failed
deploy / deploy (push) Failing after 22s
2023-10-15 15:32:06 -03:00
fd304768b7 feat: add to CI/CD piplin
Some checks failed
deploy / deploy (push) Failing after 5s
2023-10-15 15:27:44 -03:00
fe078809d6 feat: add to CI/CD piplin 2023-10-15 15:26:48 -03:00
36d36defd8 debug: sigil / after proxy connect 2023-10-15 15:26:48 -03:00
6047a3b259 unread-counter-fix-2 2023-10-13 15:20:06 +03:00
f86da630e8 redis-debug-fix 2023-10-13 15:17:44 +03:00
7348e5d9fe unread-counter-fix2 2023-10-13 15:13:01 +03:00
f5da6d450b unread-counter-fix 2023-10-13 15:10:56 +03:00
882ff39f28 redis-debug 2023-10-13 15:01:35 +03:00
7cd5929df2 token-type-tolerance 2023-10-13 14:47:31 +03:00
e9f68c8fb1 token-type-tolerance 2023-10-13 14:45:24 +03:00
792d60453a new-query-fix2 2023-10-13 14:35:10 +03:00
e648091a3c new-query-fix 2023-10-13 14:32:55 +03:00
1b7aa6aa0a some-more-queries-fix-3 2023-10-13 14:07:13 +03:00
d881f9da27 some-more-queries-fix-2 2023-10-13 14:02:44 +03:00
3f1aff2d0f some-more-queries-fix 2023-10-13 14:00:30 +03:00
d4dbf5c0ae some-more-queries 2023-10-13 13:59:24 +03:00
fed154c7f1 fix-redis 2023-10-13 13:48:17 +03:00
c1abace1c0 few-more-resolvers-fix-2 2023-10-13 13:46:34 +03:00
31824cccc9 few-more-resolvers-fix 2023-10-13 13:45:27 +03:00
85a9077792 few-more-resolvers 2023-10-13 13:41:47 +03:00
bbd8f61408 redis update 2023-10-13 13:13:45 +03:00
82618bf7f3 merged 2023-10-11 23:00:15 +03:00
9720b9f26b Merge branch 'feature/refactoring-services' of https://dev.discours.io/discours.io/backend into feature/refactoring-services 2023-10-11 22:59:13 +03:00
2c15852e9b fix-str 2023-10-11 22:59:05 +03:00
e39450d33b fix: sigil proxy for /connect 2023-10-11 10:49:52 -03:00
df2f097e11 fix: sigil proxy for /connect 2023-10-11 10:42:18 -03:00
a14c70e8c7 unmerge 2023-10-11 15:56:28 +03:00
9c651a6d72 debug 2023-10-11 15:41:04 +03:00
09d77bb1d1 merge-fix-7 2023-10-11 13:07:49 +03:00
eca3de7579 merge-fix-6 2023-10-11 13:02:17 +03:00
2fafe8b618 merged-fix-5 2023-10-11 12:26:08 +03:00
62020bd668 merged-fix-4 2023-10-11 12:23:09 +03:00
f1bdd7a0f8 merged-fix-3 2023-10-11 12:20:58 +03:00
6e63be30e0 merged-fix-2 2023-10-11 12:00:36 +03:00
d89235e82a merged-fix 2023-10-11 11:57:58 +03:00
6252671b85 merged 2023-10-11 11:56:46 +03:00
0e8b39bed6 Merge branch 'main' of dev.discours.io:discoursio-api into feature/refactoring-services 2023-10-11 10:28:04 +03:00
d50a510d52 fix-profile 2023-10-11 08:36:40 +03:00
e1245d1f46 feat: sigil with logs and reguest methods 2023-10-10 09:13:25 -03:00
fbeaac5cad feat: sigil with logs and reguest methods 2023-10-10 09:13:25 -03:00
d6913d6ff5 feat: sigil with logs and reguest methods 2023-10-10 07:52:43 -03:00
93b86eab86 feat: sigil with logs and reguest methods 2023-10-10 07:48:33 -03:00
0eed70c102 port=8080 2023-10-10 01:09:15 +03:00
14fa314e2a fix-load 2023-10-10 00:34:51 +03:00
ad97aa2227 fix-slug-raise-error 2023-10-10 00:29:22 +03:00
57aa4caa84 started-log 2023-10-10 00:22:16 +03:00
0bd44d1fab new-sigi 2023-10-09 23:47:18 +03:00
177a47ba7c _Service-redeploy3 2023-10-06 12:57:08 +03:00
32b00d5065 merged 2023-10-06 12:51:48 +03:00
01be3ac95e schema-sdl-serv 2023-10-06 12:51:07 +03:00
d1366d0b88 feat: @read about keys 2023-10-06 06:14:24 -03:00
fada9a289a feat: right schema in schema.py 2023-10-06 06:05:01 -03:00
6d56e8b3a7 feat: right schema in schema.py 2023-10-06 06:02:11 -03:00
c5ea08f939 feat: add to SDL full Query Mutation schema 2023-10-06 05:47:41 -03:00
d9f47183c8 feat: add in schema.py resolver fro _server 2023-10-06 03:51:23 -03:00
ilya-bkv
6ddfc11a91 getAuthor add stat 2023-10-06 03:39:09 -03:00
2697ec4fcd _Service-redeploy2 2023-10-06 06:39:01 +03:00
e244549a1d _Service 2023-10-06 06:29:52 +03:00
150449a0cf port=80 2023-10-06 05:33:51 +03:00
aa5709c695 fix-reqs2 2023-10-06 03:56:27 +03:00
8a3aa1dae6 fix-reqs 2023-10-06 03:55:43 +03:00
12f65bd8fa fix-poetry-deps 2023-10-06 03:33:48 +03:00
bab6990c87 fix-poetryenv 2023-10-06 03:31:45 +03:00
34f9139742 fix-dockerfile 2023-10-06 03:24:40 +03:00
b64d9d5014 poetry-rty 2023-10-06 03:22:37 +03:00
12416c1b83 path-fix 2023-10-06 02:03:36 +03:00
b2e196d261 forked-ariadne 2023-10-06 02:01:18 +03:00
0e8e8f4d04 git+ssh 2023-10-06 01:49:34 +03:00
8de2eb385b async-fix 2023-10-06 01:45:32 +03:00
12c43dbf32 fix-sync 2023-10-06 01:22:58 +03:00
d34597e349 debug-stat 2023-10-06 01:15:23 +03:00
78a3354d5f logs-fix 2023-10-06 01:12:34 +03:00
720d8a4a68 no-sigil-here2 2023-10-06 01:02:51 +03:00
ffa3fbb252 no-sigil-here 2023-10-06 01:02:14 +03:00
400fff4ef0 schema-no-subs2 2023-10-06 00:42:34 +03:00
4f0377c57d schema-no-subs 2023-10-06 00:34:08 +03:00
7761ccf2d5 schema-path-fix 2023-10-06 00:22:54 +03:00
4de1e64ba2 schema-fix 2023-10-06 00:20:02 +03:00
bbc5dc441d requests-transport 2023-10-06 00:17:24 +03:00
120208a621 rollback-requests 2023-10-06 00:10:46 +03:00
8524d0f843 edge 2023-10-06 00:05:15 +03:00
d26d444975 deps-workaround2 2023-10-06 00:02:25 +03:00
e0bd938a6e deps-workaround 2023-10-05 23:57:04 +03:00
aed91c6375 deps... 2023-10-05 23:55:23 +03:00
34f3098a0d import-fix6 2023-10-05 23:50:14 +03:00
c57f3857a6 import-fix4 2023-10-05 23:47:51 +03:00
c665c0056c import-fix4 2023-10-05 23:45:21 +03:00
d30b4c7d2b import-fix3 2023-10-05 23:42:48 +03:00
f468ccca93 import-fix2 2023-10-05 23:34:02 +03:00
d5b0aaba9b import-fix 2023-10-05 23:31:21 +03:00
da5bbc79b4 deps... 2023-10-05 23:25:52 +03:00
3c936e7860 deps... 2023-10-05 23:22:11 +03:00
46044a0f98 migration-removed 2023-10-05 23:18:55 +03:00
5fedd007c7 git-dep3 2023-10-05 23:18:06 +03:00
3d659caa6e git-dep2 2023-10-05 23:05:09 +03:00
9d2cd9f21f git-dep 2023-10-05 23:04:09 +03:00
f068869727 git-install 2023-10-05 23:01:25 +03:00
45d187786b fix-imports 2023-10-05 22:59:50 +03:00
f6e3320e18 async-mail 2023-10-05 22:47:02 +03:00
9537814718 deps-fixes 2023-10-05 22:38:35 +03:00
458823b894 dockerfile-fix 2023-10-05 22:19:20 +03:00
b8e6f7bb5a requests-removed+fixes 2023-10-05 22:18:05 +03:00
fbc85f6c2d aioredis-removed 2023-10-05 22:00:24 +03:00
deac939ed8 restructured,inbox-removed 2023-10-05 21:46:18 +03:00
157 changed files with 11755 additions and 11746 deletions

1
.cursorignore Normal file
View File

@ -0,0 +1 @@
# Add directories or file patterns to ignore during indexing (e.g. foo/ or *.csv)

View File

@ -1,9 +1,5 @@
name: 'Deploy to discoursio-api'
on:
push:
branches:
- main
name: 'Deploy on push'
on: [push]
jobs:
deploy:
runs-on: ubuntu-latest
@ -21,11 +17,19 @@ jobs:
id: branch_name
run: echo "::set-output name=branch::$(echo ${GITHUB_REF##*/})"
- name: Push to dokku
- name: Push to dokku for main branch
if: github.ref == 'refs/heads/main'
uses: dokku/github-action@master
with:
branch: 'main'
git_remote_url: 'ssh://dokku@v2.discours.io:22/discoursio-api'
ssh_private_key: ${{ secrets.SSH_PRIVATE_KEY }}
- name: Push to dokku for dev branch
if: github.ref == 'refs/heads/dev'
uses: dokku/github-action@master
with:
branch: 'dev'
force: true
git_remote_url: 'ssh://dokku@v2.discours.io:22/core'
ssh_private_key: ${{ secrets.SSH_PRIVATE_KEY }}

View File

@ -1,16 +0,0 @@
name: Checks
on: [pull_request]
jobs:
build:
runs-on: ubuntu-latest
name: Checks
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.10.6
- run: pip install --upgrade pip
- run: pip install -r requirements.txt
- run: pip install -r requirements-dev.txt
- run: ./checks.sh

View File

@ -17,11 +17,11 @@ jobs:
- uses: webfactory/ssh-agent@v0.8.0
with:
ssh-private-key: ${{ secrets.SSH_PRIVATE_KEY }}
ssh-private-key: ${{ github.action.secrets.SSH_PRIVATE_KEY }}
- name: Push to dokku
env:
HOST_KEY: ${{ secrets.HOST_KEY }}
HOST_KEY: ${{ github.action.secrets.HOST_KEY }}
run: |
echo $HOST_KEY > ~/.ssh/known_hosts
git remote add dokku dokku@v2.discours.io:discoursio-api

23
.gitignore vendored
View File

@ -147,11 +147,20 @@ migration/content/**/*.md
*.csv
dev-server.pid
backups/
.ruff_cache
.venv
poetry.lock
.devcontainer/devcontainer.json
localhost-key.pem
.gitignore
discoursio.db
localhost.pem
.ruff_cache
.jj
.zed
dokku_config
*.db
*.sqlite3
views.json
*.pem
*.key
*.crt
*cache.json
.cursor
node_modules/

View File

@ -1,44 +1,18 @@
exclude: |
(?x)(
^tests/unit_tests/resource|
_grpc.py|
_pb2.py
)
default_language_version:
python: python3.10
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- id: check-added-large-files
- id: check-case-conflict
- id: check-docstring-first
- id: check-json
- id: check-merge-conflict
- id: check-toml
- id: check-yaml
- id: check-toml
- id: end-of-file-fixer
- id: trailing-whitespace
- id: requirements-txt-fixer
- id: check-added-large-files
- id: detect-private-key
- id: check-ast
- id: check-merge-conflict
- repo: https://github.com/timothycrosley/isort
rev: 5.12.0
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.4.7
hooks:
- id: isort
- repo: https://github.com/ambv/black
rev: 23.10.1
hooks:
- id: black
- repo: https://github.com/PyCQA/flake8
rev: 6.1.0
hooks:
- id: flake8
# - repo: https://github.com/python/mypy
# rev: v1.6.1
# hooks:
# - id: mypy
- id: ruff
args: [--fix]

340
CHANGELOG.md Normal file
View File

@ -0,0 +1,340 @@
#### [0.4.20] - 2025-05-03
- Исправлена ошибка в классе `CacheRevalidationManager`: добавлена инициализация атрибута `_redis`
- Улучшена обработка соединения с Redis в менеджере ревалидации кэша:
- Автоматическое восстановление соединения в случае его потери
- Проверка соединения перед выполнением операций с кэшем
- Дополнительное логирование для упрощения диагностики проблем
- Исправлен резолвер `unpublish_shout`:
- Корректное формирование синтетического поля `publication` с `published_at: null`
- Возвращение полноценного словаря с данными вместо объекта модели
- Улучшена загрузка связанных данных (авторы, темы) для правильного формирования ответа
#### [0.4.19] - 2025-04-14
- dropped `Shout.description` and `Draft.description` to be UX-generated
- use redis to init views counters after migrator
#### [0.4.18] - 2025-04-10
- Fixed `Topic.stat.authors` and `Topic.stat.comments`
- Fixed unique constraint violation for empty slug values:
- Modified `update_draft` resolver to handle empty slug values
- Modified `create_draft` resolver to prevent empty slug values
- Added validation to prevent inserting or updating drafts with empty slug
- Fixed database error "duplicate key value violates unique constraint draft_slug_key"
#### [0.4.17] - 2025-03-26
- Fixed `'Reaction' object is not subscriptable` error in hierarchical comments:
- Modified `get_reactions_with_stat()` to convert Reaction objects to dictionaries
- Added default values for limit/offset parameters
- Fixed `load_first_replies()` implementation with proper parameter passing
- Added doctest with example usage
- Limited child comments to 100 per parent for performance
#### [0.4.16] - 2025-03-22
- Added hierarchical comments pagination:
- Created new GraphQL query `load_comments_branch` for efficient loading of hierarchical comments
- Ability to load root comments with their first N replies
- Added pagination for both root and child comments
- Using existing `comments_count` field in `Stat` type to display number of replies
- Added special `first_replies` field to store first replies to a comment
- Optimized SQL queries for efficient loading of comment hierarchies
- Implemented flexible comment sorting system (by time, rating)
#### [0.4.15] - 2025-03-22
- Upgraded caching system described `docs/caching.md`
- Module `cache/memorycache.py` removed
- Enhanced caching system with backward compatibility:
- Unified cache key generation with support for existing naming patterns
- Improved Redis operation function with better error handling
- Updated precache module to use consistent Redis interface
- Integrated revalidator with the invalidation system for better performance
- Added comprehensive documentation for the caching system
- Enhanced cached_query to support template-based cache keys
- Standardized error handling across all cache operations
- Optimized cache invalidation system:
- Added targeted invalidation for individual entities (authors, topics)
- Improved revalidation manager with individual object processing
- Implemented batched processing for high-volume invalidations
- Reduced Redis operations by using precise key invalidation instead of prefix-based wipes
- Added special handling for slug changes in topics
- Unified caching system for all models:
- Implemented abstract functions `cache_data`, `get_cached_data` and `invalidate_cache_by_prefix`
- Added `cached_query` function for unified approach to query caching
- Updated resolvers `author.py` and `topic.py` to use the new caching API
- Improved logging for cache operations to simplify debugging
- Optimized Redis memory usage through key format unification
- Improved caching and sorting in Topic and Author modules:
- Added support for dictionary sorting parameters in `by` for both modules
- Optimized cache key generation for stable behavior with various parameters
- Enhanced sorting logic with direction support and arbitrary fields
- Added `by` parameter support in the API for getting topics by community
- Performance optimizations for author-related queries:
- Added SQLAlchemy-managed indexes to `Author`, `AuthorFollower`, `AuthorRating` and `AuthorBookmark` models
- Implemented persistent Redis caching for author queries without TTL (invalidated only on changes)
- Optimized author retrieval with separate endpoints:
- `get_authors_all` - returns all non-deleted authors without statistics
- `load_authors_by` - optimized to use caching and efficient sorting and pagination
- Improved SQL queries with optimized JOIN conditions and efficient filtering
- Added pre-aggregation of statistics (shouts count, followers count) in single efficient queries
- Implemented robust cache invalidation on author updates
- Created necessary indexes for author lookups by user ID, slug, and timestamps
#### [0.4.14] - 2025-03-21
- Significant performance improvements for topic queries:
- Added database indexes to optimize JOIN operations
- Implemented persistent Redis caching for topic queries (no TTL, invalidated only on changes)
- Optimized topic retrieval with separate endpoints for different use cases:
- `get_topics_all` - returns all topics without statistics for lightweight listing
- `get_topics_by_community` - adds pagination and optimized filtering by community
- Added SQLAlchemy-managed indexes directly in ORM models for automatic schema maintenance
- Created `sync_indexes()` function for automatic index synchronization during app startup
- Reduced database load by pre-aggregating statistics in optimized SQL queries
- Added robust cache invalidation on topic create/update/delete operations
- Improved query optimization with proper JOIN conditions and specific partial indexes
#### [0.4.13] - 2025-03-20
- Fixed Topic objects serialization error in cache/memorycache.py
- Improved CustomJSONEncoder to support SQLAlchemy models with dict() method
- Enhanced error handling in cache_on_arguments decorator
- Modified `load_reactions_by` to include deleted reactions when `include_deleted=true` for proper comment tree building
- Fixed featured/unfeatured logic in reaction processing:
- Dislike reactions now properly take precedence over likes
- Featured status now requires more than 4 likes from users with featured articles
- Removed unnecessary filters for deleted reactions since rating reactions are physically deleted
- Author's featured status now based on having non-deleted articles with featured_at
#### [0.4.12] - 2025-03-19
- `delete_reaction` detects comments and uses `deleted_at` update
- `check_to_unfeature` etc. update
- dogpile dep in `services/memorycache.py` optimized
#### [0.4.11] - 2025-02-12
- `create_draft` resolver requires draft_id fixed
- `create_draft` resolver defaults body and title fields to empty string
#### [0.4.9] - 2025-02-09
- `Shout.draft` field added
- `Draft` entity added
- `create_draft`, `update_draft`, `delete_draft` mutations and resolvers added
- `create_shout`, `update_shout`, `delete_shout` mutations removed from GraphQL API
- `load_drafts` resolver implemented
- `publish_` and `unpublish_` mutations and resolvers added
- `create_`, `update_`, `delete_` mutations and resolvers added for `Draft` entity
- tests with pytest for original auth, shouts, drafts
- `Dockerfile` and `pyproject.toml` removed for the simplicity: `Procfile` and `requirements.txt`
#### [0.4.8] - 2025-02-03
- `Reaction.deleted_at` filter on `update_reaction` resolver added
- `triggers` module updated with `after_shout_handler`, `after_reaction_handler` for cache revalidation
- `after_shout_handler`, `after_reaction_handler` now also handle `deleted_at` field
- `get_cached_topic_followers` fixed
- `get_my_rates_comments` fixed
#### [0.4.7]
- `get_my_rates_shouts` resolver added with:
- `shout_id` and `my_rate` fields in response
- filters by `Reaction.deleted_at.is_(None)`
- filters by `Reaction.kind.in_([ReactionKind.LIKE.value, ReactionKind.DISLIKE.value])`
- filters by `Reaction.reply_to.is_(None)`
- uses `local_session()` context manager
- returns empty list on errors
- SQLAlchemy syntax updated:
- `select()` statement fixed for newer versions
- `Reaction` model direct selection instead of labeled columns
- proper row access with `row[0].shout` and `row[0].kind`
- GraphQL resolver fixes:
- added root parameter `_` to match schema
- proper async/await handling with `@login_required`
- error logging added via `logger.error()`
#### [0.4.6]
- login_accepted decorator added
- `docs` added
- optimized and unified `load_shouts_*` resolvers with `LoadShoutsOptions`
- `load_shouts_bookmarked` resolver fixed
- resolvers updates:
- new resolvers group `feed`
- `load_shouts_authored_by` resolver added
- `load_shouts_with_topic` resolver added
- `load_shouts_followed` removed
- `load_shouts_random_topic` removed
- `get_topics_random` removed
- model updates:
- `ShoutsOrderBy` enum added
- `Shout.main_topic` from `ShoutTopic.main` as `Topic` type output
- `Shout.created_by` as `Author` type output
#### [0.4.5]
- `bookmark_shout` mutation resolver added
- `load_shouts_bookmarked` resolver added
- `get_communities_by_author` resolver added
- `get_communities_all` resolver fixed
- `Community` stats in orm
- `Community` CUDL resolvers added
- `Reaction` filter by `Reaction.kind`s
- `ReactionSort` enum added
- `CommunityFollowerRole` enum added
- `InviteStatus` enum added
- `Topic.parents` ids added
- `get_shout` resolver accepts slug or shout_id
#### [0.4.4]
- `followers_stat` removed for shout
- sqlite3 support added
- `rating_stat` and `comments_count` fixes
#### [0.4.3]
- cache reimplemented
- load shouts queries unified
- `followers_stat` removed from shout
#### [0.4.2]
- reactions load resolvers separated for ratings (no stats) and comments
- reactions stats improved
- `load_comment_ratings` separate resolver
#### [0.4.1]
- follow/unfollow logic updated and unified with cache
#### [0.4.0]
- chore: version migrator synced
- feat: precache_data on start
- fix: store id list for following cache data
- fix: shouts stat filter out deleted
#### [0.3.5]
- cache isolated to services
- topics followers and authors cached
- redis stores lists of ids
#### [0.3.4]
- `load_authors_by` from cache
#### [0.3.3]
- feat: sentry integration enabled with glitchtip
- fix: reindex on update shout
- packages upgrade, isort
- separated stats queries for author and topic
- fix: feed featured filter
- fts search removed
#### [0.3.2]
- redis cache for what author follows
- redis cache for followers
- graphql add query: get topic followers
#### [0.3.1]
- enabling sentry
- long query log report added
- editor fixes
- authors links cannot be updated by `update_shout` anymore
#### [0.3.0]
- `Shout.featured_at` timestamp of the frontpage featuring event
- added proposal accepting logics
- schema modulized
- Shout.visibility removed
#### [0.2.22]
- added precommit hook
- fmt
- granian asgi
#### [0.2.21]
- fix: rating logix
- fix: `load_top_random_shouts`
- resolvers: `add_stat_*` refactored
- services: use google analytics
- services: minor fixes search
#### [0.2.20]
- services: ackee removed
- services: following manager fixed
- services: import views.json
#### [0.2.19]
- fix: adding `author` role
- fix: stripping `user_id` in auth connector
#### [0.2.18]
- schema: added `Shout.seo` string field
- resolvers: added `/new-author` webhook resolver
- resolvers: added reader.load_shouts_top_random
- resolvers: added reader.load_shouts_unrated
- resolvers: community follower id property name is `.author`
- resolvers: `get_authors_all` and `load_authors_by`
- services: auth connector upgraded
#### [0.2.17]
- schema: enum types workaround, `ReactionKind`, `InviteStatus`, `ShoutVisibility`
- schema: `Shout.created_by`, `Shout.updated_by`
- schema: `Shout.authors` can be empty
- resolvers: optimized `reacted_shouts_updates` query
#### [0.2.16]
- resolvers: collab inviting logics
- resolvers: queries and mutations revision and renaming
- resolvers: `delete_topic(slug)` implemented
- resolvers: added `get_shout_followers`
- resolvers: `load_shouts_by` filters implemented
- orm: invite entity
- schema: `Reaction.range` -> `Reaction.quote`
- filters: `time_ago` -> `after`
- httpx -> aiohttp
#### [0.2.15]
- schema: `Shout.created_by` removed
- schema: `Shout.mainTopic` removed
- services: cached elasticsearch connector
- services: auth is using `user_id` from authorizer
- resolvers: `notify_*` usage fixes
- resolvers: `getAuthor` now accepts slug, `user_id` or `author_id`
- resolvers: login_required usage fixes
#### [0.2.14]
- schema: some fixes from migrator
- schema: `.days` -> `.time_ago`
- schema: `excludeLayout` + `layout` in filters -> `layouts`
- services: db access simpler, no contextmanager
- services: removed Base.create() method
- services: rediscache updated
- resolvers: get_reacted_shouts_updates as followedReactions query
#### [0.2.13]
- services: db context manager
- services: `ViewedStorage` fixes
- services: views are not stored in core db anymore
- schema: snake case in model fields names
- schema: no DateTime scalar
- resolvers: `get_my_feed` comments filter reactions body.is_not('')
- resolvers: `get_my_feed` query fix
- resolvers: `LoadReactionsBy.days` -> `LoadReactionsBy.time_ago`
- resolvers: `LoadShoutsBy.days` -> `LoadShoutsBy.time_ago`
#### [0.2.12]
- `Author.userpic` -> `Author.pic`
- `CommunityFollower.role` is string now
- `Author.user` is string now
#### [0.2.11]
- redis interface updated
- `viewed` interface updated
- `presence` interface updated
- notify on create, update, delete for reaction and shout
- notify on follow / unfollow author
- use pyproject
- devmode fixed
#### [0.2.10]
- community resolvers connected
#### [0.2.9]
- starlette is back, aiohttp removed
- aioredis replaced with aredis
#### [0.2.8]
- refactored
#### [0.2.7]
- `loadFollowedReactions` now with `

5
CHECKS
View File

@ -1,5 +0,0 @@
WAIT=10
TIMEOUT=10
ATTEMPTS=3
/

View File

@ -1,11 +1,18 @@
FROM python:3.11-slim
FROM python:slim
RUN apt-get update && apt-get install -y \
postgresql-client \
curl \
&& rm -rf /var/lib/apt/lists/*
WORKDIR /app
EXPOSE 8080
ADD nginx.conf.sigil ./
COPY requirements.txt .
RUN apt update && apt install -y git gcc curl postgresql
RUN pip install -r requirements.txt
COPY . .
CMD python server.py
EXPOSE 8000
CMD ["python", "-m", "granian", "main:app", "--interface", "asgi", "--host", "0.0.0.0", "--port", "8000"]

123
README.md
View File

@ -1,47 +1,102 @@
# discoursio-api
# GraphQL API Backend
Backend service providing GraphQL API for content management system with reactions, ratings and comments.
- sqlalchemy
- redis
- ariadne
- starlette
- uvicorn
## Core Features
on osx
```
brew install redis nginx postgres
brew services start redis
### Shouts (Posts)
- CRUD operations via GraphQL mutations
- Rich filtering and sorting options
- Support for multiple authors and topics
- Rating system with likes/dislikes
- Comments and nested replies
- Bookmarks and following
### Reactions System
- `ReactionKind` types: LIKE, DISLIKE, COMMENT
- Rating calculation for shouts and comments
- User-specific reaction tracking
- Reaction stats and aggregations
- Nested comments support
### Authors & Topics
- Author profiles with stats
- Topic categorization and hierarchy
- Following system for authors/topics
- Activity tracking and stats
- Community features
## Tech Stack
- **(Python)[https://www.python.org/]** 3.12+
- **GraphQL** with [Ariadne](https://ariadnegraphql.org/)
- **(SQLAlchemy)[https://docs.sqlalchemy.org/en/20/orm/]**
- **(PostgreSQL)[https://www.postgresql.org/]/(SQLite)[https://www.sqlite.org/]** support
- **(Starlette)[https://www.starlette.io/]** for ASGI server
- **(Redis)[https://redis.io/]** for caching
## Development
### Prepare environment:
```shell
mkdir .venv
python3.12 -m venv venv
source venv/bin/activate
```
on debian/ubuntu
```
apt install redis nginx
### Run server
First, certifcates are required to run the server.
```shell
mkcert -install
mkcert localhost
```
# Local development
Then, run the server:
Install deps first
```
pip install -r requirements.txt
pip install -r requirements-dev.txt
pre-commit install
```shell
python server.py dev
```
Create database from backup
```
./restdb.sh
### Useful Commands
```shell
# Linting and import sorting
ruff check . --fix --select I
# Code formatting
ruff format . --line-length=120
# Run tests
pytest
# Type checking
mypy .
```
Start local server
### Code Style
We use:
- Ruff for linting and import sorting
- Line length: 120 characters
- Python type hints
- Docstrings for public methods
### GraphQL Development
Test queries in GraphQL Playground at `http://localhost:8000`:
```graphql
# Example query
query GetShout($slug: String) {
get_shout(slug: $slug) {
id
title
main_author {
name
}
}
}
```
python3 server.py dev
```
# How to do an authorized request
Put the header 'Authorization' with token from signIn query or registerUser mutation.
# How to debug Ackee
Set ACKEE_TOKEN var

6
__init__.py Normal file
View File

@ -0,0 +1,6 @@
import os
import sys
# Получаем путь к корневой директории проекта
root_path = os.path.abspath(os.path.dirname(__file__))
sys.path.append(root_path)

View File

@ -1,110 +0,0 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts
script_location = alembic
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory.
prepend_sys_path = .
# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python-dateutil library that can be
# installed by adding `alembic[tz]` to the pip requirements
# string value is passed to dateutil.tz.gettz()
# leave blank for localtime
# timezone =
# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; This defaults
# to alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below.
# version_locations = %(here)s/bar:%(here)s/bat:alembic/versions
# version path separator; As mentioned above, this is the character used to split
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
# Valid values for version_path_separator are:
#
# version_path_separator = :
# version_path_separator = ;
# version_path_separator = space
version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
sqlalchemy.url = %(DB_URL)
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

View File

@ -1,3 +0,0 @@
Generic single-database configuration.
https://alembic.sqlalchemy.org/en/latest/tutorial.html

View File

@ -3,7 +3,7 @@ from logging.config import fileConfig
from sqlalchemy import engine_from_config, pool
from alembic import context
from base.orm import Base
from services.db import Base
from settings import DB_URL
# this is the Alembic Config object, which provides

View File

@ -1,26 +0,0 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
${upgrades if upgrades else "pass"}
def downgrade() -> None:
${downgrades if downgrades else "pass"}

View File

@ -1,26 +0,0 @@
"""init alembic
Revision ID: fe943b098418
Revises:
Create Date: 2023-08-19 01:37:57.031933
"""
from typing import Sequence, Union
# import sqlalchemy as sa
# from alembic import op
# revision identifiers, used by Alembic.
revision: str = "fe943b098418"
down_revision: Union[str, None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
pass
def downgrade() -> None:
pass

15
app.json Normal file
View File

@ -0,0 +1,15 @@
{
"healthchecks": {
"web": [
{
"type": "startup",
"name": "web check",
"description": "Checking if the app responds to the GET /",
"path": "/",
"attempts": 3,
"warn": true,
"initialDelay": 1
}
]
}
}

View File

@ -7,26 +7,22 @@ from starlette.authentication import AuthenticationBackend
from starlette.requests import HTTPConnection
from auth.credentials import AuthCredentials, AuthUser
from auth.exceptions import OperationNotAllowed
from auth.tokenstorage import SessionToken
from base.exceptions import OperationNotAllowed
from base.orm import local_session
from orm.user import Role, User
from auth.usermodel import Role, User
from services.db import local_session
from settings import SESSION_TOKEN_HEADER
class JWTAuthenticate(AuthenticationBackend):
async def authenticate(
self, request: HTTPConnection
) -> Optional[Tuple[AuthCredentials, AuthUser]]:
async def authenticate(self, request: HTTPConnection) -> Optional[Tuple[AuthCredentials, AuthUser]]:
if SESSION_TOKEN_HEADER not in request.headers:
return AuthCredentials(scopes={}), AuthUser(user_id=None, username="")
token = request.headers.get(SESSION_TOKEN_HEADER)
if not token:
print("[auth.authenticate] no token in header %s" % SESSION_TOKEN_HEADER)
return AuthCredentials(scopes={}, error_message=str("no token")), AuthUser(
user_id=None, username=""
)
return AuthCredentials(scopes={}, error_message=str("no token")), AuthUser(user_id=None, username="")
if len(token.split(".")) > 1:
payload = await SessionToken.verify(token)
@ -52,20 +48,14 @@ class JWTAuthenticate(AuthenticationBackend):
except exc.NoResultFound:
pass
return AuthCredentials(scopes={}, error_message=str("Invalid token")), AuthUser(
user_id=None, username=""
)
return AuthCredentials(scopes={}, error_message=str("Invalid token")), AuthUser(user_id=None, username="")
def login_required(func):
@wraps(func)
async def wrap(parent, info: GraphQLResolveInfo, *args, **kwargs):
# debug only
# print('[auth.authenticate] login required for %r with info %r' % (func, info))
auth: AuthCredentials = info.context["request"].auth
# print(auth)
if not auth or not auth.logged_in:
# raise Unauthorized(auth.error_message or "Please login")
return {"error": "Please login first"}
return await func(parent, info, *args, **kwargs)
@ -75,9 +65,7 @@ def login_required(func):
def permission_required(resource, operation, func):
@wraps(func)
async def wrap(parent, info: GraphQLResolveInfo, *args, **kwargs):
print(
"[auth.authenticate] permission_required for %r with info %r" % (func, info)
) # debug only
print("[auth.authenticate] permission_required for %r with info %r" % (func, info)) # debug only
auth: AuthCredentials = info.context["request"].auth
if not auth.logged_in:
raise OperationNotAllowed(auth.error_message or "Please login")
@ -87,3 +75,22 @@ def permission_required(resource, operation, func):
return await func(parent, info, *args, **kwargs)
return wrap
def login_accepted(func):
@wraps(func)
async def wrap(parent, info: GraphQLResolveInfo, *args, **kwargs):
auth: AuthCredentials = info.context["request"].auth
# Если есть авторизация, добавляем данные автора в контекст
if auth and auth.logged_in:
info.context["author"] = auth.author
info.context["user_id"] = auth.author.get("id")
else:
# Очищаем данные автора из контекста если авторизация отсутствует
info.context["author"] = None
info.context["user_id"] = None
return await func(parent, info, *args, **kwargs)
return wrap

View File

@ -1,15 +1,15 @@
from binascii import hexlify
from hashlib import sha256
from jwt import DecodeError, ExpiredSignatureError
from passlib.hash import bcrypt
from auth.exceptions import ExpiredToken, InvalidToken
from auth.jwtcodec import JWTCodec
from auth.tokenstorage import TokenStorage
from orm.user import User
# from base.exceptions import InvalidPassword, InvalidToken
from base.orm import local_session
from orm import User
from services.db import local_session
class Password:
@ -33,8 +33,8 @@ class Password:
Verify that password hash is equal to specified hash. Hash format:
$2a$10$Ro0CUfOqk6cXEKf3dyaM7OhSCvnwM9s4wIX9JeLapehKK5YdLxKcm
__ __ ____________________________________________________ # noqa: W605
| | | Salt (22) | Hash
\__/\/ \____________________/\_____________________________/ # noqa: W605
| | Salt Hash
| Cost
Version
@ -80,10 +80,10 @@ class Identity:
if not await TokenStorage.exist(f"{payload.user_id}-{payload.username}-{token}"):
# raise InvalidToken("Login token has expired, please login again")
return {"error": "Token has expired"}
except ExpiredSignatureError:
except ExpiredToken:
# raise InvalidToken("Login token has expired, please try again")
return {"error": "Token has expired"}
except DecodeError:
except InvalidToken:
# raise InvalidToken("token format error") from e
return {"error": "Token format error"}
with local_session() as session:

View File

@ -1,15 +1,23 @@
from datetime import datetime, timezone
import jwt
from pydantic import BaseModel
from base.exceptions import ExpiredToken, InvalidToken
from auth.exceptions import ExpiredToken, InvalidToken
from settings import JWT_ALGORITHM, JWT_SECRET_KEY
from validations.auth import AuthInput, TokenPayload
class TokenPayload(BaseModel):
user_id: str
username: str
exp: datetime
iat: datetime
iss: str
class JWTCodec:
@staticmethod
def encode(user: AuthInput, exp: datetime) -> str:
def encode(user, exp: datetime) -> str:
payload = {
"user_id": user.id,
"username": user.email or user.phone,
@ -23,7 +31,7 @@ class JWTCodec:
print("[auth.jwtcodec] JWT encode error %r" % e)
@staticmethod
def decode(token: str, verify_exp: bool = True) -> TokenPayload:
def decode(token: str, verify_exp: bool = True):
r = None
payload = None
try:

View File

@ -5,18 +5,17 @@ from datetime import datetime, timezone
from urllib.parse import quote_plus
from graphql.type import GraphQLResolveInfo
from transliterate import translit
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from auth.email import send_auth_email
from auth.exceptions import InvalidPassword, InvalidToken, ObjectNotExist, Unauthorized
from auth.identity import Identity, Password
from auth.jwtcodec import JWTCodec
from auth.tokenstorage import TokenStorage
from base.exceptions import InvalidPassword, InvalidToken, ObjectNotExist, Unauthorized
from base.orm import local_session
from base.resolvers import mutation, query
from orm import Role, User
from services.db import local_session
from services.schema import mutation, query
from settings import SESSION_TOKEN_HEADER
@ -66,9 +65,50 @@ def create_user(user_dict):
return user
def replace_translit(src):
ruchars = "абвгдеёжзийклмнопрстуфхцчшщъыьэюя."
enchars = [
"a",
"b",
"v",
"g",
"d",
"e",
"yo",
"zh",
"z",
"i",
"y",
"k",
"l",
"m",
"n",
"o",
"p",
"r",
"s",
"t",
"u",
"f",
"h",
"c",
"ch",
"sh",
"sch",
"",
"y",
"'",
"e",
"yu",
"ya",
"-",
]
return src.translate(str.maketrans(ruchars, enchars))
def generate_unique_slug(src):
print("[resolvers.auth] generating slug from: " + src)
slug = translit(src, "ru", reversed=True).replace(".", "-").lower()
slug = replace_translit(src.lower())
slug = re.sub("[^0-9a-zA-Z]+", "-", slug)
if slug != src:
print("[resolvers.auth] translited name: " + slug)

View File

@ -1,9 +1,9 @@
from datetime import datetime, timedelta, timezone
from auth.jwtcodec import JWTCodec
from base.redis import redis
from auth.validations import AuthInput
from services.redis import redis
from settings import ONETIME_TOKEN_LIFE_SPAN, SESSION_TOKEN_LIFE_SPAN
from validations.auth import AuthInput
async def save(token_key, life_span, auto_delete=True):

116
auth/validations.py Normal file
View File

@ -0,0 +1,116 @@
import re
from datetime import datetime
from typing import Dict, List, Optional, Union
from pydantic import BaseModel, Field, field_validator
# RFC 5322 compliant email regex pattern
EMAIL_PATTERN = r"^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$"
class AuthInput(BaseModel):
"""Base model for authentication input validation"""
user_id: str = Field(description="Unique user identifier")
username: str = Field(min_length=2, max_length=50)
token: str = Field(min_length=32)
@field_validator("user_id")
@classmethod
def validate_user_id(cls, v: str) -> str:
if not v.strip():
raise ValueError("user_id cannot be empty")
return v
class UserRegistrationInput(BaseModel):
"""Validation model for user registration"""
email: str = Field(max_length=254) # Max email length per RFC 5321
password: str = Field(min_length=8, max_length=100)
name: str = Field(min_length=2, max_length=50)
@field_validator("email")
@classmethod
def validate_email(cls, v: str) -> str:
"""Validate email format"""
if not re.match(EMAIL_PATTERN, v):
raise ValueError("Invalid email format")
return v.lower()
@field_validator("password")
@classmethod
def validate_password_strength(cls, v: str) -> str:
"""Validate password meets security requirements"""
if not any(c.isupper() for c in v):
raise ValueError("Password must contain at least one uppercase letter")
if not any(c.islower() for c in v):
raise ValueError("Password must contain at least one lowercase letter")
if not any(c.isdigit() for c in v):
raise ValueError("Password must contain at least one number")
if not any(c in "!@#$%^&*()_+-=[]{}|;:,.<>?" for c in v):
raise ValueError("Password must contain at least one special character")
return v
class UserLoginInput(BaseModel):
"""Validation model for user login"""
email: str = Field(max_length=254)
password: str = Field(min_length=8, max_length=100)
@field_validator("email")
@classmethod
def validate_email(cls, v: str) -> str:
if not re.match(EMAIL_PATTERN, v):
raise ValueError("Invalid email format")
return v.lower()
class TokenPayload(BaseModel):
"""Validation model for JWT token payload"""
user_id: str
username: str
exp: datetime
iat: datetime
scopes: Optional[List[str]] = []
class OAuthInput(BaseModel):
"""Validation model for OAuth input"""
provider: str = Field(pattern="^(google|github|facebook)$")
code: str
redirect_uri: Optional[str] = None
@field_validator("provider")
@classmethod
def validate_provider(cls, v: str) -> str:
valid_providers = ["google", "github", "facebook"]
if v.lower() not in valid_providers:
raise ValueError(f"Provider must be one of: {', '.join(valid_providers)}")
return v.lower()
class AuthResponse(BaseModel):
"""Validation model for authentication responses"""
success: bool
token: Optional[str] = None
error: Optional[str] = None
user: Optional[Dict[str, Union[str, int, bool]]] = None
@field_validator("error")
@classmethod
def validate_error_if_not_success(cls, v: Optional[str], info) -> Optional[str]:
if not info.data.get("success") and not v:
raise ValueError("Error message required when success is False")
return v
@field_validator("token")
@classmethod
def validate_token_if_success(cls, v: Optional[str], info) -> Optional[str]:
if info.data.get("success") and not v:
raise ValueError("Token required when success is True")
return v

View File

@ -1,57 +0,0 @@
from typing import Any, Callable, Dict, Generic, TypeVar
from sqlalchemy import Column, Integer, create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import Session
from sqlalchemy.sql.schema import Table
from settings import DB_URL
engine = create_engine(DB_URL, echo=False, pool_size=10, max_overflow=20)
T = TypeVar("T")
REGISTRY: Dict[str, type] = {}
def local_session():
return Session(bind=engine, expire_on_commit=False)
DeclarativeBase = declarative_base() # type: Any
class Base(DeclarativeBase):
__table__: Table
__tablename__: str
__new__: Callable
__init__: Callable
__allow_unmapped__ = True
__abstract__ = True
__table_args__ = {"extend_existing": True}
id = Column(Integer, primary_key=True)
def __init_subclass__(cls, **kwargs):
REGISTRY[cls.__name__] = cls
@classmethod
def create(cls: Generic[T], **kwargs) -> Generic[T]:
instance = cls(**kwargs)
return instance.save()
def save(self) -> Generic[T]:
with local_session() as session:
session.add(self)
session.commit()
return self
def update(self, input):
column_names = self.__table__.columns.keys()
for name, value in input.items():
if name in column_names:
setattr(self, name, value)
def dict(self) -> Dict[str, Any]:
column_names = self.__table__.columns.keys()
return {c: getattr(self, c) for c in column_names}

View File

@ -1,13 +0,0 @@
from ariadne import MutationType, QueryType, ScalarType
datetime_scalar = ScalarType("DateTime")
@datetime_scalar.serializer
def serialize_datetime(value):
return value.isoformat()
query = QueryType()
mutation = MutationType()
resolvers = [query, mutation, datetime_scalar]

628
cache/cache.py vendored Normal file
View File

@ -0,0 +1,628 @@
"""
Caching system for the Discours platform
----------------------------------------
This module provides a comprehensive caching solution with these key components:
1. KEY NAMING CONVENTIONS:
- Entity-based keys: "entity:property:value" (e.g., "author:id:123")
- Collection keys: "entity:collection:params" (e.g., "authors:stats:limit=10:offset=0")
- Special case keys: Maintained for backwards compatibility (e.g., "topic_shouts_123")
2. CORE FUNCTIONS:
- cached_query(): High-level function for retrieving cached data or executing queries
3. ENTITY-SPECIFIC FUNCTIONS:
- cache_author(), cache_topic(): Cache entity data
- get_cached_author(), get_cached_topic(): Retrieve entity data from cache
- invalidate_cache_by_prefix(): Invalidate all keys with a specific prefix
4. CACHE INVALIDATION STRATEGY:
- Direct invalidation via invalidate_* functions for immediate changes
- Delayed invalidation via revalidation_manager for background processing
- Event-based triggers for automatic cache updates (see triggers.py)
To maintain consistency with the existing codebase, this module preserves
the original key naming patterns while providing a more structured approach
for new cache operations.
"""
import asyncio
import json
from typing import Any, Dict, List, Optional, Union
import orjson
from sqlalchemy import and_, join, select
from orm.author import Author, AuthorFollower
from orm.shout import Shout, ShoutAuthor, ShoutTopic
from orm.topic import Topic, TopicFollower
from services.db import local_session
from services.redis import redis
from utils.encoders import CustomJSONEncoder
from utils.logger import root_logger as logger
DEFAULT_FOLLOWS = {
"topics": [],
"authors": [],
"shouts": [],
"communities": [{"id": 1, "name": "Дискурс", "slug": "discours", "pic": ""}],
}
CACHE_TTL = 300 # 5 minutes
# Key templates for common entity types
# These are used throughout the codebase and should be maintained for compatibility
CACHE_KEYS = {
"TOPIC_ID": "topic:id:{}",
"TOPIC_SLUG": "topic:slug:{}",
"TOPIC_AUTHORS": "topic:authors:{}",
"TOPIC_FOLLOWERS": "topic:followers:{}",
"TOPIC_SHOUTS": "topic_shouts_{}",
"AUTHOR_ID": "author:id:{}",
"AUTHOR_USER": "author:user:{}",
"SHOUTS": "shouts:{}",
}
# Cache topic data
async def cache_topic(topic: dict):
payload = json.dumps(topic, cls=CustomJSONEncoder)
await asyncio.gather(
redis.execute("SET", f"topic:id:{topic['id']}", payload),
redis.execute("SET", f"topic:slug:{topic['slug']}", payload),
)
# Cache author data
async def cache_author(author: dict):
payload = json.dumps(author, cls=CustomJSONEncoder)
await asyncio.gather(
redis.execute("SET", f"author:user:{author['user'].strip()}", str(author["id"])),
redis.execute("SET", f"author:id:{author['id']}", payload),
)
# Cache follows data
async def cache_follows(follower_id: int, entity_type: str, entity_id: int, is_insert=True):
key = f"author:follows-{entity_type}s:{follower_id}"
follows_str = await redis.execute("GET", key)
follows = orjson.loads(follows_str) if follows_str else DEFAULT_FOLLOWS[entity_type]
if is_insert:
if entity_id not in follows:
follows.append(entity_id)
else:
follows = [eid for eid in follows if eid != entity_id]
await redis.execute("SET", key, json.dumps(follows, cls=CustomJSONEncoder))
await update_follower_stat(follower_id, entity_type, len(follows))
# Update follower statistics
async def update_follower_stat(follower_id, entity_type, count):
follower_key = f"author:id:{follower_id}"
follower_str = await redis.execute("GET", follower_key)
follower = orjson.loads(follower_str) if follower_str else None
if follower:
follower["stat"] = {f"{entity_type}s": count}
await cache_author(follower)
# Get author from cache
async def get_cached_author(author_id: int, get_with_stat):
author_key = f"author:id:{author_id}"
result = await redis.execute("GET", author_key)
if result:
return orjson.loads(result)
# Load from database if not found in cache
q = select(Author).where(Author.id == author_id)
authors = get_with_stat(q)
if authors:
author = authors[0]
await cache_author(author.dict())
return author.dict()
return None
# Function to get cached topic
async def get_cached_topic(topic_id: int):
"""
Fetch topic data from cache or database by id.
Args:
topic_id (int): The identifier for the topic.
Returns:
dict: Topic data or None if not found.
"""
topic_key = f"topic:id:{topic_id}"
cached_topic = await redis.execute("GET", topic_key)
if cached_topic:
return orjson.loads(cached_topic)
# If not in cache, fetch from the database
with local_session() as session:
topic = session.execute(select(Topic).where(Topic.id == topic_id)).scalar_one_or_none()
if topic:
topic_dict = topic.dict()
await redis.execute("SET", topic_key, json.dumps(topic_dict, cls=CustomJSONEncoder))
return topic_dict
return None
# Get topic by slug from cache
async def get_cached_topic_by_slug(slug: str, get_with_stat):
topic_key = f"topic:slug:{slug}"
result = await redis.execute("GET", topic_key)
if result:
return orjson.loads(result)
# Load from database if not found in cache
topic_query = select(Topic).where(Topic.slug == slug)
topics = get_with_stat(topic_query)
if topics:
topic_dict = topics[0].dict()
await cache_topic(topic_dict)
return topic_dict
return None
# Get list of authors by ID from cache
async def get_cached_authors_by_ids(author_ids: List[int]) -> List[dict]:
# Fetch all author data concurrently
keys = [f"author:id:{author_id}" for author_id in author_ids]
results = await asyncio.gather(*(redis.execute("GET", key) for key in keys))
authors = [orjson.loads(result) if result else None for result in results]
# Load missing authors from database and cache
missing_indices = [index for index, author in enumerate(authors) if author is None]
if missing_indices:
missing_ids = [author_ids[index] for index in missing_indices]
with local_session() as session:
query = select(Author).where(Author.id.in_(missing_ids))
missing_authors = session.execute(query).scalars().all()
await asyncio.gather(*(cache_author(author.dict()) for author in missing_authors))
for index, author in zip(missing_indices, missing_authors):
authors[index] = author.dict()
return authors
async def get_cached_topic_followers(topic_id: int):
"""
Получает подписчиков темы по ID, используя кеш Redis.
Args:
topic_id: ID темы
Returns:
List[dict]: Список подписчиков с их данными
"""
try:
cache_key = CACHE_KEYS["TOPIC_FOLLOWERS"].format(topic_id)
cached = await redis.execute("GET", cache_key)
if cached:
followers_ids = orjson.loads(cached)
logger.debug(f"Found {len(followers_ids)} cached followers for topic #{topic_id}")
return await get_cached_authors_by_ids(followers_ids)
with local_session() as session:
followers_ids = [
f[0]
for f in session.query(Author.id)
.join(TopicFollower, TopicFollower.follower == Author.id)
.filter(TopicFollower.topic == topic_id)
.all()
]
await redis.execute("SETEX", cache_key, CACHE_TTL, orjson.dumps(followers_ids))
followers = await get_cached_authors_by_ids(followers_ids)
logger.debug(f"Cached {len(followers)} followers for topic #{topic_id}")
return followers
except Exception as e:
logger.error(f"Error getting followers for topic #{topic_id}: {str(e)}")
return []
# Get cached author followers
async def get_cached_author_followers(author_id: int):
# Check cache for data
cached = await redis.execute("GET", f"author:followers:{author_id}")
if cached:
followers_ids = orjson.loads(cached)
followers = await get_cached_authors_by_ids(followers_ids)
logger.debug(f"Cached followers for author #{author_id}: {len(followers)}")
return followers
# Query database if cache is empty
with local_session() as session:
followers_ids = [
f[0]
for f in session.query(Author.id)
.join(AuthorFollower, AuthorFollower.follower == Author.id)
.filter(AuthorFollower.author == author_id, Author.id != author_id)
.all()
]
await redis.execute("SET", f"author:followers:{author_id}", orjson.dumps(followers_ids))
followers = await get_cached_authors_by_ids(followers_ids)
return followers
# Get cached follower authors
async def get_cached_follower_authors(author_id: int):
# Attempt to retrieve authors from cache
cached = await redis.execute("GET", f"author:follows-authors:{author_id}")
if cached:
authors_ids = orjson.loads(cached)
else:
# Query authors from database
with local_session() as session:
authors_ids = [
a[0]
for a in session.execute(
select(Author.id)
.select_from(join(Author, AuthorFollower, Author.id == AuthorFollower.author))
.where(AuthorFollower.follower == author_id)
).all()
]
await redis.execute("SET", f"author:follows-authors:{author_id}", orjson.dumps(authors_ids))
authors = await get_cached_authors_by_ids(authors_ids)
return authors
# Get cached follower topics
async def get_cached_follower_topics(author_id: int):
# Attempt to retrieve topics from cache
cached = await redis.execute("GET", f"author:follows-topics:{author_id}")
if cached:
topics_ids = orjson.loads(cached)
else:
# Load topics from database and cache them
with local_session() as session:
topics_ids = [
t[0]
for t in session.query(Topic.id)
.join(TopicFollower, TopicFollower.topic == Topic.id)
.where(TopicFollower.follower == author_id)
.all()
]
await redis.execute("SET", f"author:follows-topics:{author_id}", orjson.dumps(topics_ids))
topics = []
for topic_id in topics_ids:
topic_str = await redis.execute("GET", f"topic:id:{topic_id}")
if topic_str:
topic = orjson.loads(topic_str)
if topic and topic not in topics:
topics.append(topic)
logger.debug(f"Cached topics for author#{author_id}: {len(topics)}")
return topics
# Get author by user ID from cache
async def get_cached_author_by_user_id(user_id: str, get_with_stat):
"""
Retrieve author information by user_id, checking the cache first, then the database.
Args:
user_id (str): The user identifier for which to retrieve the author.
Returns:
dict: Dictionary with author data or None if not found.
"""
# Attempt to find author ID by user_id in Redis cache
author_id = await redis.execute("GET", f"author:user:{user_id.strip()}")
if author_id:
# If ID is found, get full author data by ID
author_data = await redis.execute("GET", f"author:id:{author_id}")
if author_data:
return orjson.loads(author_data)
# If data is not found in cache, query the database
author_query = select(Author).where(Author.user == user_id)
authors = get_with_stat(author_query)
if authors:
# Cache the retrieved author data
author = authors[0]
author_dict = author.dict()
await asyncio.gather(
redis.execute("SET", f"author:user:{user_id.strip()}", str(author.id)),
redis.execute("SET", f"author:id:{author.id}", orjson.dumps(author_dict)),
)
return author_dict
# Return None if author is not found
return None
# Get cached topic authors
async def get_cached_topic_authors(topic_id: int):
"""
Retrieve a list of authors for a given topic, using cache or database.
Args:
topic_id (int): The identifier of the topic for which to retrieve authors.
Returns:
List[dict]: A list of dictionaries containing author data.
"""
# Attempt to get a list of author IDs from cache
rkey = f"topic:authors:{topic_id}"
cached_authors_ids = await redis.execute("GET", rkey)
if cached_authors_ids:
authors_ids = orjson.loads(cached_authors_ids)
else:
# If cache is empty, get data from the database
with local_session() as session:
query = (
select(ShoutAuthor.author)
.select_from(join(ShoutTopic, Shout, ShoutTopic.shout == Shout.id))
.join(ShoutAuthor, ShoutAuthor.shout == Shout.id)
.where(and_(ShoutTopic.topic == topic_id, Shout.published_at.is_not(None), Shout.deleted_at.is_(None)))
)
authors_ids = [author_id for (author_id,) in session.execute(query).all()]
# Cache the retrieved author IDs
await redis.execute("SET", rkey, orjson.dumps(authors_ids))
# Retrieve full author details from cached IDs
if authors_ids:
authors = await get_cached_authors_by_ids(authors_ids)
logger.debug(f"Topic#{topic_id} authors fetched and cached: {len(authors)} authors found.")
return authors
return []
async def invalidate_shouts_cache(cache_keys: List[str]):
"""
Инвалидирует кэш выборок публикаций по переданным ключам.
"""
for key in cache_keys:
try:
# Формируем полный ключ кэша
cache_key = f"shouts:{key}"
# Удаляем основной кэш
await redis.execute("DEL", cache_key)
logger.debug(f"Invalidated cache key: {cache_key}")
# Добавляем ключ в список инвалидированных с TTL
await redis.execute("SETEX", f"{cache_key}:invalidated", CACHE_TTL, "1")
# Если это кэш темы, инвалидируем также связанные ключи
if key.startswith("topic_"):
topic_id = key.split("_")[1]
related_keys = [
f"topic:id:{topic_id}",
f"topic:authors:{topic_id}",
f"topic:followers:{topic_id}",
f"topic:stats:{topic_id}",
]
for related_key in related_keys:
await redis.execute("DEL", related_key)
logger.debug(f"Invalidated related key: {related_key}")
except Exception as e:
logger.error(f"Error invalidating cache key {key}: {e}")
async def cache_topic_shouts(topic_id: int, shouts: List[dict]):
"""Кэширует список публикаций для темы"""
key = f"topic_shouts_{topic_id}"
payload = json.dumps(shouts, cls=CustomJSONEncoder)
await redis.execute("SETEX", key, CACHE_TTL, payload)
async def get_cached_topic_shouts(topic_id: int) -> List[dict]:
"""Получает кэшированный список публикаций для темы"""
key = f"topic_shouts_{topic_id}"
cached = await redis.execute("GET", key)
if cached:
return orjson.loads(cached)
return None
async def cache_related_entities(shout: Shout):
"""
Кэширует все связанные с публикацией сущности (авторов и темы)
"""
tasks = []
for author in shout.authors:
tasks.append(cache_by_id(Author, author.id, cache_author))
for topic in shout.topics:
tasks.append(cache_by_id(Topic, topic.id, cache_topic))
await asyncio.gather(*tasks)
async def invalidate_shout_related_cache(shout: Shout, author_id: int):
"""
Инвалидирует весь кэш, связанный с публикацией и её связями
Args:
shout: Объект публикации
author_id: ID автора
"""
cache_keys = {
"feed", # основная лента
f"author_{author_id}", # публикации автора
"random_top", # случайные топовые
"unrated", # неоцененные
"recent", # последние
"coauthored", # совместные
}
# Добавляем ключи авторов
cache_keys.update(f"author_{a.id}" for a in shout.authors)
cache_keys.update(f"authored_{a.id}" for a in shout.authors)
# Добавляем ключи тем
cache_keys.update(f"topic_{t.id}" for t in shout.topics)
cache_keys.update(f"topic_shouts_{t.id}" for t in shout.topics)
await invalidate_shouts_cache(list(cache_keys))
# Function removed - direct Redis calls used throughout the module instead
async def get_cached_entity(entity_type: str, entity_id: int, get_method, cache_method):
"""
Универсальная функция получения кэшированной сущности
Args:
entity_type: 'author' или 'topic'
entity_id: ID сущности
get_method: метод получения из БД
cache_method: метод кэширования
"""
key = f"{entity_type}:id:{entity_id}"
cached = await redis.execute("GET", key)
if cached:
return orjson.loads(cached)
entity = await get_method(entity_id)
if entity:
await cache_method(entity)
return entity
return None
async def cache_by_id(entity, entity_id: int, cache_method):
"""
Кэширует сущность по ID, используя указанный метод кэширования
Args:
entity: класс сущности (Author/Topic)
entity_id: ID сущности
cache_method: функция кэширования
"""
from resolvers.stat import get_with_stat
caching_query = select(entity).filter(entity.id == entity_id)
result = get_with_stat(caching_query)
if not result or not result[0]:
logger.warning(f"{entity.__name__} with id {entity_id} not found")
return
x = result[0]
d = x.dict()
await cache_method(d)
return d
# Универсальная функция для сохранения данных в кеш
async def cache_data(key: str, data: Any, ttl: Optional[int] = None) -> None:
"""
Сохраняет данные в кеш по указанному ключу.
Args:
key: Ключ кеша
data: Данные для сохранения
ttl: Время жизни кеша в секундах (None - бессрочно)
"""
try:
payload = json.dumps(data, cls=CustomJSONEncoder)
if ttl:
await redis.execute("SETEX", key, ttl, payload)
else:
await redis.execute("SET", key, payload)
logger.debug(f"Данные сохранены в кеш по ключу {key}")
except Exception as e:
logger.error(f"Ошибка при сохранении данных в кеш: {e}")
# Универсальная функция для получения данных из кеша
async def get_cached_data(key: str) -> Optional[Any]:
"""
Получает данные из кеша по указанному ключу.
Args:
key: Ключ кеша
Returns:
Any: Данные из кеша или None, если данных нет
"""
try:
cached_data = await redis.execute("GET", key)
if cached_data:
loaded = orjson.loads(cached_data)
logger.debug(f"Данные получены из кеша по ключу {key}: {len(loaded)}")
return loaded
return None
except Exception as e:
logger.error(f"Ошибка при получении данных из кеша: {e}")
return None
# Универсальная функция для инвалидации кеша по префиксу
async def invalidate_cache_by_prefix(prefix: str) -> None:
"""
Инвалидирует все ключи кеша с указанным префиксом.
Args:
prefix: Префикс ключей кеша для инвалидации
"""
try:
keys = await redis.execute("KEYS", f"{prefix}:*")
if keys:
await redis.execute("DEL", *keys)
logger.debug(f"Удалено {len(keys)} ключей кеша с префиксом {prefix}")
except Exception as e:
logger.error(f"Ошибка при инвалидации кеша: {e}")
# Универсальная функция для получения и кеширования данных
async def cached_query(
cache_key: str,
query_func: callable,
ttl: Optional[int] = None,
force_refresh: bool = False,
use_key_format: bool = True,
**query_params,
) -> Any:
"""
Gets data from cache or executes query and saves result to cache.
Supports existing key formats for compatibility.
Args:
cache_key: Cache key or key template from CACHE_KEYS
query_func: Function to execute the query
ttl: Cache TTL in seconds (None - indefinite)
force_refresh: Force cache refresh
use_key_format: Whether to check if cache_key matches a key template in CACHE_KEYS
**query_params: Parameters to pass to the query function
Returns:
Any: Data from cache or query result
"""
# Check if cache_key matches a pattern in CACHE_KEYS
actual_key = cache_key
if use_key_format and "{}" in cache_key:
# Look for a template match in CACHE_KEYS
for key_name, key_format in CACHE_KEYS.items():
if cache_key == key_format:
# We have a match, now look for the id or value to format with
for param_name, param_value in query_params.items():
if param_name in ["id", "slug", "user", "topic_id", "author_id"]:
actual_key = cache_key.format(param_value)
break
# If not forcing refresh, try to get data from cache
if not force_refresh:
cached_result = await get_cached_data(actual_key)
if cached_result is not None:
return cached_result
# If data not in cache or refresh required, execute query
try:
result = await query_func(**query_params)
if result is not None:
# Save result to cache
await cache_data(actual_key, result, ttl)
return result
except Exception as e:
logger.error(f"Error executing query for caching: {e}")
# In case of error, return data from cache if not forcing refresh
if not force_refresh:
return await get_cached_data(actual_key)
raise

133
cache/precache.py vendored Normal file
View File

@ -0,0 +1,133 @@
import asyncio
import json
from sqlalchemy import and_, join, select
from cache.cache import cache_author, cache_topic
from orm.author import Author, AuthorFollower
from orm.shout import Shout, ShoutAuthor, ShoutReactionsFollower, ShoutTopic
from orm.topic import Topic, TopicFollower
from resolvers.stat import get_with_stat
from services.db import local_session
from services.redis import redis
from utils.encoders import CustomJSONEncoder
from utils.logger import root_logger as logger
# Предварительное кеширование подписчиков автора
async def precache_authors_followers(author_id, session):
authors_followers = set()
followers_query = select(AuthorFollower.follower).where(AuthorFollower.author == author_id)
result = session.execute(followers_query)
authors_followers.update(row[0] for row in result if row[0])
followers_payload = json.dumps(list(authors_followers), cls=CustomJSONEncoder)
await redis.execute("SET", f"author:followers:{author_id}", followers_payload)
# Предварительное кеширование подписок автора
async def precache_authors_follows(author_id, session):
follows_topics_query = select(TopicFollower.topic).where(TopicFollower.follower == author_id)
follows_authors_query = select(AuthorFollower.author).where(AuthorFollower.follower == author_id)
follows_shouts_query = select(ShoutReactionsFollower.shout).where(ShoutReactionsFollower.follower == author_id)
follows_topics = {row[0] for row in session.execute(follows_topics_query) if row[0]}
follows_authors = {row[0] for row in session.execute(follows_authors_query) if row[0]}
follows_shouts = {row[0] for row in session.execute(follows_shouts_query) if row[0]}
topics_payload = json.dumps(list(follows_topics), cls=CustomJSONEncoder)
authors_payload = json.dumps(list(follows_authors), cls=CustomJSONEncoder)
shouts_payload = json.dumps(list(follows_shouts), cls=CustomJSONEncoder)
await asyncio.gather(
redis.execute("SET", f"author:follows-topics:{author_id}", topics_payload),
redis.execute("SET", f"author:follows-authors:{author_id}", authors_payload),
redis.execute("SET", f"author:follows-shouts:{author_id}", shouts_payload),
)
# Предварительное кеширование авторов тем
async def precache_topics_authors(topic_id: int, session):
topic_authors_query = (
select(ShoutAuthor.author)
.select_from(join(ShoutTopic, Shout, ShoutTopic.shout == Shout.id))
.join(ShoutAuthor, ShoutAuthor.shout == Shout.id)
.filter(
and_(
ShoutTopic.topic == topic_id,
Shout.published_at.is_not(None),
Shout.deleted_at.is_(None),
)
)
)
topic_authors = {row[0] for row in session.execute(topic_authors_query) if row[0]}
authors_payload = json.dumps(list(topic_authors), cls=CustomJSONEncoder)
await redis.execute("SET", f"topic:authors:{topic_id}", authors_payload)
# Предварительное кеширование подписчиков тем
async def precache_topics_followers(topic_id: int, session):
followers_query = select(TopicFollower.follower).where(TopicFollower.topic == topic_id)
topic_followers = {row[0] for row in session.execute(followers_query) if row[0]}
followers_payload = json.dumps(list(topic_followers), cls=CustomJSONEncoder)
await redis.execute("SET", f"topic:followers:{topic_id}", followers_payload)
async def precache_data():
logger.info("precaching...")
try:
key = "authorizer_env"
# cache reset
value = await redis.execute("HGETALL", key)
await redis.execute("FLUSHDB")
logger.info("redis: FLUSHDB")
# Преобразуем словарь в список аргументов для HSET
if value:
# Если значение - словарь, преобразуем его в плоский список для HSET
if isinstance(value, dict):
flattened = []
for field, val in value.items():
flattened.extend([field, val])
await redis.execute("HSET", key, *flattened)
else:
# Предполагаем, что значение уже содержит список
await redis.execute("HSET", key, *value)
logger.info(f"redis hash '{key}' was restored")
with local_session() as session:
# topics
q = select(Topic).where(Topic.community == 1)
topics = get_with_stat(q)
for topic in topics:
topic_dict = topic.dict() if hasattr(topic, "dict") else topic
await cache_topic(topic_dict)
await asyncio.gather(
precache_topics_followers(topic_dict["id"], session),
precache_topics_authors(topic_dict["id"], session),
)
logger.info(f"{len(topics)} topics and their followings precached")
# authors
authors = get_with_stat(select(Author).where(Author.user.is_not(None)))
logger.info(f"{len(authors)} authors found in database")
for author in authors:
if isinstance(author, Author):
profile = author.dict()
author_id = profile.get("id")
user_id = profile.get("user", "").strip()
if author_id and user_id:
await cache_author(profile)
await asyncio.gather(
precache_authors_followers(author_id, session), precache_authors_follows(author_id, session)
)
else:
logger.error(f"fail caching {author}")
logger.info(f"{len(authors)} authors and their followings precached")
except Exception as exc:
import traceback
traceback.print_exc()
logger.error(f"Error in precache_data: {exc}")

172
cache/revalidator.py vendored Normal file
View File

@ -0,0 +1,172 @@
import asyncio
from cache.cache import (
cache_author,
cache_topic,
get_cached_author,
get_cached_topic,
invalidate_cache_by_prefix,
)
from resolvers.stat import get_with_stat
from services.redis import redis
from utils.logger import root_logger as logger
CACHE_REVALIDATION_INTERVAL = 300 # 5 minutes
class CacheRevalidationManager:
def __init__(self, interval=CACHE_REVALIDATION_INTERVAL):
"""Инициализация менеджера с заданным интервалом проверки (в секундах)."""
self.interval = interval
self.items_to_revalidate = {"authors": set(), "topics": set(), "shouts": set(), "reactions": set()}
self.lock = asyncio.Lock()
self.running = True
self.MAX_BATCH_SIZE = 10 # Максимальное количество элементов для поштучной обработки
self._redis = redis # Добавлена инициализация _redis для доступа к Redis-клиенту
async def start(self):
"""Запуск фонового воркера для ревалидации кэша."""
# Проверяем, что у нас есть соединение с Redis
if not self._redis._client:
logger.warning("Redis connection not established. Waiting for connection...")
try:
await self._redis.connect()
logger.info("Redis connection established for revalidation manager")
except Exception as e:
logger.error(f"Failed to connect to Redis: {e}")
self.task = asyncio.create_task(self.revalidate_cache())
async def revalidate_cache(self):
"""Циклическая проверка и ревалидация кэша каждые self.interval секунд."""
try:
while self.running:
await asyncio.sleep(self.interval)
await self.process_revalidation()
except asyncio.CancelledError:
logger.info("Revalidation worker was stopped.")
except Exception as e:
logger.error(f"An error occurred in the revalidation worker: {e}")
async def process_revalidation(self):
"""Обновление кэша для всех сущностей, требующих ревалидации."""
# Проверяем соединение с Redis
if not self._redis._client:
return # Выходим из метода, если не удалось подключиться
async with self.lock:
# Ревалидация кэша авторов
if self.items_to_revalidate["authors"]:
logger.debug(f"Revalidating {len(self.items_to_revalidate['authors'])} authors")
for author_id in self.items_to_revalidate["authors"]:
if author_id == "all":
await invalidate_cache_by_prefix("authors")
break
author = await get_cached_author(author_id, get_with_stat)
if author:
await cache_author(author)
self.items_to_revalidate["authors"].clear()
# Ревалидация кэша тем
if self.items_to_revalidate["topics"]:
logger.debug(f"Revalidating {len(self.items_to_revalidate['topics'])} topics")
for topic_id in self.items_to_revalidate["topics"]:
if topic_id == "all":
await invalidate_cache_by_prefix("topics")
break
topic = await get_cached_topic(topic_id)
if topic:
await cache_topic(topic)
self.items_to_revalidate["topics"].clear()
# Ревалидация шаутов (публикаций)
if self.items_to_revalidate["shouts"]:
shouts_count = len(self.items_to_revalidate["shouts"])
logger.debug(f"Revalidating {shouts_count} shouts")
# Проверяем наличие специального флага 'all'
if "all" in self.items_to_revalidate["shouts"]:
await invalidate_cache_by_prefix("shouts")
# Если элементов много, но не 'all', используем специфический подход
elif shouts_count > self.MAX_BATCH_SIZE:
# Инвалидируем только collections keys, которые затрагивают много сущностей
collection_keys = await asyncio.create_task(self._redis.execute("KEYS", "shouts:*"))
if collection_keys:
await self._redis.execute("DEL", *collection_keys)
logger.debug(f"Удалено {len(collection_keys)} коллекционных ключей шаутов")
# Обновляем кеш каждого конкретного шаута
for shout_id in self.items_to_revalidate["shouts"]:
if shout_id != "all":
# Точечная инвалидация для каждого shout_id
specific_keys = [f"shout:id:{shout_id}"]
for key in specific_keys:
await self._redis.execute("DEL", key)
logger.debug(f"Удален ключ кеша {key}")
else:
# Если элементов немного, обрабатываем каждый
for shout_id in self.items_to_revalidate["shouts"]:
if shout_id != "all":
# Точечная инвалидация для каждого shout_id
specific_keys = [f"shout:id:{shout_id}"]
for key in specific_keys:
await self._redis.execute("DEL", key)
logger.debug(f"Удален ключ кеша {key}")
self.items_to_revalidate["shouts"].clear()
# Аналогично для реакций - точечная инвалидация
if self.items_to_revalidate["reactions"]:
reactions_count = len(self.items_to_revalidate["reactions"])
logger.debug(f"Revalidating {reactions_count} reactions")
if "all" in self.items_to_revalidate["reactions"]:
await invalidate_cache_by_prefix("reactions")
elif reactions_count > self.MAX_BATCH_SIZE:
# Инвалидируем только collections keys для реакций
collection_keys = await asyncio.create_task(self._redis.execute("KEYS", "reactions:*"))
if collection_keys:
await self._redis.execute("DEL", *collection_keys)
logger.debug(f"Удалено {len(collection_keys)} коллекционных ключей реакций")
# Точечная инвалидация для каждой реакции
for reaction_id in self.items_to_revalidate["reactions"]:
if reaction_id != "all":
specific_keys = [f"reaction:id:{reaction_id}"]
for key in specific_keys:
await self._redis.execute("DEL", key)
logger.debug(f"Удален ключ кеша {key}")
else:
# Точечная инвалидация для каждой реакции
for reaction_id in self.items_to_revalidate["reactions"]:
if reaction_id != "all":
specific_keys = [f"reaction:id:{reaction_id}"]
for key in specific_keys:
await self._redis.execute("DEL", key)
logger.debug(f"Удален ключ кеша {key}")
self.items_to_revalidate["reactions"].clear()
def mark_for_revalidation(self, entity_id, entity_type):
"""Отметить сущность для ревалидации."""
if entity_id and entity_type:
self.items_to_revalidate[entity_type].add(entity_id)
def invalidate_all(self, entity_type):
"""Пометить для инвалидации все элементы указанного типа."""
logger.debug(f"Marking all {entity_type} for invalidation")
# Особый флаг для полной инвалидации
self.items_to_revalidate[entity_type].add("all")
async def stop(self):
"""Остановка фонового воркера."""
self.running = False
if hasattr(self, "task"):
self.task.cancel()
try:
await self.task
except asyncio.CancelledError:
pass
revalidation_manager = CacheRevalidationManager()

147
cache/triggers.py vendored Normal file
View File

@ -0,0 +1,147 @@
from sqlalchemy import event
from cache.revalidator import revalidation_manager
from orm.author import Author, AuthorFollower
from orm.reaction import Reaction, ReactionKind
from orm.shout import Shout, ShoutAuthor, ShoutReactionsFollower
from orm.topic import Topic, TopicFollower
from services.db import local_session
from utils.logger import root_logger as logger
def mark_for_revalidation(entity, *args):
"""Отметка сущности для ревалидации."""
entity_type = (
"authors"
if isinstance(entity, Author)
else "topics"
if isinstance(entity, Topic)
else "reactions"
if isinstance(entity, Reaction)
else "shouts"
if isinstance(entity, Shout)
else None
)
if entity_type:
revalidation_manager.mark_for_revalidation(entity.id, entity_type)
def after_follower_handler(mapper, connection, target, is_delete=False):
"""Обработчик добавления, обновления или удаления подписки."""
entity_type = None
if isinstance(target, AuthorFollower):
entity_type = "authors"
elif isinstance(target, TopicFollower):
entity_type = "topics"
elif isinstance(target, ShoutReactionsFollower):
entity_type = "shouts"
if entity_type:
revalidation_manager.mark_for_revalidation(
target.author if entity_type == "authors" else target.topic, entity_type
)
if not is_delete:
revalidation_manager.mark_for_revalidation(target.follower, "authors")
def after_shout_handler(mapper, connection, target):
"""Обработчик изменения статуса публикации"""
if not isinstance(target, Shout):
return
# Проверяем изменение статуса публикации
# was_published = target.published_at is not None and target.deleted_at is None
# Всегда обновляем счетчики для авторов и тем при любом изменении поста
for author in target.authors:
revalidation_manager.mark_for_revalidation(author.id, "authors")
for topic in target.topics:
revalidation_manager.mark_for_revalidation(topic.id, "topics")
# Обновляем сам пост
revalidation_manager.mark_for_revalidation(target.id, "shouts")
def after_reaction_handler(mapper, connection, target):
"""Обработчик для комментариев"""
if not isinstance(target, Reaction):
return
# Проверяем что это комментарий
is_comment = target.kind == ReactionKind.COMMENT.value
# Получаем связанный пост
shout_id = target.shout if isinstance(target.shout, int) else target.shout.id
if not shout_id:
return
# Обновляем счетчики для автора комментария
if target.created_by:
revalidation_manager.mark_for_revalidation(target.created_by, "authors")
# Обновляем счетчики для поста
revalidation_manager.mark_for_revalidation(shout_id, "shouts")
if is_comment:
# Для комментариев обновляем также авторов и темы
with local_session() as session:
shout = (
session.query(Shout)
.filter(
Shout.id == shout_id,
Shout.published_at.is_not(None),
Shout.deleted_at.is_(None),
)
.first()
)
if shout:
for author in shout.authors:
revalidation_manager.mark_for_revalidation(author.id, "authors")
for topic in shout.topics:
revalidation_manager.mark_for_revalidation(topic.id, "topics")
def events_register():
"""Регистрация обработчиков событий для всех сущностей."""
event.listen(ShoutAuthor, "after_insert", mark_for_revalidation)
event.listen(ShoutAuthor, "after_update", mark_for_revalidation)
event.listen(ShoutAuthor, "after_delete", mark_for_revalidation)
event.listen(AuthorFollower, "after_insert", after_follower_handler)
event.listen(AuthorFollower, "after_update", after_follower_handler)
event.listen(
AuthorFollower,
"after_delete",
lambda *args: after_follower_handler(*args, is_delete=True),
)
event.listen(TopicFollower, "after_insert", after_follower_handler)
event.listen(TopicFollower, "after_update", after_follower_handler)
event.listen(
TopicFollower,
"after_delete",
lambda *args: after_follower_handler(*args, is_delete=True),
)
event.listen(ShoutReactionsFollower, "after_insert", after_follower_handler)
event.listen(ShoutReactionsFollower, "after_update", after_follower_handler)
event.listen(
ShoutReactionsFollower,
"after_delete",
lambda *args: after_follower_handler(*args, is_delete=True),
)
event.listen(Reaction, "after_update", mark_for_revalidation)
event.listen(Author, "after_update", mark_for_revalidation)
event.listen(Topic, "after_update", mark_for_revalidation)
event.listen(Shout, "after_update", after_shout_handler)
event.listen(Shout, "after_delete", after_shout_handler)
event.listen(Reaction, "after_insert", after_reaction_handler)
event.listen(Reaction, "after_update", after_reaction_handler)
event.listen(Reaction, "after_delete", after_reaction_handler)
logger.info("Event handlers registered successfully.")

View File

@ -1,10 +0,0 @@
#!/usr/bin/env bash
echo "> isort"
isort .
echo "> black"
black .
echo "> flake8"
flake8 .
# echo "> mypy"
# mypy .

295
docs/caching.md Normal file
View File

@ -0,0 +1,295 @@
# Система кеширования Discours
## Общее описание
Система кеширования Discours - это комплексное решение для повышения производительности платформы. Она использует Redis для хранения часто запрашиваемых данных и уменьшения нагрузки на основную базу данных.
Кеширование реализовано как многоуровневая система, состоящая из нескольких модулей:
- `cache.py` - основной модуль с функциями кеширования
- `revalidator.py` - асинхронный менеджер ревалидации кеша
- `triggers.py` - триггеры событий SQLAlchemy для автоматической ревалидации
- `precache.py` - предварительное кеширование данных при старте приложения
## Ключевые компоненты
### 1. Форматы ключей кеша
Система поддерживает несколько форматов ключей для обеспечения совместимости и удобства использования:
- **Ключи сущностей**: `entity:property:value` (например, `author:id:123`)
- **Ключи коллекций**: `entity:collection:params` (например, `authors:stats:limit=10:offset=0`)
- **Специальные ключи**: для обратной совместимости (например, `topic_shouts_123`)
Все стандартные форматы ключей хранятся в словаре `CACHE_KEYS`:
```python
CACHE_KEYS = {
"TOPIC_ID": "topic:id:{}",
"TOPIC_SLUG": "topic:slug:{}",
"AUTHOR_ID": "author:id:{}",
# и другие...
}
```
### 2. Основные функции кеширования
#### Структура ключей
Вместо генерации ключей через вспомогательные функции, система следует строгим конвенциям формирования ключей:
1. **Ключи для отдельных сущностей** строятся по шаблону:
```
entity:property:value
```
Например:
- `topic:id:123` - тема с ID 123
- `author:slug:john-doe` - автор со слагом "john-doe"
- `shout:id:456` - публикация с ID 456
2. **Ключи для коллекций** строятся по шаблону:
```
entity:collection[:filter1=value1:filter2=value2:...]
```
Например:
- `topics:all:basic` - базовый список всех тем
- `authors:stats:limit=10:offset=0:sort=name` - отсортированный список авторов с пагинацией
- `shouts:feed:limit=20:community=1` - лента публикаций с фильтром по сообществу
3. **Специальные форматы ключей** для обратной совместимости:
```
entity_action_id
```
Например:
- `topic_shouts_123` - публикации для темы с ID 123
Во всех модулях системы разработчики должны явно формировать ключи в соответствии с этими конвенциями, что обеспечивает единообразие и предсказуемость кеширования.
#### Работа с данными в кеше
```python
async def cache_data(key, data, ttl=None)
async def get_cached_data(key)
```
Эти функции предоставляют универсальный интерфейс для сохранения и получения данных из кеша. Они напрямую используют Redis через вызовы `redis.execute()`.
#### Высокоуровневое кеширование запросов
```python
async def cached_query(cache_key, query_func, ttl=None, force_refresh=False, **query_params)
```
Функция `cached_query` объединяет получение данных из кеша и выполнение запроса в случае отсутствия данных в кеше. Это основная функция, которую следует использовать в резолверах для кеширования результатов запросов.
### 3. Кеширование сущностей
Для основных типов сущностей реализованы специальные функции:
```python
async def cache_topic(topic: dict)
async def cache_author(author: dict)
async def get_cached_topic(topic_id: int)
async def get_cached_author(author_id: int, get_with_stat)
```
Эти функции упрощают работу с часто используемыми типами данных и обеспечивают единообразный подход к их кешированию.
### 4. Работа со связями
Для работы со связями между сущностями предназначены функции:
```python
async def cache_follows(follower_id, entity_type, entity_id, is_insert=True)
async def get_cached_topic_followers(topic_id)
async def get_cached_author_followers(author_id)
async def get_cached_follower_topics(author_id)
```
Они позволяют эффективно кешировать и получать информацию о подписках, связях между авторами, темами и публикациями.
## Система инвалидации кеша
### 1. Прямая инвалидация
Система поддерживает два типа инвалидации кеша:
#### 1.1. Инвалидация по префиксу
```python
async def invalidate_cache_by_prefix(prefix)
```
Позволяет инвалидировать все ключи кеша, начинающиеся с указанного префикса. Используется в резолверах для инвалидации группы кешей при массовых изменениях.
#### 1.2. Точечная инвалидация
```python
async def invalidate_authors_cache(author_id=None)
async def invalidate_topics_cache(topic_id=None)
```
Эти функции позволяют инвалидировать кеш только для конкретной сущности, что снижает нагрузку на Redis и предотвращает ненужную потерю кешированных данных. Если ID сущности не указан, используется инвалидация по префиксу.
Примеры использования точечной инвалидации:
```python
# Инвалидация кеша только для автора с ID 123
await invalidate_authors_cache(123)
# Инвалидация кеша только для темы с ID 456
await invalidate_topics_cache(456)
```
### 2. Отложенная инвалидация
Модуль `revalidator.py` реализует систему отложенной инвалидации кеша через класс `CacheRevalidationManager`:
```python
class CacheRevalidationManager:
def __init__(self, interval=CACHE_REVALIDATION_INTERVAL):
# ...
self._redis = redis # Прямая ссылка на сервис Redis
async def start(self):
# Проверка и установка соединения с Redis
# ...
async def process_revalidation(self):
# Обработка элементов для ревалидации
# ...
def mark_for_revalidation(self, entity_id, entity_type):
# Добавляет сущность в очередь на ревалидацию
# ...
```
Менеджер ревалидации работает как асинхронный фоновый процесс, который периодически (по умолчанию каждые 5 минут) проверяет наличие сущностей для ревалидации.
**Взаимодействие с Redis:**
- CacheRevalidationManager хранит прямую ссылку на сервис Redis через атрибут `_redis`
- При запуске проверяется наличие соединения с Redis и при необходимости устанавливается новое
- Включена автоматическая проверка соединения перед каждой операцией ревалидации
- Система самостоятельно восстанавливает соединение при его потере
**Особенности реализации:**
- Для авторов и тем используется поштучная ревалидация каждой записи
- Для шаутов и реакций используется батчевая обработка, с порогом в 10 элементов
- При достижении порога система переключается на инвалидацию коллекций вместо поштучной обработки
- Специальный флаг `all` позволяет запустить полную инвалидацию всех записей типа
### 3. Автоматическая инвалидация через триггеры
Модуль `triggers.py` регистрирует обработчики событий SQLAlchemy, которые автоматически отмечают сущности для ревалидации при изменении данных в базе:
```python
def events_register():
event.listen(Author, "after_update", mark_for_revalidation)
event.listen(Topic, "after_update", mark_for_revalidation)
# и другие...
```
Триггеры имеют следующие особенности:
- Реагируют на события вставки, обновления и удаления
- Отмечают затронутые сущности для отложенной ревалидации
- Учитывают связи между сущностями (например, при изменении темы обновляются связанные шауты)
## Предварительное кеширование
Модуль `precache.py` реализует предварительное кеширование часто используемых данных при старте приложения:
```python
async def precache_data():
# ...
```
Эта функция выполняется при запуске приложения и заполняет кеш данными, которые будут часто запрашиваться пользователями.
## Примеры использования
### Простое кеширование результата запроса
```python
async def get_topics_with_stats(limit=10, offset=0, by="title"):
# Формирование ключа кеша по конвенции
cache_key = f"topics:stats:limit={limit}:offset={offset}:sort={by}"
cached_data = await get_cached_data(cache_key)
if cached_data:
return cached_data
# Выполнение запроса к базе данных
result = ... # логика получения данных
await cache_data(cache_key, result, ttl=300)
return result
```
### Использование обобщенной функции cached_query
```python
async def get_topics_with_stats(limit=10, offset=0, by="title"):
async def fetch_data(limit, offset, by):
# Логика получения данных
return result
# Формирование ключа кеша по конвенции
cache_key = f"topics:stats:limit={limit}:offset={offset}:sort={by}"
return await cached_query(
cache_key,
fetch_data,
ttl=300,
limit=limit,
offset=offset,
by=by
)
```
### Точечная инвалидация кеша при изменении данных
```python
async def update_topic(topic_id, new_data):
# Обновление данных в базе
# ...
# Точечная инвалидация кеша только для измененной темы
await invalidate_topics_cache(topic_id)
return updated_topic
```
## Отладка и мониторинг
Система кеширования использует логгер для отслеживания операций:
```python
logger.debug(f"Данные получены из кеша по ключу {key}")
logger.debug(f"Удалено {len(keys)} ключей кеша с префиксом {prefix}")
logger.error(f"Ошибка при инвалидации кеша: {e}")
```
Это позволяет отслеживать работу кеша и выявлять возможные проблемы на ранних стадиях.
## Рекомендации по использованию
1. **Следуйте конвенциям формирования ключей** - это критически важно для консистентности и предсказуемости кеша.
2. **Не создавайте собственные форматы ключей** - используйте существующие шаблоны для обеспечения единообразия.
3. **Не забывайте об инвалидации** - всегда инвалидируйте кеш при изменении данных.
4. **Используйте точечную инвалидацию** - вместо инвалидации по префиксу для снижения нагрузки на Redis.
5. **Устанавливайте разумные TTL** - используйте разные значения TTL в зависимости от частоты изменения данных.
6. **Не кешируйте большие объемы данных** - кешируйте только то, что действительно необходимо для повышения производительности.
## Технические детали реализации
- **Сериализация данных**: используется `orjson` для эффективной сериализации и десериализации данных.
- **Форматирование даты и времени**: для корректной работы с датами используется `CustomJSONEncoder`.
- **Асинхронность**: все операции кеширования выполняются асинхронно для минимального влияния на производительность API.
- **Прямое взаимодействие с Redis**: все операции выполняются через прямые вызовы `redis.execute()` с обработкой ошибок.
- **Батчевая обработка**: для массовых операций используется пороговое значение, после которого применяются оптимизированные стратегии.
## Известные ограничения
1. **Согласованность данных** - система не гарантирует абсолютную согласованность данных в кеше и базе данных.
2. **Память** - необходимо следить за объемом данных в кеше, чтобы избежать проблем с памятью Redis.
3. **Производительность Redis** - при большом количестве операций с кешем может стать узким местом.

165
docs/comments-pagination.md Normal file
View File

@ -0,0 +1,165 @@
# Пагинация комментариев
## Обзор
Реализована система пагинации комментариев по веткам, которая позволяет эффективно загружать и отображать вложенные ветки обсуждений. Основные преимущества:
1. Загрузка только необходимых комментариев, а не всего дерева
2. Снижение нагрузки на сервер и клиент
3. Возможность эффективной навигации по большим обсуждениям
4. Предзагрузка первых N ответов для улучшения UX
## API для иерархической загрузки комментариев
### GraphQL запрос `load_comments_branch`
```graphql
query LoadCommentsBranch(
$shout: Int!,
$parentId: Int,
$limit: Int,
$offset: Int,
$sort: ReactionSort,
$childrenLimit: Int,
$childrenOffset: Int
) {
load_comments_branch(
shout: $shout,
parent_id: $parentId,
limit: $limit,
offset: $offset,
sort: $sort,
children_limit: $childrenLimit,
children_offset: $childrenOffset
) {
id
body
created_at
created_by {
id
name
slug
pic
}
kind
reply_to
stat {
rating
comments_count
}
first_replies {
id
body
created_at
created_by {
id
name
slug
pic
}
kind
reply_to
stat {
rating
comments_count
}
}
}
}
```
### Параметры запроса
| Параметр | Тип | По умолчанию | Описание |
|----------|-----|--------------|----------|
| shout | Int! | - | ID статьи, к которой относятся комментарии |
| parent_id | Int | null | ID родительского комментария. Если null, загружаются корневые комментарии |
| limit | Int | 10 | Максимальное количество комментариев для загрузки |
| offset | Int | 0 | Смещение для пагинации |
| sort | ReactionSort | newest | Порядок сортировки: newest, oldest, like |
| children_limit | Int | 3 | Максимальное количество дочерних комментариев для каждого родительского |
| children_offset | Int | 0 | Смещение для пагинации дочерних комментариев |
### Поля в ответе
Каждый комментарий содержит следующие основные поля:
- `id`: ID комментария
- `body`: Текст комментария
- `created_at`: Время создания
- `created_by`: Информация об авторе
- `kind`: Тип реакции (COMMENT)
- `reply_to`: ID родительского комментария (null для корневых)
- `first_replies`: Первые N дочерних комментариев
- `stat`: Статистика комментария, включающая:
- `comments_count`: Количество ответов на комментарий
- `rating`: Рейтинг комментария
## Примеры использования
### Загрузка корневых комментариев с первыми ответами
```javascript
const { data } = await client.query({
query: LOAD_COMMENTS_BRANCH,
variables: {
shout: 222,
limit: 10,
offset: 0,
sort: "newest",
childrenLimit: 3
}
});
```
### Загрузка ответов на конкретный комментарий
```javascript
const { data } = await client.query({
query: LOAD_COMMENTS_BRANCH,
variables: {
shout: 222,
parentId: 123, // ID комментария, для которого загружаем ответы
limit: 10,
offset: 0,
sort: "oldest" // Сортируем ответы от старых к новым
}
});
```
### Пагинация дочерних комментариев
Для загрузки дополнительных ответов на комментарий:
```javascript
const { data } = await client.query({
query: LOAD_COMMENTS_BRANCH,
variables: {
shout: 222,
parentId: 123,
limit: 10,
offset: 0,
childrenLimit: 5,
childrenOffset: 3 // Пропускаем первые 3 комментария (уже загруженные)
}
});
```
## Рекомендации по клиентской реализации
1. Для эффективной работы со сложными ветками обсуждений рекомендуется:
- Сначала загружать только корневые комментарии с первыми N ответами
- При наличии дополнительных ответов (когда `stat.comments_count > first_replies.length`)
добавить кнопку "Показать все ответы"
- При нажатии на кнопку загружать дополнительные ответы с помощью запроса с указанным `parentId`
2. Для сортировки:
- По умолчанию использовать `newest` для отображения свежих обсуждений
- Предусмотреть переключатель сортировки для всего дерева комментариев
- При изменении сортировки перезагружать данные с новым параметром `sort`
3. Для улучшения производительности:
- Кешировать результаты запросов на клиенте
- Использовать оптимистичные обновления при добавлении/редактировании комментариев
- При необходимости загружать комментарии порциями (ленивая загрузка)

48
docs/features.md Normal file
View File

@ -0,0 +1,48 @@
## Просмотры публикаций
- Интеграция с Google Analytics для отслеживания просмотров публикаций
- Подсчет уникальных пользователей и общего количества просмотров
- Автоматическое обновление статистики при запросе данных публикации
## Мультидоменная авторизация
- Поддержка авторизации для разных доменов
- Автоматическое определение сервера авторизации
- Корректная обработка CORS для всех поддерживаемых доменов
## Система кеширования
- Redis используется в качестве основного механизма кеширования
- Поддержка как синхронных, так и асинхронных функций в декораторе cache_on_arguments
- Автоматическая сериализация/десериализация данных в JSON с использованием CustomJSONEncoder
- Резервная сериализация через pickle для сложных объектов
- Генерация уникальных ключей кеша на основе сигнатуры функции и переданных аргументов
- Настраиваемое время жизни кеша (TTL)
- Возможность ручной инвалидации кеша для конкретных функций и аргументов
## Webhooks
- Автоматическая регистрация вебхука для события user.login
- Предотвращение создания дублирующихся вебхуков
- Автоматическая очистка устаревших вебхуков
- Поддержка авторизации вебхуков через WEBHOOK_SECRET
- Обработка ошибок при операциях с вебхуками
- Динамическое определение endpoint'а на основе окружения
## CORS Configuration
- Поддерживаемые методы: GET, POST, OPTIONS
- Настроена поддержка credentials
- Разрешенные заголовки: Authorization, Content-Type, X-Requested-With, DNT, Cache-Control
- Настроено кэширование preflight-ответов на 20 дней (1728000 секунд)
## Пагинация комментариев по веткам
- Эффективная загрузка комментариев с учетом их иерархической структуры
- Отдельный запрос `load_comments_branch` для оптимизированной загрузки ветки комментариев
- Возможность загрузки корневых комментариев статьи с первыми ответами на них
- Гибкая пагинация как для корневых, так и для дочерних комментариев
- Использование поля `stat.comments_count` для отображения количества ответов на комментарий
- Добавление специального поля `first_replies` для хранения первых ответов на комментарий
- Поддержка различных методов сортировки (новые, старые, популярные)
- Оптимизированные SQL запросы для минимизации нагрузки на базу данных

94
docs/follower.md Normal file
View File

@ -0,0 +1,94 @@
# Following System
## Overview
System supports following different entity types:
- Authors
- Topics
- Communities
- Shouts (Posts)
## GraphQL API
### Mutations
#### follow
Follow an entity (author/topic/community/shout).
**Parameters:**
- `what: String!` - Entity type (`AUTHOR`, `TOPIC`, `COMMUNITY`, `SHOUT`)
- `slug: String` - Entity slug
- `entity_id: Int` - Optional entity ID
**Returns:**
```typescript
{
authors?: Author[] // For AUTHOR type
topics?: Topic[] // For TOPIC type
communities?: Community[] // For COMMUNITY type
shouts?: Shout[] // For SHOUT type
error?: String // Error message if any
}
```
#### unfollow
Unfollow an entity.
**Parameters:** Same as `follow`
**Returns:** Same as `follow`
### Queries
#### get_shout_followers
Get list of users who reacted to a shout.
**Parameters:**
- `slug: String` - Shout slug
- `shout_id: Int` - Optional shout ID
**Returns:**
```typescript
Author[] // List of authors who reacted
```
## Caching System
### Supported Entity Types
- Authors: `cache_author`, `get_cached_follower_authors`
- Topics: `cache_topic`, `get_cached_follower_topics`
- Communities: No cache
- Shouts: No cache
### Cache Flow
1. On follow/unfollow:
- Update entity in cache
- Update follower's following list
2. Cache is updated before notifications
## Notifications
- Sent when author is followed/unfollowed
- Contains:
- Follower info
- Author ID
- Action type ("follow"/"unfollow")
## Error Handling
- Unauthorized access check
- Entity existence validation
- Duplicate follow prevention
- Full error logging
- Transaction safety with `local_session()`
## Database Schema
### Follower Tables
- `AuthorFollower`
- `TopicFollower`
- `CommunityFollower`
- `ShoutReactionsFollower`
Each table contains:
- `follower` - ID of following user
- `{entity_type}` - ID of followed entity

80
docs/load_shouts.md Normal file
View File

@ -0,0 +1,80 @@
# Система загрузки публикаций
## Особенности реализации
### Базовый запрос
- Автоматически подгружает основного автора
- Добавляет основную тему публикации
- Поддерживает гибкую систему фильтрации
- Оптимизирует запросы на основе запрошенных полей
### Статистика
- Подсчёт лайков/дислайков
- Количество комментариев
- Дата последней реакции
- Статистика подгружается только при запросе поля `stat`
### Оптимизация производительности
- Ленивая загрузка связанных данных
- Кэширование результатов на 5 минут
- Пакетная загрузка авторов и тем
- Использование подзапросов для сложных выборок
## Типы лент
### Случайные топовые посты (load_shouts_random_top)
**Преимущества:**
- Разнообразный контент
- Быстрая выборка из кэша топовых постов
- Настраиваемый размер пула для выборки
**Ограничения:**
- Обновление раз в 5 минут
- Максимальный размер пула: 100 постов
- Учитываются только лайки/дислайки (без комментариев)
### Неоцененные посты (load_shouts_unrated)
**Преимущества:**
- Помогает найти новый контент
- Равномерное распределение оценок
- Случайный порядок выдачи
**Ограничения:**
- Только посты с менее чем 3 реакциями
- Не учитываются комментарии
- Без сортировки по рейтингу
### Закладки (load_shouts_bookmarked)
**Преимущества:**
- Персонализированная выборка
- Быстрый доступ к сохраненному
- Поддержка всех фильтров
**Ограничения:**
- Требует авторизации
- Ограничение на количество закладок
- Кэширование отключено
## Важные моменты
### Пагинация
- Стандартный размер страницы: 10
- Максимальный размер: 100
- Поддержка курсор-пагинации
### Кэширование
- TTL: 5 минут
- Инвалидация при изменении поста
- Отдельный кэш для каждого типа сортировки
### Сортировка
- По рейтингу (лайки минус дислайки)
- По количеству комментариев
- По дате последней реакции
- По дате публикации (по умолчанию)
### Безопасность
- Проверка прав доступа
- Фильтрация удаленного контента
- Защита от SQL-инъекций
- Валидация входных данных

82
docs/rating.md Normal file
View File

@ -0,0 +1,82 @@
# Rating System
## GraphQL Resolvers
### Queries
#### get_my_rates_shouts
Get user's reactions (LIKE/DISLIKE) for specified posts.
**Parameters:**
- `shouts: [Int!]!` - array of shout IDs
**Returns:**
```typescript
[{
shout_id: Int
my_rate: ReactionKind // LIKE or DISLIKE
}]
```
#### get_my_rates_comments
Get user's reactions (LIKE/DISLIKE) for specified comments.
**Parameters:**
- `comments: [Int!]!` - array of comment IDs
**Returns:**
```typescript
[{
comment_id: Int
my_rate: ReactionKind // LIKE or DISLIKE
}]
```
### Mutations
#### rate_author
Rate another author (karma system).
**Parameters:**
- `rated_slug: String!` - author's slug
- `value: Int!` - rating value (positive/negative)
## Rating Calculation
### Author Rating Components
#### Shouts Rating
- Calculated from LIKE/DISLIKE reactions on author's posts
- Each LIKE: +1
- Each DISLIKE: -1
- Excludes deleted reactions
- Excludes comment reactions
#### Comments Rating
- Calculated from LIKE/DISLIKE reactions on author's comments
- Each LIKE: +1
- Each DISLIKE: -1
- Only counts reactions to COMMENT type reactions
- Excludes deleted reactions
#### Legacy Karma
- Based on direct author ratings via `rate_author` mutation
- Stored in `AuthorRating` table
- Each positive rating: +1
- Each negative rating: -1
### Helper Functions
- `count_author_comments_rating()` - Calculate comment rating
- `count_author_shouts_rating()` - Calculate posts rating
- `get_author_rating_old()` - Get legacy karma rating
- `get_author_rating_shouts()` - Get posts rating (optimized)
- `get_author_rating_comments()` - Get comments rating (optimized)
- `add_author_rating_columns()` - Add rating columns to author query
## Notes
- All ratings exclude deleted content
- Reactions are unique per user/content
- Rating calculations are optimized with SQLAlchemy
- System supports both direct author rating and content-based rating

View File

@ -1 +0,0 @@
python -m gql_schema_codegen -p ./schema.graphql -t ./schema_types.py

159
main.py
View File

@ -1,5 +1,6 @@
import asyncio
import os
import sys
from importlib import import_module
from os.path import exists
@ -7,88 +8,108 @@ from ariadne import load_schema_from_path, make_executable_schema
from ariadne.asgi import GraphQL
from starlette.applications import Starlette
from starlette.middleware import Middleware
from starlette.middleware.authentication import AuthenticationMiddleware
from starlette.middleware.sessions import SessionMiddleware
from starlette.middleware.cors import CORSMiddleware
from starlette.requests import Request
from starlette.responses import JSONResponse, Response
from starlette.routing import Route
from auth.authenticate import JWTAuthenticate
from auth.oauth import oauth_authorize, oauth_login
from base.redis import redis
from base.resolvers import resolvers
from orm import init_tables
from resolvers.upload import upload_handler
from services.main import storages_init
from services.notifications.notification_service import notification_service
from services.notifications.sse import sse_subscribe_handler
from services.stat.viewed import ViewedStorage
# from services.zine.gittask import GitTask
from settings import DEV_SERVER_PID_FILE_NAME, SENTRY_DSN, SESSION_SECRET_KEY
from cache.precache import precache_data
from cache.revalidator import revalidation_manager
from services.exception import ExceptionHandlerMiddleware
from services.redis import redis
from services.schema import create_all_tables, resolvers
from services.search import search_service
from services.viewed import ViewedStorage
from services.webhook import WebhookEndpoint, create_webhook_endpoint
from settings import DEV_SERVER_PID_FILE_NAME, MODE
import_module("resolvers")
schema = make_executable_schema(load_schema_from_path("schema.graphql"), resolvers)
middleware = [
Middleware(AuthenticationMiddleware, backend=JWTAuthenticate()),
Middleware(SessionMiddleware, secret_key=SESSION_SECRET_KEY),
]
schema = make_executable_schema(load_schema_from_path("schema/"), resolvers)
async def start_up():
init_tables()
await redis.connect()
await storages_init()
views_stat_task = asyncio.create_task(ViewedStorage().worker())
print(views_stat_task)
# git_task = asyncio.create_task(GitTask.git_task_worker())
# print(git_task)
notification_service_task = asyncio.create_task(notification_service.worker())
print(notification_service_task)
async def start():
if MODE == "development":
if not exists(DEV_SERVER_PID_FILE_NAME):
# pid file management
with open(DEV_SERVER_PID_FILE_NAME, "w", encoding="utf-8") as f:
f.write(str(os.getpid()))
print(f"[main] process started in {MODE} mode")
async def lifespan(_app):
try:
create_all_tables()
await asyncio.gather(
redis.connect(),
precache_data(),
ViewedStorage.init(),
create_webhook_endpoint(),
search_service.info(),
start(),
revalidation_manager.start(),
)
yield
finally:
tasks = [redis.disconnect(), ViewedStorage.stop(), revalidation_manager.stop()]
await asyncio.gather(*tasks, return_exceptions=True)
# Создаем экземпляр GraphQL
graphql_app = GraphQL(schema, debug=True)
# Оборачиваем GraphQL-обработчик для лучшей обработки ошибок
async def graphql_handler(request: Request):
if request.method not in ["GET", "POST"]:
return JSONResponse({"error": "Method Not Allowed"}, status_code=405)
try:
import sentry_sdk
sentry_sdk.init(SENTRY_DSN)
result = await graphql_app.handle_request(request)
if isinstance(result, Response):
return result
return JSONResponse(result)
except asyncio.CancelledError:
return JSONResponse({"error": "Request cancelled"}, status_code=499)
except Exception as e:
print("[sentry] init error")
print(e)
print(f"GraphQL error: {str(e)}")
return JSONResponse({"error": str(e)}, status_code=500)
async def dev_start_up():
if exists(DEV_SERVER_PID_FILE_NAME):
await redis.connect()
return
else:
with open(DEV_SERVER_PID_FILE_NAME, "w", encoding="utf-8") as f:
f.write(str(os.getpid()))
await start_up()
async def shutdown():
await redis.disconnect()
routes = [
Route("/oauth/{provider}", endpoint=oauth_login),
Route("/oauth-authorize", endpoint=oauth_authorize),
Route("/upload", endpoint=upload_handler, methods=["POST"]),
Route("/subscribe/{user_id}", endpoint=sse_subscribe_handler),
middleware = [
# Начинаем с обработки ошибок
Middleware(ExceptionHandlerMiddleware),
# CORS должен быть перед другими middleware для корректной обработки preflight-запросов
Middleware(
CORSMiddleware,
allow_origins=[
"https://localhost:3000",
"https://testing.discours.io",
"https://testing3.discours.io",
"https://discours.io",
"https://new.discours.io"
],
allow_methods=["GET", "POST", "OPTIONS"], # Явно указываем OPTIONS
allow_headers=["*"],
allow_credentials=True,
),
]
# Обновляем маршрут в Starlette
app = Starlette(
on_startup=[start_up],
on_shutdown=[shutdown],
routes=[
Route("/", graphql_handler, methods=["GET", "POST"]),
Route("/new-author", WebhookEndpoint),
],
middleware=middleware,
routes=routes,
)
app.mount("/", GraphQL(schema))
dev_app = Starlette(
lifespan=lifespan,
debug=True,
on_startup=[dev_start_up],
on_shutdown=[shutdown],
middleware=middleware,
routes=routes,
)
dev_app.mount("/", GraphQL(schema, debug=True))
app.add_middleware(ExceptionHandlerMiddleware)
if "dev" in sys.argv:
app.add_middleware(
CORSMiddleware,
allow_origins=["https://localhost:3000"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)

View File

@ -1,18 +0,0 @@
database_name="discoursio"
echo "DATABASE MIGRATION STARTED"
echo "Dropping database $database_name"
dropdb $database_name --force
if [ $? -ne 0 ]; then { echo "Failed to drop database, aborting." ; exit 1; } fi
echo "Database $database_name dropped"
echo "Creating database $database_name"
createdb $database_name
if [ $? -ne 0 ]; then { echo "Failed to create database, aborting." ; exit 1; } fi
echo "Database $database_name successfully created"
echo "Start migration"
python3 server.py migrate
if [ $? -ne 0 ]; then { echo "Migration failed, aborting." ; exit 1; } fi
echo 'Done!'

View File

@ -1,279 +0,0 @@
""" cmd managed migration """
import asyncio
import gc
import json
import sys
from datetime import datetime, timezone
import bs4
from migration.export import export_mdx
from migration.tables.comments import migrate as migrateComment
from migration.tables.comments import migrate_2stage as migrateComment_2stage
from migration.tables.content_items import get_shout_slug
from migration.tables.content_items import migrate as migrateShout
# from migration.tables.remarks import migrate as migrateRemark
from migration.tables.topics import migrate as migrateTopic
from migration.tables.users import migrate as migrateUser
from migration.tables.users import migrate_2stage as migrateUser_2stage
from migration.tables.users import post_migrate as users_post_migrate
from orm import init_tables
from orm.reaction import Reaction
TODAY = datetime.strftime(datetime.now(tz=timezone.utc), "%Y%m%d")
OLD_DATE = "2016-03-05 22:22:00.350000"
async def users_handle(storage):
"""migrating users first"""
counter = 0
id_map = {}
print("[migration] migrating %d users" % (len(storage["users"]["data"])))
for entry in storage["users"]["data"]:
oid = entry["_id"]
user = migrateUser(entry)
storage["users"]["by_oid"][oid] = user # full
del user["password"]
del user["emailConfirmed"]
del user["username"]
del user["email"]
storage["users"]["by_slug"][user["slug"]] = user # public
id_map[user["oid"]] = user["slug"]
counter += 1
ce = 0
for entry in storage["users"]["data"]:
ce += migrateUser_2stage(entry, id_map)
users_post_migrate()
async def topics_handle(storage):
"""topics from categories and tags"""
counter = 0
for t in storage["topics"]["tags"] + storage["topics"]["cats"]:
if t["slug"] in storage["replacements"]:
t["slug"] = storage["replacements"][t["slug"]]
topic = migrateTopic(t)
storage["topics"]["by_oid"][t["_id"]] = topic
storage["topics"]["by_slug"][t["slug"]] = topic
counter += 1
else:
print("[migration] topic " + t["slug"] + " ignored")
for oldslug, newslug in storage["replacements"].items():
if oldslug != newslug and oldslug in storage["topics"]["by_slug"]:
oid = storage["topics"]["by_slug"][oldslug]["_id"]
del storage["topics"]["by_slug"][oldslug]
storage["topics"]["by_oid"][oid] = storage["topics"]["by_slug"][newslug]
print("[migration] " + str(counter) + " topics migrated")
print("[migration] " + str(len(storage["topics"]["by_oid"].values())) + " topics by oid")
print("[migration] " + str(len(storage["topics"]["by_slug"].values())) + " topics by slug")
async def shouts_handle(storage, args):
"""migrating content items one by one"""
counter = 0
discours_author = 0
anonymous_author = 0
pub_counter = 0
ignored = 0
topics_dataset_bodies = []
topics_dataset_tlist = []
for entry in storage["shouts"]["data"]:
gc.collect()
# slug
slug = get_shout_slug(entry)
# single slug mode
if "-" in args and slug not in args:
continue
# migrate
shout_dict = await migrateShout(entry, storage)
if shout_dict:
storage["shouts"]["by_oid"][entry["_id"]] = shout_dict
storage["shouts"]["by_slug"][shout_dict["slug"]] = shout_dict
# shouts.topics
if not shout_dict["topics"]:
print("[migration] no topics!")
# with author
author = shout_dict["authors"][0]
if author["slug"] == "discours":
discours_author += 1
if author["slug"] == "anonymous":
anonymous_author += 1
# print('[migration] ' + shout['slug'] + ' with author ' + author)
if entry.get("published"):
if "mdx" in args:
export_mdx(shout_dict)
pub_counter += 1
# print main counter
counter += 1
print(
"[migration] shouts_handle %d: %s @%s"
% ((counter + 1), shout_dict["slug"], author["slug"])
)
b = bs4.BeautifulSoup(shout_dict["body"], "html.parser")
texts = [shout_dict["title"].lower().replace(r"[^а-яА-Яa-zA-Z]", "")]
texts = texts + b.findAll(text=True)
topics_dataset_bodies.append(" ".join([x.strip().lower() for x in texts]))
topics_dataset_tlist.append(shout_dict["topics"])
else:
ignored += 1
# np.savetxt('topics_dataset.csv', (topics_dataset_bodies, topics_dataset_tlist), delimiter=',
# ', fmt='%s')
print("[migration] " + str(counter) + " content items were migrated")
print("[migration] " + str(pub_counter) + " have been published")
print("[migration] " + str(discours_author) + " authored by @discours")
print("[migration] " + str(anonymous_author) + " authored by @anonymous")
# async def remarks_handle(storage):
# print("[migration] comments")
# c = 0
# for entry_remark in storage["remarks"]["data"]:
# remark = await migrateRemark(entry_remark, storage)
# c += 1
# print("[migration] " + str(c) + " remarks migrated")
async def comments_handle(storage):
print("[migration] comments")
id_map = {}
ignored_counter = 0
missed_shouts = {}
for oldcomment in storage["reactions"]["data"]:
if not oldcomment.get("deleted"):
reaction = await migrateComment(oldcomment, storage)
if isinstance(reaction, str):
missed_shouts[reaction] = oldcomment
elif isinstance(reaction, Reaction):
reaction = reaction.dict()
rid = reaction["id"]
oid = reaction["oid"]
id_map[oid] = rid
else:
ignored_counter += 1
for reaction in storage["reactions"]["data"]:
migrateComment_2stage(reaction, id_map)
print("[migration] " + str(len(id_map)) + " comments migrated")
print("[migration] " + str(ignored_counter) + " comments ignored")
print("[migration] " + str(len(missed_shouts.keys())) + " commented shouts missed")
missed_counter = 0
for missed in missed_shouts.values():
missed_counter += len(missed)
print("[migration] " + str(missed_counter) + " comments dropped")
async def all_handle(storage, args):
print("[migration] handle everything")
await users_handle(storage)
await topics_handle(storage)
print("[migration] users and topics are migrated")
await shouts_handle(storage, args)
# print("[migration] remarks...")
# await remarks_handle(storage)
print("[migration] migrating comments")
await comments_handle(storage)
# export_email_subscriptions()
print("[migration] done!")
def data_load():
storage = {
"content_items": {
"by_oid": {},
"by_slug": {},
},
"shouts": {"by_oid": {}, "by_slug": {}, "data": []},
"reactions": {"by_oid": {}, "by_slug": {}, "by_content": {}, "data": []},
"topics": {
"by_oid": {},
"by_slug": {},
"cats": [],
"tags": [],
},
"remarks": {"data": []},
"users": {"by_oid": {}, "by_slug": {}, "data": []},
"replacements": json.loads(open("migration/tables/replacements.json").read()),
}
try:
users_data = json.loads(open("migration/data/users.json").read())
print("[migration.load] " + str(len(users_data)) + " users ")
tags_data = json.loads(open("migration/data/tags.json").read())
storage["topics"]["tags"] = tags_data
print("[migration.load] " + str(len(tags_data)) + " tags ")
cats_data = json.loads(open("migration/data/content_item_categories.json").read())
storage["topics"]["cats"] = cats_data
print("[migration.load] " + str(len(cats_data)) + " cats ")
comments_data = json.loads(open("migration/data/comments.json").read())
storage["reactions"]["data"] = comments_data
print("[migration.load] " + str(len(comments_data)) + " comments ")
content_data = json.loads(open("migration/data/content_items.json").read())
storage["shouts"]["data"] = content_data
print("[migration.load] " + str(len(content_data)) + " content items ")
remarks_data = json.loads(open("migration/data/remarks.json").read())
storage["remarks"]["data"] = remarks_data
print("[migration.load] " + str(len(remarks_data)) + " remarks data ")
# fill out storage
for x in users_data:
storage["users"]["by_oid"][x["_id"]] = x
# storage['users']['by_slug'][x['slug']] = x
# no user.slug yet
print("[migration.load] " + str(len(storage["users"]["by_oid"].keys())) + " users by oid")
for x in tags_data:
storage["topics"]["by_oid"][x["_id"]] = x
storage["topics"]["by_slug"][x["slug"]] = x
for x in cats_data:
storage["topics"]["by_oid"][x["_id"]] = x
storage["topics"]["by_slug"][x["slug"]] = x
print(
"[migration.load] " + str(len(storage["topics"]["by_slug"].keys())) + " topics by slug"
)
for item in content_data:
slug = get_shout_slug(item)
storage["content_items"]["by_slug"][slug] = item
storage["content_items"]["by_oid"][item["_id"]] = item
print("[migration.load] " + str(len(content_data)) + " content items")
for x in comments_data:
storage["reactions"]["by_oid"][x["_id"]] = x
cid = x["contentItem"]
storage["reactions"]["by_content"][cid] = x
ci = storage["content_items"]["by_oid"].get(cid, {})
if "slug" in ci:
storage["reactions"]["by_slug"][ci["slug"]] = x
print(
"[migration.load] "
+ str(len(storage["reactions"]["by_content"].keys()))
+ " with comments"
)
storage["users"]["data"] = users_data
storage["topics"]["tags"] = tags_data
storage["topics"]["cats"] = cats_data
storage["shouts"]["data"] = content_data
storage["reactions"]["data"] = comments_data
except Exception as e:
raise e
return storage
async def handling_migration():
init_tables()
await all_handle(data_load(), sys.argv)
def process():
loop = asyncio.get_event_loop()
loop.run_until_complete(handling_migration())
if __name__ == "__main__":
process()

View File

@ -1,33 +0,0 @@
import gc
import json
import os
import bson
from .utils import DateTimeEncoder
def json_tables():
print("[migration] unpack dump/discours/*.bson to migration/data/*.json")
data = {
"content_items": [],
"content_item_categories": [],
"tags": [],
"email_subscriptions": [],
"users": [],
"comments": [],
"remarks": [],
}
for table in data.keys():
print("[migration] bson2json for " + table)
gc.collect()
lc = []
bs = open("dump/discours/" + table + ".bson", "rb").read()
base = 0
while base < len(bs):
base, d = bson.decode_document(bs, base)
lc.append(d)
data[table] = lc
open(os.getcwd() + "/migration/data/" + table + ".json", "w").write(
json.dumps(lc, cls=DateTimeEncoder)
)

View File

@ -1,137 +0,0 @@
import json
import os
from datetime import datetime, timezone
import frontmatter
from .extract import extract_html, extract_media
from .utils import DateTimeEncoder
OLD_DATE = "2016-03-05 22:22:00.350000"
EXPORT_DEST = "../discoursio-web/data/"
parentDir = "/".join(os.getcwd().split("/")[:-1])
contentDir = parentDir + "/discoursio-web/content/"
ts = datetime.now(tz=timezone.utc)
def get_metadata(r):
authors = []
for a in r["authors"]:
authors.append(
{ # a short version for public listings
"slug": a.slug or "discours",
"name": a.name or "Дискурс",
"userpic": a.userpic or "https://discours.io/static/img/discours.png",
}
)
metadata = {}
metadata["title"] = r.get("title", "").replace("{", "(").replace("}", ")")
metadata["authors"] = authors
metadata["createdAt"] = r.get("createdAt", ts)
metadata["layout"] = r["layout"]
metadata["topics"] = [topic for topic in r["topics"]]
metadata["topics"].sort()
if r.get("cover", False):
metadata["cover"] = r.get("cover")
return metadata
def export_mdx(r):
# print('[export] mdx %s' % r['slug'])
content = ""
metadata = get_metadata(r)
content = frontmatter.dumps(frontmatter.Post(r["body"], **metadata))
ext = "mdx"
filepath = contentDir + r["slug"]
bc = bytes(content, "utf-8").decode("utf-8", "ignore")
open(filepath + "." + ext, "w").write(bc)
def export_body(shout, storage):
entry = storage["content_items"]["by_oid"][shout["oid"]]
if entry:
body = extract_html(entry)
media = extract_media(entry)
shout["body"] = body # prepare_html_body(entry) # prepare_md_body(entry)
shout["media"] = media
export_mdx(shout)
print("[export] html for %s" % shout["slug"])
open(contentDir + shout["slug"] + ".html", "w").write(body)
else:
raise Exception("no content_items entry found")
def export_slug(slug, storage):
shout = storage["shouts"]["by_slug"][slug]
shout = storage["shouts"]["by_slug"].get(slug)
assert shout, "[export] no shout found by slug: %s " % slug
author = shout["authors"][0]
assert author, "[export] no author error"
export_body(shout, storage)
def export_email_subscriptions():
email_subscriptions_data = json.loads(open("migration/data/email_subscriptions.json").read())
for data in email_subscriptions_data:
# TODO: migrate to mailgun list manually
# migrate_email_subscription(data)
pass
print("[migration] " + str(len(email_subscriptions_data)) + " email subscriptions exported")
def export_shouts(storage):
# update what was just migrated or load json again
if len(storage["users"]["by_slugs"].keys()) == 0:
storage["users"]["by_slugs"] = json.loads(open(EXPORT_DEST + "authors.json").read())
print("[migration] " + str(len(storage["users"]["by_slugs"].keys())) + " exported authors ")
if len(storage["shouts"]["by_slugs"].keys()) == 0:
storage["shouts"]["by_slugs"] = json.loads(open(EXPORT_DEST + "articles.json").read())
print(
"[migration] " + str(len(storage["shouts"]["by_slugs"].keys())) + " exported articles "
)
for slug in storage["shouts"]["by_slugs"].keys():
export_slug(slug, storage)
def export_json(export_articles={}, export_authors={}, export_topics={}, export_comments={}):
open(EXPORT_DEST + "authors.json", "w").write(
json.dumps(
export_authors,
cls=DateTimeEncoder,
indent=4,
sort_keys=True,
ensure_ascii=False,
)
)
print("[migration] " + str(len(export_authors.items())) + " authors exported")
open(EXPORT_DEST + "topics.json", "w").write(
json.dumps(
export_topics,
cls=DateTimeEncoder,
indent=4,
sort_keys=True,
ensure_ascii=False,
)
)
print("[migration] " + str(len(export_topics.keys())) + " topics exported")
open(EXPORT_DEST + "articles.json", "w").write(
json.dumps(
export_articles,
cls=DateTimeEncoder,
indent=4,
sort_keys=True,
ensure_ascii=False,
)
)
print("[migration] " + str(len(export_articles.items())) + " articles exported")
open(EXPORT_DEST + "comments.json", "w").write(
json.dumps(
export_comments,
cls=DateTimeEncoder,
indent=4,
sort_keys=True,
ensure_ascii=False,
)
)
print("[migration] " + str(len(export_comments.items())) + " exported articles with comments")

View File

@ -1,276 +0,0 @@
import os
import re
from bs4 import BeautifulSoup
TOOLTIP_REGEX = r"(\/\/\/(.+)\/\/\/)"
contentDir = os.path.join(
os.path.dirname(os.path.realpath(__file__)), "..", "..", "discoursio-web", "content"
)
cdn = "https://images.discours.io"
def replace_tooltips(body):
# change if you prefer regexp
newbody = body
matches = list(re.finditer(TOOLTIP_REGEX, body, re.IGNORECASE | re.MULTILINE))[1:]
for match in matches:
newbody = body.replace(
match.group(1), '<Tooltip text="' + match.group(2) + '" />'
) # NOTE: doesn't work
if len(matches) > 0:
print("[extract] found %d tooltips" % len(matches))
return newbody
# def extract_footnotes(body, shout_dict):
# parts = body.split("&&&")
# lll = len(parts)
# newparts = list(parts)
# placed = False
# if lll & 1:
# if lll > 1:
# i = 1
# print("[extract] found %d footnotes in body" % (lll - 1))
# for part in parts[1:]:
# if i & 1:
# placed = True
# if 'a class="footnote-url" href=' in part:
# print("[extract] footnote: " + part)
# fn = 'a class="footnote-url" href="'
# exxtracted_link = part.split(fn, 1)[1].split('"', 1)[0]
# extracted_body = part.split(fn, 1)[1].split(">", 1)[1].split("</a>", 1)[0]
# print("[extract] footnote link: " + extracted_link)
# with local_session() as session:
# Reaction.create(
# {
# "shout": shout_dict["id"],
# "kind": ReactionKind.FOOTNOTE,
# "body": extracted_body,
# "range": str(body.index(fn + link) - len("<"))
# + ":"
# + str(body.index(extracted_body) + len("</a>")),
# }
# )
# newparts[i] = "<a href='#'></a>"
# else:
# newparts[i] = part
# i += 1
# return ("".join(newparts), placed)
# def place_tooltips(body):
# parts = body.split("&&&")
# lll = len(parts)
# newparts = list(parts)
# placed = False
# if lll & 1:
# if lll > 1:
# i = 1
# print("[extract] found %d tooltips" % (lll - 1))
# for part in parts[1:]:
# if i & 1:
# placed = True
# if 'a class="footnote-url" href=' in part:
# print("[extract] footnote: " + part)
# fn = 'a class="footnote-url" href="'
# link = part.split(fn, 1)[1].split('"', 1)[0]
# extracted_part = part.split(fn, 1)[0] + " " + part.split("/", 1)[-1]
# newparts[i] = (
# "<Tooltip"
# + (' link="' + link + '" ' if link else "")
# + ">"
# + extracted_part
# + "</Tooltip>"
# )
# else:
# newparts[i] = "<Tooltip>%s</Tooltip>" % part
# # print('[extract] ' + newparts[i])
# else:
# # print('[extract] ' + part[:10] + '..')
# newparts[i] = part
# i += 1
# return ("".join(newparts), placed)
IMG_REGEX = (
r"\!\[(.*?)\]\((data\:image\/(png|jpeg|jpg);base64\,((?:[A-Za-z\d+\/]{4})*(?:[A-Za-z\d+\/]{3}="
)
IMG_REGEX += r"|[A-Za-z\d+\/]{2}==)))\)"
parentDir = "/".join(os.getcwd().split("/")[:-1])
public = parentDir + "/discoursio-web/public"
cache = {}
# def reextract_images(body, oid):
# # change if you prefer regexp
# matches = list(re.finditer(IMG_REGEX, body, re.IGNORECASE | re.MULTILINE))[1:]
# i = 0
# for match in matches:
# print("[extract] image " + match.group(1))
# ext = match.group(3)
# name = oid + str(i)
# link = public + "/upload/image-" + name + "." + ext
# img = match.group(4)
# title = match.group(1) # NOTE: this is not the title
# if img not in cache:
# content = base64.b64decode(img + "==")
# print(str(len(img)) + " image bytes been written")
# open("../" + link, "wb").write(content)
# cache[img] = name
# i += 1
# else:
# print("[extract] image cached " + cache[img])
# body.replace(
# str(match), "![" + title + "](" + cdn + link + ")"
# ) # WARNING: this does not work
# return body
IMAGES = {
"data:image/png": "png",
"data:image/jpg": "jpg",
"data:image/jpeg": "jpg",
}
b64 = ";base64,"
di = "data:image"
def extract_media(entry):
"""normalized media extraction method"""
# media [ { title pic url body } ]}
kind = entry.get("type")
if not kind:
print(entry)
raise Exception("shout no layout")
media = []
for m in entry.get("media") or []:
# title
title = m.get("title", "").replace("\n", " ").replace("&nbsp;", " ")
artist = m.get("performer") or m.get("artist")
if artist:
title = artist + " - " + title
# pic
url = m.get("fileUrl") or m.get("url", "")
pic = ""
if m.get("thumborId"):
pic = cdn + "/unsafe/" + m["thumborId"]
# url
if not url:
if kind == "Image":
url = pic
elif "youtubeId" in m:
url = "https://youtube.com/?watch=" + m["youtubeId"]
elif "vimeoId" in m:
url = "https://vimeo.com/" + m["vimeoId"]
# body
body = m.get("body") or m.get("literatureBody") or ""
media.append({"url": url, "pic": pic, "title": title, "body": body})
return media
def prepare_html_body(entry):
# body modifications
body = ""
kind = entry.get("type")
addon = ""
if kind == "Video":
addon = ""
for m in entry.get("media") or []:
if "youtubeId" in m:
addon += '<iframe width="420" height="345" src="http://www.youtube.com/embed/'
addon += m["youtubeId"]
addon += '?autoplay=1" frameborder="0" allowfullscreen></iframe>\n'
elif "vimeoId" in m:
addon += '<iframe src="https://player.vimeo.com/video/'
addon += m["vimeoId"]
addon += ' width="420" height="345" frameborder="0" allow="autoplay; fullscreen"'
addon += " allowfullscreen></iframe>"
else:
print("[extract] media is not supported")
print(m)
body += addon
elif kind == "Music":
addon = ""
for m in entry.get("media") or []:
artist = m.get("performer")
trackname = ""
if artist:
trackname += artist + " - "
if "title" in m:
trackname += m.get("title", "")
addon += "<figure><figcaption>"
addon += trackname
addon += '</figcaption><audio controls src="'
addon += m.get("fileUrl", "")
addon += '"></audio></figure>'
body += addon
body = extract_html(entry)
# if body_orig: body += extract_md(html2text(body_orig), entry['_id'])
return body
def cleanup_html(body: str) -> str:
new_body = body
regex_remove = [
r"style=\"width:\s*\d+px;height:\s*\d+px;\"",
r"style=\"width:\s*\d+px;\"",
r"style=\"color: #000000;\"",
r"style=\"float: none;\"",
r"style=\"background: white;\"",
r"class=\"Apple-interchange-newline\"",
r"class=\"MsoNormalCxSpMiddle\"",
r"class=\"MsoNormal\"",
r"lang=\"EN-US\"",
r"id=\"docs-internal-guid-[\w-]+\"",
r"<p>\s*</p>",
r"<span></span>",
r"<i>\s*</i>",
r"<b>\s*</b>",
r"<h1>\s*</h1>",
r"<h2>\s*</h2>",
r"<h3>\s*</h3>",
r"<h4>\s*</h4>",
r"<div>\s*</div>",
]
regex_replace = {r"<br>\s*</p>": "</p>"}
changed = True
while changed:
# we need several iterations to clean nested tags this way
changed = False
new_body_iteration = new_body
for regex in regex_remove:
new_body = re.sub(regex, "", new_body)
for regex, replace in regex_replace.items():
new_body = re.sub(regex, replace, new_body)
if new_body_iteration != new_body:
changed = True
return new_body
def extract_html(entry, shout_id=None, cleanup=False):
body_orig = (entry.get("body") or "").replace(r"\(", "(").replace(r"\)", ")")
if cleanup:
# we do that before bs parsing to catch the invalid html
body_clean = cleanup_html(body_orig)
if body_clean != body_orig:
print(f"[migration] html cleaned for slug {entry.get('slug', None)}")
body_orig = body_clean
# if shout_id:
# extract_footnotes(body_orig, shout_id)
body_html = str(BeautifulSoup(body_orig, features="html.parser"))
if cleanup:
# we do that after bs parsing because it can add dummy tags
body_clean_html = cleanup_html(body_html)
if body_clean_html != body_html:
print(f"[migration] html cleaned after bs4 for slug {entry.get('slug', None)}")
body_html = body_clean_html
return body_html

File diff suppressed because it is too large Load Diff

View File

@ -1,3 +0,0 @@
from .cli import main
main()

View File

@ -1,318 +0,0 @@
import argparse
import sys
from . import HTML2Text, __version__, config
# noinspection DuplicatedCode
def main() -> None:
baseurl = ""
class bcolors:
HEADER = "\033[95m"
OKBLUE = "\033[94m"
OKGREEN = "\033[92m"
WARNING = "\033[93m"
FAIL = "\033[91m"
ENDC = "\033[0m"
BOLD = "\033[1m"
UNDERLINE = "\033[4m"
p = argparse.ArgumentParser()
p.add_argument(
"--default-image-alt",
dest="default_image_alt",
default=config.DEFAULT_IMAGE_ALT,
help="The default alt string for images with missing ones",
)
p.add_argument(
"--pad-tables",
dest="pad_tables",
action="store_true",
default=config.PAD_TABLES,
help="pad the cells to equal column width in tables",
)
p.add_argument(
"--no-wrap-links",
dest="wrap_links",
action="store_false",
default=config.WRAP_LINKS,
help="don't wrap links during conversion",
)
p.add_argument(
"--wrap-list-items",
dest="wrap_list_items",
action="store_true",
default=config.WRAP_LIST_ITEMS,
help="wrap list items during conversion",
)
p.add_argument(
"--wrap-tables",
dest="wrap_tables",
action="store_true",
default=config.WRAP_TABLES,
help="wrap tables",
)
p.add_argument(
"--ignore-emphasis",
dest="ignore_emphasis",
action="store_true",
default=config.IGNORE_EMPHASIS,
help="don't include any formatting for emphasis",
)
p.add_argument(
"--reference-links",
dest="inline_links",
action="store_false",
default=config.INLINE_LINKS,
help="use reference style links instead of inline links",
)
p.add_argument(
"--ignore-links",
dest="ignore_links",
action="store_true",
default=config.IGNORE_ANCHORS,
help="don't include any formatting for links",
)
p.add_argument(
"--ignore-mailto-links",
action="store_true",
dest="ignore_mailto_links",
default=config.IGNORE_MAILTO_LINKS,
help="don't include mailto: links",
)
p.add_argument(
"--protect-links",
dest="protect_links",
action="store_true",
default=config.PROTECT_LINKS,
help="protect links from line breaks surrounding them with angle brackets",
)
p.add_argument(
"--ignore-images",
dest="ignore_images",
action="store_true",
default=config.IGNORE_IMAGES,
help="don't include any formatting for images",
)
p.add_argument(
"--images-as-html",
dest="images_as_html",
action="store_true",
default=config.IMAGES_AS_HTML,
help=(
"Always write image tags as raw html; preserves `height`, `width` and "
"`alt` if possible."
),
)
p.add_argument(
"--images-to-alt",
dest="images_to_alt",
action="store_true",
default=config.IMAGES_TO_ALT,
help="Discard image data, only keep alt text",
)
p.add_argument(
"--images-with-size",
dest="images_with_size",
action="store_true",
default=config.IMAGES_WITH_SIZE,
help=("Write image tags with height and width attrs as raw html to retain " "dimensions"),
)
p.add_argument(
"-g",
"--google-doc",
action="store_true",
dest="google_doc",
default=False,
help="convert an html-exported Google Document",
)
p.add_argument(
"-d",
"--dash-unordered-list",
action="store_true",
dest="ul_style_dash",
default=False,
help="use a dash rather than a star for unordered list items",
)
p.add_argument(
"-e",
"--asterisk-emphasis",
action="store_true",
dest="em_style_asterisk",
default=False,
help="use an asterisk rather than an underscore for emphasized text",
)
p.add_argument(
"-b",
"--body-width",
dest="body_width",
type=int,
default=config.BODY_WIDTH,
help="number of characters per output line, 0 for no wrap",
)
p.add_argument(
"-i",
"--google-list-indent",
dest="list_indent",
type=int,
default=config.GOOGLE_LIST_INDENT,
help="number of pixels Google indents nested lists",
)
p.add_argument(
"-s",
"--hide-strikethrough",
action="store_true",
dest="hide_strikethrough",
default=False,
help="hide strike-through text. only relevant when -g is " "specified as well",
)
p.add_argument(
"--escape-all",
action="store_true",
dest="escape_snob",
default=False,
help=(
"Escape all special characters. Output is less readable, but avoids "
"corner case formatting issues."
),
)
p.add_argument(
"--bypass-tables",
action="store_true",
dest="bypass_tables",
default=config.BYPASS_TABLES,
help="Format tables in HTML rather than Markdown syntax.",
)
p.add_argument(
"--ignore-tables",
action="store_true",
dest="ignore_tables",
default=config.IGNORE_TABLES,
help="Ignore table-related tags (table, th, td, tr) " "while keeping rows.",
)
p.add_argument(
"--single-line-break",
action="store_true",
dest="single_line_break",
default=config.SINGLE_LINE_BREAK,
help=(
"Use a single line break after a block element rather than two line "
"breaks. NOTE: Requires --body-width=0"
),
)
p.add_argument(
"--unicode-snob",
action="store_true",
dest="unicode_snob",
default=config.UNICODE_SNOB,
help="Use unicode throughout document",
)
p.add_argument(
"--no-automatic-links",
action="store_false",
dest="use_automatic_links",
default=config.USE_AUTOMATIC_LINKS,
help="Do not use automatic links wherever applicable",
)
p.add_argument(
"--no-skip-internal-links",
action="store_false",
dest="skip_internal_links",
default=config.SKIP_INTERNAL_LINKS,
help="Do not skip internal links",
)
p.add_argument(
"--links-after-para",
action="store_true",
dest="links_each_paragraph",
default=config.LINKS_EACH_PARAGRAPH,
help="Put links after each paragraph instead of document",
)
p.add_argument(
"--mark-code",
action="store_true",
dest="mark_code",
default=config.MARK_CODE,
help="Mark program code blocks with [code]...[/code]",
)
p.add_argument(
"--decode-errors",
dest="decode_errors",
default=config.DECODE_ERRORS,
help=(
"What to do in case of decode errors.'ignore', 'strict' and 'replace' are "
"acceptable values"
),
)
p.add_argument(
"--open-quote",
dest="open_quote",
default=config.OPEN_QUOTE,
help="The character used to open quotes",
)
p.add_argument(
"--close-quote",
dest="close_quote",
default=config.CLOSE_QUOTE,
help="The character used to close quotes",
)
p.add_argument("--version", action="version", version=".".join(map(str, __version__)))
p.add_argument("filename", nargs="?")
p.add_argument("encoding", nargs="?", default="utf-8")
args = p.parse_args()
if args.filename and args.filename != "-":
with open(args.filename, "rb") as fp:
data = fp.read()
else:
data = sys.stdin.buffer.read()
try:
html = data.decode(args.encoding, args.decode_errors)
except UnicodeDecodeError as err:
warning = bcolors.WARNING + "Warning:" + bcolors.ENDC
warning += " Use the " + bcolors.OKGREEN
warning += "--decode-errors=ignore" + bcolors.ENDC + " flag."
print(warning)
raise err
h = HTML2Text(baseurl=baseurl)
# handle options
if args.ul_style_dash:
h.ul_item_mark = "-"
if args.em_style_asterisk:
h.emphasis_mark = "*"
h.strong_mark = "__"
h.body_width = args.body_width
h.google_list_indent = args.list_indent
h.ignore_emphasis = args.ignore_emphasis
h.ignore_links = args.ignore_links
h.ignore_mailto_links = args.ignore_mailto_links
h.protect_links = args.protect_links
h.ignore_images = args.ignore_images
h.images_as_html = args.images_as_html
h.images_to_alt = args.images_to_alt
h.images_with_size = args.images_with_size
h.google_doc = args.google_doc
h.hide_strikethrough = args.hide_strikethrough
h.escape_snob = args.escape_snob
h.bypass_tables = args.bypass_tables
h.ignore_tables = args.ignore_tables
h.single_line_break = args.single_line_break
h.inline_links = args.inline_links
h.unicode_snob = args.unicode_snob
h.use_automatic_links = args.use_automatic_links
h.skip_internal_links = args.skip_internal_links
h.links_each_paragraph = args.links_each_paragraph
h.mark_code = args.mark_code
h.wrap_links = args.wrap_links
h.wrap_list_items = args.wrap_list_items
h.wrap_tables = args.wrap_tables
h.pad_tables = args.pad_tables
h.default_image_alt = args.default_image_alt
h.open_quote = args.open_quote
h.close_quote = args.close_quote
sys.stdout.write(h.handle(html))

View File

@ -1,164 +0,0 @@
import re
# Use Unicode characters instead of their ascii pseudo-replacements
UNICODE_SNOB = True
# Marker to use for marking tables for padding post processing
TABLE_MARKER_FOR_PAD = "special_marker_for_table_padding"
# Escape all special characters. Output is less readable, but avoids
# corner case formatting issues.
ESCAPE_SNOB = True
# Put the links after each paragraph instead of at the end.
LINKS_EACH_PARAGRAPH = False
# Wrap long lines at position. 0 for no wrapping.
BODY_WIDTH = 0
# Don't show internal links (href="#local-anchor") -- corresponding link
# targets won't be visible in the plain text file anyway.
SKIP_INTERNAL_LINKS = False
# Use inline, rather than reference, formatting for images and links
INLINE_LINKS = True
# Protect links from line breaks surrounding them with angle brackets (in
# addition to their square brackets)
PROTECT_LINKS = True
WRAP_LINKS = True
# Wrap list items.
WRAP_LIST_ITEMS = False
# Wrap tables
WRAP_TABLES = False
# Number of pixels Google indents nested lists
GOOGLE_LIST_INDENT = 36
# Values Google and others may use to indicate bold text
BOLD_TEXT_STYLE_VALUES = ("bold", "700", "800", "900")
IGNORE_ANCHORS = False
IGNORE_MAILTO_LINKS = False
IGNORE_IMAGES = False
IMAGES_AS_HTML = False
IMAGES_TO_ALT = False
IMAGES_WITH_SIZE = False
IGNORE_EMPHASIS = False
MARK_CODE = True
DECODE_ERRORS = "strict"
DEFAULT_IMAGE_ALT = ""
PAD_TABLES = True
# Convert links with same href and text to <href> format
# if they are absolute links
USE_AUTOMATIC_LINKS = True
# For checking space-only lines on line 771
RE_SPACE = re.compile(r"\s\+")
RE_ORDERED_LIST_MATCHER = re.compile(r"\d+\.\s")
RE_UNORDERED_LIST_MATCHER = re.compile(r"[-\*\+]\s")
RE_MD_CHARS_MATCHER = re.compile(r"([\\\[\]\(\)])")
RE_MD_CHARS_MATCHER_ALL = re.compile(r"([`\*_{}\[\]\(\)#!])")
# to find links in the text
RE_LINK = re.compile(r"(\[.*?\] ?\(.*?\))|(\[.*?\]:.*?)")
# to find table separators
RE_TABLE = re.compile(r" \| ")
RE_MD_DOT_MATCHER = re.compile(
r"""
^ # start of line
(\s*\d+) # optional whitespace and a number
(\.) # dot
(?=\s) # lookahead assert whitespace
""",
re.MULTILINE | re.VERBOSE,
)
RE_MD_PLUS_MATCHER = re.compile(
r"""
^
(\s*)
(\+)
(?=\s)
""",
flags=re.MULTILINE | re.VERBOSE,
)
RE_MD_DASH_MATCHER = re.compile(
r"""
^
(\s*)
(-)
(?=\s|\-) # followed by whitespace (bullet list, or spaced out hr)
# or another dash (header or hr)
""",
flags=re.MULTILINE | re.VERBOSE,
)
RE_SLASH_CHARS = r"\`*_{}[]()#+-.!"
RE_MD_BACKSLASH_MATCHER = re.compile(
r"""
(\\) # match one slash
(?=[%s]) # followed by a char that requires escaping
"""
% re.escape(RE_SLASH_CHARS),
flags=re.VERBOSE,
)
UNIFIABLE = {
"rsquo": "'",
"lsquo": "'",
"rdquo": '"',
"ldquo": '"',
"copy": "(C)",
"mdash": "--",
"nbsp": " ",
"rarr": "->",
"larr": "<-",
"middot": "*",
"ndash": "-",
"oelig": "oe",
"aelig": "ae",
"agrave": "a",
"aacute": "a",
"acirc": "a",
"atilde": "a",
"auml": "a",
"aring": "a",
"egrave": "e",
"eacute": "e",
"ecirc": "e",
"euml": "e",
"igrave": "i",
"iacute": "i",
"icirc": "i",
"iuml": "i",
"ograve": "o",
"oacute": "o",
"ocirc": "o",
"otilde": "o",
"ouml": "o",
"ugrave": "u",
"uacute": "u",
"ucirc": "u",
"uuml": "u",
"lrm": "",
"rlm": "",
}
# Format tables in HTML rather than Markdown syntax
BYPASS_TABLES = False
# Ignore table-related tags (table, th, td, tr) while keeping rows
IGNORE_TABLES = False
# Use a single line break after a block element rather than two line breaks.
# NOTE: Requires body width setting to be 0.
SINGLE_LINE_BREAK = False
# Use double quotation marks when converting the <q> tag.
OPEN_QUOTE = '"'
CLOSE_QUOTE = '"'

View File

@ -1,18 +0,0 @@
from typing import Dict, Optional
class AnchorElement:
__slots__ = ["attrs", "count", "outcount"]
def __init__(self, attrs: Dict[str, Optional[str]], count: int, outcount: int):
self.attrs = attrs
self.count = count
self.outcount = outcount
class ListElement:
__slots__ = ["name", "num"]
def __init__(self, name: str, num: int):
self.name = name
self.num = num

View File

@ -1,3 +0,0 @@
class OutCallback:
def __call__(self, s: str) -> None:
...

View File

@ -1,282 +0,0 @@
import html.entities
from typing import Dict, List, Optional
from . import config
unifiable_n = {
html.entities.name2codepoint[k]: v for k, v in config.UNIFIABLE.items() if k != "nbsp"
}
def hn(tag: str) -> int:
if tag[0] == "h" and len(tag) == 2:
n = tag[1]
if "0" < n <= "9":
return int(n)
return 0
def dumb_property_dict(style: str) -> Dict[str, str]:
"""
:returns: A hash of css attributes
"""
return {
x.strip().lower(): y.strip().lower()
for x, y in [z.split(":", 1) for z in style.split(";") if ":" in z]
}
def dumb_css_parser(data: str) -> Dict[str, Dict[str, str]]:
"""
:type data: str
:returns: A hash of css selectors, each of which contains a hash of
css attributes.
:rtype: dict
"""
# remove @import sentences
data += ";"
importIndex = data.find("@import")
while importIndex != -1:
data = data[0:importIndex] + data[data.find(";", importIndex) + 1 :]
importIndex = data.find("@import")
# parse the css. reverted from dictionary comprehension in order to
# support older pythons
pairs = [x.split("{") for x in data.split("}") if "{" in x.strip()]
try:
elements = {a.strip(): dumb_property_dict(b) for a, b in pairs}
except ValueError:
elements = {} # not that important
return elements
def element_style(
attrs: Dict[str, Optional[str]],
style_def: Dict[str, Dict[str, str]],
parent_style: Dict[str, str],
) -> Dict[str, str]:
"""
:type attrs: dict
:type style_def: dict
:type style_def: dict
:returns: A hash of the 'final' style attributes of the element
:rtype: dict
"""
style = parent_style.copy()
attrs_class = attrs.get("class")
if attrs_class:
for css_class in attrs_class.split():
css_style = style_def.get("." + css_class, {})
style.update(css_style)
attrs_style = attrs.get("style")
if attrs_style:
immediate_style = dumb_property_dict(attrs_style)
style.update(immediate_style)
return style
def google_list_style(style: Dict[str, str]) -> str:
"""
Finds out whether this is an ordered or unordered list
:type style: dict
:rtype: str
"""
if "list-style-type" in style:
list_style = style["list-style-type"]
if list_style in ["disc", "circle", "square", "none"]:
return "ul"
return "ol"
def google_has_height(style: Dict[str, str]) -> bool:
"""
Check if the style of the element has the 'height' attribute
explicitly defined
:type style: dict
:rtype: bool
"""
return "height" in style
def google_text_emphasis(style: Dict[str, str]) -> List[str]:
"""
:type style: dict
:returns: A list of all emphasis modifiers of the element
:rtype: list
"""
emphasis = []
if "text-decoration" in style:
emphasis.append(style["text-decoration"])
if "font-style" in style:
emphasis.append(style["font-style"])
if "font-weight" in style:
emphasis.append(style["font-weight"])
return emphasis
def google_fixed_width_font(style: Dict[str, str]) -> bool:
"""
Check if the css of the current element defines a fixed width font
:type style: dict
:rtype: bool
"""
font_family = ""
if "font-family" in style:
font_family = style["font-family"]
return "courier new" == font_family or "consolas" == font_family
def list_numbering_start(attrs: Dict[str, Optional[str]]) -> int:
"""
Extract numbering from list element attributes
:type attrs: dict
:rtype: int or None
"""
attrs_start = attrs.get("start")
if attrs_start:
try:
return int(attrs_start) - 1
except ValueError:
pass
return 0
def skipwrap(para: str, wrap_links: bool, wrap_list_items: bool, wrap_tables: bool) -> bool:
# If it appears to contain a link
# don't wrap
if not wrap_links and config.RE_LINK.search(para):
return True
# If the text begins with four spaces or one tab, it's a code block;
# don't wrap
if para[0:4] == " " or para[0] == "\t":
return True
# If the text begins with only two "--", possibly preceded by
# whitespace, that's an emdash; so wrap.
stripped = para.lstrip()
if stripped[0:2] == "--" and len(stripped) > 2 and stripped[2] != "-":
return False
# I'm not sure what this is for; I thought it was to detect lists,
# but there's a <br>-inside-<span> case in one of the tests that
# also depends upon it.
if stripped[0:1] in ("-", "*") and not stripped[0:2] == "**":
return not wrap_list_items
# If text contains a pipe character it is likely a table
if not wrap_tables and config.RE_TABLE.search(para):
return True
# If the text begins with a single -, *, or +, followed by a space,
# or an integer, followed by a ., followed by a space (in either
# case optionally proceeded by whitespace), it's a list; don't wrap.
return bool(
config.RE_ORDERED_LIST_MATCHER.match(stripped)
or config.RE_UNORDERED_LIST_MATCHER.match(stripped)
)
def escape_md(text: str) -> str:
"""
Escapes markdown-sensitive characters within other markdown
constructs.
"""
return config.RE_MD_CHARS_MATCHER.sub(r"\\\1", text)
def escape_md_section(text: str, snob: bool = False) -> str:
"""
Escapes markdown-sensitive characters across whole document sections.
"""
text = config.RE_MD_BACKSLASH_MATCHER.sub(r"\\\1", text)
if snob:
text = config.RE_MD_CHARS_MATCHER_ALL.sub(r"\\\1", text)
text = config.RE_MD_DOT_MATCHER.sub(r"\1\\\2", text)
text = config.RE_MD_PLUS_MATCHER.sub(r"\1\\\2", text)
text = config.RE_MD_DASH_MATCHER.sub(r"\1\\\2", text)
return text
def reformat_table(lines: List[str], right_margin: int) -> List[str]:
"""
Given the lines of a table
padds the cells and returns the new lines
"""
# find the maximum width of the columns
max_width = [len(x.rstrip()) + right_margin for x in lines[0].split("|")]
max_cols = len(max_width)
for line in lines:
cols = [x.rstrip() for x in line.split("|")]
num_cols = len(cols)
# don't drop any data if colspan attributes result in unequal lengths
if num_cols < max_cols:
cols += [""] * (max_cols - num_cols)
elif max_cols < num_cols:
max_width += [len(x) + right_margin for x in cols[-(num_cols - max_cols) :]]
max_cols = num_cols
max_width = [max(len(x) + right_margin, old_len) for x, old_len in zip(cols, max_width)]
# reformat
new_lines = []
for line in lines:
cols = [x.rstrip() for x in line.split("|")]
if set(line.strip()) == set("-|"):
filler = "-"
new_cols = [
x.rstrip() + (filler * (M - len(x.rstrip()))) for x, M in zip(cols, max_width)
]
new_lines.append("|-" + "|".join(new_cols) + "|")
else:
filler = " "
new_cols = [
x.rstrip() + (filler * (M - len(x.rstrip()))) for x, M in zip(cols, max_width)
]
new_lines.append("| " + "|".join(new_cols) + "|")
return new_lines
def pad_tables_in_text(text: str, right_margin: int = 1) -> str:
"""
Provide padding for tables in the text
"""
lines = text.split("\n")
table_buffer = [] # type: List[str]
table_started = False
new_lines = []
for line in lines:
# Toggle table started
if config.TABLE_MARKER_FOR_PAD in line:
table_started = not table_started
if not table_started:
table = reformat_table(table_buffer, right_margin)
new_lines.extend(table)
table_buffer = []
new_lines.append("")
continue
# Process lines
if table_started:
table_buffer.append(line)
else:
new_lines.append(line)
return "\n".join(new_lines)

View File

@ -1,196 +0,0 @@
from datetime import datetime, timezone
from dateutil.parser import parse as date_parse
from base.orm import local_session
from migration.html2text import html2text
from orm.reaction import Reaction, ReactionKind
from orm.shout import Shout, ShoutReactionsFollower
from orm.topic import TopicFollower
from orm.user import User
ts = datetime.now(tz=timezone.utc)
def auto_followers(session, topics, reaction_dict):
# creating shout's reactions following for reaction author
following1 = (
session.query(ShoutReactionsFollower)
.where(ShoutReactionsFollower.follower == reaction_dict["createdBy"])
.filter(ShoutReactionsFollower.shout == reaction_dict["shout"])
.first()
)
if not following1:
following1 = ShoutReactionsFollower.create(
follower=reaction_dict["createdBy"], shout=reaction_dict["shout"], auto=True
)
session.add(following1)
# creating topics followings for reaction author
for t in topics:
tf = (
session.query(TopicFollower)
.where(TopicFollower.follower == reaction_dict["createdBy"])
.filter(TopicFollower.topic == t["id"])
.first()
)
if not tf:
topic_following = TopicFollower.create(
follower=reaction_dict["createdBy"], topic=t["id"], auto=True
)
session.add(topic_following)
def migrate_ratings(session, entry, reaction_dict):
for comment_rating_old in entry.get("ratings", []):
rater = session.query(User).filter(User.oid == comment_rating_old["createdBy"]).first()
re_reaction_dict = {
"shout": reaction_dict["shout"],
"replyTo": reaction_dict["id"],
"kind": ReactionKind.LIKE if comment_rating_old["value"] > 0 else ReactionKind.DISLIKE,
"createdBy": rater.id if rater else 1,
}
cts = comment_rating_old.get("createdAt")
if cts:
re_reaction_dict["createdAt"] = date_parse(cts)
try:
# creating reaction from old rating
rr = Reaction.create(**re_reaction_dict)
following2 = (
session.query(ShoutReactionsFollower)
.where(ShoutReactionsFollower.follower == re_reaction_dict["createdBy"])
.filter(ShoutReactionsFollower.shout == rr.shout)
.first()
)
if not following2:
following2 = ShoutReactionsFollower.create(
follower=re_reaction_dict["createdBy"], shout=rr.shout, auto=True
)
session.add(following2)
session.add(rr)
except Exception as e:
print("[migration] comment rating error: %r" % re_reaction_dict)
raise e
session.commit()
async def migrate(entry, storage):
"""
{
"_id": "hdtwS8fSyFLxXCgSC",
"body": "<p>",
"contentItem": "mnK8KsJHPRi8DrybQ",
"createdBy": "bMFPuyNg6qAD2mhXe",
"thread": "01/",
"createdAt": "2016-04-19 04:33:53+00:00",
"ratings": [
{ "createdBy": "AqmRukvRiExNpAe8C", "value": 1 },
{ "createdBy": "YdE76Wth3yqymKEu5", "value": 1 }
],
"rating": 2,
"updatedAt": "2020-05-27 19:22:57.091000+00:00",
"updatedBy": "0"
}
->
type Reaction {
id: Int!
shout: Shout!
createdAt: DateTime!
createdBy: User!
updatedAt: DateTime
deletedAt: DateTime
deletedBy: User
range: String # full / 0:2340
kind: ReactionKind!
body: String
replyTo: Reaction
stat: Stat
old_id: String
old_thread: String
}
"""
old_ts = entry.get("createdAt")
reaction_dict = {
"createdAt": (ts if not old_ts else date_parse(old_ts)),
"body": html2text(entry.get("body", "")),
"oid": entry["_id"],
}
shout_oid = entry.get("contentItem")
if shout_oid not in storage["shouts"]["by_oid"]:
if len(storage["shouts"]["by_oid"]) > 0:
return shout_oid
else:
print("[migration] no shouts migrated yet")
raise Exception
return
else:
stage = "started"
reaction = None
with local_session() as session:
author = session.query(User).filter(User.oid == entry["createdBy"]).first()
old_shout = storage["shouts"]["by_oid"].get(shout_oid)
if not old_shout:
raise Exception("no old shout in storage")
else:
stage = "author and old id found"
try:
shout = session.query(Shout).where(Shout.slug == old_shout["slug"]).one()
if shout:
reaction_dict["shout"] = shout.id
reaction_dict["createdBy"] = author.id if author else 1
reaction_dict["kind"] = ReactionKind.COMMENT
# creating reaction from old comment
reaction = Reaction.create(**reaction_dict)
session.add(reaction)
# session.commit()
stage = "new reaction commited"
reaction_dict = reaction.dict()
topics = [t.dict() for t in shout.topics]
auto_followers(session, topics, reaction_dict)
migrate_ratings(session, entry, reaction_dict)
return reaction
except Exception as e:
print(e)
print(reaction)
raise Exception(stage)
return
def migrate_2stage(old_comment, idmap):
if old_comment.get("body"):
new_id = idmap.get(old_comment.get("oid"))
new_id = idmap.get(old_comment.get("_id"))
if new_id:
new_replyto_id = None
old_replyto_id = old_comment.get("replyTo")
if old_replyto_id:
new_replyto_id = int(idmap.get(old_replyto_id, "0"))
with local_session() as session:
comment = session.query(Reaction).where(Reaction.id == new_id).first()
try:
if new_replyto_id:
new_reply = (
session.query(Reaction).where(Reaction.id == new_replyto_id).first()
)
if not new_reply:
print(new_replyto_id)
raise Exception("cannot find reply by id!")
comment.replyTo = new_reply.id
session.add(comment)
srf = (
session.query(ShoutReactionsFollower)
.where(ShoutReactionsFollower.shout == comment.shout)
.filter(ShoutReactionsFollower.follower == comment.createdBy)
.first()
)
if not srf:
srf = ShoutReactionsFollower.create(
shout=comment.shout, follower=comment.createdBy, auto=True
)
session.add(srf)
session.commit()
except Exception:
raise Exception("cannot find a comment by oldid")

View File

@ -1,399 +0,0 @@
import json
import re
from datetime import datetime, timezone
from dateutil.parser import parse as date_parse
from sqlalchemy.exc import IntegrityError
from transliterate import translit
from base.orm import local_session
from migration.extract import extract_html, extract_media
from orm.reaction import Reaction, ReactionKind
from orm.shout import Shout, ShoutReactionsFollower, ShoutTopic
from orm.topic import Topic, TopicFollower
from orm.user import User
from services.stat.viewed import ViewedStorage
OLD_DATE = "2016-03-05 22:22:00.350000"
ts = datetime.now(tz=timezone.utc)
type2layout = {
"Article": "article",
"Literature": "literature",
"Music": "music",
"Video": "video",
"Image": "image",
}
anondict = {"slug": "anonymous", "id": 1, "name": "Аноним"}
discours = {"slug": "discours", "id": 2, "name": "Дискурс"}
def get_shout_slug(entry):
slug = entry.get("slug", "")
if not slug:
for friend in entry.get("friendlySlugs", []):
slug = friend.get("slug", "")
if slug:
break
slug = re.sub("[^0-9a-zA-Z]+", "-", slug)
return slug
def create_author_from_app(app):
user = None
userdata = None
# check if email is used
if app["email"]:
with local_session() as session:
user = session.query(User).where(User.email == app["email"]).first()
if not user:
# print('[migration] app %r' % app)
name = app.get("name")
if name:
slug = translit(name, "ru", reversed=True).lower()
slug = re.sub("[^0-9a-zA-Z]+", "-", slug)
print("[migration] created slug %s" % slug)
# check if slug is used
if slug:
user = session.query(User).where(User.slug == slug).first()
# get slug from email
if user:
slug = app["email"].split("@")[0]
user = session.query(User).where(User.slug == slug).first()
# one more try
if user:
slug += "-author"
user = session.query(User).where(User.slug == slug).first()
# create user with application data
if not user:
userdata = {
"username": app["email"],
"email": app["email"],
"name": app.get("name", ""),
"emailConfirmed": False,
"slug": slug,
"createdAt": ts,
"lastSeen": ts,
}
# print('[migration] userdata %r' % userdata)
user = User.create(**userdata)
session.add(user)
session.commit()
userdata["id"] = user.id
userdata = user.dict()
return userdata
else:
raise Exception("app is not ok", app)
async def create_shout(shout_dict):
s = Shout.create(**shout_dict)
author = s.authors[0]
with local_session() as session:
srf = (
session.query(ShoutReactionsFollower)
.where(ShoutReactionsFollower.shout == s.id)
.filter(ShoutReactionsFollower.follower == author.id)
.first()
)
if not srf:
srf = ShoutReactionsFollower.create(shout=s.id, follower=author.id, auto=True)
session.add(srf)
session.commit()
return s
async def get_user(entry, storage):
app = entry.get("application")
userdata = None
user_oid = None
if app:
userdata = create_author_from_app(app)
else:
user_oid = entry.get("createdBy")
if user_oid == "0":
userdata = discours
elif user_oid:
userdata = storage["users"]["by_oid"].get(user_oid)
if not userdata:
print("no userdata by oid, anonymous")
userdata = anondict
print(app)
# cleanup slug
if userdata:
slug = userdata.get("slug", "")
if slug:
slug = re.sub("[^0-9a-zA-Z]+", "-", slug)
userdata["slug"] = slug
else:
userdata = anondict
user = await process_user(userdata, storage, user_oid)
return user, user_oid
async def migrate(entry, storage):
author, user_oid = await get_user(entry, storage)
r = {
"layout": type2layout[entry["type"]],
"title": entry["title"],
"authors": [
author,
],
"slug": get_shout_slug(entry),
"cover": (
"https://images.discours.io/unsafe/" + entry["thumborId"]
if entry.get("thumborId")
else entry.get("image", {}).get("url")
),
"visibility": "public" if entry.get("published") else "community",
"publishedAt": date_parse(entry.get("publishedAt")) if entry.get("published") else None,
"deletedAt": date_parse(entry.get("deletedAt")) if entry.get("deletedAt") else None,
"createdAt": date_parse(entry.get("createdAt", OLD_DATE)),
"updatedAt": date_parse(entry["updatedAt"]) if "updatedAt" in entry else ts,
"createdBy": author.id,
"topics": await add_topics_follower(entry, storage, author),
"body": extract_html(entry, cleanup=True),
}
# main topic patch
r["mainTopic"] = r["topics"][0]
# published author auto-confirm
if entry.get("published"):
with local_session() as session:
# update user.emailConfirmed if published
author.emailConfirmed = True
session.add(author)
session.commit()
# media
media = extract_media(entry)
r["media"] = json.dumps(media, ensure_ascii=True) if media else None
# ----------------------------------- copy
shout_dict = r.copy()
del shout_dict["topics"]
try:
# save shout to db
shout_dict["oid"] = entry.get("_id", "")
shout = await create_shout(shout_dict)
except IntegrityError as e:
print("[migration] create_shout integrity error", e)
shout = await resolve_create_shout(shout_dict)
except Exception as e:
raise Exception(e)
# udpate data
shout_dict = shout.dict()
shout_dict["authors"] = [
author.dict(),
]
# shout topics aftermath
shout_dict["topics"] = await topics_aftermath(r, storage)
# content_item ratings to reactions
await content_ratings_to_reactions(entry, shout_dict["slug"])
# shout views
await ViewedStorage.increment(
shout_dict["slug"], amount=entry.get("views", 1), viewer="old-discours"
)
# del shout_dict['ratings']
storage["shouts"]["by_oid"][entry["_id"]] = shout_dict
storage["shouts"]["by_slug"][shout_dict["slug"]] = shout_dict
return shout_dict
async def add_topics_follower(entry, storage, user):
topics = set([])
category = entry.get("category")
topics_by_oid = storage["topics"]["by_oid"]
oids = [
category,
] + entry.get("tags", [])
for toid in oids:
tslug = topics_by_oid.get(toid, {}).get("slug")
if tslug:
topics.add(tslug)
ttt = list(topics)
# add author as TopicFollower
with local_session() as session:
for tpcslug in topics:
try:
tpc = session.query(Topic).where(Topic.slug == tpcslug).first()
if tpc:
tf = (
session.query(TopicFollower)
.where(TopicFollower.follower == user.id)
.filter(TopicFollower.topic == tpc.id)
.first()
)
if not tf:
tf = TopicFollower.create(topic=tpc.id, follower=user.id, auto=True)
session.add(tf)
session.commit()
except IntegrityError:
print("[migration.shout] hidden by topic " + tpc.slug)
# main topic
maintopic = storage["replacements"].get(topics_by_oid.get(category, {}).get("slug"))
if maintopic in ttt:
ttt.remove(maintopic)
ttt.insert(0, maintopic)
return ttt
async def process_user(userdata, storage, oid):
with local_session() as session:
uid = userdata.get("id") # anonymous as
if not uid:
print(userdata)
print("has no id field, set it @anonymous")
userdata = anondict
uid = 1
user = session.query(User).filter(User.id == uid).first()
if not user:
try:
slug = userdata["slug"].lower().strip()
slug = re.sub("[^0-9a-zA-Z]+", "-", slug)
userdata["slug"] = slug
user = User.create(**userdata)
session.add(user)
session.commit()
except IntegrityError:
print(f"[migration] user creating with slug {userdata['slug']}")
print("[migration] from userdata")
print(userdata)
raise Exception("[migration] cannot create user in content_items.get_user()")
if user.id == 946:
print("[migration] ***************** ALPINA")
if user.id == 2:
print("[migration] +++++++++++++++++ DISCOURS")
userdata["id"] = user.id
userdata["createdAt"] = user.createdAt
storage["users"]["by_slug"][userdata["slug"]] = userdata
storage["users"]["by_oid"][oid] = userdata
if not user:
raise Exception("could not get a user")
return user
async def resolve_create_shout(shout_dict):
with local_session() as session:
s = session.query(Shout).filter(Shout.slug == shout_dict["slug"]).first()
bump = False
if s:
if s.createdAt != shout_dict["createdAt"]:
# create new with different slug
shout_dict["slug"] += "-" + shout_dict["layout"]
try:
await create_shout(shout_dict)
except IntegrityError as e:
print(e)
bump = True
else:
# update old
for key in shout_dict:
if key in s.__dict__:
if s.__dict__[key] != shout_dict[key]:
print("[migration] shout already exists, but differs in %s" % key)
bump = True
else:
print("[migration] shout already exists, but lacks %s" % key)
bump = True
if bump:
s.update(shout_dict)
else:
print("[migration] something went wrong with shout: \n%r" % shout_dict)
raise Exception("")
session.commit()
return s
async def topics_aftermath(entry, storage):
r = []
for tpc in filter(lambda x: bool(x), entry["topics"]):
oldslug = tpc
newslug = storage["replacements"].get(oldslug, oldslug)
if newslug:
with local_session() as session:
shout = session.query(Shout).where(Shout.slug == entry["slug"]).first()
new_topic = session.query(Topic).where(Topic.slug == newslug).first()
shout_topic_old = (
session.query(ShoutTopic)
.join(Shout)
.join(Topic)
.filter(Shout.slug == entry["slug"])
.filter(Topic.slug == oldslug)
.first()
)
if shout_topic_old:
shout_topic_old.update({"topic": new_topic.id})
else:
shout_topic_new = (
session.query(ShoutTopic)
.join(Shout)
.join(Topic)
.filter(Shout.slug == entry["slug"])
.filter(Topic.slug == newslug)
.first()
)
if not shout_topic_new:
try:
ShoutTopic.create(**{"shout": shout.id, "topic": new_topic.id})
except Exception:
print("[migration] shout topic error: " + newslug)
session.commit()
if newslug not in r:
r.append(newslug)
else:
print("[migration] ignored topic slug: \n%r" % tpc["slug"])
# raise Exception
return r
async def content_ratings_to_reactions(entry, slug):
try:
with local_session() as session:
for content_rating in entry.get("ratings", []):
rater = (
session.query(User).filter(User.oid == content_rating["createdBy"]).first()
) or User.default_user
shout = session.query(Shout).where(Shout.slug == slug).first()
cts = content_rating.get("createdAt")
reaction_dict = {
"createdAt": date_parse(cts) if cts else None,
"kind": ReactionKind.LIKE
if content_rating["value"] > 0
else ReactionKind.DISLIKE,
"createdBy": rater.id,
"shout": shout.id,
}
reaction = (
session.query(Reaction)
.filter(Reaction.shout == reaction_dict["shout"])
.filter(Reaction.createdBy == reaction_dict["createdBy"])
.filter(Reaction.kind == reaction_dict["kind"])
.first()
)
if reaction:
k = ReactionKind.AGREE if content_rating["value"] > 0 else ReactionKind.DISAGREE
reaction_dict["kind"] = k
reaction.update(reaction_dict)
session.add(reaction)
else:
rea = Reaction.create(**reaction_dict)
session.add(rea)
# shout_dict['ratings'].append(reaction_dict)
session.commit()
except Exception:
print("[migration] content_item.ratings error: \n%r" % content_rating)

View File

@ -1,35 +0,0 @@
# from base.orm import local_session
# from migration.extract import extract_md
# from migration.html2text import html2text
# from orm.reaction import Reaction, ReactionKind
# def migrate(entry, storage):
# post_oid = entry["contentItem"]
# print(post_oid)
# shout_dict = storage["shouts"]["by_oid"].get(post_oid)
# if shout_dict:
# print(shout_dict["body"])
# remark = {
# "shout": shout_dict["id"],
# "body": extract_md(html2text(entry["body"]), shout_dict),
# "kind": ReactionKind.REMARK,
# }
#
# if entry.get("textBefore"):
# remark["range"] = (
# str(shout_dict["body"].index(entry["textBefore"] or ""))
# + ":"
# + str(
# shout_dict["body"].index(entry["textAfter"] or "")
# + len(entry["textAfter"] or "")
# )
# )
#
# with local_session() as session:
# rmrk = Reaction.create(**remark)
# session.commit()
# del rmrk["_sa_instance_state"]
# return rmrk
# return

View File

@ -1,828 +0,0 @@
{
"207": "207",
"1990-e": "90s",
"2000-e": "2000s",
"90-e": "90s",
"Georgia": "georgia",
"Japan": "japan",
"Sweden": "sweden",
"abstraktsiya": "abstract",
"absurdism": "absurdism",
"acclimatization": "acclimatisation",
"activism": "activism",
"adolf-gitler": "adolf-hitler",
"afrika": "africa",
"agata-kristi": "agatha-christie",
"agressivnoe-povedenie": "agression",
"agressiya": "agression",
"aktsii": "actions",
"aktsionizm": "actionism",
"alber-kamyu": "albert-kamus",
"albomy": "albums",
"aleksandr-griboedov": "aleksander-griboedov",
"aleksandr-pushkin": "aleksander-pushkin",
"aleksandr-solzhenitsyn": "aleksander-solzhenitsyn",
"aleksandr-vvedenskiy": "aleksander-vvedensky",
"aleksey-navalnyy": "alexey-navalny",
"alfavit": "alphabet",
"alkogol": "alcohol",
"alternativa": "alternative",
"alternative": "alternative",
"alternativnaya-istoriya": "alternative-history",
"amerika": "america",
"anarhizm": "anarchism",
"anatoliy-mariengof": "anatoly-mariengof",
"ancient-russia": "ancient-russia",
"andegraund": "underground",
"andrey-platonov": "andrey-platonov",
"andrey-rodionov": "andrey-rodionov",
"andrey-tarkovskiy": "andrey-tarkovsky",
"angliyskie-istorii": "english-stories",
"angliyskiy-yazyk": "english-langugae",
"ango": "ango",
"animation": "animation",
"animatsiya": "animation",
"anime": "anime",
"anri-volohonskiy": "anri-volohonsky",
"antifashizm": "anti-faschism",
"antiquity": "antiquity",
"antiutopiya": "dystopia",
"anton-dolin": "anton-dolin",
"antropology": "antropology",
"antropotsen": "antropocenus",
"architecture": "architecture",
"arheologiya": "archeology",
"arhetipy": "archetypes",
"arhiv": "archive",
"aristokraty": "aristocracy",
"aristotel": "aristotle",
"arktika": "arctic",
"armiya": "army",
"armiya-1": "army",
"art": "art",
"art-is": "art-is",
"artists": "artists",
"ateizm": "atheism",
"audio-poetry": "audio-poetry",
"audiopoeziya": "audio-poetry",
"audiospektakl": "audio-spectacles",
"auktsyon": "auktsyon",
"avangard": "avantgarde",
"avtofikshn": "autofiction",
"avtorskaya-pesnya": "bardsongs",
"azbuka-immigratsii": "immigration-basics",
"aziatskiy-kinematograf": "asian-cinema",
"b-movie": "b-movie",
"bannye-chteniya": "sauna-reading",
"bardsongs": "bardsongs",
"bdsm": "bdsm",
"beecake": "beecake",
"belarus": "belarus",
"belgiya": "belgium",
"bertold-breht": "berttold-brecht",
"bezumie": "madness",
"biography": "biography",
"biologiya": "biology",
"bipolyarnoe-rasstroystvo": "bipolar-disorder",
"bitniki": "beatnics",
"biznes": "business",
"blizhniy-vostok": "middle-east",
"blizost": "closeness",
"blocked-in-russia": "blocked-in-russia",
"blokada": "blockade",
"bob-dilan": "bob-dylan",
"bog": "god",
"bol": "pain",
"bolotnoe-delo": "bolotnaya-case",
"books": "books",
"boris-eltsin": "boris-eltsin",
"boris-godunov": "boris-godunov",
"boris-grebenschikov": "boris-grebenschikov",
"boris-nemtsov": "boris-nemtsov",
"boris-pasternak": "boris-pasternak",
"brak": "marriage",
"bret-iston-ellis": "bret-iston-ellis",
"buddizm": "buddhism",
"bullying": "bullying",
"bunt": "riot",
"burning-man": "burning-man",
"bytie": "being",
"byurokratiya": "bureaucracy",
"capitalism": "capitalism",
"censored-in-russia": "censored-in-russia",
"ch-rno-beloe": "black-and-white",
"ch-rnyy-yumor": "black-humour",
"chapters": "chapters",
"charity": "charity",
"chayldfri": "childfree",
"chechenskaya-voyna": "chechen-war",
"chechnya": "chechnya",
"chelovek": "male",
"chernobyl": "chernobyl",
"chernyy-yumor": "black-humour",
"children": "children",
"china": "china",
"chinovniki": "bureaucracy",
"chukotka": "chukotka",
"chuma": "plague",
"church": "church",
"cinema": "cinema",
"city": "city",
"civil-position": "civil-position",
"clips": "clips",
"collage": "collage",
"comics": "comics",
"conspiracy-theory": "conspiracy-theory",
"contemporary-art": "contemporary-art",
"contemporary-poetry": "poetry",
"contemporary-prose": "prose",
"coronavirus": "coronavirus",
"corruption": "corruption",
"creative-writing-school": "creative-writing-school",
"crime": "crime",
"criticism": "criticism",
"critiques": "reviews",
"culture": "culture",
"dadaizm": "dadaism",
"daniel-defo": "daniel-defoe",
"daniil-harms": "daniil-kharms",
"dante-aligeri": "dante-alighieri",
"darkveyv": "darkwave",
"death": "death",
"debaty": "debats",
"delo-seti": "seti-case",
"democracy": "democracy",
"demografiya": "demographics",
"demonstrations": "demonstrations",
"depression": "depression",
"derevnya": "village",
"derrida": "derrida",
"design": "design",
"detskie-doma": "orphanages",
"detstvo": "childhood",
"devid-linch": "david-linch",
"devyanostye": "90s",
"dialog": "dialogue",
"digital": "digital",
"digital-art": "digital-art",
"dinozavry": "dinosaurs",
"directing": "directing",
"diskurs": "discours",
"diskurs-1": "discourse",
"diskurs-analiz": "discourse-analytics",
"dissidenty": "dissidents",
"diy": "diy",
"dmitriy-donskoy": "dmitriy-donskoy",
"dmitriy-prigov": "dmitriy-prigov",
"dnevnik-1": "dairy",
"dnevniki": "dairies",
"documentary": "documentary",
"dokumentalnaya-poema": "documentary-poem",
"dokumentalnaya-poeziya": "documentary-poetry",
"dokumenty": "doсuments",
"domashnee-nasilie": "home-terror",
"donald-tramp": "donald-trump",
"donbass": "donbass",
"donbass-diary": "donbass-diary",
"donorstvo": "donation",
"dozhd": "rain",
"drama": "drama",
"dramaturgy": "dramaturgy",
"drawing": "drawing",
"drevo-zhizni": "tree-of-life",
"drugs": "drugs",
"duh": "spirit",
"dzhaz": "jazz",
"dzhek-keruak": "jack-keruak",
"dzhim-morrison": "jim-morrison",
"dzhordzh-romero": "george-romero",
"dzhordzho-agamben": "giorgio-agamben",
"ecology": "ecology",
"economics": "economics",
"eda": "food",
"editorial-statements": "editorial-statements",
"eduard-limonov": "eduard-limonov",
"education": "education",
"egor-letov": "egor-letov",
"ekspat": "expat",
"eksperiment": "experiments",
"eksperimentalnaya-muzyka": "experimental-music",
"ekspressionizm": "expressionism",
"ekstremizm": "extremism",
"ekzistentsializm-1": "existentialism",
"ekzistentsiya": "existence",
"elections": "elections",
"electronic": "electronics",
"electronics": "electronics",
"elena-glinskaya": "elena-glinskaya",
"elena-guro": "elena-guro",
"elizaveta-mnatsakanova": "elizaveta-mnatsakanova",
"embient": "ambient",
"emigration": "emigration",
"emil-dyurkgeym": "emile-durkheim",
"emotsii": "emotions",
"empiric": "empiric",
"epidemiya": "pandemic",
"erich-von-neff": "erich-von-neff",
"erotika": "erotics",
"essay": "essay",
"estetika": "aestetics",
"etika": "ethics",
"etno": "ethno",
"etnos": "ethnics",
"everyday-life": "everyday-life",
"evgeniy-onegin": "eugene-onegin",
"evolyutsiya": "evolution",
"exhibitions": "exhibitions",
"experience": "experiences",
"experimental": "experimental",
"experimental-music": "experimental-music",
"explanation": "explanation",
"faktcheking": "fact-checking",
"falsifikatsii": "falsifications",
"family": "family",
"fanfiki": "fan-fiction",
"fantastika": "sci-fi",
"fatalizm": "fatalism",
"fedor-dostoevskiy": "fedor-dostoevsky",
"fedor-ioannovich": "fedor-ioannovich",
"feleton": "feuilleton",
"feminism": "feminism",
"fenomenologiya": "phenomenology",
"fentezi": "fantasy",
"festival": "festival",
"festival-territoriya": "festival-territory",
"folk": "folk",
"folklor": "folklore",
"fotoreportazh": "photoreports",
"france": "france",
"frants-kafka": "franz-kafka",
"frederik-begbeder": "frederick-begbeder",
"freedom": "freedom",
"friendship": "friendship",
"fsb": "fsb",
"futbol": "footbool",
"future": "future",
"futuristy": "futurists",
"futurizm": "futurism",
"galereya": "gallery",
"galereya-anna-nova": "gallery-anna-nova",
"gdr": "gdr",
"gender": "gender",
"gendernyy-diskurs": "gender",
"gennadiy-aygi": "gennadiy-aygi",
"gerhard-rihter": "gerhard-rihter",
"germaniya": "germany",
"germenevtika": "hermeneutics",
"geroi": "heroes",
"girls": "girls",
"gkchp": "gkchp",
"glitch": "glitch",
"globalizatsiya": "globalisation",
"gollivud": "hollywood",
"gonzo": "gonzo",
"gore-ot-uma": "woe-from-wit",
"graffiti": "graffiti",
"graficheskaya-novella": "graphic-novell",
"graphics": "graphics",
"gravyura": "engraving",
"grazhdanskaya-oborona": "grazhdanskaya-oborona",
"gretsiya": "greece",
"griby": "mushrooms",
"gruziya-2": "georgia",
"gulag": "gulag",
"han-batyy": "khan-batyy",
"hayku": "haiku",
"health": "health",
"himiya": "chemistry",
"hip-hop": "hip-hop",
"history": "history",
"history-of-russia": "history-of-russia",
"holokost": "holocaust",
"horeografiya": "choreography",
"horror": "horror",
"hospis": "hospice",
"hristianstvo": "christianity",
"humans": "humans",
"humour": "humour",
"ideologiya": "ideology",
"idm": "idm",
"igil": "isis",
"igor-pomerantsev": "igor-pomerantsev",
"igra": "game",
"igra-prestolov": "game-of-throne",
"igry": "games",
"iisus-hristos": "jesus-christ",
"illness": "illness",
"illustration-history": "illustration-history",
"illustrations": "illustrations",
"imazhinizm": "imagism",
"immanuil-kant": "immanuel-kant",
"impressionizm": "impressionism",
"improvizatsiya": "improvisation",
"indi": "indie",
"individualizm": "individualism",
"infografika": "infographics",
"informatsiya": "information",
"ingmar-bergman": "ingmar-bergman",
"inklyuziya": "inclusion",
"installyatsiya": "installation",
"internet": "internet",
"interview": "interview",
"invalidnost": "disability",
"investigations": "investigations",
"iosif-brodskiy": "joseph-brodsky",
"iosif-stalin": "joseph-stalin",
"iskusstvennyy-intellekt": "artificial-intelligence",
"islam": "islam",
"istoriya-moskvy": "moscow-history",
"istoriya-nauki": "history-of-sceince",
"istoriya-o-medsestre": "nurse-story",
"istoriya-teatra": "theatre-history",
"italiya": "italy",
"italyanskiy-yazyk": "italian-language",
"iudaika": "judaica",
"ivan-groznyy": "ivan-grozny",
"ivan-iii-gorbatyy": "ivan-iii-gorbaty",
"ivan-kalita": "ivan-kalita",
"ivan-krylov": "ivan-krylov",
"izobreteniya": "inventions",
"izrail-1": "israel",
"jazz": "jazz",
"john-lennon": "john-lennon",
"journalism": "journalism",
"justice": "justice",
"k-pop": "k-pop",
"kalligrafiya": "calligraphy",
"karikatura": "caricatures",
"kartochki-rubinshteyna": "rubinstein-cards",
"katrin-nenasheva": "katrin-nenasheva",
"kavarga": "kavarga",
"kavkaz": "caucasus",
"kazan": "kazan",
"kiberbezopasnost": "cybersecurity",
"kinoklub": "cinema-club",
"kinokritika": "film-criticism",
"kirill-serebrennikov": "kirill-serebrennikov",
"kladbische": "cemetery",
"klassika": "classic",
"kollektivnoe-bessoznatelnoe": "сollective-unconscious",
"komediya": "comedy",
"kommunikatsii": "communications",
"kommunizm": "communism",
"kommuny": "communes",
"kompyuternye-igry": "computer-games",
"konets-vesny": "end-of-spring",
"konservatizm": "conservatism",
"kontrkultura": "counter-culture",
"kontseptualizm": "conceptualism",
"korotkometrazhka": "cinema-shorts",
"kosmos": "cosmos",
"kraudfanding": "crowdfunding",
"kriptovalyuty": "cryptocurrencies",
"krizis": "crisis",
"krov": "blood",
"krym": "crimea",
"kulturologiya": "culturology",
"kulty": "cults",
"kurdistan": "kurdistan",
"kurt-kobeyn": "kurt-cobain",
"kurt-vonnegut": "kurt-vonnegut",
"kvir": "queer",
"laboratoriya": "lab",
"language": "languages",
"lars-fon-trier": "lars-fon-trier",
"laws": "laws",
"lectures": "lectures",
"leto": "summer",
"lev-tolstoy": "leo-tolstoy",
"lgbt": "lgbt",
"liberalizm": "liberalism",
"libertarianstvo": "libertarianism",
"life": "life",
"likbez": "likbez",
"lingvistika": "linguistics",
"lirika": "lirics",
"literary-studies": "literary-studies",
"literature": "literature",
"literaturnyykaver": "literature-cover",
"lo-fi": "lo-fi",
"lomonosov": "lomonosov",
"love": "love",
"luzha-goluboy-krovi": "luzha-goluboy-krovi",
"lyudvig-vitgenshteyn": "ludwig-wittgenstein",
"lzhedmitriy": "false-dmitry",
"lzhenauka": "pseudoscience",
"magiya": "magic",
"maks-veber": "max-weber",
"manifests": "manifests",
"manipulyatsii-soznaniem": "mind-manipulation",
"marina-abramovich": "marina-abramovich",
"marketing": "marketing",
"marksizm": "marxism",
"marsel-dyushan": "marchel-duchamp",
"marsel-prust": "marcel-proust",
"martin-haydegger": "martin-hidegger",
"matematika": "maths",
"mayakovskiy": "vladimir-mayakovsky",
"media": "media",
"medicine": "medicine",
"memuary": "memoirs",
"menedzhment": "management",
"menty": "police",
"merab-mamardashvili": "merab-mamardashvili",
"mest": "revenge",
"metamodernizm": "metamodern",
"metavselennaya": "metaverse",
"metro": "metro",
"mifologiya": "mythology",
"mify": "myth",
"mihael-haneke": "michael-haneke",
"mihail-baryshnikov": "mihail-baryshnikov",
"mihail-bulgakov": "mihail-bulgakov",
"mikrotonalnaya-muzyka": "mikrotone-muzyka",
"minimalizm": "minimalism",
"minkult-privet": "minkult-privet",
"mir": "world",
"mirovozzrenie": "mindsets",
"mishel-fuko": "michel-foucault",
"mistika": "mystics",
"mitropolit-makariy": "mitropolit-makariy",
"mlm": "mlm",
"mobilizatsiya": "mobilisation",
"moda": "fashion",
"modernizm": "modernism",
"mokyumentari": "mockumentary",
"molodezh": "youth",
"moloko-plus": "moloko-plus",
"money": "money",
"monologs": "monologues",
"monstratsiya": "monstration",
"moralnaya-otvetstvennost": "moral-responsibility",
"more": "sea",
"moscow": "moscow",
"moshennichestvo": "frauds",
"moskovskiy-romanticheskiy-kontseptualizm": "moscow-romantic-conceptualism",
"moskovskoe-delo": "moscow-case",
"movies": "movies",
"mozg": "brain",
"multiplikatsiya": "animation",
"music": "music",
"musulmanstvo": "islam",
"muzei": "museum",
"muzey": "museum",
"muzhchiny": "man",
"myshlenie": "thinking",
"nagornyy-karabah": "nagorno-karabakh",
"nasilie-1": "violence",
"natsionalizm": "nationalism",
"natsionalnaya-ideya": "national-idea",
"natsizm": "nazism",
"natyurmort": "nature-morte",
"nauchpop": "pop-science",
"nbp": "nbp",
"nenavist": "hate",
"neofitsialnaya-literatura": "unofficial-literature",
"neoklassika": "neoclassic",
"neprozrachnye-smysly": "hidden-meanings",
"neravenstvo": "inequality",
"net-voyne": "no-war",
"new-year": "new-year",
"neyronauka": "neuro-science",
"neyroseti": "neural-networks",
"niu-vshe": "hse",
"nizhniy-novgorod": "nizhny-novgorod",
"nko": "nonprofits",
"nlo": "ufo",
"nobelevskaya-premiya": "nobel-prize",
"noize-mc": "noize-mc",
"nonkonformizm": "nonconformism",
"notforall": "notforall",
"novaya-drama": "new-drama",
"novosti": "news",
"noyz": "noise",
"nuar": "noir",
"oberiu": "oberiu",
"ocherk": "etudes",
"ochevidnyy-nuar": "ochevidnyy-nuar",
"odinochestvo": "loneliness",
"odna-kniga-odna-istoriya": "one-book-one-story",
"okrainy": "outskirts",
"omon": "swat",
"opinions": "opinions",
"oppozitsiya": "opposition",
"orhan-pamuk": "orhan-pamuk",
"ornitologiya": "ornitology",
"osen": "autumn",
"osip-mandelshtam": "osip-mandelshtam",
"oskar-uayld": "oscar-wilde",
"osoznanie": "awareness",
"otnosheniya": "relationship",
"pablo-pikasso": "pablo-picasso",
"painting": "painting",
"paintings": "painting",
"pamyat": "memory",
"pandemiya": "pandemic",
"parizh": "paris",
"patriotizm": "patriotism",
"patsifizm": "pacifism",
"paul-tselan": "paul-tselan",
"per-burd": "pierre-bourdieu",
"perezhivaniya": "worries",
"performance": "performance",
"peyzazh": "landscape",
"philology": "philology",
"philosophy": "philosophy",
"photo": "photography",
"photography": "photography",
"photoprojects": "photoprojects",
"plakaty": "posters",
"plastilin": "plasticine",
"plays": "plays",
"podrostki": "teenagers",
"poema": "poem",
"poems": "poems",
"poeticheskaya-proza": "poetic-prose",
"poetry": "poetry",
"poetry-of-squares": "poetry-of-squares",
"poetry-slam": "poetry-slam",
"pokoy": "peace",
"police": "police",
"politicheskoe-fentezi": "political-fantasy",
"politics": "politics",
"politzaklyuchennye": "political-prisoners",
"polsha": "poland",
"pomosch": "help",
"pop-art": "pop-art",
"pop-culture": "pop-culture",
"populyarnaya-psihologiya": "popular-psychology",
"pornografiya": "pornography",
"portret": "portrait",
"poslovitsy": "proverbs",
"post-pank": "post-punk",
"post-rok": "post-rock",
"postmodernism": "postmodernism",
"povest": "novells",
"povsednevnost": "everyday-life",
"power": "power",
"pravo": "right",
"pravoslavie": "orthodox",
"pravozaschitniki": "human-rights-activism",
"prazdnik": "holidays",
"predatelstvo": "betrayal",
"predprinimatelstvo": "entrepreneurship",
"premera": "premier",
"premiya-oskar": "oscar-prize",
"pribaltika-1": "baltic",
"priroda": "nature",
"prison": "prison",
"pritcha": "parable",
"privatnost": "privacy",
"progress": "progress",
"projects": "projects",
"prokrastinatsiya": "procrastination",
"propaganda": "propaganda",
"proschenie": "forgiveness",
"prose": "prose",
"proshloe": "past",
"prostitutsiya": "prostitution",
"prosveschenie": "enlightenment",
"protests": "protests",
"psalmy": "psalms",
"psihoanaliz": "psychoanalysis",
"psihodeliki": "psychodelics",
"pskov": "pskov",
"psychiatry": "psychiatry",
"psychology": "psychology",
"ptitsy": "birds",
"punk": "punk",
"r-b": "rnb",
"rasizm": "racism",
"realizm": "realism",
"redaktura": "editing",
"refleksiya": "reflection",
"reggi": "reggae",
"religion": "religion",
"rene-zhirar": "rene-girard",
"renesanss": "renessance",
"renovatsiya": "renovation",
"rep": "rap",
"reportage": "reportage",
"reportazh-1": "reportage",
"repressions": "repressions",
"research": "research",
"retroveyv": "retrowave",
"review": "review",
"revolution": "revolution",
"rezo-gabriadze": "rezo-gabriadze",
"risunki": "painting",
"roboty": "robots",
"rock": "rock",
"roditeli": "parents",
"romantizm": "romantism",
"romany": "novell",
"ronald-reygan": "ronald-reygan",
"roskomnadzor": "roskomnadzor",
"rossiyskoe-kino": "russian-cinema",
"rouling": "rowling",
"rozhava": "rojava",
"rpts": "rpts",
"rus-na-grani-sryva": "rus-na-grani-sryva",
"russia": "russia",
"russian-language": "russian-language",
"russian-literature": "russian-literature",
"russkaya-toska": "russian-toska",
"russkiy-mir": "russkiy-mir",
"salo": "lard",
"salvador-dali": "salvador-dali",
"samoidentifikatsiya": "self-identity",
"samoopredelenie": "self-definition",
"sankt-peterburg": "saint-petersburg",
"sasha-skochilenko": "sasha-skochilenko",
"satira": "satiric",
"saund-art": "sound-art",
"schaste": "happiness",
"school": "school",
"science": "science",
"sculpture": "sculpture",
"second-world-war": "second-world-war",
"sekond-hend": "second-hand",
"seksprosvet": "sex-education",
"seksualizirovannoe-nasilie": "sexualized-violence",
"seksualnoe-nasilie": "sexualized-violence",
"sekty": "sects",
"semi": "semi",
"semiotics": "semiotics",
"serbiya": "serbia",
"sergey-bodrov-mladshiy": "sergey-bodrov-junior",
"sergey-solov-v": "sergey-solovyov",
"serialy": "series",
"sever": "north",
"severnaya-koreya": "north-korea",
"sex": "sex",
"shotlandiya": "scotland",
"shugeyz": "shoegaze",
"siloviki": "siloviki",
"simeon-bekbulatovich": "simeon-bekbulatovich",
"simvolizm": "simbolism",
"siriya": "siria",
"skulptura": "sculpture",
"slavoy-zhizhek": "slavoj-zizek",
"smert-1": "death",
"smysl": "meaning",
"sny": "dreams",
"sobytiya": "events",
"social": "society",
"society": "society",
"sociology": "sociology",
"sofya-paleolog": "sofya-paleolog",
"sofya-vitovtovna": "sofya-vitovtovna",
"soobschestva": "communities",
"soprotivlenie": "resistence",
"sotsializm": "socialism",
"sotsialnaya-filosofiya": "social-philosophy",
"sotsiologiya-1": "sociology",
"sotsseti": "social-networks",
"sotvorenie-tretego-rima": "third-rome",
"sovremennost": "modernity",
"spaces": "spaces",
"spektakl": "spectacles",
"spetseffekty": "special-fx",
"spetsoperatsiya": "special-operation",
"spetssluzhby": "special-services",
"sport": "sport",
"srednevekove": "middle-age",
"state": "state",
"statistika": "statistics",
"stendap": "stand-up",
"stihi": "poetry",
"stoitsizm": "stoicism",
"stories": "stories",
"stoyanie-na-ugre": "stoyanie-na-ugre",
"strah": "fear",
"street-art": "street-art",
"stsenarii": "scenarios",
"sud": "court",
"summary": "summary",
"supergeroi": "superheroes",
"svetlana-aleksievich": "svetlana-aleksievich",
"svobodu-ivanu-golunovu": "free-ivan-golunov",
"syurrealizm": "surrealism",
"tales": "tales",
"tanets": "dance",
"tataro-mongolskoe-igo": "mongol-tatar-yoke",
"tatuirovki": "tattoo",
"technology": "technology",
"televidenie": "television",
"telo": "body",
"telo-kak-iskusstvo": "body-as-art",
"terrorizm": "terrorism",
"tests": "tests",
"text": "texts",
"the-beatles": "the-beatles",
"theater": "theater",
"theory": "theory",
"tokio": "tokio",
"torture": "torture",
"totalitarizm": "totalitarism",
"traditions": "traditions",
"tragicomedy": "tragicomedy",
"transgendernost": "transgender",
"translation": "translation",
"transport": "transport",
"travel": "travel",
"travma": "trauma",
"trendy": "trends",
"tretiy-reyh": "third-reich",
"triller": "thriller",
"tsar": "central-african-republic",
"tsar-edip": "oedipus",
"tsarevich-dmitriy": "tsarevich-dmitry",
"tsennosti": "values",
"tsenzura": "censorship",
"tseremonii": "ceremonies",
"turizm": "tourism",
"tvorchestvo": "creativity",
"ugnetennyy-zhilischnyy-klass": "oppressed-housing-class",
"uilyam-shekspir": "william-shakespeare",
"ukraina-2": "ukraine",
"ukraine": "ukraine",
"university": "university",
"urban-studies": "urban-studies",
"uroki-literatury": "literature-lessons",
"usa": "usa",
"ussr": "ussr",
"utopiya": "utopia",
"utrata": "loss",
"valter-benyamin": "valter-benyamin",
"varlam-shalamov": "varlam-shalamov",
"vasiliy-ii-temnyy": "basil-ii-temnyy",
"vasiliy-iii": "basil-iii",
"vdnh": "vdnh",
"vechnost": "ethernety",
"velikobritaniya": "great-britain",
"velimir-hlebnikov": "velimir-hlebnikov",
"velkom-tu-greyt-britn": "welcome-to-great-britain",
"venedikt-erofeev": "venedikt-erofeev",
"venetsiya": "veneece",
"vengriya": "hungary",
"verlibry": "free-verse",
"veschi": "things",
"vessels": "vessels",
"veterany": "veterans",
"video": "video",
"videoart": "videoart",
"videoklip": "clips",
"videopoeziya": "video-poetry",
"viktor-astafev": "viktor-astafev",
"viktor-pelevin": "viktor-pelevin",
"vilgelm-rayh": "wilhelm-reich",
"vinzavod": "vinzavod",
"violence": "violence",
"visual-culture": "visual-culture",
"vizualnaya-poeziya": "visual-poetry",
"vladimir-lenin": "vladimir-lenin",
"vladimir-mayakovskiy": "vladimir-mayakovsky",
"vladimir-nabokov": "vladimir-nabokov",
"vladimir-putin": "vladimir-putin",
"vladimir-sorokin": "vladimir-sorokin",
"vladimir-voynovich": "vladimir-voynovich",
"vnutrenniy-opyt": "inner-expirience",
"volga": "volga",
"volontery": "volonteurs",
"vong-karvay": "wong-karwai",
"vospominaniya": "memories",
"vostok": "east",
"voyna-na-ukraine": "war-in-ukraine",
"voyna-v-ukraine": "war-in-ukraine",
"vremya": "time",
"vudi-allen": "woody-allen",
"vynuzhdennye-otnosheniya": "forced-relationship",
"war": "war",
"war-in-ukraine-images": "war-in-ukrahine-images",
"women": "women",
"work": "work",
"writers": "writers",
"xx-century": "xx-century",
"yakob-yordans": "yakob-yordans",
"yan-vermeer": "yan-vermeer",
"yanka-dyagileva": "yanka-dyagileva",
"yaponskaya-literatura": "japan-literature",
"yazychestvo": "paganism",
"youth": "youth",
"yozef-rot": "yozef-rot",
"yurgen-habermas": "jorgen-habermas",
"za-liniey-mannergeyma": "behind-mannerheim-line",
"zabota": "care",
"zahar-prilepin": "zahar-prilepin",
"zakonodatelstvo": "laws",
"zakony-mira": "world-laws",
"zametki": "notes",
"zhelanie": "wish",
"zhivotnye": "animals",
"zhoze-saramago": "jose-saramago",
"zigmund-freyd": "sigmund-freud",
"zolotaya-orda": "golden-horde",
"zombi": "zombie",
"zombi-simpsony": "zombie-simpsons"
}

View File

@ -1,31 +0,0 @@
from base.orm import local_session
from migration.html2text import html2text
from orm import Topic
def migrate(entry):
body_orig = entry.get("description", "").replace("&nbsp;", " ")
topic_dict = {
"slug": entry["slug"],
"oid": entry["_id"],
"title": entry["title"].replace("&nbsp;", " "),
"body": html2text(body_orig),
}
with local_session() as session:
slug = topic_dict["slug"]
topic = session.query(Topic).filter(Topic.slug == slug).first() or Topic.create(
**topic_dict
)
if not topic:
raise Exception("no topic!")
if topic:
if len(topic.title) > len(topic_dict["title"]):
Topic.update(topic, {"title": topic_dict["title"]})
if len(topic.body) < len(topic_dict["body"]):
Topic.update(topic, {"body": topic_dict["body"]})
session.commit()
# print(topic.__dict__)
rt = topic.__dict__.copy()
del rt["_sa_instance_state"]
return rt

View File

@ -1,156 +0,0 @@
import re
from bs4 import BeautifulSoup
from dateutil.parser import parse
from sqlalchemy.exc import IntegrityError
from base.orm import local_session
from orm.user import AuthorFollower, User, UserRating
def migrate(entry): # noqa: C901
if "subscribedTo" in entry:
del entry["subscribedTo"]
email = entry["emails"][0]["address"]
user_dict = {
"oid": entry["_id"],
"roles": [],
"ratings": [],
"username": email,
"email": email,
"createdAt": parse(entry["createdAt"]),
"emailConfirmed": ("@discours.io" in email) or bool(entry["emails"][0]["verified"]),
"muted": False, # amnesty
"links": [],
"name": "anonymous",
"password": entry["services"]["password"].get("bcrypt"),
}
if "updatedAt" in entry:
user_dict["updatedAt"] = parse(entry["updatedAt"])
if "wasOnlineAt" in entry:
user_dict["lastSeen"] = parse(entry["wasOnlineAt"])
if entry.get("profile"):
# slug
slug = entry["profile"].get("path").lower()
slug = re.sub("[^0-9a-zA-Z]+", "-", slug).strip()
user_dict["slug"] = slug
bio = (
(entry.get("profile", {"bio": ""}).get("bio") or "")
.replace(r"\(", "(")
.replace(r"\)", ")")
)
bio_text = BeautifulSoup(bio, features="lxml").text
if len(bio_text) > 120:
user_dict["about"] = bio_text
else:
user_dict["bio"] = bio_text
# userpic
try:
user_dict["userpic"] = (
"https://images.discours.io/unsafe/" + entry["profile"]["thumborId"]
)
except KeyError:
try:
user_dict["userpic"] = entry["profile"]["image"]["url"]
except KeyError:
user_dict["userpic"] = ""
# name
fn = entry["profile"].get("firstName", "")
ln = entry["profile"].get("lastName", "")
name = fn if fn else ""
name = (name + " " + ln) if ln else name
if not name:
name = slug if slug else "anonymous"
name = entry["profile"]["path"].lower().strip().replace(" ", "-") if len(name) < 2 else name
user_dict["name"] = name
# links
fb = entry["profile"].get("facebook", False)
if fb:
user_dict["links"].append(fb)
vk = entry["profile"].get("vkontakte", False)
if vk:
user_dict["links"].append(vk)
tr = entry["profile"].get("twitter", False)
if tr:
user_dict["links"].append(tr)
ws = entry["profile"].get("website", False)
if ws:
user_dict["links"].append(ws)
# some checks
if not user_dict["slug"] and len(user_dict["links"]) > 0:
user_dict["slug"] = user_dict["links"][0].split("/")[-1]
user_dict["slug"] = user_dict.get("slug", user_dict["email"].split("@")[0])
oid = user_dict["oid"]
user_dict["slug"] = user_dict["slug"].lower().strip().replace(" ", "-")
try:
user = User.create(**user_dict.copy())
except IntegrityError:
print("[migration] cannot create user " + user_dict["slug"])
with local_session() as session:
old_user = session.query(User).filter(User.slug == user_dict["slug"]).first()
old_user.oid = oid
old_user.password = user_dict["password"]
session.commit()
user = old_user
if not user:
print("[migration] ERROR: cannot find user " + user_dict["slug"])
raise Exception
user_dict["id"] = user.id
return user_dict
def post_migrate():
old_discours_dict = {
"slug": "old-discours",
"username": "old-discours",
"email": "old@discours.io",
"name": "Просмотры на старой версии сайта",
}
with local_session() as session:
old_discours_user = User.create(**old_discours_dict)
session.add(old_discours_user)
session.commit()
def migrate_2stage(entry, id_map):
ce = 0
for rating_entry in entry.get("ratings", []):
rater_oid = rating_entry["createdBy"]
rater_slug = id_map.get(rater_oid)
if not rater_slug:
ce += 1
# print(rating_entry)
continue
oid = entry["_id"]
author_slug = id_map.get(oid)
with local_session() as session:
try:
rater = session.query(User).where(User.slug == rater_slug).one()
user = session.query(User).where(User.slug == author_slug).one()
user_rating_dict = {
"value": rating_entry["value"],
"rater": rater.id,
"user": user.id,
}
user_rating = UserRating.create(**user_rating_dict)
if user_rating_dict["value"] > 0:
af = AuthorFollower.create(author=user.id, follower=rater.id, auto=True)
session.add(af)
session.add(user_rating)
session.commit()
except IntegrityError:
print("[migration] cannot rate " + author_slug + "`s by " + rater_slug)
except Exception as e:
print(e)
return ce

View File

@ -1,10 +0,0 @@
from datetime import datetime
from json import JSONEncoder
class DateTimeEncoder(JSONEncoder):
def default(self, z):
if isinstance(z, datetime):
return str(z)
else:
return super().default(z)

View File

@ -1,9 +1,14 @@
log_format custom '$remote_addr - $remote_user [$time_local] "$request" '
'origin=$http_origin status=$status '
'"$http_referer" "$http_user_agent"';
{{ $proxy_settings := "proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection $http_connection; proxy_set_header Host $http_host; proxy_set_header X-Request-Start $msec;" }}
{{ $gzip_settings := "gzip on; gzip_min_length 1100; gzip_buffers 4 32k; gzip_types text/css text/javascript text/xml text/plain text/x-component application/javascript application/x-javascript application/json application/xml application/rss+xml font/truetype application/x-font-ttf font/opentype application/vnd.ms-fontobject image/svg+xml; gzip_vary on; gzip_comp_level 6;" }}
{{ $cors_headers_options := "if ($request_method = 'OPTIONS') { add_header 'Access-Control-Allow-Origin' '$allow_origin' always; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS'; add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization'; add_header 'Access-Control-Allow-Credentials' 'true'; add_header 'Access-Control-Max-Age' 1728000; add_header 'Content-Type' 'text/plain; charset=utf-8'; add_header 'Content-Length' 0; return 204; }" }}
{{ $cors_headers_post := "if ($request_method = 'POST') { add_header 'Access-Control-Allow-Origin' '$allow_origin' always; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS' always; add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization' always; add_header 'Access-Control-Expose-Headers' 'Content-Length,Content-Range' always; add_header 'Access-Control-Allow-Credentials' 'true' always; }" }}
{{ $cors_headers_get := "if ($request_method = 'GET') { add_header 'Access-Control-Allow-Origin' '$allow_origin' always; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS' always; add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization' always; add_header 'Access-Control-Expose-Headers' 'Content-Length,Content-Range' always; add_header 'Access-Control-Allow-Credentials' 'true' always; }" }}
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m max_size=1g
inactive=60m use_temp_path=off;
limit_conn_zone $binary_remote_addr zone=addr:10m;
limit_req_zone $binary_remote_addr zone=req_zone:10m rate=20r/s;
{{ range $port_map := .PROXY_PORT_MAP | split " " }}
{{ $port_map_list := $port_map | split ":" }}
@ -16,14 +21,15 @@ server {
listen [::]:{{ $listen_port }};
listen {{ $listen_port }};
server_name {{ $.NOSSL_SERVER_NAME }};
access_log /var/log/nginx/{{ $.APP }}-access.log;
access_log /var/log/nginx/{{ $.APP }}-access.log custom;
error_log /var/log/nginx/{{ $.APP }}-error.log;
client_max_body_size 100M;
{{ else if eq $scheme "https" }}
listen [::]:{{ $listen_port }} ssl http2;
listen {{ $listen_port }} ssl http2;
server_name {{ $.NOSSL_SERVER_NAME }};
access_log /var/log/nginx/{{ $.APP }}-access.log;
access_log /var/log/nginx/{{ $.APP }}-access.log custom;
error_log /var/log/nginx/{{ $.APP }}-error.log;
ssl_certificate {{ $.APP_SSL_PATH }}/server.crt;
ssl_certificate_key {{ $.APP_SSL_PATH }}/server.key;
@ -31,21 +37,37 @@ server {
ssl_prefer_server_ciphers off;
keepalive_timeout 70;
keepalive_requests 500;
proxy_read_timeout 3600;
limit_conn addr 10000;
client_max_body_size 100M;
{{ end }}
location / {
proxy_pass http://{{ $.APP }}-{{ $upstream_port }};
proxy_pass http://{{ $.APP }}-{{ $upstream_port }};
{{ $proxy_settings }}
{{ $gzip_settings }}
{{ $cors_headers_options }}
{{ $cors_headers_post }}
{{ $cors_headers_get }}
proxy_cache my_cache;
proxy_cache_revalidate on;
proxy_cache_min_uses 2;
proxy_cache_use_stale error timeout updating http_500 http_502 http_503 http_504;
proxy_cache_background_update on;
proxy_cache_lock on;
# Connections and request limits increase (bad for DDos)
limit_req zone=req_zone burst=10 nodelay;
}
location ~* \.(jpg|jpeg|png|gif|ico|css|js)$ {
expires 30d; # This means that the client can cache these resources for 30 days.
add_header Cache-Control "public, no-transform";
proxy_pass http://{{ $.APP }}-{{ $upstream_port }};
expires 30d;
add_header Cache-Control "public, no-transform";
}
location ~* \.(mp3|wav|ogg|flac|aac|aif|webm)$ {
proxy_pass http://{{ $.APP }}-{{ $upstream_port }};
}
@ -73,7 +95,6 @@ server {
internal;
}
# include /home/dokku/gateway/nginx.conf.d/*.conf;
include {{ $.DOKKU_ROOT }}/{{ $.APP }}/nginx.conf.d/*.conf;
}
{{ end }}

View File

@ -1,36 +0,0 @@
from base.orm import Base, engine
from orm.community import Community
from orm.notification import Notification
from orm.rbac import Operation, Permission, Resource, Role
from orm.reaction import Reaction
from orm.shout import Shout
from orm.topic import Topic, TopicFollower
from orm.user import User, UserRating
def init_tables():
Base.metadata.create_all(engine)
Operation.init_table()
Resource.init_table()
User.init_table()
Community.init_table()
Role.init_table()
UserRating.init_table()
Shout.init_table()
print("[orm] tables initialized")
__all__ = [
"User",
"Role",
"Operation",
"Permission",
"Community",
"Shout",
"Topic",
"TopicFollower",
"Notification",
"Reaction",
"UserRating",
"init_tables",
]

136
orm/author.py Normal file
View File

@ -0,0 +1,136 @@
import time
from sqlalchemy import JSON, Boolean, Column, ForeignKey, Index, Integer, String
from services.db import Base
# from sqlalchemy_utils import TSVectorType
class AuthorRating(Base):
"""
Рейтинг автора от другого автора.
Attributes:
rater (int): ID оценивающего автора
author (int): ID оцениваемого автора
plus (bool): Положительная/отрицательная оценка
"""
__tablename__ = "author_rating"
id = None # type: ignore
rater = Column(ForeignKey("author.id"), primary_key=True)
author = Column(ForeignKey("author.id"), primary_key=True)
plus = Column(Boolean)
# Определяем индексы
__table_args__ = (
# Индекс для быстрого поиска всех оценок конкретного автора
Index("idx_author_rating_author", "author"),
# Индекс для быстрого поиска всех оценок, оставленных конкретным автором
Index("idx_author_rating_rater", "rater"),
)
class AuthorFollower(Base):
"""
Подписка одного автора на другого.
Attributes:
follower (int): ID подписчика
author (int): ID автора, на которого подписываются
created_at (int): Время создания подписки
auto (bool): Признак автоматической подписки
"""
__tablename__ = "author_follower"
id = None # type: ignore
follower = Column(ForeignKey("author.id"), primary_key=True)
author = Column(ForeignKey("author.id"), primary_key=True)
created_at = Column(Integer, nullable=False, default=lambda: int(time.time()))
auto = Column(Boolean, nullable=False, default=False)
# Определяем индексы
__table_args__ = (
# Индекс для быстрого поиска всех подписчиков автора
Index("idx_author_follower_author", "author"),
# Индекс для быстрого поиска всех авторов, на которых подписан конкретный автор
Index("idx_author_follower_follower", "follower"),
)
class AuthorBookmark(Base):
"""
Закладка автора на публикацию.
Attributes:
author (int): ID автора
shout (int): ID публикации
"""
__tablename__ = "author_bookmark"
id = None # type: ignore
author = Column(ForeignKey("author.id"), primary_key=True)
shout = Column(ForeignKey("shout.id"), primary_key=True)
# Определяем индексы
__table_args__ = (
# Индекс для быстрого поиска всех закладок автора
Index("idx_author_bookmark_author", "author"),
# Индекс для быстрого поиска всех авторов, добавивших публикацию в закладки
Index("idx_author_bookmark_shout", "shout"),
)
class Author(Base):
"""
Модель автора в системе.
Attributes:
name (str): Отображаемое имя
slug (str): Уникальный строковый идентификатор
bio (str): Краткая биография/статус
about (str): Полное описание
pic (str): URL изображения профиля
links (dict): Ссылки на социальные сети и сайты
created_at (int): Время создания профиля
last_seen (int): Время последнего посещения
updated_at (int): Время последнего обновления
deleted_at (int): Время удаления (если профиль удален)
"""
__tablename__ = "author"
name = Column(String, nullable=True, comment="Display name")
slug = Column(String, unique=True, comment="Author's slug")
bio = Column(String, nullable=True, comment="Bio") # status description
about = Column(String, nullable=True, comment="About") # long and formatted
pic = Column(String, nullable=True, comment="Picture")
links = Column(JSON, nullable=True, comment="Links")
created_at = Column(Integer, nullable=False, default=lambda: int(time.time()))
last_seen = Column(Integer, nullable=False, default=lambda: int(time.time()))
updated_at = Column(Integer, nullable=False, default=lambda: int(time.time()))
deleted_at = Column(Integer, nullable=True, comment="Deleted at")
# search_vector = Column(
# TSVectorType("name", "slug", "bio", "about", regconfig="pg_catalog.russian")
# )
# Определяем индексы
__table_args__ = (
# Индекс для быстрого поиска по имени
Index("idx_author_name", "name"),
# Индекс для быстрого поиска по slug
Index("idx_author_slug", "slug"),
# Индекс для фильтрации неудаленных авторов
Index(
"idx_author_deleted_at", "deleted_at", postgresql_where=deleted_at.is_(None)
),
# Индекс для сортировки по времени создания (для новых авторов)
Index("idx_author_created_at", "created_at"),
# Индекс для сортировки по времени последнего посещения
Index("idx_author_last_seen", "last_seen"),
)

View File

@ -1,12 +1,14 @@
from sqlalchemy import Column, DateTime, ForeignKey, String, func
import time
from base.orm import Base
from sqlalchemy import Column, ForeignKey, Integer, String
from services.db import Base
class ShoutCollection(Base):
__tablename__ = "shout_collection"
id = None
id = None # type: ignore
shout = Column(ForeignKey("shout.id"), primary_key=True)
collection = Column(ForeignKey("collection.id"), primary_key=True)
@ -18,6 +20,6 @@ class Collection(Base):
title = Column(String, nullable=False, comment="Title")
body = Column(String, nullable=True, comment="Body")
pic = Column(String, nullable=True, comment="Picture")
createdAt = Column(DateTime(timezone=True), server_default=func.now(), comment="Created At")
createdBy = Column(ForeignKey("user.id"), comment="Created By")
publishedAt = Column(DateTime(timezone=True), server_default=func.now(), comment="Published At")
created_at = Column(Integer, default=lambda: int(time.time()))
created_by = Column(ForeignKey("author.id"), comment="Created By")
published_at = Column(Integer, default=lambda: int(time.time()))

View File

@ -1,38 +1,106 @@
from sqlalchemy import Column, DateTime, ForeignKey, String, func
import enum
import time
from base.orm import Base, local_session
from sqlalchemy import Column, ForeignKey, Integer, String, Text, distinct, func
from sqlalchemy.ext.hybrid import hybrid_property
from orm.author import Author
from services.db import Base
class CommunityRole(enum.Enum):
READER = "reader" # can read and comment
AUTHOR = "author" # + can vote and invite collaborators
ARTIST = "artist" # + can be credited as featured artist
EXPERT = "expert" # + can add proof or disproof to shouts, can manage topics
EDITOR = "editor" # + can manage topics, comments and community settings
@classmethod
def as_string_array(cls, roles):
return [role.value for role in roles]
class CommunityFollower(Base):
__tablename__ = "community_followers"
__tablename__ = "community_author"
id = None
follower: Column = Column(ForeignKey("user.id"), primary_key=True)
community: Column = Column(ForeignKey("community.id"), primary_key=True)
joinedAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
# role = Column(ForeignKey(Role.id), nullable=False, comment="Role for member")
author = Column(ForeignKey("author.id"), primary_key=True)
community = Column(ForeignKey("community.id"), primary_key=True)
joined_at = Column(Integer, nullable=False, default=lambda: int(time.time()))
roles = Column(Text, nullable=True, comment="Roles (comma-separated)")
def set_roles(self, roles):
self.roles = CommunityRole.as_string_array(roles)
def get_roles(self):
return [CommunityRole(role) for role in self.roles]
class Community(Base):
__tablename__ = "community"
name = Column(String, nullable=False, comment="Name")
slug = Column(String, nullable=False, unique=True, comment="Slug")
name = Column(String, nullable=False)
slug = Column(String, nullable=False, unique=True)
desc = Column(String, nullable=False, default="")
pic = Column(String, nullable=False, default="")
createdAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
created_at = Column(Integer, nullable=False, default=lambda: int(time.time()))
created_by = Column(ForeignKey("author.id"), nullable=False)
@staticmethod
def init_table():
with local_session() as session:
d = session.query(Community).filter(Community.slug == "discours").first()
if not d:
d = Community.create(name="Дискурс", slug="discours")
session.add(d)
session.commit()
Community.default_community = d
print("[orm] default community id: %s" % d.id)
@hybrid_property
def stat(self):
return CommunityStats(self)
@property
def role_list(self):
return self.roles.split(",") if self.roles else []
@role_list.setter
def role_list(self, value):
self.roles = ",".join(value) if value else None
class CommunityStats:
def __init__(self, community):
self.community = community
@property
def shouts(self):
from orm.shout import Shout
return self.community.session.query(func.count(Shout.id)).filter(Shout.community == self.community.id).scalar()
@property
def followers(self):
return (
self.community.session.query(func.count(CommunityFollower.author))
.filter(CommunityFollower.community == self.community.id)
.scalar()
)
@property
def authors(self):
from orm.shout import Shout
# author has a shout with community id and its featured_at is not null
return (
self.community.session.query(func.count(distinct(Author.id)))
.join(Shout)
.filter(Shout.community == self.community.id, Shout.featured_at.is_not(None), Author.id.in_(Shout.authors))
.scalar()
)
class CommunityAuthor(Base):
__tablename__ = "community_author"
id = Column(Integer, primary_key=True)
community_id = Column(Integer, ForeignKey("community.id"))
author_id = Column(Integer, ForeignKey("author.id"))
roles = Column(Text, nullable=True, comment="Roles (comma-separated)")
@property
def role_list(self):
return self.roles.split(",") if self.roles else []
@role_list.setter
def role_list(self, value):
self.roles = ",".join(value) if value else None

105
orm/draft.py Normal file
View File

@ -0,0 +1,105 @@
import time
from sqlalchemy import JSON, Boolean, Column, ForeignKey, Integer, String
from sqlalchemy.orm import relationship
from orm.author import Author
from orm.topic import Topic
from services.db import Base
class DraftTopic(Base):
__tablename__ = "draft_topic"
id = None # type: ignore
shout = Column(ForeignKey("draft.id"), primary_key=True, index=True)
topic = Column(ForeignKey("topic.id"), primary_key=True, index=True)
main = Column(Boolean, nullable=True)
class DraftAuthor(Base):
__tablename__ = "draft_author"
id = None # type: ignore
shout = Column(ForeignKey("draft.id"), primary_key=True, index=True)
author = Column(ForeignKey("author.id"), primary_key=True, index=True)
caption = Column(String, nullable=True, default="")
class Draft(Base):
__tablename__ = "draft"
# required
created_at: int = Column(Integer, nullable=False, default=lambda: int(time.time()))
# Колонки для связей с автором
created_by: int = Column("created_by", ForeignKey("author.id"), nullable=False)
community: int = Column("community", ForeignKey("community.id"), nullable=False, default=1)
# optional
layout: str = Column(String, nullable=True, default="article")
slug: str = Column(String, unique=True)
title: str = Column(String, nullable=True)
subtitle: str | None = Column(String, nullable=True)
lead: str | None = Column(String, nullable=True)
body: str = Column(String, nullable=False, comment="Body")
media: dict | None = Column(JSON, nullable=True)
cover: str | None = Column(String, nullable=True, comment="Cover image url")
cover_caption: str | None = Column(String, nullable=True, comment="Cover image alt caption")
lang: str = Column(String, nullable=False, default="ru", comment="Language")
seo: str | None = Column(String, nullable=True) # JSON
# auto
updated_at: int | None = Column(Integer, nullable=True, index=True)
deleted_at: int | None = Column(Integer, nullable=True, index=True)
updated_by: int | None = Column("updated_by", ForeignKey("author.id"), nullable=True)
deleted_by: int | None = Column("deleted_by", ForeignKey("author.id"), nullable=True)
# --- Relationships ---
# Только many-to-many связи через вспомогательные таблицы
authors = relationship(Author, secondary="draft_author", lazy="select")
topics = relationship(Topic, secondary="draft_topic", lazy="select")
# Связь с Community (если нужна как объект, а не ID)
# community = relationship("Community", foreign_keys=[community_id], lazy="joined")
# Пока оставляем community_id как ID
# Связь с публикацией (один-к-одному или один-к-нулю)
# Загружается через joinedload в резолвере
publication = relationship(
"Shout",
primaryjoin="Draft.id == Shout.draft",
foreign_keys="Shout.draft",
uselist=False,
lazy="noload", # Не грузим по умолчанию, только через options
viewonly=True # Указываем, что это связь только для чтения
)
def dict(self):
"""
Сериализует объект Draft в словарь.
Гарантирует, что поля topics и authors всегда будут списками.
"""
return {
"id": self.id,
"created_at": self.created_at,
"created_by": self.created_by,
"community": self.community,
"layout": self.layout,
"slug": self.slug,
"title": self.title,
"subtitle": self.subtitle,
"lead": self.lead,
"body": self.body,
"media": self.media or [],
"cover": self.cover,
"cover_caption": self.cover_caption,
"lang": self.lang,
"seo": self.seo,
"updated_at": self.updated_at,
"deleted_at": self.deleted_at,
"updated_by": self.updated_by,
"deleted_by": self.deleted_by,
# Гарантируем, что topics и authors всегда будут списками
"topics": [topic.dict() for topic in (self.topics or [])],
"authors": [author.dict() for author in (self.authors or [])]
}

35
orm/invite.py Normal file
View File

@ -0,0 +1,35 @@
import enum
from sqlalchemy import Column, ForeignKey, String
from sqlalchemy.orm import relationship
from services.db import Base
class InviteStatus(enum.Enum):
PENDING = "PENDING"
ACCEPTED = "ACCEPTED"
REJECTED = "REJECTED"
@classmethod
def from_string(cls, value):
return cls(value)
class Invite(Base):
__tablename__ = "invite"
inviter_id = Column(ForeignKey("author.id"), primary_key=True)
author_id = Column(ForeignKey("author.id"), primary_key=True)
shout_id = Column(ForeignKey("shout.id"), primary_key=True)
status = Column(String, default=InviteStatus.PENDING.value)
inviter = relationship("Author", foreign_keys=[inviter_id])
author = relationship("Author", foreign_keys=[author_id])
shout = relationship("Shout")
def set_status(self, status: InviteStatus):
self.status = status.value
def get_status(self) -> InviteStatus:
return InviteStatus.from_string(self.status)

View File

@ -1,26 +1,63 @@
from enum import Enum as Enumeration
import enum
import time
from sqlalchemy import Boolean, Column, DateTime, Enum, ForeignKey, Integer, func
from sqlalchemy.dialects.postgresql import JSONB
from sqlalchemy import JSON, Column, ForeignKey, Integer, String
from sqlalchemy.orm import relationship
from base.orm import Base
from orm.author import Author
from services.db import Base
class NotificationType(Enumeration):
NEW_COMMENT = 1
NEW_REPLY = 2
class NotificationEntity(enum.Enum):
REACTION = "reaction"
SHOUT = "shout"
FOLLOWER = "follower"
COMMUNITY = "community"
@classmethod
def from_string(cls, value):
return cls(value)
class NotificationAction(enum.Enum):
CREATE = "create"
UPDATE = "update"
DELETE = "delete"
SEEN = "seen"
FOLLOW = "follow"
UNFOLLOW = "unfollow"
@classmethod
def from_string(cls, value):
return cls(value)
class NotificationSeen(Base):
__tablename__ = "notification_seen"
viewer = Column(ForeignKey("author.id"), primary_key=True)
notification = Column(ForeignKey("notification.id"), primary_key=True)
class Notification(Base):
__tablename__ = "notification"
shout: Column = Column(ForeignKey("shout.id"), index=True)
reaction: Column = Column(ForeignKey("reaction.id"), index=True)
user: Column = Column(ForeignKey("user.id"), index=True)
createdAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), index=True
)
seen = Column(Boolean, nullable=False, default=False, index=True)
type = Column(Enum(NotificationType), nullable=False)
data = Column(JSONB, nullable=True)
occurrences = Column(Integer, default=1)
id = Column(Integer, primary_key=True, autoincrement=True)
created_at = Column(Integer, server_default=str(int(time.time())))
entity = Column(String, nullable=False)
action = Column(String, nullable=False)
payload = Column(JSON, nullable=True)
seen = relationship(Author, secondary="notification_seen")
def set_entity(self, entity: NotificationEntity):
self.entity = entity.value
def get_entity(self) -> NotificationEntity:
return NotificationEntity.from_string(self.entity)
def set_action(self, action: NotificationAction):
self.action = action.value
def get_action(self) -> NotificationAction:
return NotificationAction.from_string(self.action)

30
orm/rating.py Normal file
View File

@ -0,0 +1,30 @@
from orm.reaction import ReactionKind
PROPOSAL_REACTIONS = [
ReactionKind.ACCEPT.value,
ReactionKind.REJECT.value,
ReactionKind.AGREE.value,
ReactionKind.DISAGREE.value,
ReactionKind.ASK.value,
ReactionKind.PROPOSE.value,
]
PROOF_REACTIONS = [ReactionKind.PROOF.value, ReactionKind.DISPROOF.value]
RATING_REACTIONS = [ReactionKind.LIKE.value, ReactionKind.DISLIKE.value]
def is_negative(x):
return x in [
ReactionKind.DISLIKE.value,
ReactionKind.DISPROOF.value,
ReactionKind.REJECT.value,
]
def is_positive(x):
return x in [
ReactionKind.ACCEPT.value,
ReactionKind.LIKE.value,
ReactionKind.PROOF.value,
]

View File

@ -1,178 +0,0 @@
import warnings
from sqlalchemy import Column, ForeignKey, String, TypeDecorator, UniqueConstraint
from sqlalchemy.orm import relationship
from base.orm import REGISTRY, Base, local_session
# Role Based Access Control #
class ClassType(TypeDecorator):
impl = String
@property
def python_type(self):
return NotImplemented
def process_literal_param(self, value, dialect):
return NotImplemented
def process_bind_param(self, value, dialect):
return value.__name__ if isinstance(value, type) else str(value)
def process_result_value(self, value, dialect):
class_ = REGISTRY.get(value)
if class_ is None:
warnings.warn(f"Can't find class <{value}>,find it yourself!", stacklevel=2)
return class_
class Role(Base):
__tablename__ = "role"
name = Column(String, nullable=False, comment="Role Name")
desc = Column(String, nullable=True, comment="Role Description")
community = Column(
ForeignKey("community.id", ondelete="CASCADE"),
nullable=False,
comment="Community",
)
permissions = relationship(lambda: Permission)
@staticmethod
def init_table():
with local_session() as session:
r = session.query(Role).filter(Role.name == "author").first()
if r:
Role.default_role = r
return
r1 = Role.create(
name="author",
desc="Role for an author",
community=1,
)
session.add(r1)
Role.default_role = r1
r2 = Role.create(
name="reader",
desc="Role for a reader",
community=1,
)
session.add(r2)
r3 = Role.create(
name="expert",
desc="Role for an expert",
community=1,
)
session.add(r3)
r4 = Role.create(
name="editor",
desc="Role for an editor",
community=1,
)
session.add(r4)
class Operation(Base):
__tablename__ = "operation"
name = Column(String, nullable=False, unique=True, comment="Operation Name")
@staticmethod
def init_table():
with local_session() as session:
for name in ["create", "update", "delete", "load"]:
"""
* everyone can:
- load shouts
- load topics
- load reactions
- create an account to become a READER
* readers can:
- update and delete their account
- load chats
- load messages
- create reaction of some shout's author allowed kinds
- create shout to become an AUTHOR
* authors can:
- update and delete their shout
- invite other authors to edit shout and chat
- manage allowed reactions for their shout
* pros can:
- create/update/delete their community
- create/update/delete topics for their community
"""
op = session.query(Operation).filter(Operation.name == name).first()
if not op:
op = Operation.create(name=name)
session.add(op)
session.commit()
class Resource(Base):
__tablename__ = "resource"
resourceClass = Column(String, nullable=False, unique=True, comment="Resource class")
name = Column(String, nullable=False, unique=True, comment="Resource name")
# TODO: community = Column(ForeignKey())
@staticmethod
def init_table():
with local_session() as session:
for res in [
"shout",
"topic",
"reaction",
"chat",
"message",
"invite",
"community",
"user",
]:
r = session.query(Resource).filter(Resource.name == res).first()
if not r:
r = Resource.create(name=res, resourceClass=res)
session.add(r)
session.commit()
class Permission(Base):
__tablename__ = "permission"
__table_args__ = (
UniqueConstraint("role", "operation", "resource"),
{"extend_existing": True},
)
role: Column = Column(ForeignKey("role.id", ondelete="CASCADE"), nullable=False, comment="Role")
operation: Column = Column(
ForeignKey("operation.id", ondelete="CASCADE"),
nullable=False,
comment="Operation",
)
resource: Column = Column(
ForeignKey("resource.id", ondelete="CASCADE"),
nullable=False,
comment="Resource",
)
# if __name__ == "__main__":
# Base.metadata.create_all(engine)
# ops = [
# Permission(role=1, operation=1, resource=1),
# Permission(role=1, operation=2, resource=1),
# Permission(role=1, operation=3, resource=1),
# Permission(role=1, operation=4, resource=1),
# Permission(role=2, operation=4, resource=1),
# ]
# global_session.add_all(ops)
# global_session.commit()

View File

@ -1,47 +1,45 @@
import time
from enum import Enum as Enumeration
from sqlalchemy import Column, DateTime, Enum, ForeignKey, String, func
from sqlalchemy import Column, ForeignKey, Integer, String
from base.orm import Base
from services.db import Base
class ReactionKind(Enumeration):
AGREE = 1 # +1
DISAGREE = 2 # -1
PROOF = 3 # +1
DISPROOF = 4 # -1
ASK = 5 # +0
PROPOSE = 6 # +0
QUOTE = 7 # +0 bookmark
COMMENT = 8 # +0
ACCEPT = 9 # +1
REJECT = 0 # -1
LIKE = 11 # +1
DISLIKE = 12 # -1
REMARK = 13 # 0
FOOTNOTE = 14 # 0
# TYPE = <reaction index> # rating diff
# editor mode
AGREE = "AGREE" # +1
DISAGREE = "DISAGREE" # -1
ASK = "ASK" # +0
PROPOSE = "PROPOSE" # +0
ACCEPT = "ACCEPT" # +1
REJECT = "REJECT" # -1
# expert mode
PROOF = "PROOF" # +1
DISPROOF = "DISPROOF" # -1
# public feed
QUOTE = "QUOTE" # +0 TODO: use to bookmark in collection
COMMENT = "COMMENT" # +0
LIKE = "LIKE" # +1
DISLIKE = "DISLIKE" # -1
class Reaction(Base):
__tablename__ = "reaction"
body = Column(String, nullable=True, comment="Reaction Body")
createdAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
createdBy: Column = Column(ForeignKey("user.id"), nullable=False, index=True, comment="Sender")
updatedAt = Column(DateTime(timezone=True), nullable=True, comment="Updated at")
updatedBy: Column = Column(
ForeignKey("user.id"), nullable=True, index=True, comment="Last Editor"
)
deletedAt = Column(DateTime(timezone=True), nullable=True, comment="Deleted at")
deletedBy: Column = Column(
ForeignKey("user.id"), nullable=True, index=True, comment="Deleted by"
)
shout: Column = Column(ForeignKey("shout.id"), nullable=False, index=True)
replyTo: Column = Column(
ForeignKey("reaction.id"), nullable=True, comment="Reply to reaction ID"
)
range = Column(String, nullable=True, comment="Range in format <start index>:<end>")
kind = Column(Enum(ReactionKind), nullable=False, comment="Reaction kind")
oid = Column(String, nullable=True, comment="Old ID")
body = Column(String, default="", comment="Reaction Body")
created_at = Column(Integer, nullable=False, default=lambda: int(time.time()), index=True)
updated_at = Column(Integer, nullable=True, comment="Updated at", index=True)
deleted_at = Column(Integer, nullable=True, comment="Deleted at", index=True)
deleted_by = Column(ForeignKey("author.id"), nullable=True)
reply_to = Column(ForeignKey("reaction.id"), nullable=True)
quote = Column(String, nullable=True, comment="Original quoted text")
shout = Column(ForeignKey("shout.id"), nullable=False, index=True)
created_by = Column(ForeignKey("author.id"), nullable=False)
kind = Column(String, nullable=False, index=True)
oid = Column(String)

View File

@ -1,98 +1,126 @@
from sqlalchemy import (
JSON,
Boolean,
Column,
DateTime,
ForeignKey,
Integer,
String,
func,
)
from sqlalchemy.orm import column_property, relationship
import time
from base.orm import Base, local_session
from sqlalchemy import JSON, Boolean, Column, ForeignKey, Index, Integer, String
from sqlalchemy.orm import relationship
from orm.author import Author
from orm.reaction import Reaction
from orm.topic import Topic
from orm.user import User
from services.db import Base
class ShoutTopic(Base):
"""
Связь между публикацией и темой.
Attributes:
shout (int): ID публикации
topic (int): ID темы
main (bool): Признак основной темы
"""
__tablename__ = "shout_topic"
id = None
shout: Column = Column(ForeignKey("shout.id"), primary_key=True, index=True)
topic: Column = Column(ForeignKey("topic.id"), primary_key=True, index=True)
id = None # type: ignore
shout = Column(ForeignKey("shout.id"), primary_key=True, index=True)
topic = Column(ForeignKey("topic.id"), primary_key=True, index=True)
main = Column(Boolean, nullable=True)
# Определяем дополнительные индексы
__table_args__ = (
# Оптимизированный составной индекс для запросов, которые ищут публикации по теме
Index("idx_shout_topic_topic_shout", "topic", "shout"),
)
class ShoutReactionsFollower(Base):
__tablename__ = "shout_reactions_followers"
id = None
follower: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
shout: Column = Column(ForeignKey("shout.id"), primary_key=True, index=True)
id = None # type: ignore
follower = Column(ForeignKey("author.id"), primary_key=True, index=True)
shout = Column(ForeignKey("shout.id"), primary_key=True, index=True)
auto = Column(Boolean, nullable=False, default=False)
createdAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
deletedAt = Column(DateTime(timezone=True), nullable=True)
created_at = Column(Integer, nullable=False, default=lambda: int(time.time()))
deleted_at = Column(Integer, nullable=True)
class ShoutAuthor(Base):
"""
Связь между публикацией и автором.
Attributes:
shout (int): ID публикации
author (int): ID автора
caption (str): Подпись автора
"""
__tablename__ = "shout_author"
id = None
shout: Column = Column(ForeignKey("shout.id"), primary_key=True, index=True)
user: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
caption: Column = Column(String, nullable=True, default="")
id = None # type: ignore
shout = Column(ForeignKey("shout.id"), primary_key=True, index=True)
author = Column(ForeignKey("author.id"), primary_key=True, index=True)
caption = Column(String, nullable=True, default="")
# Определяем дополнительные индексы
__table_args__ = (
# Оптимизированный индекс для запросов, которые ищут публикации по автору
Index("idx_shout_author_author_shout", "author", "shout"),
)
class Shout(Base):
"""
Публикация в системе.
"""
__tablename__ = "shout"
# timestamps
createdAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
created_at: int = Column(Integer, nullable=False, default=lambda: int(time.time()))
updated_at: int | None = Column(Integer, nullable=True, index=True)
published_at: int | None = Column(Integer, nullable=True, index=True)
featured_at: int | None = Column(Integer, nullable=True, index=True)
deleted_at: int | None = Column(Integer, nullable=True, index=True)
created_by: int = Column(ForeignKey("author.id"), nullable=False)
updated_by: int | None = Column(ForeignKey("author.id"), nullable=True)
deleted_by: int | None = Column(ForeignKey("author.id"), nullable=True)
community: int = Column(ForeignKey("community.id"), nullable=False)
body: str = Column(String, nullable=False, comment="Body")
slug: str = Column(String, unique=True)
cover: str | None = Column(String, nullable=True, comment="Cover image url")
cover_caption: str | None = Column(String, nullable=True, comment="Cover image alt caption")
lead: str | None = Column(String, nullable=True)
title: str = Column(String, nullable=False)
subtitle: str | None = Column(String, nullable=True)
layout: str = Column(String, nullable=False, default="article")
media: dict | None = Column(JSON, nullable=True)
authors = relationship(Author, secondary="shout_author")
topics = relationship(Topic, secondary="shout_topic")
reactions = relationship(Reaction)
lang: str = Column(String, nullable=False, default="ru", comment="Language")
version_of: int | None = Column(ForeignKey("shout.id"), nullable=True)
oid: str | None = Column(String, nullable=True)
seo: str | None = Column(String, nullable=True) # JSON
draft: int | None = Column(ForeignKey("draft.id"), nullable=True)
# Определяем индексы
__table_args__ = (
# Индекс для быстрого поиска неудаленных публикаций
Index("idx_shout_deleted_at", "deleted_at", postgresql_where=deleted_at.is_(None)),
# Индекс для быстрой фильтрации по community
Index("idx_shout_community", "community"),
# Индекс для быстрого поиска по slug
Index("idx_shout_slug", "slug"),
# Составной индекс для фильтрации опубликованных неудаленных публикаций
Index(
"idx_shout_published_deleted",
"published_at",
"deleted_at",
postgresql_where=published_at.is_not(None) & deleted_at.is_(None),
),
)
updatedAt = Column(DateTime(timezone=True), nullable=True, comment="Updated at")
publishedAt = Column(DateTime(timezone=True), nullable=True)
deletedAt = Column(DateTime(timezone=True), nullable=True)
createdBy: Column = Column(ForeignKey("user.id"), comment="Created By")
deletedBy: Column = Column(ForeignKey("user.id"), nullable=True)
slug = Column(String, unique=True)
cover = Column(String, nullable=True, comment="Cover image url")
lead = Column(String, nullable=True)
description = Column(String, nullable=True)
body = Column(String, nullable=False, comment="Body")
title = Column(String, nullable=True)
subtitle = Column(String, nullable=True)
layout = Column(String, nullable=True)
media = Column(JSON, nullable=True)
authors = relationship(lambda: User, secondary=ShoutAuthor.__tablename__)
topics = relationship(lambda: Topic, secondary=ShoutTopic.__tablename__)
# views from the old Discours website
viewsOld = Column(Integer, default=0)
# views from Ackee tracker on the new Discours website
viewsAckee = Column(Integer, default=0)
views = column_property(viewsOld + viewsAckee)
reactions = relationship(lambda: Reaction)
# TODO: these field should be used or modified
community: Column = Column(ForeignKey("community.id"), default=1)
lang = Column(String, nullable=False, default="ru", comment="Language")
mainTopic: Column = Column(ForeignKey("topic.slug"), nullable=True)
visibility = Column(String, nullable=True) # owner authors community public
versionOf: Column = Column(ForeignKey("shout.id"), nullable=True)
oid = Column(String, nullable=True)
@staticmethod
def init_table():
with local_session() as session:
s = session.query(Shout).first()
if not s:
entry = {"slug": "genesis-block", "body": "", "title": "Ничего", "lang": "ru"}
s = Shout.create(**entry)
session.add(s)
session.commit()

View File

@ -1,26 +1,66 @@
from sqlalchemy import Boolean, Column, DateTime, ForeignKey, String, func
import time
from base.orm import Base
from sqlalchemy import JSON, Boolean, Column, ForeignKey, Index, Integer, String
from services.db import Base
class TopicFollower(Base):
"""
Связь между топиком и его подписчиком.
Attributes:
follower (int): ID подписчика
topic (int): ID топика
created_at (int): Время создания связи
auto (bool): Автоматическая подписка
"""
__tablename__ = "topic_followers"
id = None
follower: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
topic: Column = Column(ForeignKey("topic.id"), primary_key=True, index=True)
createdAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
id = None # type: ignore
follower = Column(Integer, ForeignKey("author.id"), primary_key=True)
topic = Column(Integer, ForeignKey("topic.id"), primary_key=True)
created_at = Column(Integer, nullable=False, default=int(time.time()))
auto = Column(Boolean, nullable=False, default=False)
# Определяем индексы
__table_args__ = (
# Индекс для быстрого поиска всех подписчиков топика
Index("idx_topic_followers_topic", "topic"),
# Индекс для быстрого поиска всех топиков, на которые подписан автор
Index("idx_topic_followers_follower", "follower"),
)
class Topic(Base):
"""
Модель топика (темы) публикаций.
Attributes:
slug (str): Уникальный строковый идентификатор темы
title (str): Название темы
body (str): Описание темы
pic (str): URL изображения темы
community (int): ID сообщества
oid (str): Старый ID
parent_ids (list): IDs родительских тем
"""
__tablename__ = "topic"
slug = Column(String, unique=True)
title = Column(String, nullable=False, comment="Title")
body = Column(String, nullable=True, comment="Body")
pic = Column(String, nullable=True, comment="Picture")
community: Column = Column(ForeignKey("community.id"), default=1, comment="Community")
community = Column(ForeignKey("community.id"), default=1)
oid = Column(String, nullable=True, comment="Old ID")
parent_ids = Column(JSON, nullable=True, comment="Parent Topic IDs")
# Определяем индексы
__table_args__ = (
# Индекс для быстрого поиска по slug
Index("idx_topic_slug", "slug"),
# Индекс для быстрого поиска по сообществу
Index("idx_topic_community", "community"),
)

View File

@ -1,105 +0,0 @@
from sqlalchemy import JSON as JSONType
from sqlalchemy import Boolean, Column, DateTime, ForeignKey, Integer, String, func
from sqlalchemy.orm import relationship
from base.orm import Base, local_session
from orm.rbac import Role
class UserRating(Base):
__tablename__ = "user_rating"
id = None
rater: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
user: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
value: Column = Column(Integer)
@staticmethod
def init_table():
pass
class UserRole(Base):
__tablename__ = "user_role"
id = None
user = Column(ForeignKey("user.id"), primary_key=True, index=True)
role = Column(ForeignKey("role.id"), primary_key=True, index=True)
class AuthorFollower(Base):
__tablename__ = "author_follower"
id = None
follower: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
author: Column = Column(ForeignKey("user.id"), primary_key=True, index=True)
createdAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
auto = Column(Boolean, nullable=False, default=False)
class User(Base):
__tablename__ = "user"
default_user = None
email = Column(String, unique=True, nullable=False, comment="Email")
username = Column(String, nullable=False, comment="Login")
password = Column(String, nullable=True, comment="Password")
bio = Column(String, nullable=True, comment="Bio") # status description
about = Column(String, nullable=True, comment="About") # long and formatted
userpic = Column(String, nullable=True, comment="Userpic")
name = Column(String, nullable=True, comment="Display name")
slug = Column(String, unique=True, comment="User's slug")
muted = Column(Boolean, default=False)
emailConfirmed = Column(Boolean, default=False)
createdAt = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Created at"
)
lastSeen = Column(
DateTime(timezone=True), nullable=False, server_default=func.now(), comment="Was online at"
)
deletedAt = Column(DateTime(timezone=True), nullable=True, comment="Deleted at")
links = Column(JSONType, nullable=True, comment="Links")
oauth = Column(String, nullable=True)
ratings = relationship(UserRating, foreign_keys=UserRating.user)
roles = relationship(lambda: Role, secondary=UserRole.__tablename__)
oid = Column(String, nullable=True)
@staticmethod
def init_table():
with local_session() as session:
default = session.query(User).filter(User.slug == "anonymous").first()
if not default:
default_dict = {
"email": "noreply@discours.io",
"username": "noreply@discours.io",
"name": "Аноним",
"slug": "anonymous",
}
default = User.create(**default_dict)
session.add(default)
discours_dict = {
"email": "welcome@discours.io",
"username": "welcome@discours.io",
"name": "Дискурс",
"slug": "discours",
}
discours = User.create(**discours_dict)
session.add(discours)
session.commit()
User.default_user = default
def get_permission(self):
scope = {}
for role in self.roles:
for p in role.permissions:
if p.resource not in scope:
scope[p.resource] = set()
scope[p.resource].add(p.operation)
print(scope)
return scope
# if __name__ == "__main__":
# print(User.get_permission(user_id=1))

1802
poetry.lock generated

File diff suppressed because it is too large Load Diff

2
pyproject.toml Normal file
View File

@ -0,0 +1,2 @@
[tool.ruff]
line-length = 108

View File

@ -1,8 +0,0 @@
black==23.10.1
flake8==6.1.0
gql_schema_codegen==1.0.1
isort==5.12.0
mypy==1.6.1
pre-commit==3.5.0
pymongo-stubs==0.2.0
sqlalchemy-stubs==0.4

6
requirements.dev.txt Normal file
View File

@ -0,0 +1,6 @@
fakeredis
pytest
pytest-asyncio
pytest-cov
mypy
ruff

View File

@ -1,37 +1,18 @@
aiohttp==3.8.6
alembic==1.11.3
ariadne>=0.17.0
asyncio~=3.4.3
authlib==1.2.1
bcrypt>=4.0.0
beautifulsoup4~=4.11.1
boto3~=1.28.2
botocore~=1.31.2
bson~=0.5.10
DateTime~=4.7
gql~=3.4.0
graphql-core>=3.0.3
httpx>=0.23.0
itsdangerous
lxml
Mako==1.2.4
MarkupSafe==2.1.3
nltk~=3.8.1
passlib~=1.7.4
# own auth
bcrypt
authlib
passlib
opensearch-py
google-analytics-data
colorlog
psycopg2-binary
pydantic>=1.10.2
pyjwt>=2.6.0
pymystem3~=0.2.0
python-dateutil~=2.8.2
python-frontmatter~=1.0.0
python-multipart~=0.0.6
PyYAML>=5.4
requests~=2.28.1
sentry-sdk>=1.14.0
sqlalchemy>=1.4.41
sse-starlette==1.6.5
starlette~=0.23.1
transliterate~=1.10.2
uvicorn>=0.18.3
redis
httpx
redis[hiredis]
sentry-sdk[starlette,sqlalchemy]
starlette
gql
ariadne
granian
orjson
pydantic
trafilatura

View File

@ -1,55 +0,0 @@
database_name="discoursio"
remote_backup_dir="/var/backups/mongodb"
user="root"
host="v2.discours.io"
server="$user@$host"
dump_dir="./dump"
local_backup_filename="discours-backup.bson.gz.tar"
echo "DATABASE RESET STARTED"
echo "server: $server"
echo "remote backup directory: $remote_backup_dir"
echo "Searching for last backup file..."
last_backup_filename=$(ssh $server "ls -t $remote_backup_dir | head -1")
if [ $? -ne 0 ]; then { echo "Failed to get last backup filename, aborting." ; exit 1; } fi
echo "Last backup file found: $last_backup_filename"
echo "Downloading..."
scp $server:$remote_backup_dir/"$last_backup_filename" "$local_backup_filename"
if [ $? -ne 0 ]; then { echo "Failed to download backup file, aborting." ; exit 1; } fi
echo "Backup file $local_backup_filename downloaded successfully"
echo "Creating dump directory: $dump_dir"
mkdir -p "$dump_dir"
if [ $? -ne 0 ]; then { echo "Failed to create dump directory, aborting." ; exit 1; } fi
echo "$dump_dir directory created"
echo "Unpacking backup file $local_backup_filename to $dump_dir"
tar -xzf "$local_backup_filename" --directory "$dump_dir" --strip-components 1
if [ $? -ne 0 ]; then { echo "Failed to unpack backup, aborting." ; exit 1; } fi
echo "Backup file $local_backup_filename successfully unpacked to $dump_dir"
echo "Removing backup file $local_backup_filename"
rm "$local_backup_filename"
if [ $? -ne 0 ]; then { echo "Failed to remove backup file, aborting." ; exit 1; } fi
echo "Backup file removed"
echo "Dropping database $database_name"
dropdb $database_name --force
if [ $? -ne 0 ]; then { echo "Failed to drop database, aborting." ; exit 1; } fi
echo "Database $database_name dropped"
echo "Creating database $database_name"
createdb $database_name
if [ $? -ne 0 ]; then { echo "Failed to create database, aborting." ; exit 1; } fi
echo "Database $database_name successfully created"
echo "BSON -> JSON"
python3 server.py bson
if [ $? -ne 0 ]; then { echo "BSON -> JSON failed, aborting." ; exit 1; } fi
echo "Start migration"
python3 server.py migrate
if [ $? -ne 0 ]; then { echo "Migration failed, aborting." ; exit 1; } fi
echo 'Done!'

View File

@ -1,46 +1,132 @@
# flake8: noqa
from resolvers.auth import (
auth_send_link,
confirm_email,
get_current_user,
is_email_used,
login,
register_by_email,
sign_out,
)
from resolvers.create.editor import create_shout, delete_shout, update_shout
from resolvers.inbox.chats import create_chat, delete_chat, update_chat
from resolvers.inbox.load import load_chats, load_messages_by, load_recipients
from resolvers.inbox.messages import (
create_message,
delete_message,
mark_as_read,
update_message,
)
from resolvers.inbox.search import search_recipients
from resolvers.notifications import load_notifications
from resolvers.zine.following import follow, unfollow
from resolvers.zine.load import load_shout, load_shouts_by
from resolvers.zine.profile import (
from cache.triggers import events_register
from resolvers.author import ( # search_authors,
get_author,
get_author_followers,
get_author_follows,
get_author_follows_authors,
get_author_follows_topics,
get_author_id,
get_authors_all,
load_authors_by,
rate_user,
update_profile,
update_author,
)
from resolvers.zine.reactions import (
from resolvers.community import get_communities_all, get_community
from resolvers.draft import (
create_draft,
delete_draft,
load_drafts,
publish_draft,
update_draft,
)
from resolvers.editor import (
unpublish_shout,
)
from resolvers.feed import (
load_shouts_coauthored,
load_shouts_discussed,
load_shouts_feed,
load_shouts_followed_by,
)
from resolvers.follower import follow, get_shout_followers, unfollow
from resolvers.notifier import (
load_notifications,
notification_mark_seen,
notifications_seen_after,
notifications_seen_thread,
)
from resolvers.rating import get_my_rates_comments, get_my_rates_shouts, rate_author
from resolvers.reaction import (
create_reaction,
delete_reaction,
load_comment_ratings,
load_comments_branch,
load_reactions_by,
reactions_follow,
reactions_unfollow,
load_shout_comments,
load_shout_ratings,
update_reaction,
)
from resolvers.zine.topics import (
get_topic,
topic_follow,
topic_unfollow,
topics_all,
topics_by_author,
topics_by_community,
from resolvers.reader import (
get_shout,
load_shouts_by,
load_shouts_random_top,
load_shouts_search,
load_shouts_unrated,
)
from resolvers.topic import (
get_topic,
get_topic_authors,
get_topic_followers,
get_topics_all,
get_topics_by_author,
get_topics_by_community,
)
events_register()
__all__ = [
# author
"get_author",
"get_author_id",
"get_author_followers",
"get_author_follows",
"get_author_follows_topics",
"get_author_follows_authors",
"get_authors_all",
"load_authors_by",
"update_author",
## "search_authors",
# community
"get_community",
"get_communities_all",
# topic
"get_topic",
"get_topics_all",
"get_topics_by_community",
"get_topics_by_author",
"get_topic_followers",
"get_topic_authors",
# reader
"get_shout",
"load_shouts_by",
"load_shouts_random_top",
"load_shouts_search",
"load_shouts_unrated",
# feed
"load_shouts_feed",
"load_shouts_coauthored",
"load_shouts_discussed",
"load_shouts_with_topic",
"load_shouts_followed_by",
"load_shouts_authored_by",
# follower
"follow",
"unfollow",
"get_shout_followers",
# reaction
"create_reaction",
"update_reaction",
"delete_reaction",
"load_reactions_by",
"load_shout_comments",
"load_shout_ratings",
"load_comment_ratings",
"load_comments_branch",
# notifier
"load_notifications",
"notifications_seen_thread",
"notifications_seen_after",
"notification_mark_seen",
# rating
"rate_author",
"get_my_rates_comments",
"get_my_rates_shouts",
# draft
"load_drafts",
"create_draft",
"update_draft",
"delete_draft",
"publish_draft",
"publish_shout",
"unpublish_shout",
"unpublish_draft",
]

384
resolvers/author.py Normal file
View File

@ -0,0 +1,384 @@
import asyncio
import time
from typing import Optional
from sqlalchemy import select, text
from cache.cache import (
cache_author,
cached_query,
get_cached_author,
get_cached_author_by_user_id,
get_cached_author_followers,
get_cached_follower_authors,
get_cached_follower_topics,
invalidate_cache_by_prefix,
)
from orm.author import Author
from resolvers.stat import get_with_stat
from services.auth import login_required
from services.db import local_session
from services.redis import redis
from services.schema import mutation, query
from utils.logger import root_logger as logger
DEFAULT_COMMUNITIES = [1]
# Вспомогательная функция для получения всех авторов без статистики
async def get_all_authors():
"""
Получает всех авторов без статистики.
Используется для случаев, когда нужен полный список авторов без дополнительной информации.
Returns:
list: Список всех авторов без статистики
"""
cache_key = "authors:all:basic"
# Функция для получения всех авторов из БД
async def fetch_all_authors():
logger.debug("Получаем список всех авторов из БД и кешируем результат")
with local_session() as session:
# Запрос на получение базовой информации об авторах
authors_query = select(Author).where(Author.deleted_at.is_(None))
authors = session.execute(authors_query).scalars().all()
# Преобразуем авторов в словари
return [author.dict() for author in authors]
# Используем универсальную функцию для кеширования запросов
return await cached_query(cache_key, fetch_all_authors)
# Вспомогательная функция для получения авторов со статистикой с пагинацией
async def get_authors_with_stats(limit=50, offset=0, by: Optional[str] = None):
"""
Получает авторов со статистикой с пагинацией.
Args:
limit: Максимальное количество возвращаемых авторов
offset: Смещение для пагинации
by: Опциональный параметр сортировки (new/active)
Returns:
list: Список авторов с их статистикой
"""
# Формируем ключ кеша с помощью универсальной функции
cache_key = f"authors:stats:limit={limit}:offset={offset}"
# Функция для получения авторов из БД
async def fetch_authors_with_stats():
logger.debug(f"Выполняем запрос на получение авторов со статистикой: limit={limit}, offset={offset}, by={by}")
with local_session() as session:
# Базовый запрос для получения авторов
base_query = select(Author).where(Author.deleted_at.is_(None))
# Применяем сортировку
if by:
if isinstance(by, dict):
# Обработка словаря параметров сортировки
from sqlalchemy import asc, desc
for field, direction in by.items():
column = getattr(Author, field, None)
if column:
if direction.lower() == "desc":
base_query = base_query.order_by(desc(column))
else:
base_query = base_query.order_by(column)
elif by == "new":
base_query = base_query.order_by(desc(Author.created_at))
elif by == "active":
base_query = base_query.order_by(desc(Author.last_seen))
else:
# По умолчанию сортируем по времени создания
base_query = base_query.order_by(desc(Author.created_at))
else:
base_query = base_query.order_by(desc(Author.created_at))
# Применяем лимит и смещение
base_query = base_query.limit(limit).offset(offset)
# Получаем авторов
authors = session.execute(base_query).scalars().all()
author_ids = [author.id for author in authors]
if not author_ids:
return []
# Оптимизированный запрос для получения статистики по публикациям для авторов
shouts_stats_query = f"""
SELECT sa.author, COUNT(DISTINCT s.id) as shouts_count
FROM shout_author sa
JOIN shout s ON sa.shout = s.id AND s.deleted_at IS NULL AND s.published_at IS NOT NULL
WHERE sa.author IN ({",".join(map(str, author_ids))})
GROUP BY sa.author
"""
shouts_stats = {row[0]: row[1] for row in session.execute(text(shouts_stats_query))}
# Запрос на получение статистики по подписчикам для авторов
followers_stats_query = f"""
SELECT author, COUNT(DISTINCT follower) as followers_count
FROM author_follower
WHERE author IN ({",".join(map(str, author_ids))})
GROUP BY author
"""
followers_stats = {row[0]: row[1] for row in session.execute(text(followers_stats_query))}
# Формируем результат с добавлением статистики
result = []
for author in authors:
author_dict = author.dict()
author_dict["stat"] = {
"shouts": shouts_stats.get(author.id, 0),
"followers": followers_stats.get(author.id, 0),
}
result.append(author_dict)
# Кешируем каждого автора отдельно для использования в других функциях
await cache_author(author_dict)
return result
# Используем универсальную функцию для кеширования запросов
return await cached_query(cache_key, fetch_authors_with_stats)
# Функция для инвалидации кеша авторов
async def invalidate_authors_cache(author_id=None):
"""
Инвалидирует кеши авторов при изменении данных.
Args:
author_id: Опциональный ID автора для точечной инвалидации.
Если не указан, инвалидируются все кеши авторов.
"""
if author_id:
# Точечная инвалидация конкретного автора
logger.debug(f"Инвалидация кеша для автора #{author_id}")
specific_keys = [
f"author:id:{author_id}",
f"author:followers:{author_id}",
f"author:follows-authors:{author_id}",
f"author:follows-topics:{author_id}",
f"author:follows-shouts:{author_id}",
]
# Получаем user_id автора, если есть
with local_session() as session:
author = session.query(Author).filter(Author.id == author_id).first()
if author and author.user:
specific_keys.append(f"author:user:{author.user.strip()}")
# Удаляем конкретные ключи
for key in specific_keys:
try:
await redis.execute("DEL", key)
logger.debug(f"Удален ключ кеша {key}")
except Exception as e:
logger.error(f"Ошибка при удалении ключа {key}: {e}")
# Также ищем и удаляем ключи коллекций, содержащих данные об этом авторе
collection_keys = await redis.execute("KEYS", "authors:stats:*")
if collection_keys:
await redis.execute("DEL", *collection_keys)
logger.debug(f"Удалено {len(collection_keys)} коллекционных ключей авторов")
else:
# Общая инвалидация всех кешей авторов
logger.debug("Полная инвалидация кеша авторов")
await invalidate_cache_by_prefix("authors")
@mutation.field("update_author")
@login_required
async def update_author(_, info, profile):
user_id = info.context.get("user_id")
if not user_id:
return {"error": "unauthorized", "author": None}
try:
with local_session() as session:
author = session.query(Author).where(Author.user == user_id).first()
if author:
Author.update(author, profile)
session.add(author)
session.commit()
author_query = select(Author).where(Author.user == user_id)
result = get_with_stat(author_query)
if result:
author_with_stat = result[0]
if isinstance(author_with_stat, Author):
author_dict = author_with_stat.dict()
# await cache_author(author_dict)
asyncio.create_task(cache_author(author_dict))
return {"error": None, "author": author}
except Exception as exc:
import traceback
logger.error(traceback.format_exc())
return {"error": exc, "author": None}
@query.field("get_authors_all")
async def get_authors_all(_, _info):
"""
Получает список всех авторов без статистики.
Returns:
list: Список всех авторов
"""
return await get_all_authors()
@query.field("get_author")
async def get_author(_, _info, slug="", author_id=0):
author_dict = None
try:
author_id = get_author_id_from(slug=slug, user="", author_id=author_id)
if not author_id:
raise ValueError("cant find")
author_dict = await get_cached_author(int(author_id), get_with_stat)
if not author_dict or not author_dict.get("stat"):
# update stat from db
author_query = select(Author).filter(Author.id == author_id)
result = get_with_stat(author_query)
if result:
author_with_stat = result[0]
if isinstance(author_with_stat, Author):
author_dict = author_with_stat.dict()
# await cache_author(author_dict)
asyncio.create_task(cache_author(author_dict))
except ValueError:
pass
except Exception as exc:
import traceback
logger.error(f"{exc}:\n{traceback.format_exc()}")
return author_dict
@query.field("get_author_id")
async def get_author_id(_, _info, user: str):
user_id = user.strip()
logger.info(f"getting author id for {user_id}")
author = None
try:
author = await get_cached_author_by_user_id(user_id, get_with_stat)
if author:
return author
author_query = select(Author).filter(Author.user == user_id)
result = get_with_stat(author_query)
if result:
author_with_stat = result[0]
if isinstance(author_with_stat, Author):
author_dict = author_with_stat.dict()
# await cache_author(author_dict)
asyncio.create_task(cache_author(author_dict))
return author_with_stat
except Exception as exc:
logger.error(f"Error getting author: {exc}")
return None
@query.field("load_authors_by")
async def load_authors_by(_, _info, by, limit, offset):
"""
Загружает авторов по заданному критерию с пагинацией.
Args:
by: Критерий сортировки авторов (new/active)
limit: Максимальное количество возвращаемых авторов
offset: Смещение для пагинации
Returns:
list: Список авторов с учетом критерия
"""
# Используем оптимизированную функцию для получения авторов
return await get_authors_with_stats(limit, offset, by)
def get_author_id_from(slug="", user=None, author_id=None):
try:
author_id = None
if author_id:
return author_id
with local_session() as session:
author = None
if slug:
author = session.query(Author).filter(Author.slug == slug).first()
if author:
author_id = author.id
return author_id
if user:
author = session.query(Author).filter(Author.user == user).first()
if author:
author_id = author.id
except Exception as exc:
logger.error(exc)
return author_id
@query.field("get_author_follows")
async def get_author_follows(_, _info, slug="", user=None, author_id=0):
logger.debug(f"getting follows for @{slug}")
author_id = get_author_id_from(slug=slug, user=user, author_id=author_id)
if not author_id:
return {}
followed_authors = await get_cached_follower_authors(author_id)
followed_topics = await get_cached_follower_topics(author_id)
# TODO: Get followed communities too
return {
"authors": followed_authors,
"topics": followed_topics,
"communities": DEFAULT_COMMUNITIES,
"shouts": [],
}
@query.field("get_author_follows_topics")
async def get_author_follows_topics(_, _info, slug="", user=None, author_id=None):
logger.debug(f"getting followed topics for @{slug}")
author_id = get_author_id_from(slug=slug, user=user, author_id=author_id)
if not author_id:
return []
followed_topics = await get_cached_follower_topics(author_id)
return followed_topics
@query.field("get_author_follows_authors")
async def get_author_follows_authors(_, _info, slug="", user=None, author_id=None):
logger.debug(f"getting followed authors for @{slug}")
author_id = get_author_id_from(slug=slug, user=user, author_id=author_id)
if not author_id:
return []
followed_authors = await get_cached_follower_authors(author_id)
return followed_authors
def create_author(user_id: str, slug: str, name: str = ""):
author = Author()
author.user = user_id # Связь с user_id из системы авторизации
author.slug = slug # Идентификатор из системы авторизации
author.created_at = author.updated_at = int(time.time())
author.name = name or slug # если не указано
with local_session() as session:
session.add(author)
session.commit()
return author
@query.field("get_author_followers")
async def get_author_followers(_, _info, slug: str = "", user: str = "", author_id: int = 0):
logger.debug(f"getting followers for author @{slug} or ID:{author_id}")
author_id = get_author_id_from(slug=slug, user=user, author_id=author_id)
if not author_id:
return []
followers = await get_cached_author_followers(author_id)
return followers

83
resolvers/bookmark.py Normal file
View File

@ -0,0 +1,83 @@
from operator import and_
from graphql import GraphQLError
from sqlalchemy import delete, insert
from orm.author import AuthorBookmark
from orm.shout import Shout
from resolvers.feed import apply_options
from resolvers.reader import get_shouts_with_links, query_with_stat
from services.auth import login_required
from services.common_result import CommonResult
from services.db import local_session
from services.schema import mutation, query
@query.field("load_shouts_bookmarked")
@login_required
def load_shouts_bookmarked(_, info, options):
"""
Load bookmarked shouts for the authenticated user.
Args:
limit (int): Maximum number of shouts to return.
offset (int): Number of shouts to skip.
Returns:
list: List of bookmarked shouts.
"""
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
if not author_id:
raise GraphQLError("User not authenticated")
q = query_with_stat(info)
q = q.join(AuthorBookmark)
q = q.filter(
and_(
Shout.id == AuthorBookmark.shout,
AuthorBookmark.author == author_id,
)
)
q, limit, offset = apply_options(q, options, author_id)
return get_shouts_with_links(info, q, limit, offset)
@mutation.field("toggle_bookmark_shout")
def toggle_bookmark_shout(_, info, slug: str) -> CommonResult:
"""
Toggle bookmark status for a specific shout.
Args:
slug (str): Unique identifier of the shout.
Returns:
CommonResult: Result of the operation with bookmark status.
"""
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
if not author_id:
raise GraphQLError("User not authenticated")
with local_session() as db:
shout = db.query(Shout).filter(Shout.slug == slug).first()
if not shout:
raise GraphQLError("Shout not found")
existing_bookmark = (
db.query(AuthorBookmark)
.filter(AuthorBookmark.author == author_id, AuthorBookmark.shout == shout.id)
.first()
)
if existing_bookmark:
db.execute(
delete(AuthorBookmark).where(AuthorBookmark.author == author_id, AuthorBookmark.shout == shout.id)
)
result = False
else:
db.execute(insert(AuthorBookmark).values(author=author_id, shout=shout.id))
result = True
db.commit()
return result

147
resolvers/collab.py Normal file
View File

@ -0,0 +1,147 @@
from orm.author import Author
from orm.invite import Invite, InviteStatus
from orm.shout import Shout
from services.auth import login_required
from services.db import local_session
from services.schema import mutation
@mutation.field("accept_invite")
@login_required
async def accept_invite(_, info, invite_id: int):
info.context["user_id"]
author_dict = info.context["author"]
author_id = author_dict.get("id")
if author_id:
author_id = int(author_id)
# Check if the user exists
with local_session() as session:
# Check if the invite exists
invite = session.query(Invite).filter(Invite.id == invite_id).first()
if invite and invite.author_id is author_id and invite.status is InviteStatus.PENDING.value:
# Add the user to the shout authors
shout = session.query(Shout).filter(Shout.id == invite.shout_id).first()
if shout:
if author_id not in shout.authors:
author = session.query(Author).filter(Author.id == author_id).first()
if author:
shout.authors.append(author)
session.add(shout)
session.delete(invite)
session.commit()
return {"success": True, "message": "Invite accepted"}
else:
return {"error": "Shout not found"}
else:
return {"error": "Invalid invite or already accepted/rejected"}
else:
return {"error": "Unauthorized"}
@mutation.field("reject_invite")
@login_required
async def reject_invite(_, info, invite_id: int):
info.context["user_id"]
author_dict = info.context["author"]
author_id = author_dict.get("id")
if author_id:
# Check if the user exists
with local_session() as session:
author_id = int(author_id)
# Check if the invite exists
invite = session.query(Invite).filter(Invite.id == invite_id).first()
if invite and invite.author_id is author_id and invite.status is InviteStatus.PENDING.value:
# Delete the invite
session.delete(invite)
session.commit()
return {"success": True, "message": "Invite rejected"}
else:
return {"error": "Invalid invite or already accepted/rejected"}
return {"error": "User not found"}
@mutation.field("create_invite")
@login_required
async def create_invite(_, info, slug: str = "", author_id: int = 0):
user_id = info.context["user_id"]
author_dict = info.context["author"]
author_id = author_dict.get("id")
if author_id:
# Check if the inviter is the owner of the shout
with local_session() as session:
shout = session.query(Shout).filter(Shout.slug == slug).first()
inviter = session.query(Author).filter(Author.user == user_id).first()
if inviter and shout and shout.authors and inviter.id is shout.created_by:
# Check if an invite already exists
existing_invite = (
session.query(Invite)
.filter(
Invite.inviter_id == inviter.id,
Invite.author_id == author_id,
Invite.shout_id == shout.id,
Invite.status == InviteStatus.PENDING.value,
)
.first()
)
if existing_invite:
return {"error": "Invite already sent"}
# Create a new invite
new_invite = Invite(
inviter_id=user_id,
author_id=author_id,
shout_id=shout.id,
status=InviteStatus.PENDING.value,
)
session.add(new_invite)
session.commit()
return {"error": None, "invite": new_invite}
else:
return {"error": "Invalid author"}
else:
return {"error": "Access denied"}
@mutation.field("remove_author")
@login_required
async def remove_author(_, info, slug: str = "", author_id: int = 0):
user_id = info.context["user_id"]
with local_session() as session:
author = session.query(Author).filter(Author.user == user_id).first()
if author:
shout = session.query(Shout).filter(Shout.slug == slug).first()
# NOTE: owner should be first in a list
if shout and author.id is shout.created_by:
shout.authors = [author for author in shout.authors if author.id != author_id]
session.commit()
return {}
return {"error": "Access denied"}
@mutation.field("remove_invite")
@login_required
async def remove_invite(_, info, invite_id: int):
info.context["user_id"]
author_dict = info.context["author"]
author_id = author_dict.get("id")
if isinstance(author_id, int):
# Check if the user exists
with local_session() as session:
# Check if the invite exists
invite = session.query(Invite).filter(Invite.id == invite_id).first()
if isinstance(invite, Invite):
shout = session.query(Shout).filter(Shout.id == invite.shout_id).first()
if shout and shout.deleted_at is None and invite:
if invite.inviter_id is author_id or author_id == shout.created_by:
if invite.status is InviteStatus.PENDING.value:
# Delete the invite
session.delete(invite)
session.commit()
return {}
else:
return {"error": "Invalid invite or already accepted/rejected"}
else:
return {"error": "Author not found"}

97
resolvers/community.py Normal file
View File

@ -0,0 +1,97 @@
from orm.author import Author
from orm.community import Community, CommunityFollower
from services.db import local_session
from services.schema import mutation, query
@query.field("get_communities_all")
async def get_communities_all(_, _info):
return local_session().query(Community).all()
@query.field("get_community")
async def get_community(_, _info, slug: str):
q = local_session().query(Community).where(Community.slug == slug)
return q.first()
@query.field("get_communities_by_author")
async def get_communities_by_author(_, _info, slug="", user="", author_id=0):
with local_session() as session:
q = session.query(Community).join(CommunityFollower)
if slug:
author_id = session.query(Author).where(Author.slug == slug).first().id
q = q.where(CommunityFollower.author == author_id)
if user:
author_id = session.query(Author).where(Author.user == user).first().id
q = q.where(CommunityFollower.author == author_id)
if author_id:
q = q.where(CommunityFollower.author == author_id)
return q.all()
return []
@mutation.field("join_community")
async def join_community(_, info, slug: str):
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
with local_session() as session:
community = session.query(Community).where(Community.slug == slug).first()
if not community:
return {"ok": False, "error": "Community not found"}
session.add(CommunityFollower(community=community.id, author=author_id))
session.commit()
return {"ok": True}
@mutation.field("leave_community")
async def leave_community(_, info, slug: str):
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
with local_session() as session:
session.query(CommunityFollower).where(
CommunityFollower.author == author_id, CommunityFollower.community == slug
).delete()
session.commit()
return {"ok": True}
@mutation.field("create_community")
async def create_community(_, info, community_data):
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
with local_session() as session:
session.add(Community(author=author_id, **community_data))
session.commit()
return {"ok": True}
@mutation.field("update_community")
async def update_community(_, info, community_data):
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
slug = community_data.get("slug")
if slug:
with local_session() as session:
try:
session.query(Community).where(Community.created_by == author_id, Community.slug == slug).update(
community_data
)
session.commit()
except Exception as e:
return {"ok": False, "error": str(e)}
return {"ok": True}
return {"ok": False, "error": "Please, set community slug in input"}
@mutation.field("delete_community")
async def delete_community(_, info, slug: str):
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
with local_session() as session:
try:
session.query(Community).where(Community.slug == slug, Community.created_by == author_id).delete()
session.commit()
return {"ok": True}
except Exception as e:
return {"ok": False, "error": str(e)}

View File

@ -1,179 +0,0 @@
from datetime import datetime, timezone
from sqlalchemy import and_
from sqlalchemy.orm import joinedload
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.orm import local_session
from base.resolvers import mutation
from orm.shout import Shout, ShoutAuthor, ShoutTopic
from orm.topic import Topic
from resolvers.zine.reactions import reactions_follow, reactions_unfollow
@mutation.field("createShout")
@login_required
async def create_shout(_, info, inp):
auth: AuthCredentials = info.context["request"].auth
with local_session() as session:
topics = session.query(Topic).filter(Topic.slug.in_(inp.get("topics", []))).all()
new_shout = Shout.create(
**{
"title": inp.get("title"),
"subtitle": inp.get("subtitle"),
"lead": inp.get("lead"),
"description": inp.get("description"),
"body": inp.get("body", ""),
"layout": inp.get("layout"),
"authors": inp.get("authors", []),
"slug": inp.get("slug"),
"mainTopic": inp.get("mainTopic"),
"visibility": "owner",
"createdBy": auth.user_id,
}
)
for topic in topics:
t = ShoutTopic.create(topic=topic.id, shout=new_shout.id)
session.add(t)
# NOTE: shout made by one first author
sa = ShoutAuthor.create(shout=new_shout.id, user=auth.user_id)
session.add(sa)
session.add(new_shout)
reactions_follow(auth.user_id, new_shout.id, True)
session.commit()
# TODO
# GitTask(inp, user.username, user.email, "new shout %s" % new_shout.slug)
if new_shout.slug is None:
new_shout.slug = f"draft-{new_shout.id}"
session.commit()
return {"shout": new_shout}
@mutation.field("updateShout")
@login_required
async def update_shout(_, info, shout_id, shout_input=None, publish=False): # noqa: C901
auth: AuthCredentials = info.context["request"].auth
with local_session() as session:
shout = (
session.query(Shout)
.options(
joinedload(Shout.authors),
joinedload(Shout.topics),
)
.filter(Shout.id == shout_id)
.first()
)
if not shout:
return {"error": "shout not found"}
if shout.createdBy != auth.user_id:
return {"error": "access denied"}
updated = False
if shout_input is not None:
topics_input = shout_input["topics"]
del shout_input["topics"]
new_topics_to_link = []
new_topics = [topic_input for topic_input in topics_input if topic_input["id"] < 0]
for new_topic in new_topics:
del new_topic["id"]
created_new_topic = Topic.create(**new_topic)
session.add(created_new_topic)
new_topics_to_link.append(created_new_topic)
if len(new_topics) > 0:
session.commit()
for new_topic_to_link in new_topics_to_link:
created_unlinked_topic = ShoutTopic.create(
shout=shout.id, topic=new_topic_to_link.id
)
session.add(created_unlinked_topic)
existing_topics_input = [
topic_input for topic_input in topics_input if topic_input.get("id", 0) > 0
]
existing_topic_to_link_ids = [
existing_topic_input["id"]
for existing_topic_input in existing_topics_input
if existing_topic_input["id"] not in [topic.id for topic in shout.topics]
]
for existing_topic_to_link_id in existing_topic_to_link_ids:
created_unlinked_topic = ShoutTopic.create(
shout=shout.id, topic=existing_topic_to_link_id
)
session.add(created_unlinked_topic)
topic_to_unlink_ids = [
topic.id
for topic in shout.topics
if topic.id not in [topic_input["id"] for topic_input in existing_topics_input]
]
shout_topics_to_remove = session.query(ShoutTopic).filter(
and_(ShoutTopic.shout == shout.id, ShoutTopic.topic.in_(topic_to_unlink_ids))
)
for shout_topic_to_remove in shout_topics_to_remove:
session.delete(shout_topic_to_remove)
shout_input["mainTopic"] = shout_input["mainTopic"]["slug"]
if shout_input["mainTopic"] == "":
del shout_input["mainTopic"]
shout.update(shout_input)
updated = True
if publish and shout.visibility == "owner":
shout.visibility = "community"
shout.publishedAt = datetime.now(tz=timezone.utc)
updated = True
if updated:
shout.updatedAt = datetime.now(tz=timezone.utc)
session.commit()
# GitTask(inp, user.username, user.email, "update shout %s" % slug)
return {"shout": shout}
@mutation.field("deleteShout")
@login_required
async def delete_shout(_, info, shout_id):
auth: AuthCredentials = info.context["request"].auth
with local_session() as session:
shout = session.query(Shout).filter(Shout.id == shout_id).first()
if not shout:
return {"error": "invalid shout id"}
if auth.user_id != shout.createdBy:
return {"error": "access denied"}
for author_id in shout.authors:
reactions_unfollow(author_id, shout_id)
shout.deletedAt = datetime.now(tz=timezone.utc)
session.commit()
return {}

475
resolvers/draft.py Normal file
View File

@ -0,0 +1,475 @@
import time
import trafilatura
from sqlalchemy.orm import joinedload
from cache.cache import (
cache_author,
cache_by_id,
cache_topic,
invalidate_shout_related_cache,
invalidate_shouts_cache,
)
from orm.author import Author
from orm.draft import Draft, DraftAuthor, DraftTopic
from orm.shout import Shout, ShoutAuthor, ShoutTopic
from orm.topic import Topic
from services.auth import login_required
from services.db import local_session
from services.notify import notify_shout
from services.schema import mutation, query
from services.search import search_service
from utils.html_wrapper import wrap_html_fragment
from utils.logger import root_logger as logger
def create_shout_from_draft(session, draft, author_id):
"""
Создаёт новый объект публикации (Shout) на основе черновика.
Args:
session: SQLAlchemy сессия (не используется, для совместимости)
draft (Draft): Объект черновика
author_id (int): ID автора публикации
Returns:
Shout: Новый объект публикации (не сохранённый в базе)
Пример:
>>> from orm.draft import Draft
>>> draft = Draft(id=1, title='Заголовок', body='Текст', slug='slug', created_by=1)
>>> shout = create_shout_from_draft(None, draft, 1)
>>> shout.title
'Заголовок'
>>> shout.body
'Текст'
>>> shout.created_by
1
"""
# Создаем новую публикацию
shout = Shout(
body=draft.body or "",
slug=draft.slug,
cover=draft.cover,
cover_caption=draft.cover_caption,
lead=draft.lead,
title=draft.title or "",
subtitle=draft.subtitle,
layout=draft.layout or "article",
media=draft.media or [],
lang=draft.lang or "ru",
seo=draft.seo,
created_by=author_id,
community=draft.community,
draft=draft.id,
deleted_at=None,
)
# Инициализируем пустые массивы для связей
shout.topics = []
shout.authors = []
return shout
@query.field("load_drafts")
@login_required
async def load_drafts(_, info):
"""
Загружает все черновики, доступные текущему пользователю.
Предварительно загружает связанные объекты (topics, authors, publication),
чтобы избежать ошибок с отсоединенными объектами при сериализации.
Returns:
dict: Список черновиков или сообщение об ошибке
"""
user_id = info.context.get("user_id")
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
if not user_id or not author_id:
return {"error": "User ID and author ID are required"}
try:
with local_session() as session:
# Предзагружаем authors, topics и связанную publication
drafts_query = (
session.query(Draft)
.options(
joinedload(Draft.topics),
joinedload(Draft.authors),
joinedload(Draft.publication) # Загружаем связанную публикацию
)
.filter(Draft.authors.any(Author.id == author_id))
)
drafts = drafts_query.all()
# Преобразуем объекты в словари, пока они в контексте сессии
drafts_data = []
for draft in drafts:
draft_dict = draft.dict()
# Всегда возвращаем массив для topics, даже если он пустой
draft_dict["topics"] = [topic.dict() for topic in (draft.topics or [])]
draft_dict["authors"] = [author.dict() for author in (draft.authors or [])]
# Добавляем информацию о публикации, если она есть
if draft.publication:
draft_dict["publication"] = {
"id": draft.publication.id,
"slug": draft.publication.slug,
"published_at": draft.publication.published_at
}
else:
draft_dict["publication"] = None
drafts_data.append(draft_dict)
return {"drafts": drafts_data}
except Exception as e:
logger.error(f"Failed to load drafts: {e}", exc_info=True)
return {"error": f"Failed to load drafts: {str(e)}"}
@mutation.field("create_draft")
@login_required
async def create_draft(_, info, draft_input):
"""Create a new draft.
Args:
info: GraphQL context
draft_input (dict): Draft data including optional fields:
- title (str, required) - заголовок черновика
- body (str, required) - текст черновика
- slug (str)
- etc.
Returns:
dict: Contains either:
- draft: The created draft object
- error: Error message if creation failed
Example:
>>> async def test_create():
... context = {'user_id': '123', 'author': {'id': 1}}
... info = type('Info', (), {'context': context})()
... result = await create_draft(None, info, {'title': 'Test'})
... assert result.get('error') is None
... assert result['draft'].title == 'Test'
... return result
"""
user_id = info.context.get("user_id")
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
if not user_id or not author_id:
return {"error": "Author ID is required"}
# Проверяем обязательные поля
if "body" not in draft_input or not draft_input["body"]:
draft_input["body"] = "" # Пустая строка вместо NULL
if "title" not in draft_input or not draft_input["title"]:
draft_input["title"] = "" # Пустая строка вместо NULL
# Проверяем slug - он должен быть или не пустым, или не передаваться вообще
if "slug" in draft_input and (draft_input["slug"] is None or draft_input["slug"] == ""):
# При создании черновика удаляем пустой slug из входных данных
del draft_input["slug"]
try:
with local_session() as session:
# Remove id from input if present since it's auto-generated
if "id" in draft_input:
del draft_input["id"]
# Добавляем текущее время создания и ID автора
draft_input["created_at"] = int(time.time())
draft_input["created_by"] = author_id
draft = Draft(**draft_input)
session.add(draft)
session.flush()
# Добавляем создателя как автора
da = DraftAuthor(shout=draft.id, author=author_id)
session.add(da)
session.commit()
return {"draft": draft}
except Exception as e:
logger.error(f"Failed to create draft: {e}", exc_info=True)
return {"error": f"Failed to create draft: {str(e)}"}
def generate_teaser(body, limit=300):
body_html = wrap_html_fragment(body)
body_text = trafilatura.extract(body_html, include_comments=False, include_tables=False)
body_teaser = ". ".join(body_text[:limit].split(". ")[:-1])
return body_teaser
@mutation.field("update_draft")
@login_required
async def update_draft(_, info, draft_id: int, draft_input):
"""Обновляет черновик публикации.
Args:
draft_id: ID черновика для обновления
draft_input: Данные для обновления черновика согласно схеме DraftInput:
- layout: String
- author_ids: [Int!]
- topic_ids: [Int!]
- main_topic_id: Int
- media: [MediaItemInput]
- lead: String
- subtitle: String
- lang: String
- seo: String
- body: String
- title: String
- slug: String
- cover: String
- cover_caption: String
Returns:
dict: Обновленный черновик или сообщение об ошибке
"""
user_id = info.context.get("user_id")
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
if not user_id or not author_id:
return {"error": "Author ID are required"}
try:
with local_session() as session:
draft = session.query(Draft).filter(Draft.id == draft_id).first()
if not draft:
return {"error": "Draft not found"}
# Фильтруем входные данные, оставляя только разрешенные поля
allowed_fields = {
"layout", "author_ids", "topic_ids", "main_topic_id",
"media", "lead", "subtitle", "lang", "seo", "body",
"title", "slug", "cover", "cover_caption"
}
filtered_input = {k: v for k, v in draft_input.items() if k in allowed_fields}
# Проверяем slug
if "slug" in filtered_input and not filtered_input["slug"]:
del filtered_input["slug"]
# Обновляем связи с авторами если переданы
if "author_ids" in filtered_input:
author_ids = filtered_input.pop("author_ids")
if author_ids:
# Очищаем текущие связи
session.query(DraftAuthor).filter(DraftAuthor.shout == draft_id).delete()
# Добавляем новые связи
for aid in author_ids:
da = DraftAuthor(shout=draft_id, author=aid)
session.add(da)
# Обновляем связи с темами если переданы
if "topic_ids" in filtered_input:
topic_ids = filtered_input.pop("topic_ids")
main_topic_id = filtered_input.pop("main_topic_id", None)
if topic_ids:
# Очищаем текущие связи
session.query(DraftTopic).filter(DraftTopic.shout == draft_id).delete()
# Добавляем новые связи
for tid in topic_ids:
dt = DraftTopic(
shout=draft_id,
topic=tid,
main=(tid == main_topic_id) if main_topic_id else False
)
session.add(dt)
# Генерируем SEO если не предоставлено
if "seo" not in filtered_input and not draft.seo:
body_src = filtered_input.get("body", draft.body)
lead_src = filtered_input.get("lead", draft.lead)
body_html = wrap_html_fragment(body_src)
lead_html = wrap_html_fragment(lead_src)
try:
body_text = trafilatura.extract(body_html, include_comments=False, include_tables=False) if body_src else None
lead_text = trafilatura.extract(lead_html, include_comments=False, include_tables=False) if lead_src else None
body_teaser = generate_teaser(body_text, 300) if body_text else ""
filtered_input["seo"] = lead_text if lead_text else body_teaser
except Exception as e:
logger.warning(f"Failed to generate SEO for draft {draft_id}: {e}")
# Обновляем основные поля черновика
for key, value in filtered_input.items():
setattr(draft, key, value)
# Обновляем метаданные
draft.updated_at = int(time.time())
draft.updated_by = author_id
session.commit()
# Преобразуем объект в словарь для ответа
draft_dict = draft.dict()
draft_dict["topics"] = [topic.dict() for topic in draft.topics]
draft_dict["authors"] = [author.dict() for author in draft.authors]
# Добавляем объект автора в updated_by
draft_dict["updated_by"] = author_dict
return {"draft": draft_dict}
except Exception as e:
logger.error(f"Failed to update draft: {e}", exc_info=True)
return {"error": f"Failed to update draft: {str(e)}"}
@mutation.field("delete_draft")
@login_required
async def delete_draft(_, info, draft_id: int):
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
with local_session() as session:
draft = session.query(Draft).filter(Draft.id == draft_id).first()
if not draft:
return {"error": "Draft not found"}
if author_id != draft.created_by and draft.authors.filter(Author.id == author_id).count() == 0:
return {"error": "You are not allowed to delete this draft"}
session.delete(draft)
session.commit()
return {"draft": draft}
def validate_html_content(html_content: str) -> tuple[bool, str]:
"""
Проверяет валидность HTML контента через trafilatura.
Args:
html_content: HTML строка для проверки
Returns:
tuple[bool, str]: (валидность, сообщение об ошибке)
Example:
>>> is_valid, error = validate_html_content("<p>Valid HTML</p>")
>>> is_valid
True
>>> error
''
>>> is_valid, error = validate_html_content("Invalid < HTML")
>>> is_valid
False
>>> 'Invalid HTML' in error
True
"""
if not html_content or not html_content.strip():
return False, "Content is empty"
try:
html_content = wrap_html_fragment(html_content)
extracted = trafilatura.extract(html_content)
if not extracted:
return False, "Invalid HTML structure or empty content"
return True, ""
except Exception as e:
logger.error(f"HTML validation error: {e}", exc_info=True)
return False, f"Invalid HTML content: {str(e)}"
@mutation.field("publish_draft")
@login_required
async def publish_draft(_, info, draft_id: int):
"""
Публикует черновик, создавая новый Shout или обновляя существующий.
Args:
draft_id (int): ID черновика для публикации
Returns:
dict: Результат публикации с shout или сообщением об ошибке
"""
user_id = info.context.get("user_id")
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
if not user_id or not author_id:
return {"error": "Author ID is required"}
try:
with local_session() as session:
# Загружаем черновик со всеми связями
draft = (
session.query(Draft)
.options(
joinedload(Draft.topics),
joinedload(Draft.authors),
joinedload(Draft.publication)
)
.filter(Draft.id == draft_id)
.first()
)
if not draft:
return {"error": "Draft not found"}
# Проверка валидности HTML в body
is_valid, error = validate_html_content(draft.body)
if not is_valid:
return {"error": f"Cannot publish draft: {error}"}
# Проверяем, есть ли уже публикация для этого черновика
if draft.publication:
shout = draft.publication
# Обновляем существующую публикацию
for field in ["body", "title", "subtitle", "lead", "cover", "cover_caption", "media", "lang", "seo"]:
if hasattr(draft, field):
setattr(shout, field, getattr(draft, field))
shout.updated_at = int(time.time())
shout.updated_by = author_id
else:
# Создаем новую публикацию
shout = create_shout_from_draft(session, draft, author_id)
now = int(time.time())
shout.created_at = now
shout.published_at = now
session.add(shout)
session.flush() # Получаем ID нового шаута
# Очищаем существующие связи
session.query(ShoutAuthor).filter(ShoutAuthor.shout == shout.id).delete()
session.query(ShoutTopic).filter(ShoutTopic.shout == shout.id).delete()
# Добавляем авторов
for author in (draft.authors or []):
sa = ShoutAuthor(shout=shout.id, author=author.id)
session.add(sa)
# Добавляем темы
for topic in (draft.topics or []):
st = ShoutTopic(
topic=topic.id,
shout=shout.id,
main=topic.main if hasattr(topic, "main") else False
)
session.add(st)
session.commit()
# Инвалидируем кеш
invalidate_shouts_cache()
invalidate_shout_related_cache(shout.id)
# Уведомляем о публикации
await notify_shout(shout.id)
# Обновляем поисковый индекс
search_service.index_shout(shout)
logger.info(f"Successfully published shout #{shout.id} from draft #{draft_id}")
logger.debug(f"Shout data: {shout.dict()}")
return {"shout": shout}
except Exception as e:
logger.error(f"Failed to publish draft {draft_id}: {e}", exc_info=True)
return {"error": f"Failed to publish draft: {str(e)}"}

826
resolvers/editor.py Normal file
View File

@ -0,0 +1,826 @@
import time
import orjson
import trafilatura
from sqlalchemy import and_, desc, select
from sqlalchemy.orm import joinedload, selectinload
from sqlalchemy.sql.functions import coalesce
from cache.cache import (
cache_author,
cache_topic,
invalidate_shout_related_cache,
invalidate_shouts_cache,
)
from orm.author import Author
from orm.draft import Draft
from orm.shout import Shout, ShoutAuthor, ShoutTopic
from orm.topic import Topic
from resolvers.follower import follow, unfollow
from resolvers.stat import get_with_stat
from services.auth import login_required
from services.db import local_session
from services.notify import notify_shout
from services.schema import mutation, query
from services.search import search_service
from utils.html_wrapper import wrap_html_fragment
from utils.logger import root_logger as logger
async def cache_by_id(entity, entity_id: int, cache_method):
"""Cache an entity by its ID using the provided cache method.
Args:
entity: The SQLAlchemy model class to query
entity_id (int): The ID of the entity to cache
cache_method: The caching function to use
Returns:
dict: The cached entity data if successful, None if entity not found
Example:
>>> async def test_cache():
... author = await cache_by_id(Author, 1, cache_author)
... assert author['id'] == 1
... assert 'name' in author
... return author
"""
caching_query = select(entity).filter(entity.id == entity_id)
result = get_with_stat(caching_query)
if not result or not result[0]:
logger.warning(f"{entity.__name__} with id {entity_id} not found")
return
x = result[0]
d = x.dict() # convert object to dictionary
cache_method(d)
return d
@query.field("get_my_shout")
@login_required
async def get_my_shout(_, info, shout_id: int):
"""Get a shout by ID if the requesting user has permission to view it.
DEPRECATED: use `load_drafts` instead
Args:
info: GraphQL resolver info containing context
shout_id (int): ID of the shout to retrieve
Returns:
dict: Contains either:
- error (str): Error message if retrieval failed
- shout (Shout): The requested shout if found and accessible
Permissions:
User must be:
- The shout creator
- Listed as an author
- Have editor role
Example:
>>> async def test_get_my_shout():
... context = {'user_id': '123', 'author': {'id': 1}, 'roles': []}
... info = type('Info', (), {'context': context})()
... result = await get_my_shout(None, info, 1)
... assert result['error'] is None
... assert result['shout'].id == 1
... return result
"""
user_id = info.context.get("user_id", "")
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
roles = info.context.get("roles", [])
shout = None
if not user_id or not author_id:
return {"error": "unauthorized", "shout": None}
with local_session() as session:
shout = (
session.query(Shout)
.filter(Shout.id == shout_id)
.options(joinedload(Shout.authors), joinedload(Shout.topics))
.filter(Shout.deleted_at.is_(None))
.first()
)
if not shout:
return {"error": "no shout found", "shout": None}
# Преобразуем media JSON в список объектов MediaItem
if hasattr(shout, "media") and shout.media:
if isinstance(shout.media, str):
try:
shout.media = orjson.loads(shout.media)
except Exception as e:
logger.error(f"Error parsing shout media: {e}")
shout.media = []
if not isinstance(shout.media, list):
shout.media = [shout.media] if shout.media else []
else:
shout.media = []
logger.debug(f"got {len(shout.authors)} shout authors, created by {shout.created_by}")
is_editor = "editor" in roles
logger.debug(f"viewer is{'' if is_editor else ' not'} editor")
is_creator = author_id == shout.created_by
logger.debug(f"viewer is{'' if is_creator else ' not'} creator")
is_author = bool(list(filter(lambda x: x.id == int(author_id), [x for x in shout.authors])))
logger.debug(f"viewer is{'' if is_creator else ' not'} author")
can_edit = is_editor or is_author or is_creator
if not can_edit:
return {"error": "forbidden", "shout": None}
logger.debug("got shout editor with data")
return {"error": None, "shout": shout}
@query.field("get_shouts_drafts")
@login_required
async def get_shouts_drafts(_, info):
# user_id = info.context.get("user_id")
author_dict = info.context.get("author")
if not author_dict:
return {"error": "author profile was not found"}
author_id = author_dict.get("id")
shouts = []
with local_session() as session:
if author_id:
q = (
select(Shout)
.options(joinedload(Shout.authors), joinedload(Shout.topics))
.filter(and_(Shout.deleted_at.is_(None), Shout.created_by == int(author_id)))
.filter(Shout.published_at.is_(None))
.order_by(desc(coalesce(Shout.updated_at, Shout.created_at)))
.group_by(Shout.id)
)
shouts = [shout for [shout] in session.execute(q).unique()]
return {"shouts": shouts}
# @mutation.field("create_shout")
# @login_required
async def create_shout(_, info, inp):
logger.info(f"Starting create_shout with input: {inp}")
user_id = info.context.get("user_id")
author_dict = info.context.get("author")
logger.debug(f"Context user_id: {user_id}, author: {author_dict}")
if not author_dict:
logger.error("Author profile not found in context")
return {"error": "author profile was not found"}
author_id = author_dict.get("id")
if user_id and author_id:
try:
with local_session() as session:
author_id = int(author_id)
current_time = int(time.time())
slug = inp.get("slug") or f"draft-{current_time}"
logger.info(f"Creating shout with input: {inp}")
# Создаем публикацию без topics
body = inp.get("body", "")
lead = inp.get("lead", "")
body_html = wrap_html_fragment(body)
lead_html = wrap_html_fragment(lead)
body_text = trafilatura.extract(body_html)
lead_text = trafilatura.extract(lead_html)
seo = inp.get("seo", lead_text.strip() or body_text.strip()[:300].split(". ")[:-1].join(". "))
new_shout = Shout(
slug=slug,
body=body,
seo=seo,
lead=lead,
layout=inp.get("layout", "article"),
title=inp.get("title", ""),
created_by=author_id,
created_at=current_time,
community=1,
)
# Проверяем уникальность slug
logger.debug(f"Checking for existing slug: {slug}")
same_slug_shout = session.query(Shout).filter(Shout.slug == new_shout.slug).first()
c = 1
while same_slug_shout is not None:
logger.debug(f"Found duplicate slug, trying iteration {c}")
new_shout.slug = f"{slug}-{c}"
same_slug_shout = session.query(Shout).filter(Shout.slug == new_shout.slug).first()
c += 1
try:
logger.info("Creating new shout object")
session.add(new_shout)
session.commit()
logger.info(f"Created shout with ID: {new_shout.id}")
except Exception as e:
logger.error(f"Error creating shout object: {e}", exc_info=True)
return {"error": f"Database error: {str(e)}"}
# Связываем с автором
try:
logger.debug(f"Linking author {author_id} to shout {new_shout.id}")
sa = ShoutAuthor(shout=new_shout.id, author=author_id)
session.add(sa)
except Exception as e:
logger.error(f"Error linking author: {e}", exc_info=True)
return {"error": f"Error linking author: {str(e)}"}
# Связываем с темами
input_topics = inp.get("topics", [])
if input_topics:
try:
logger.debug(f"Linking topics: {[t.slug for t in input_topics]}")
main_topic = inp.get("main_topic")
for topic in input_topics:
st = ShoutTopic(
topic=topic.id,
shout=new_shout.id,
main=(topic.slug == main_topic) if main_topic else False,
)
session.add(st)
logger.debug(f"Added topic {topic.slug} {'(main)' if st.main else ''}")
except Exception as e:
logger.error(f"Error linking topics: {e}", exc_info=True)
return {"error": f"Error linking topics: {str(e)}"}
try:
session.commit()
logger.info("Final commit successful")
except Exception as e:
logger.error(f"Error in final commit: {e}", exc_info=True)
return {"error": f"Error in final commit: {str(e)}"}
# Получаем созданную публикацию
shout = session.query(Shout).filter(Shout.id == new_shout.id).first()
# Подписываем автора
try:
logger.debug("Following created shout")
await follow(None, info, "shout", shout.slug)
except Exception as e:
logger.warning(f"Error following shout: {e}", exc_info=True)
logger.info(f"Successfully created shout {shout.id}")
return {"shout": shout}
except Exception as e:
logger.error(f"Unexpected error in create_shout: {e}", exc_info=True)
return {"error": f"Unexpected error: {str(e)}"}
error_msg = "cant create shout" if user_id else "unauthorized"
logger.error(f"Create shout failed: {error_msg}")
return {"error": error_msg}
def patch_main_topic(session, main_topic_slug, shout):
"""Update the main topic for a shout."""
logger.info(f"Starting patch_main_topic for shout#{shout.id} with slug '{main_topic_slug}'")
logger.debug(f"Current shout topics: {[(t.topic.slug, t.main) for t in shout.topics]}")
with session.begin():
# Получаем текущий главный топик
old_main = (
session.query(ShoutTopic).filter(and_(ShoutTopic.shout == shout.id, ShoutTopic.main.is_(True))).first()
)
if old_main:
logger.info(f"Found current main topic: {old_main.topic.slug}")
else:
logger.info("No current main topic found")
# Находим новый главный топик
main_topic = session.query(Topic).filter(Topic.slug == main_topic_slug).first()
if not main_topic:
logger.error(f"Main topic with slug '{main_topic_slug}' not found")
return
logger.info(f"Found new main topic: {main_topic.slug} (id={main_topic.id})")
# Находим связь с новым главным топиком
new_main = (
session.query(ShoutTopic)
.filter(and_(ShoutTopic.shout == shout.id, ShoutTopic.topic == main_topic.id))
.first()
)
logger.debug(f"Found new main topic relation: {new_main is not None}")
if old_main and new_main and old_main is not new_main:
logger.info(f"Updating main topic flags: {old_main.topic.slug} -> {new_main.topic.slug}")
old_main.main = False
session.add(old_main)
new_main.main = True
session.add(new_main)
session.flush()
logger.info(f"Main topic updated for shout#{shout.id}")
else:
logger.warning(f"No changes needed for main topic (old={old_main is not None}, new={new_main is not None})")
def patch_topics(session, shout, topics_input):
"""Update the topics associated with a shout.
Args:
session: SQLAlchemy session
shout (Shout): The shout to update
topics_input (list): List of topic dicts with fields:
- id (int): Topic ID (<0 for new topics)
- slug (str): Topic slug
- title (str): Topic title (for new topics)
Side Effects:
- Creates new topics if needed
- Updates shout-topic associations
- Refreshes shout object with new topics
Example:
>>> def test_patch_topics():
... topics = [
... {'id': -1, 'slug': 'new-topic', 'title': 'New Topic'},
... {'id': 1, 'slug': 'existing-topic'}
... ]
... with local_session() as session:
... shout = session.query(Shout).first()
... patch_topics(session, shout, topics)
... assert len(shout.topics) == 2
... assert any(t.slug == 'new-topic' for t in shout.topics)
... return shout.topics
"""
logger.info(f"Starting patch_topics for shout#{shout.id}")
logger.info(f"Received topics_input: {topics_input}")
# Создаем новые топики если есть
new_topics_to_link = [Topic(**new_topic) for new_topic in topics_input if new_topic["id"] < 0]
if new_topics_to_link:
logger.info(f"Creating new topics: {[t.dict() for t in new_topics_to_link]}")
session.add_all(new_topics_to_link)
session.flush()
# Получаем текущие связи
current_links = session.query(ShoutTopic).filter(ShoutTopic.shout == shout.id).all()
logger.info(f"Current topic links: {[{t.topic: t.main} for t in current_links]}")
# Удаляем старые связи
if current_links:
logger.info(f"Removing old topic links for shout#{shout.id}")
for link in current_links:
session.delete(link)
session.flush()
# Создаем новые связи
for topic_input in topics_input:
topic_id = topic_input["id"]
if topic_id < 0:
topic = next(t for t in new_topics_to_link if t.slug == topic_input["slug"])
topic_id = topic.id
logger.info(f"Creating new topic link: shout#{shout.id} -> topic#{topic_id}")
new_link = ShoutTopic(shout=shout.id, topic=topic_id, main=False)
session.add(new_link)
session.flush()
# Обновляем связи в объекте шаута
session.refresh(shout)
logger.info(f"Successfully updated topics for shout#{shout.id}")
logger.info(f"Final shout topics: {[t.dict() for t in shout.topics]}")
# @mutation.field("update_shout")
# @login_required
async def update_shout(_, info, shout_id: int, shout_input=None, publish=False):
logger.info(f"Starting update_shout with id={shout_id}, publish={publish}")
logger.debug(f"Full shout_input: {shout_input}") # DraftInput
user_id = info.context.get("user_id")
roles = info.context.get("roles", [])
author_dict = info.context.get("author")
if not author_dict:
logger.error("Author profile not found")
return {"error": "author profile was not found"}
author_id = author_dict.get("id")
shout_input = shout_input or {}
current_time = int(time.time())
shout_id = shout_id or shout_input.get("id", shout_id)
slug = shout_input.get("slug")
if not user_id:
logger.error("Unauthorized update attempt")
return {"error": "unauthorized"}
try:
with local_session() as session:
if author_id:
logger.info(f"Processing update for shout#{shout_id} by author #{author_id}")
shout_by_id = (
session.query(Shout)
.options(joinedload(Shout.topics).joinedload(ShoutTopic.topic), joinedload(Shout.authors))
.filter(Shout.id == shout_id)
.first()
)
if not shout_by_id:
logger.error(f"shout#{shout_id} not found")
return {"error": "shout not found"}
logger.info(f"Found shout#{shout_id}")
# Логируем текущие топики
current_topics = (
[{"id": t.id, "slug": t.slug, "title": t.title} for t in shout_by_id.topics]
if shout_by_id.topics
else []
)
logger.info(f"Current topics for shout#{shout_id}: {current_topics}")
if slug != shout_by_id.slug:
same_slug_shout = session.query(Shout).filter(Shout.slug == slug).first()
c = 1
while same_slug_shout is not None:
c += 1
slug = f"{slug}-{c}"
same_slug_shout = session.query(Shout).filter(Shout.slug == slug).first()
shout_input["slug"] = slug
logger.info(f"shout#{shout_id} slug patched")
if filter(lambda x: x.id == author_id, [x for x in shout_by_id.authors]) or "editor" in roles:
logger.info(f"Author #{author_id} has permission to edit shout#{shout_id}")
# topics patch
topics_input = shout_input.get("topics")
if topics_input:
logger.info(f"Received topics_input for shout#{shout_id}: {topics_input}")
try:
patch_topics(session, shout_by_id, topics_input)
logger.info(f"Successfully patched topics for shout#{shout_id}")
# Обновляем связи в сессии после patch_topics
session.refresh(shout_by_id)
except Exception as e:
logger.error(f"Error patching topics: {e}", exc_info=True)
return {"error": f"Failed to update topics: {str(e)}"}
del shout_input["topics"]
for tpc in topics_input:
await cache_by_id(Topic, tpc["id"], cache_topic)
else:
logger.warning(f"No topics_input received for shout#{shout_id}")
# main topic
main_topic = shout_input.get("main_topic")
if main_topic:
logger.info(f"Updating main topic for shout#{shout_id} to {main_topic}")
patch_main_topic(session, main_topic, shout_by_id)
shout_input["updated_at"] = current_time
if publish:
logger.info(f"Publishing shout#{shout_id}")
shout_input["published_at"] = current_time
# Проверяем наличие связи с автором
logger.info(f"Checking author link for shout#{shout_id} and author#{author_id}")
author_link = (
session.query(ShoutAuthor)
.filter(and_(ShoutAuthor.shout == shout_id, ShoutAuthor.author == author_id))
.first()
)
if not author_link:
logger.info(f"Adding missing author link for shout#{shout_id}")
sa = ShoutAuthor(shout=shout_id, author=author_id)
session.add(sa)
session.flush()
logger.info("Author link added successfully")
else:
logger.info("Author link already exists")
# Логируем финальное состояние перед сохранением
logger.info(f"Final shout_input for update: {shout_input}")
Shout.update(shout_by_id, shout_input)
session.add(shout_by_id)
try:
session.commit()
# Обновляем объект после коммита чтобы получить все связи
session.refresh(shout_by_id)
logger.info(f"Successfully committed updates for shout#{shout_id}")
except Exception as e:
logger.error(f"Commit failed: {e}", exc_info=True)
return {"error": f"Failed to save changes: {str(e)}"}
# После обновления проверяем топики
updated_topics = (
[{"id": t.id, "slug": t.slug, "title": t.title} for t in shout_by_id.topics]
if shout_by_id.topics
else []
)
logger.info(f"Updated topics for shout#{shout_id}: {updated_topics}")
# Инвалидация кэша после обновления
try:
logger.info("Invalidating cache after shout update")
cache_keys = [
"feed", # лента
f"author_{author_id}", # публикации автора
"random_top", # случайные топовые
"unrated", # неоцененные
]
# Добавляем ключи для тем публикации
for topic in shout_by_id.topics:
cache_keys.append(f"topic_{topic.id}")
cache_keys.append(f"topic_shouts_{topic.id}")
await invalidate_shouts_cache(cache_keys)
await invalidate_shout_related_cache(shout_by_id, author_id)
# Обновляем кэш тем и авторов
for topic in shout_by_id.topics:
await cache_by_id(Topic, topic.id, cache_topic)
for author in shout_by_id.authors:
await cache_author(author.dict())
logger.info("Cache invalidated successfully")
except Exception as cache_error:
logger.warning(f"Cache invalidation error: {cache_error}", exc_info=True)
if not publish:
await notify_shout(shout_by_id.dict(), "update")
else:
await notify_shout(shout_by_id.dict(), "published")
# search service indexing
search_service.index(shout_by_id)
for a in shout_by_id.authors:
await cache_by_id(Author, a.id, cache_author)
logger.info(f"shout#{shout_id} updated")
# Получаем полные данные шаута со связями
shout_with_relations = (
session.query(Shout)
.options(joinedload(Shout.topics).joinedload(ShoutTopic.topic), joinedload(Shout.authors))
.filter(Shout.id == shout_id)
.first()
)
# Создаем словарь с базовыми полями
shout_dict = shout_with_relations.dict()
# Явно добавляем связанные данные
shout_dict["topics"] = (
[
{"id": topic.id, "slug": topic.slug, "title": topic.title}
for topic in shout_with_relations.topics
]
if shout_with_relations.topics
else []
)
# Add main_topic to the shout dictionary
shout_dict["main_topic"] = get_main_topic(shout_with_relations.topics)
shout_dict["authors"] = (
[
{"id": author.id, "name": author.name, "slug": author.slug}
for author in shout_with_relations.authors
]
if shout_with_relations.authors
else []
)
logger.info(f"Final shout data with relations: {shout_dict}")
logger.debug(
f"Loaded topics details: {[(t.topic.slug if t.topic else 'no-topic', t.main) for t in shout_with_relations.topics]}"
)
return {"shout": shout_dict, "error": None}
else:
logger.warning(f"Access denied: author #{author_id} cannot edit shout#{shout_id}")
return {"error": "access denied", "shout": None}
except Exception as exc:
logger.error(f"Unexpected error in update_shout: {exc}", exc_info=True)
logger.error(f"Failed input data: {shout_input}")
return {"error": "cant update shout"}
return {"error": "cant update shout"}
# @mutation.field("delete_shout")
# @login_required
async def delete_shout(_, info, shout_id: int):
user_id = info.context.get("user_id")
roles = info.context.get("roles", [])
author_dict = info.context.get("author")
if not author_dict:
return {"error": "author profile was not found"}
author_id = author_dict.get("id")
if user_id and author_id:
author_id = int(author_id)
with local_session() as session:
shout = session.query(Shout).filter(Shout.id == shout_id).first()
if not isinstance(shout, Shout):
return {"error": "invalid shout id"}
shout_dict = shout.dict()
# NOTE: only owner and editor can mark the shout as deleted
if shout_dict["created_by"] == author_id or "editor" in roles:
shout_dict["deleted_at"] = int(time.time())
Shout.update(shout, shout_dict)
session.add(shout)
session.commit()
for author in shout.authors:
await cache_by_id(Author, author.id, cache_author)
info.context["author"] = author.dict()
info.context["user_id"] = author.user
unfollow(None, info, "shout", shout.slug)
for topic in shout.topics:
await cache_by_id(Topic, topic.id, cache_topic)
await notify_shout(shout_dict, "delete")
return {"error": None}
else:
return {"error": "access denied"}
def get_main_topic(topics):
"""Get the main topic from a list of ShoutTopic objects."""
logger.info(f"Starting get_main_topic with {len(topics) if topics else 0} topics")
logger.debug(
f"Topics data: {[(t.slug, getattr(t, 'main', False)) for t in topics] if topics else []}"
)
if not topics:
logger.warning("No topics provided to get_main_topic")
return {"id": 0, "title": "no topic", "slug": "notopic", "is_main": True}
# Проверяем, является ли topics списком объектов ShoutTopic или Topic
if hasattr(topics[0], 'topic') and topics[0].topic:
# Для ShoutTopic объектов (старый формат)
# Find first main topic in original order
main_topic_rel = next((st for st in topics if getattr(st, 'main', False)), None)
logger.debug(
f"Found main topic relation: {main_topic_rel.topic.slug if main_topic_rel and main_topic_rel.topic else None}"
)
if main_topic_rel and main_topic_rel.topic:
result = {
"slug": main_topic_rel.topic.slug,
"title": main_topic_rel.topic.title,
"id": main_topic_rel.topic.id,
"is_main": True,
}
logger.info(f"Returning main topic: {result}")
return result
# If no main found but topics exist, return first
if topics and topics[0].topic:
logger.info(f"No main topic found, using first topic: {topics[0].topic.slug}")
result = {
"slug": topics[0].topic.slug,
"title": topics[0].topic.title,
"id": topics[0].topic.id,
"is_main": True,
}
return result
else:
# Для Topic объектов (новый формат из selectinload)
# После смены на selectinload у нас просто список Topic объектов
if topics:
logger.info(f"Using first topic as main: {topics[0].slug}")
result = {
"slug": topics[0].slug,
"title": topics[0].title,
"id": topics[0].id,
"is_main": True,
}
return result
logger.warning("No valid topics found, returning default")
return {"slug": "notopic", "title": "no topic", "id": 0, "is_main": True}
@mutation.field("unpublish_shout")
@login_required
async def unpublish_shout(_, info, shout_id: int):
"""Снимает публикацию (shout) с публикации.
Предзагружает связанный черновик (draft) и его авторов/темы, чтобы избежать
ошибок при последующем доступе к ним в GraphQL.
Args:
shout_id: ID публикации для снятия с публикации
Returns:
dict: Снятая с публикации публикация или сообщение об ошибке
"""
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
if not author_id:
# В идеале нужна проверка прав, имеет ли автор право снимать публикацию
return {"error": "Author ID is required"}
shout = None
with local_session() as session:
try:
# Загружаем Shout со всеми связями для правильного формирования ответа
shout = (
session.query(Shout)
.options(
joinedload(Shout.authors),
selectinload(Shout.topics)
)
.filter(Shout.id == shout_id)
.first()
)
if not shout:
logger.warning(f"Shout not found for unpublish: ID {shout_id}")
return {"error": "Shout not found"}
# Если у публикации есть связанный черновик, загружаем его с relationships
if shout.draft:
# Отдельно загружаем черновик с его связями
draft = (
session.query(Draft)
.options(
selectinload(Draft.authors),
selectinload(Draft.topics)
)
.filter(Draft.id == shout.draft)
.first()
)
# Связываем черновик с публикацией вручную для доступа через API
if draft:
shout.draft_obj = draft
# TODO: Добавить проверку прав доступа, если необходимо
# if author_id not in [a.id for a in shout.authors]: # Требует selectinload(Shout.authors) выше
# logger.warning(f"Author {author_id} denied unpublishing shout {shout_id}")
# return {"error": "Access denied"}
# Запоминаем старый slug и id для формирования поля publication
shout_slug = shout.slug
shout_id_for_publication = shout.id
# Снимаем с публикации (устанавливаем published_at в None)
shout.published_at = None
session.commit()
# Формируем полноценный словарь для ответа
shout_dict = shout.dict()
# Добавляем связанные данные
shout_dict["topics"] = (
[
{"id": topic.id, "slug": topic.slug, "title": topic.title}
for topic in shout.topics
]
if shout.topics
else []
)
# Добавляем main_topic
shout_dict["main_topic"] = get_main_topic(shout.topics)
# Добавляем авторов
shout_dict["authors"] = (
[
{"id": author.id, "name": author.name, "slug": author.slug}
for author in shout.authors
]
if shout.authors
else []
)
# Важно! Обновляем поле publication, отражая состояние "снят с публикации"
shout_dict["publication"] = {
"id": shout_id_for_publication,
"slug": shout_slug,
"published_at": None # Ключевое изменение - устанавливаем published_at в None
}
# Инвалидация кэша
try:
cache_keys = [
"feed", # лента
f"author_{author_id}", # публикации автора
"random_top", # случайные топовые
"unrated", # неоцененные
]
await invalidate_shout_related_cache(shout, author_id)
await invalidate_shouts_cache(cache_keys)
logger.info(f"Cache invalidated after unpublishing shout {shout_id}")
except Exception as cache_err:
logger.error(f"Failed to invalidate cache for unpublish shout {shout_id}: {cache_err}")
except Exception as e:
session.rollback()
logger.error(f"Failed to unpublish shout {shout_id}: {e}", exc_info=True)
return {"error": f"Failed to unpublish shout: {str(e)}"}
# Возвращаем сформированный словарь вместо объекта
logger.info(f"Shout {shout_id} unpublished successfully by author {author_id}")
return {"shout": shout_dict}

198
resolvers/feed.py Normal file
View File

@ -0,0 +1,198 @@
from typing import List
from sqlalchemy import and_, select
from orm.author import Author, AuthorFollower
from orm.shout import Shout, ShoutAuthor, ShoutReactionsFollower, ShoutTopic
from orm.topic import Topic, TopicFollower
from resolvers.reader import (
apply_options,
get_shouts_with_links,
has_field,
query_with_stat,
)
from services.auth import login_required
from services.db import local_session
from services.schema import query
from utils.logger import root_logger as logger
@query.field("load_shouts_coauthored")
@login_required
async def load_shouts_coauthored(_, info, options):
"""
Загрузка публикаций, написанных в соавторстве с пользователем.
:param info: Информаци о контексте GraphQL.
:param options: Опции фильтрации и сортировки.
:return: Список публикаций в соавтостве.
"""
author_id = info.context.get("author", {}).get("id")
if not author_id:
return []
q = query_with_stat(info)
q = q.filter(Shout.authors.any(id=author_id))
q, limit, offset = apply_options(q, options)
return get_shouts_with_links(info, q, limit, offset=offset)
@query.field("load_shouts_discussed")
@login_required
async def load_shouts_discussed(_, info, options):
"""
Загрузка публикаций, которые обсуждались пользователем.
:param info: Информация о контексте GraphQL.
:param options: Опции фильтрации и сортировки.
:return: Список публикаций, обсужденых пользователем.
"""
author_id = info.context.get("author", {}).get("id")
if not author_id:
return []
q = query_with_stat(info)
options["filters"]["commented"] = True
q, limit, offset = apply_options(q, options, author_id)
return get_shouts_with_links(info, q, limit, offset=offset)
def shouts_by_follower(info, follower_id: int, options):
"""
Загружает публикации, на которые подписан автор.
- по авторам
- по темам
- по реакциям
:param info: Информация о контексте GraphQL.
:param follower_id: Идентификатор автора.
:param options: Опции фильтрации и сортировки.
:return: Список публикаций.
"""
q = query_with_stat(info)
reader_followed_authors = select(AuthorFollower.author).where(AuthorFollower.follower == follower_id)
reader_followed_topics = select(TopicFollower.topic).where(TopicFollower.follower == follower_id)
reader_followed_shouts = select(ShoutReactionsFollower.shout).where(ShoutReactionsFollower.follower == follower_id)
followed_subquery = (
select(Shout.id)
.join(ShoutAuthor, ShoutAuthor.shout == Shout.id)
.join(ShoutTopic, ShoutTopic.shout == Shout.id)
.where(
ShoutAuthor.author.in_(reader_followed_authors)
| ShoutTopic.topic.in_(reader_followed_topics)
| Shout.id.in_(reader_followed_shouts)
)
.scalar_subquery()
)
q = q.filter(Shout.id.in_(followed_subquery))
q, limit, offset = apply_options(q, options)
shouts = get_shouts_with_links(info, q, limit, offset=offset)
return shouts
@query.field("load_shouts_followed_by")
async def load_shouts_followed_by(_, info, slug: str, options) -> List[Shout]:
"""
Загружает публикации, на которые подписан автор по slug.
:param info: Информация о контексте GraphQL.
:param slug: Slug автора.
:param options: Опции фильтрации и сортировки.
:return: Список публикаций.
"""
with local_session() as session:
author = session.query(Author).filter(Author.slug == slug).first()
if author:
follower_id = author.dict()["id"]
shouts = shouts_by_follower(info, follower_id, options)
return shouts
return []
@query.field("load_shouts_feed")
@login_required
async def load_shouts_feed(_, info, options) -> List[Shout]:
"""
Загружает публикации, на которые подписан авторизованный пользователь.
:param info: Информация о контексте GraphQL.
:param options: Опции фильтрации и сортировки.
:return: Список публикаций.
"""
author_id = info.context.get("author", {}).get("id")
return shouts_by_follower(info, author_id, options) if author_id else []
@query.field("load_shouts_authored_by")
async def load_shouts_authored_by(_, info, slug: str, options) -> List[Shout]:
"""
Загружает публикации, написанные автором по slug.
:param info: Информация о контексте GraphQL.
:param slug: Slug автора.
:param options: Опции фильтрации и сортировки.
:return: Список публикаций.
"""
with local_session() as session:
author = session.query(Author).filter(Author.slug == slug).first()
if author:
try:
author_id: int = author.dict()["id"]
q = (
query_with_stat(info)
if has_field(info, "stat")
else select(Shout).filter(and_(Shout.published_at.is_not(None), Shout.deleted_at.is_(None)))
)
q = q.filter(Shout.authors.any(id=author_id))
q, limit, offset = apply_options(q, options, author_id)
shouts = get_shouts_with_links(info, q, limit, offset=offset)
return shouts
except Exception as error:
logger.debug(error)
return []
@query.field("load_shouts_with_topic")
async def load_shouts_with_topic(_, info, slug: str, options) -> List[Shout]:
"""
Загружает публикации, связанные с темой по slug.
:param info: Информация о контексте GraphQL.
:param slug: Slug темы.
:param options: Опции фильтрации и сортировки.
:return: Список публикаций.
"""
with local_session() as session:
topic = session.query(Topic).filter(Topic.slug == slug).first()
if topic:
try:
topic_id: int = topic.dict()["id"]
q = (
query_with_stat(info)
if has_field(info, "stat")
else select(Shout).filter(and_(Shout.published_at.is_not(None), Shout.deleted_at.is_(None)))
)
q = q.filter(Shout.topics.any(id=topic_id))
q, limit, offset = apply_options(q, options)
shouts = get_shouts_with_links(info, q, limit, offset=offset)
return shouts
except Exception as error:
logger.debug(error)
return []
def apply_filters(q, filters):
"""
Применяет фильтры к запросу
"""
logger.info(f"Applying filters: {filters}")
if filters.get("published"):
q = q.filter(Shout.published_at.is_not(None))
logger.info("Added published filter")
if filters.get("topic"):
topic_slug = filters["topic"]
q = q.join(ShoutTopic).join(Topic).filter(Topic.slug == topic_slug)
logger.info(f"Added topic filter: {topic_slug}")
return q

222
resolvers/follower.py Normal file
View File

@ -0,0 +1,222 @@
from typing import List
from graphql import GraphQLError
from sqlalchemy import select
from sqlalchemy.sql import and_
from cache.cache import (
cache_author,
cache_topic,
get_cached_follower_authors,
get_cached_follower_topics,
)
from orm.author import Author, AuthorFollower
from orm.community import Community, CommunityFollower
from orm.reaction import Reaction
from orm.shout import Shout, ShoutReactionsFollower
from orm.topic import Topic, TopicFollower
from resolvers.stat import get_with_stat
from services.auth import login_required
from services.db import local_session
from services.notify import notify_follower
from services.schema import mutation, query
from utils.logger import root_logger as logger
@mutation.field("follow")
@login_required
async def follow(_, info, what, slug="", entity_id=0):
logger.debug("Начало выполнения функции 'follow'")
user_id = info.context.get("user_id")
follower_dict = info.context.get("author")
logger.debug(f"follower: {follower_dict}")
if not user_id or not follower_dict:
return GraphQLError("unauthorized")
follower_id = follower_dict.get("id")
logger.debug(f"follower_id: {follower_id}")
entity_classes = {
"AUTHOR": (Author, AuthorFollower, get_cached_follower_authors, cache_author),
"TOPIC": (Topic, TopicFollower, get_cached_follower_topics, cache_topic),
"COMMUNITY": (Community, CommunityFollower, None, None), # Нет методов кэша для сообщества
"SHOUT": (Shout, ShoutReactionsFollower, None, None), # Нет методов кэша для shout
}
if what not in entity_classes:
logger.error(f"Неверный тип для следования: {what}")
return {"error": "invalid follow type"}
entity_class, follower_class, get_cached_follows_method, cache_method = entity_classes[what]
entity_type = what.lower()
entity_dict = None
try:
logger.debug("Попытка получить сущность из базы данных")
with local_session() as session:
entity_query = select(entity_class).filter(entity_class.slug == slug)
entities = get_with_stat(entity_query)
[entity] = entities
if not entity:
logger.warning(f"{what.lower()} не найден по slug: {slug}")
return {"error": f"{what.lower()} not found"}
if not entity_id and entity:
entity_id = entity.id
entity_dict = entity.dict()
logger.debug(f"entity_id: {entity_id}, entity_dict: {entity_dict}")
if entity_id:
logger.debug("Проверка существующей подписки")
with local_session() as session:
existing_sub = (
session.query(follower_class)
.filter(follower_class.follower == follower_id, getattr(follower_class, entity_type) == entity_id)
.first()
)
if existing_sub:
logger.info(f"Пользователь {follower_id} уже подписан на {what.lower()} с ID {entity_id}")
else:
logger.debug("Добавление новой записи в базу данных")
sub = follower_class(follower=follower_id, **{entity_type: entity_id})
logger.debug(f"Создан объект подписки: {sub}")
session.add(sub)
session.commit()
logger.info(f"Пользователь {follower_id} подписался на {what.lower()} с ID {entity_id}")
follows = None
if cache_method:
logger.debug("Обновление кэша")
await cache_method(entity_dict)
if get_cached_follows_method:
logger.debug("Получение подписок из кэша")
existing_follows = await get_cached_follows_method(follower_id)
follows = [*existing_follows, entity_dict] if not existing_sub else existing_follows
logger.debug("Обновлен список подписок")
if what == "AUTHOR" and not existing_sub:
logger.debug("Отправка уведомления автору о подписке")
await notify_follower(follower=follower_dict, author_id=entity_id, action="follow")
except Exception as exc:
logger.exception("Произошла ошибка в функции 'follow'")
return {"error": str(exc)}
return {f"{what.lower()}s": follows}
@mutation.field("unfollow")
@login_required
async def unfollow(_, info, what, slug="", entity_id=0):
logger.debug("Начало выполнения функции 'unfollow'")
user_id = info.context.get("user_id")
follower_dict = info.context.get("author")
logger.debug(f"follower: {follower_dict}")
if not user_id or not follower_dict:
logger.warning("Неавторизованный доступ при попытке отписаться")
return {"error": "unauthorized"}
follower_id = follower_dict.get("id")
logger.debug(f"follower_id: {follower_id}")
entity_classes = {
"AUTHOR": (Author, AuthorFollower, get_cached_follower_authors, cache_author),
"TOPIC": (Topic, TopicFollower, get_cached_follower_topics, cache_topic),
"COMMUNITY": (Community, CommunityFollower, None, None), # Нет методов кэша для сообщества
"SHOUT": (Shout, ShoutReactionsFollower, None, None), # Нет методов кэша для shout
}
if what not in entity_classes:
logger.error(f"Неверный тип для отписки: {what}")
return {"error": "invalid unfollow type"}
entity_class, follower_class, get_cached_follows_method, cache_method = entity_classes[what]
entity_type = what.lower()
follows = []
error = None
try:
logger.debug("Попытка получить сущность из базы данных")
with local_session() as session:
entity = session.query(entity_class).filter(entity_class.slug == slug).first()
logger.debug(f"Полученная сущность: {entity}")
if not entity:
logger.warning(f"{what.lower()} не найден по slug: {slug}")
return {"error": f"{what.lower()} not found"}
if entity and not entity_id:
entity_id = entity.id
logger.debug(f"entity_id: {entity_id}")
sub = (
session.query(follower_class)
.filter(
and_(
getattr(follower_class, "follower") == follower_id,
getattr(follower_class, entity_type) == entity_id,
)
)
.first()
)
logger.debug(f"Найдена подписка для удаления: {sub}")
if sub:
session.delete(sub)
session.commit()
logger.info(f"Пользователь {follower_id} отписался от {what.lower()} с ID {entity_id}")
if cache_method:
logger.debug("Обновление кэша после отписки")
await cache_method(entity.dict())
if get_cached_follows_method:
logger.debug("Получение подписок из кэша")
existing_follows = await get_cached_follows_method(follower_id)
follows = filter(lambda x: x["id"] != entity_id, existing_follows)
logger.debug("Обновлен список подписок")
if what == "AUTHOR":
logger.debug("Отправка уведомления автору об отписке")
await notify_follower(follower=follower_dict, author_id=entity_id, action="unfollow")
else:
return {"error": "following was not found", f"{entity_type}s": follows}
except Exception as exc:
logger.exception("Произошла ошибка в функции 'unfollow'")
import traceback
traceback.print_exc()
return {"error": str(exc)}
# logger.debug(f"Функция 'unfollow' завершена успешно с результатом: {entity_type}s={follows}, error={error}")
return {f"{entity_type}s": follows, "error": error}
@query.field("get_shout_followers")
def get_shout_followers(_, _info, slug: str = "", shout_id: int | None = None) -> List[Author]:
logger.debug("Начало выполнения функции 'get_shout_followers'")
followers = []
try:
with local_session() as session:
shout = None
if slug:
shout = session.query(Shout).filter(Shout.slug == slug).first()
logger.debug(f"Найден shout по slug: {slug} -> {shout}")
elif shout_id:
shout = session.query(Shout).filter(Shout.id == shout_id).first()
logger.debug(f"Найден shout по ID: {shout_id} -> {shout}")
if shout:
reactions = session.query(Reaction).filter(Reaction.shout == shout.id).all()
logger.debug(f"Полученные реакции для shout ID {shout.id}: {reactions}")
for r in reactions:
followers.append(r.created_by)
logger.debug(f"Добавлен follower: {r.created_by}")
except Exception as _exc:
import traceback
traceback.print_exc()
logger.exception("Произошла ошибка в функции 'get_shout_followers'")
return []
# logger.debug(f"Функция 'get_shout_followers' завершена с {len(followers)} подписчиками")
return followers

View File

@ -1,113 +0,0 @@
import json
import uuid
from datetime import datetime, timezone
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.redis import redis
from base.resolvers import mutation
from validations.inbox import Chat
@mutation.field("updateChat")
@login_required
async def update_chat(_, info, chat_new: Chat):
"""
updating chat
requires info["request"].user.slug to be in chat["admins"]
:param info: GraphQLInfo with request
:param chat_new: dict with chat data
:return: Result { error chat }
"""
auth: AuthCredentials = info.context["request"].auth
chat_id = chat_new["id"]
chat = await redis.execute("GET", f"chats/{chat_id}")
if not chat:
return {"error": "chat not exist"}
chat = dict(json.loads(chat))
# TODO
if auth.user_id in chat["admins"]:
chat.update(
{
"title": chat_new.get("title", chat["title"]),
"description": chat_new.get("description", chat["description"]),
"updatedAt": int(datetime.now(tz=timezone.utc).timestamp()),
"admins": chat_new.get("admins", chat.get("admins") or []),
"users": chat_new.get("users", chat["users"]),
}
)
await redis.execute("SET", f"chats/{chat.id}", json.dumps(chat))
await redis.execute("COMMIT")
return {"error": None, "chat": chat}
@mutation.field("createChat")
@login_required
async def create_chat(_, info, title="", members=[]):
auth: AuthCredentials = info.context["request"].auth
chat = {}
print("create_chat members: %r" % members)
if auth.user_id not in members:
members.append(int(auth.user_id))
# reuse chat craeted before if exists
if len(members) == 2 and title == "":
chat = None
print(members)
chatset1 = await redis.execute("SMEMBERS", f"chats_by_user/{members[0]}")
if not chatset1:
chatset1 = set([])
print(chatset1)
chatset2 = await redis.execute("SMEMBERS", f"chats_by_user/{members[1]}")
if not chatset2:
chatset2 = set([])
print(chatset2)
chatset = chatset1.intersection(chatset2)
print(chatset)
for c in chatset:
chat = await redis.execute("GET", f"chats/{c.decode('utf-8')}")
if chat:
chat = json.loads(chat)
if chat["title"] == "":
print("[inbox] createChat found old chat")
print(chat)
break
if chat:
return {"chat": chat, "error": "existed"}
chat_id = str(uuid.uuid4())
chat = {
"id": chat_id,
"users": members,
"title": title,
"createdBy": auth.user_id,
"createdAt": int(datetime.now(tz=timezone.utc).timestamp()),
"updatedAt": int(datetime.now(tz=timezone.utc).timestamp()),
"admins": members if (len(members) == 2 and title == "") else [],
}
for m in members:
await redis.execute("SADD", f"chats_by_user/{m}", chat_id)
await redis.execute("SET", f"chats/{chat_id}", json.dumps(chat))
await redis.execute("SET", f"chats/{chat_id}/next_message_id", str(0))
await redis.execute("COMMIT")
return {"error": None, "chat": chat}
@mutation.field("deleteChat")
@login_required
async def delete_chat(_, info, chat_id: str):
auth: AuthCredentials = info.context["request"].auth
chat = await redis.execute("GET", f"/chats/{chat_id}")
if chat:
chat = dict(json.loads(chat))
if auth.user_id in chat["admins"]:
await redis.execute("DEL", f"chats/{chat_id}")
await redis.execute("SREM", "chats_by_user/" + str(auth.user_id), chat_id)
await redis.execute("COMMIT")
else:
return {"error": "chat not exist"}

View File

@ -1,138 +0,0 @@
import json
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.orm import local_session
from base.redis import redis
from base.resolvers import query
from orm.user import User
from resolvers.zine.profile import followed_authors
from .unread import get_unread_counter
# from datetime import datetime, timedelta, timezone
async def load_messages(chat_id: str, limit: int = 5, offset: int = 0, ids=[]):
"""load :limit messages for :chat_id with :offset"""
messages = []
message_ids = []
if ids:
message_ids += ids
try:
if limit:
mids = await redis.lrange(f"chats/{chat_id}/message_ids", offset, offset + limit)
mids = [mid.decode("utf-8") for mid in mids]
message_ids += mids
except Exception as e:
print(e)
if message_ids:
message_keys = [f"chats/{chat_id}/messages/{mid}" for mid in message_ids]
messages = await redis.mget(*message_keys)
messages = [json.loads(msg.decode("utf-8")) for msg in messages]
replies = []
for m in messages:
rt = m.get("replyTo")
if rt:
rt = int(rt)
if rt not in message_ids:
replies.append(rt)
if replies:
messages += await load_messages(chat_id, limit=0, ids=replies)
return messages
@query.field("loadChats")
@login_required
async def load_chats(_, info, limit: int = 50, offset: int = 0):
"""load :limit chats of current user with :offset"""
auth: AuthCredentials = info.context["request"].auth
cids = await redis.execute("SMEMBERS", "chats_by_user/" + str(auth.user_id))
if cids:
cids = list(cids)[offset : offset + limit]
if not cids:
print("[inbox.load] no chats were found")
cids = []
onliners = await redis.execute("SMEMBERS", "users-online")
if not onliners:
onliners = []
chats = []
for cid in cids:
cid = cid.decode("utf-8")
c = await redis.execute("GET", "chats/" + cid)
if c:
c = dict(json.loads(c))
c["messages"] = await load_messages(cid, 5, 0)
c["unread"] = await get_unread_counter(cid, auth.user_id)
with local_session() as session:
c["members"] = []
for uid in c["users"]:
a = session.query(User).where(User.id == uid).first()
if a:
c["members"].append(
{
"id": a.id,
"slug": a.slug,
"userpic": a.userpic,
"name": a.name,
"lastSeen": a.lastSeen,
"online": a.id in onliners,
}
)
chats.append(c)
return {"chats": chats, "error": None}
@query.field("loadMessagesBy")
@login_required
async def load_messages_by(_, info, by, limit: int = 10, offset: int = 0):
"""load :limit messages of :chat_id with :offset"""
auth: AuthCredentials = info.context["request"].auth
userchats = await redis.execute("SMEMBERS", "chats_by_user/" + str(auth.user_id))
userchats = [c.decode("utf-8") for c in userchats]
# print('[inbox] userchats: %r' % userchats)
if userchats:
# print('[inbox] loading messages by...')
messages = []
by_chat = by.get("chat")
if by_chat in userchats:
chat = await redis.execute("GET", f"chats/{by_chat}")
# print(chat)
if not chat:
return {"messages": [], "error": "chat not exist"}
# everyone's messages in filtered chat
messages = await load_messages(by_chat, limit, offset)
return {"messages": sorted(list(messages), key=lambda m: m["createdAt"]), "error": None}
else:
return {"error": "Cannot access messages of this chat"}
@query.field("loadRecipients")
async def load_recipients(_, info, limit=50, offset=0):
chat_users = []
auth: AuthCredentials = info.context["request"].auth
onliners = await redis.execute("SMEMBERS", "users-online")
if not onliners:
onliners = []
try:
chat_users += await followed_authors(auth.user_id)
limit = limit - len(chat_users)
except Exception:
pass
with local_session() as session:
chat_users += session.query(User).where(User.emailConfirmed).limit(limit).offset(offset)
members = []
for a in chat_users:
members.append(
{
"id": a.id,
"slug": a.slug,
"userpic": a.userpic,
"name": a.name,
"lastSeen": a.lastSeen,
"online": a.id in onliners,
}
)
return {"members": members, "error": None}

View File

@ -1,129 +0,0 @@
import json
from datetime import datetime, timezone
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.redis import redis
from base.resolvers import mutation
from services.following import FollowingManager, FollowingResult
@mutation.field("createMessage")
@login_required
async def create_message(_, info, chat: str, body: str, replyTo=None):
"""create message with :body for :chat_id replying to :replyTo optionally"""
auth: AuthCredentials = info.context["request"].auth
chat = await redis.execute("GET", f"chats/{chat}")
if not chat:
return {"error": "chat is not exist"}
else:
chat_dict = dict(json.loads(chat))
message_id = await redis.execute("GET", f"chats/{chat_dict['id']}/next_message_id")
message_id = int(message_id)
new_message = {
"chatId": chat_dict["id"],
"id": message_id,
"author": auth.user_id,
"body": body,
"createdAt": int(datetime.now(tz=timezone.utc).timestamp()),
}
if replyTo:
new_message["replyTo"] = replyTo
chat_dict["updatedAt"] = new_message["createdAt"]
await redis.execute("SET", f"chats/{chat_dict['id']}", json.dumps(chat))
print(f"[inbox] creating message {new_message}")
await redis.execute(
"SET", f"chats/{chat_dict['id']}/messages/{message_id}", json.dumps(new_message)
)
await redis.execute("LPUSH", f"chats/{chat_dict['id']}/message_ids", str(message_id))
await redis.execute("SET", f"chats/{chat_dict['id']}/next_message_id", str(message_id + 1))
users = chat_dict["users"]
for user_slug in users:
await redis.execute(
"LPUSH", f"chats/{chat_dict['id']}/unread/{user_slug}", str(message_id)
)
result = FollowingResult("NEW", "chat", new_message)
await FollowingManager.push("chat", result)
return {"message": new_message, "error": None}
@mutation.field("updateMessage")
@login_required
async def update_message(_, info, chat_id: str, message_id: int, body: str):
auth: AuthCredentials = info.context["request"].auth
chat = await redis.execute("GET", f"chats/{chat_id}")
if not chat:
return {"error": "chat not exist"}
message = await redis.execute("GET", f"chats/{chat_id}/messages/{message_id}")
if not message:
return {"error": "message not exist"}
message = json.loads(message)
if message["author"] != auth.user_id:
return {"error": "access denied"}
message["body"] = body
message["updatedAt"] = int(datetime.now(tz=timezone.utc).timestamp())
await redis.execute("SET", f"chats/{chat_id}/messages/{message_id}", json.dumps(message))
result = FollowingResult("UPDATED", "chat", message)
await FollowingManager.push("chat", result)
return {"message": message, "error": None}
@mutation.field("deleteMessage")
@login_required
async def delete_message(_, info, chat_id: str, message_id: int):
auth: AuthCredentials = info.context["request"].auth
chat = await redis.execute("GET", f"chats/{chat_id}")
if not chat:
return {"error": "chat not exist"}
chat = json.loads(chat)
message = await redis.execute("GET", f"chats/{chat_id}/messages/{str(message_id)}")
if not message:
return {"error": "message not exist"}
message = json.loads(message)
if message["author"] != auth.user_id:
return {"error": "access denied"}
await redis.execute("LREM", f"chats/{chat_id}/message_ids", 0, str(message_id))
await redis.execute("DEL", f"chats/{chat_id}/messages/{str(message_id)}")
users = chat["users"]
for user_id in users:
await redis.execute("LREM", f"chats/{chat_id}/unread/{user_id}", 0, str(message_id))
result = FollowingResult("DELETED", "chat", message)
await FollowingManager.push(result)
return {}
@mutation.field("markAsRead")
@login_required
async def mark_as_read(_, info, chat_id: str, messages: [int]):
auth: AuthCredentials = info.context["request"].auth
chat = await redis.execute("GET", f"chats/{chat_id}")
if not chat:
return {"error": "chat not exist"}
chat = json.loads(chat)
users = set(chat["users"])
if auth.user_id not in users:
return {"error": "access denied"}
for message_id in messages:
await redis.execute("LREM", f"chats/{chat_id}/unread/{auth.user_id}", 0, str(message_id))
return {"error": None}

View File

@ -1,96 +0,0 @@
import json
from datetime import datetime, timedelta, timezone
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.orm import local_session
from base.redis import redis
from base.resolvers import query
from orm.user import AuthorFollower, User
from resolvers.inbox.load import load_messages
@query.field("searchRecipients")
@login_required
async def search_recipients(_, info, query: str, limit: int = 50, offset: int = 0):
result = []
# TODO: maybe redis scan?
auth: AuthCredentials = info.context["request"].auth
talk_before = await redis.execute("GET", f"/chats_by_user/{auth.user_id}")
if talk_before:
talk_before = list(json.loads(talk_before))[offset : offset + limit]
for chat_id in talk_before:
members = await redis.execute("GET", f"/chats/{chat_id}/users")
if members:
members = list(json.loads(members))
for member in members:
if member.startswith(query):
if member not in result:
result.append(member)
more_amount = limit - len(result)
with local_session() as session:
# followings
result += (
session.query(AuthorFollower.author)
.join(User, User.id == AuthorFollower.follower)
.where(User.slug.startswith(query))
.offset(offset + len(result))
.limit(more_amount)
)
more_amount = limit
# followers
result += (
session.query(AuthorFollower.follower)
.join(User, User.id == AuthorFollower.author)
.where(User.slug.startswith(query))
.offset(offset + len(result))
.limit(offset + len(result) + limit)
)
return {"members": list(result), "error": None}
@query.field("searchMessages")
@login_required
async def search_user_chats(by, messages, user_id: int, limit, offset):
cids = set([])
cids.union(set(await redis.execute("SMEMBERS", "chats_by_user/" + str(user_id))))
messages = []
by_author = by.get("author")
if by_author:
# all author's messages
cids.union(set(await redis.execute("SMEMBERS", f"chats_by_user/{by_author}")))
# author's messages in filtered chat
messages.union(set(filter(lambda m: m["author"] == by_author, list(messages))))
for c in cids:
c = c.decode("utf-8")
messages = await load_messages(c, limit, offset)
body_like = by.get("body")
if body_like:
# search in all messages in all user's chats
for c in cids:
# FIXME: use redis scan here
c = c.decode("utf-8")
mmm = await load_messages(c, limit, offset)
for m in mmm:
if body_like in m["body"]:
messages.add(m)
else:
# search in chat's messages
messages.extend(filter(lambda m: body_like in m["body"], list(messages)))
days = by.get("days")
if days:
messages.extend(
filter(
list(messages),
key=lambda m: (
datetime.now(tz=timezone.utc) - int(m["createdAt"]) < timedelta(days=by["days"])
),
)
)
return {"messages": messages, "error": None}

View File

@ -1,10 +0,0 @@
from base.redis import redis
async def get_unread_counter(chat_id: str, user_id: int):
try:
unread = await redis.execute("LLEN", f"chats/{chat_id.decode('utf-8')}/unread/{user_id}")
if unread:
return unread
except Exception:
return 0

View File

@ -1,89 +0,0 @@
from sqlalchemy import and_, desc, select, update
from auth.authenticate import login_required
from auth.credentials import AuthCredentials
from base.orm import local_session
from base.resolvers import mutation, query
from orm import Notification
@query.field("loadNotifications")
@login_required
async def load_notifications(_, info, params=None):
if params is None:
params = {}
auth: AuthCredentials = info.context["request"].auth
user_id = auth.user_id
limit = params.get("limit", 50)
offset = params.get("offset", 0)
q = (
select(Notification)
.where(Notification.user == user_id)
.order_by(desc(Notification.createdAt))
.limit(limit)
.offset(offset)
)
notifications = []
with local_session() as session:
total_count = session.query(Notification).where(Notification.user == user_id).count()
total_unread_count = (
session.query(Notification)
.where(and_(Notification.user == user_id, Notification.seen == False)) # noqa: E712
.count()
)
for [notification] in session.execute(q):
notification.type = notification.type.name
notifications.append(notification)
return {
"notifications": notifications,
"totalCount": total_count,
"totalUnreadCount": total_unread_count,
}
@mutation.field("markNotificationAsRead")
@login_required
async def mark_notification_as_read(_, info, notification_id: int):
auth: AuthCredentials = info.context["request"].auth
user_id = auth.user_id
with local_session() as session:
notification = (
session.query(Notification)
.where(and_(Notification.id == notification_id, Notification.user == user_id))
.one()
)
notification.seen = True
session.commit()
return {}
@mutation.field("markAllNotificationsAsRead")
@login_required
async def mark_all_notifications_as_read(_, info):
auth: AuthCredentials = info.context["request"].auth
user_id = auth.user_id
statement = (
update(Notification)
.where(and_(Notification.user == user_id, Notification.seen == False)) # noqa: E712
.values(seen=True)
)
with local_session() as session:
try:
session.execute(statement)
session.commit()
except Exception as e:
session.rollback()
print(f"[mark_all_notifications_as_read] error: {str(e)}")
return {}

316
resolvers/notifier.py Normal file
View File

@ -0,0 +1,316 @@
import time
from typing import List, Tuple
import orjson
from sqlalchemy import and_, select
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.orm import aliased
from sqlalchemy.sql import not_
from orm.author import Author
from orm.notification import (
Notification,
NotificationAction,
NotificationEntity,
NotificationSeen,
)
from orm.shout import Shout
from services.auth import login_required
from services.db import local_session
from services.schema import mutation, query
from utils.logger import root_logger as logger
def query_notifications(author_id: int, after: int = 0) -> Tuple[int, int, List[Tuple[Notification, bool]]]:
notification_seen_alias = aliased(NotificationSeen)
q = select(Notification, notification_seen_alias.viewer.label("seen")).outerjoin(
NotificationSeen,
and_(
NotificationSeen.viewer == author_id,
NotificationSeen.notification == Notification.id,
),
)
if after:
q = q.filter(Notification.created_at > after)
q = q.group_by(NotificationSeen.notification, Notification.created_at)
with local_session() as session:
total = (
session.query(Notification)
.filter(
and_(
Notification.action == NotificationAction.CREATE.value,
Notification.created_at > after,
)
)
.count()
)
unread = (
session.query(Notification)
.filter(
and_(
Notification.action == NotificationAction.CREATE.value,
Notification.created_at > after,
not_(Notification.seen),
)
)
.count()
)
notifications_result = session.execute(q)
notifications = []
for n, seen in notifications_result:
notifications.append((n, seen))
return total, unread, notifications
def group_notification(thread, authors=None, shout=None, reactions=None, entity="follower", action="follow"):
reactions = reactions or []
authors = authors or []
return {
"thread": thread,
"authors": authors,
"updated_at": int(time.time()),
"shout": shout,
"reactions": reactions,
"entity": entity,
"action": action,
}
def get_notifications_grouped(author_id: int, after: int = 0, limit: int = 10, offset: int = 0):
"""
Retrieves notifications for a given author.
Args:
author_id (int): The ID of the author for whom notifications are retrieved.
after (int, optional): If provided, selects only notifications created after this timestamp will be considered.
limit (int, optional): The maximum number of groupa to retrieve.
offset (int, optional): offset
Returns:
Dict[str, NotificationGroup], int, int: A dictionary where keys are thread IDs
and values are NotificationGroup objects, unread and total amounts.
This function queries the database to retrieve notifications for the specified author, considering optional filters.
The result is a dictionary where each key is a thread ID, and the corresponding value is a NotificationGroup
containing information about the notifications within that thread.
NotificationGroup structure:
{
entity: str, # Type of entity (e.g., 'reaction', 'shout', 'follower').
updated_at: int, # Timestamp of the latest update in the thread.
shout: Optional[NotificationShout]
reactions: List[int], # List of reaction ids within the thread.
authors: List[NotificationAuthor], # List of authors involved in the thread.
}
"""
total, unread, notifications = query_notifications(author_id, after)
groups_by_thread = {}
groups_amount = 0
for notification, seen in notifications:
if (groups_amount + offset) >= limit:
break
payload = orjson.loads(str(notification.payload))
if str(notification.entity) == NotificationEntity.SHOUT.value:
shout = payload
shout_id = shout.get("id")
author_id = shout.get("created_by")
thread_id = f"shout-{shout_id}"
with local_session() as session:
author = session.query(Author).filter(Author.id == author_id).first()
shout = session.query(Shout).filter(Shout.id == shout_id).first()
if author and shout:
author = author.dict()
shout = shout.dict()
group = group_notification(
thread_id,
shout=shout,
authors=[author],
action=str(notification.action),
entity=str(notification.entity),
)
groups_by_thread[thread_id] = group
groups_amount += 1
elif str(notification.entity) == NotificationEntity.REACTION.value:
reaction = payload
if not isinstance(reaction, dict):
raise ValueError("reaction data is not consistent")
shout_id = reaction.get("shout")
author_id = reaction.get("created_by", 0)
if shout_id and author_id:
with local_session() as session:
author = session.query(Author).filter(Author.id == author_id).first()
shout = session.query(Shout).filter(Shout.id == shout_id).first()
if shout and author:
author = author.dict()
shout = shout.dict()
reply_id = reaction.get("reply_to")
thread_id = f"shout-{shout_id}"
if reply_id and reaction.get("kind", "").lower() == "comment":
thread_id += f"{reply_id}"
existing_group = groups_by_thread.get(thread_id)
if existing_group:
existing_group["seen"] = False
existing_group["authors"].append(author_id)
existing_group["reactions"] = existing_group["reactions"] or []
existing_group["reactions"].append(reaction)
groups_by_thread[thread_id] = existing_group
else:
group = group_notification(
thread_id,
authors=[author],
shout=shout,
reactions=[reaction],
entity=str(notification.entity),
action=str(notification.action),
)
if group:
groups_by_thread[thread_id] = group
groups_amount += 1
elif str(notification.entity) == "follower":
thread_id = "followers"
follower = orjson.loads(payload)
group = groups_by_thread.get(thread_id)
if group:
if str(notification.action) == "follow":
group["authors"].append(follower)
elif str(notification.action) == "unfollow":
follower_id = follower.get("id")
for author in group["authors"]:
if author.get("id") == follower_id:
group["authors"].remove(author)
break
else:
group = group_notification(
thread_id,
authors=[follower],
entity=str(notification.entity),
action=str(notification.action),
)
groups_amount += 1
groups_by_thread[thread_id] = group
return groups_by_thread, unread, total
@query.field("load_notifications")
@login_required
async def load_notifications(_, info, after: int, limit: int = 50, offset=0):
author_dict = info.context.get("author")
author_id = author_dict.get("id")
error = None
total = 0
unread = 0
notifications = []
try:
if author_id:
groups, unread, total = get_notifications_grouped(author_id, after, limit)
notifications = sorted(groups.values(), key=lambda group: group.updated_at, reverse=True)
except Exception as e:
error = e
logger.error(e)
return {
"notifications": notifications,
"total": total,
"unread": unread,
"error": error,
}
@mutation.field("notification_mark_seen")
@login_required
async def notification_mark_seen(_, info, notification_id: int):
author_id = info.context.get("author", {}).get("id")
if author_id:
with local_session() as session:
try:
ns = NotificationSeen(notification=notification_id, viewer=author_id)
session.add(ns)
session.commit()
except SQLAlchemyError as e:
session.rollback()
logger.error(f"seen mutation failed: {e}")
return {"error": "cant mark as read"}
return {"error": None}
@mutation.field("notifications_seen_after")
@login_required
async def notifications_seen_after(_, info, after: int):
# TODO: use latest loaded notification_id as input offset parameter
error = None
try:
author_id = info.context.get("author", {}).get("id")
if author_id:
with local_session() as session:
nnn = session.query(Notification).filter(and_(Notification.created_at > after)).all()
for n in nnn:
try:
ns = NotificationSeen(notification=n.id, viewer=author_id)
session.add(ns)
session.commit()
except SQLAlchemyError:
session.rollback()
except Exception as e:
print(e)
error = "cant mark as read"
return {"error": error}
@mutation.field("notifications_seen_thread")
@login_required
async def notifications_seen_thread(_, info, thread: str, after: int):
error = None
author_id = info.context.get("author", {}).get("id")
if author_id:
[shout_id, reply_to_id] = thread.split(":")
with local_session() as session:
# TODO: handle new follower and new shout notifications
new_reaction_notifications = (
session.query(Notification)
.filter(
Notification.action == "create",
Notification.entity == "reaction",
Notification.created_at > after,
)
.all()
)
removed_reaction_notifications = (
session.query(Notification)
.filter(
Notification.action == "delete",
Notification.entity == "reaction",
Notification.created_at > after,
)
.all()
)
exclude = set()
for nr in removed_reaction_notifications:
reaction = orjson.loads(str(nr.payload))
reaction_id = reaction.get("id")
exclude.add(reaction_id)
for n in new_reaction_notifications:
reaction = orjson.loads(str(n.payload))
reaction_id = reaction.get("id")
if (
reaction_id not in exclude
and reaction.get("shout") == shout_id
and reaction.get("reply_to") == reply_to_id
):
try:
ns = NotificationSeen(notification=n.id, viewer=author_id)
session.add(ns)
session.commit()
except Exception as e:
logger.warn(e)
session.rollback()
else:
error = "You are not logged in"
return {"error": error}

49
resolvers/proposals.py Normal file
View File

@ -0,0 +1,49 @@
from sqlalchemy import and_
from orm.rating import is_negative, is_positive
from orm.reaction import Reaction, ReactionKind
from orm.shout import Shout
from services.db import local_session
from utils.diff import apply_diff, get_diff
def handle_proposing(kind: ReactionKind, reply_to: int, shout_id: int):
with local_session() as session:
if is_positive(kind):
replied_reaction = (
session.query(Reaction).filter(Reaction.id == reply_to, Reaction.shout == shout_id).first()
)
if replied_reaction and replied_reaction.kind is ReactionKind.PROPOSE.value and replied_reaction.quote:
# patch all the proposals' quotes
proposals = (
session.query(Reaction)
.filter(
and_(
Reaction.shout == shout_id,
Reaction.kind == ReactionKind.PROPOSE.value,
)
)
.all()
)
# patch shout's body
shout = session.query(Shout).filter(Shout.id == shout_id).first()
body = replied_reaction.quote
Shout.update(shout, {body})
session.add(shout)
session.commit()
# реакция содержит цитату -> обновляются все предложения
# (proposals) для соответствующего Shout.
for proposal in proposals:
if proposal.quote:
proposal_diff = get_diff(shout.body, proposal.quote)
proposal_dict = proposal.dict()
proposal_dict["quote"] = apply_diff(replied_reaction.quote, proposal_diff)
Reaction.update(proposal, proposal_dict)
session.add(proposal)
if is_negative(kind):
# TODO: rejection logic
pass

337
resolvers/rating.py Normal file
View File

@ -0,0 +1,337 @@
from sqlalchemy import and_, case, func, select, true
from sqlalchemy.orm import aliased
from orm.author import Author, AuthorRating
from orm.reaction import Reaction, ReactionKind
from orm.shout import Shout
from services.auth import login_required
from services.db import local_session
from services.schema import mutation, query
from utils.logger import root_logger as logger
@query.field("get_my_rates_comments")
@login_required
async def get_my_rates_comments(_, info, comments: list[int]) -> list[dict]:
"""
Получение реакций пользователя на комментарии
Args:
info: Контекст запроса
comments: Список ID комментариев
Returns:
list[dict]: Список словарей с реакциями пользователя на комментарии
Каждый словарь содержит:
- comment_id: ID комментария
- my_rate: Тип реакции (LIKE/DISLIKE)
"""
author_dict = info.context.get("author") if info.context else None
author_id = author_dict.get("id") if author_dict else None
if not author_id:
return [] # Возвращаем пустой список вместо словаря с ошибкой
# Подзапрос для реакций текущего пользователя
rated_query = (
select(Reaction.id.label("comment_id"), Reaction.kind.label("my_rate"))
.where(
and_(
Reaction.reply_to.in_(comments),
Reaction.created_by == author_id,
Reaction.deleted_at.is_(None),
Reaction.kind.in_([ReactionKind.LIKE.value, ReactionKind.DISLIKE.value]),
)
)
.order_by(Reaction.shout, Reaction.created_at.desc())
.distinct(Reaction.shout)
)
with local_session() as session:
comments_result = session.execute(rated_query).all()
return [{"comment_id": row.comment_id, "my_rate": row.my_rate} for row in comments_result]
@query.field("get_my_rates_shouts")
@login_required
async def get_my_rates_shouts(_, info, shouts):
"""
Получение реакций пользователя на публикации
"""
author_dict = info.context.get("author") if info.context else None
author_id = author_dict.get("id") if author_dict else None
if not author_id:
return []
with local_session() as session:
try:
stmt = (
select(Reaction)
.where(
and_(
Reaction.shout.in_(shouts),
Reaction.reply_to.is_(None),
Reaction.created_by == author_id,
Reaction.deleted_at.is_(None),
Reaction.kind.in_([ReactionKind.LIKE.value, ReactionKind.DISLIKE.value]),
)
)
.order_by(Reaction.shout, Reaction.created_at.desc())
.distinct(Reaction.shout)
)
result = session.execute(stmt).all()
return [
{
"shout_id": row[0].shout, # Получаем shout_id из объекта Reaction
"my_rate": row[0].kind, # Получаем kind (my_rate) из объекта Reaction
}
for row in result
]
except Exception as e:
logger.error(f"Error in get_my_rates_shouts: {e}")
return []
@mutation.field("rate_author")
@login_required
async def rate_author(_, info, rated_slug, value):
info.context["user_id"]
rater_id = info.context.get("author", {}).get("id")
with local_session() as session:
rater_id = int(rater_id)
rated_author = session.query(Author).filter(Author.slug == rated_slug).first()
if rater_id and rated_author:
rating: AuthorRating = (
session.query(AuthorRating)
.filter(
and_(
AuthorRating.rater == rater_id,
AuthorRating.author == rated_author.id,
)
)
.first()
)
if rating:
rating.plus = value > 0
session.add(rating)
session.commit()
return {}
else:
try:
rating = AuthorRating(rater=rater_id, author=rated_author.id, plus=value > 0)
session.add(rating)
session.commit()
except Exception as err:
return {"error": err}
return {}
def count_author_comments_rating(session, author_id) -> int:
replied_alias = aliased(Reaction)
replies_likes = (
session.query(replied_alias)
.join(Reaction, replied_alias.id == Reaction.reply_to)
.where(
and_(
replied_alias.created_by == author_id,
replied_alias.kind == ReactionKind.COMMENT.value,
)
)
.filter(replied_alias.kind == ReactionKind.LIKE.value)
.count()
) or 0
replies_dislikes = (
session.query(replied_alias)
.join(Reaction, replied_alias.id == Reaction.reply_to)
.where(
and_(
replied_alias.created_by == author_id,
replied_alias.kind == ReactionKind.COMMENT.value,
)
)
.filter(replied_alias.kind == ReactionKind.DISLIKE.value)
.count()
) or 0
return replies_likes - replies_dislikes
def count_author_shouts_rating(session, author_id) -> int:
shouts_likes = (
session.query(Reaction, Shout)
.join(Shout, Shout.id == Reaction.shout)
.filter(
and_(
Shout.authors.any(id=author_id),
Reaction.kind == ReactionKind.LIKE.value,
)
)
.count()
or 0
)
shouts_dislikes = (
session.query(Reaction, Shout)
.join(Shout, Shout.id == Reaction.shout)
.filter(
and_(
Shout.authors.any(id=author_id),
Reaction.kind == ReactionKind.DISLIKE.value,
)
)
.count()
or 0
)
return shouts_likes - shouts_dislikes
def get_author_rating_old(session, author: Author):
likes_count = (
session.query(AuthorRating).filter(and_(AuthorRating.author == author.id, AuthorRating.plus.is_(True))).count()
)
dislikes_count = (
session.query(AuthorRating)
.filter(and_(AuthorRating.author == author.id, AuthorRating.plus.is_not(True)))
.count()
)
return likes_count - dislikes_count
def get_author_rating_shouts(session, author: Author) -> int:
q = (
select(
func.coalesce(
func.sum(
case(
(Reaction.kind == ReactionKind.LIKE.value, 1),
(Reaction.kind == ReactionKind.DISLIKE.value, -1),
else_=0,
)
),
0,
).label("shouts_rating")
)
.select_from(Reaction)
.outerjoin(Shout, Shout.authors.any(id=author.id))
.outerjoin(
Reaction,
and_(
Reaction.reply_to.is_(None),
Reaction.shout == Shout.id,
Reaction.deleted_at.is_(None),
),
)
)
result = session.execute(q).scalar()
return result
def get_author_rating_comments(session, author: Author) -> int:
replied_comment = aliased(Reaction)
q = (
select(
func.coalesce(
func.sum(
case(
(Reaction.kind == ReactionKind.LIKE.value, 1),
(Reaction.kind == ReactionKind.DISLIKE.value, -1),
else_=0,
)
),
0,
).label("shouts_rating")
)
.select_from(Reaction)
.outerjoin(
Reaction,
and_(
replied_comment.kind == ReactionKind.COMMENT.value,
replied_comment.created_by == author.id,
Reaction.kind.in_([ReactionKind.LIKE.value, ReactionKind.DISLIKE.value]),
Reaction.reply_to == replied_comment.id,
Reaction.deleted_at.is_(None),
),
)
)
result = session.execute(q).scalar()
return result
def add_author_rating_columns(q, group_list):
# NOTE: method is not used
# old karma
q = q.outerjoin(AuthorRating, AuthorRating.author == Author.id)
q = q.add_columns(func.sum(case((AuthorRating.plus == true(), 1), else_=-1)).label("rating"))
# by shouts rating
shout_reaction = aliased(Reaction)
shouts_rating_subq = (
select(
Author.id,
func.coalesce(
func.sum(
case(
(shout_reaction.kind == ReactionKind.LIKE.value, 1),
(shout_reaction.kind == ReactionKind.DISLIKE.value, -1),
else_=0,
)
),
0,
).label("shouts_rating"),
)
.select_from(shout_reaction)
.outerjoin(Shout, Shout.authors.any(id=Author.id))
.outerjoin(
shout_reaction,
and_(
shout_reaction.reply_to.is_(None),
shout_reaction.shout == Shout.id,
shout_reaction.deleted_at.is_(None),
),
)
.group_by(Author.id)
.subquery()
)
q = q.outerjoin(shouts_rating_subq, Author.id == shouts_rating_subq.c.id)
q = q.add_columns(shouts_rating_subq.c.shouts_rating)
group_list = [shouts_rating_subq.c.shouts_rating]
# by comments
replied_comment = aliased(Reaction)
reaction_2 = aliased(Reaction)
comments_subq = (
select(
Author.id,
func.coalesce(
func.sum(
case(
(reaction_2.kind == ReactionKind.LIKE.value, 1),
(reaction_2.kind == ReactionKind.DISLIKE.value, -1),
else_=0,
)
),
0,
).label("comments_rating"),
)
.select_from(reaction_2)
.outerjoin(
replied_comment,
and_(
replied_comment.kind == ReactionKind.COMMENT.value,
replied_comment.created_by == Author.id,
reaction_2.kind.in_([ReactionKind.LIKE.value, ReactionKind.DISLIKE.value]),
reaction_2.reply_to == replied_comment.id,
reaction_2.deleted_at.is_(None),
),
)
.group_by(Author.id)
.subquery()
)
q = q.outerjoin(comments_subq, Author.id == comments_subq.c.id)
q = q.add_columns(comments_subq.c.comments_rating)
group_list.extend([comments_subq.c.comments_rating])
return q, group_list

830
resolvers/reaction.py Normal file
View File

@ -0,0 +1,830 @@
import time
from sqlalchemy import and_, asc, case, desc, func, select
from sqlalchemy.orm import aliased
from orm.author import Author
from orm.rating import PROPOSAL_REACTIONS, RATING_REACTIONS, is_negative, is_positive
from orm.reaction import Reaction, ReactionKind
from orm.shout import Shout, ShoutAuthor
from resolvers.follower import follow
from resolvers.proposals import handle_proposing
from resolvers.stat import update_author_stat
from services.auth import add_user_role, login_required
from services.db import local_session
from services.notify import notify_reaction
from services.schema import mutation, query
from utils.logger import root_logger as logger
def query_reactions():
"""
Base query for fetching reactions with associated authors and shouts.
:return: Base query.
"""
return (
select(
Reaction,
Author,
Shout,
)
.select_from(Reaction)
.join(Author, Reaction.created_by == Author.id)
.join(Shout, Reaction.shout == Shout.id)
)
def add_reaction_stat_columns(q):
"""
Add statistical columns to a reaction query.
:param q: SQL query for reactions.
:return: Query with added statistics columns.
"""
aliased_reaction = aliased(Reaction)
# Join reactions and add statistical columns
q = q.outerjoin(
aliased_reaction,
and_(
aliased_reaction.reply_to == Reaction.id,
aliased_reaction.deleted_at.is_(None),
),
).add_columns(
# Count unique comments
func.coalesce(
func.count(aliased_reaction.id).filter(aliased_reaction.kind == ReactionKind.COMMENT.value), 0
).label("comments_stat"),
# Calculate rating as the difference between likes and dislikes
func.sum(
case(
(aliased_reaction.kind == ReactionKind.LIKE.value, 1),
(aliased_reaction.kind == ReactionKind.DISLIKE.value, -1),
else_=0,
)
).label("rating_stat"),
)
return q
def get_reactions_with_stat(q, limit=10, offset=0):
"""
Execute the reaction query and retrieve reactions with statistics.
:param q: Query with reactions and statistics.
:param limit: Number of reactions to load.
:param offset: Pagination offset.
:return: List of reactions as dictionaries.
>>> get_reactions_with_stat(q, 10, 0) # doctest: +SKIP
[{'id': 1, 'body': 'Текст комментария', 'stat': {'rating': 5, 'comments_count': 3}, ...}]
"""
q = q.limit(limit).offset(offset)
reactions = []
with local_session() as session:
result_rows = session.execute(q)
for reaction, author, shout, comments_count, rating_stat in result_rows:
# Пропускаем реакции с отсутствующими shout или author
if not shout or not author:
logger.error(f"Пропущена реакция из-за отсутствия shout или author: {reaction.dict()}")
continue
# Преобразуем Reaction в словарь для доступа по ключу
reaction_dict = reaction.dict()
reaction_dict["created_by"] = author.dict()
reaction_dict["shout"] = shout.dict()
reaction_dict["stat"] = {"rating": rating_stat, "comments_count": comments_count}
reactions.append(reaction_dict)
return reactions
def is_featured_author(session, author_id) -> bool:
"""
Check if an author has at least one non-deleted featured article.
:param session: Database session.
:param author_id: Author ID.
:return: True if the author has a featured article, else False.
"""
return session.query(
session.query(Shout)
.where(Shout.authors.any(id=author_id))
.filter(Shout.featured_at.is_not(None), Shout.deleted_at.is_(None))
.exists()
).scalar()
def check_to_feature(session, approver_id, reaction) -> bool:
"""
Make a shout featured if it receives more than 4 votes from authors.
:param session: Database session.
:param approver_id: Approver author ID.
:param reaction: Reaction object.
:return: True if shout should be featured, else False.
"""
if not reaction.reply_to and is_positive(reaction.kind):
# Проверяем, не содержит ли пост более 20% дизлайков
# Если да, то не должен быть featured независимо от количества лайков
if check_to_unfeature(session, reaction):
return False
# Собираем всех авторов, поставивших лайк
author_approvers = set()
reacted_readers = (
session.query(Reaction.created_by)
.filter(
Reaction.shout == reaction.shout,
is_positive(Reaction.kind),
# Рейтинги (LIKE, DISLIKE) физически удаляются, поэтому фильтр deleted_at не нужен
)
.distinct()
.all()
)
# Добавляем текущего одобряющего
approver = session.query(Author).filter(Author.id == approver_id).first()
if approver and is_featured_author(session, approver_id):
author_approvers.add(approver_id)
# Проверяем, есть ли у реагировавших авторов featured публикации
for (reader_id,) in reacted_readers:
if is_featured_author(session, reader_id):
author_approvers.add(reader_id)
# Публикация становится featured при наличии более 4 лайков от авторов
logger.debug(f"Публикация {reaction.shout} имеет {len(author_approvers)} лайков от авторов")
return len(author_approvers) > 4
return False
def check_to_unfeature(session, reaction) -> bool:
"""
Unfeature a shout if 20% of reactions are negative.
:param session: Database session.
:param reaction: Reaction object.
:return: True if shout should be unfeatured, else False.
"""
if not reaction.reply_to:
# Проверяем соотношение дизлайков, даже если текущая реакция не дизлайк
total_reactions = (
session.query(Reaction)
.filter(
Reaction.shout == reaction.shout,
Reaction.reply_to.is_(None),
Reaction.kind.in_(RATING_REACTIONS),
# Рейтинги физически удаляются при удалении, поэтому фильтр deleted_at не нужен
)
.count()
)
negative_reactions = (
session.query(Reaction)
.filter(
Reaction.shout == reaction.shout,
is_negative(Reaction.kind),
Reaction.reply_to.is_(None),
# Рейтинги физически удаляются при удалении, поэтому фильтр deleted_at не нужен
)
.count()
)
# Проверяем, составляют ли отрицательные реакции 20% или более от всех реакций
negative_ratio = negative_reactions / total_reactions if total_reactions > 0 else 0
logger.debug(
f"Публикация {reaction.shout}: {negative_reactions}/{total_reactions} отрицательных реакций ({negative_ratio:.2%})"
)
return total_reactions > 0 and negative_ratio >= 0.2
return False
async def set_featured(session, shout_id):
"""
Feature a shout and update the author's role.
:param session: Database session.
:param shout_id: Shout ID.
"""
s = session.query(Shout).filter(Shout.id == shout_id).first()
if s:
current_time = int(time.time())
s.featured_at = current_time
session.commit()
author = session.query(Author).filter(Author.id == s.created_by).first()
if author:
await add_user_role(str(author.user))
session.add(s)
session.commit()
def set_unfeatured(session, shout_id):
"""
Unfeature a shout.
:param session: Database session.
:param shout_id: Shout ID.
"""
session.query(Shout).filter(Shout.id == shout_id).update({"featured_at": None})
session.commit()
async def _create_reaction(session, shout_id: int, is_author: bool, author_id: int, reaction) -> dict:
"""
Create a new reaction and perform related actions such as updating counters and notification.
:param session: Database session.
:param shout_id: Shout ID.
:param is_author: Flag indicating if the user is the author of the shout.
:param author_id: Author ID.
:param reaction: Dictionary with reaction data.
:return: Dictionary with created reaction data.
"""
r = Reaction(**reaction)
session.add(r)
session.commit()
rdict = r.dict()
# Update author stat for comments
if r.kind == ReactionKind.COMMENT.value:
update_author_stat(author_id)
# Handle proposal
if r.reply_to and r.kind in PROPOSAL_REACTIONS and is_author:
handle_proposing(r.kind, r.reply_to, shout_id)
# Handle rating
if r.kind in RATING_REACTIONS:
# Проверяем сначала условие для unfeature (дизлайки имеют приоритет)
if check_to_unfeature(session, r):
set_unfeatured(session, shout_id)
logger.info(f"Публикация {shout_id} потеряла статус featured из-за высокого процента дизлайков")
# Только если не было unfeature, проверяем условие для feature
elif check_to_feature(session, author_id, r):
await set_featured(session, shout_id)
logger.info(f"Публикация {shout_id} получила статус featured благодаря лайкам от авторов")
# Notify creation
await notify_reaction(rdict, "create")
return rdict
def prepare_new_rating(reaction: dict, shout_id: int, session, author_id: int):
"""
Check for the possibility of rating a shout.
:param reaction: Dictionary with reaction data.
:param shout_id: Shout ID.
:param session: Database session.
:param author_id: Author ID.
:return: Dictionary with error or None.
"""
kind = reaction.get("kind")
opposite_kind = ReactionKind.DISLIKE.value if is_positive(kind) else ReactionKind.LIKE.value
existing_ratings = (
session.query(Reaction)
.filter(
Reaction.shout == shout_id,
Reaction.created_by == author_id,
Reaction.kind.in_(RATING_REACTIONS),
Reaction.deleted_at.is_(None),
)
.all()
)
for r in existing_ratings:
if r.kind == kind:
return {"error": "You can't rate the same thing twice"}
if r.kind == opposite_kind:
return {"error": "Remove opposite vote first"}
if shout_id in [r.shout for r in existing_ratings]:
return {"error": "You can't rate your own thing"}
return
@mutation.field("create_reaction")
@login_required
async def create_reaction(_, info, reaction):
"""
Create a new reaction through a GraphQL request.
:param info: GraphQL context info.
:param reaction: Dictionary with reaction data.
:return: Dictionary with created reaction data or error.
"""
reaction_input = reaction
author_dict = info.context.get("author", {})
author_id = author_dict.get("id")
shout_id = int(reaction_input.get("shout", "0"))
logger.debug(f"Creating reaction with data: {reaction_input}")
logger.debug(f"Author ID: {author_id}, Shout ID: {shout_id}")
if not author_id:
return {"error": "Author ID is required to create a reaction."}
if not shout_id:
return {"error": "Shout ID is required to create a reaction."}
try:
with local_session() as session:
authors = session.query(ShoutAuthor.author).filter(ShoutAuthor.shout == shout_id).scalar()
is_author = (
bool(list(filter(lambda x: x == int(author_id), authors))) if isinstance(authors, list) else False
)
reaction_input["created_by"] = author_id
kind = reaction_input.get("kind")
# handle ratings
if kind in RATING_REACTIONS:
logger.debug(f"creating rating reaction: {kind}")
error_result = prepare_new_rating(reaction_input, shout_id, session, author_id)
if error_result:
logger.error(f"Rating preparation error: {error_result}")
return error_result
# handle all reactions
rdict = await _create_reaction(session, shout_id, is_author, author_id, reaction_input)
logger.debug(f"Created reaction result: {rdict}")
# follow if liked
if kind == ReactionKind.LIKE.value:
try:
follow(None, info, "shout", shout_id=shout_id)
except Exception:
pass
shout = session.query(Shout).filter(Shout.id == shout_id).first()
if not shout:
return {"error": "Shout not found"}
rdict["shout"] = shout.dict()
rdict["created_by"] = author_dict
return {"reaction": rdict}
except Exception as e:
import traceback
traceback.print_exc()
logger.error(f"{type(e).__name__}: {e}")
return {"error": "Cannot create reaction."}
@mutation.field("update_reaction")
@login_required
async def update_reaction(_, info, reaction):
"""
Update an existing reaction through a GraphQL request.
:param info: GraphQL context info.
:param reaction: Dictionary with reaction data.
:return: Dictionary with updated reaction data or error.
"""
user_id = info.context.get("user_id")
roles = info.context.get("roles")
rid = reaction.get("id")
if not rid or not user_id or not roles:
return {"error": "Invalid input data"}
del reaction["id"]
with local_session() as session:
try:
reaction_query = query_reactions().filter(Reaction.id == rid)
reaction_query = add_reaction_stat_columns(reaction_query)
reaction_query = reaction_query.group_by(Reaction.id, Author.id, Shout.id)
result = session.execute(reaction_query).unique().first()
if result:
r, author, _shout, comments_count, rating_stat = result
if not r or not author:
return {"error": "Invalid reaction ID or unauthorized"}
if r.created_by != author.id and "editor" not in roles:
return {"error": "Access denied"}
# Update reaction
r.body = reaction.get("body", r.body)
r.updated_at = int(time.time())
Reaction.update(r, reaction)
session.add(r)
session.commit()
r.stat = {
"comments_count": comments_count,
"rating": rating_stat,
}
await notify_reaction(r.dict(), "update")
return {"reaction": r}
except Exception as e:
logger.error(f"{type(e).__name__}: {e}")
return {"error": "Cannot update reaction"}
@mutation.field("delete_reaction")
@login_required
async def delete_reaction(_, info, reaction_id: int):
"""
Delete an existing reaction through a GraphQL request.
:param info: GraphQL context info.
:param reaction_id: Reaction ID to delete.
:return: Dictionary with deleted reaction data or error.
"""
user_id = info.context.get("user_id")
author_id = info.context.get("author", {}).get("id")
roles = info.context.get("roles", [])
if not user_id:
return {"error": "Unauthorized"}
with local_session() as session:
try:
author = session.query(Author).filter(Author.user == user_id).one()
r = session.query(Reaction).filter(Reaction.id == reaction_id).one()
if r.created_by != author_id and "editor" not in roles:
return {"error": "Access denied"}
if r.kind == ReactionKind.COMMENT.value:
r.deleted_at = int(time.time())
update_author_stat(author.id)
session.add(r)
session.commit()
elif r.kind == ReactionKind.PROPOSE.value:
r.deleted_at = int(time.time())
session.add(r)
session.commit()
# TODO: add more reaction types here
else:
logger.debug(f"{user_id} user removing his #{reaction_id} reaction")
session.delete(r)
session.commit()
if check_to_unfeature(session, r):
set_unfeatured(session, r.shout)
reaction_dict = r.dict()
await notify_reaction(reaction_dict, "delete")
return {"error": None, "reaction": reaction_dict}
except Exception as e:
logger.error(f"{type(e).__name__}: {e}")
return {"error": "Cannot delete reaction"}
def apply_reaction_filters(by, q):
"""
Apply filters to a reaction query.
:param by: Dictionary with filter parameters.
:param q: SQL query.
:return: Query with applied filters.
"""
shout_slug = by.get("shout")
if shout_slug:
q = q.filter(Shout.slug == shout_slug)
shout_id = by.get("shout_id")
if shout_id:
q = q.filter(Shout.id == shout_id)
shouts = by.get("shouts")
if shouts:
q = q.filter(Shout.slug.in_(shouts))
created_by = by.get("created_by", by.get("author_id"))
if created_by:
q = q.filter(Author.id == created_by)
author_slug = by.get("author")
if author_slug:
q = q.filter(Author.slug == author_slug)
topic = by.get("topic")
if isinstance(topic, int):
q = q.filter(Shout.topics.any(id=topic))
kinds = by.get("kinds")
if isinstance(kinds, list):
q = q.filter(Reaction.kind.in_(kinds))
if by.get("reply_to"):
q = q.filter(Reaction.reply_to == by.get("reply_to"))
by_search = by.get("search", "")
if len(by_search) > 2:
q = q.filter(Reaction.body.ilike(f"%{by_search}%"))
after = by.get("after")
if isinstance(after, int):
q = q.filter(Reaction.created_at > after)
return q
@query.field("load_reactions_by")
async def load_reactions_by(_, _info, by, limit=50, offset=0):
"""
Load reactions based on specified parameters.
:param info: GraphQL context info.
:param by: Filter parameters.
:param limit: Number of reactions to load.
:param offset: Pagination offset.
:return: List of reactions.
"""
q = query_reactions()
# Add statistics and apply filters
q = add_reaction_stat_columns(q)
q = apply_reaction_filters(by, q)
# Include reactions with deleted_at for building comment trees
# q = q.where(Reaction.deleted_at.is_(None))
# Group and sort
q = q.group_by(Reaction.id, Author.id, Shout.id)
order_stat = by.get("sort", "").lower()
order_by_stmt = desc(Reaction.created_at)
if order_stat == "oldest":
order_by_stmt = asc(Reaction.created_at)
elif order_stat.endswith("like"):
order_by_stmt = desc("rating_stat")
q = q.order_by(order_by_stmt)
# Retrieve and return reactions
return get_reactions_with_stat(q, limit, offset)
@query.field("load_shout_ratings")
async def load_shout_ratings(_, info, shout: int, limit=100, offset=0):
"""
Load ratings for a specified shout with pagination.
:param info: GraphQL context info.
:param shout: Shout ID.
:param limit: Number of reactions to load.
:param offset: Pagination offset.
:return: List of reactions.
"""
q = query_reactions()
# Filter, group, sort, limit, offset
q = q.filter(
and_(
Reaction.deleted_at.is_(None),
Reaction.shout == shout,
Reaction.kind.in_(RATING_REACTIONS),
)
)
q = q.group_by(Reaction.id, Author.id, Shout.id)
q = q.order_by(desc(Reaction.created_at))
# Retrieve and return reactions
return get_reactions_with_stat(q, limit, offset)
@query.field("load_shout_comments")
async def load_shout_comments(_, info, shout: int, limit=50, offset=0):
"""
Load comments for a specified shout with pagination and statistics.
:param info: GraphQL context info.
:param shout: Shout ID.
:param limit: Number of comments to load.
:param offset: Pagination offset.
:return: List of reactions.
"""
q = query_reactions()
q = add_reaction_stat_columns(q)
# Filter, group, sort, limit, offset
q = q.filter(
and_(
Reaction.deleted_at.is_(None),
Reaction.shout == shout,
Reaction.body.is_not(None),
)
)
q = q.group_by(Reaction.id, Author.id, Shout.id)
q = q.order_by(desc(Reaction.created_at))
# Retrieve and return reactions
return get_reactions_with_stat(q, limit, offset)
@query.field("load_comment_ratings")
async def load_comment_ratings(_, info, comment: int, limit=50, offset=0):
"""
Load ratings for a specified comment with pagination.
:param info: GraphQL context info.
:param comment: Comment ID.
:param limit: Number of ratings to load.
:param offset: Pagination offset.
:return: List of ratings.
"""
q = query_reactions()
# Filter, group, sort, limit, offset
q = q.filter(
and_(
Reaction.deleted_at.is_(None),
Reaction.reply_to == comment,
Reaction.kind.in_(RATING_REACTIONS),
)
)
q = q.group_by(Reaction.id, Author.id, Shout.id)
q = q.order_by(desc(Reaction.created_at))
# Retrieve and return reactions
return get_reactions_with_stat(q, limit, offset)
@query.field("load_comments_branch")
async def load_comments_branch(
_,
_info,
shout: int,
parent_id: int | None = None,
limit=10,
offset=0,
sort="newest",
children_limit=3,
children_offset=0,
):
"""
Загружает иерархические комментарии с возможностью пагинации корневых и дочерних.
:param info: GraphQL context info.
:param shout: ID статьи.
:param parent_id: ID родительского комментария (None для корневых).
:param limit: Количество комментариев для загрузки.
:param offset: Смещение для пагинации.
:param sort: Порядок сортировки ('newest', 'oldest', 'like').
:param children_limit: Максимальное количество дочерних комментариев.
:param children_offset: Смещение для дочерних комментариев.
:return: Список комментариев с дочерними.
"""
# Создаем базовый запрос
q = query_reactions()
q = add_reaction_stat_columns(q)
# Фильтруем по статье и типу (комментарии)
q = q.filter(
and_(
Reaction.deleted_at.is_(None),
Reaction.shout == shout,
Reaction.kind == ReactionKind.COMMENT.value,
)
)
# Фильтруем по родительскому ID
if parent_id is None:
# Загружаем только корневые комментарии
q = q.filter(Reaction.reply_to.is_(None))
else:
# Загружаем только прямые ответы на указанный комментарий
q = q.filter(Reaction.reply_to == parent_id)
# Сортировка и группировка
q = q.group_by(Reaction.id, Author.id, Shout.id)
# Определяем сортировку
order_by_stmt = None
if sort.lower() == "oldest":
order_by_stmt = asc(Reaction.created_at)
elif sort.lower() == "like":
order_by_stmt = desc("rating_stat")
else: # "newest" по умолчанию
order_by_stmt = desc(Reaction.created_at)
q = q.order_by(order_by_stmt)
# Выполняем запрос для получения комментариев
comments = get_reactions_with_stat(q, limit, offset)
# Если комментарии найдены, загружаем дочерние и количество ответов
if comments:
# Загружаем количество ответов для каждого комментария
await load_replies_count(comments)
# Загружаем дочерние комментарии
await load_first_replies(comments, children_limit, children_offset, sort)
return comments
async def load_replies_count(comments):
"""
Загружает количество ответов для списка комментариев и обновляет поле stat.comments_count.
:param comments: Список комментариев, для которых нужно загрузить количество ответов.
"""
if not comments:
return
comment_ids = [comment["id"] for comment in comments]
# Запрос для подсчета количества ответов
q = (
select(Reaction.reply_to.label("parent_id"), func.count().label("count"))
.where(
and_(
Reaction.reply_to.in_(comment_ids),
Reaction.deleted_at.is_(None),
Reaction.kind == ReactionKind.COMMENT.value,
)
)
.group_by(Reaction.reply_to)
)
# Выполняем запрос
with local_session() as session:
result = session.execute(q).fetchall()
# Создаем словарь {parent_id: count}
replies_count = {row[0]: row[1] for row in result}
# Добавляем значения в комментарии
for comment in comments:
if "stat" not in comment:
comment["stat"] = {}
# Обновляем счетчик комментариев в stat
comment["stat"]["comments_count"] = replies_count.get(comment["id"], 0)
async def load_first_replies(comments, limit, offset, sort="newest"):
"""
Загружает первые N ответов для каждого комментария.
:param comments: Список комментариев, для которых нужно загрузить ответы.
:param limit: Максимальное количество ответов для каждого комментария.
:param offset: Смещение для пагинации дочерних комментариев.
:param sort: Порядок сортировки ответов.
"""
if not comments or limit <= 0:
return
# Собираем ID комментариев
comment_ids = [comment["id"] for comment in comments]
# Базовый запрос для загрузки ответов
q = query_reactions()
q = add_reaction_stat_columns(q)
# Фильтрация: только ответы на указанные комментарии
q = q.filter(
and_(
Reaction.reply_to.in_(comment_ids),
Reaction.deleted_at.is_(None),
Reaction.kind == ReactionKind.COMMENT.value,
)
)
# Группировка
q = q.group_by(Reaction.id, Author.id, Shout.id)
# Определяем сортировку
order_by_stmt = None
if sort.lower() == "oldest":
order_by_stmt = asc(Reaction.created_at)
elif sort.lower() == "like":
order_by_stmt = desc("rating_stat")
else: # "newest" по умолчанию
order_by_stmt = desc(Reaction.created_at)
q = q.order_by(order_by_stmt, Reaction.reply_to)
# Выполняем запрос - указываем limit для неограниченного количества ответов
# но не более 100 на родительский комментарий
replies = get_reactions_with_stat(q, limit=100, offset=0)
# Группируем ответы по родительским ID
replies_by_parent = {}
for reply in replies:
parent_id = reply.get("reply_to")
if parent_id not in replies_by_parent:
replies_by_parent[parent_id] = []
replies_by_parent[parent_id].append(reply)
# Добавляем ответы к соответствующим комментариям с учетом смещения и лимита
for comment in comments:
comment_id = comment["id"]
if comment_id in replies_by_parent:
parent_replies = replies_by_parent[comment_id]
# Применяем смещение и лимит
comment["first_replies"] = parent_replies[offset : offset + limit]
else:
comment["first_replies"] = []
# Загружаем количество ответов для дочерних комментариев
all_replies = [reply for replies in replies_by_parent.values() for reply in replies]
if all_replies:
await load_replies_count(all_replies)

Some files were not shown because too many files have changed in this diff Show More