You've already forked AstralRinth
forked from didirus/AstralRinth
Technical review queue (#4775)
* chore: fix typo in status message * feat(labrinth): overhaul malware scanner report storage and routes * chore: address some review comments * feat: add Delphi to Docker Compose `with-delphi` profile * chore: fix unused import Clippy lint * feat(labrinth/delphi): use PAT token authorization with project read scopes * chore: expose file IDs in version queries * fix: accept null decompiled source payloads from Delphi * tweak(labrinth): expose base62 file IDs more consistently for Delphi * feat(labrinth/delphi): support new Delphi report severity field * chore(labrinth): run `cargo sqlx prepare` to fix Docker build errors * tweak: add route for fetching Delphi issue type schema, abstract Labrinth away from issue types * chore: run `cargo sqlx prepare` * chore: fix typo on frontend generated state file message * feat: update to use new Delphi issue schema * wip: tech review endpoints * wip: add ToSchema for dependent types * wip: report issues return * wip * wip: returning more data * wip * Fix up db query * Delphi configuration to talk to Labrinth * Get Delphi working with Labrinth * Add Delphi dummy fixture * Better Delphi logging * Improve utoipa for tech review routes * Add more sorting options for tech review queue * Oops join * New routes for fetching issues and reports * Fix which kind of ID is returned in tech review endpoints * Deduplicate tech review report rows * Reduce info sent for projects * Fetch more thread info * Address PR comments * fix ci * fix postgres version mismatch * fix version creation * Implement routes * fix up tech review * Allow adding a moderation comment to Delphi rejections * fix up rebase * exclude rejected projects from tech review * add status change msg to tech review thread * cargo sqlx prepare * also ignore withheld projects * More filtering on issue search * wip: report routes * Fix up for build * cargo sqlx prepare * fix thread message privacy * New tech review search route * submit route * details have statuses now * add default to drid status * dedup issue details * fix sqlx query on empty files * fixes * Dedupe issue detail statuses and message on entering tech rev * Fix qa issues * Fix qa issues * fix review comments * typos * fix ci * feat: tech review frontend (#4781) * chore: fix typo in status message * feat(labrinth): overhaul malware scanner report storage and routes * chore: address some review comments * feat: add Delphi to Docker Compose `with-delphi` profile * chore: fix unused import Clippy lint * feat(labrinth/delphi): use PAT token authorization with project read scopes * chore: expose file IDs in version queries * fix: accept null decompiled source payloads from Delphi * tweak(labrinth): expose base62 file IDs more consistently for Delphi * feat(labrinth/delphi): support new Delphi report severity field * chore(labrinth): run `cargo sqlx prepare` to fix Docker build errors * tweak: add route for fetching Delphi issue type schema, abstract Labrinth away from issue types * chore: run `cargo sqlx prepare` * chore: fix typo on frontend generated state file message * feat: update to use new Delphi issue schema * wip: tech review endpoints * wip: add ToSchema for dependent types * wip: report issues return * wip * wip: returning more data * wip * Fix up db query * Delphi configuration to talk to Labrinth * Get Delphi working with Labrinth * Add Delphi dummy fixture * Better Delphi logging * Improve utoipa for tech review routes * Add more sorting options for tech review queue * Oops join * New routes for fetching issues and reports * Fix which kind of ID is returned in tech review endpoints * Deduplicate tech review report rows * Reduce info sent for projects * Fetch more thread info * Address PR comments * fix ci * fix ci * fix postgres version mismatch * fix version creation * Implement routes * feat: batch scan alert * feat: layout * feat: introduce surface variables * fix: theme selector * feat: rough draft of tech review card * feat: tab switcher * feat: batch scan btn * feat: api-client module for tech review * draft: impl * feat: auto icons * fix: layout issues * feat: fixes to code blocks + flag labels * feat: temp remove mock data * fix: search sort types * fix: intl & lint * chore: re-enable mock data * fix: flag badges + auto open first issue in file tab * feat: update for new routes * fix: more qa issues * feat: lazy load sources * fix: re-enable auth middleware * feat: impl threads * fix: lint & severity * feat: download btn + switch to using NavTabs with new local mode option * feat: re-add toplevel btns * feat: reports page consistency * fix: consistency on project queue * fix: icons + sizing * fix: colors and gaps * fix: impl endpoints * feat: load all flags on file tab * feat: thread generics changes * feat: more qa * feat: fix collapse * fix: qa * feat: msg modal * fix: ISO import * feat: qa fixes * fix: empty state basic * fix: collapsible region * fix: collapse thread by default * feat: rough draft of new process/flow * fix labrinth build * fix thread message privacy * New tech review search route * feat: qa fixes * feat: QA changes * fix: verdict on detail not whole issue * fix: lint + intl * fix: lint * fix: thread message for tech rev verdict * feat: use anim frames * fix: exports + typecheck * polish: qa changes * feat: qa * feat: qa polish * feat: fix malic modal * fix: lint * fix: qa + lint * fix: pagination * fix: lint * fix: qa * intl extract * fix ci --------- Signed-off-by: Calum H. <contact@cal.engineer> Co-authored-by: Alejandro González <me@alegon.dev> Co-authored-by: aecsocket <aecsocket@tutanota.com> --------- Signed-off-by: Calum H. <contact@cal.engineer> Co-authored-by: Alejandro González <me@alegon.dev> Co-authored-by: Calum H. <contact@cal.engineer>
This commit is contained in:
@@ -129,7 +129,7 @@ PYRO_API_KEY=none
|
||||
BREX_API_URL=https://platform.brexapis.com/v2/
|
||||
BREX_API_KEY=none
|
||||
|
||||
DELPHI_URL=none
|
||||
DELPHI_URL=http://labrinth-delphi:59999
|
||||
DELPHI_SLACK_WEBHOOK=none
|
||||
|
||||
AVALARA_1099_API_URL=https://www.track1099.com/api
|
||||
|
||||
34
apps/labrinth/.sqlx/query-0ed2e6e3149352d12a673fddc50f9530c311eef084abb6fce35de5f37d79bcea.json
generated
Normal file
34
apps/labrinth/.sqlx/query-0ed2e6e3149352d12a673fddc50f9530c311eef084abb6fce35de5f37d79bcea.json
generated
Normal file
@@ -0,0 +1,34 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n SELECT\n version_id AS \"version_id: crate::database::models::DBVersionId\",\n versions.mod_id AS \"project_id: crate::database::models::DBProjectId\",\n files.url AS \"url\"\n FROM files INNER JOIN versions ON files.version_id = versions.id\n WHERE files.id = $1\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "version_id: crate::database::models::DBVersionId",
|
||||
"type_info": "Int8"
|
||||
},
|
||||
{
|
||||
"ordinal": 1,
|
||||
"name": "project_id: crate::database::models::DBProjectId",
|
||||
"type_info": "Int8"
|
||||
},
|
||||
{
|
||||
"ordinal": 2,
|
||||
"name": "url",
|
||||
"type_info": "Varchar"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
false,
|
||||
false,
|
||||
false
|
||||
]
|
||||
},
|
||||
"hash": "0ed2e6e3149352d12a673fddc50f9530c311eef084abb6fce35de5f37d79bcea"
|
||||
}
|
||||
28
apps/labrinth/.sqlx/query-2d9e36c76a1e214c53d9dc2aa3debe1d03998be169a306b63a0ca1beaa07397f.json
generated
Normal file
28
apps/labrinth/.sqlx/query-2d9e36c76a1e214c53d9dc2aa3debe1d03998be169a306b63a0ca1beaa07397f.json
generated
Normal file
@@ -0,0 +1,28 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n SELECT\n EXISTS(\n SELECT 1 FROM delphi_issue_details_with_statuses didws\n WHERE didws.project_id = $1 AND didws.status = 'pending'\n ) AS \"pending_issue_details_exist!\",\n t.id AS \"thread_id: DBThreadId\"\n FROM mods m\n INNER JOIN threads t ON t.mod_id = $1\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "pending_issue_details_exist!",
|
||||
"type_info": "Bool"
|
||||
},
|
||||
{
|
||||
"ordinal": 1,
|
||||
"name": "thread_id: DBThreadId",
|
||||
"type_info": "Int8"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
null,
|
||||
false
|
||||
]
|
||||
},
|
||||
"hash": "2d9e36c76a1e214c53d9dc2aa3debe1d03998be169a306b63a0ca1beaa07397f"
|
||||
}
|
||||
15
apps/labrinth/.sqlx/query-3240e4b5abc9850b5d3c09fafcac71674941487c15be1e8ce0ebc78e7c26b34d.json
generated
Normal file
15
apps/labrinth/.sqlx/query-3240e4b5abc9850b5d3c09fafcac71674941487c15be1e8ce0ebc78e7c26b34d.json
generated
Normal file
@@ -0,0 +1,15 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n UPDATE mods\n SET status = $1\n FROM delphi_report_issues dri\n INNER JOIN delphi_reports dr ON dr.id = dri.report_id\n INNER JOIN files f ON f.id = dr.file_id\n INNER JOIN versions v ON v.id = f.version_id\n INNER JOIN mods m ON v.mod_id = m.id\n WHERE dri.id = $2\n ",
|
||||
"describe": {
|
||||
"columns": [],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Varchar",
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": []
|
||||
},
|
||||
"hash": "3240e4b5abc9850b5d3c09fafcac71674941487c15be1e8ce0ebc78e7c26b34d"
|
||||
}
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n SELECT id FROM mods\n WHERE status = $1\n ORDER BY queued ASC\n OFFSET $3\n LIMIT $2\n ",
|
||||
"query": "\n INSERT INTO delphi_report_issues (report_id, issue_type)\n VALUES ($1, $2)\n RETURNING id\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
@@ -11,14 +11,13 @@
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Text",
|
||||
"Int8",
|
||||
"Int8"
|
||||
"Text"
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
false
|
||||
]
|
||||
},
|
||||
"hash": "ccb0315ff52ea4402f53508334a7288fc9f8e77ffd7bce665441ff682384cbf9"
|
||||
"hash": "33f26ce7e262d7c5707d05fe926390683636bbde53a51ee61fa18ef49cea8c3a"
|
||||
}
|
||||
29
apps/labrinth/.sqlx/query-3473715e4ff6efb6707f73e8ddf19ef7bcbb341c7ffea3d13acd250bb20e6d07.json
generated
Normal file
29
apps/labrinth/.sqlx/query-3473715e4ff6efb6707f73e8ddf19ef7bcbb341c7ffea3d13acd250bb20e6d07.json
generated
Normal file
@@ -0,0 +1,29 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n UPDATE mods\n SET status = $1\n FROM mods m\n INNER JOIN threads t ON t.mod_id = m.id\n WHERE m.id = $2\n RETURNING\n t.id AS \"thread_id: DBThreadId\",\n (SELECT status FROM mods WHERE id = m.id) AS \"old_status!\"\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "thread_id: DBThreadId",
|
||||
"type_info": "Int8"
|
||||
},
|
||||
{
|
||||
"ordinal": 1,
|
||||
"name": "old_status!",
|
||||
"type_info": "Varchar"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Varchar",
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
false,
|
||||
null
|
||||
]
|
||||
},
|
||||
"hash": "3473715e4ff6efb6707f73e8ddf19ef7bcbb341c7ffea3d13acd250bb20e6d07"
|
||||
}
|
||||
49
apps/labrinth/.sqlx/query-3961aa17ce3219c057c398dca0ed3aaeb30a8da4721959fdee99cf649a8b29e3.json
generated
Normal file
49
apps/labrinth/.sqlx/query-3961aa17ce3219c057c398dca0ed3aaeb30a8da4721959fdee99cf649a8b29e3.json
generated
Normal file
@@ -0,0 +1,49 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n SELECT\n project_id AS \"project_id: DBProjectId\",\n project_thread_id AS \"project_thread_id: DBThreadId\",\n report AS \"report!: sqlx::types::Json<FileReport>\"\n FROM (\n SELECT DISTINCT ON (dr.id)\n dr.id AS report_id,\n dr.created AS report_created,\n dr.severity AS report_severity,\n m.id AS project_id,\n t.id AS project_thread_id,\n\n to_jsonb(dr)\n || jsonb_build_object(\n 'file_id', to_base62(f.id),\n 'version_id', to_base62(v.id),\n 'project_id', to_base62(v.mod_id),\n 'file_name', f.filename,\n 'file_size', f.size,\n 'flag_reason', 'delphi',\n 'download_url', f.url,\n -- TODO: replace with `json_array` in Postgres 16\n 'issues', (\n SELECT json_agg(\n to_jsonb(dri)\n || jsonb_build_object(\n -- TODO: replace with `json_array` in Postgres 16\n 'details', (\n SELECT json_agg(\n jsonb_build_object(\n 'id', drid.id,\n 'issue_id', drid.issue_id,\n 'key', drid.key,\n 'file_path', drid.file_path,\n -- ignore `decompiled_source`\n 'data', drid.data,\n 'severity', drid.severity\n )\n )\n FROM delphi_report_issue_details drid\n WHERE drid.issue_id = dri.id\n )\n )\n )\n FROM delphi_report_issues dri\n WHERE dri.report_id = dr.id\n )\n ) AS report\n FROM delphi_reports dr\n INNER JOIN files f ON f.id = dr.file_id\n INNER JOIN versions v ON v.id = f.version_id\n INNER JOIN mods m ON m.id = v.mod_id\n INNER JOIN threads t ON t.mod_id = m.id\n\n -- filtering\n LEFT JOIN mods_categories mc ON mc.joining_mod_id = m.id\n LEFT JOIN categories c ON c.id = mc.joining_category_id\n WHERE\n -- project type\n (cardinality($4::int[]) = 0 OR c.project_type = ANY($4::int[]))\n AND dr.status = $5\n ) t\n\n -- sorting\n ORDER BY\n CASE WHEN $3 = 'created_asc' THEN t.report_created ELSE TO_TIMESTAMP(0) END ASC,\n CASE WHEN $3 = 'created_desc' THEN t.report_created ELSE TO_TIMESTAMP(0) END DESC,\n CASE WHEN $3 = 'severity_asc' THEN t.report_severity ELSE 'low'::delphi_severity END ASC,\n CASE WHEN $3 = 'severity_desc' THEN t.report_severity ELSE 'low'::delphi_severity END DESC\n\n -- pagination\n LIMIT $1\n OFFSET $2\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "project_id: DBProjectId",
|
||||
"type_info": "Int8"
|
||||
},
|
||||
{
|
||||
"ordinal": 1,
|
||||
"name": "project_thread_id: DBThreadId",
|
||||
"type_info": "Int8"
|
||||
},
|
||||
{
|
||||
"ordinal": 2,
|
||||
"name": "report!: sqlx::types::Json<FileReport>",
|
||||
"type_info": "Jsonb"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8",
|
||||
"Int8",
|
||||
"Text",
|
||||
"Int4Array",
|
||||
{
|
||||
"Custom": {
|
||||
"name": "delphi_report_issue_status",
|
||||
"kind": {
|
||||
"Enum": [
|
||||
"pending",
|
||||
"safe",
|
||||
"unsafe"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
false,
|
||||
false,
|
||||
null
|
||||
]
|
||||
},
|
||||
"hash": "3961aa17ce3219c057c398dca0ed3aaeb30a8da4721959fdee99cf649a8b29e3"
|
||||
}
|
||||
22
apps/labrinth/.sqlx/query-3e2804a3443239104b2d8b095941fe1472402338e0f0bb323b6147d2a0cc4eca.json
generated
Normal file
22
apps/labrinth/.sqlx/query-3e2804a3443239104b2d8b095941fe1472402338e0f0bb323b6147d2a0cc4eca.json
generated
Normal file
@@ -0,0 +1,22 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n SELECT DISTINCT ON (dr.id)\n to_jsonb(dr)\n || jsonb_build_object(\n 'file_id', to_base62(f.id),\n 'version_id', to_base62(v.id),\n 'project_id', to_base62(v.mod_id),\n 'file_name', f.filename,\n 'file_size', f.size,\n 'flag_reason', 'delphi',\n 'download_url', f.url,\n -- TODO: replace with `json_array` in Postgres 16\n 'issues', (\n SELECT json_agg(\n to_jsonb(dri)\n || jsonb_build_object(\n -- TODO: replace with `json_array` in Postgres 16\n 'details', (\n SELECT json_agg(to_jsonb(drid))\n FROM delphi_report_issue_details drid\n WHERE drid.issue_id = dri.id\n )\n )\n )\n FROM delphi_report_issues dri\n WHERE\n dri.report_id = dr.id\n -- see delphi.rs todo comment\n AND dri.issue_type != '__dummy'\n )\n ) AS \"data!: sqlx::types::Json<FileReport>\"\n FROM delphi_reports dr\n INNER JOIN files f ON f.id = dr.file_id\n INNER JOIN versions v ON v.id = f.version_id\n WHERE dr.id = $1\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "data!: sqlx::types::Json<FileReport>",
|
||||
"type_info": "Jsonb"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
null
|
||||
]
|
||||
},
|
||||
"hash": "3e2804a3443239104b2d8b095941fe1472402338e0f0bb323b6147d2a0cc4eca"
|
||||
}
|
||||
22
apps/labrinth/.sqlx/query-52ef6d02f8d533fc4e4ceb141d07a2eb115dc88da24735fffeca3eb1c269ad53.json
generated
Normal file
22
apps/labrinth/.sqlx/query-52ef6d02f8d533fc4e4ceb141d07a2eb115dc88da24735fffeca3eb1c269ad53.json
generated
Normal file
@@ -0,0 +1,22 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n SELECT\n didws.id AS \"issue_detail_id!\"\n FROM mods m\n INNER JOIN versions v ON v.mod_id = m.id\n INNER JOIN files f ON f.version_id = v.id\n INNER JOIN delphi_reports dr ON dr.file_id = f.id\n INNER JOIN delphi_report_issues dri ON dri.report_id = dr.id\n INNER JOIN delphi_issue_details_with_statuses didws ON didws.issue_id = dri.id\n WHERE\n m.id = $1\n AND didws.status = 'pending'\n -- see delphi.rs todo comment\n AND dri.issue_type != '__dummy'\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "issue_detail_id!",
|
||||
"type_info": "Int8"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
true
|
||||
]
|
||||
},
|
||||
"hash": "52ef6d02f8d533fc4e4ceb141d07a2eb115dc88da24735fffeca3eb1c269ad53"
|
||||
}
|
||||
15
apps/labrinth/.sqlx/query-6cf1862b3c197d42f9183dcbbd3d07b7d42c37e089403961ee16be0f99958ea0.json
generated
Normal file
15
apps/labrinth/.sqlx/query-6cf1862b3c197d42f9183dcbbd3d07b7d42c37e089403961ee16be0f99958ea0.json
generated
Normal file
@@ -0,0 +1,15 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n UPDATE mods\n SET status = $1\n FROM delphi_reports dr\n INNER JOIN files f ON f.id = dr.file_id\n INNER JOIN versions v ON v.id = f.version_id\n INNER JOIN mods m ON v.mod_id = m.id\n WHERE dr.id = $2\n ",
|
||||
"describe": {
|
||||
"columns": [],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Varchar",
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": []
|
||||
},
|
||||
"hash": "6cf1862b3c197d42f9183dcbbd3d07b7d42c37e089403961ee16be0f99958ea0"
|
||||
}
|
||||
37
apps/labrinth/.sqlx/query-6f5ec5cee9fc0007d11b4707b4442917689c31af7dd9a6baea4dbde99dc1a08e.json
generated
Normal file
37
apps/labrinth/.sqlx/query-6f5ec5cee9fc0007d11b4707b4442917689c31af7dd9a6baea4dbde99dc1a08e.json
generated
Normal file
File diff suppressed because one or more lines are too long
22
apps/labrinth/.sqlx/query-7d1f49699e242f3e002afee9bf466b6696052ac6d5ebe131b9e7242104f700af.json
generated
Normal file
22
apps/labrinth/.sqlx/query-7d1f49699e242f3e002afee9bf466b6696052ac6d5ebe131b9e7242104f700af.json
generated
Normal file
@@ -0,0 +1,22 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n SELECT\n to_jsonb(dri)\n || jsonb_build_object(\n -- TODO: replace with `json_array` in Postgres 16\n 'details', (\n SELECT json_agg(to_jsonb(drid))\n FROM delphi_report_issue_details drid\n WHERE drid.issue_id = dri.id\n )\n ) AS \"data!: sqlx::types::Json<FileIssue>\"\n FROM delphi_report_issues dri\n WHERE dri.id = $1\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "data!: sqlx::types::Json<FileIssue>",
|
||||
"type_info": "Jsonb"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
null
|
||||
]
|
||||
},
|
||||
"hash": "7d1f49699e242f3e002afee9bf466b6696052ac6d5ebe131b9e7242104f700af"
|
||||
}
|
||||
14
apps/labrinth/.sqlx/query-9ab1f07c2968b5d445752c1480345c1fa3af3a899b232482aab9cc44b9336063.json
generated
Normal file
14
apps/labrinth/.sqlx/query-9ab1f07c2968b5d445752c1480345c1fa3af3a899b232482aab9cc44b9336063.json
generated
Normal file
@@ -0,0 +1,14 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n DELETE FROM delphi_report_issue_details drid\n WHERE issue_id IN (\n SELECT dri.id\n FROM mods m\n INNER JOIN versions v ON v.mod_id = m.id\n INNER JOIN files f ON f.version_id = v.id\n INNER JOIN delphi_reports dr ON dr.file_id = f.id\n INNER JOIN delphi_report_issues dri ON dri.report_id = dr.id\n WHERE m.id = $1 AND dri.issue_type = '__dummy'\n )\n ",
|
||||
"describe": {
|
||||
"columns": [],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": []
|
||||
},
|
||||
"hash": "9ab1f07c2968b5d445752c1480345c1fa3af3a899b232482aab9cc44b9336063"
|
||||
}
|
||||
26
apps/labrinth/.sqlx/query-b1df83f4592701f8aa03f6d16bac9e2bd27ac9a87987eafd79b06f1c4ecdb659.json
generated
Normal file
26
apps/labrinth/.sqlx/query-b1df83f4592701f8aa03f6d16bac9e2bd27ac9a87987eafd79b06f1c4ecdb659.json
generated
Normal file
@@ -0,0 +1,26 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n UPDATE delphi_report_issues\n SET status = $1\n WHERE id = $2\n ",
|
||||
"describe": {
|
||||
"columns": [],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
{
|
||||
"Custom": {
|
||||
"name": "delphi_report_issue_status",
|
||||
"kind": {
|
||||
"Enum": [
|
||||
"pending",
|
||||
"safe",
|
||||
"unsafe"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": []
|
||||
},
|
||||
"hash": "b1df83f4592701f8aa03f6d16bac9e2bd27ac9a87987eafd79b06f1c4ecdb659"
|
||||
}
|
||||
39
apps/labrinth/.sqlx/query-b65094517546487e43b65a76aa38efd9e422151b683d9897a071ee0c4bac1cd4.json
generated
Normal file
39
apps/labrinth/.sqlx/query-b65094517546487e43b65a76aa38efd9e422151b683d9897a071ee0c4bac1cd4.json
generated
Normal file
@@ -0,0 +1,39 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n INSERT INTO delphi_report_issue_details (issue_id, key, file_path, decompiled_source, data, severity)\n VALUES ($1, $2, $3, $4, $5, $6)\n RETURNING id\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "id",
|
||||
"type_info": "Int8"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8",
|
||||
"Text",
|
||||
"Text",
|
||||
"Text",
|
||||
"Jsonb",
|
||||
{
|
||||
"Custom": {
|
||||
"name": "delphi_severity",
|
||||
"kind": {
|
||||
"Enum": [
|
||||
"low",
|
||||
"medium",
|
||||
"high",
|
||||
"severe"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
false
|
||||
]
|
||||
},
|
||||
"hash": "b65094517546487e43b65a76aa38efd9e422151b683d9897a071ee0c4bac1cd4"
|
||||
}
|
||||
14
apps/labrinth/.sqlx/query-c7c72cf1f98cbc2b647ab840bdfadf1de8aaf214b32a2aab299a0d87fd2dc453.json
generated
Normal file
14
apps/labrinth/.sqlx/query-c7c72cf1f98cbc2b647ab840bdfadf1de8aaf214b32a2aab299a0d87fd2dc453.json
generated
Normal file
@@ -0,0 +1,14 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "DELETE FROM delphi_report_issue_details WHERE issue_id = $1",
|
||||
"describe": {
|
||||
"columns": [],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": []
|
||||
},
|
||||
"hash": "c7c72cf1f98cbc2b647ab840bdfadf1de8aaf214b32a2aab299a0d87fd2dc453"
|
||||
}
|
||||
28
apps/labrinth/.sqlx/query-cd630ba950611b387fb5b04999a061d930ff06a8a928ff1cea6a723bb37c1b75.json
generated
Normal file
28
apps/labrinth/.sqlx/query-cd630ba950611b387fb5b04999a061d930ff06a8a928ff1cea6a723bb37c1b75.json
generated
Normal file
@@ -0,0 +1,28 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n SELECT\n f.url,\n COUNT(dr.id) AS \"report_count!\"\n FROM files f\n LEFT JOIN delphi_reports dr ON dr.file_id = f.id\n WHERE f.id = $1\n GROUP BY f.url\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "url",
|
||||
"type_info": "Varchar"
|
||||
},
|
||||
{
|
||||
"ordinal": 1,
|
||||
"name": "report_count!",
|
||||
"type_info": "Int8"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
false,
|
||||
null
|
||||
]
|
||||
},
|
||||
"hash": "cd630ba950611b387fb5b04999a061d930ff06a8a928ff1cea6a723bb37c1b75"
|
||||
}
|
||||
26
apps/labrinth/.sqlx/query-cfe6c9e2abba8e9c1cd7aa799a6a95f2732f1a7611ea6f7ce49cd7e077761ebf.json
generated
Normal file
26
apps/labrinth/.sqlx/query-cfe6c9e2abba8e9c1cd7aa799a6a95f2732f1a7611ea6f7ce49cd7e077761ebf.json
generated
Normal file
@@ -0,0 +1,26 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n INSERT INTO delphi_issue_detail_verdicts (\n project_id,\n detail_key,\n verdict\n )\n SELECT\n didws.project_id,\n didws.key,\n $1\n FROM delphi_issue_details_with_statuses didws\n INNER JOIN delphi_report_issues dri ON dri.id = didws.issue_id\n WHERE\n didws.id = $2\n -- see delphi.rs todo comment\n AND dri.issue_type != '__dummy'\n ",
|
||||
"describe": {
|
||||
"columns": [],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
{
|
||||
"Custom": {
|
||||
"name": "delphi_report_issue_status",
|
||||
"kind": {
|
||||
"Enum": [
|
||||
"pending",
|
||||
"safe",
|
||||
"unsafe"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": []
|
||||
},
|
||||
"hash": "cfe6c9e2abba8e9c1cd7aa799a6a95f2732f1a7611ea6f7ce49cd7e077761ebf"
|
||||
}
|
||||
24
apps/labrinth/.sqlx/query-d30290c1b55d9fb0939d122a96f350233d40ad81ac2d16481a0e9b32424a999d.json
generated
Normal file
24
apps/labrinth/.sqlx/query-d30290c1b55d9fb0939d122a96f350233d40ad81ac2d16481a0e9b32424a999d.json
generated
Normal file
@@ -0,0 +1,24 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n SELECT id\n FROM (\n SELECT DISTINCT ON (m.id)\n m.id,\n m.queued\n FROM mods m\n\n -- exclude projects in tech review queue\n LEFT JOIN delphi_issue_details_with_statuses didws\n ON didws.project_id = m.id AND didws.status = 'pending'\n\n WHERE\n m.status = $1\n AND didws.status IS NULL\n\n GROUP BY m.id\n ) t\n\n ORDER BY queued ASC\n OFFSET $3\n LIMIT $2\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "id",
|
||||
"type_info": "Int8"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Text",
|
||||
"Int8",
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
false
|
||||
]
|
||||
},
|
||||
"hash": "d30290c1b55d9fb0939d122a96f350233d40ad81ac2d16481a0e9b32424a999d"
|
||||
}
|
||||
37
apps/labrinth/.sqlx/query-f2054ae7dcc89b21ed6b2f04526de1e7cddd68ac956143bef994104280a8dc07.json
generated
Normal file
37
apps/labrinth/.sqlx/query-f2054ae7dcc89b21ed6b2f04526de1e7cddd68ac956143bef994104280a8dc07.json
generated
Normal file
@@ -0,0 +1,37 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n INSERT INTO delphi_reports (file_id, delphi_version, artifact_url, severity)\n VALUES ($1, $2, $3, $4)\n ON CONFLICT (file_id, delphi_version) DO UPDATE SET\n delphi_version = $2, artifact_url = $3, created = CURRENT_TIMESTAMP, severity = $4\n RETURNING id\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "id",
|
||||
"type_info": "Int8"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8",
|
||||
"Int4",
|
||||
"Varchar",
|
||||
{
|
||||
"Custom": {
|
||||
"name": "delphi_severity",
|
||||
"kind": {
|
||||
"Enum": [
|
||||
"low",
|
||||
"medium",
|
||||
"high",
|
||||
"severe"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
false
|
||||
]
|
||||
},
|
||||
"hash": "f2054ae7dcc89b21ed6b2f04526de1e7cddd68ac956143bef994104280a8dc07"
|
||||
}
|
||||
22
apps/labrinth/.sqlx/query-f6432d7a3c67e058c0e9da42f23ea29fa063b416c18dc857132127db95ff17f3.json
generated
Normal file
22
apps/labrinth/.sqlx/query-f6432d7a3c67e058c0e9da42f23ea29fa063b416c18dc857132127db95ff17f3.json
generated
Normal file
@@ -0,0 +1,22 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "\n SELECT t.id AS \"thread_id: DBThreadId\"\n FROM mods m\n INNER JOIN threads t ON t.mod_id = m.id\n WHERE m.id = $1\n ",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "thread_id: DBThreadId",
|
||||
"type_info": "Int8"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": [
|
||||
"Int8"
|
||||
]
|
||||
},
|
||||
"nullable": [
|
||||
false
|
||||
]
|
||||
},
|
||||
"hash": "f6432d7a3c67e058c0e9da42f23ea29fa063b416c18dc857132127db95ff17f3"
|
||||
}
|
||||
20
apps/labrinth/.sqlx/query-fe571872262fe7d119b4b6eb1e55d818fde0499d8e5a08e9e22bee42014877f3.json
generated
Normal file
20
apps/labrinth/.sqlx/query-fe571872262fe7d119b4b6eb1e55d818fde0499d8e5a08e9e22bee42014877f3.json
generated
Normal file
@@ -0,0 +1,20 @@
|
||||
{
|
||||
"db_name": "PostgreSQL",
|
||||
"query": "SELECT MAX(delphi_version) FROM delphi_reports",
|
||||
"describe": {
|
||||
"columns": [
|
||||
{
|
||||
"ordinal": 0,
|
||||
"name": "max",
|
||||
"type_info": "Int4"
|
||||
}
|
||||
],
|
||||
"parameters": {
|
||||
"Left": []
|
||||
},
|
||||
"nullable": [
|
||||
null
|
||||
]
|
||||
},
|
||||
"hash": "fe571872262fe7d119b4b6eb1e55d818fde0499d8e5a08e9e22bee42014877f3"
|
||||
}
|
||||
90
apps/labrinth/fixtures/delphi-report-2025-11-15.sql
Normal file
90
apps/labrinth/fixtures/delphi-report-2025-11-15.sql
Normal file
File diff suppressed because one or more lines are too long
47
apps/labrinth/migrations/20250810155316_delphi-reports.sql
Normal file
47
apps/labrinth/migrations/20250810155316_delphi-reports.sql
Normal file
@@ -0,0 +1,47 @@
|
||||
CREATE TYPE delphi_severity AS ENUM ('low', 'medium', 'high', 'severe');
|
||||
|
||||
CREATE TYPE delphi_report_issue_status AS ENUM ('pending', 'safe', 'unsafe');
|
||||
|
||||
-- A Delphi analysis report for a project version
|
||||
CREATE TABLE delphi_reports (
|
||||
id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
|
||||
file_id BIGINT REFERENCES files (id)
|
||||
ON DELETE SET NULL
|
||||
ON UPDATE CASCADE,
|
||||
delphi_version INTEGER NOT NULL,
|
||||
artifact_url VARCHAR(2048) NOT NULL,
|
||||
created TIMESTAMPTZ DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
severity DELPHI_SEVERITY NOT NULL,
|
||||
UNIQUE (file_id, delphi_version)
|
||||
);
|
||||
CREATE INDEX delphi_version ON delphi_reports (delphi_version);
|
||||
|
||||
-- An issue found in a Delphi report. Every issue belongs to a report,
|
||||
-- and a report can have zero, one, or more issues attached to it
|
||||
CREATE TABLE delphi_report_issues (
|
||||
id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
|
||||
report_id BIGINT NOT NULL REFERENCES delphi_reports (id)
|
||||
ON DELETE CASCADE
|
||||
ON UPDATE CASCADE,
|
||||
issue_type TEXT NOT NULL,
|
||||
status DELPHI_REPORT_ISSUE_STATUS NOT NULL,
|
||||
UNIQUE (report_id, issue_type)
|
||||
);
|
||||
CREATE INDEX delphi_report_issue_by_status_and_type ON delphi_report_issues (status, issue_type);
|
||||
|
||||
-- The details of a Delphi report issue, which contain data about a
|
||||
-- Java class affected by it. Every Delphi report issue details object
|
||||
-- belongs to a specific issue, and an issue can have zero, one, or
|
||||
-- more details attached to it. (Some issues may be artifact-wide,
|
||||
-- or otherwise not really specific to any particular class.)
|
||||
CREATE TABLE delphi_report_issue_details (
|
||||
id BIGINT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
|
||||
issue_id BIGINT NOT NULL REFERENCES delphi_report_issues (id)
|
||||
ON DELETE CASCADE
|
||||
ON UPDATE CASCADE,
|
||||
key TEXT NOT NULL,
|
||||
file_path TEXT NOT NULL,
|
||||
decompiled_source TEXT,
|
||||
data JSONB NOT NULL,
|
||||
severity DELPHI_SEVERITY NOT NULL
|
||||
);
|
||||
@@ -0,0 +1,2 @@
|
||||
ALTER TABLE delphi_reports
|
||||
ADD COLUMN status delphi_report_issue_status NOT NULL DEFAULT 'pending';
|
||||
@@ -0,0 +1,8 @@
|
||||
ALTER TABLE delphi_reports
|
||||
DROP COLUMN status;
|
||||
|
||||
ALTER TABLE delphi_report_issues
|
||||
DROP COLUMN status;
|
||||
|
||||
ALTER TABLE delphi_report_issue_details
|
||||
ADD COLUMN status DELPHI_REPORT_ISSUE_STATUS NOT NULL DEFAULT 'pending';
|
||||
@@ -0,0 +1,26 @@
|
||||
ALTER TABLE delphi_report_issue_details
|
||||
DROP COLUMN status;
|
||||
|
||||
CREATE TABLE delphi_issue_detail_verdicts (
|
||||
project_id BIGINT REFERENCES mods(id)
|
||||
ON DELETE SET NULL
|
||||
ON UPDATE CASCADE,
|
||||
detail_key TEXT NOT NULL,
|
||||
verdict delphi_report_issue_status NOT NULL,
|
||||
PRIMARY KEY (project_id, detail_key)
|
||||
);
|
||||
|
||||
CREATE VIEW delphi_issue_details_with_statuses AS
|
||||
SELECT
|
||||
drid.*,
|
||||
m.id AS project_id,
|
||||
COALESCE(didv.verdict, 'pending') AS status
|
||||
FROM delphi_report_issue_details drid
|
||||
INNER JOIN delphi_report_issues dri ON dri.id = drid.issue_id
|
||||
INNER JOIN delphi_reports dr ON dr.id = dri.report_id
|
||||
INNER JOIN files f ON f.id = dr.file_id
|
||||
INNER JOIN versions v ON v.id = f.version_id
|
||||
INNER JOIN mods m ON m.id = v.mod_id
|
||||
LEFT JOIN delphi_issue_detail_verdicts didv
|
||||
ON m.id = didv.project_id
|
||||
AND drid.key = didv.detail_key;
|
||||
@@ -113,7 +113,15 @@ impl AuthenticationError {
|
||||
}
|
||||
|
||||
#[derive(
|
||||
Serialize, Deserialize, Default, Eq, PartialEq, Clone, Copy, Debug,
|
||||
Debug,
|
||||
Clone,
|
||||
Copy,
|
||||
PartialEq,
|
||||
Eq,
|
||||
Default,
|
||||
Serialize,
|
||||
Deserialize,
|
||||
utoipa::ToSchema,
|
||||
)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum AuthProvider {
|
||||
|
||||
266
apps/labrinth/src/database/models/delphi_report_item.rs
Normal file
266
apps/labrinth/src/database/models/delphi_report_item.rs
Normal file
@@ -0,0 +1,266 @@
|
||||
use std::{
|
||||
collections::HashMap,
|
||||
fmt::{self, Display, Formatter},
|
||||
};
|
||||
|
||||
use chrono::{DateTime, Utc};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use sqlx::types::Json;
|
||||
|
||||
use crate::database::models::{
|
||||
DBFileId, DBProjectId, DatabaseError, DelphiReportId,
|
||||
DelphiReportIssueDetailsId, DelphiReportIssueId,
|
||||
};
|
||||
|
||||
/// A Delphi malware analysis report for a project version file.
|
||||
///
|
||||
/// Malware analysis reports usually belong to a specific project file,
|
||||
/// but they can get orphaned if the versions they belong to are deleted.
|
||||
/// Thus, deleting versions does not delete these reports.
|
||||
#[derive(Serialize)]
|
||||
pub struct DBDelphiReport {
|
||||
pub id: DelphiReportId,
|
||||
pub file_id: Option<DBFileId>,
|
||||
/// A sequential, monotonically increasing version number for the
|
||||
/// Delphi version that generated this report.
|
||||
pub delphi_version: i32,
|
||||
pub artifact_url: String,
|
||||
pub created: DateTime<Utc>,
|
||||
pub severity: DelphiSeverity,
|
||||
}
|
||||
|
||||
impl DBDelphiReport {
|
||||
pub async fn upsert(
|
||||
&self,
|
||||
transaction: &mut sqlx::Transaction<'_, sqlx::Postgres>,
|
||||
) -> Result<DelphiReportId, DatabaseError> {
|
||||
Ok(DelphiReportId(sqlx::query_scalar!(
|
||||
"
|
||||
INSERT INTO delphi_reports (file_id, delphi_version, artifact_url, severity)
|
||||
VALUES ($1, $2, $3, $4)
|
||||
ON CONFLICT (file_id, delphi_version) DO UPDATE SET
|
||||
delphi_version = $2, artifact_url = $3, created = CURRENT_TIMESTAMP, severity = $4
|
||||
RETURNING id
|
||||
",
|
||||
self.file_id as Option<DBFileId>,
|
||||
self.delphi_version,
|
||||
self.artifact_url,
|
||||
self.severity as DelphiSeverity,
|
||||
)
|
||||
.fetch_one(&mut **transaction)
|
||||
.await?))
|
||||
}
|
||||
}
|
||||
|
||||
/// A severity level reported by Delphi.
|
||||
#[derive(
|
||||
Deserialize,
|
||||
Serialize,
|
||||
Debug,
|
||||
Clone,
|
||||
Copy,
|
||||
PartialEq,
|
||||
Eq,
|
||||
Hash,
|
||||
sqlx::Type,
|
||||
utoipa::ToSchema,
|
||||
)]
|
||||
// The canonical serialized form of this enum is the snake_case representation.
|
||||
// We add `alias`es so we can deserialize it from how Delphi sends it,
|
||||
// which follows the Java conventions of `SCREAMING_SNAKE_CASE`.
|
||||
#[serde(rename_all = "snake_case")]
|
||||
#[sqlx(type_name = "delphi_severity", rename_all = "snake_case")]
|
||||
pub enum DelphiSeverity {
|
||||
#[serde(alias = "LOW")]
|
||||
Low,
|
||||
#[serde(alias = "MEDIUM")]
|
||||
Medium,
|
||||
#[serde(alias = "HIGH")]
|
||||
High,
|
||||
#[serde(alias = "SEVERE")]
|
||||
Severe,
|
||||
}
|
||||
|
||||
/// An issue found in a Delphi report. Every issue belongs to a report,
|
||||
/// and a report can have zero, one, or more issues attached to it.
|
||||
#[derive(Deserialize, Serialize)]
|
||||
pub struct DBDelphiReportIssue {
|
||||
pub id: DelphiReportIssueId,
|
||||
pub report_id: DelphiReportId,
|
||||
pub issue_type: String,
|
||||
}
|
||||
|
||||
/// A status a Delphi report issue can have.
|
||||
#[derive(
|
||||
Deserialize,
|
||||
Serialize,
|
||||
Debug,
|
||||
Clone,
|
||||
Copy,
|
||||
PartialEq,
|
||||
Eq,
|
||||
Hash,
|
||||
sqlx::Type,
|
||||
utoipa::ToSchema,
|
||||
)]
|
||||
#[serde(rename_all = "snake_case")]
|
||||
#[sqlx(type_name = "delphi_report_issue_status", rename_all = "snake_case")]
|
||||
pub enum DelphiStatus {
|
||||
/// The issue is pending review by the moderation team.
|
||||
Pending,
|
||||
/// The issue has been rejected (i.e., reviewed as a false positive).
|
||||
/// The affected artifact has thus been verified to be clean, other issues
|
||||
/// with it notwithstanding.
|
||||
Safe,
|
||||
/// The issue has been approved (i.e., reviewed as a valid, true positive).
|
||||
/// The affected artifact has thus been verified to be potentially malicious.
|
||||
Unsafe,
|
||||
}
|
||||
|
||||
impl Display for DelphiStatus {
|
||||
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
|
||||
self.serialize(f)
|
||||
}
|
||||
}
|
||||
|
||||
/// What verdict a moderator can give to a project flagged for technical review.
|
||||
#[derive(
|
||||
Deserialize,
|
||||
Serialize,
|
||||
Debug,
|
||||
Clone,
|
||||
Copy,
|
||||
PartialEq,
|
||||
Eq,
|
||||
Hash,
|
||||
sqlx::Type,
|
||||
utoipa::ToSchema,
|
||||
)]
|
||||
#[serde(rename_all = "snake_case")]
|
||||
pub enum DelphiVerdict {
|
||||
/// The issue has been rejected (i.e., reviewed as a false positive).
|
||||
/// The affected artifact has thus been verified to be clean, other issues
|
||||
/// with it notwithstanding.
|
||||
Safe,
|
||||
/// The issue has been approved (i.e., reviewed as a valid, true positive).
|
||||
/// The affected artifact has thus been verified to be potentially malicious.
|
||||
Unsafe,
|
||||
}
|
||||
|
||||
/// An order in which Delphi report issues can be sorted during queries.
|
||||
#[derive(Deserialize, Serialize, Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
#[serde(rename_all = "snake_case")]
|
||||
pub enum DelphiReportListOrder {
|
||||
CreatedAsc,
|
||||
CreatedDesc,
|
||||
PendingStatusFirst,
|
||||
SeverityAsc,
|
||||
SeverityDesc,
|
||||
}
|
||||
|
||||
impl Display for DelphiReportListOrder {
|
||||
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
|
||||
self.serialize(f)
|
||||
}
|
||||
}
|
||||
|
||||
/// A result returned from a Delphi report issue query, slightly
|
||||
/// denormalized with related entity information for ease of
|
||||
/// consumption by clients.
|
||||
#[derive(Serialize)]
|
||||
pub struct DelphiReportIssueResult {
|
||||
pub issue: DBDelphiReportIssue,
|
||||
pub report: DBDelphiReport,
|
||||
pub details: Vec<ReportIssueDetail>,
|
||||
pub project_id: Option<DBProjectId>,
|
||||
pub project_published: Option<DateTime<Utc>>,
|
||||
}
|
||||
|
||||
impl DBDelphiReportIssue {
|
||||
pub async fn insert(
|
||||
&self,
|
||||
transaction: &mut sqlx::Transaction<'_, sqlx::Postgres>,
|
||||
) -> Result<DelphiReportIssueId, DatabaseError> {
|
||||
Ok(DelphiReportIssueId(
|
||||
sqlx::query_scalar!(
|
||||
"
|
||||
INSERT INTO delphi_report_issues (report_id, issue_type)
|
||||
VALUES ($1, $2)
|
||||
RETURNING id
|
||||
",
|
||||
self.report_id as DelphiReportId,
|
||||
self.issue_type,
|
||||
)
|
||||
.fetch_one(&mut **transaction)
|
||||
.await?,
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
/// The details of a Delphi report issue, which contain data about a
|
||||
/// Java class affected by it. Every Delphi report issue details object
|
||||
/// belongs to a specific issue, and an issue can have zero, one, or
|
||||
/// more details attached to it. (Some issues may be artifact-wide,
|
||||
/// or otherwise not really specific to any particular class.)
|
||||
#[derive(
|
||||
Debug, Clone, Deserialize, Serialize, utoipa::ToSchema, sqlx::FromRow,
|
||||
)]
|
||||
pub struct ReportIssueDetail {
|
||||
/// ID of this issue detail.
|
||||
pub id: DelphiReportIssueDetailsId,
|
||||
/// ID of the issue this detail belongs to.
|
||||
pub issue_id: DelphiReportIssueId,
|
||||
/// Opaque identifier for where this issue detail is located, relative to
|
||||
/// the file scanned.
|
||||
///
|
||||
/// This acts as a stable identifier for an issue detail, even across
|
||||
/// different versions of the same file.
|
||||
pub key: String,
|
||||
/// Name of the Java class path in which this issue was found.
|
||||
pub file_path: String,
|
||||
/// Decompiled, pretty-printed source of the Java class.
|
||||
pub decompiled_source: Option<String>,
|
||||
/// Extra detail-specific info for this detail.
|
||||
#[sqlx(json)]
|
||||
pub data: HashMap<String, serde_json::Value>,
|
||||
/// How important is this issue, as flagged by Delphi?
|
||||
pub severity: DelphiSeverity,
|
||||
/// Has this issue detail been marked as safe or unsafe?
|
||||
pub status: DelphiStatus,
|
||||
}
|
||||
|
||||
impl ReportIssueDetail {
|
||||
pub async fn insert(
|
||||
&self,
|
||||
transaction: &mut sqlx::Transaction<'_, sqlx::Postgres>,
|
||||
) -> Result<DelphiReportIssueDetailsId, DatabaseError> {
|
||||
Ok(DelphiReportIssueDetailsId(sqlx::query_scalar!(
|
||||
"
|
||||
INSERT INTO delphi_report_issue_details (issue_id, key, file_path, decompiled_source, data, severity)
|
||||
VALUES ($1, $2, $3, $4, $5, $6)
|
||||
RETURNING id
|
||||
",
|
||||
self.issue_id as DelphiReportIssueId,
|
||||
self.key,
|
||||
self.file_path,
|
||||
self.decompiled_source,
|
||||
sqlx::types::Json(&self.data) as Json<&HashMap<String, serde_json::Value>>,
|
||||
self.severity as DelphiSeverity,
|
||||
)
|
||||
.fetch_one(&mut **transaction)
|
||||
.await?))
|
||||
}
|
||||
|
||||
pub async fn remove_all_by_issue_id(
|
||||
issue_id: DelphiReportIssueId,
|
||||
transaction: &mut sqlx::Transaction<'_, sqlx::Postgres>,
|
||||
) -> Result<u64, DatabaseError> {
|
||||
Ok(sqlx::query!(
|
||||
"DELETE FROM delphi_report_issue_details WHERE issue_id = $1",
|
||||
issue_id as DelphiReportIssueId,
|
||||
)
|
||||
.execute(&mut **transaction)
|
||||
.await?
|
||||
.rows_affected())
|
||||
}
|
||||
}
|
||||
@@ -94,7 +94,7 @@ macro_rules! generate_bulk_ids {
|
||||
|
||||
macro_rules! impl_db_id_interface {
|
||||
($id_struct:ident, $db_id_struct:ident, $(, generator: $generator_function:ident @ $db_table:expr, $(bulk_generator: $bulk_generator_function:ident,)?)?) => {
|
||||
#[derive(Copy, Clone, Debug, Type, Serialize, Deserialize, PartialEq, Eq, Hash)]
|
||||
#[derive(Copy, Clone, Debug, Type, Serialize, Deserialize, PartialEq, Eq, Hash, utoipa::ToSchema)]
|
||||
#[sqlx(transparent)]
|
||||
pub struct $db_id_struct(pub i64);
|
||||
|
||||
@@ -140,8 +140,8 @@ macro_rules! db_id_interface {
|
||||
};
|
||||
}
|
||||
|
||||
macro_rules! short_id_type {
|
||||
($name:ident) => {
|
||||
macro_rules! id_type {
|
||||
($name:ident as $type:ty) => {
|
||||
#[derive(
|
||||
Copy,
|
||||
Clone,
|
||||
@@ -152,9 +152,10 @@ macro_rules! short_id_type {
|
||||
Eq,
|
||||
PartialEq,
|
||||
Hash,
|
||||
utoipa::ToSchema,
|
||||
)]
|
||||
#[sqlx(transparent)]
|
||||
pub struct $name(pub i32);
|
||||
pub struct $name(pub $type);
|
||||
};
|
||||
}
|
||||
|
||||
@@ -268,14 +269,17 @@ db_id_interface!(
|
||||
generator: generate_affiliate_code_id @ "affiliate_codes",
|
||||
);
|
||||
|
||||
short_id_type!(CategoryId);
|
||||
short_id_type!(GameId);
|
||||
short_id_type!(LinkPlatformId);
|
||||
short_id_type!(LoaderFieldEnumId);
|
||||
short_id_type!(LoaderFieldEnumValueId);
|
||||
short_id_type!(LoaderFieldId);
|
||||
short_id_type!(LoaderId);
|
||||
short_id_type!(NotificationActionId);
|
||||
short_id_type!(ProjectTypeId);
|
||||
short_id_type!(ReportTypeId);
|
||||
short_id_type!(StatusId);
|
||||
id_type!(CategoryId as i32);
|
||||
id_type!(GameId as i32);
|
||||
id_type!(LinkPlatformId as i32);
|
||||
id_type!(LoaderFieldEnumId as i32);
|
||||
id_type!(LoaderFieldEnumValueId as i32);
|
||||
id_type!(LoaderFieldId as i32);
|
||||
id_type!(LoaderId as i32);
|
||||
id_type!(NotificationActionId as i32);
|
||||
id_type!(ProjectTypeId as i32);
|
||||
id_type!(ReportTypeId as i32);
|
||||
id_type!(StatusId as i32);
|
||||
id_type!(DelphiReportId as i64);
|
||||
id_type!(DelphiReportIssueId as i64);
|
||||
id_type!(DelphiReportIssueDetailsId as i64);
|
||||
|
||||
@@ -4,6 +4,7 @@ pub mod affiliate_code_item;
|
||||
pub mod categories;
|
||||
pub mod charge_item;
|
||||
pub mod collection_item;
|
||||
pub mod delphi_report_item;
|
||||
pub mod flow_item;
|
||||
pub mod friend_item;
|
||||
pub mod ids;
|
||||
|
||||
@@ -11,7 +11,7 @@ pub struct ThreadBuilder {
|
||||
pub report_id: Option<DBReportId>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Serialize)]
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct DBThread {
|
||||
pub id: DBThreadId,
|
||||
|
||||
@@ -30,7 +30,7 @@ pub struct ThreadMessageBuilder {
|
||||
pub hide_identity: bool,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize, Clone)]
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct DBThreadMessage {
|
||||
pub id: DBThreadMessageId,
|
||||
pub thread_id: DBThreadId,
|
||||
|
||||
@@ -6,6 +6,7 @@ use crate::database::models::loader_fields::{
|
||||
};
|
||||
use crate::database::redis::RedisPool;
|
||||
use crate::models::projects::{FileType, VersionStatus};
|
||||
use crate::routes::internal::delphi::DelphiRunParameters;
|
||||
use chrono::{DateTime, Utc};
|
||||
use dashmap::{DashMap, DashSet};
|
||||
use futures::TryStreamExt;
|
||||
@@ -164,6 +165,17 @@ impl VersionFileBuilder {
|
||||
.await?;
|
||||
}
|
||||
|
||||
if let Err(err) = crate::routes::internal::delphi::run(
|
||||
&mut **transaction,
|
||||
DelphiRunParameters {
|
||||
file_id: file_id.into(),
|
||||
},
|
||||
)
|
||||
.await
|
||||
{
|
||||
tracing::error!("Error submitting new file to Delphi: {err}");
|
||||
}
|
||||
|
||||
Ok(file_id)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -27,7 +27,9 @@ impl FileHost for MockHost {
|
||||
file_publicity: FileHostPublicity,
|
||||
file_bytes: Bytes,
|
||||
) -> Result<UploadFileData, FileHostingError> {
|
||||
let path = get_file_path(file_name, file_publicity);
|
||||
let file_name = urlencoding::decode(file_name)
|
||||
.map_err(|_| FileHostingError::InvalidFilename)?;
|
||||
let path = get_file_path(&file_name, file_publicity);
|
||||
std::fs::create_dir_all(
|
||||
path.parent().ok_or(FileHostingError::InvalidFilename)?,
|
||||
)?;
|
||||
|
||||
@@ -21,6 +21,7 @@ use std::sync::Arc;
|
||||
use tracing::{Instrument, error, info, info_span};
|
||||
use tracing_actix_web::TracingLogger;
|
||||
use utoipa::OpenApi;
|
||||
use utoipa::openapi::security::{ApiKey, ApiKeyValue, SecurityScheme};
|
||||
use utoipa_actix_web::AppExt;
|
||||
use utoipa_swagger_ui::SwaggerUi;
|
||||
|
||||
@@ -262,9 +263,23 @@ async fn main() -> std::io::Result<()> {
|
||||
}
|
||||
|
||||
#[derive(utoipa::OpenApi)]
|
||||
#[openapi(info(title = "Labrinth"))]
|
||||
#[openapi(info(title = "Labrinth"), modifiers(&SecurityAddon))]
|
||||
struct ApiDoc;
|
||||
|
||||
struct SecurityAddon;
|
||||
|
||||
impl utoipa::Modify for SecurityAddon {
|
||||
fn modify(&self, openapi: &mut utoipa::openapi::OpenApi) {
|
||||
let components = openapi.components.as_mut().unwrap();
|
||||
components.add_security_scheme(
|
||||
"bearer_auth",
|
||||
SecurityScheme::ApiKey(ApiKey::Header(ApiKeyValue::new(
|
||||
"authorization",
|
||||
))),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
fn log_error(err: &actix_web::Error) {
|
||||
if err.as_response_error().status_code().is_client_error() {
|
||||
tracing::debug!(
|
||||
|
||||
@@ -94,6 +94,32 @@ impl From<crate::models::v3::threads::MessageBody> for LegacyMessageBody {
|
||||
new_status,
|
||||
old_status,
|
||||
},
|
||||
crate::models::v3::threads::MessageBody::TechReview { verdict } => {
|
||||
LegacyMessageBody::Text {
|
||||
body: format!(
|
||||
"(legacy) Reviewed technical report and gave verdict {verdict:?}"
|
||||
),
|
||||
private: true,
|
||||
replying_to: None,
|
||||
associated_images: Vec::new(),
|
||||
}
|
||||
}
|
||||
crate::models::v3::threads::MessageBody::TechReviewEntered => {
|
||||
LegacyMessageBody::Text {
|
||||
body: "(legacy) Entered technical review".into(),
|
||||
private: true,
|
||||
replying_to: None,
|
||||
associated_images: Vec::new(),
|
||||
}
|
||||
}
|
||||
crate::models::v3::threads::MessageBody::TechReviewExitFileDeleted => {
|
||||
LegacyMessageBody::Text {
|
||||
body: "(legacy) Exited technical review because file was deleted".into(),
|
||||
private: true,
|
||||
replying_to: None,
|
||||
associated_images: Vec::new(),
|
||||
}
|
||||
}
|
||||
crate::models::v3::threads::MessageBody::ThreadClosure => {
|
||||
LegacyMessageBody::ThreadClosure
|
||||
}
|
||||
|
||||
@@ -5,7 +5,7 @@ use crate::database::models::loader_fields::VersionField;
|
||||
use crate::database::models::project_item::{LinkUrl, ProjectQueryResult};
|
||||
use crate::database::models::version_item::VersionQueryResult;
|
||||
use crate::models::ids::{
|
||||
OrganizationId, ProjectId, TeamId, ThreadId, VersionId,
|
||||
FileId, OrganizationId, ProjectId, TeamId, ThreadId, VersionId,
|
||||
};
|
||||
use ariadne::ids::UserId;
|
||||
use chrono::{DateTime, Utc};
|
||||
@@ -731,6 +731,7 @@ impl From<VersionQueryResult> for Version {
|
||||
.files
|
||||
.into_iter()
|
||||
.map(|f| VersionFile {
|
||||
id: Some(FileId(f.id.0 as u64)),
|
||||
url: f.url,
|
||||
filename: f.filename,
|
||||
hashes: f.hashes,
|
||||
@@ -855,6 +856,10 @@ impl VersionStatus {
|
||||
/// A single project file, with a url for the file and the file's hash
|
||||
#[derive(Serialize, Deserialize, Clone)]
|
||||
pub struct VersionFile {
|
||||
/// The ID of the file. Every file has an ID once created, but it
|
||||
/// is not known until it indeed has been created.
|
||||
#[serde(default, skip_serializing_if = "Option::is_none")]
|
||||
pub id: Option<FileId>,
|
||||
/// A map of hashes of the file. The key is the hashing algorithm
|
||||
/// and the value is the string version of the hash.
|
||||
pub hashes: std::collections::HashMap<String, String>,
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
use crate::database::models::delphi_report_item::DelphiVerdict;
|
||||
use crate::models::ids::{
|
||||
ImageId, ProjectId, ReportId, ThreadId, ThreadMessageId,
|
||||
};
|
||||
@@ -7,7 +8,7 @@ use ariadne::ids::UserId;
|
||||
use chrono::{DateTime, Utc};
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(Serialize, Deserialize)]
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct Thread {
|
||||
pub id: ThreadId,
|
||||
#[serde(rename = "type")]
|
||||
@@ -18,7 +19,7 @@ pub struct Thread {
|
||||
pub members: Vec<User>,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize)]
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct ThreadMessage {
|
||||
pub id: ThreadMessageId,
|
||||
pub author_id: Option<UserId>,
|
||||
@@ -27,7 +28,7 @@ pub struct ThreadMessage {
|
||||
pub hide_identity: bool,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize, Clone)]
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
#[serde(tag = "type", rename_all = "snake_case")]
|
||||
pub enum MessageBody {
|
||||
Text {
|
||||
@@ -42,6 +43,11 @@ pub enum MessageBody {
|
||||
new_status: ProjectStatus,
|
||||
old_status: ProjectStatus,
|
||||
},
|
||||
TechReview {
|
||||
verdict: DelphiVerdict,
|
||||
},
|
||||
TechReviewEntered,
|
||||
TechReviewExitFileDeleted,
|
||||
ThreadClosure,
|
||||
ThreadReopen,
|
||||
Deleted {
|
||||
@@ -50,7 +56,23 @@ pub enum MessageBody {
|
||||
},
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize, Eq, PartialEq, Copy, Clone)]
|
||||
impl MessageBody {
|
||||
pub fn is_private(&self) -> bool {
|
||||
match self {
|
||||
Self::Text { private, .. } | Self::Deleted { private } => *private,
|
||||
Self::TechReview { .. }
|
||||
| Self::TechReviewEntered
|
||||
| Self::TechReviewExitFileDeleted => true,
|
||||
Self::StatusChange { .. }
|
||||
| Self::ThreadClosure
|
||||
| Self::ThreadReopen => false,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(
|
||||
Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize, utoipa::ToSchema,
|
||||
)]
|
||||
#[serde(rename_all = "snake_case")]
|
||||
pub enum ThreadType {
|
||||
Report,
|
||||
@@ -100,16 +122,7 @@ impl Thread {
|
||||
messages: data
|
||||
.messages
|
||||
.into_iter()
|
||||
.filter(|x| {
|
||||
if let MessageBody::Text { private, .. } = x.body {
|
||||
!private || user.role.is_mod()
|
||||
} else if let MessageBody::Deleted { private, .. } = x.body
|
||||
{
|
||||
!private || user.role.is_mod()
|
||||
} else {
|
||||
true
|
||||
}
|
||||
})
|
||||
.filter(|x| user.role.is_mod() || !x.body.is_private())
|
||||
.map(|x| ThreadMessage::from(x, user))
|
||||
.collect(),
|
||||
members: users,
|
||||
|
||||
@@ -8,7 +8,7 @@ use serde::{Deserialize, Serialize};
|
||||
pub const DELETED_USER: UserId = UserId(127155982985829);
|
||||
|
||||
bitflags::bitflags! {
|
||||
#[derive(Copy, Clone, Debug)]
|
||||
#[derive(Debug, Clone, Copy)]
|
||||
pub struct Badges: u64 {
|
||||
const MIDAS = 1 << 0;
|
||||
const EARLY_MODPACK_ADOPTER = 1 << 1;
|
||||
@@ -21,6 +21,23 @@ bitflags::bitflags! {
|
||||
}
|
||||
}
|
||||
|
||||
impl utoipa::PartialSchema for Badges {
|
||||
fn schema() -> utoipa::openapi::RefOr<utoipa::openapi::schema::Schema> {
|
||||
u64::schema()
|
||||
}
|
||||
}
|
||||
|
||||
impl utoipa::ToSchema for Badges {
|
||||
fn schemas(
|
||||
schemas: &mut Vec<(
|
||||
String,
|
||||
utoipa::openapi::RefOr<utoipa::openapi::schema::Schema>,
|
||||
)>,
|
||||
) {
|
||||
u64::schemas(schemas);
|
||||
}
|
||||
}
|
||||
|
||||
bitflags_serde_impl!(Badges, u64);
|
||||
|
||||
impl Default for Badges {
|
||||
@@ -29,7 +46,7 @@ impl Default for Badges {
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize, Clone, Debug)]
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct User {
|
||||
pub id: UserId,
|
||||
pub username: String,
|
||||
@@ -52,7 +69,7 @@ pub struct User {
|
||||
pub github_id: Option<u64>,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize, Clone, Debug)]
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct UserPayoutData {
|
||||
pub paypal_address: Option<String>,
|
||||
pub paypal_country: Option<String>,
|
||||
@@ -137,7 +154,9 @@ impl User {
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize, PartialEq, Eq, Clone, Debug)]
|
||||
#[derive(
|
||||
Debug, Clone, PartialEq, Eq, Hash, Serialize, Deserialize, utoipa::ToSchema,
|
||||
)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum Role {
|
||||
Developer,
|
||||
|
||||
@@ -1,12 +1,9 @@
|
||||
use crate::auth::validate::get_user_record_from_bearer_token;
|
||||
use crate::database::models::thread_item::ThreadMessageBuilder;
|
||||
use crate::database::redis::RedisPool;
|
||||
use crate::models::analytics::Download;
|
||||
use crate::models::ids::ProjectId;
|
||||
use crate::models::pats::Scopes;
|
||||
use crate::models::threads::MessageBody;
|
||||
use crate::queue::analytics::AnalyticsQueue;
|
||||
use crate::queue::moderation::AUTOMOD_ID;
|
||||
use crate::queue::session::AuthQueue;
|
||||
use crate::routes::ApiError;
|
||||
use crate::search::SearchConfig;
|
||||
@@ -17,17 +14,14 @@ use modrinth_maxmind::MaxMind;
|
||||
use serde::Deserialize;
|
||||
use sqlx::PgPool;
|
||||
use std::collections::HashMap;
|
||||
use std::fmt::Write;
|
||||
use std::net::Ipv4Addr;
|
||||
use std::sync::Arc;
|
||||
use tracing::info;
|
||||
|
||||
pub fn config(cfg: &mut web::ServiceConfig) {
|
||||
cfg.service(
|
||||
web::scope("admin")
|
||||
.service(count_download)
|
||||
.service(force_reindex)
|
||||
.service(delphi_result_ingest),
|
||||
.service(force_reindex),
|
||||
);
|
||||
}
|
||||
|
||||
@@ -163,98 +157,3 @@ pub async fn force_reindex(
|
||||
index_projects(pool.as_ref().clone(), redis.clone(), &config).await?;
|
||||
Ok(HttpResponse::NoContent().finish())
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
pub struct DelphiIngest {
|
||||
pub url: String,
|
||||
pub project_id: crate::models::ids::ProjectId,
|
||||
pub version_id: crate::models::ids::VersionId,
|
||||
pub issues: HashMap<String, HashMap<String, String>>,
|
||||
}
|
||||
|
||||
#[post("/_delphi", guard = "admin_key_guard")]
|
||||
pub async fn delphi_result_ingest(
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
body: web::Json<DelphiIngest>,
|
||||
) -> Result<HttpResponse, ApiError> {
|
||||
if body.issues.is_empty() {
|
||||
info!("No issues found for file {}", body.url);
|
||||
return Ok(HttpResponse::NoContent().finish());
|
||||
}
|
||||
|
||||
let webhook_url = dotenvy::var("DELPHI_SLACK_WEBHOOK")?;
|
||||
|
||||
let project = crate::database::models::DBProject::get_id(
|
||||
body.project_id.into(),
|
||||
&**pool,
|
||||
&redis,
|
||||
)
|
||||
.await?
|
||||
.ok_or_else(|| {
|
||||
ApiError::InvalidInput(format!(
|
||||
"Project {} does not exist",
|
||||
body.project_id
|
||||
))
|
||||
})?;
|
||||
|
||||
let mut header = format!("Suspicious traces found at {}", body.url);
|
||||
|
||||
for (issue, trace) in &body.issues {
|
||||
for (path, code) in trace {
|
||||
write!(
|
||||
&mut header,
|
||||
"\n issue {issue} found at file {path}: \n ```\n{code}\n```"
|
||||
)
|
||||
.unwrap();
|
||||
}
|
||||
}
|
||||
|
||||
crate::util::webhook::send_slack_project_webhook(
|
||||
body.project_id,
|
||||
&pool,
|
||||
&redis,
|
||||
webhook_url,
|
||||
Some(header),
|
||||
)
|
||||
.await
|
||||
.ok();
|
||||
|
||||
let mut thread_header = format!(
|
||||
"Suspicious traces found at [version {}](https://modrinth.com/project/{}/version/{})",
|
||||
body.version_id, body.project_id, body.version_id
|
||||
);
|
||||
|
||||
for (issue, trace) in &body.issues {
|
||||
for path in trace.keys() {
|
||||
write!(
|
||||
&mut thread_header,
|
||||
"\n\n- issue {issue} found at file {path}"
|
||||
)
|
||||
.unwrap();
|
||||
}
|
||||
|
||||
if trace.is_empty() {
|
||||
write!(&mut thread_header, "\n\n- issue {issue} found").unwrap();
|
||||
}
|
||||
}
|
||||
|
||||
let mut transaction = pool.begin().await?;
|
||||
ThreadMessageBuilder {
|
||||
author_id: Some(crate::database::models::DBUserId(AUTOMOD_ID)),
|
||||
body: MessageBody::Text {
|
||||
body: thread_header,
|
||||
private: true,
|
||||
replying_to: None,
|
||||
associated_images: vec![],
|
||||
},
|
||||
thread_id: project.thread_id,
|
||||
hide_identity: false,
|
||||
}
|
||||
.insert(&mut transaction)
|
||||
.await?;
|
||||
|
||||
transaction.commit().await?;
|
||||
|
||||
Ok(HttpResponse::NoContent().finish())
|
||||
}
|
||||
|
||||
423
apps/labrinth/src/routes/internal/delphi.rs
Normal file
423
apps/labrinth/src/routes/internal/delphi.rs
Normal file
@@ -0,0 +1,423 @@
|
||||
use std::{collections::HashMap, fmt::Write, sync::LazyLock, time::Instant};
|
||||
|
||||
use actix_web::{HttpRequest, HttpResponse, get, post, web};
|
||||
use chrono::{DateTime, Utc};
|
||||
use eyre::eyre;
|
||||
use reqwest::header::{HeaderMap, HeaderValue, USER_AGENT};
|
||||
use serde::Deserialize;
|
||||
use sqlx::PgPool;
|
||||
use tokio::sync::Mutex;
|
||||
use tracing::info;
|
||||
|
||||
use crate::{
|
||||
auth::check_is_moderator_from_headers,
|
||||
database::{
|
||||
models::{
|
||||
DBFileId, DBProjectId, DBThreadId, DelphiReportId,
|
||||
DelphiReportIssueDetailsId, DelphiReportIssueId,
|
||||
delphi_report_item::{
|
||||
DBDelphiReport, DBDelphiReportIssue, DelphiSeverity,
|
||||
DelphiStatus, ReportIssueDetail,
|
||||
},
|
||||
thread_item::ThreadMessageBuilder,
|
||||
},
|
||||
redis::RedisPool,
|
||||
},
|
||||
models::{
|
||||
ids::{ProjectId, VersionId},
|
||||
pats::Scopes,
|
||||
threads::MessageBody,
|
||||
},
|
||||
queue::session::AuthQueue,
|
||||
routes::ApiError,
|
||||
util::{error::Context, guards::admin_key_guard},
|
||||
};
|
||||
|
||||
pub fn config(cfg: &mut web::ServiceConfig) {
|
||||
cfg.service(
|
||||
web::scope("delphi")
|
||||
.service(ingest_report)
|
||||
.service(_run)
|
||||
.service(version)
|
||||
.service(issue_type_schema),
|
||||
);
|
||||
}
|
||||
|
||||
static DELPHI_CLIENT: LazyLock<reqwest::Client> = LazyLock::new(|| {
|
||||
reqwest::Client::builder()
|
||||
.default_headers({
|
||||
HeaderMap::from_iter([(
|
||||
USER_AGENT,
|
||||
HeaderValue::from_static(concat!(
|
||||
"Labrinth/",
|
||||
env!("COMPILATION_DATE")
|
||||
)),
|
||||
)])
|
||||
})
|
||||
.build()
|
||||
.unwrap()
|
||||
});
|
||||
|
||||
#[derive(Deserialize)]
|
||||
struct DelphiReportIssueDetails {
|
||||
pub file: String,
|
||||
pub key: String,
|
||||
pub data: HashMap<String, serde_json::Value>,
|
||||
pub severity: DelphiSeverity,
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
struct DelphiReport {
|
||||
pub url: String,
|
||||
pub project_id: crate::models::ids::ProjectId,
|
||||
#[serde(rename = "version_id")]
|
||||
pub version_id: crate::models::ids::VersionId,
|
||||
pub file_id: crate::models::ids::FileId,
|
||||
/// A sequential, monotonically increasing version number for the
|
||||
/// Delphi version that generated this report.
|
||||
pub delphi_version: i32,
|
||||
pub issues: HashMap<String, Vec<DelphiReportIssueDetails>>,
|
||||
pub severity: DelphiSeverity,
|
||||
/// Map of [`DelphiReportIssueDetails::file`] to the decompiled Java source
|
||||
/// code.
|
||||
pub decompiled_sources: HashMap<String, Option<String>>,
|
||||
}
|
||||
|
||||
impl DelphiReport {
|
||||
async fn send_to_slack(
|
||||
&self,
|
||||
pool: &PgPool,
|
||||
redis: &RedisPool,
|
||||
) -> Result<(), ApiError> {
|
||||
let webhook_url = dotenvy::var("DELPHI_SLACK_WEBHOOK")?;
|
||||
|
||||
let mut message_header =
|
||||
format!("⚠️ Suspicious traces found at {}", self.url);
|
||||
|
||||
for (issue, trace) in &self.issues {
|
||||
for DelphiReportIssueDetails { file, .. } in trace {
|
||||
let decompiled_source =
|
||||
self.decompiled_sources.get(file).and_then(|o| o.as_ref());
|
||||
|
||||
write!(
|
||||
&mut message_header,
|
||||
"\n issue {issue} found at class `{file}`:\n```\n{}\n```",
|
||||
decompiled_source.as_ref().map_or(
|
||||
"No decompiled source available",
|
||||
|decompiled_source| &**decompiled_source
|
||||
)
|
||||
)
|
||||
.ok();
|
||||
}
|
||||
}
|
||||
|
||||
crate::util::webhook::send_slack_project_webhook(
|
||||
self.project_id,
|
||||
pool,
|
||||
redis,
|
||||
webhook_url,
|
||||
Some(message_header),
|
||||
)
|
||||
.await
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
pub struct DelphiRunParameters {
|
||||
pub file_id: crate::models::ids::FileId,
|
||||
}
|
||||
|
||||
#[post("ingest", guard = "admin_key_guard")]
|
||||
async fn ingest_report(
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
web::Json(report): web::Json<serde_json::Value>,
|
||||
) -> Result<(), ApiError> {
|
||||
// treat this as an internal error, since it's not a bad request from the
|
||||
// client's side - it's *our* fault for handling the Delphi schema wrong
|
||||
// this could happen if Delphi updates and Labrinth doesn't
|
||||
let report = serde_json::from_value::<DelphiReport>(report.clone())
|
||||
.wrap_internal_err_with(|| {
|
||||
eyre!(
|
||||
"Delphi sent a response which does not match our schema\n\n{}",
|
||||
serde_json::to_string_pretty(&report).unwrap()
|
||||
)
|
||||
})?;
|
||||
|
||||
ingest_report_deserialized(pool, redis, report).await
|
||||
}
|
||||
|
||||
#[tracing::instrument(
|
||||
level = "info",
|
||||
skip_all,
|
||||
fields(
|
||||
%report.url,
|
||||
%report.file_id,
|
||||
%report.project_id,
|
||||
%report.version_id,
|
||||
)
|
||||
)]
|
||||
async fn ingest_report_deserialized(
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
report: DelphiReport,
|
||||
) -> Result<(), ApiError> {
|
||||
if report.issues.is_empty() {
|
||||
info!("No issues found for file");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
report.send_to_slack(&pool, &redis).await.ok();
|
||||
|
||||
let mut transaction = pool.begin().await?;
|
||||
|
||||
let report_id = DBDelphiReport {
|
||||
id: DelphiReportId(0), // This will be set by the database
|
||||
file_id: Some(DBFileId(report.file_id.0 as i64)),
|
||||
delphi_version: report.delphi_version,
|
||||
artifact_url: report.url.clone(),
|
||||
created: DateTime::<Utc>::MIN_UTC, // This will be set by the database
|
||||
severity: report.severity,
|
||||
}
|
||||
.upsert(&mut transaction)
|
||||
.await?;
|
||||
|
||||
info!(
|
||||
num_issues = %report.issues.len(),
|
||||
"Delphi found issues in file",
|
||||
);
|
||||
|
||||
let record = sqlx::query!(
|
||||
r#"
|
||||
SELECT
|
||||
EXISTS(
|
||||
SELECT 1 FROM delphi_issue_details_with_statuses didws
|
||||
WHERE didws.project_id = $1 AND didws.status = 'pending'
|
||||
) AS "pending_issue_details_exist!",
|
||||
t.id AS "thread_id: DBThreadId"
|
||||
FROM mods m
|
||||
INNER JOIN threads t ON t.mod_id = $1
|
||||
"#,
|
||||
DBProjectId::from(report.project_id) as _,
|
||||
)
|
||||
.fetch_one(&mut *transaction)
|
||||
.await
|
||||
.wrap_internal_err("failed to check if pending issue details exist")?;
|
||||
|
||||
if record.pending_issue_details_exist {
|
||||
info!(
|
||||
"File's project already has pending issue details, is not entering tech review queue"
|
||||
);
|
||||
} else {
|
||||
info!("File's project is entering tech review queue");
|
||||
|
||||
ThreadMessageBuilder {
|
||||
author_id: None,
|
||||
body: MessageBody::TechReviewEntered,
|
||||
thread_id: record.thread_id,
|
||||
hide_identity: false,
|
||||
}
|
||||
.insert(&mut transaction)
|
||||
.await
|
||||
.wrap_internal_err("failed to add entering tech review message")?;
|
||||
}
|
||||
|
||||
// TODO: Currently, the way we determine if an issue is in tech review or not
|
||||
// is if it has any issue details which are pending.
|
||||
// If you mark all issue details are safe or not safe - even if you don't
|
||||
// submit the final report - the project will be taken out of tech review
|
||||
// queue, and into moderation queue.
|
||||
//
|
||||
// This is undesirable, but we can't rework the database schema to fix it
|
||||
// right now. As a hack, we add a dummy report issue which blocks the
|
||||
// project from exiting the tech review queue.
|
||||
{
|
||||
let dummy_issue_id = DBDelphiReportIssue {
|
||||
id: DelphiReportIssueId(0), // This will be set by the database
|
||||
report_id,
|
||||
issue_type: "__dummy".into(),
|
||||
}
|
||||
.insert(&mut transaction)
|
||||
.await?;
|
||||
|
||||
ReportIssueDetail {
|
||||
id: DelphiReportIssueDetailsId(0), // This will be set by the database
|
||||
issue_id: dummy_issue_id,
|
||||
key: "".into(),
|
||||
file_path: "".into(),
|
||||
decompiled_source: None,
|
||||
data: HashMap::new(),
|
||||
severity: DelphiSeverity::Low,
|
||||
status: DelphiStatus::Pending,
|
||||
}
|
||||
.insert(&mut transaction)
|
||||
.await?;
|
||||
}
|
||||
|
||||
for (issue_type, issue_details) in report.issues {
|
||||
let issue_id = DBDelphiReportIssue {
|
||||
id: DelphiReportIssueId(0), // This will be set by the database
|
||||
report_id,
|
||||
issue_type,
|
||||
}
|
||||
.insert(&mut transaction)
|
||||
.await?;
|
||||
|
||||
// This is required to handle the case where the same Delphi version is re-run on the same file
|
||||
ReportIssueDetail::remove_all_by_issue_id(issue_id, &mut transaction)
|
||||
.await?;
|
||||
|
||||
for issue_detail in issue_details {
|
||||
let decompiled_source =
|
||||
report.decompiled_sources.get(&issue_detail.file);
|
||||
|
||||
ReportIssueDetail {
|
||||
id: DelphiReportIssueDetailsId(0), // This will be set by the database
|
||||
issue_id,
|
||||
key: issue_detail.key,
|
||||
file_path: issue_detail.file,
|
||||
decompiled_source: decompiled_source.cloned().flatten(),
|
||||
data: issue_detail.data,
|
||||
severity: issue_detail.severity,
|
||||
status: DelphiStatus::Pending,
|
||||
}
|
||||
.insert(&mut transaction)
|
||||
.await?;
|
||||
}
|
||||
}
|
||||
|
||||
transaction.commit().await?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn run(
|
||||
exec: impl sqlx::Executor<'_, Database = sqlx::Postgres>,
|
||||
run_parameters: DelphiRunParameters,
|
||||
) -> Result<HttpResponse, ApiError> {
|
||||
let file_data = sqlx::query!(
|
||||
r#"
|
||||
SELECT
|
||||
version_id AS "version_id: crate::database::models::DBVersionId",
|
||||
versions.mod_id AS "project_id: crate::database::models::DBProjectId",
|
||||
files.url AS "url"
|
||||
FROM files INNER JOIN versions ON files.version_id = versions.id
|
||||
WHERE files.id = $1
|
||||
"#,
|
||||
run_parameters.file_id.0 as i64
|
||||
)
|
||||
.fetch_one(exec)
|
||||
.await?;
|
||||
|
||||
tracing::debug!(
|
||||
"Running Delphi for project {}, version {}, file {}",
|
||||
file_data.project_id.0,
|
||||
file_data.version_id.0,
|
||||
run_parameters.file_id.0
|
||||
);
|
||||
|
||||
DELPHI_CLIENT
|
||||
.post(dotenvy::var("DELPHI_URL")?)
|
||||
.json(&serde_json::json!({
|
||||
"url": file_data.url,
|
||||
"project_id": ProjectId(file_data.project_id.0 as u64),
|
||||
"version_id": VersionId(file_data.version_id.0 as u64),
|
||||
"file_id": run_parameters.file_id,
|
||||
}))
|
||||
.send()
|
||||
.await
|
||||
.and_then(|res| res.error_for_status())
|
||||
.map_err(ApiError::Delphi)?;
|
||||
|
||||
Ok(HttpResponse::NoContent().finish())
|
||||
}
|
||||
|
||||
#[post("run")]
|
||||
async fn _run(
|
||||
req: HttpRequest,
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
session_queue: web::Data<AuthQueue>,
|
||||
run_parameters: web::Query<DelphiRunParameters>,
|
||||
) -> Result<HttpResponse, ApiError> {
|
||||
check_is_moderator_from_headers(
|
||||
&req,
|
||||
&**pool,
|
||||
&redis,
|
||||
&session_queue,
|
||||
Scopes::PROJECT_READ,
|
||||
)
|
||||
.await?;
|
||||
|
||||
run(&**pool, run_parameters.into_inner()).await
|
||||
}
|
||||
|
||||
#[get("version")]
|
||||
async fn version(
|
||||
req: HttpRequest,
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
session_queue: web::Data<AuthQueue>,
|
||||
) -> Result<HttpResponse, ApiError> {
|
||||
check_is_moderator_from_headers(
|
||||
&req,
|
||||
&**pool,
|
||||
&redis,
|
||||
&session_queue,
|
||||
Scopes::PROJECT_READ,
|
||||
)
|
||||
.await?;
|
||||
|
||||
Ok(HttpResponse::Ok().json(
|
||||
sqlx::query_scalar!("SELECT MAX(delphi_version) FROM delphi_reports")
|
||||
.fetch_one(&**pool)
|
||||
.await?,
|
||||
))
|
||||
}
|
||||
|
||||
#[get("issue_type/schema")]
|
||||
async fn issue_type_schema(
|
||||
req: HttpRequest,
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
session_queue: web::Data<AuthQueue>,
|
||||
) -> Result<HttpResponse, ApiError> {
|
||||
check_is_moderator_from_headers(
|
||||
&req,
|
||||
&**pool,
|
||||
&redis,
|
||||
&session_queue,
|
||||
Scopes::PROJECT_READ,
|
||||
)
|
||||
.await?;
|
||||
|
||||
// This route is expected to be called often by the frontend, and Delphi is not necessarily
|
||||
// built to scale beyond malware analysis, so cache the result of its quasi-constant-valued
|
||||
// schema route to alleviate the load on it
|
||||
|
||||
static CACHED_ISSUE_TYPE_SCHEMA: Mutex<
|
||||
Option<(serde_json::Map<String, serde_json::Value>, Instant)>,
|
||||
> = Mutex::const_new(None);
|
||||
|
||||
match &mut *CACHED_ISSUE_TYPE_SCHEMA.lock().await {
|
||||
Some((schema, last_fetch)) if last_fetch.elapsed().as_secs() < 60 => {
|
||||
Ok(HttpResponse::Ok().json(schema))
|
||||
}
|
||||
cache_entry => Ok(HttpResponse::Ok().json(
|
||||
&cache_entry
|
||||
.insert((
|
||||
DELPHI_CLIENT
|
||||
.get(format!("{}/schema", dotenvy::var("DELPHI_URL")?))
|
||||
.send()
|
||||
.await
|
||||
.and_then(|res| res.error_for_status())
|
||||
.map_err(ApiError::Delphi)?
|
||||
.json::<serde_json::Map<String, serde_json::Value>>()
|
||||
.await
|
||||
.map_err(ApiError::Delphi)?,
|
||||
Instant::now(),
|
||||
))
|
||||
.0,
|
||||
)),
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
pub(crate) mod admin;
|
||||
pub mod affiliate;
|
||||
pub mod billing;
|
||||
pub mod delphi;
|
||||
pub mod external_notifications;
|
||||
pub mod flows;
|
||||
pub mod gdpr;
|
||||
@@ -31,7 +32,8 @@ pub fn config(cfg: &mut actix_web::web::ServiceConfig) {
|
||||
.configure(statuses::config)
|
||||
.configure(medal::config)
|
||||
.configure(external_notifications::config)
|
||||
.configure(mural::config),
|
||||
.configure(mural::config)
|
||||
.configure(delphi::config),
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
use super::ApiError;
|
||||
use crate::database;
|
||||
use crate::database::models::{DBOrganization, DBTeamId, DBTeamMember, DBUser};
|
||||
use crate::database::redis::RedisPool;
|
||||
use crate::models::ids::{OrganizationId, TeamId};
|
||||
use crate::models::ids::OrganizationId;
|
||||
use crate::models::projects::{Project, ProjectStatus};
|
||||
use crate::queue::moderation::{ApprovalType, IdentifiedFile, MissingMetadata};
|
||||
use crate::queue::session::AuthQueue;
|
||||
@@ -10,15 +9,22 @@ use crate::util::error::Context;
|
||||
use crate::{auth::check_is_moderator_from_headers, models::pats::Scopes};
|
||||
use actix_web::{HttpRequest, get, post, web};
|
||||
use ariadne::ids::{UserId, random_base62};
|
||||
use eyre::eyre;
|
||||
use ownership::get_projects_ownership;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use sqlx::PgPool;
|
||||
use std::collections::HashMap;
|
||||
|
||||
mod ownership;
|
||||
mod tech_review;
|
||||
|
||||
pub fn config(cfg: &mut utoipa_actix_web::service_config::ServiceConfig) {
|
||||
cfg.service(get_projects)
|
||||
.service(get_project_meta)
|
||||
.service(set_project_meta);
|
||||
.service(set_project_meta)
|
||||
.service(
|
||||
utoipa_actix_web::scope("/tech-review")
|
||||
.configure(tech_review::config),
|
||||
);
|
||||
}
|
||||
|
||||
#[derive(Deserialize, utoipa::ToSchema)]
|
||||
@@ -47,7 +53,7 @@ pub struct FetchedProject {
|
||||
}
|
||||
|
||||
/// Fetched information on who owns a project.
|
||||
#[derive(Debug, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
#[derive(Debug, Serialize, Deserialize, utoipa::ToSchema, Clone)]
|
||||
#[serde(tag = "kind", rename_all = "snake_case")]
|
||||
pub enum Ownership {
|
||||
/// Project is owned by a team, and this is the team owner.
|
||||
@@ -105,8 +111,24 @@ pub async fn get_projects_internal(
|
||||
|
||||
let project_ids = sqlx::query!(
|
||||
"
|
||||
SELECT id FROM mods
|
||||
WHERE status = $1
|
||||
SELECT id
|
||||
FROM (
|
||||
SELECT DISTINCT ON (m.id)
|
||||
m.id,
|
||||
m.queued
|
||||
FROM mods m
|
||||
|
||||
-- exclude projects in tech review queue
|
||||
LEFT JOIN delphi_issue_details_with_statuses didws
|
||||
ON didws.project_id = m.id AND didws.status = 'pending'
|
||||
|
||||
WHERE
|
||||
m.status = $1
|
||||
AND didws.status IS NULL
|
||||
|
||||
GROUP BY m.id
|
||||
) t
|
||||
|
||||
ORDER BY queued ASC
|
||||
OFFSET $3
|
||||
LIMIT $2
|
||||
@@ -129,73 +151,20 @@ pub async fn get_projects_internal(
|
||||
.map(crate::models::projects::Project::from)
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
let team_ids = projects
|
||||
.iter()
|
||||
.map(|project| project.team_id)
|
||||
.map(DBTeamId::from)
|
||||
.collect::<Vec<_>>();
|
||||
let org_ids = projects
|
||||
.iter()
|
||||
.filter_map(|project| project.organization)
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
let team_members =
|
||||
DBTeamMember::get_from_team_full_many(&team_ids, &**pool, &redis)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch team members")?;
|
||||
let users = DBUser::get_many_ids(
|
||||
&team_members
|
||||
.iter()
|
||||
.map(|member| member.user_id)
|
||||
.collect::<Vec<_>>(),
|
||||
&**pool,
|
||||
&redis,
|
||||
)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch user data of team members")?;
|
||||
let orgs = DBOrganization::get_many(&org_ids, &**pool, &redis)
|
||||
let ownerships = get_projects_ownership(&projects, &pool, &redis)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch organizations")?;
|
||||
.wrap_internal_err("failed to fetch project ownerships")?;
|
||||
|
||||
let map_project = |project: Project| -> Result<FetchedProject, ApiError> {
|
||||
let project_id = project.id;
|
||||
let ownership = if let Some(org_id) = project.organization {
|
||||
let org = orgs
|
||||
.iter()
|
||||
.find(|org| OrganizationId::from(org.id) == org_id)
|
||||
.wrap_internal_err_with(|| {
|
||||
eyre!(
|
||||
"project {project_id} is owned by an invalid organization {org_id}"
|
||||
)
|
||||
})?;
|
||||
|
||||
Ownership::Organization {
|
||||
id: OrganizationId::from(org.id),
|
||||
name: org.name.clone(),
|
||||
icon_url: org.icon_url.clone(),
|
||||
}
|
||||
} else {
|
||||
let team_id = project.team_id;
|
||||
let team_owner = team_members.iter().find(|member| TeamId::from(member.team_id) == team_id && member.is_owner)
|
||||
.wrap_internal_err_with(|| eyre!("project {project_id} is owned by a team {team_id} which has no valid owner"))?;
|
||||
let team_owner_id = team_owner.user_id;
|
||||
let user = users.iter().find(|user| user.id == team_owner_id)
|
||||
.wrap_internal_err_with(|| eyre!("project {project_id} is owned by a team {team_id} which has owner {} which does not exist", UserId::from(team_owner_id)))?;
|
||||
|
||||
Ownership::User {
|
||||
id: UserId::from(user.id),
|
||||
name: user.username.clone(),
|
||||
icon_url: user.avatar_url.clone(),
|
||||
}
|
||||
let map_project =
|
||||
|(project, ownership): (Project, Ownership)| -> FetchedProject {
|
||||
FetchedProject { ownership, project }
|
||||
};
|
||||
|
||||
Ok(FetchedProject { ownership, project })
|
||||
};
|
||||
|
||||
let projects = projects
|
||||
.into_iter()
|
||||
.zip(ownerships)
|
||||
.map(map_project)
|
||||
.collect::<Result<Vec<_>, _>>()?;
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
Ok(web::Json(projects))
|
||||
}
|
||||
84
apps/labrinth/src/routes/internal/moderation/ownership.rs
Normal file
84
apps/labrinth/src/routes/internal/moderation/ownership.rs
Normal file
@@ -0,0 +1,84 @@
|
||||
use crate::database::models::{DBOrganization, DBTeamId, DBTeamMember, DBUser};
|
||||
use crate::database::redis::RedisPool;
|
||||
use crate::models::ids::OrganizationId;
|
||||
use crate::routes::internal::moderation::Ownership;
|
||||
use crate::util::error::Context;
|
||||
use ariadne::ids::UserId;
|
||||
use eyre::eyre;
|
||||
use sqlx::PgPool;
|
||||
|
||||
/// Fetches ownership information for multiple projects efficiently
|
||||
pub async fn get_projects_ownership(
|
||||
projects: &[crate::models::projects::Project],
|
||||
pool: &PgPool,
|
||||
redis: &RedisPool,
|
||||
) -> Result<Vec<Ownership>, crate::routes::ApiError> {
|
||||
let team_ids = projects
|
||||
.iter()
|
||||
.map(|project| project.team_id)
|
||||
.map(DBTeamId::from)
|
||||
.collect::<Vec<_>>();
|
||||
let org_ids = projects
|
||||
.iter()
|
||||
.filter_map(|project| project.organization)
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
let team_members =
|
||||
DBTeamMember::get_from_team_full_many(&team_ids, pool, redis)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch team members")?;
|
||||
let users = DBUser::get_many_ids(
|
||||
&team_members
|
||||
.iter()
|
||||
.map(|member| member.user_id)
|
||||
.collect::<Vec<_>>(),
|
||||
pool,
|
||||
redis,
|
||||
)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch user data of team members")?;
|
||||
let orgs = DBOrganization::get_many(&org_ids, pool, redis)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch organizations")?;
|
||||
|
||||
let mut ownerships = Vec::with_capacity(projects.len());
|
||||
|
||||
for project in projects {
|
||||
let project_id = project.id;
|
||||
let ownership = if let Some(org_id) = project.organization {
|
||||
let org = orgs
|
||||
.iter()
|
||||
.find(|org| OrganizationId::from(org.id) == org_id)
|
||||
.wrap_internal_err_with(|| {
|
||||
eyre!(
|
||||
"project {project_id} is owned by an invalid organization {org_id}"
|
||||
)
|
||||
})?;
|
||||
|
||||
Ownership::Organization {
|
||||
id: OrganizationId::from(org.id),
|
||||
name: org.name.clone(),
|
||||
icon_url: org.icon_url.clone(),
|
||||
}
|
||||
} else {
|
||||
let team_id = project.team_id;
|
||||
let team_owner = team_members.iter().find(|member| {
|
||||
crate::models::ids::TeamId::from(member.team_id) == team_id && member.is_owner
|
||||
})
|
||||
.wrap_internal_err_with(|| eyre!("project {project_id} is owned by a team {team_id} which has no valid owner"))?;
|
||||
let team_owner_id = team_owner.user_id;
|
||||
let user = users.iter().find(|user| user.id == team_owner_id)
|
||||
.wrap_internal_err_with(|| eyre!("project {project_id} is owned by a team {team_id} which has owner {} which does not exist", UserId::from(team_owner_id)))?;
|
||||
|
||||
Ownership::User {
|
||||
id: ariadne::ids::UserId::from(user.id),
|
||||
name: user.username.clone(),
|
||||
icon_url: user.avatar_url.clone(),
|
||||
}
|
||||
};
|
||||
|
||||
ownerships.push(ownership);
|
||||
}
|
||||
|
||||
Ok(ownerships)
|
||||
}
|
||||
894
apps/labrinth/src/routes/internal/moderation/tech_review.rs
Normal file
894
apps/labrinth/src/routes/internal/moderation/tech_review.rs
Normal file
@@ -0,0 +1,894 @@
|
||||
use std::{collections::HashMap, fmt};
|
||||
|
||||
use actix_web::{HttpRequest, get, patch, post, put, web};
|
||||
use chrono::{DateTime, Utc};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use sqlx::PgPool;
|
||||
use tokio_stream::StreamExt;
|
||||
|
||||
use super::ownership::get_projects_ownership;
|
||||
use crate::{
|
||||
auth::check_is_moderator_from_headers,
|
||||
database::{
|
||||
DBProject,
|
||||
models::{
|
||||
DBFileId, DBProjectId, DBThread, DBThreadId, DBUser,
|
||||
DelphiReportId, DelphiReportIssueDetailsId, DelphiReportIssueId,
|
||||
ProjectTypeId,
|
||||
delphi_report_item::{
|
||||
DBDelphiReport, DelphiSeverity, DelphiStatus, DelphiVerdict,
|
||||
ReportIssueDetail,
|
||||
},
|
||||
thread_item::ThreadMessageBuilder,
|
||||
},
|
||||
redis::RedisPool,
|
||||
},
|
||||
models::{
|
||||
ids::{FileId, ProjectId, ThreadId, VersionId},
|
||||
pats::Scopes,
|
||||
projects::{Project, ProjectStatus},
|
||||
threads::{MessageBody, Thread},
|
||||
},
|
||||
queue::session::AuthQueue,
|
||||
routes::{ApiError, internal::moderation::Ownership},
|
||||
util::error::Context,
|
||||
};
|
||||
use eyre::eyre;
|
||||
|
||||
pub fn config(cfg: &mut utoipa_actix_web::service_config::ServiceConfig) {
|
||||
cfg.service(search_projects)
|
||||
.service(get_report)
|
||||
.service(get_issue)
|
||||
.service(submit_report)
|
||||
.service(update_issue_detail)
|
||||
.service(add_report);
|
||||
}
|
||||
|
||||
/// Arguments for searching project technical reviews.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct SearchProjects {
|
||||
#[serde(default = "default_limit")]
|
||||
#[schema(default = 20)]
|
||||
pub limit: u64,
|
||||
#[serde(default)]
|
||||
#[schema(default = 0)]
|
||||
pub page: u64,
|
||||
#[serde(default)]
|
||||
pub filter: SearchProjectsFilter,
|
||||
#[serde(default = "default_sort_by")]
|
||||
pub sort_by: SearchProjectsSort,
|
||||
}
|
||||
|
||||
fn default_limit() -> u64 {
|
||||
20
|
||||
}
|
||||
|
||||
fn default_sort_by() -> SearchProjectsSort {
|
||||
SearchProjectsSort::CreatedAsc
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Default, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct SearchProjectsFilter {
|
||||
pub project_type: Vec<ProjectTypeId>,
|
||||
}
|
||||
|
||||
#[derive(
|
||||
Debug,
|
||||
Clone,
|
||||
Copy,
|
||||
PartialEq,
|
||||
Eq,
|
||||
Hash,
|
||||
Serialize,
|
||||
Deserialize,
|
||||
utoipa::ToSchema,
|
||||
)]
|
||||
#[serde(rename_all = "snake_case")]
|
||||
pub enum SearchProjectsSort {
|
||||
CreatedAsc,
|
||||
CreatedDesc,
|
||||
SeverityAsc,
|
||||
SeverityDesc,
|
||||
}
|
||||
|
||||
impl fmt::Display for SearchProjectsSort {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
let s = serde_json::to_value(*self).unwrap();
|
||||
let s = s.as_str().unwrap();
|
||||
write!(f, "{s}")
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct FileReport {
|
||||
/// ID of this report.
|
||||
pub report_id: DelphiReportId,
|
||||
/// ID of the file that was scanned.
|
||||
pub file_id: FileId,
|
||||
/// When the report for this file was created.
|
||||
pub created: DateTime<Utc>,
|
||||
/// Why this project was flagged.
|
||||
pub flag_reason: FlagReason,
|
||||
/// According to this report, how likely is the project malicious?
|
||||
pub severity: DelphiSeverity,
|
||||
/// Name of the flagged file.
|
||||
pub file_name: String,
|
||||
/// Size of the flagged file, in bytes.
|
||||
pub file_size: i32,
|
||||
/// URL to download the flagged file.
|
||||
pub download_url: String,
|
||||
/// What issues appeared in the file.
|
||||
#[serde(default)]
|
||||
pub issues: Vec<FileIssue>,
|
||||
}
|
||||
|
||||
/// Issue raised by Delphi in a flagged file.
|
||||
///
|
||||
/// The issue is scoped to the JAR, not any specific class, but issues can be
|
||||
/// raised because they appeared in a class - see [`FileIssueDetails`].
|
||||
#[derive(Debug, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct FileIssue {
|
||||
/// ID of the issue.
|
||||
pub id: DelphiReportIssueId,
|
||||
/// ID of the report this issue is a part of.
|
||||
pub report_id: DelphiReportId,
|
||||
/// Delphi-determined kind of issue that this is, e.g. `OBFUSCATED_NAMES`.
|
||||
///
|
||||
/// Labrinth does not know the full set of kinds of issues, so this is kept
|
||||
/// as a string.
|
||||
pub issue_type: String,
|
||||
/// Details of why this issue might have been raised, such as what file it
|
||||
/// was found in.
|
||||
#[serde(default)]
|
||||
pub details: Vec<ReportIssueDetail>,
|
||||
}
|
||||
|
||||
/// Why a project was flagged for technical review.
|
||||
#[derive(
|
||||
Debug,
|
||||
Clone,
|
||||
Copy,
|
||||
PartialEq,
|
||||
Eq,
|
||||
Hash,
|
||||
Serialize,
|
||||
Deserialize,
|
||||
utoipa::ToSchema,
|
||||
)]
|
||||
#[serde(rename_all = "snake_case")]
|
||||
pub enum FlagReason {
|
||||
/// Delphi anti-malware scanner flagged a file in the project.
|
||||
Delphi,
|
||||
}
|
||||
|
||||
/// Get info on an issue in a Delphi report.
|
||||
#[utoipa::path(
|
||||
security(("bearer_auth" = [])),
|
||||
responses((status = OK, body = inline(FileIssue)))
|
||||
)]
|
||||
#[get("/issue/{issue_id}")]
|
||||
async fn get_issue(
|
||||
req: HttpRequest,
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
session_queue: web::Data<AuthQueue>,
|
||||
path: web::Path<(DelphiReportIssueId,)>,
|
||||
) -> Result<web::Json<FileIssue>, ApiError> {
|
||||
check_is_moderator_from_headers(
|
||||
&req,
|
||||
&**pool,
|
||||
&redis,
|
||||
&session_queue,
|
||||
Scopes::PROJECT_READ,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let (issue_id,) = path.into_inner();
|
||||
let row = sqlx::query!(
|
||||
r#"
|
||||
SELECT
|
||||
to_jsonb(dri)
|
||||
|| jsonb_build_object(
|
||||
-- TODO: replace with `json_array` in Postgres 16
|
||||
'details', (
|
||||
SELECT json_agg(to_jsonb(drid))
|
||||
FROM delphi_report_issue_details drid
|
||||
WHERE drid.issue_id = dri.id
|
||||
)
|
||||
) AS "data!: sqlx::types::Json<FileIssue>"
|
||||
FROM delphi_report_issues dri
|
||||
WHERE dri.id = $1
|
||||
"#,
|
||||
issue_id as DelphiReportIssueId,
|
||||
)
|
||||
.fetch_optional(&**pool)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch issue from database")?
|
||||
.ok_or(ApiError::NotFound)?;
|
||||
|
||||
Ok(web::Json(row.data.0))
|
||||
}
|
||||
|
||||
/// Get info on a specific report for a project.
|
||||
#[utoipa::path(
|
||||
security(("bearer_auth" = [])),
|
||||
responses((status = OK, body = inline(FileReport)))
|
||||
)]
|
||||
#[get("/report/{id}")]
|
||||
async fn get_report(
|
||||
req: HttpRequest,
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
session_queue: web::Data<AuthQueue>,
|
||||
path: web::Path<(DelphiReportId,)>,
|
||||
) -> Result<web::Json<FileReport>, ApiError> {
|
||||
check_is_moderator_from_headers(
|
||||
&req,
|
||||
&**pool,
|
||||
&redis,
|
||||
&session_queue,
|
||||
Scopes::PROJECT_READ,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let (report_id,) = path.into_inner();
|
||||
|
||||
let row = sqlx::query!(
|
||||
r#"
|
||||
SELECT DISTINCT ON (dr.id)
|
||||
to_jsonb(dr)
|
||||
|| jsonb_build_object(
|
||||
'file_id', to_base62(f.id),
|
||||
'version_id', to_base62(v.id),
|
||||
'project_id', to_base62(v.mod_id),
|
||||
'file_name', f.filename,
|
||||
'file_size', f.size,
|
||||
'flag_reason', 'delphi',
|
||||
'download_url', f.url,
|
||||
-- TODO: replace with `json_array` in Postgres 16
|
||||
'issues', (
|
||||
SELECT json_agg(
|
||||
to_jsonb(dri)
|
||||
|| jsonb_build_object(
|
||||
-- TODO: replace with `json_array` in Postgres 16
|
||||
'details', (
|
||||
SELECT json_agg(to_jsonb(drid))
|
||||
FROM delphi_report_issue_details drid
|
||||
WHERE drid.issue_id = dri.id
|
||||
)
|
||||
)
|
||||
)
|
||||
FROM delphi_report_issues dri
|
||||
WHERE
|
||||
dri.report_id = dr.id
|
||||
-- see delphi.rs todo comment
|
||||
AND dri.issue_type != '__dummy'
|
||||
)
|
||||
) AS "data!: sqlx::types::Json<FileReport>"
|
||||
FROM delphi_reports dr
|
||||
INNER JOIN files f ON f.id = dr.file_id
|
||||
INNER JOIN versions v ON v.id = f.version_id
|
||||
WHERE dr.id = $1
|
||||
"#,
|
||||
report_id as DelphiReportId,
|
||||
)
|
||||
.fetch_optional(&**pool)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch report from database")?
|
||||
.ok_or(ApiError::NotFound)?;
|
||||
|
||||
Ok(web::Json(row.data.0))
|
||||
}
|
||||
|
||||
/// See [`search_projects`].
|
||||
#[derive(Debug, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct SearchResponse {
|
||||
/// List of reported projects returned, and their report data.
|
||||
pub project_reports: Vec<ProjectReport>,
|
||||
/// Fetched project information for projects in the returned reports.
|
||||
pub projects: HashMap<ProjectId, ProjectModerationInfo>,
|
||||
/// Fetched moderation threads for projects in the returned reports.
|
||||
pub threads: HashMap<ThreadId, Thread>,
|
||||
/// Fetched owner information for projects.
|
||||
pub ownership: HashMap<ProjectId, Ownership>,
|
||||
}
|
||||
|
||||
/// Single project's reports from a search response.
|
||||
#[derive(Debug, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct ProjectReport {
|
||||
/// ID of the project this report is for.
|
||||
pub project_id: ProjectId,
|
||||
/// Highest severity of any report of any file of any version under this
|
||||
/// project.
|
||||
pub max_severity: Option<DelphiSeverity>,
|
||||
/// Reports for this project's versions.
|
||||
#[serde(default)]
|
||||
pub versions: Vec<VersionReport>,
|
||||
}
|
||||
|
||||
/// Single project version's reports from a search response.
|
||||
#[derive(Debug, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct VersionReport {
|
||||
/// ID of the project version this report is for.
|
||||
pub version_id: VersionId,
|
||||
/// Reports for this version's files.
|
||||
#[serde(default)]
|
||||
pub files: Vec<FileReport>,
|
||||
}
|
||||
|
||||
/// Limited set of project information returned by [`search_projects`].
|
||||
#[derive(Debug, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct ProjectModerationInfo {
|
||||
/// Project ID.
|
||||
pub id: ProjectId,
|
||||
/// Project moderation thread ID.
|
||||
pub thread_id: ThreadId,
|
||||
/// Project name.
|
||||
pub name: String,
|
||||
/// The aggregated project typos of the versions of this project
|
||||
#[serde(default)]
|
||||
pub project_types: Vec<String>,
|
||||
/// The URL of the icon of the project
|
||||
pub icon_url: Option<String>,
|
||||
}
|
||||
|
||||
/// Searches all projects which are awaiting technical review.
|
||||
#[utoipa::path(
|
||||
security(("bearer_auth" = [])),
|
||||
responses((status = OK, body = inline(Vec<SearchResponse>)))
|
||||
)]
|
||||
#[post("/search")]
|
||||
async fn search_projects(
|
||||
req: HttpRequest,
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
session_queue: web::Data<AuthQueue>,
|
||||
search_req: web::Json<SearchProjects>,
|
||||
) -> Result<web::Json<SearchResponse>, ApiError> {
|
||||
let user = check_is_moderator_from_headers(
|
||||
&req,
|
||||
&**pool,
|
||||
&redis,
|
||||
&session_queue,
|
||||
Scopes::PROJECT_READ,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let sort_by = search_req.sort_by.to_string();
|
||||
let limit = search_req.limit.max(50);
|
||||
let offset = limit.saturating_mul(search_req.page);
|
||||
|
||||
let limit =
|
||||
i64::try_from(limit).wrap_request_err("limit cannot fit into `i64`")?;
|
||||
let offset = i64::try_from(offset)
|
||||
.wrap_request_err("offset cannot fit into `i64`")?;
|
||||
|
||||
let mut project_reports = Vec::<ProjectReport>::new();
|
||||
let mut project_ids = Vec::<DBProjectId>::new();
|
||||
let mut thread_ids = Vec::<DBThreadId>::new();
|
||||
|
||||
let mut rows = sqlx::query!(
|
||||
r#"
|
||||
SELECT
|
||||
project_id AS "project_id: DBProjectId",
|
||||
project_thread_id AS "project_thread_id: DBThreadId",
|
||||
report AS "report!: sqlx::types::Json<ProjectReport>"
|
||||
FROM (
|
||||
SELECT DISTINCT ON (m.id)
|
||||
m.id AS project_id,
|
||||
t.id AS project_thread_id,
|
||||
MAX(dr.severity) AS severity,
|
||||
MIN(dr.created) AS earliest_report_created,
|
||||
MAX(dr.created) AS latest_report_created,
|
||||
|
||||
jsonb_build_object(
|
||||
'project_id', to_base62(m.id),
|
||||
'max_severity', MAX(dr.severity),
|
||||
-- TODO: replace with `json_array` in Postgres 16
|
||||
'versions', (
|
||||
SELECT coalesce(jsonb_agg(jsonb_build_object(
|
||||
'version_id', to_base62(v.id),
|
||||
-- TODO: replace with `json_array` in Postgres 16
|
||||
'files', (
|
||||
SELECT coalesce(jsonb_agg(jsonb_build_object(
|
||||
'report_id', dr.id,
|
||||
'file_id', to_base62(f.id),
|
||||
'created', dr.created,
|
||||
'flag_reason', 'delphi',
|
||||
'severity', dr.severity,
|
||||
'file_name', f.filename,
|
||||
'file_size', f.size,
|
||||
'download_url', f.url,
|
||||
-- TODO: replace with `json_array` in Postgres 16
|
||||
'issues', (
|
||||
SELECT coalesce(jsonb_agg(
|
||||
to_jsonb(dri)
|
||||
|| jsonb_build_object(
|
||||
-- TODO: replace with `json_array` in Postgres 16
|
||||
'details', (
|
||||
SELECT coalesce(jsonb_agg(
|
||||
jsonb_build_object(
|
||||
'id', didws.id,
|
||||
'issue_id', didws.issue_id,
|
||||
'key', didws.key,
|
||||
'file_path', didws.file_path,
|
||||
-- ignore `decompiled_source`
|
||||
'data', didws.data,
|
||||
'severity', didws.severity,
|
||||
'status', didws.status
|
||||
)
|
||||
), '[]'::jsonb)
|
||||
FROM delphi_issue_details_with_statuses didws
|
||||
WHERE didws.issue_id = dri.id
|
||||
)
|
||||
)
|
||||
), '[]'::jsonb)
|
||||
FROM delphi_report_issues dri
|
||||
WHERE
|
||||
dri.report_id = dr.id
|
||||
-- see delphi.rs todo comment
|
||||
AND dri.issue_type != '__dummy'
|
||||
)
|
||||
)), '[]'::jsonb)
|
||||
FROM delphi_reports dr
|
||||
WHERE dr.file_id = f.id
|
||||
)
|
||||
)), '[]'::jsonb)
|
||||
FROM versions v
|
||||
INNER JOIN files f ON f.version_id = v.id
|
||||
WHERE v.mod_id = m.id
|
||||
)
|
||||
) AS report
|
||||
FROM mods m
|
||||
INNER JOIN threads t ON t.mod_id = m.id
|
||||
INNER JOIN versions v ON v.mod_id = m.id
|
||||
INNER JOIN files f ON f.version_id = v.id
|
||||
|
||||
-- only return projects with at least 1 pending drid
|
||||
INNER JOIN delphi_reports dr ON dr.file_id = f.id
|
||||
INNER JOIN delphi_issue_details_with_statuses didws
|
||||
ON didws.project_id = m.id AND didws.status = 'pending'
|
||||
|
||||
-- filtering
|
||||
LEFT JOIN mods_categories mc ON mc.joining_mod_id = m.id
|
||||
LEFT JOIN categories c ON c.id = mc.joining_category_id
|
||||
WHERE
|
||||
-- project type
|
||||
(cardinality($4::int[]) = 0 OR c.project_type = ANY($4::int[]))
|
||||
AND m.status NOT IN ('draft', 'rejected', 'withheld')
|
||||
|
||||
GROUP BY m.id, t.id
|
||||
) t
|
||||
|
||||
-- sorting
|
||||
ORDER BY
|
||||
CASE WHEN $3 = 'created_asc' THEN t.earliest_report_created ELSE TO_TIMESTAMP(0) END ASC,
|
||||
CASE WHEN $3 = 'created_desc' THEN t.latest_report_created ELSE TO_TIMESTAMP(0) END DESC,
|
||||
CASE WHEN $3 = 'severity_asc' THEN t.severity ELSE 'low'::delphi_severity END ASC,
|
||||
CASE WHEN $3 = 'severity_desc' THEN t.severity ELSE 'low'::delphi_severity END DESC
|
||||
|
||||
-- pagination
|
||||
LIMIT $1
|
||||
OFFSET $2
|
||||
"#,
|
||||
limit,
|
||||
offset,
|
||||
&sort_by,
|
||||
&search_req
|
||||
.filter
|
||||
.project_type
|
||||
.iter()
|
||||
.map(|ty| ty.0)
|
||||
.collect::<Vec<_>>(),
|
||||
)
|
||||
.fetch(&**pool);
|
||||
|
||||
while let Some(row) = rows
|
||||
.next()
|
||||
.await
|
||||
.transpose()
|
||||
.wrap_internal_err("failed to fetch reports")?
|
||||
{
|
||||
project_reports.push(row.report.0);
|
||||
project_ids.push(row.project_id);
|
||||
thread_ids.push(row.project_thread_id);
|
||||
}
|
||||
|
||||
let projects = DBProject::get_many_ids(&project_ids, &**pool, &redis)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch projects")?
|
||||
.into_iter()
|
||||
.map(|project| {
|
||||
(ProjectId::from(project.inner.id), Project::from(project))
|
||||
})
|
||||
.collect::<HashMap<_, _>>();
|
||||
let db_threads = DBThread::get_many(&thread_ids, &**pool)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch threads")?;
|
||||
let thread_author_ids = db_threads
|
||||
.iter()
|
||||
.flat_map(|thread| {
|
||||
thread
|
||||
.messages
|
||||
.iter()
|
||||
.filter_map(|message| message.author_id)
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
let thread_authors =
|
||||
DBUser::get_many_ids(&thread_author_ids, &**pool, &redis)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch thread authors")?
|
||||
.into_iter()
|
||||
.map(From::from)
|
||||
.collect::<Vec<_>>();
|
||||
let threads = db_threads
|
||||
.into_iter()
|
||||
.map(|thread| {
|
||||
let thread = Thread::from(thread, thread_authors.clone(), &user);
|
||||
(thread.id, thread)
|
||||
})
|
||||
.collect::<HashMap<_, _>>();
|
||||
|
||||
let project_list: Vec<Project> = projects.values().cloned().collect();
|
||||
|
||||
let ownerships = get_projects_ownership(&project_list, &pool, &redis)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch project ownerships")?;
|
||||
let ownership = projects
|
||||
.keys()
|
||||
.copied()
|
||||
.zip(ownerships)
|
||||
.collect::<HashMap<_, _>>();
|
||||
|
||||
Ok(web::Json(SearchResponse {
|
||||
project_reports,
|
||||
projects: projects
|
||||
.into_iter()
|
||||
.map(|(id, project)| {
|
||||
(
|
||||
id,
|
||||
ProjectModerationInfo {
|
||||
id,
|
||||
thread_id: project.thread_id,
|
||||
name: project.name,
|
||||
project_types: project.project_types,
|
||||
icon_url: project.icon_url,
|
||||
},
|
||||
)
|
||||
})
|
||||
.collect(),
|
||||
threads,
|
||||
ownership,
|
||||
}))
|
||||
}
|
||||
|
||||
/// See [`submit_report`].
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct SubmitReport {
|
||||
/// Does the moderator think this report shows that the project is safe or
|
||||
/// unsafe?
|
||||
pub verdict: DelphiVerdict,
|
||||
/// Moderator message to send to the thread when rejecting the project.
|
||||
pub message: Option<String>,
|
||||
}
|
||||
|
||||
/// Submits a verdict for a project based on its technical reports.
|
||||
///
|
||||
/// Before this is called, all issues for this project's reports must have been
|
||||
/// marked as either safe or unsafe. Otherwise, this will error with
|
||||
/// [`ApiError::TechReviewIssuesWithNoVerdict`], providing the issue IDs which
|
||||
/// are still unmarked.
|
||||
#[utoipa::path(
|
||||
security(("bearer_auth" = [])),
|
||||
responses((status = NO_CONTENT))
|
||||
)]
|
||||
#[post("/submit/{project_id}")]
|
||||
async fn submit_report(
|
||||
req: HttpRequest,
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
session_queue: web::Data<AuthQueue>,
|
||||
web::Json(submit_report): web::Json<SubmitReport>,
|
||||
path: web::Path<(ProjectId,)>,
|
||||
) -> Result<(), ApiError> {
|
||||
let user = check_is_moderator_from_headers(
|
||||
&req,
|
||||
&**pool,
|
||||
&redis,
|
||||
&session_queue,
|
||||
Scopes::PROJECT_WRITE,
|
||||
)
|
||||
.await?;
|
||||
let (project_id,) = path.into_inner();
|
||||
let project_id = DBProjectId::from(project_id);
|
||||
|
||||
let mut txn = pool
|
||||
.begin()
|
||||
.await
|
||||
.wrap_internal_err("failed to begin transaction")?;
|
||||
|
||||
let pending_issue_details = sqlx::query!(
|
||||
r#"
|
||||
SELECT
|
||||
didws.id AS "issue_detail_id!"
|
||||
FROM mods m
|
||||
INNER JOIN versions v ON v.mod_id = m.id
|
||||
INNER JOIN files f ON f.version_id = v.id
|
||||
INNER JOIN delphi_reports dr ON dr.file_id = f.id
|
||||
INNER JOIN delphi_report_issues dri ON dri.report_id = dr.id
|
||||
INNER JOIN delphi_issue_details_with_statuses didws ON didws.issue_id = dri.id
|
||||
WHERE
|
||||
m.id = $1
|
||||
AND didws.status = 'pending'
|
||||
-- see delphi.rs todo comment
|
||||
AND dri.issue_type != '__dummy'
|
||||
"#,
|
||||
project_id as _,
|
||||
)
|
||||
.fetch_all(&mut *txn)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch pending issues")?;
|
||||
|
||||
if !pending_issue_details.is_empty() {
|
||||
return Err(ApiError::TechReviewDetailsWithNoVerdict {
|
||||
details: pending_issue_details
|
||||
.into_iter()
|
||||
.map(|record| {
|
||||
DelphiReportIssueDetailsId(record.issue_detail_id)
|
||||
})
|
||||
.collect(),
|
||||
});
|
||||
}
|
||||
|
||||
sqlx::query!(
|
||||
"
|
||||
DELETE FROM delphi_report_issue_details drid
|
||||
WHERE issue_id IN (
|
||||
SELECT dri.id
|
||||
FROM mods m
|
||||
INNER JOIN versions v ON v.mod_id = m.id
|
||||
INNER JOIN files f ON f.version_id = v.id
|
||||
INNER JOIN delphi_reports dr ON dr.file_id = f.id
|
||||
INNER JOIN delphi_report_issues dri ON dri.report_id = dr.id
|
||||
WHERE m.id = $1 AND dri.issue_type = '__dummy'
|
||||
)
|
||||
",
|
||||
project_id as _,
|
||||
)
|
||||
.execute(&mut *txn)
|
||||
.await
|
||||
.wrap_internal_err("failed to delete dummy issue")?;
|
||||
|
||||
let record = sqlx::query!(
|
||||
r#"
|
||||
SELECT t.id AS "thread_id: DBThreadId"
|
||||
FROM mods m
|
||||
INNER JOIN threads t ON t.mod_id = m.id
|
||||
WHERE m.id = $1
|
||||
"#,
|
||||
project_id as _,
|
||||
)
|
||||
.fetch_one(&mut *txn)
|
||||
.await
|
||||
.wrap_internal_err("failed to update reports")?;
|
||||
|
||||
if let Some(body) = submit_report.message {
|
||||
ThreadMessageBuilder {
|
||||
author_id: Some(user.id.into()),
|
||||
body: MessageBody::Text {
|
||||
body,
|
||||
private: true,
|
||||
replying_to: None,
|
||||
associated_images: Vec::new(),
|
||||
},
|
||||
thread_id: record.thread_id,
|
||||
hide_identity: user.role.is_mod(),
|
||||
}
|
||||
.insert(&mut txn)
|
||||
.await
|
||||
.wrap_internal_err("failed to add moderator message")?;
|
||||
}
|
||||
|
||||
let verdict = submit_report.verdict;
|
||||
ThreadMessageBuilder {
|
||||
author_id: Some(user.id.into()),
|
||||
body: MessageBody::TechReview { verdict },
|
||||
thread_id: record.thread_id,
|
||||
hide_identity: user.role.is_mod(),
|
||||
}
|
||||
.insert(&mut txn)
|
||||
.await
|
||||
.wrap_internal_err("failed to add tech review message")?;
|
||||
|
||||
if verdict == DelphiVerdict::Unsafe {
|
||||
let record = sqlx::query!(
|
||||
r#"
|
||||
UPDATE mods
|
||||
SET status = $1
|
||||
FROM mods m
|
||||
INNER JOIN threads t ON t.mod_id = m.id
|
||||
WHERE m.id = $2
|
||||
RETURNING
|
||||
t.id AS "thread_id: DBThreadId",
|
||||
(SELECT status FROM mods WHERE id = m.id) AS "old_status!"
|
||||
"#,
|
||||
ProjectStatus::Rejected.as_str(),
|
||||
project_id as _,
|
||||
)
|
||||
.fetch_one(&mut *txn)
|
||||
.await
|
||||
.wrap_internal_err("failed to mark project as rejected")?;
|
||||
|
||||
ThreadMessageBuilder {
|
||||
author_id: Some(user.id.into()),
|
||||
body: MessageBody::StatusChange {
|
||||
new_status: ProjectStatus::Rejected,
|
||||
old_status: ProjectStatus::from_string(&record.old_status),
|
||||
},
|
||||
thread_id: record.thread_id,
|
||||
hide_identity: user.role.is_mod(),
|
||||
}
|
||||
.insert(&mut txn)
|
||||
.await
|
||||
.wrap_internal_err("failed to add tech review message")?;
|
||||
|
||||
DBProject::clear_cache(project_id, None, None, &redis)
|
||||
.await
|
||||
.wrap_internal_err("failed to clear project cache")?;
|
||||
}
|
||||
|
||||
txn.commit()
|
||||
.await
|
||||
.wrap_internal_err("failed to commit transaction")?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// See [`update_issue`].
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct UpdateIssue {
|
||||
/// What the moderator has decided the outcome of this issue is.
|
||||
pub verdict: DelphiVerdict,
|
||||
}
|
||||
|
||||
/// Updates the state of a technical review issue detail.
|
||||
///
|
||||
/// This will not automatically reject the project for malware, but just flag
|
||||
/// this issue with a verdict.
|
||||
#[utoipa::path(
|
||||
security(("bearer_auth" = [])),
|
||||
responses((status = NO_CONTENT))
|
||||
)]
|
||||
#[patch("/issue-detail/{id}")]
|
||||
async fn update_issue_detail(
|
||||
req: HttpRequest,
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
session_queue: web::Data<AuthQueue>,
|
||||
update_req: web::Json<UpdateIssue>,
|
||||
path: web::Path<(DelphiReportIssueDetailsId,)>,
|
||||
) -> Result<(), ApiError> {
|
||||
check_is_moderator_from_headers(
|
||||
&req,
|
||||
&**pool,
|
||||
&redis,
|
||||
&session_queue,
|
||||
Scopes::PROJECT_WRITE,
|
||||
)
|
||||
.await?;
|
||||
let (issue_detail_id,) = path.into_inner();
|
||||
|
||||
let mut txn = pool
|
||||
.begin()
|
||||
.await
|
||||
.wrap_internal_err("failed to start transaction")?;
|
||||
|
||||
let status = match update_req.verdict {
|
||||
DelphiVerdict::Safe => DelphiStatus::Safe,
|
||||
DelphiVerdict::Unsafe => DelphiStatus::Unsafe,
|
||||
};
|
||||
let results = sqlx::query!(
|
||||
r#"
|
||||
INSERT INTO delphi_issue_detail_verdicts (
|
||||
project_id,
|
||||
detail_key,
|
||||
verdict
|
||||
)
|
||||
SELECT
|
||||
didws.project_id,
|
||||
didws.key,
|
||||
$1
|
||||
FROM delphi_issue_details_with_statuses didws
|
||||
INNER JOIN delphi_report_issues dri ON dri.id = didws.issue_id
|
||||
WHERE
|
||||
didws.id = $2
|
||||
-- see delphi.rs todo comment
|
||||
AND dri.issue_type != '__dummy'
|
||||
"#,
|
||||
status as _,
|
||||
issue_detail_id as _,
|
||||
)
|
||||
.execute(&mut *txn)
|
||||
.await
|
||||
.wrap_internal_err("failed to update issue detail")?;
|
||||
if results.rows_affected() == 0 {
|
||||
return Err(ApiError::Request(eyre!("issue detail does not exist")));
|
||||
}
|
||||
|
||||
txn.commit()
|
||||
.await
|
||||
.wrap_internal_err("failed to commit transaction")?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// See [`add_report`].
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, utoipa::ToSchema)]
|
||||
pub struct AddReport {
|
||||
pub file_id: FileId,
|
||||
}
|
||||
|
||||
/// Adds a file to the technical review queue by adding an empty report, if one
|
||||
/// does not already exist for it.
|
||||
#[utoipa::path]
|
||||
#[put("/report")]
|
||||
async fn add_report(
|
||||
req: HttpRequest,
|
||||
pool: web::Data<PgPool>,
|
||||
redis: web::Data<RedisPool>,
|
||||
session_queue: web::Data<AuthQueue>,
|
||||
web::Json(add_report): web::Json<AddReport>,
|
||||
) -> Result<web::Json<DelphiReportId>, ApiError> {
|
||||
check_is_moderator_from_headers(
|
||||
&req,
|
||||
&**pool,
|
||||
&redis,
|
||||
&session_queue,
|
||||
Scopes::PROJECT_WRITE,
|
||||
)
|
||||
.await?;
|
||||
let file_id = add_report.file_id;
|
||||
|
||||
let mut txn = pool
|
||||
.begin()
|
||||
.await
|
||||
.wrap_internal_err("failed to begin transaction")?;
|
||||
|
||||
let record = sqlx::query!(
|
||||
r#"
|
||||
SELECT
|
||||
f.url,
|
||||
COUNT(dr.id) AS "report_count!"
|
||||
FROM files f
|
||||
LEFT JOIN delphi_reports dr ON dr.file_id = f.id
|
||||
WHERE f.id = $1
|
||||
GROUP BY f.url
|
||||
"#,
|
||||
DBFileId::from(file_id) as _,
|
||||
)
|
||||
.fetch_one(&mut *txn)
|
||||
.await
|
||||
.wrap_internal_err("failed to fetch file")?;
|
||||
|
||||
if record.report_count > 0 {
|
||||
return Err(ApiError::Request(eyre!("file already has reports")));
|
||||
}
|
||||
|
||||
let report_id = DBDelphiReport {
|
||||
id: DelphiReportId(0),
|
||||
file_id: Some(file_id.into()),
|
||||
delphi_version: -1, // TODO
|
||||
artifact_url: record.url,
|
||||
created: Utc::now(),
|
||||
severity: DelphiSeverity::Low, // TODO
|
||||
}
|
||||
.upsert(&mut txn)
|
||||
.await
|
||||
.wrap_internal_err("failed to insert report")?;
|
||||
|
||||
txn.commit()
|
||||
.await
|
||||
.wrap_internal_err("failed to commit transaction")?;
|
||||
|
||||
Ok(web::Json(report_id))
|
||||
}
|
||||
@@ -1,3 +1,4 @@
|
||||
use crate::database::models::DelphiReportIssueDetailsId;
|
||||
use crate::file_hosting::FileHostingError;
|
||||
use crate::routes::analytics::{page_view_ingest, playtime_ingest};
|
||||
use crate::util::cors::default_cors;
|
||||
@@ -7,6 +8,7 @@ use actix_files::Files;
|
||||
use actix_web::http::StatusCode;
|
||||
use actix_web::{HttpResponse, web};
|
||||
use futures::FutureExt;
|
||||
use serde_json::json;
|
||||
|
||||
pub mod internal;
|
||||
pub mod v2;
|
||||
@@ -161,8 +163,14 @@ pub enum ApiError {
|
||||
RateLimitError(u128, u32),
|
||||
#[error("Error while interacting with payment processor: {0}")]
|
||||
Stripe(#[from] stripe::StripeError),
|
||||
#[error("Error while interacting with Delphi: {0}")]
|
||||
Delphi(reqwest::Error),
|
||||
#[error(transparent)]
|
||||
Mural(#[from] Box<muralpay::ApiError>),
|
||||
#[error("report still has {} issue details with no verdict", details.len())]
|
||||
TechReviewDetailsWithNoVerdict {
|
||||
details: Vec<DelphiReportIssueDetailsId>,
|
||||
},
|
||||
}
|
||||
|
||||
impl ApiError {
|
||||
@@ -203,7 +211,11 @@ impl ApiError {
|
||||
Self::Stripe(..) => "stripe_error",
|
||||
Self::TaxProcessor(..) => "tax_processor_error",
|
||||
Self::Slack(..) => "slack_error",
|
||||
Self::Delphi(..) => "delphi_error",
|
||||
Self::Mural(..) => "mural_error",
|
||||
Self::TechReviewDetailsWithNoVerdict { .. } => {
|
||||
"tech_review_issues_with_no_verdict"
|
||||
}
|
||||
},
|
||||
description: match self {
|
||||
Self::Internal(e) => format!("{e:#?}"),
|
||||
@@ -213,6 +225,13 @@ impl ApiError {
|
||||
},
|
||||
details: match self {
|
||||
Self::Mural(err) => serde_json::to_value(err.clone()).ok(),
|
||||
Self::TechReviewDetailsWithNoVerdict { details } => {
|
||||
let details = serde_json::to_value(details)
|
||||
.expect("details should never fail to serialize");
|
||||
Some(json!({
|
||||
"issue_details": details
|
||||
}))
|
||||
}
|
||||
_ => None,
|
||||
},
|
||||
}
|
||||
@@ -256,7 +275,11 @@ impl actix_web::ResponseError for ApiError {
|
||||
Self::Stripe(..) => StatusCode::FAILED_DEPENDENCY,
|
||||
Self::TaxProcessor(..) => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Self::Slack(..) => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Self::Delphi(..) => StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Self::Mural(..) => StatusCode::BAD_REQUEST,
|
||||
Self::TechReviewDetailsWithNoVerdict { .. } => {
|
||||
StatusCode::BAD_REQUEST
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -413,9 +413,6 @@ async fn project_create_inner(
|
||||
session_queue: &AuthQueue,
|
||||
project_id: ProjectId,
|
||||
) -> Result<HttpResponse, CreateError> {
|
||||
// The base URL for files uploaded to S3
|
||||
let cdn_url = dotenvy::var("CDN_URL")?;
|
||||
|
||||
// The currently logged in user
|
||||
let (_, current_user) = get_user_from_headers(
|
||||
&req,
|
||||
@@ -651,7 +648,6 @@ async fn project_create_inner(
|
||||
uploaded_files,
|
||||
&mut created_version.files,
|
||||
&mut created_version.dependencies,
|
||||
&cdn_url,
|
||||
&content_disposition,
|
||||
project_id,
|
||||
created_version.version_id.into(),
|
||||
|
||||
@@ -380,7 +380,25 @@ pub async fn thread_send_message(
|
||||
.await?
|
||||
.1;
|
||||
|
||||
let string: database::models::DBThreadId = info.into_inner().0.into();
|
||||
thread_send_message_internal(
|
||||
&user,
|
||||
info.into_inner().0,
|
||||
&pool,
|
||||
new_message.into_inner(),
|
||||
&redis,
|
||||
)
|
||||
.await?;
|
||||
Ok(HttpResponse::NoContent().finish())
|
||||
}
|
||||
|
||||
pub async fn thread_send_message_internal(
|
||||
user: &User,
|
||||
thread_id: ThreadId,
|
||||
pool: &PgPool,
|
||||
new_message: NewThreadMessage,
|
||||
redis: &RedisPool,
|
||||
) -> Result<(), ApiError> {
|
||||
let string: database::models::DBThreadId = thread_id.into();
|
||||
|
||||
let is_private: bool;
|
||||
|
||||
@@ -406,7 +424,7 @@ pub async fn thread_send_message(
|
||||
if let Some(replying_to) = replying_to {
|
||||
let thread_message = database::models::DBThreadMessage::get(
|
||||
(*replying_to).into(),
|
||||
&**pool,
|
||||
pool,
|
||||
)
|
||||
.await?;
|
||||
|
||||
@@ -431,10 +449,10 @@ pub async fn thread_send_message(
|
||||
));
|
||||
}
|
||||
|
||||
let result = database::models::DBThread::get(string, &**pool).await?;
|
||||
let result = database::models::DBThread::get(string, pool).await?;
|
||||
|
||||
if let Some(thread) = result {
|
||||
if !is_authorized_thread(&thread, &user, &pool).await? {
|
||||
if !is_authorized_thread(&thread, user, pool).await? {
|
||||
return Err(ApiError::NotFound);
|
||||
}
|
||||
|
||||
@@ -450,10 +468,9 @@ pub async fn thread_send_message(
|
||||
.await?;
|
||||
|
||||
if let Some(project_id) = thread.project_id {
|
||||
let project = database::models::DBProject::get_id(
|
||||
project_id, &**pool, &redis,
|
||||
)
|
||||
.await?;
|
||||
let project =
|
||||
database::models::DBProject::get_id(project_id, pool, redis)
|
||||
.await?;
|
||||
|
||||
if let Some(project) = project
|
||||
&& project.inner.status != ProjectStatus::Processing
|
||||
@@ -463,8 +480,8 @@ pub async fn thread_send_message(
|
||||
let members =
|
||||
database::models::DBTeamMember::get_from_team_full(
|
||||
project.inner.team_id,
|
||||
&**pool,
|
||||
&redis,
|
||||
pool,
|
||||
redis,
|
||||
)
|
||||
.await?;
|
||||
|
||||
@@ -479,7 +496,7 @@ pub async fn thread_send_message(
|
||||
.insert_many(
|
||||
members.iter().map(|x| x.user_id).collect(),
|
||||
&mut transaction,
|
||||
&redis,
|
||||
redis,
|
||||
)
|
||||
.await?;
|
||||
|
||||
@@ -491,15 +508,14 @@ pub async fn thread_send_message(
|
||||
.insert_many(
|
||||
members.iter().map(|x| x.user_id).collect(),
|
||||
&mut transaction,
|
||||
&redis,
|
||||
redis,
|
||||
)
|
||||
.await?;
|
||||
}
|
||||
} else if let Some(report_id) = thread.report_id {
|
||||
let report = database::models::report_item::DBReport::get(
|
||||
report_id, &**pool,
|
||||
)
|
||||
.await?;
|
||||
let report =
|
||||
database::models::report_item::DBReport::get(report_id, pool)
|
||||
.await?;
|
||||
|
||||
if let Some(report) = report {
|
||||
if report.closed && !user.role.is_mod() {
|
||||
@@ -517,7 +533,7 @@ pub async fn thread_send_message(
|
||||
report_id: Some(report.id.into()),
|
||||
},
|
||||
}
|
||||
.insert(report.reporter, &mut transaction, &redis)
|
||||
.insert(report.reporter, &mut transaction, redis)
|
||||
.await?;
|
||||
}
|
||||
}
|
||||
@@ -531,7 +547,7 @@ pub async fn thread_send_message(
|
||||
if let Some(db_image) = image_item::DBImage::get(
|
||||
(*image_id).into(),
|
||||
&mut *transaction,
|
||||
&redis,
|
||||
redis,
|
||||
)
|
||||
.await?
|
||||
{
|
||||
@@ -558,7 +574,7 @@ pub async fn thread_send_message(
|
||||
.execute(&mut *transaction)
|
||||
.await?;
|
||||
|
||||
image_item::DBImage::clear_cache(image.id.into(), &redis)
|
||||
image_item::DBImage::clear_cache(image.id.into(), redis)
|
||||
.await?;
|
||||
} else {
|
||||
return Err(ApiError::InvalidInput(format!(
|
||||
@@ -570,7 +586,7 @@ pub async fn thread_send_message(
|
||||
|
||||
transaction.commit().await?;
|
||||
|
||||
Ok(HttpResponse::NoContent().body(""))
|
||||
Ok(())
|
||||
} else {
|
||||
Err(ApiError::NotFound)
|
||||
}
|
||||
@@ -630,14 +646,7 @@ pub async fn message_delete(
|
||||
.await?;
|
||||
}
|
||||
|
||||
let private = if let MessageBody::Text { private, .. } = thread.body {
|
||||
private
|
||||
} else if let MessageBody::Deleted { private, .. } = thread.body {
|
||||
private
|
||||
} else {
|
||||
false
|
||||
};
|
||||
|
||||
let private = thread.body.is_private();
|
||||
database::models::DBThreadMessage::remove_full(
|
||||
thread.id,
|
||||
private,
|
||||
|
||||
@@ -38,7 +38,6 @@ use sha1::Digest;
|
||||
use sqlx::postgres::PgPool;
|
||||
use std::collections::{HashMap, HashSet};
|
||||
use std::sync::Arc;
|
||||
use tracing::error;
|
||||
use validator::Validate;
|
||||
|
||||
fn default_requested_status() -> VersionStatus {
|
||||
@@ -158,8 +157,6 @@ async fn version_create_inner(
|
||||
session_queue: &AuthQueue,
|
||||
moderation_queue: &AutomatedModerationQueue,
|
||||
) -> Result<HttpResponse, CreateError> {
|
||||
let cdn_url = dotenvy::var("CDN_URL")?;
|
||||
|
||||
let mut initial_version_data = None;
|
||||
let mut version_builder = None;
|
||||
let mut selected_loaders = None;
|
||||
@@ -355,7 +352,6 @@ async fn version_create_inner(
|
||||
uploaded_files,
|
||||
&mut version.files,
|
||||
&mut version.dependencies,
|
||||
&cdn_url,
|
||||
&content_disposition,
|
||||
version.project_id.into(),
|
||||
version.version_id.into(),
|
||||
@@ -451,6 +447,7 @@ async fn version_create_inner(
|
||||
.files
|
||||
.iter()
|
||||
.map(|file| VersionFile {
|
||||
id: None,
|
||||
hashes: file
|
||||
.hashes
|
||||
.iter()
|
||||
@@ -590,8 +587,6 @@ async fn upload_file_to_version_inner(
|
||||
version_id: models::DBVersionId,
|
||||
session_queue: &AuthQueue,
|
||||
) -> Result<HttpResponse, CreateError> {
|
||||
let cdn_url = dotenvy::var("CDN_URL")?;
|
||||
|
||||
let mut initial_file_data: Option<InitialFileData> = None;
|
||||
let mut file_builders: Vec<VersionFileBuilder> = Vec::new();
|
||||
|
||||
@@ -741,7 +736,6 @@ async fn upload_file_to_version_inner(
|
||||
uploaded_files,
|
||||
&mut file_builders,
|
||||
&mut dependencies,
|
||||
&cdn_url,
|
||||
&content_disposition,
|
||||
project_id,
|
||||
version_id.into(),
|
||||
@@ -795,7 +789,6 @@ pub async fn upload_file(
|
||||
uploaded_files: &mut Vec<UploadedFile>,
|
||||
version_files: &mut Vec<VersionFileBuilder>,
|
||||
dependencies: &mut Vec<DependencyBuilder>,
|
||||
cdn_url: &str,
|
||||
content_disposition: &actix_web::http::header::ContentDisposition,
|
||||
project_id: ProjectId,
|
||||
version_id: VersionId,
|
||||
@@ -943,13 +936,11 @@ pub async fn upload_file(
|
||||
|| total_files_len == 1;
|
||||
|
||||
let file_path_encode = format!(
|
||||
"data/{}/versions/{}/{}",
|
||||
project_id,
|
||||
version_id,
|
||||
"data/{project_id}/versions/{version_id}/{}",
|
||||
urlencoding::encode(file_name)
|
||||
);
|
||||
let file_path =
|
||||
format!("data/{}/versions/{}/{}", project_id, version_id, &file_name);
|
||||
format!("data/{project_id}/versions/{version_id}/{file_name}");
|
||||
|
||||
let upload_data = file_host
|
||||
.upload_file(content_type, &file_path, FileHostPublicity::Public, data)
|
||||
@@ -980,33 +971,9 @@ pub async fn upload_file(
|
||||
return Err(CreateError::InvalidInput(msg.to_string()));
|
||||
}
|
||||
|
||||
let url = format!("{cdn_url}/{file_path_encode}");
|
||||
|
||||
let client = reqwest::Client::new();
|
||||
let delphi_url = dotenvy::var("DELPHI_URL")?;
|
||||
match client
|
||||
.post(delphi_url)
|
||||
.json(&serde_json::json!({
|
||||
"url": url,
|
||||
"project_id": project_id,
|
||||
"version_id": version_id,
|
||||
}))
|
||||
.send()
|
||||
.await
|
||||
{
|
||||
Ok(res) => {
|
||||
if !res.status().is_success() {
|
||||
error!("Failed to upload file to Delphi: {url}");
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Failed to upload file to Delphi: {url}: {e}");
|
||||
}
|
||||
}
|
||||
|
||||
version_files.push(VersionFileBuilder {
|
||||
filename: file_name.to_string(),
|
||||
url: format!("{cdn_url}/{file_path_encode}"),
|
||||
url: format!("{}/{file_path_encode}", dotenvy::var("CDN_URL")?),
|
||||
hashes: vec![
|
||||
models::version_item::HashBuilder {
|
||||
algorithm: "sha1".to_string(),
|
||||
|
||||
Reference in New Issue
Block a user