Compare commits

...

167 Commits

Author SHA1 Message Date
24c756a9a8 🗑️ Remove gateway got replaced by turbine one 2025-12-13 19:49:56 +08:00
7ecb64742f ♻️ Updated discovery resolver 2025-12-13 19:28:24 +08:00
3a7140f0a6 ♻️ Update service discovery code 2025-12-13 18:52:55 +08:00
42082fbefa 🔨 Reconfigured to use new discovery 2025-12-13 17:38:49 +08:00
bc3d030a1e New service discovery system 2025-12-13 14:23:28 +08:00
8642737a07 Configurable post page 2025-12-12 00:10:57 +08:00
8181938aaf Managed mode page will render with layout 2025-12-11 22:25:40 +08:00
922afc2239 🐛 Fix realm query 2025-12-10 22:59:18 +08:00
a071bd2738 Publication site global config data structure 2025-12-10 19:33:00 +08:00
43945fc524 🐛 Fix discovery realms order incorrect 2025-12-07 14:28:41 +08:00
e477429a35 👔 Increase the chance of other type of activities show up
🗑️ Remove debug include in timeline
2025-12-06 21:12:08 +08:00
fe3a057185 👔 Discovery realms will show desc by member count 2025-12-06 21:10:08 +08:00
ad3c104c5c Proper trace for auth session 2025-12-04 00:38:44 +08:00
2020d625aa 🗃️ Add migration of add sticker pack icon 2025-12-04 00:27:09 +08:00
f471c5635d Post article thumbnail 2025-12-04 00:26:54 +08:00
eaeaa28c60 Sticker icon 2025-12-04 00:19:36 +08:00
ee5c7cb7ce 🐛 Fix get device API 2025-12-03 23:29:31 +08:00
33abf12e41 🐛 Fix pass service swagger docs duplicate schema name cause 500 2025-12-03 22:46:47 +08:00
4a71f92ef0 ♻️ Updated auth challenges and device API to fit new design 2025-12-03 22:43:35 +08:00
4faa1a4b64 🐛 Fix message pack cache serilaize issue in sticker 2025-12-03 22:09:56 +08:00
e49a1ec49a Push token clean up when invalid 2025-12-03 21:42:18 +08:00
a88f42b26a Rolling back to old logic to provide mock device id in websocket gateway 2025-12-03 21:30:29 +08:00
c45be62331 Support switching from JSON to MessagePack in cache during runtime 2025-12-03 21:27:26 +08:00
c8228e0c8e Use JSON to serialize cache 2025-12-03 01:47:57 +08:00
c642c6d646 Resend self activation email API 2025-12-03 01:17:39 +08:00
270c211cb8 ♻️ Refactored to make a simplifier auth session system 2025-12-03 00:38:28 +08:00
74c8f3490d 🐛 Fix the message pack serializer 2025-12-03 00:38:12 +08:00
b364edc74b Use Json Serializer in cache again 2025-12-02 22:59:43 +08:00
9addf38677 🐛 Enable contractless serilization in cache to fix message pack serilizer 2025-12-02 22:51:12 +08:00
a02ed10434 🐛 Fix use wrong DI type in cache service 2025-12-02 22:45:30 +08:00
aca28f9318 ♻️ Refactored the cache service 2025-12-02 22:38:47 +08:00
c2f72993b7 🐛 Fix app snapshot didn't included in release 2025-12-02 21:52:24 +08:00
158cc75c5b 💥 Simplified permission node system and data structure 2025-12-02 21:42:26 +08:00
fa2f53ff7a 🐛 Fix file reference created with wrong date 2025-12-02 21:03:57 +08:00
2cce5ebf80 Use affiliation spell for registeration 2025-12-02 00:54:57 +08:00
13b2e46ecc Affliation spell CRUD 2025-12-01 23:33:48 +08:00
cbd68c9ae6 Proper site manager send file method 2025-12-01 22:55:20 +08:00
b99b61e0f9 🐛 Fix chat backward comapbility 2025-11-30 21:33:39 +08:00
94f4e68120 Timeout prevent send message logic 2025-11-30 21:13:54 +08:00
d5510f7e4d Chat timeout APIs
🐛 Fix member listing in chat
2025-11-30 21:08:07 +08:00
c038ab9e3c ♻️ A more robust and simpler chat system 2025-11-30 20:58:48 +08:00
e97719ec84 🗃️ Add missing account id migrations 2025-11-30 20:13:15 +08:00
40b8ea8eb8 🗃️ Bring account id back to chat room 2025-11-30 19:59:30 +08:00
f9b4dd45d7 🐛 Trying to fix relationship bugs 2025-11-30 17:52:19 +08:00
a46de4662c 🐛 Fix gateway 2025-11-30 17:51:27 +08:00
fdd14b860e 🐛 Fix wrong required status of validate account create request 2025-11-30 17:37:34 +08:00
cb62df81e2 👔 Adjust lookup account logic 2025-11-30 17:20:20 +08:00
46717e39a7 Admin delete account endpoint 2025-11-30 17:19:33 +08:00
344ed6e348 Account validation endpoint 2025-11-30 17:16:11 +08:00
a8b62fb0eb Auth via authorized device 2025-11-30 00:00:13 +08:00
00b3087d6a ♻️ Refactored auth service for better security 2025-11-29 18:00:23 +08:00
78f3873a0c 🐛 Fix birthday check in 2025-11-27 22:22:22 +08:00
a7f4173df7 Special birthday check in tips 2025-11-27 21:49:25 +08:00
f51c3c1724 🐛 Fix birthday check in result didn't show up 2025-11-27 21:41:30 +08:00
a92dc7e140 👔 Remove single file 1MB limit in site 2025-11-24 22:54:16 +08:00
c42befed6b ♻️ Refactored notification meta 2025-11-23 13:20:40 +08:00
2b95d58611 All unread messages endpoint 2025-11-23 12:28:57 +08:00
726a752fbb :zsap: Pagination in chat sync 2025-11-23 12:07:58 +08:00
2024972832 🐛 Trying to fix Pass service issues 2025-11-23 03:02:51 +08:00
d553ca2ca7 🐛 Dozens of bug fixes in chat 2025-11-23 01:17:15 +08:00
aeef16495f 🐛 Fix sitemap and rss still respond all types of posts 2025-11-22 18:55:29 +08:00
9b26a2a7eb 🐛 Fix replace of markdown convertion 2025-11-22 18:53:48 +08:00
2317033dae 👔 Stop rendering post attachments in article post on hosted pages 2025-11-22 18:24:32 +08:00
fd6e9c9780 🐛 Fix some stupid bugs 2025-11-22 18:22:53 +08:00
af0a2ff493 💄 Enrich post susbcription notification 2025-11-22 18:08:11 +08:00
b142a71c32 🐛 Fix publisher member didn't include publisher in response 2025-11-22 17:57:17 +08:00
27e3cc853a 🐛 Fix post service grpc call made type filter wrong 2025-11-22 17:55:45 +08:00
590519c28f 🐛 Fix index shows all type of posts in managed page 2025-11-22 17:53:52 +08:00
8ccf8100d4 👔 Make listing on the hosted page shows article only 2025-11-22 17:50:19 +08:00
ec21a94921 🐛 Serval bug fixes in hosted page 2025-11-22 17:43:52 +08:00
7b7a6c9218 Extend the ability of the hosted page markdown parser 2025-11-22 17:40:17 +08:00
0e44d9c514 🐛 Fix publisher invite controller still use int user id 2025-11-22 17:25:45 +08:00
e449e16d33 🐛 Fix pagination overflow in hosted page 2025-11-22 17:20:36 +08:00
3ce2b36c15 🐛 Fix featured post on hosted page uses wrong order 2025-11-22 17:13:02 +08:00
f7388822e0 🐛 Unable to use random split in open fund 2025-11-22 16:54:29 +08:00
3800dae8b7 SEO optimization on the hosted pages 2025-11-22 16:45:44 +08:00
c62ed191f3 File deploy smart mode 2025-11-22 16:00:30 +08:00
8b77f0e0ad Site management purge files and deploy from zip 2025-11-22 15:50:20 +08:00
2b56c6f1e5 Static site hosting support access directory as index.html 2025-11-22 15:49:29 +08:00
ef02265ccd 💄 Optimize hosted page index 2025-11-22 14:16:40 +08:00
f4505d2ecc 💄 Add titles to the hosted pages 2025-11-22 13:49:26 +08:00
9d2242d331 💄 Hosted page SEO optimization 2025-11-22 13:42:13 +08:00
c806365a81 Render markdown on hosted pages 2025-11-22 13:28:37 +08:00
bd1715c9a3 💄 Optimize hosted post details page 2025-11-22 13:17:34 +08:00
0b0598712e 💄 Updated the hosted site post page 2025-11-22 13:09:57 +08:00
92a4899e7c The posts page basis 2025-11-22 02:33:22 +08:00
bdc8db3091 About page also contains site info 2025-11-22 02:18:57 +08:00
a16da37221 Account about page 2025-11-22 01:47:10 +08:00
70a18b07ff 🐛 Bug fixes in the publication site hosting 2025-11-21 23:36:38 +08:00
98b8d5f33b ♻️ New error page 2025-11-21 23:30:43 +08:00
2a35786204 🐛 Fix self-managed files hosting 2025-11-21 22:27:27 +08:00
7016a0a943 Render self-managed site 2025-11-21 01:55:22 +08:00
cad72502d9 Managed page rendering 2025-11-21 01:41:25 +08:00
226a64df41 💄 Optimize the page rendering in zone 2025-11-21 01:21:40 +08:00
75b8567a28 🐛 Fix file management of the site 2025-11-21 00:40:58 +08:00
3aa5561a07 🐛 Fix hosted sites 2025-11-20 23:47:41 +08:00
c0ebb496fe Site manager 2025-11-20 22:54:24 +08:00
afccb27bd4 Site mode 2025-11-20 22:40:36 +08:00
6ed96780ab 💥 Improvements in the URL of the publication site 2025-11-20 21:29:32 +08:00
8e5cdfbc62 Zone site placeholder 2025-11-19 23:14:22 +08:00
1b774c1de6 ♻️ Moved the site to the Zone project 2025-11-19 22:34:01 +08:00
9b4cbade5c :heavy_plus_arrow: Add alpine.js to zone 2025-11-19 22:05:26 +08:00
a52e54f672 🔨 Setup the docker build for tailwindcss 2025-11-19 22:02:26 +08:00
aa48d5e25d 🔨 Setup the tailwindcss and daisyui frontend for the zone 2025-11-19 21:33:14 +08:00
ce18b194a5 🔨 Finish the initial setup of the Zone project 2025-11-19 21:12:07 +08:00
382579a20e 🎉 Initial commit for the Zone project 2025-11-19 21:04:44 +08:00
18d50346a9 👔 Update publication site limits for perk members 2025-11-19 00:48:36 +08:00
ac51bbde6c Publication Sites aka Solian Pages 2025-11-18 23:39:00 +08:00
4ab0dcf1c2 🐛 Fix file reference JSON loop 2025-11-18 21:52:21 +08:00
587066d847 Delete files in batch API 2025-11-18 20:33:02 +08:00
faa375042a New drive api order etc 2025-11-18 18:50:39 +08:00
65b6f3a606 🐛 Fix bugs 2025-11-18 18:40:23 +08:00
fa1a40c637 File references listing endpoint 2025-11-18 01:06:02 +08:00
d43ce7cb11 🗑️ Remove the fast upload endpoint 2025-11-18 00:55:29 +08:00
92b28d830d Drive file name query 2025-11-18 00:48:35 +08:00
1fa6c893a5 🐛 Fix compile errors 2025-11-18 00:34:50 +08:00
ba57becba8 ♻️ Replace the soft delete logic with the new shared one 2025-11-17 23:43:59 +08:00
4280168002 🐛 Try to fix the soft delete filter didn't work in drive 2025-11-17 23:19:03 +08:00
a172128d84 🐛 Hide wrongly exposed method in FileController 2025-11-17 22:37:10 +08:00
34e78294a1 Unindexed files has similar filter to the list file API 2025-11-17 22:20:49 +08:00
82afdb3922 🐛 Fix unable to claim fund due to db issue 2025-11-17 01:12:00 +08:00
260b3e7bc6 🐛 Fix recieve fund save db together to prevent cocurrent db save 2025-11-17 00:49:10 +08:00
713777cd8a 🐛 Trying to fix actually affected 0 row 2025-11-17 00:43:12 +08:00
5cd09bc2d0 Open fund total amount of splits 2025-11-17 00:36:15 +08:00
861fc7cafa 🐛 Tried to fix fund claim cocurrency issue 2025-11-17 00:18:57 +08:00
6313f15375 Open funds 2025-11-16 23:32:03 +08:00
337cc1be97 👔 Allow to send poll only message 2025-11-16 22:52:43 +08:00
9b4f61fcda Embeddable funds
 Chat message embeddable poll
2025-11-16 21:22:45 +08:00
6252988390 Optimize typing indicator 2025-11-16 20:41:34 +08:00
aace3b48b1 Sharable thought 2025-11-16 20:36:04 +08:00
5a097c7518 🐛 Allow user to implitctly set oidc flow type 2025-11-16 18:30:03 +08:00
ba3be1e3bb 🔊 Add verbose logs for oidc 2025-11-16 17:05:28 +08:00
6fd90c424d ♻️ Refactored oidc onboard flow 2025-11-16 15:05:29 +08:00
a0ac3b5820 Friends overview online filter 2025-11-16 13:31:07 +08:00
076bf347c8 Account friends overview endpoint 2025-11-16 12:29:56 +08:00
788326381f Multi model support 2025-11-16 02:44:44 +08:00
a035b23242 Support multiple models in thought 2025-11-16 01:22:07 +08:00
b29f4fce4d Insight proper payment validation 2025-11-16 01:06:33 +08:00
5418489f77 🐛 Fix function call bug, for real this time 2025-11-16 00:52:02 +08:00
310f2c1497 🐛 Fix function call in chat history issue 2025-11-16 00:34:31 +08:00
0ae8a2cfd4 🐛 Fix function calls in thought 2025-11-15 23:43:22 +08:00
c69256bda6 🐛 Fix some issues in new thought system 2025-11-15 17:11:39 +08:00
80ea44f2cc ♻️ Refactored the think message part 2025-11-15 16:21:26 +08:00
b5f9faa724 ♻️ Refactored the thought Solar Network related plugins 2025-11-15 13:05:58 +08:00
05985e0852 Unindxed files 2025-11-15 02:59:26 +08:00
6814b5690e ⬇️ Downgrade EFCore to 9 from 10 since it's not ready for use 2025-11-15 00:18:34 +08:00
78447de1b6 🔨 Update dockerfile to use dotnet 10 images as base instead of 9 2025-11-14 23:54:16 +08:00
e54dcccad9 🐛 Fix obslete API call according to https://github.com/aspnet/Announcements/issues/523 2025-11-14 23:53:20 +08:00
429a08930f ♻️ Refactored the server-side versioning by move that logic to Gateway only 2025-11-14 23:49:38 +08:00
b94b288755 ⬆️ Upgrade PgSQL to 10.0.0-rc.2 2025-11-14 23:45:05 +08:00
1c50c2f822 ⬆️ Upgrade dependecies to use dotnet10 version 2025-11-14 23:01:33 +08:00
73700e7cfd Revert "♻️ Proper folder system to index"
This reverts commit 1647aa2f1e.
2025-11-14 22:11:21 +08:00
bd2943345a ⬆️ Upgrade the dotnet framework to 10.0 2025-11-14 22:11:16 +08:00
1647aa2f1e ♻️ Proper folder system to index 2025-11-14 01:03:59 +08:00
b137021b1f File index controller returns folders 2025-11-13 01:32:25 +08:00
ffca94f789 🐛 Fix some issues when creating duplicate indexes and instant upload triggered won't create index 2025-11-13 01:12:13 +08:00
e2b2bdd262 File index 2025-11-12 22:09:13 +08:00
ce715cd6b0 👔 Check in algo v3 2025-11-11 01:03:26 +08:00
f7b3926338 👔 Optimize push notification logic 2025-11-11 00:38:43 +08:00
68cd23d64f 🐛 Fixes in track tasks 2025-11-10 23:58:12 +08:00
db7d994039 🐛 Fix bugs 2025-11-10 02:06:21 +08:00
741ed18ce5 🐛 Fixes for drive task tracking 2025-11-10 01:53:58 +08:00
2bfb50cc71 🐛 Dozens of bug fixes to new task system 2025-11-10 00:14:41 +08:00
db98fa240e ♻️ Merge the upload tasks and common tasks handling 2025-11-09 21:18:13 +08:00
d96937aabc 🐛 Fixes in the upload tasks 2025-11-09 18:49:35 +08:00
dc0be3467f 🚚 Move emails razor templates 2025-11-09 14:08:13 +08:00
6101de741f ♻️ Refactored emails 2025-11-09 14:06:12 +08:00
338 changed files with 47766 additions and 6242 deletions

View File

@@ -27,8 +27,8 @@ jobs:
run: |
files="${{ steps.changed-files.outputs.files }}"
matrix="{\"include\":[]}"
services=("Sphere" "Pass" "Ring" "Drive" "Develop" "Gateway" "Insight")
images=("sphere" "pass" "ring" "drive" "develop" "gateway" "insight")
services=("Sphere" "Pass" "Ring" "Drive" "Develop" "Gateway" "Insight" "Zone")
images=("sphere" "pass" "ring" "drive" "develop" "gateway" "insight" "zone")
changed_services=()
for file in $files; do

View File

@@ -1,71 +0,0 @@
using Microsoft.Extensions.Hosting;
var builder = DistributedApplication.CreateBuilder(args);
var isDev = builder.Environment.IsDevelopment();
var cache = builder.AddRedis("cache");
var queue = builder.AddNats("queue").WithJetStream();
var ringService = builder.AddProject<Projects.DysonNetwork_Ring>("ring");
var passService = builder.AddProject<Projects.DysonNetwork_Pass>("pass")
.WithReference(ringService);
var driveService = builder.AddProject<Projects.DysonNetwork_Drive>("drive")
.WithReference(passService)
.WithReference(ringService);
var sphereService = builder.AddProject<Projects.DysonNetwork_Sphere>("sphere")
.WithReference(passService)
.WithReference(ringService)
.WithReference(driveService);
var developService = builder.AddProject<Projects.DysonNetwork_Develop>("develop")
.WithReference(passService)
.WithReference(ringService)
.WithReference(sphereService);
var insightService = builder.AddProject<Projects.DysonNetwork_Insight>("insight")
.WithReference(passService)
.WithReference(ringService)
.WithReference(sphereService)
.WithReference(developService);
passService.WithReference(developService).WithReference(driveService);
List<IResourceBuilder<ProjectResource>> services =
[ringService, passService, driveService, sphereService, developService, insightService];
for (var idx = 0; idx < services.Count; idx++)
{
var service = services[idx];
service.WithReference(cache).WithReference(queue);
var grpcPort = 7002 + idx;
if (isDev)
{
service.WithEnvironment("GRPC_PORT", grpcPort.ToString());
var httpPort = 8001 + idx;
service.WithEnvironment("HTTP_PORTS", httpPort.ToString());
service.WithHttpEndpoint(httpPort, targetPort: null, isProxied: false, name: "http");
}
else
{
service.WithHttpEndpoint(8080, targetPort: null, isProxied: false, name: "http");
}
service.WithEndpoint(isDev ? grpcPort : 7001, isDev ? null : 7001, "https", name: "grpc", isProxied: false);
}
// Extra double-ended references
ringService.WithReference(passService);
var gateway = builder.AddProject<Projects.DysonNetwork_Gateway>("gateway")
.WithEnvironment("HTTP_PORTS", "5001")
.WithHttpEndpoint(port: 5001, targetPort: null, isProxied: false, name: "http");
foreach (var service in services)
gateway.WithReference(service);
builder.AddDockerComposeEnvironment("docker-compose");
builder.Build().Run();

View File

@@ -1,28 +0,0 @@
<Project Sdk="Microsoft.NET.Sdk">
<Sdk Name="Aspire.AppHost.Sdk" Version="9.5.2" />
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net9.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<UserSecretsId>a68b3195-a00d-40c2-b5ed-d675356b7cde</UserSecretsId>
<RootNamespace>DysonNetwork.Control</RootNamespace>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Aspire.Hosting.AppHost" Version="9.5.2" />
<PackageReference Include="Aspire.Hosting.Docker" Version="9.4.2-preview.1.25428.12" />
<PackageReference Include="Aspire.Hosting.Nats" Version="9.5.2" />
<PackageReference Include="Aspire.Hosting.Redis" Version="9.5.2" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\DysonNetwork.Develop\DysonNetwork.Develop.csproj" />
<ProjectReference Include="..\DysonNetwork.Drive\DysonNetwork.Drive.csproj" />
<ProjectReference Include="..\DysonNetwork.Pass\DysonNetwork.Pass.csproj" />
<ProjectReference Include="..\DysonNetwork.Ring\DysonNetwork.Ring.csproj" />
<ProjectReference Include="..\DysonNetwork.Sphere\DysonNetwork.Sphere.csproj" />
<ProjectReference Include="..\DysonNetwork.Gateway\DysonNetwork.Gateway.csproj" />
<ProjectReference Include="..\DysonNetwork.Insight\DysonNetwork.Insight.csproj" />
</ItemGroup>
</Project>

View File

@@ -1,32 +0,0 @@
{
"$schema": "https://json.schemastore.org/launchsettings.json",
"profiles": {
"https": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "https://localhost:17025;http://localhost:15057",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development",
"DOTNET_ENVIRONMENT": "Development",
"ASPIRE_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:21175",
"ASPIRE_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:22189",
"DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:21260",
"DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:22052"
}
},
"http": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "http://localhost:15057",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development",
"DOTNET_ENVIRONMENT": "Development",
"ASPIRE_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:19163",
"ASPIRE_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:20185",
"DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:22108"
}
}
}
}

View File

@@ -1,11 +0,0 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"ConnectionStrings": {
"cache": "localhost:6379"
}
}

View File

@@ -1,3 +1,4 @@
using DysonNetwork.Shared.Data;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Design;
@@ -33,36 +34,15 @@ public class AppDatabase(
public override async Task<int> SaveChangesAsync(CancellationToken cancellationToken = default)
{
var now = SystemClock.Instance.GetCurrentInstant();
foreach (var entry in ChangeTracker.Entries<ModelBase>())
{
switch (entry.State)
{
case EntityState.Added:
entry.Entity.CreatedAt = now;
entry.Entity.UpdatedAt = now;
break;
case EntityState.Modified:
entry.Entity.UpdatedAt = now;
break;
case EntityState.Deleted:
entry.State = EntityState.Modified;
entry.Entity.DeletedAt = now;
break;
case EntityState.Detached:
case EntityState.Unchanged:
default:
break;
}
}
this.ApplyAuditableAndSoftDelete();
return await base.SaveChangesAsync(cancellationToken);
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
modelBuilder.ApplySoftDeleteFilters();
}
}

View File

@@ -1,10 +1,10 @@
FROM mcr.microsoft.com/dotnet/aspnet:9.0 AS base
FROM mcr.microsoft.com/dotnet/aspnet:10.0 AS base
USER $APP_UID
WORKDIR /app
EXPOSE 8080
EXPOSE 8081
FROM mcr.microsoft.com/dotnet/sdk:9.0 AS build
FROM mcr.microsoft.com/dotnet/sdk:10.0 AS build
ARG BUILD_CONFIGURATION=Release
WORKDIR /src
COPY ["DysonNetwork.Develop/DysonNetwork.Develop.csproj", "DysonNetwork.Develop/"]

View File

@@ -1,23 +1,19 @@
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net9.0</TargetFramework>
<TargetFramework>net10.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="EFCore.NamingConventions" Version="9.0.0" />
<PackageReference Include="Microsoft.AspNetCore.OpenApi" Version="9.0.10" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.10">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
<PackageReference Include="Microsoft.AspNetCore.OpenApi" Version="10.0.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.11">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="NodaTime.Serialization.Protobuf" Version="2.0.2" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="9.0.4"/>
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.NodaTime" Version="9.0.4" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="9.0.6" />
<PackageReference Include="NodaTime" Version="3.2.2"/>
<PackageReference Include="NodaTime.Serialization.SystemTextJson" Version="1.3.0"/>
<PackageReference Include="Grpc.AspNetCore.Server" Version="2.71.0"/>

View File

@@ -69,7 +69,7 @@ public class DeveloperController(
[HttpPost("{name}/enroll")]
[Authorize]
[RequiredPermission("global", "developers.create")]
[AskPermission("developers.create")]
public async Task<ActionResult<SnDeveloper>> EnrollDeveloperProgram(string name)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser) return Unauthorized();

View File

@@ -7,16 +7,15 @@ using Microsoft.EntityFrameworkCore;
var builder = WebApplication.CreateBuilder(args);
builder.AddServiceDefaults();
builder.AddServiceDefaults("develop");
builder.Services.Configure<ServiceRegistrationOptions>(opts => { opts.Name = "develop"; });
builder.ConfigureAppKestrel(builder.Configuration);
builder.Services.AddAppServices(builder.Configuration);
builder.Services.AddAppAuthentication();
builder.Services.AddDysonAuth();
builder.Services.AddSphereService();
builder.Services.AddAccountService();
builder.Services.AddDriveService();
builder.AddSwaggerManifest(
"DysonNetwork.Develop",

View File

@@ -16,7 +16,7 @@ public static class ApplicationConfiguration
app.UseAuthentication();
app.UseAuthorization();
app.UseMiddleware<PermissionMiddleware>();
app.UseMiddleware<RemotePermissionMiddleware>();
app.MapControllers();

View File

@@ -16,9 +16,7 @@ public static class ServiceCollectionExtensions
services.AddLocalization();
services.AddDbContext<AppDatabase>();
services.AddSingleton<IClock>(SystemClock.Instance);
services.AddHttpContextAccessor();
services.AddSingleton<ICacheService, CacheServiceRedis>();
services.AddHttpClient();

View File

@@ -1,22 +1,31 @@
{
"Debug": true,
"BaseUrl": "http://localhost:5071",
"SiteUrl": "https://solian.app",
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
"Debug": true,
"BaseUrl": "http://localhost:5071",
"SiteUrl": "https://solian.app",
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"AllowedHosts": "*",
"ConnectionStrings": {
"App": "Host=localhost;Port=5432;Database=dyson_develop;Username=postgres;Password=postgres;Include Error Detail=True;Maximum Pool Size=20;Connection Idle Lifetime=60",
"Registrar": "127.0.0.1:2379",
"Cache": "127.0.0.1:6379",
"Queue": "127.0.0.1:4222"
},
"KnownProxies": [
"127.0.0.1",
"::1"
],
"Swagger": {
"PublicBasePath": "/develop"
},
"Cache": {
"Serializer": "MessagePack"
},
"Etcd": {
"Insecure": true
}
},
"AllowedHosts": "*",
"ConnectionStrings": {
"App": "Host=localhost;Port=5432;Database=dyson_develop;Username=postgres;Password=postgres;Include Error Detail=True;Maximum Pool Size=20;Connection Idle Lifetime=60"
},
"KnownProxies": ["127.0.0.1", "::1"],
"Swagger": {
"PublicBasePath": "/develop"
},
"Etcd": {
"Insecure": true
}
}

View File

@@ -1,14 +1,14 @@
using System.Linq.Expressions;
using System.Reflection;
using DysonNetwork.Drive.Billing;
using DysonNetwork.Drive.Storage;
using DysonNetwork.Drive.Storage.Model;
using DysonNetwork.Shared.Data;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Design;
using Microsoft.EntityFrameworkCore.Query;
using NodaTime;
using Quartz;
using TaskStatus = DysonNetwork.Drive.Storage.Model.TaskStatus;
namespace DysonNetwork.Drive;
@@ -23,7 +23,8 @@ public class AppDatabase(
public DbSet<QuotaRecord> QuotaRecords { get; set; } = null!;
public DbSet<SnCloudFile> Files { get; set; } = null!;
public DbSet<CloudFileReference> FileReferences { get; set; } = null!;
public DbSet<SnCloudFileReference> FileReferences { get; set; } = null!;
public DbSet<SnCloudFileIndex> FileIndexes { get; set; }
public DbSet<PersistentTask> Tasks { get; set; } = null!;
public DbSet<PersistentUploadTask> UploadTasks { get; set; } = null!; // Backward compatibility
@@ -44,61 +45,12 @@ public class AppDatabase(
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
// Apply soft-delete filter only to root entities, not derived types
foreach (var entityType in modelBuilder.Model.GetEntityTypes())
{
if (!typeof(ModelBase).IsAssignableFrom(entityType.ClrType)) continue;
// Skip derived types to avoid filter conflicts
var clrType = entityType.ClrType;
if (clrType.BaseType != typeof(object) &&
typeof(ModelBase).IsAssignableFrom(clrType.BaseType))
{
continue; // Skip derived types
}
var method = typeof(AppDatabase)
.GetMethod(nameof(SetSoftDeleteFilter),
BindingFlags.NonPublic | BindingFlags.Static)!
.MakeGenericMethod(clrType);
method.Invoke(null, [modelBuilder]);
}
}
private static void SetSoftDeleteFilter<TEntity>(ModelBuilder modelBuilder)
where TEntity : ModelBase
{
modelBuilder.Entity<TEntity>().HasQueryFilter(e => e.DeletedAt == null);
modelBuilder.ApplySoftDeleteFilters();
}
public override async Task<int> SaveChangesAsync(CancellationToken cancellationToken = default)
{
var now = SystemClock.Instance.GetCurrentInstant();
foreach (var entry in ChangeTracker.Entries<ModelBase>())
{
switch (entry.State)
{
case EntityState.Added:
entry.Entity.CreatedAt = now;
entry.Entity.UpdatedAt = now;
break;
case EntityState.Modified:
entry.Entity.UpdatedAt = now;
break;
case EntityState.Deleted:
entry.State = EntityState.Modified;
entry.Entity.DeletedAt = now;
break;
case EntityState.Detached:
case EntityState.Unchanged:
default:
break;
}
}
this.ApplyAuditableAndSoftDelete();
return await base.SaveChangesAsync(cancellationToken);
}
}
@@ -150,26 +102,41 @@ public class AppDatabaseRecyclingJob(AppDatabase db, ILogger<AppDatabaseRecyclin
}
}
public class UploadTaskCleanupJob(
public class PersistentTaskCleanupJob(
IServiceProvider serviceProvider,
ILogger<UploadTaskCleanupJob> logger
ILogger<PersistentTaskCleanupJob> logger
) : IJob
{
public async Task Execute(IJobExecutionContext context)
{
logger.LogInformation("Cleaning up stale upload tasks...");
logger.LogInformation("Cleaning up stale persistent tasks...");
// Get the PersistentUploadService from DI
// Get the PersistentTaskService from DI
using var scope = serviceProvider.CreateScope();
var persistentUploadService = scope.ServiceProvider.GetService(typeof(DysonNetwork.Drive.Storage.PersistentUploadService));
var persistentTaskService = scope.ServiceProvider.GetService(typeof(PersistentTaskService));
if (persistentUploadService is DysonNetwork.Drive.Storage.PersistentUploadService service)
if (persistentTaskService is PersistentTaskService service)
{
await service.CleanupStaleTasksAsync();
// Clean up tasks for all users (you might want to add user-specific logic here)
// For now, we'll clean up tasks older than 30 days for all users
var cutoff = SystemClock.Instance.GetCurrentInstant() - Duration.FromDays(30);
var tasksToClean = await service.GetUserTasksAsync(
Guid.Empty, // This would need to be adjusted for multi-user cleanup
status: TaskStatus.Completed | TaskStatus.Failed | TaskStatus.Cancelled | TaskStatus.Expired
);
var cleanedCount = 0;
foreach (var task in tasksToClean.Items.Where(t => t.UpdatedAt < cutoff))
{
await service.CancelTaskAsync(task.TaskId); // Or implement a proper cleanup method
cleanedCount++;
}
logger.LogInformation("Cleaned up {Count} stale persistent tasks", cleanedCount);
}
else
{
logger.LogWarning("PersistentUploadService not found in DI container");
logger.LogWarning("PersistentTaskService not found in DI container");
}
}
}
@@ -187,35 +154,3 @@ public class AppDatabaseFactory : IDesignTimeDbContextFactory<AppDatabase>
return new AppDatabase(optionsBuilder.Options, configuration);
}
}
public static class OptionalQueryExtensions
{
public static IQueryable<T> If<T>(
this IQueryable<T> source,
bool condition,
Func<IQueryable<T>, IQueryable<T>> transform
)
{
return condition ? transform(source) : source;
}
public static IQueryable<T> If<T, TP>(
this IIncludableQueryable<T, TP> source,
bool condition,
Func<IIncludableQueryable<T, TP>, IQueryable<T>> transform
)
where T : class
{
return condition ? transform(source) : source;
}
public static IQueryable<T> If<T, TP>(
this IIncludableQueryable<T, IEnumerable<TP>> source,
bool condition,
Func<IIncludableQueryable<T, IEnumerable<TP>>, IQueryable<T>> transform
)
where T : class
{
return condition ? transform(source) : source;
}
}

View File

@@ -1,4 +1,4 @@
FROM mcr.microsoft.com/dotnet/aspnet:9.0 AS base
FROM mcr.microsoft.com/dotnet/aspnet:10.0 AS base
WORKDIR /app
EXPOSE 8080
EXPOSE 8081
@@ -20,7 +20,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
USER $APP_UID
# Stage 2: Build .NET application
FROM mcr.microsoft.com/dotnet/sdk:9.0 AS build
FROM mcr.microsoft.com/dotnet/sdk:10.0 AS build
ARG BUILD_CONFIGURATION=Release
WORKDIR /src
COPY ["DysonNetwork.Drive/DysonNetwork.Drive.csproj", "DysonNetwork.Drive/"]

View File

@@ -1,7 +1,7 @@
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net9.0</TargetFramework>
<TargetFramework>net10.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
@@ -12,22 +12,18 @@
<PackageReference Include="BlurHashSharp.SkiaSharp" Version="1.3.4" />
<PackageReference Include="FFMpegCore" Version="5.4.0" />
<PackageReference Include="Grpc.AspNetCore.Server" Version="2.71.0" />
<PackageReference Include="Microsoft.AspNetCore.OpenApi" Version="9.0.10" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.10">
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
<PackageReference Include="Microsoft.AspNetCore.OpenApi" Version="10.0.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.11">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="MimeKit" Version="4.14.0" />
<PackageReference Include="MimeTypes" Version="2.5.2">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="Minio" Version="6.0.5" />
<PackageReference Include="Minio" Version="7.0.0" />
<PackageReference Include="Nanoid" Version="3.1.0" />
<PackageReference Include="Nerdbank.GitVersioning" Version="3.8.118">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="NetVips" Version="3.1.0" />
<PackageReference Include="NetVips.Native.linux-x64" Version="8.17.3" />
<PackageReference Include="NetVips.Native.osx-arm64" Version="8.17.3" />
@@ -35,26 +31,14 @@
<PackageReference Include="NodaTime.Serialization.JsonNet" Version="3.2.0" />
<PackageReference Include="NodaTime.Serialization.Protobuf" Version="2.0.2" />
<PackageReference Include="NodaTime.Serialization.SystemTextJson" Version="1.3.0" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="9.0.4" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.Design" Version="1.1.0" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.NodaTime" Version="9.0.4" />
<PackageReference Include="OpenTelemetry.Exporter.OpenTelemetryProtocol" Version="1.13.1" />
<PackageReference Include="OpenTelemetry.Extensions.Hosting" Version="1.13.1" />
<PackageReference Include="OpenTelemetry.Instrumentation.AspNetCore" Version="1.13.0" />
<PackageReference Include="OpenTelemetry.Instrumentation.Http" Version="1.13.0" />
<PackageReference Include="OpenTelemetry.Instrumentation.Runtime" Version="1.13.0" />
<PackageReference Include="Quartz" Version="3.15.1" />
<PackageReference Include="Quartz.AspNetCore" Version="3.15.1" />
<PackageReference Include="Quartz.Extensions.Hosting" Version="3.15.1" />
<PackageReference Include="EFCore.BulkExtensions" Version="9.0.2" />
<PackageReference Include="EFCore.BulkExtensions.PostgreSql" Version="9.0.2" />
<PackageReference Include="EFCore.NamingConventions" Version="9.0.0" />
<!-- Pin the SkiaSharp version at the 2.88.9 due to the BlurHash need this specific version -->
<PackageReference Include="SkiaSharp" Version="2.88.9" />
<PackageReference Include="SkiaSharp.NativeAssets.Linux" Version="2.88.9" />
<PackageReference Include="SkiaSharp.NativeAssets.Linux.NoDependencies" Version="2.88.9" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="9.0.6" />
<PackageReference Include="Swashbuckle.AspNetCore.SwaggerUI" Version="9.0.6" />
</ItemGroup>
<ItemGroup>

View File

@@ -0,0 +1,585 @@
using System.ComponentModel.DataAnnotations;
using DysonNetwork.Drive.Storage;
using DysonNetwork.Shared.Auth;
using DysonNetwork.Shared.Http;
using DysonNetwork.Shared.Models;
using DysonNetwork.Shared.Proto;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
namespace DysonNetwork.Drive.Index;
[ApiController]
[Route("/api/index")]
[Authorize]
public class FileIndexController(
FileIndexService fileIndexService,
AppDatabase db,
ILogger<FileIndexController> logger
) : ControllerBase
{
/// <summary>
/// Gets files in a specific path for the current user
/// </summary>
/// <param name="path">The path to browse (defaults to root "/")</param>
/// <param name="query">Optional query to filter files by name</param>
/// <param name="order">The field to order by (date, size, name - defaults to date)</param>
/// <param name="orderDesc">Whether to order in descending order (defaults to true)</param>
/// <returns>List of files in the specified path</returns>
[HttpGet("browse")]
public async Task<IActionResult> BrowseFiles(
[FromQuery] string path = "/",
[FromQuery] string? query = null,
[FromQuery] string order = "date",
[FromQuery] bool orderDesc = true
)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
try
{
var fileIndexes = await fileIndexService.GetByPathAsync(accountId, path);
if (!string.IsNullOrWhiteSpace(query))
{
fileIndexes = fileIndexes
.Where(fi => fi.File.Name.Contains(query, StringComparison.OrdinalIgnoreCase))
.ToList();
}
// Apply sorting
fileIndexes = order.ToLower() switch
{
"name" => orderDesc ? fileIndexes.OrderByDescending(fi => fi.File.Name).ToList()
: fileIndexes.OrderBy(fi => fi.File.Name).ToList(),
"size" => orderDesc ? fileIndexes.OrderByDescending(fi => fi.File.Size).ToList()
: fileIndexes.OrderBy(fi => fi.File.Size).ToList(),
_ => orderDesc ? fileIndexes.OrderByDescending(fi => fi.File.CreatedAt).ToList()
: fileIndexes.OrderBy(fi => fi.File.CreatedAt).ToList()
};
// Get all file indexes for this account to extract child folders
var allFileIndexes = await fileIndexService.GetByAccountIdAsync(accountId);
// Extract unique child folder paths
var childFolders = ExtractChildFolders(allFileIndexes, path);
return Ok(new
{
Path = path,
Files = fileIndexes,
Folders = childFolders,
TotalCount = fileIndexes.Count
});
}
catch (Exception ex)
{
logger.LogError(ex, "Failed to browse files for account {AccountId} at path {Path}", accountId, path);
return new ObjectResult(new ApiError
{
Code = "BROWSE_FAILED",
Message = "Failed to browse files",
Status = 500
}) { StatusCode = 500 };
}
}
/// <summary>
/// Extracts unique child folder paths from all file indexes for a given parent path
/// </summary>
/// <param name="allFileIndexes">All file indexes for the account</param>
/// <param name="parentPath">The parent path to find children for</param>
/// <returns>List of unique child folder names</returns>
private List<string> ExtractChildFolders(List<SnCloudFileIndex> allFileIndexes, string parentPath)
{
var normalizedParentPath = FileIndexService.NormalizePath(parentPath);
var childFolders = new HashSet<string>();
foreach (var index in allFileIndexes)
{
var normalizedIndexPath = FileIndexService.NormalizePath(index.Path);
// Check if this path is a direct child of the parent path
if (normalizedIndexPath.StartsWith(normalizedParentPath) &&
normalizedIndexPath != normalizedParentPath)
{
// Remove the parent path prefix to get the relative path
var relativePath = normalizedIndexPath.Substring(normalizedParentPath.Length);
// Extract the first folder name (direct child)
var firstSlashIndex = relativePath.IndexOf('/');
if (firstSlashIndex > 0)
{
var folderName = relativePath.Substring(0, firstSlashIndex);
childFolders.Add(folderName);
}
}
}
return childFolders.OrderBy(f => f).ToList();
}
/// <summary>
/// Gets all files for the current user (across all paths)
/// </summary>
/// <param name="query">Optional query to filter files by name</param>
/// <param name="order">The field to order by (date, size, name - defaults to date)</param>
/// <param name="orderDesc">Whether to order in descending order (defaults to true)</param>
/// <returns>List of all files for the user</returns>
[HttpGet("all")]
public async Task<IActionResult> GetAllFiles(
[FromQuery] string? query = null,
[FromQuery] string order = "date",
[FromQuery] bool orderDesc = true
)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
try
{
var fileIndexes = await fileIndexService.GetByAccountIdAsync(accountId);
if (!string.IsNullOrWhiteSpace(query))
{
fileIndexes = fileIndexes
.Where(fi => fi.File.Name.Contains(query, StringComparison.OrdinalIgnoreCase))
.ToList();
}
// Apply sorting
fileIndexes = order.ToLower() switch
{
"name" => orderDesc ? fileIndexes.OrderByDescending(fi => fi.File.Name).ToList()
: fileIndexes.OrderBy(fi => fi.File.Name).ToList(),
"size" => orderDesc ? fileIndexes.OrderByDescending(fi => fi.File.Size).ToList()
: fileIndexes.OrderBy(fi => fi.File.Size).ToList(),
_ => orderDesc ? fileIndexes.OrderByDescending(fi => fi.File.CreatedAt).ToList()
: fileIndexes.OrderBy(fi => fi.File.CreatedAt).ToList()
};
return Ok(new
{
Files = fileIndexes,
TotalCount = fileIndexes.Count()
});
}
catch (Exception ex)
{
logger.LogError(ex, "Failed to get all files for account {AccountId}", accountId);
return new ObjectResult(new ApiError
{
Code = "GET_ALL_FAILED",
Message = "Failed to get files",
Status = 500
}) { StatusCode = 500 };
}
}
/// <summary>
/// Gets files that have not been indexed for the current user.
/// </summary>
/// <param name="recycled">Shows recycled files or not</param>
/// <param name="offset">The number of files to skip</param>
/// <param name="take">The number of files to return</param>
/// <param name="pool">The pool ID of those files</param>
/// <param name="query">Optional query to filter files by name</param>
/// <param name="order">The field to order by (date, size, name - defaults to date)</param>
/// <param name="orderDesc">Whether to order in descending order (defaults to true)</param>
/// <returns>List of unindexed files</returns>
[HttpGet("unindexed")]
public async Task<IActionResult> GetUnindexedFiles(
[FromQuery] Guid? pool,
[FromQuery] bool recycled = false,
[FromQuery] int offset = 0,
[FromQuery] int take = 20,
[FromQuery] string? query = null,
[FromQuery] string order = "date",
[FromQuery] bool orderDesc = true
)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
try
{
var filesQuery = db.Files
.Where(f => f.AccountId == accountId
&& f.IsMarkedRecycle == recycled
&& !db.FileIndexes.Any(fi => fi.FileId == f.Id && fi.AccountId == accountId)
)
.AsQueryable();
// Apply sorting
filesQuery = order.ToLower() switch
{
"name" => orderDesc ? filesQuery.OrderByDescending(f => f.Name)
: filesQuery.OrderBy(f => f.Name),
"size" => orderDesc ? filesQuery.OrderByDescending(f => f.Size)
: filesQuery.OrderBy(f => f.Size),
_ => orderDesc ? filesQuery.OrderByDescending(f => f.CreatedAt)
: filesQuery.OrderBy(f => f.CreatedAt)
};
if (pool.HasValue) filesQuery = filesQuery.Where(f => f.PoolId == pool);
if (!string.IsNullOrWhiteSpace(query))
{
filesQuery = filesQuery.Where(f => f.Name.Contains(query));
}
var totalCount = await filesQuery.CountAsync();
Response.Headers.Append("X-Total", totalCount.ToString());
var unindexedFiles = await filesQuery
.Skip(offset)
.Take(take)
.ToListAsync();
return Ok(unindexedFiles);
}
catch (Exception ex)
{
logger.LogError(ex, "Failed to get unindexed files for account {AccountId}", accountId);
return new ObjectResult(new ApiError
{
Code = "GET_UNINDEXED_FAILED",
Message = "Failed to get unindexed files",
Status = 500
}) { StatusCode = 500 };
}
}
/// <summary>
/// Moves a file to a new path
/// </summary>
/// <param name="indexId">The file index ID</param>
/// <param name="newPath">The new path</param>
/// <returns>The updated file index</returns>
[HttpPost("move/{indexId}")]
public async Task<IActionResult> MoveFile(Guid indexId, [FromBody] MoveFileRequest request)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
try
{
// Verify ownership
var existingIndex = await db.FileIndexes
.Include(fi => fi.File)
.FirstOrDefaultAsync(fi => fi.Id == indexId && fi.AccountId == accountId);
if (existingIndex == null)
return new ObjectResult(ApiError.NotFound("File index")) { StatusCode = 404 };
var updatedIndex = await fileIndexService.UpdateAsync(indexId, request.NewPath);
if (updatedIndex == null)
return new ObjectResult(ApiError.NotFound("File index")) { StatusCode = 404 };
return Ok(new
{
updatedIndex.FileId,
IndexId = updatedIndex.Id,
OldPath = existingIndex.Path,
NewPath = updatedIndex.Path,
Message = "File moved successfully"
});
}
catch (Exception ex)
{
logger.LogError(ex, "Failed to move file index {IndexId} for account {AccountId}", indexId, accountId);
return new ObjectResult(new ApiError
{
Code = "MOVE_FAILED",
Message = "Failed to move file",
Status = 500
}) { StatusCode = 500 };
}
}
/// <summary>
/// Removes a file index (does not delete the actual file by default)
/// </summary>
/// <param name="indexId">The file index ID</param>
/// <param name="deleteFile">Whether to also delete the actual file data</param>
/// <returns>Success message</returns>
[HttpDelete("remove/{indexId}")]
public async Task<IActionResult> RemoveFileIndex(Guid indexId, [FromQuery] bool deleteFile = false)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
try
{
// Verify ownership
var existingIndex = await db.FileIndexes
.Include(fi => fi.File)
.FirstOrDefaultAsync(fi => fi.Id == indexId && fi.AccountId == accountId);
if (existingIndex == null)
return new ObjectResult(ApiError.NotFound("File index")) { StatusCode = 404 };
var fileId = existingIndex.FileId;
var fileName = existingIndex.File.Name;
var filePath = existingIndex.Path;
// Remove the index
var removed = await fileIndexService.RemoveAsync(indexId);
if (!removed)
return new ObjectResult(ApiError.NotFound("File index")) { StatusCode = 404 };
// Optionally delete the actual file
if (!deleteFile)
return Ok(new
{
Message = deleteFile
? "File index and file data removed successfully"
: "File index removed successfully",
FileId = fileId,
FileName = fileName,
Path = filePath,
FileDataDeleted = deleteFile
});
try
{
// Check if there are any other indexes for this file
var remainingIndexes = await fileIndexService.GetByFileIdAsync(fileId);
if (remainingIndexes.Count == 0)
{
// No other indexes exist, safe to delete the file
var file = await db.Files.FirstOrDefaultAsync(f => f.Id == fileId.ToString());
if (file != null)
{
db.Files.Remove(file);
await db.SaveChangesAsync();
logger.LogInformation("Deleted file {FileId} ({FileName}) as requested", fileId, fileName);
}
}
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to delete file {FileId} while removing index", fileId);
// Continue even if file deletion fails
}
return Ok(new
{
Message = deleteFile
? "File index and file data removed successfully"
: "File index removed successfully",
FileId = fileId,
FileName = fileName,
Path = filePath,
FileDataDeleted = deleteFile
});
}
catch (Exception ex)
{
logger.LogError(ex, "Failed to remove file index {IndexId} for account {AccountId}", indexId, accountId);
return new ObjectResult(new ApiError
{
Code = "REMOVE_FAILED",
Message = "Failed to remove file",
Status = 500
}) { StatusCode = 500 };
}
}
/// <summary>
/// Removes all file indexes in a specific path
/// </summary>
/// <param name="path">The path to clear</param>
/// <param name="deleteFiles">Whether to also delete the actual file data</param>
/// <returns>Success message with count of removed items</returns>
[HttpDelete("clear-path")]
public async Task<IActionResult> ClearPath([FromQuery] string path = "/", [FromQuery] bool deleteFiles = false)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
try
{
var removedCount = await fileIndexService.RemoveByPathAsync(accountId, path);
if (!deleteFiles || removedCount <= 0)
return Ok(new
{
Message = deleteFiles
? $"Cleared {removedCount} file indexes from path and deleted orphaned files"
: $"Cleared {removedCount} file indexes from path",
Path = path,
RemovedCount = removedCount,
FilesDeleted = deleteFiles
});
// Get the files that were in this path and check if they have other indexes
var filesInPath = await fileIndexService.GetByPathAsync(accountId, path);
var fileIdsToCheck = filesInPath.Select(fi => fi.FileId).Distinct().ToList();
foreach (var fileId in fileIdsToCheck)
{
var remainingIndexes = await fileIndexService.GetByFileIdAsync(fileId);
if (remainingIndexes.Count != 0) continue;
// No other indexes exist, safe to delete the file
var file = await db.Files.FirstOrDefaultAsync(f => f.Id == fileId.ToString());
if (file == null) continue;
db.Files.Remove(file);
logger.LogInformation("Deleted orphaned file {FileId} after clearing path {Path}", fileId, path);
}
await db.SaveChangesAsync();
return Ok(new
{
Message = deleteFiles
? $"Cleared {removedCount} file indexes from path and deleted orphaned files"
: $"Cleared {removedCount} file indexes from path",
Path = path,
RemovedCount = removedCount,
FilesDeleted = deleteFiles
});
}
catch (Exception ex)
{
logger.LogError(ex, "Failed to clear path {Path} for account {AccountId}", path, accountId);
return new ObjectResult(new ApiError
{
Code = "CLEAR_PATH_FAILED",
Message = "Failed to clear path",
Status = 500
}) { StatusCode = 500 };
}
}
/// <summary>
/// Creates a new file index (useful for adding existing files to a path)
/// </summary>
/// <param name="request">The create index request</param>
/// <returns>The created file index</returns>
[HttpPost("create")]
public async Task<IActionResult> CreateFileIndex([FromBody] CreateFileIndexRequest request)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
try
{
// Verify the file exists and belongs to the user
var file = await db.Files.FirstOrDefaultAsync(f => f.Id == request.FileId);
if (file == null)
return new ObjectResult(ApiError.NotFound("File")) { StatusCode = 404 };
if (file.AccountId != accountId)
return new ObjectResult(ApiError.Unauthorized(forbidden: true)) { StatusCode = 403 };
// Check if index already exists for this file and path
var existingIndex = await db.FileIndexes
.FirstOrDefaultAsync(fi =>
fi.FileId == request.FileId && fi.Path == request.Path && fi.AccountId == accountId);
if (existingIndex != null)
return new ObjectResult(ApiError.Validation(new Dictionary<string, string[]>
{
{ "fileId", ["File index already exists for this path"] }
})) { StatusCode = 400 };
var fileIndex = await fileIndexService.CreateAsync(request.Path, request.FileId, accountId);
return Ok(new
{
IndexId = fileIndex.Id,
fileIndex.FileId,
fileIndex.Path,
Message = "File index created successfully"
});
}
catch (Exception ex)
{
logger.LogError(ex, "Failed to create file index for file {FileId} at path {Path} for account {AccountId}",
request.FileId, request.Path, accountId);
return new ObjectResult(new ApiError
{
Code = "CREATE_INDEX_FAILED",
Message = "Failed to create file index",
Status = 500
}) { StatusCode = 500 };
}
}
/// <summary>
/// Searches for files by name or metadata
/// </summary>
/// <param name="query">The search query</param>
/// <param name="path">Optional path to limit search to</param>
/// <returns>Matching files</returns>
[HttpGet("search")]
public async Task<IActionResult> SearchFiles([FromQuery] string query, [FromQuery] string? path = null)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
try
{
// Build the query with all conditions at once
var searchTerm = query.ToLower();
var fileIndexes = await db.FileIndexes
.Where(fi => fi.AccountId == accountId)
.Include(fi => fi.File)
.Where(fi =>
(string.IsNullOrEmpty(path) || fi.Path == FileIndexService.NormalizePath(path)) &&
(fi.File.Name.ToLower().Contains(searchTerm) ||
(fi.File.Description != null && fi.File.Description.ToLower().Contains(searchTerm)) ||
(fi.File.MimeType != null && fi.File.MimeType.ToLower().Contains(searchTerm))))
.ToListAsync();
return Ok(new
{
Query = query,
Path = path,
Results = fileIndexes,
TotalCount = fileIndexes.Count()
});
}
catch (Exception ex)
{
logger.LogError(ex, "Failed to search files for account {AccountId} with query {Query}", accountId, query);
return new ObjectResult(new ApiError
{
Code = "SEARCH_FAILED",
Message = "Failed to search files",
Status = 500
}) { StatusCode = 500 };
}
}
}
public class MoveFileRequest
{
public string NewPath { get; set; } = null!;
}
public class CreateFileIndexRequest
{
[MaxLength(32)] public string FileId { get; set; } = null!;
public string Path { get; set; } = null!;
}

View File

@@ -0,0 +1,197 @@
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
namespace DysonNetwork.Drive.Index;
public class FileIndexService(AppDatabase db)
{
/// <summary>
/// Creates a new file index entry
/// </summary>
/// <param name="path">The parent folder path with a trailing slash</param>
/// <param name="fileId">The file ID</param>
/// <param name="accountId">The account ID</param>
/// <returns>The created file index</returns>
public async Task<SnCloudFileIndex> CreateAsync(string path, string fileId, Guid accountId)
{
// Ensure a path has a trailing slash and is query-safe
var normalizedPath = NormalizePath(path);
// Check if a file with the same name already exists in the same path for this account
var existingFileIndex = await db.FileIndexes
.FirstOrDefaultAsync(fi => fi.AccountId == accountId && fi.Path == normalizedPath && fi.FileId == fileId);
if (existingFileIndex != null)
{
throw new InvalidOperationException(
$"A file with ID '{fileId}' already exists in path '{normalizedPath}' for account '{accountId}'");
}
var fileIndex = new SnCloudFileIndex
{
Path = normalizedPath,
FileId = fileId,
AccountId = accountId
};
db.FileIndexes.Add(fileIndex);
await db.SaveChangesAsync();
return fileIndex;
}
/// <summary>
/// Updates an existing file index entry by removing the old one and creating a new one
/// </summary>
/// <param name="id">The file index ID</param>
/// <param name="newPath">The new parent folder path with trailing slash</param>
/// <returns>The updated file index</returns>
public async Task<SnCloudFileIndex?> UpdateAsync(Guid id, string newPath)
{
var fileIndex = await db.FileIndexes.FindAsync(id);
if (fileIndex == null)
return null;
// Since properties are init-only, we need to remove the old index and create a new one
db.FileIndexes.Remove(fileIndex);
var newFileIndex = new SnCloudFileIndex
{
Path = NormalizePath(newPath),
FileId = fileIndex.FileId,
AccountId = fileIndex.AccountId
};
db.FileIndexes.Add(newFileIndex);
await db.SaveChangesAsync();
return newFileIndex;
}
/// <summary>
/// Removes a file index entry by ID
/// </summary>
/// <param name="id">The file index ID</param>
/// <returns>True if the index was found and removed, false otherwise</returns>
public async Task<bool> RemoveAsync(Guid id)
{
var fileIndex = await db.FileIndexes.FindAsync(id);
if (fileIndex == null)
return false;
db.FileIndexes.Remove(fileIndex);
await db.SaveChangesAsync();
return true;
}
/// <summary>
/// Removes file index entries by file ID
/// </summary>
/// <param name="fileId">The file ID</param>
/// <returns>The number of indexes removed</returns>
public async Task<int> RemoveByFileIdAsync(string fileId)
{
var indexes = await db.FileIndexes
.Where(fi => fi.FileId == fileId)
.ToListAsync();
if (indexes.Count == 0)
return 0;
db.FileIndexes.RemoveRange(indexes);
await db.SaveChangesAsync();
return indexes.Count;
}
/// <summary>
/// Removes file index entries by account ID and path
/// </summary>
/// <param name="accountId">The account ID</param>
/// <param name="path">The parent folder path</param>
/// <returns>The number of indexes removed</returns>
public async Task<int> RemoveByPathAsync(Guid accountId, string path)
{
var normalizedPath = NormalizePath(path);
var indexes = await db.FileIndexes
.Where(fi => fi.AccountId == accountId && fi.Path == normalizedPath)
.ToListAsync();
if (!indexes.Any())
return 0;
db.FileIndexes.RemoveRange(indexes);
await db.SaveChangesAsync();
return indexes.Count;
}
/// <summary>
/// Gets file indexes by account ID and path
/// </summary>
/// <param name="accountId">The account ID</param>
/// <param name="path">The parent folder path</param>
/// <returns>List of file indexes</returns>
public async Task<List<SnCloudFileIndex>> GetByPathAsync(Guid accountId, string path)
{
var normalizedPath = NormalizePath(path);
return await db.FileIndexes
.Where(fi => fi.AccountId == accountId && fi.Path == normalizedPath)
.Include(fi => fi.File)
.ToListAsync();
}
/// <summary>
/// Gets file indexes by file ID
/// </summary>
/// <param name="fileId">The file ID</param>
/// <returns>List of file indexes</returns>
public async Task<List<SnCloudFileIndex>> GetByFileIdAsync(string fileId)
{
return await db.FileIndexes
.Where(fi => fi.FileId == fileId)
.Include(fi => fi.File)
.ToListAsync();
}
/// <summary>
/// Gets all file indexes for an account
/// </summary>
/// <param name="accountId">The account ID</param>
/// <returns>List of file indexes</returns>
public async Task<List<SnCloudFileIndex>> GetByAccountIdAsync(Guid accountId)
{
return await db.FileIndexes
.Where(fi => fi.AccountId == accountId)
.Include(fi => fi.File)
.ToListAsync();
}
/// <summary>
/// Normalizes the path to ensure it has a trailing slash and is query-safe
/// </summary>
/// <param name="path">The original path</param>
/// <returns>The normalized path</returns>
public static string NormalizePath(string path)
{
if (string.IsNullOrEmpty(path))
return "/";
// Ensure the path starts with a slash
if (!path.StartsWith('/'))
path = "/" + path;
// Ensure the path ends with a slash (unless it's just the root)
if (path != "/" && !path.EndsWith('/'))
path += "/";
// Make path query-safe by removing problematic characters
// This is a basic implementation - you might want to add more robust validation
path = path.Replace("%", "").Replace("'", "").Replace("\"", "");
return path;
}
}

View File

@@ -0,0 +1,341 @@
# File Indexing System Documentation
## Overview
The File Indexing System provides a hierarchical file organization layer on top of the existing file storage system in DysonNetwork Drive. It allows users to organize their files in folders and paths while maintaining the underlying file storage capabilities.
When using with the gateway, replace the `/api` with the `/drive` in the path.
And all the arguments will be transformed into snake case via the gateway.
## Architecture
### Core Components
1. **SnCloudFileIndex Model** - Represents the file-to-path mapping
2. **FileIndexService** - Business logic for file index operations
3. **FileIndexController** - REST API endpoints for file management
4. **FileUploadController Integration** - Automatic index creation during upload
### Database Schema
```sql
-- File Indexes table
CREATE TABLE "FileIndexes" (
"Id" uuid NOT NULL DEFAULT gen_random_uuid(),
"Path" character varying(8192) NOT NULL,
"FileId" uuid NOT NULL,
"AccountId" uuid NOT NULL,
"CreatedAt" timestamp with time zone NOT NULL DEFAULT (now() at time zone 'utc'),
"UpdatedAt" timestamp with time zone NOT NULL DEFAULT (now() at time zone 'utc'),
CONSTRAINT "PK_FileIndexes" PRIMARY KEY ("Id"),
CONSTRAINT "FK_FileIndexes_Files_FileId" FOREIGN KEY ("FileId") REFERENCES "Files" ("Id") ON DELETE CASCADE,
INDEX "IX_FileIndexes_Path_AccountId" ("Path", "AccountId")
);
```
## API Endpoints
### Browse Files
**GET** `/api/index/browse?path=/documents/`
Browse files in a specific path.
**Query Parameters:**
- `path` (optional, default: "/") - The path to browse
**Response:**
```json
{
"path": "/documents/",
"files": [
{
"id": "guid",
"path": "/documents/",
"fileId": "guid",
"accountId": "guid",
"createdAt": "2024-01-01T00:00:00Z",
"updatedAt": "2024-01-01T00:00:00Z",
"file": {
"id": "string",
"name": "document.pdf",
"size": 1024,
"mimeType": "application/pdf",
"hash": "sha256-hash",
"uploadedAt": "2024-01-01T00:00:00Z",
"expiredAt": null,
"hasCompression": false,
"hasThumbnail": true,
"isEncrypted": false,
"description": null
}
}
],
"totalCount": 1
}
```
### Get All Files
**GET** `/api/index/all`
Get all files for the current user across all paths.
**Response:**
```json
{
"files": [
// Same structure as browse endpoint
],
"totalCount": 10
}
```
### Move File
**POST** `/api/index/move/{indexId}`
Move a file to a new path.
**Path Parameters:**
- `indexId` - The file index ID
**Request Body:**
```json
{
"newPath": "/archived/"
}
```
**Response:**
```json
{
"fileId": "guid",
"indexId": "guid",
"oldPath": "/documents/",
"newPath": "/archived/",
"message": "File moved successfully"
}
```
### Remove File Index
**DELETE** `/api/index/remove/{indexId}?deleteFile=false`
Remove a file index. Optionally delete the actual file data.
**Path Parameters:**
- `indexId` - The file index ID
**Query Parameters:**
- `deleteFile` (optional, default: false) - Whether to also delete the file data
**Response:**
```json
{
"message": "File index removed successfully",
"fileId": "guid",
"fileName": "document.pdf",
"path": "/documents/",
"fileDataDeleted": false
}
```
### Clear Path
**DELETE** `/api/index/clear-path?path=/temp/&deleteFiles=false`
Remove all file indexes in a specific path.
**Query Parameters:**
- `path` (optional, default: "/") - The path to clear
- `deleteFiles` (optional, default: false) - Whether to also delete orphaned files
**Response:**
```json
{
"message": "Cleared 5 file indexes from path",
"path": "/temp/",
"removedCount": 5,
"filesDeleted": false
}
```
### Create File Index
**POST** `/api/index/create`
Create a new file index for an existing file.
**Request Body:**
```json
{
"fileId": "guid",
"path": "/documents/"
}
```
**Response:**
```json
{
"indexId": "guid",
"fileId": "guid",
"path": "/documents/",
"message": "File index created successfully"
}
```
### Search Files
**GET** `/api/index/search?query=report&path=/documents/`
Search for files by name or metadata.
**Query Parameters:**
- `query` (required) - The search query
- `path` (optional) - Limit search to specific path
**Response:**
```json
{
"query": "report",
"path": "/documents/",
"results": [
// Same structure as browse endpoint
],
"totalCount": 3
}
```
## Path Normalization
The system automatically normalizes paths to ensure consistency:
- **Trailing Slash**: All paths end with `/`
- **Root Path**: User home folder is represented as `/`
- **Query Safety**: Paths are validated to avoid SQL injection
- **Examples**:
- `/documents/` ✅ (correct)
- `/documents``/documents/` ✅ (normalized)
- `/documents/reports/` ✅ (correct)
- `/documents/reports``/documents/reports/` ✅ (normalized)
## File Upload Integration
When uploading files with the `FileUploadController`, you can specify a path to automatically create file indexes:
**Create Upload Task Request:**
```json
{
"fileName": "document.pdf",
"fileSize": 1024,
"contentType": "application/pdf",
"hash": "sha256-hash",
"path": "/documents/" // New field for file indexing
}
```
The system will automatically create a file index when the upload completes successfully.
## Service Methods
### FileIndexService
```csharp
public class FileIndexService
{
// Create a new file index
Task<SnCloudFileIndex> CreateAsync(string path, Guid fileId, Guid accountId);
// Get files by path
Task<List<SnCloudFileIndex>> GetByPathAsync(Guid accountId, string path);
// Get all files for account
Task<List<SnCloudFileIndex>> GetByAccountIdAsync(Guid accountId);
// Get indexes for specific file
Task<List<SnCloudFileIndex>> GetByFileIdAsync(Guid fileId);
// Move file to new path
Task<SnCloudFileIndex?> UpdateAsync(Guid indexId, string newPath);
// Remove file index
Task<bool> RemoveAsync(Guid indexId);
// Remove all indexes in path
Task<int> RemoveByPathAsync(Guid accountId, string path);
// Normalize path format
public static string NormalizePath(string path);
}
```
## Error Handling
The API returns appropriate HTTP status codes and error messages:
- **400 Bad Request**: Invalid input parameters
- **401 Unauthorized**: User not authenticated
- **403 Forbidden**: User lacks permission
- **404 Not Found**: Resource not found
- **500 Internal Server Error**: Server-side error
**Error Response Format:**
```json
{
"code": "BROWSE_FAILED",
"message": "Failed to browse files",
"status": 500
}
```
## Security Considerations
1. **Ownership Verification**: All operations verify that the user owns the file indexes
2. **Path Validation**: Paths are normalized and validated
3. **Cascade Deletion**: File indexes are automatically removed when files are deleted
4. **Safe File Deletion**: Files are only deleted when no other indexes reference them
## Usage Examples
### Upload File to Specific Path
```bash
# Create upload task with path
curl -X POST /api/files/upload/create \
-H "Authorization: Bearer {token}" \
-H "Content-Type: application/json" \
-d '{
"fileName": "report.pdf",
"fileSize": 2048,
"contentType": "application/pdf",
"path": "/documents/reports/"
}'
```
### Browse Files
```bash
curl -X GET "/api/index/browse?path=/documents/reports/" \
-H "Authorization: Bearer {token}"
```
### Move File
```bash
curl -X POST "/api/index/move/{indexId}" \
-H "Authorization: Bearer {token}" \
-H "Content-Type: application/json" \
-d '{"newPath": "/archived/"}'
```
### Search Files
```bash
curl -X GET "/api/index/search?query=invoice&path=/documents/" \
-H "Authorization: Bearer {token}"
```
## Best Practices
1. **Use Trailing Slashes**: Always include trailing slashes in paths
2. **Organize Hierarchically**: Use meaningful folder structures
3. **Search Efficiently**: Use the search endpoint instead of client-side filtering
4. **Clean Up**: Use the clear-path endpoint for temporary directories
5. **Monitor Usage**: Check total file counts for quota management
## Integration Notes
- The file indexing system works alongside the existing file storage
- Files can exist in multiple paths (hard links)
- File deletion is optional and only removes data when safe
- The system maintains referential integrity between files and indexes

View File

@@ -0,0 +1,632 @@
// <auto-generated />
using System;
using System.Collections.Generic;
using DysonNetwork.Drive;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace DysonNetwork.Drive.Migrations
{
[DbContext(typeof(AppDatabase))]
[Migration("20251112135535_AddFileIndex")]
partial class AddFileIndex
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.10")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("DysonNetwork.Drive.Billing.QuotaRecord", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<string>("Description")
.IsRequired()
.HasColumnType("text")
.HasColumnName("description");
b.Property<Instant?>("ExpiredAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("expired_at");
b.Property<string>("Name")
.IsRequired()
.HasColumnType("text")
.HasColumnName("name");
b.Property<long>("Quota")
.HasColumnType("bigint")
.HasColumnName("quota");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_quota_records");
b.ToTable("quota_records", (string)null);
});
modelBuilder.Entity("DysonNetwork.Drive.Storage.Model.PersistentTask", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<Instant?>("CompletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("completed_at");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<string>("Description")
.HasMaxLength(1024)
.HasColumnType("character varying(1024)")
.HasColumnName("description");
b.Property<string>("Discriminator")
.IsRequired()
.HasMaxLength(21)
.HasColumnType("character varying(21)")
.HasColumnName("discriminator");
b.Property<string>("ErrorMessage")
.HasMaxLength(1024)
.HasColumnType("character varying(1024)")
.HasColumnName("error_message");
b.Property<long?>("EstimatedDurationSeconds")
.HasColumnType("bigint")
.HasColumnName("estimated_duration_seconds");
b.Property<Instant?>("ExpiredAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("expired_at");
b.Property<Instant>("LastActivity")
.HasColumnType("timestamp with time zone")
.HasColumnName("last_activity");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("character varying(256)")
.HasColumnName("name");
b.Property<Dictionary<string, object>>("Parameters")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("parameters");
b.Property<int>("Priority")
.HasColumnType("integer")
.HasColumnName("priority");
b.Property<double>("Progress")
.HasColumnType("double precision")
.HasColumnName("progress");
b.Property<Dictionary<string, object>>("Results")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("results");
b.Property<Instant?>("StartedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("started_at");
b.Property<int>("Status")
.HasColumnType("integer")
.HasColumnName("status");
b.Property<string>("TaskId")
.IsRequired()
.HasMaxLength(64)
.HasColumnType("character varying(64)")
.HasColumnName("task_id");
b.Property<int>("Type")
.HasColumnType("integer")
.HasColumnName("type");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_tasks");
b.ToTable("tasks", (string)null);
b.HasDiscriminator().HasValue("PersistentTask");
b.UseTphMappingStrategy();
});
modelBuilder.Entity("DysonNetwork.Shared.Models.CloudFileReference", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<Instant?>("ExpiredAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("expired_at");
b.Property<string>("FileId")
.IsRequired()
.HasMaxLength(32)
.HasColumnType("character varying(32)")
.HasColumnName("file_id");
b.Property<string>("ResourceId")
.IsRequired()
.HasMaxLength(1024)
.HasColumnType("character varying(1024)")
.HasColumnName("resource_id");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.Property<string>("Usage")
.IsRequired()
.HasMaxLength(1024)
.HasColumnType("character varying(1024)")
.HasColumnName("usage");
b.HasKey("Id")
.HasName("pk_file_references");
b.HasIndex("FileId")
.HasDatabaseName("ix_file_references_file_id");
b.ToTable("file_references", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.FilePool", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid?>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<BillingConfig>("BillingConfig")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("billing_config");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<string>("Description")
.IsRequired()
.HasMaxLength(8192)
.HasColumnType("character varying(8192)")
.HasColumnName("description");
b.Property<bool>("IsHidden")
.HasColumnType("boolean")
.HasColumnName("is_hidden");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(1024)
.HasColumnType("character varying(1024)")
.HasColumnName("name");
b.Property<PolicyConfig>("PolicyConfig")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("policy_config");
b.Property<RemoteStorageConfig>("StorageConfig")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("storage_config");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_pools");
b.ToTable("pools", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnCloudFile", b =>
{
b.Property<string>("Id")
.HasMaxLength(32)
.HasColumnType("character varying(32)")
.HasColumnName("id");
b.Property<Guid>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<Guid?>("BundleId")
.HasColumnType("uuid")
.HasColumnName("bundle_id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<string>("Description")
.HasMaxLength(4096)
.HasColumnType("character varying(4096)")
.HasColumnName("description");
b.Property<Instant?>("ExpiredAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("expired_at");
b.Property<Dictionary<string, object>>("FileMeta")
.HasColumnType("jsonb")
.HasColumnName("file_meta");
b.Property<bool>("HasCompression")
.HasColumnType("boolean")
.HasColumnName("has_compression");
b.Property<bool>("HasThumbnail")
.HasColumnType("boolean")
.HasColumnName("has_thumbnail");
b.Property<string>("Hash")
.HasMaxLength(256)
.HasColumnType("character varying(256)")
.HasColumnName("hash");
b.Property<bool>("IsEncrypted")
.HasColumnType("boolean")
.HasColumnName("is_encrypted");
b.Property<bool>("IsMarkedRecycle")
.HasColumnType("boolean")
.HasColumnName("is_marked_recycle");
b.Property<string>("MimeType")
.HasMaxLength(256)
.HasColumnType("character varying(256)")
.HasColumnName("mime_type");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(1024)
.HasColumnType("character varying(1024)")
.HasColumnName("name");
b.Property<Guid?>("PoolId")
.HasColumnType("uuid")
.HasColumnName("pool_id");
b.Property<List<ContentSensitiveMark>>("SensitiveMarks")
.HasColumnType("jsonb")
.HasColumnName("sensitive_marks");
b.Property<long>("Size")
.HasColumnType("bigint")
.HasColumnName("size");
b.Property<string>("StorageId")
.HasMaxLength(32)
.HasColumnType("character varying(32)")
.HasColumnName("storage_id");
b.Property<string>("StorageUrl")
.HasMaxLength(4096)
.HasColumnType("character varying(4096)")
.HasColumnName("storage_url");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.Property<Instant?>("UploadedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("uploaded_at");
b.Property<Dictionary<string, object>>("UserMeta")
.HasColumnType("jsonb")
.HasColumnName("user_meta");
b.HasKey("Id")
.HasName("pk_files");
b.HasIndex("BundleId")
.HasDatabaseName("ix_files_bundle_id");
b.HasIndex("PoolId")
.HasDatabaseName("ix_files_pool_id");
b.ToTable("files", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnCloudFileIndex", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<string>("FileId")
.IsRequired()
.HasMaxLength(32)
.HasColumnType("character varying(32)")
.HasColumnName("file_id");
b.Property<string>("Path")
.IsRequired()
.HasMaxLength(8192)
.HasColumnType("character varying(8192)")
.HasColumnName("path");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_file_indexes");
b.HasIndex("FileId")
.HasDatabaseName("ix_file_indexes_file_id");
b.HasIndex("Path", "AccountId")
.HasDatabaseName("ix_file_indexes_path_account_id");
b.ToTable("file_indexes", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnFileBundle", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<string>("Description")
.HasMaxLength(8192)
.HasColumnType("character varying(8192)")
.HasColumnName("description");
b.Property<Instant?>("ExpiredAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("expired_at");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(1024)
.HasColumnType("character varying(1024)")
.HasColumnName("name");
b.Property<string>("Passcode")
.HasMaxLength(256)
.HasColumnType("character varying(256)")
.HasColumnName("passcode");
b.Property<string>("Slug")
.IsRequired()
.HasMaxLength(1024)
.HasColumnType("character varying(1024)")
.HasColumnName("slug");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_bundles");
b.HasIndex("Slug")
.IsUnique()
.HasDatabaseName("ix_bundles_slug");
b.ToTable("bundles", (string)null);
});
modelBuilder.Entity("DysonNetwork.Drive.Storage.Model.PersistentUploadTask", b =>
{
b.HasBaseType("DysonNetwork.Drive.Storage.Model.PersistentTask");
b.Property<Guid?>("BundleId")
.HasColumnType("uuid")
.HasColumnName("bundle_id");
b.Property<long>("ChunkSize")
.HasColumnType("bigint")
.HasColumnName("chunk_size");
b.Property<int>("ChunksCount")
.HasColumnType("integer")
.HasColumnName("chunks_count");
b.Property<int>("ChunksUploaded")
.HasColumnType("integer")
.HasColumnName("chunks_uploaded");
b.Property<string>("ContentType")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("character varying(128)")
.HasColumnName("content_type");
b.Property<string>("EncryptPassword")
.HasMaxLength(256)
.HasColumnType("character varying(256)")
.HasColumnName("encrypt_password");
b.Property<string>("FileName")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("character varying(256)")
.HasColumnName("file_name");
b.Property<long>("FileSize")
.HasColumnType("bigint")
.HasColumnName("file_size");
b.Property<string>("Hash")
.IsRequired()
.HasColumnType("text")
.HasColumnName("hash");
b.Property<string>("Path")
.HasColumnType("text")
.HasColumnName("path");
b.Property<Guid>("PoolId")
.HasColumnType("uuid")
.HasColumnName("pool_id");
b.PrimitiveCollection<List<int>>("UploadedChunks")
.IsRequired()
.HasColumnType("integer[]")
.HasColumnName("uploaded_chunks");
b.HasDiscriminator().HasValue("PersistentUploadTask");
});
modelBuilder.Entity("DysonNetwork.Shared.Models.CloudFileReference", b =>
{
b.HasOne("DysonNetwork.Shared.Models.SnCloudFile", "File")
.WithMany("References")
.HasForeignKey("FileId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired()
.HasConstraintName("fk_file_references_files_file_id");
b.Navigation("File");
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnCloudFile", b =>
{
b.HasOne("DysonNetwork.Shared.Models.SnFileBundle", "Bundle")
.WithMany("Files")
.HasForeignKey("BundleId")
.HasConstraintName("fk_files_bundles_bundle_id");
b.HasOne("DysonNetwork.Shared.Models.FilePool", "Pool")
.WithMany()
.HasForeignKey("PoolId")
.HasConstraintName("fk_files_pools_pool_id");
b.Navigation("Bundle");
b.Navigation("Pool");
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnCloudFileIndex", b =>
{
b.HasOne("DysonNetwork.Shared.Models.SnCloudFile", "File")
.WithMany("FileIndexes")
.HasForeignKey("FileId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired()
.HasConstraintName("fk_file_indexes_files_file_id");
b.Navigation("File");
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnCloudFile", b =>
{
b.Navigation("FileIndexes");
b.Navigation("References");
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnFileBundle", b =>
{
b.Navigation("Files");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,66 @@
using System;
using Microsoft.EntityFrameworkCore.Migrations;
using NodaTime;
#nullable disable
namespace DysonNetwork.Drive.Migrations
{
/// <inheritdoc />
public partial class AddFileIndex : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.AddColumn<string>(
name: "path",
table: "tasks",
type: "text",
nullable: true);
migrationBuilder.CreateTable(
name: "file_indexes",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
path = table.Column<string>(type: "character varying(8192)", maxLength: 8192, nullable: false),
file_id = table.Column<string>(type: "character varying(32)", maxLength: 32, nullable: false),
account_id = table.Column<Guid>(type: "uuid", nullable: false),
created_at = table.Column<Instant>(type: "timestamp with time zone", nullable: false),
updated_at = table.Column<Instant>(type: "timestamp with time zone", nullable: false),
deleted_at = table.Column<Instant>(type: "timestamp with time zone", nullable: true)
},
constraints: table =>
{
table.PrimaryKey("pk_file_indexes", x => x.id);
table.ForeignKey(
name: "fk_file_indexes_files_file_id",
column: x => x.file_id,
principalTable: "files",
principalColumn: "id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateIndex(
name: "ix_file_indexes_file_id",
table: "file_indexes",
column: "file_id");
migrationBuilder.CreateIndex(
name: "ix_file_indexes_path_account_id",
table: "file_indexes",
columns: new[] { "path", "account_id" });
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable(
name: "file_indexes");
migrationBuilder.DropColumn(
name: "path",
table: "tasks");
}
}
}

View File

@@ -179,7 +179,7 @@ namespace DysonNetwork.Drive.Migrations
b.UseTphMappingStrategy();
});
modelBuilder.Entity("DysonNetwork.Shared.Models.CloudFileReference", b =>
modelBuilder.Entity("DysonNetwork.Shared.Models.SnCloudFileReference", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
@@ -403,6 +403,53 @@ namespace DysonNetwork.Drive.Migrations
b.ToTable("files", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnCloudFileIndex", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<string>("FileId")
.IsRequired()
.HasMaxLength(32)
.HasColumnType("character varying(32)")
.HasColumnName("file_id");
b.Property<string>("Path")
.IsRequired()
.HasMaxLength(8192)
.HasColumnType("character varying(8192)")
.HasColumnName("path");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_file_indexes");
b.HasIndex("FileId")
.HasDatabaseName("ix_file_indexes_file_id");
b.HasIndex("Path", "AccountId")
.HasDatabaseName("ix_file_indexes_path_account_id");
b.ToTable("file_indexes", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnFileBundle", b =>
{
b.Property<Guid>("Id")
@@ -508,6 +555,10 @@ namespace DysonNetwork.Drive.Migrations
.HasColumnType("text")
.HasColumnName("hash");
b.Property<string>("Path")
.HasColumnType("text")
.HasColumnName("path");
b.Property<Guid>("PoolId")
.HasColumnType("uuid")
.HasColumnName("pool_id");
@@ -520,7 +571,7 @@ namespace DysonNetwork.Drive.Migrations
b.HasDiscriminator().HasValue("PersistentUploadTask");
});
modelBuilder.Entity("DysonNetwork.Shared.Models.CloudFileReference", b =>
modelBuilder.Entity("DysonNetwork.Shared.Models.SnCloudFileReference", b =>
{
b.HasOne("DysonNetwork.Shared.Models.SnCloudFile", "File")
.WithMany("References")
@@ -549,8 +600,22 @@ namespace DysonNetwork.Drive.Migrations
b.Navigation("Pool");
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnCloudFileIndex", b =>
{
b.HasOne("DysonNetwork.Shared.Models.SnCloudFile", "File")
.WithMany("FileIndexes")
.HasForeignKey("FileId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired()
.HasConstraintName("fk_file_indexes_files_file_id");
b.Navigation("File");
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnCloudFile", b =>
{
b.Navigation("FileIndexes");
b.Navigation("References");
});

View File

@@ -7,7 +7,9 @@ using Microsoft.EntityFrameworkCore;
var builder = WebApplication.CreateBuilder(args);
builder.AddServiceDefaults();
builder.AddServiceDefaults("drive");
builder.Services.Configure<ServiceRegistrationOptions>(opts => { opts.Name = "drive"; });
// Configure Kestrel and server options
builder.ConfigureAppKestrel(builder.Configuration, maxRequestBodySize: long.MaxValue);
@@ -17,7 +19,6 @@ builder.ConfigureAppKestrel(builder.Configuration, maxRequestBodySize: long.MaxV
builder.Services.AddAppServices(builder.Configuration);
builder.Services.AddAppAuthentication();
builder.Services.AddDysonAuth();
builder.Services.AddAccountService();
builder.Services.AddAppFlushHandlers();
builder.Services.AddAppBusinessServices();

View File

@@ -1,5 +1,7 @@
using System.Text.Json;
using DysonNetwork.Drive.Storage;
using DysonNetwork.Drive.Storage.Model;
using DysonNetwork.Shared.Models;
using DysonNetwork.Shared.Proto;
using DysonNetwork.Shared.Stream;
using FFMpegCore;
@@ -29,16 +31,15 @@ public class BroadcastEventHandler(
[".gif", ".apng", ".avif"];
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
var js = nats.CreateJetStreamContext();
await js.EnsureStreamCreated("account_events", [AccountDeletedEvent.Type]);
var accountEventConsumer = await js.CreateOrUpdateConsumerAsync("account_events",
new ConsumerConfig("drive_account_deleted_handler"), cancellationToken: stoppingToken);
await js.EnsureStreamCreated("file_events", [FileUploadedEvent.Type]);
var fileUploadedConsumer = await js.CreateOrUpdateConsumerAsync("file_events",
new ConsumerConfig("drive_file_uploaded_handler") { MaxDeliver = 3 }, cancellationToken: stoppingToken);
@@ -53,13 +54,14 @@ public class BroadcastEventHandler(
{
await foreach (var msg in consumer.ConsumeAsync<byte[]>(cancellationToken: stoppingToken))
{
var payload = JsonSerializer.Deserialize<FileUploadedEventPayload>(msg.Data, GrpcTypeHelper.SerializerOptions);
var payload =
JsonSerializer.Deserialize<FileUploadedEventPayload>(msg.Data, GrpcTypeHelper.SerializerOptions);
if (payload == null)
{
await msg.AckAsync(cancellationToken: stoppingToken);
continue;
}
try
{
await ProcessAndUploadInBackgroundAsync(
@@ -129,8 +131,8 @@ public class BroadcastEventHandler(
}
}
}
private async Task ProcessAndUploadInBackgroundAsync(
private async Task ProcessAndUploadInBackgroundAsync(
string fileId,
Guid remoteId,
string storageId,
@@ -142,6 +144,7 @@ public class BroadcastEventHandler(
using var scope = serviceProvider.CreateScope();
var fs = scope.ServiceProvider.GetRequiredService<FileService>();
var scopedDb = scope.ServiceProvider.GetRequiredService<AppDatabase>();
var persistentTaskService = scope.ServiceProvider.GetRequiredService<PersistentTaskService>();
var pool = await fs.GetPoolAsync(remoteId);
if (pool is null) return;
@@ -155,6 +158,11 @@ public class BroadcastEventHandler(
var fileToUpdate = await scopedDb.Files.AsNoTracking().FirstAsync(f => f.Id == fileId);
// Find the upload task associated with this file
var uploadTask = await scopedDb.Tasks
.OfType<PersistentUploadTask>()
.FirstOrDefaultAsync(t => t.FileName == fileToUpdate.Name && t.FileSize == fileToUpdate.Size);
if (fileToUpdate.IsEncrypted)
{
uploads.Add((processingFilePath, string.Empty, contentType, false));
@@ -293,5 +301,51 @@ public class BroadcastEventHandler(
}
await fs._PurgeCacheAsync(fileId);
// Complete the upload task if found
if (uploadTask != null)
{
await persistentTaskService.MarkTaskCompletedAsync(uploadTask.TaskId, new Dictionary<string, object?>
{
{ "FileId", fileId },
{ "FileName", fileToUpdate.Name },
{ "FileInfo", fileToUpdate },
{ "FileSize", fileToUpdate.Size },
{ "MimeType", newMimeType },
{ "HasCompression", hasCompression },
{ "HasThumbnail", hasThumbnail }
});
// Send push notification for large files (>5MB) that took longer to process
if (fileToUpdate.Size > 5 * 1024 * 1024) // 5MB threshold
await SendLargeFileProcessingCompleteNotificationAsync(uploadTask, fileToUpdate);
}
}
}
private async Task SendLargeFileProcessingCompleteNotificationAsync(PersistentUploadTask task, SnCloudFile file)
{
try
{
var ringService = serviceProvider.GetRequiredService<RingService.RingServiceClient>();
var pushNotification = new PushNotification
{
Topic = "drive.tasks.upload",
Title = "File Processing Complete",
Subtitle = file.Name,
Body = $"Your file '{file.Name}' has finished processing and is now available.",
IsSavable = true
};
await ringService.SendPushNotificationToUserAsync(new SendPushNotificationToUserRequest
{
UserId = task.AccountId.ToString(),
Notification = pushNotification
});
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to send large file processing notification for task {TaskId}", task.TaskId);
}
}
}

View File

@@ -22,6 +22,13 @@ public static class ScheduledJobsConfiguration
.ForJob(cloudFileUnusedRecyclingJob)
.WithIdentity("CloudFileUnusedRecyclingTrigger")
.WithCronSchedule("0 0 0 * * ?"));
var persistentTaskCleanupJob = new JobKey("PersistentTaskCleanup");
q.AddJob<PersistentTaskCleanupJob>(opts => opts.WithIdentity(persistentTaskCleanupJob));
q.AddTrigger(opts => opts
.ForJob(persistentTaskCleanupJob)
.WithIdentity("PersistentTaskCleanupTrigger")
.WithCronSchedule("0 0 2 * * ?")); // Run daily at 2 AM
});
services.AddQuartzHostedService(q => q.WaitForJobsToComplete = true);

View File

@@ -1,5 +1,6 @@
using System.Text.Json;
using System.Text.Json.Serialization;
using DysonNetwork.Drive.Index;
using DysonNetwork.Shared.Cache;
using NodaTime;
using NodaTime.Serialization.SystemTextJson;
@@ -11,9 +12,7 @@ public static class ServiceCollectionExtensions
public static IServiceCollection AddAppServices(this IServiceCollection services, IConfiguration configuration)
{
services.AddDbContext<AppDatabase>(); // Assuming you'll have an AppDatabase
services.AddSingleton<IClock>(SystemClock.Instance);
services.AddHttpContextAccessor();
services.AddSingleton<ICacheService, CacheServiceRedis>(); // Uncomment if you have CacheServiceRedis
services.AddHttpClient();
@@ -55,11 +54,13 @@ public static class ServiceCollectionExtensions
{
services.AddScoped<Storage.FileService>();
services.AddScoped<Storage.FileReferenceService>();
services.AddScoped<Storage.PersistentTaskService>();
services.AddScoped<FileIndexService>();
services.AddScoped<Billing.UsageService>();
services.AddScoped<Billing.QuotaService>();
services.AddHostedService<BroadcastEventHandler>();
return services;
}
}
}

View File

@@ -14,7 +14,7 @@ public class CloudFileUnusedRecyclingJob(
public async Task Execute(IJobExecutionContext context)
{
logger.LogInformation("Cleaning tus cloud files...");
var storePath = configuration["Tus:StorePath"];
var storePath = configuration["Storage:Uploads"];
if (Directory.Exists(storePath))
{
var oneHourAgo = SystemClock.Instance.GetCurrentInstant() - Duration.FromHours(1);
@@ -39,6 +39,7 @@ public class CloudFileUnusedRecyclingJob(
var processedCount = 0;
var markedCount = 0;
var totalFiles = await db.Files
.Where(f => f.FileIndexes.Count == 0)
.Where(f => f.PoolId.HasValue && recyclablePools.Contains(f.PoolId.Value))
.Where(f => !f.IsMarkedRecycle)
.CountAsync();

View File

@@ -1,4 +1,3 @@
using DysonNetwork.Drive.Billing;
using DysonNetwork.Shared.Auth;
using DysonNetwork.Shared.Models;
using DysonNetwork.Shared.Proto;
@@ -14,9 +13,9 @@ namespace DysonNetwork.Drive.Storage;
public class FileController(
AppDatabase db,
FileService fs,
QuotaService qs,
IConfiguration configuration,
IWebHostEnvironment env
IWebHostEnvironment env,
FileReferenceService fileReferenceService
) : ControllerBase
{
[HttpGet("{id}")]
@@ -63,30 +62,31 @@ public class FileController(
return null;
}
private async Task<ActionResult> ServeLocalFile(SnCloudFile file)
private Task<ActionResult> ServeLocalFile(SnCloudFile file)
{
// Try temp storage first
var tempFilePath = Path.Combine(Path.GetTempPath(), file.Id);
if (System.IO.File.Exists(tempFilePath))
{
if (file.IsEncrypted)
return StatusCode(StatusCodes.Status403Forbidden, "Encrypted files cannot be accessed before they are processed and stored.");
return Task.FromResult<ActionResult>(StatusCode(StatusCodes.Status403Forbidden,
"Encrypted files cannot be accessed before they are processed and stored."));
return PhysicalFile(tempFilePath, file.MimeType ?? "application/octet-stream", file.Name, enableRangeProcessing: true);
return Task.FromResult<ActionResult>(PhysicalFile(tempFilePath, file.MimeType ?? "application/octet-stream",
file.Name, enableRangeProcessing: true));
}
// Fallback for tus uploads
var tusStorePath = configuration.GetValue<string>("Tus:StorePath");
if (!string.IsNullOrEmpty(tusStorePath))
{
var tusFilePath = Path.Combine(env.ContentRootPath, tusStorePath, file.Id);
if (System.IO.File.Exists(tusFilePath))
{
return PhysicalFile(tusFilePath, file.MimeType ?? "application/octet-stream", file.Name, enableRangeProcessing: true);
}
}
return StatusCode(StatusCodes.Status400BadRequest, "File is being processed. Please try again later.");
var tusStorePath = configuration.GetValue<string>("Storage:Uploads");
if (string.IsNullOrEmpty(tusStorePath))
return Task.FromResult<ActionResult>(StatusCode(StatusCodes.Status400BadRequest,
"File is being processed. Please try again later."));
var tusFilePath = Path.Combine(env.ContentRootPath, tusStorePath, file.Id);
return System.IO.File.Exists(tusFilePath)
? Task.FromResult<ActionResult>(PhysicalFile(tusFilePath, file.MimeType ?? "application/octet-stream",
file.Name, enableRangeProcessing: true))
: Task.FromResult<ActionResult>(StatusCode(StatusCodes.Status400BadRequest,
"File is being processed. Please try again later."));
}
private async Task<ActionResult> ServeRemoteFile(
@@ -99,7 +99,8 @@ public class FileController(
)
{
if (!file.PoolId.HasValue)
return StatusCode(StatusCodes.Status500InternalServerError, "File is in an inconsistent state: uploaded but no pool ID.");
return StatusCode(StatusCodes.Status500InternalServerError,
"File is in an inconsistent state: uploaded but no pool ID.");
var pool = await fs.GetPoolAsync(file.PoolId.Value);
if (pool is null)
@@ -148,15 +149,10 @@ public class FileController(
return Redirect(BuildProxyUrl(dest.ImageProxy, fileName));
}
if (dest.AccessProxy is not null)
{
return Redirect(BuildProxyUrl(dest.AccessProxy, fileName));
}
return null;
return dest.AccessProxy is not null ? Redirect(BuildProxyUrl(dest.AccessProxy, fileName)) : null;
}
private string BuildProxyUrl(string proxyUrl, string fileName)
private static string BuildProxyUrl(string proxyUrl, string fileName)
{
var baseUri = new Uri(proxyUrl.EndsWith('/') ? proxyUrl : $"{proxyUrl}/");
var fullUri = new Uri(baseUri, fileName);
@@ -189,7 +185,7 @@ public class FileController(
return Redirect(openUrl);
}
private Dictionary<string, string> BuildSignedUrlHeaders(
private static Dictionary<string, string> BuildSignedUrlHeaders(
SnCloudFile file,
string? fileExtension,
string? overrideMimeType,
@@ -234,6 +230,21 @@ public class FileController(
return file;
}
[HttpGet("{id}/references")]
public async Task<ActionResult<List<Shared.Models.SnCloudFileReference>>> GetFileReferences(string id)
{
var file = await fs.GetFileAsync(id);
if (file is null) return NotFound("File not found.");
// Check if user has access to the file
var accessResult = await ValidateFileAccess(file, null);
if (accessResult is not null) return accessResult;
// Get references using the injected FileReferenceService
var references = await fileReferenceService.GetReferencesAsync(id);
return Ok(references);
}
[Authorize]
[HttpPatch("{id}/name")]
public async Task<ActionResult<SnCloudFile>> UpdateFileName(string id, [FromBody] string name)
@@ -281,25 +292,40 @@ public class FileController(
[FromQuery] Guid? pool,
[FromQuery] bool recycled = false,
[FromQuery] int offset = 0,
[FromQuery] int take = 20
[FromQuery] int take = 20,
[FromQuery] string? query = null,
[FromQuery] string order = "date",
[FromQuery] bool orderDesc = true
)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser) return Unauthorized();
var accountId = Guid.Parse(currentUser.Id);
var query = db.Files
var filesQuery = db.Files
.Where(e => e.IsMarkedRecycle == recycled)
.Where(e => e.AccountId == accountId)
.Include(e => e.Pool)
.OrderByDescending(e => e.CreatedAt)
.AsQueryable();
if (pool.HasValue) query = query.Where(e => e.PoolId == pool);
if (pool.HasValue) filesQuery = filesQuery.Where(e => e.PoolId == pool);
var total = await query.CountAsync();
if (!string.IsNullOrWhiteSpace(query))
{
filesQuery = filesQuery.Where(e => e.Name.Contains(query));
}
filesQuery = order.ToLower() switch
{
"date" => orderDesc ? filesQuery.OrderByDescending(e => e.CreatedAt) : filesQuery.OrderBy(e => e.CreatedAt),
"size" => orderDesc ? filesQuery.OrderByDescending(e => e.Size) : filesQuery.OrderBy(e => e.Size),
"name" => orderDesc ? filesQuery.OrderByDescending(e => e.Name) : filesQuery.OrderBy(e => e.Name),
_ => filesQuery.OrderByDescending(e => e.CreatedAt)
};
var total = await filesQuery.CountAsync();
Response.Headers.Append("X-Total", total.ToString());
var files = await query
var files = await filesQuery
.Skip(offset)
.Take(take)
.ToListAsync();
@@ -307,9 +333,25 @@ public class FileController(
return Ok(files);
}
public class FileBatchDeletionRequest
{
public List<string> FileIds { get; set; } = [];
}
[Authorize]
[HttpPost("batches/delete")]
public async Task<ActionResult> DeleteFileBatch([FromBody] FileBatchDeletionRequest request)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser) return Unauthorized();
var userId = Guid.Parse(currentUser.Id);
var count = await fs.DeleteAccountFileBatchAsync(userId, request.FileIds);
return Ok(new { Count = count });
}
[Authorize]
[HttpDelete("{id}")]
public async Task<ActionResult> DeleteFile(string id)
public async Task<ActionResult<SnCloudFile>> DeleteFile(string id)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser) return Unauthorized();
var userId = Guid.Parse(currentUser.Id);
@@ -321,9 +363,9 @@ public class FileController(
if (file is null) return NotFound();
await fs.DeleteFileDataAsync(file, force: true);
await fs.DeleteFileAsync(file);
await fs.DeleteFileAsync(file, skipData: true);
return NoContent();
return Ok(file);
}
[Authorize]
@@ -339,116 +381,10 @@ public class FileController(
[Authorize]
[HttpDelete("recycle")]
[RequiredPermission("maintenance", "files.delete.recycle")]
[AskPermission("files.delete.recycle")]
public async Task<ActionResult> DeleteAllRecycledFiles()
{
var count = await fs.DeleteAllRecycledFilesAsync();
return Ok(new { Count = count });
}
public class CreateFastFileRequest
{
public string Name { get; set; } = null!;
public long Size { get; set; }
public string Hash { get; set; } = null!;
public string? MimeType { get; set; }
public string? Description { get; set; }
public Dictionary<string, object?>? UserMeta { get; set; }
public Dictionary<string, object?>? FileMeta { get; set; }
public List<Shared.Models.ContentSensitiveMark>? SensitiveMarks { get; set; }
public Guid PoolId { get; set; }
}
[Authorize]
[HttpPost("fast")]
[RequiredPermission("global", "files.create")]
public async Task<ActionResult<SnCloudFile>> CreateFastFile([FromBody] CreateFastFileRequest request)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser) return Unauthorized();
var accountId = Guid.Parse(currentUser.Id);
var pool = await db.Pools.FirstOrDefaultAsync(p => p.Id == request.PoolId);
if (pool is null) return BadRequest();
if (!currentUser.IsSuperuser && pool.AccountId != accountId)
return StatusCode(403, "You don't have permission to create files in this pool.");
if (!pool.PolicyConfig.EnableFastUpload)
return StatusCode(
403,
"This pool does not allow fast upload"
);
if (pool.PolicyConfig.RequirePrivilege > 0)
{
if (currentUser.PerkSubscription is null)
{
return StatusCode(
403,
$"You need to have join the Stellar Program to use this pool"
);
}
var privilege =
PerkSubscriptionPrivilege.GetPrivilegeFromIdentifier(currentUser.PerkSubscription.Identifier);
if (privilege < pool.PolicyConfig.RequirePrivilege)
{
return StatusCode(
403,
$"You need Stellar Program tier {pool.PolicyConfig.RequirePrivilege} to use this pool, you are tier {privilege}"
);
}
}
if (request.Size > pool.PolicyConfig.MaxFileSize)
{
return StatusCode(
403,
$"File size {request.Size} is larger than the pool's maximum file size {pool.PolicyConfig.MaxFileSize}"
);
}
var (ok, billableUnit, quota) = await qs.IsFileAcceptable(
accountId,
pool.BillingConfig.CostMultiplier ?? 1.0,
request.Size
);
if (!ok)
{
return StatusCode(
403,
$"File size {billableUnit} is larger than the user's quota {quota}"
);
}
await using var transaction = await db.Database.BeginTransactionAsync();
try
{
var file = new SnCloudFile
{
Name = request.Name,
Size = request.Size,
Hash = request.Hash,
MimeType = request.MimeType,
Description = request.Description,
AccountId = accountId,
UserMeta = request.UserMeta,
FileMeta = request.FileMeta,
SensitiveMarks = request.SensitiveMarks,
PoolId = request.PoolId
};
db.Files.Add(file);
await db.SaveChangesAsync();
await fs._PurgeCacheAsync(file.Id);
await transaction.CommitAsync();
file.FastUploadLink = await fs.CreateFastUploadLinkAsync(file);
return file;
}
catch (Exception)
{
await transaction.RollbackAsync();
throw;
}
}
}
}

View File

@@ -1,4 +1,5 @@
using DysonNetwork.Shared.Cache;
using DysonNetwork.Shared.Data;
using DysonNetwork.Shared.Models;
using EFCore.BulkExtensions;
using Microsoft.EntityFrameworkCore;
@@ -20,7 +21,7 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
/// <param name="expiredAt">Optional expiration time for the file</param>
/// <param name="duration">Optional duration after which the file expires (alternative to expiredAt)</param>
/// <returns>The created file reference</returns>
public async Task<CloudFileReference> CreateReferenceAsync(
public async Task<SnCloudFileReference> CreateReferenceAsync(
string fileId,
string usage,
string resourceId,
@@ -33,7 +34,7 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
if (duration.HasValue)
finalExpiration = SystemClock.Instance.GetCurrentInstant() + duration.Value;
var reference = new CloudFileReference
var reference = new SnCloudFileReference
{
FileId = fileId,
Usage = usage,
@@ -49,7 +50,7 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
return reference;
}
public async Task<List<CloudFileReference>> CreateReferencesAsync(
public async Task<List<SnCloudFileReference>> CreateReferencesAsync(
List<string> fileId,
string usage,
string resourceId,
@@ -57,12 +58,15 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
Duration? duration = null
)
{
var data = fileId.Select(id => new CloudFileReference
var now = SystemClock.Instance.GetCurrentInstant();
var data = fileId.Select(id => new SnCloudFileReference
{
FileId = id,
Usage = usage,
ResourceId = resourceId,
ExpiredAt = expiredAt ?? SystemClock.Instance.GetCurrentInstant() + duration
ExpiredAt = expiredAt ?? now + duration,
CreatedAt = now,
UpdatedAt = now
}).ToList();
await db.BulkInsertAsync(data);
return data;
@@ -73,11 +77,11 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
/// </summary>
/// <param name="fileId">The ID of the file</param>
/// <returns>A list of all references to the file</returns>
public async Task<List<CloudFileReference>> GetReferencesAsync(string fileId)
public async Task<List<SnCloudFileReference>> GetReferencesAsync(string fileId)
{
var cacheKey = $"{CacheKeyPrefix}list:{fileId}";
var cachedReferences = await cache.GetAsync<List<CloudFileReference>>(cacheKey);
var cachedReferences = await cache.GetAsync<List<SnCloudFileReference>>(cacheKey);
if (cachedReferences is not null)
return cachedReferences;
@@ -90,17 +94,17 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
return references;
}
public async Task<Dictionary<string, List<CloudFileReference>>> GetReferencesAsync(IEnumerable<string> fileIds)
public async Task<Dictionary<string, List<SnCloudFileReference>>> GetReferencesAsync(IEnumerable<string> fileIds)
{
var fileIdList = fileIds.ToList();
var result = new Dictionary<string, List<CloudFileReference>>();
var result = new Dictionary<string, List<SnCloudFileReference>>();
// Check cache for each file ID
var uncachedFileIds = new List<string>();
foreach (var fileId in fileIdList)
{
var cacheKey = $"{CacheKeyPrefix}list:{fileId}";
var cachedReferences = await cache.GetAsync<List<CloudFileReference>>(cacheKey);
var cachedReferences = await cache.GetAsync<List<SnCloudFileReference>>(cacheKey);
if (cachedReferences is not null)
{
result[fileId] = cachedReferences;
@@ -158,11 +162,11 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
/// </summary>
/// <param name="resourceId">The ID of the resource</param>
/// <returns>A list of file references associated with the resource</returns>
public async Task<List<CloudFileReference>> GetResourceReferencesAsync(string resourceId)
public async Task<List<SnCloudFileReference>> GetResourceReferencesAsync(string resourceId)
{
var cacheKey = $"{CacheKeyPrefix}resource:{resourceId}";
var cachedReferences = await cache.GetAsync<List<CloudFileReference>>(cacheKey);
var cachedReferences = await cache.GetAsync<List<SnCloudFileReference>>(cacheKey);
if (cachedReferences is not null)
return cachedReferences;
@@ -180,11 +184,11 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
/// </summary>
/// <param name="usage">The usage context</param>
/// <returns>A list of file references with the specified usage</returns>
public async Task<List<CloudFileReference>> GetUsageReferencesAsync(string usage)
public async Task<List<SnCloudFileReference>> GetUsageReferencesAsync(string usage)
{
var cacheKey = $"{CacheKeyPrefix}usage:{usage}";
var cachedReferences = await cache.GetAsync<List<CloudFileReference>>(cacheKey);
var cachedReferences = await cache.GetAsync<List<SnCloudFileReference>>(cacheKey);
if (cachedReferences is not null)
return cachedReferences;
@@ -306,7 +310,7 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
/// <param name="expiredAt">Optional expiration time for newly added files</param>
/// <param name="duration">Optional duration after which newly added files expire</param>
/// <returns>A list of the updated file references</returns>
public async Task<List<CloudFileReference>> UpdateResourceFilesAsync(
public async Task<List<SnCloudFileReference>> UpdateResourceFilesAsync(
string resourceId,
IEnumerable<string>? newFileIds,
string usage,
@@ -314,7 +318,7 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
Duration? duration = null)
{
if (newFileIds == null)
return new List<CloudFileReference>();
return new List<SnCloudFileReference>();
var existingReferences = await db.FileReferences
.Where(r => r.ResourceId == resourceId && r.Usage == usage)
@@ -332,7 +336,7 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
// Files to add
var toAdd = newFileIdsList
.Where(id => !existingFileIds.Contains(id))
.Select(id => new CloudFileReference
.Select(id => new SnCloudFileReference
{
FileId = id,
Usage = usage,
@@ -484,7 +488,7 @@ public class FileReferenceService(AppDatabase db, FileService fileService, ICach
/// <param name="resourceId">The resource ID</param>
/// <param name="usageType">The usage type</param>
/// <returns>List of file references</returns>
public async Task<List<CloudFileReference>> GetResourceReferencesAsync(string resourceId, string usageType)
public async Task<List<SnCloudFileReference>> GetResourceReferencesAsync(string resourceId, string usageType)
{
return await db.FileReferences
.Where(r => r.ResourceId == resourceId && r.Usage == usageType)

View File

@@ -103,7 +103,8 @@ public class FileService(
var bundle = await ValidateAndGetBundleAsync(fileBundleId, accountId);
var finalExpiredAt = CalculateFinalExpiration(expiredAt, pool, bundle);
var (managedTempPath, fileSize, finalContentType) = await PrepareFileAsync(fileId, filePath, fileName, contentType);
var (managedTempPath, fileSize, finalContentType) =
await PrepareFileAsync(fileId, filePath, fileName, contentType);
var file = CreateFileObject(fileId, fileName, finalContentType, fileSize, finalExpiredAt, bundle, accountId);
@@ -112,7 +113,8 @@ public class FileService(
await ExtractMetadataAsync(file, managedTempPath);
}
var (processingPath, isTempFile) = await ProcessEncryptionAsync(fileId, managedTempPath, encryptPassword, pool, file);
var (processingPath, isTempFile) =
await ProcessEncryptionAsync(fileId, managedTempPath, encryptPassword, pool, file);
file.Hash = await HashFileAsync(processingPath);
@@ -126,8 +128,7 @@ public class FileService(
private async Task<FilePool> ValidateAndGetPoolAsync(string filePool)
{
var pool = await GetPoolAsync(Guid.Parse(filePool));
if (pool is null) throw new InvalidOperationException("Pool not found");
return pool;
return pool ?? throw new InvalidOperationException("Pool not found: " + filePool);
}
private async Task<SnFileBundle?> ValidateAndGetBundleAsync(string? fileBundleId, Guid accountId)
@@ -135,12 +136,10 @@ public class FileService(
if (fileBundleId is null) return null;
var bundle = await GetBundleAsync(Guid.Parse(fileBundleId), accountId);
if (bundle is null) throw new InvalidOperationException("Bundle not found");
return bundle;
return bundle ?? throw new InvalidOperationException("Bundle not found: " + fileBundleId);
}
private Instant? CalculateFinalExpiration(Instant? expiredAt, FilePool pool, SnFileBundle? bundle)
private static Instant? CalculateFinalExpiration(Instant? expiredAt, FilePool pool, SnFileBundle? bundle)
{
var finalExpiredAt = expiredAt;
@@ -234,7 +233,8 @@ public class FileService(
file.StorageId ??= file.Id;
}
private async Task PublishFileUploadedEventAsync(SnCloudFile file, FilePool pool, string processingPath, bool isTempFile)
private async Task PublishFileUploadedEventAsync(SnCloudFile file, FilePool pool, string processingPath,
bool isTempFile)
{
var js = nats.CreateJetStreamContext();
await js.PublishAsync(
@@ -474,13 +474,14 @@ public class FileService(
return await db.Files.AsNoTracking().FirstAsync(f => f.Id == file.Id);
}
public async Task DeleteFileAsync(SnCloudFile file)
public async Task DeleteFileAsync(SnCloudFile file, bool skipData = false)
{
db.Remove(file);
await db.SaveChangesAsync();
await _PurgeCacheAsync(file.Id);
await DeleteFileDataAsync(file);
if (!skipData)
await DeleteFileDataAsync(file);
}
public async Task DeleteFileDataAsync(SnCloudFile file, bool force = false)
@@ -663,9 +664,12 @@ public class FileService(
}
}
return [.. references
.Select(r => cachedFiles.GetValueOrDefault(r.Id))
.Where(f => f != null)];
return
[
.. references
.Select(r => cachedFiles.GetValueOrDefault(r.Id))
.Where(f => f != null)
];
}
public async Task<int> GetReferenceCountAsync(string fileId)
@@ -714,6 +718,21 @@ public class FileService(
return count;
}
public async Task<int> DeleteAccountFileBatchAsync(Guid accountId, List<string> fileIds)
{
var files = await db.Files
.Where(f => f.AccountId == accountId && fileIds.Contains(f.Id))
.ToListAsync();
var count = files.Count;
var tasks = files.Select(f => DeleteFileDataAsync(f, true));
await Task.WhenAll(tasks);
var fileIdsList = files.Select(f => f.Id).ToList();
await _PurgeCacheRangeAsync(fileIdsList);
db.RemoveRange(files);
await db.SaveChangesAsync();
return count;
}
public async Task<int> DeletePoolRecycledFilesAsync(Guid poolId)
{
var files = await db.Files

View File

@@ -1,6 +1,6 @@
using System.ComponentModel.DataAnnotations;
using System.Text.Json;
using DysonNetwork.Drive.Billing;
using DysonNetwork.Drive.Index;
using DysonNetwork.Drive.Storage.Model;
using DysonNetwork.Shared.Auth;
using DysonNetwork.Shared.Http;
@@ -24,7 +24,9 @@ public class FileUploadController(
AppDatabase db,
PermissionService.PermissionServiceClient permission,
QuotaService quotaService,
PersistentUploadService persistentUploadService
PersistentTaskService persistentTaskService,
FileIndexService fileIndexService,
ILogger<FileUploadController> logger
)
: ControllerBase
{
@@ -36,8 +38,7 @@ public class FileUploadController(
[HttpPost("create")]
public async Task<IActionResult> CreateUploadTask([FromBody] CreateUploadTaskRequest request)
{
var currentUser = HttpContext.Items["CurrentUser"] as Account;
if (currentUser is null)
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var permissionCheck = await ValidateUserPermissions(currentUser);
@@ -60,10 +61,32 @@ public class FileUploadController(
EnsureTempDirectoryExists();
var accountId = Guid.Parse(currentUser.Id);
// Check if a file with the same hash already exists
var existingFile = await db.Files.FirstOrDefaultAsync(f => f.Hash == request.Hash);
if (existingFile != null)
{
// Create the file index if a path is provided, even for existing files
if (string.IsNullOrEmpty(request.Path))
return Ok(new CreateUploadTaskResponse
{
FileExists = true,
File = existingFile
});
try
{
await fileIndexService.CreateAsync(request.Path, existingFile.Id, accountId);
logger.LogInformation("Created file index for existing file {FileId} at path {Path}",
existingFile.Id, request.Path);
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to create file index for existing file {FileId} at path {Path}",
existingFile.Id, request.Path);
// Don't fail the request if index creation fails, just log it
}
return Ok(new CreateUploadTaskResponse
{
FileExists = true,
@@ -71,11 +94,10 @@ public class FileUploadController(
});
}
var accountId = Guid.Parse(currentUser.Id);
var taskId = await Nanoid.GenerateAsync();
// Create persistent upload task
var persistentTask = await persistentUploadService.CreateUploadTaskAsync(taskId, request, accountId);
var persistentTask = await persistentTaskService.CreateUploadTaskAsync(taskId, request, accountId);
return Ok(new CreateUploadTaskResponse
{
@@ -91,36 +113,38 @@ public class FileUploadController(
if (currentUser.IsSuperuser) return null;
var allowed = await permission.HasPermissionAsync(new HasPermissionRequest
{ Actor = $"user:{currentUser.Id}", Area = "global", Key = "files.create" });
{ Actor = currentUser.Id, Key = "files.create" });
return allowed.HasPermission ? null :
new ObjectResult(ApiError.Unauthorized(forbidden: true)) { StatusCode = 403 };
return allowed.HasPermission
? null
: new ObjectResult(ApiError.Unauthorized(forbidden: true)) { StatusCode = 403 };
}
private async Task<IActionResult?> ValidatePoolAccess(Account currentUser, FilePool pool, CreateUploadTaskRequest request)
private Task<IActionResult?> ValidatePoolAccess(Account currentUser, FilePool pool, CreateUploadTaskRequest request)
{
if (pool.PolicyConfig.RequirePrivilege <= 0) return null;
if (pool.PolicyConfig.RequirePrivilege <= 0) return Task.FromResult<IActionResult?>(null);
var privilege = currentUser.PerkSubscription is null ? 0 :
PerkSubscriptionPrivilege.GetPrivilegeFromIdentifier(currentUser.PerkSubscription.Identifier);
var privilege = currentUser.PerkSubscription is null
? 0
: PerkSubscriptionPrivilege.GetPrivilegeFromIdentifier(currentUser.PerkSubscription.Identifier);
if (privilege < pool.PolicyConfig.RequirePrivilege)
{
return new ObjectResult(ApiError.Unauthorized(
$"You need Stellar Program tier {pool.PolicyConfig.RequirePrivilege} to use pool {pool.Name}, you are tier {privilege}",
forbidden: true))
{ StatusCode = 403 };
return Task.FromResult<IActionResult?>(new ObjectResult(ApiError.Unauthorized(
$"You need Stellar Program tier {pool.PolicyConfig.RequirePrivilege} to use pool {pool.Name}, you are tier {privilege}",
forbidden: true))
{ StatusCode = 403 });
}
return null;
return Task.FromResult<IActionResult?>(null);
}
private IActionResult? ValidatePoolPolicy(PolicyConfig policy, CreateUploadTaskRequest request)
private static IActionResult? ValidatePoolPolicy(PolicyConfig policy, CreateUploadTaskRequest request)
{
if (!policy.AllowEncryption && !string.IsNullOrEmpty(request.EncryptPassword))
{
return new ObjectResult(ApiError.Unauthorized("File encryption is not allowed in this pool", true))
{ StatusCode = 403 };
{ StatusCode = 403 };
}
if (policy.AcceptTypes is { Count: > 0 })
@@ -128,36 +152,35 @@ public class FileUploadController(
if (string.IsNullOrEmpty(request.ContentType))
{
return new ObjectResult(ApiError.Validation(new Dictionary<string, string[]>
{
{ "contentType", new[] { "Content type is required by the pool's policy" } }
}))
{ StatusCode = 400 };
{
{ "contentType", new[] { "Content type is required by the pool's policy" } }
}))
{ StatusCode = 400 };
}
var foundMatch = policy.AcceptTypes.Any(acceptType =>
{
if (acceptType.EndsWith("/*", StringComparison.OrdinalIgnoreCase))
{
var type = acceptType[..^2];
return request.ContentType.StartsWith($"{type}/", StringComparison.OrdinalIgnoreCase);
}
return acceptType.Equals(request.ContentType, StringComparison.OrdinalIgnoreCase);
if (!acceptType.EndsWith("/*", StringComparison.OrdinalIgnoreCase))
return acceptType.Equals(request.ContentType, StringComparison.OrdinalIgnoreCase);
var type = acceptType[..^2];
return request.ContentType.StartsWith($"{type}/", StringComparison.OrdinalIgnoreCase);
});
if (!foundMatch)
{
return new ObjectResult(
ApiError.Unauthorized($"Content type {request.ContentType} is not allowed by the pool's policy", true))
{ StatusCode = 403 };
ApiError.Unauthorized($"Content type {request.ContentType} is not allowed by the pool's policy",
true))
{ StatusCode = 403 };
}
}
if (policy.MaxFileSize is not null && request.FileSize > policy.MaxFileSize)
{
return new ObjectResult(ApiError.Unauthorized(
$"File size {request.FileSize} is larger than the pool's maximum file size {policy.MaxFileSize}", true))
{ StatusCode = 403 };
$"File size {request.FileSize} is larger than the pool's maximum file size {policy.MaxFileSize}",
true))
{ StatusCode = 403 };
}
return null;
@@ -174,8 +197,9 @@ public class FileUploadController(
if (!ok)
{
return new ObjectResult(
ApiError.Unauthorized($"File size {billableUnit} MiB is exceeded the user's quota {quota} MiB", true))
{ StatusCode = 403 };
ApiError.Unauthorized($"File size {billableUnit} MiB is exceeded the user's quota {quota} MiB",
true))
{ StatusCode = 403 };
}
return null;
@@ -189,41 +213,12 @@ public class FileUploadController(
}
}
private async Task<(string taskId, UploadTask task)> CreateUploadTaskInternal(CreateUploadTaskRequest request)
{
var taskId = await Nanoid.GenerateAsync();
var taskPath = Path.Combine(_tempPath, taskId);
Directory.CreateDirectory(taskPath);
var chunkSize = request.ChunkSize ?? DefaultChunkSize;
var chunksCount = (int)Math.Ceiling((double)request.FileSize / chunkSize);
var task = new UploadTask
{
TaskId = taskId,
FileName = request.FileName,
FileSize = request.FileSize,
ContentType = request.ContentType,
ChunkSize = chunkSize,
ChunksCount = chunksCount,
PoolId = request.PoolId.Value,
BundleId = request.BundleId,
EncryptPassword = request.EncryptPassword,
ExpiredAt = request.ExpiredAt,
Hash = request.Hash,
};
await System.IO.File.WriteAllTextAsync(Path.Combine(taskPath, "task.json"), JsonSerializer.Serialize(task));
return (taskId, task);
}
public class UploadChunkRequest
{
[Required]
public IFormFile Chunk { get; set; } = null!;
[Required] public IFormFile Chunk { get; set; } = null!;
}
[HttpPost("chunk/{taskId}/{chunkIndex}")]
[HttpPost("chunk/{taskId}/{chunkIndex:int}")]
[RequestSizeLimit(DefaultChunkSize + 1024 * 1024)] // 6MB to be safe
[RequestFormLimits(MultipartBodyLengthLimit = DefaultChunkSize + 1024 * 1024)]
public async Task<IActionResult> UploadChunk(string taskId, int chunkIndex, [FromForm] UploadChunkRequest request)
@@ -231,7 +226,7 @@ public class FileUploadController(
var chunk = request.Chunk;
// Check if chunk is already uploaded (resumable upload)
if (await persistentUploadService.IsChunkUploadedAsync(taskId, chunkIndex))
if (await persistentTaskService.IsChunkUploadedAsync(taskId, chunkIndex))
{
return Ok(new { message = "Chunk already uploaded" });
}
@@ -247,7 +242,7 @@ public class FileUploadController(
await chunk.CopyToAsync(stream);
// Update persistent task progress
await persistentUploadService.UpdateChunkProgressAsync(taskId, chunkIndex);
await persistentTaskService.UpdateChunkProgressAsync(taskId, chunkIndex);
return Ok();
}
@@ -256,7 +251,7 @@ public class FileUploadController(
public async Task<IActionResult> CompleteUpload(string taskId)
{
// Get persistent task
var persistentTask = await persistentUploadService.GetUploadTaskAsync(taskId);
var persistentTask = await persistentTaskService.GetUploadTaskAsync(taskId);
if (persistentTask is null)
return new ObjectResult(ApiError.NotFound("Upload task")) { StatusCode = 404 };
@@ -276,7 +271,7 @@ public class FileUploadController(
try
{
await MergeChunks(taskPath, mergedFilePath, persistentTask.ChunksCount);
await MergeChunks(taskId, taskPath, mergedFilePath, persistentTask.ChunksCount, persistentTaskService);
var fileId = await Nanoid.GenerateAsync();
var cloudFile = await fileService.ProcessNewFileAsync(
@@ -291,28 +286,50 @@ public class FileUploadController(
persistentTask.ExpiredAt
);
// Mark task as completed
await persistentUploadService.MarkTaskCompletedAsync(taskId);
// Create the file index if a path is provided
if (!string.IsNullOrEmpty(persistentTask.Path))
{
try
{
var accountId = Guid.Parse(currentUser.Id);
await fileIndexService.CreateAsync(persistentTask.Path, fileId, accountId);
logger.LogInformation("Created file index for file {FileId} at path {Path}", fileId,
persistentTask.Path);
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to create file index for file {FileId} at path {Path}", fileId,
persistentTask.Path);
// Don't fail the upload if index creation fails, just log it
}
}
// Send completion notification
await persistentUploadService.SendUploadCompletedNotificationAsync(persistentTask, fileId);
// Update the task status to "processing" - background processing is now happening
await persistentTaskService.UpdateTaskProgressAsync(taskId, 0.95, "Processing file in background...");
// Send upload completion notification (a file is uploaded, but processing continues)
await persistentTaskService.SendUploadCompletedNotificationAsync(persistentTask, fileId);
return Ok(cloudFile);
}
catch (Exception ex)
{
// Log the actual exception for debugging
logger.LogError(ex, "Failed to complete upload for task {TaskId}. Error: {ErrorMessage}", taskId,
ex.Message);
// Mark task as failed
await persistentUploadService.MarkTaskFailedAsync(taskId);
await persistentTaskService.MarkTaskFailedAsync(taskId);
// Send failure notification
await persistentUploadService.SendUploadFailedNotificationAsync(persistentTask, ex.Message);
await persistentTaskService.SendUploadFailedNotificationAsync(persistentTask, ex.Message);
await CleanupTempFiles(taskPath, mergedFilePath);
return new ObjectResult(new ApiError
{
Code = "UPLOAD_FAILED",
Message = "Failed to complete file upload.",
Message = $"Failed to complete file upload: {ex.Message}",
Status = 500
}) { StatusCode = 500 };
}
@@ -323,24 +340,39 @@ public class FileUploadController(
}
}
private async Task MergeChunks(string taskPath, string mergedFilePath, int chunksCount)
private static async Task MergeChunks(
string taskId,
string taskPath,
string mergedFilePath,
int chunksCount,
PersistentTaskService persistentTaskService)
{
await using var mergedStream = new FileStream(mergedFilePath, FileMode.Create);
const double baseProgress = 0.8; // Start from 80% (chunk upload is already at 95%)
const double remainingProgress = 0.15; // Remaining 15% progress distributed across chunks
var progressPerChunk = remainingProgress / chunksCount;
for (var i = 0; i < chunksCount; i++)
{
var chunkPath = Path.Combine(taskPath, $"{i}.chunk");
var chunkPath = Path.Combine(taskPath, i + ".chunk");
if (!System.IO.File.Exists(chunkPath))
{
throw new InvalidOperationException($"Chunk {i} is missing.");
}
throw new InvalidOperationException("Chunk " + i + " is missing.");
await using var chunkStream = new FileStream(chunkPath, FileMode.Open);
await chunkStream.CopyToAsync(mergedStream);
// Update progress after each chunk is merged
var currentProgress = baseProgress + progressPerChunk * (i + 1);
await persistentTaskService.UpdateTaskProgressAsync(
taskId,
currentProgress,
"Merging chunks... (" + (i + 1) + "/" + chunksCount + ")"
);
}
}
private async Task CleanupTempFiles(string taskPath, string mergedFilePath)
private static Task CleanupTempFiles(string taskPath, string mergedFilePath)
{
try
{
@@ -354,6 +386,8 @@ public class FileUploadController(
{
// Ignore cleanup errors to avoid masking the original exception
}
return Task.CompletedTask;
}
// New endpoints for resumable uploads
@@ -372,7 +406,8 @@ public class FileUploadController(
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
var tasks = await persistentUploadService.GetUserTasksAsync(accountId, status, sortBy, sortDescending, offset, limit);
var tasks = await persistentTaskService.GetUserUploadTasksAsync(accountId, status, sortBy, sortDescending,
offset, limit);
Response.Headers.Append("X-Total", tasks.TotalCount.ToString());
@@ -390,7 +425,7 @@ public class FileUploadController(
t.LastActivity,
t.CreatedAt,
t.UpdatedAt,
UploadedChunks = t.UploadedChunks,
t.UploadedChunks,
Pool = new { t.PoolId, Name = "Pool Name" }, // Could be expanded to include pool details
Bundle = t.BundleId.HasValue ? new { t.BundleId } : null
}));
@@ -403,7 +438,7 @@ public class FileUploadController(
if (currentUser is null)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var task = await persistentUploadService.GetUploadTaskAsync(taskId);
var task = await persistentTaskService.GetUploadTaskAsync(taskId);
if (task is null)
return new ObjectResult(ApiError.NotFound("Upload task")) { StatusCode = 404 };
@@ -411,7 +446,7 @@ public class FileUploadController(
if (task.AccountId != Guid.Parse(currentUser.Id))
return new ObjectResult(ApiError.Unauthorized(forbidden: true)) { StatusCode = 403 };
var progress = await persistentUploadService.GetUploadProgressAsync(taskId);
var progress = await persistentTaskService.GetUploadProgressAsync(taskId);
return Ok(new
{
@@ -434,7 +469,7 @@ public class FileUploadController(
if (currentUser is null)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var task = await persistentUploadService.GetUploadTaskAsync(taskId);
var task = await persistentTaskService.GetUploadTaskAsync(taskId);
if (task is null)
return new ObjectResult(ApiError.NotFound("Upload task")) { StatusCode = 404 };
@@ -458,7 +493,7 @@ public class FileUploadController(
task.ChunkSize,
task.ChunksCount,
task.ChunksUploaded,
UploadedChunks = task.UploadedChunks,
task.UploadedChunks,
Progress = task.ChunksCount > 0 ? (double)task.ChunksUploaded / task.ChunksCount * 100 : 0
});
}
@@ -470,7 +505,7 @@ public class FileUploadController(
if (currentUser is null)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var task = await persistentUploadService.GetUploadTaskAsync(taskId);
var task = await persistentTaskService.GetUploadTaskAsync(taskId);
if (task is null)
return new ObjectResult(ApiError.NotFound("Upload task")) { StatusCode = 404 };
@@ -479,7 +514,7 @@ public class FileUploadController(
return new ObjectResult(ApiError.Unauthorized(forbidden: true)) { StatusCode = 403 };
// Mark as failed (cancelled)
await persistentUploadService.MarkTaskFailedAsync(taskId);
await persistentTaskService.MarkTaskFailedAsync(taskId);
// Clean up temp files
var taskPath = Path.Combine(_tempPath, taskId);
@@ -496,18 +531,18 @@ public class FileUploadController(
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
var stats = await persistentUploadService.GetUserUploadStatsAsync(accountId);
var stats = await persistentTaskService.GetUserUploadStatsAsync(accountId);
return Ok(new
{
TotalTasks = stats.TotalTasks,
InProgressTasks = stats.InProgressTasks,
CompletedTasks = stats.CompletedTasks,
FailedTasks = stats.FailedTasks,
ExpiredTasks = stats.ExpiredTasks,
TotalUploadedBytes = stats.TotalUploadedBytes,
AverageProgress = stats.AverageProgress,
RecentActivity = stats.RecentActivity
stats.TotalTasks,
stats.InProgressTasks,
stats.CompletedTasks,
stats.FailedTasks,
stats.ExpiredTasks,
stats.TotalUploadedBytes,
stats.AverageProgress,
stats.RecentActivity
});
}
@@ -519,7 +554,7 @@ public class FileUploadController(
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
var cleanedCount = await persistentUploadService.CleanupUserFailedTasksAsync(accountId);
var cleanedCount = await persistentTaskService.CleanupUserFailedTasksAsync(accountId);
return Ok(new { message = $"Cleaned up {cleanedCount} failed tasks" });
}
@@ -532,7 +567,7 @@ public class FileUploadController(
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var accountId = Guid.Parse(currentUser.Id);
var tasks = await persistentUploadService.GetRecentUserTasksAsync(accountId, limit);
var tasks = await persistentTaskService.GetRecentUserTasksAsync(accountId, limit);
return Ok(tasks.Select(t => new
{
@@ -554,7 +589,7 @@ public class FileUploadController(
if (currentUser is null)
return new ObjectResult(ApiError.Unauthorized()) { StatusCode = 401 };
var task = await persistentUploadService.GetUploadTaskAsync(taskId);
var task = await persistentTaskService.GetUploadTaskAsync(taskId);
if (task is null)
return new ObjectResult(ApiError.NotFound("Upload task")) { StatusCode = 404 };
@@ -586,28 +621,32 @@ public class FileUploadController(
task.UpdatedAt,
task.ExpiredAt,
task.Hash,
UploadedChunks = task.UploadedChunks
task.UploadedChunks
},
Pool = pool != null ? new
{
pool.Id,
pool.Name,
pool.Description
} : null,
Bundle = bundle != null ? new
{
bundle.Id,
bundle.Name,
bundle.Description
} : null,
Pool = pool != null
? new
{
pool.Id,
pool.Name,
pool.Description
}
: null,
Bundle = bundle != null
? new
{
bundle.Id,
bundle.Name,
bundle.Description
}
: null,
EstimatedTimeRemaining = CalculateEstimatedTime(task),
UploadSpeed = CalculateUploadSpeed(task)
});
}
private string? CalculateEstimatedTime(PersistentUploadTask task)
private static string? CalculateEstimatedTime(PersistentUploadTask task)
{
if (task.Status != Model.TaskStatus.InProgress || task.ChunksUploaded == 0)
if (task.Status != TaskStatus.InProgress || task.ChunksUploaded == 0)
return null;
var elapsed = NodaTime.SystemClock.Instance.GetCurrentInstant() - task.CreatedAt;
@@ -620,27 +659,29 @@ public class FileUploadController(
var remainingSeconds = remainingChunks / chunksPerSecond;
if (remainingSeconds < 60)
return $"{remainingSeconds:F0} seconds";
if (remainingSeconds < 3600)
return $"{remainingSeconds / 60:F0} minutes";
return $"{remainingSeconds / 3600:F1} hours";
return remainingSeconds switch
{
< 60 => $"{remainingSeconds:F0} seconds",
< 3600 => $"{remainingSeconds / 60:F0} minutes",
_ => $"{remainingSeconds / 3600:F1} hours"
};
}
private string? CalculateUploadSpeed(PersistentUploadTask task)
private static string? CalculateUploadSpeed(PersistentUploadTask task)
{
if (task.ChunksUploaded == 0)
return null;
var elapsed = NodaTime.SystemClock.Instance.GetCurrentInstant() - task.CreatedAt;
var elapsed = SystemClock.Instance.GetCurrentInstant() - task.CreatedAt;
var elapsedSeconds = elapsed.TotalSeconds;
var bytesUploaded = (long)task.ChunksUploaded * task.ChunkSize;
var bytesUploaded = task.ChunksUploaded * task.ChunkSize;
var bytesPerSecond = bytesUploaded / elapsedSeconds;
if (bytesPerSecond < 1024)
return $"{bytesPerSecond:F0} B/s";
if (bytesPerSecond < 1024 * 1024)
return $"{bytesPerSecond / 1024:F0} KB/s";
return $"{bytesPerSecond / (1024 * 1024):F1} MB/s";
return bytesPerSecond switch
{
< 1024 => $"{bytesPerSecond:F0} B/s",
< 1024 * 1024 => $"{bytesPerSecond / 1024:F0} KB/s",
_ => $"{bytesPerSecond / (1024 * 1024):F1} MB/s"
};
}
}
}

View File

@@ -1,10 +1,88 @@
using DysonNetwork.Shared.Models;
using DysonNetwork.Shared.Proto;
using Google.Protobuf.Collections;
using NodaTime;
using System.ComponentModel.DataAnnotations;
using System.ComponentModel.DataAnnotations.Schema;
using System.Text.Json;
namespace DysonNetwork.Drive.Storage.Model;
// File Upload Task Parameters
public class FileUploadParameters
{
public string FileName { get; set; } = string.Empty;
public long FileSize { get; set; }
public string ContentType { get; set; } = string.Empty;
public long ChunkSize { get; set; } = 5242880L;
public int ChunksCount { get; set; }
public int ChunksUploaded { get; set; }
public Guid PoolId { get; set; }
public Guid? BundleId { get; set; }
public string? EncryptPassword { get; set; }
public string Hash { get; set; } = string.Empty;
public List<int> UploadedChunks { get; set; } = [];
public string? Path { get; set; }
}
// File Move Task Parameters
public class FileMoveParameters
{
public List<string> FileIds { get; set; } = [];
public Guid TargetPoolId { get; set; }
public Guid? TargetBundleId { get; set; }
public int FilesProcessed { get; set; }
}
// File Compression Task Parameters
public class FileCompressParameters
{
public List<string> FileIds { get; set; } = [];
public string CompressionFormat { get; set; } = "zip";
public int CompressionLevel { get; set; } = 6;
public string? OutputFileName { get; set; }
public int FilesProcessed { get; set; }
public string? ResultFileId { get; set; }
}
// Bulk Operation Task Parameters
public class BulkOperationParameters
{
public string OperationType { get; set; } = string.Empty;
public List<string> TargetIds { get; set; } = [];
public Dictionary<string, object?> OperationParameters { get; set; } = new();
public int ItemsProcessed { get; set; }
public Dictionary<string, object?>? OperationResults { get; set; }
}
// Storage Migration Task Parameters
public class StorageMigrationParameters
{
public Guid SourcePoolId { get; set; }
public Guid TargetPoolId { get; set; }
public List<string> FileIds { get; set; } = new();
public bool PreserveOriginals { get; set; } = true;
public long TotalBytesToTransfer { get; set; }
public long BytesTransferred { get; set; }
public int FilesMigrated { get; set; }
}
// Helper class for parameter operations using GrpcTypeHelper
public static class ParameterHelper
{
public static T? Typed<T>(Dictionary<string, object?> parameters)
{
var rawParams = GrpcTypeHelper.ConvertObjectToByteString(parameters);
return GrpcTypeHelper.ConvertByteStringToObject<T>(rawParams);
}
public static Dictionary<string, object?> Untyped<T>(T parameters)
{
var rawParams = GrpcTypeHelper.ConvertObjectToByteString(parameters);
return GrpcTypeHelper.ConvertByteStringToObject<Dictionary<string, object?>>(rawParams) ?? [];
}
}
public class CreateUploadTaskRequest
{
public string Hash { get; set; } = null!;
@@ -16,6 +94,7 @@ public class CreateUploadTaskRequest
public string? EncryptPassword { get; set; }
public Instant? ExpiredAt { get; set; }
public long? ChunkSize { get; set; }
public string? Path { get; set; }
}
public class CreateUploadTaskResponse
@@ -46,14 +125,11 @@ public class PersistentTask : ModelBase
{
public Guid Id { get; set; } = Guid.NewGuid();
[MaxLength(64)]
public string TaskId { get; set; } = null!;
[MaxLength(64)] public string TaskId { get; set; } = null!;
[MaxLength(256)]
public string Name { get; set; } = null!;
[MaxLength(256)] public string Name { get; set; } = null!;
[MaxLength(1024)]
public string? Description { get; set; }
[MaxLength(1024)] public string? Description { get; set; }
public TaskType Type { get; set; }
@@ -65,15 +141,12 @@ public class PersistentTask : ModelBase
public double Progress { get; set; }
// Task-specific parameters stored as JSON
[Column(TypeName = "jsonb")]
public Dictionary<string, object?> Parameters { get; set; } = new();
[Column(TypeName = "jsonb")] public Dictionary<string, object?> Parameters { get; set; } = new();
// Task results/output stored as JSON
[Column(TypeName = "jsonb")]
public Dictionary<string, object?> Results { get; set; } = new();
[Column(TypeName = "jsonb")] public Dictionary<string, object?> Results { get; set; } = new();
[MaxLength(1024)]
public string? ErrorMessage { get; set; }
[MaxLength(1024)] public string? ErrorMessage { get; set; }
public Instant? StartedAt { get; set; }
public Instant? CompletedAt { get; set; }
@@ -97,82 +170,149 @@ public class PersistentUploadTask : PersistentTask
Name = "File Upload";
}
// Convenience properties using typed parameters
[NotMapped]
public FileUploadParameters TypedParameters
{
get => ParameterHelper.Typed<FileUploadParameters>(Parameters)!;
set => Parameters = ParameterHelper.Untyped(value);
}
[MaxLength(256)]
public string FileName
{
get => Parameters.GetValueOrDefault("fileName") as string ?? string.Empty;
set => Parameters["fileName"] = value;
get => TypedParameters.FileName;
set
{
var parameters = TypedParameters;
parameters.FileName = value;
TypedParameters = parameters;
}
}
public long FileSize
{
get => Convert.ToInt64(Parameters.GetValueOrDefault("fileSize") ?? 0L);
set => Parameters["fileSize"] = value;
get => TypedParameters.FileSize;
set
{
var parameters = TypedParameters;
parameters.FileSize = value;
TypedParameters = parameters;
}
}
[MaxLength(128)]
public string ContentType
{
get => Parameters.GetValueOrDefault("contentType") as string ?? string.Empty;
set => Parameters["contentType"] = value;
get => TypedParameters.ContentType;
set
{
var parameters = TypedParameters;
parameters.ContentType = value;
TypedParameters = parameters;
}
}
public long ChunkSize
{
get => Convert.ToInt64(Parameters.GetValueOrDefault("chunkSize") ?? 5242880L);
set => Parameters["chunkSize"] = value;
get => TypedParameters.ChunkSize;
set
{
var parameters = TypedParameters;
parameters.ChunkSize = value;
TypedParameters = parameters;
}
}
public int ChunksCount
{
get => Convert.ToInt32(Parameters.GetValueOrDefault("chunksCount") ?? 0);
set => Parameters["chunksCount"] = value;
get => TypedParameters.ChunksCount;
set
{
var parameters = TypedParameters;
parameters.ChunksCount = value;
TypedParameters = parameters;
}
}
public int ChunksUploaded
{
get => Convert.ToInt32(Parameters.GetValueOrDefault("chunksUploaded") ?? 0);
get => TypedParameters.ChunksUploaded;
set
{
Parameters["chunksUploaded"] = value;
var parameters = TypedParameters;
parameters.ChunksUploaded = value;
TypedParameters = parameters;
Progress = ChunksCount > 0 ? (double)value / ChunksCount * 100 : 0;
}
}
public Guid PoolId
{
get => Guid.Parse(Parameters.GetValueOrDefault("poolId") as string ?? Guid.Empty.ToString());
set => Parameters["poolId"] = value.ToString();
get => TypedParameters.PoolId;
set
{
var parameters = TypedParameters;
parameters.PoolId = value;
TypedParameters = parameters;
}
}
public Guid? BundleId
{
get
get => TypedParameters.BundleId;
set
{
var bundleIdStr = Parameters.GetValueOrDefault("bundleId") as string;
return string.IsNullOrEmpty(bundleIdStr) ? null : Guid.Parse(bundleIdStr);
var parameters = TypedParameters;
parameters.BundleId = value;
TypedParameters = parameters;
}
set => Parameters["bundleId"] = value?.ToString();
}
[MaxLength(256)]
public string? EncryptPassword
{
get => Parameters.GetValueOrDefault("encryptPassword") as string;
set => Parameters["encryptPassword"] = value;
get => TypedParameters.EncryptPassword;
set
{
var parameters = TypedParameters;
parameters.EncryptPassword = value;
TypedParameters = parameters;
}
}
public string Hash
{
get => Parameters.GetValueOrDefault("hash") as string ?? string.Empty;
set => Parameters["hash"] = value;
get => TypedParameters.Hash;
set
{
var parameters = TypedParameters;
parameters.Hash = value;
TypedParameters = parameters;
}
}
// JSON array of uploaded chunk indices for resumability
public List<int> UploadedChunks
{
get => Parameters.GetValueOrDefault("uploadedChunks") as List<int> ?? [];
set => Parameters["uploadedChunks"] = value;
get => TypedParameters.UploadedChunks;
set
{
var parameters = TypedParameters;
parameters.UploadedChunks = value;
TypedParameters = parameters;
}
}
public string? Path
{
get => TypedParameters.Path;
set
{
var parameters = TypedParameters;
parameters.Path = value;
TypedParameters = parameters;
}
}
}
@@ -190,6 +330,7 @@ public enum TaskType
Custom
}
[Flags]
public enum TaskStatus
{
Pending,
@@ -210,34 +351,54 @@ public class FileMoveTask : PersistentTask
Name = "Move Files";
}
// Convenience properties using typed parameters
public FileMoveParameters TypedParameters
{
get => ParameterHelper.Typed<FileMoveParameters>(Parameters)!;
set => Parameters = ParameterHelper.Untyped(value);
}
public List<string> FileIds
{
get => Parameters.GetValueOrDefault("fileIds") as List<string> ?? [];
set => Parameters["fileIds"] = value;
get => TypedParameters.FileIds;
set
{
var parameters = TypedParameters;
parameters.FileIds = value;
TypedParameters = parameters;
}
}
public Guid TargetPoolId
{
get => Guid.Parse(Parameters.GetValueOrDefault("targetPoolId") as string ?? Guid.Empty.ToString());
set => Parameters["targetPoolId"] = value.ToString();
get => TypedParameters.TargetPoolId;
set
{
var parameters = TypedParameters;
parameters.TargetPoolId = value;
TypedParameters = parameters;
}
}
public Guid? TargetBundleId
{
get
get => TypedParameters.TargetBundleId;
set
{
var bundleIdStr = Parameters.GetValueOrDefault("targetBundleId") as string;
return string.IsNullOrEmpty(bundleIdStr) ? null : Guid.Parse(bundleIdStr);
var parameters = TypedParameters;
parameters.TargetBundleId = value;
TypedParameters = parameters;
}
set => Parameters["targetBundleId"] = value?.ToString();
}
public int FilesProcessed
{
get => Convert.ToInt32(Parameters.GetValueOrDefault("filesProcessed") ?? 0);
get => TypedParameters.FilesProcessed;
set
{
Parameters["filesProcessed"] = value;
var parameters = TypedParameters;
parameters.FilesProcessed = value;
TypedParameters = parameters;
Progress = FileIds.Count > 0 ? (double)value / FileIds.Count * 100 : 0;
}
}
@@ -252,45 +413,79 @@ public class FileCompressTask : PersistentTask
Name = "Compress Files";
}
// Convenience properties using typed parameters
public FileCompressParameters TypedParameters
{
get => ParameterHelper.Typed<FileCompressParameters>(Parameters)!;
set => Parameters = ParameterHelper.Untyped(value);
}
public List<string> FileIds
{
get => Parameters.GetValueOrDefault("fileIds") as List<string> ?? [];
set => Parameters["fileIds"] = value;
get => TypedParameters.FileIds;
set
{
var parameters = TypedParameters;
parameters.FileIds = value;
TypedParameters = parameters;
}
}
[MaxLength(32)]
public string CompressionFormat
{
get => Parameters.GetValueOrDefault("compressionFormat") as string ?? "zip";
set => Parameters["compressionFormat"] = value;
get => TypedParameters.CompressionFormat;
set
{
var parameters = TypedParameters;
parameters.CompressionFormat = value;
TypedParameters = parameters;
}
}
public int CompressionLevel
{
get => Convert.ToInt32(Parameters.GetValueOrDefault("compressionLevel") ?? 6);
set => Parameters["compressionLevel"] = value;
get => TypedParameters.CompressionLevel;
set
{
var parameters = TypedParameters;
parameters.CompressionLevel = value;
TypedParameters = parameters;
}
}
public string? OutputFileName
{
get => Parameters.GetValueOrDefault("outputFileName") as string;
set => Parameters["outputFileName"] = value;
get => TypedParameters.OutputFileName;
set
{
var parameters = TypedParameters;
parameters.OutputFileName = value;
TypedParameters = parameters;
}
}
public int FilesProcessed
{
get => Convert.ToInt32(Parameters.GetValueOrDefault("filesProcessed") ?? 0);
get => TypedParameters.FilesProcessed;
set
{
Parameters["filesProcessed"] = value;
var parameters = TypedParameters;
parameters.FilesProcessed = value;
TypedParameters = parameters;
Progress = FileIds.Count > 0 ? (double)value / FileIds.Count * 100 : 0;
}
}
public string? ResultFileId
{
get => Results.GetValueOrDefault("resultFileId") as string;
set => Results["resultFileId"] = value;
get => TypedParameters.ResultFileId;
set
{
var parameters = TypedParameters;
parameters.ResultFileId = value;
TypedParameters = parameters;
}
}
}
@@ -303,41 +498,70 @@ public class BulkOperationTask : PersistentTask
Name = "Bulk Operation";
}
// Convenience properties using typed parameters
public BulkOperationParameters TypedParameters
{
get => ParameterHelper.Typed<BulkOperationParameters>(Parameters)!;
set => Parameters = ParameterHelper.Untyped(value);
}
[MaxLength(128)]
public string OperationType
{
get => Parameters.GetValueOrDefault("operationType") as string ?? string.Empty;
set => Parameters["operationType"] = value;
get => TypedParameters.OperationType;
set
{
var parameters = TypedParameters;
parameters.OperationType = value;
TypedParameters = parameters;
}
}
public List<string> TargetIds
{
get => Parameters.GetValueOrDefault("targetIds") as List<string> ?? [];
set => Parameters["targetIds"] = value;
get => TypedParameters.TargetIds;
set
{
var parameters = TypedParameters;
parameters.TargetIds = value;
TypedParameters = parameters;
}
}
[Column(TypeName = "jsonb")]
public Dictionary<string, object?> OperationParameters
{
get => Parameters.GetValueOrDefault("operationParameters") as Dictionary<string, object?> ?? new();
set => Parameters["operationParameters"] = value;
get => TypedParameters.OperationParameters;
set
{
var parameters = TypedParameters;
parameters.OperationParameters = value;
TypedParameters = parameters;
}
}
public int ItemsProcessed
{
get => Convert.ToInt32(Parameters.GetValueOrDefault("itemsProcessed") ?? 0);
get => TypedParameters.ItemsProcessed;
set
{
Parameters["itemsProcessed"] = value;
var parameters = TypedParameters;
parameters.ItemsProcessed = value;
TypedParameters = parameters;
Progress = TargetIds.Count > 0 ? (double)value / TargetIds.Count * 100 : 0;
}
}
[Column(TypeName = "jsonb")]
public Dictionary<string, object?> OperationResults
public Dictionary<string, object?>? OperationResults
{
get => Results.GetValueOrDefault("operationResults") as Dictionary<string, object?> ?? new();
set => Results["operationResults"] = value;
get => TypedParameters.OperationResults;
set
{
var parameters = TypedParameters;
parameters.OperationResults = value;
TypedParameters = parameters;
}
}
}
@@ -350,50 +574,89 @@ public class StorageMigrationTask : PersistentTask
Name = "Storage Migration";
}
// Convenience properties using typed parameters
public StorageMigrationParameters TypedParameters
{
get => ParameterHelper.Typed<StorageMigrationParameters>(Parameters)!;
set => Parameters = ParameterHelper.Untyped(value);
}
public Guid SourcePoolId
{
get => Guid.Parse(Parameters.GetValueOrDefault("sourcePoolId") as string ?? Guid.Empty.ToString());
set => Parameters["sourcePoolId"] = value.ToString();
get => TypedParameters.SourcePoolId;
set
{
var parameters = TypedParameters;
parameters.SourcePoolId = value;
TypedParameters = parameters;
}
}
public Guid TargetPoolId
{
get => Guid.Parse(Parameters.GetValueOrDefault("targetPoolId") as string ?? Guid.Empty.ToString());
set => Parameters["targetPoolId"] = value.ToString();
get => TypedParameters.TargetPoolId;
set
{
var parameters = TypedParameters;
parameters.TargetPoolId = value;
TypedParameters = parameters;
}
}
public List<string> FileIds
{
get => Parameters.GetValueOrDefault("fileIds") as List<string> ?? [];
set => Parameters["fileIds"] = value;
get => TypedParameters.FileIds;
set
{
var parameters = TypedParameters;
parameters.FileIds = value;
TypedParameters = parameters;
}
}
public bool PreserveOriginals
{
get => Convert.ToBoolean(Parameters.GetValueOrDefault("preserveOriginals") ?? true);
set => Parameters["preserveOriginals"] = value;
get => TypedParameters.PreserveOriginals;
set
{
var parameters = TypedParameters;
parameters.PreserveOriginals = value;
TypedParameters = parameters;
}
}
public long TotalBytesToTransfer
{
get => Convert.ToInt64(Parameters.GetValueOrDefault("totalBytesToTransfer") ?? 0L);
set => Parameters["totalBytesToTransfer"] = value;
get => TypedParameters.TotalBytesToTransfer;
set
{
var parameters = TypedParameters;
parameters.TotalBytesToTransfer = value;
TypedParameters = parameters;
}
}
public long BytesTransferred
{
get => Convert.ToInt64(Parameters.GetValueOrDefault("bytesTransferred") ?? 0L);
get => TypedParameters.BytesTransferred;
set
{
Parameters["bytesTransferred"] = value;
var parameters = TypedParameters;
parameters.BytesTransferred = value;
TypedParameters = parameters;
Progress = TotalBytesToTransfer > 0 ? (double)value / TotalBytesToTransfer * 100 : 0;
}
}
public int FilesMigrated
{
get => Convert.ToInt32(Parameters.GetValueOrDefault("filesMigrated") ?? 0);
set => Parameters["filesMigrated"] = value;
get => TypedParameters.FilesMigrated;
set
{
var parameters = TypedParameters;
parameters.FilesMigrated = value;
TypedParameters = parameters;
}
}
}

View File

@@ -26,7 +26,7 @@ public class PersistentTaskService(
/// </summary>
public async Task<T> CreateTaskAsync<T>(T task) where T : PersistentTask
{
task.TaskId = NanoidDotNet.Nanoid.Generate();
task.TaskId = await Nanoid.GenerateAsync();
var now = SystemClock.Instance.GetCurrentInstant();
task.CreatedAt = now;
task.UpdatedAt = now;
@@ -45,7 +45,7 @@ public class PersistentTaskService(
/// <summary>
/// Gets a task by ID
/// </summary>
public async Task<T?> GetTaskAsync<T>(string taskId) where T : PersistentTask
private async Task<T?> GetTaskAsync<T>(string taskId) where T : PersistentTask
{
var cacheKey = $"{CacheKeyPrefix}{taskId}";
var cachedTask = await cache.GetAsync<T>(cacheKey);
@@ -55,13 +55,9 @@ public class PersistentTaskService(
var task = await db.Tasks
.FirstOrDefaultAsync(t => t.TaskId == taskId);
if (task is T typedTask)
{
await SetCacheAsync(typedTask);
return typedTask;
}
return null;
if (task is not T typedTask) return null;
await SetCacheAsync(typedTask);
return typedTask;
}
/// <summary>
@@ -73,20 +69,35 @@ public class PersistentTaskService(
if (task is null) return;
var previousProgress = task.Progress;
task.Progress = Math.Clamp(progress, 0, 100);
task.LastActivity = SystemClock.Instance.GetCurrentInstant();
task.UpdatedAt = task.LastActivity;
var delta = progress - previousProgress;
var clampedProgress = Math.Clamp(progress, 0, 1.0);
var now = SystemClock.Instance.GetCurrentInstant();
// Update the cached task
task.Progress = clampedProgress;
task.LastActivity = now;
task.UpdatedAt = now;
if (statusMessage is not null)
{
task.Description = statusMessage;
}
await db.SaveChangesAsync();
await SetCacheAsync(task);
// Send progress update notification
await SendTaskProgressUpdateAsync(task, task.Progress, previousProgress);
// Only updates when update in period
// Use ExecuteUpdateAsync for better performance - update only the fields we need
if (Math.Abs(progress - 1) < 0.1 || delta * 100 > 5)
{
await db.Tasks
.Where(t => t.TaskId == taskId)
.ExecuteUpdateAsync(setters => setters
.SetProperty(t => t.Progress, clampedProgress)
.SetProperty(t => t.LastActivity, now)
.SetProperty(t => t.UpdatedAt, now)
.SetProperty(t => t.Description, t => statusMessage ?? t.Description)
);
}
}
/// <summary>
@@ -98,24 +109,38 @@ public class PersistentTaskService(
if (task is null) return;
var now = SystemClock.Instance.GetCurrentInstant();
task.Status = TaskStatus.Completed;
task.Progress = 100;
task.CompletedAt = now;
task.LastActivity = now;
task.UpdatedAt = now;
if (results is not null)
// Use ExecuteUpdateAsync for better performance - update only the fields we need
var updatedRows = await db.Tasks
.Where(t => t.TaskId == taskId)
.ExecuteUpdateAsync(setters => setters
.SetProperty(t => t.Status, TaskStatus.Completed)
.SetProperty(t => t.Progress, 1.0)
.SetProperty(t => t.CompletedAt, now)
.SetProperty(t => t.LastActivity, now)
.SetProperty(t => t.UpdatedAt, now)
);
if (updatedRows > 0)
{
foreach (var (key, value) in results)
// Update the cached task with results if provided
task.Status = TaskStatus.Completed;
task.Progress = 1.0;
task.CompletedAt = now;
task.LastActivity = now;
task.UpdatedAt = now;
if (results is not null)
{
task.Results[key] = value;
foreach (var (key, value) in results)
{
task.Results[key] = value;
}
}
await RemoveCacheAsync(taskId);
await SendTaskCompletedNotificationAsync(task);
}
await db.SaveChangesAsync();
await RemoveCacheAsync(taskId);
await SendTaskCompletedNotificationAsync(task);
}
/// <summary>
@@ -126,15 +151,30 @@ public class PersistentTaskService(
var task = await GetTaskAsync<PersistentTask>(taskId);
if (task is null) return;
task.Status = TaskStatus.Failed;
task.ErrorMessage = errorMessage ?? "Task failed due to an unknown error";
task.LastActivity = SystemClock.Instance.GetCurrentInstant();
task.UpdatedAt = task.LastActivity;
var now = SystemClock.Instance.GetCurrentInstant();
var errorMsg = errorMessage ?? "Task failed due to an unknown error";
await db.SaveChangesAsync();
await RemoveCacheAsync(taskId);
// Use ExecuteUpdateAsync for better performance - update only the fields we need
var updatedRows = await db.Tasks
.Where(t => t.TaskId == taskId)
.ExecuteUpdateAsync(setters => setters
.SetProperty(t => t.Status, TaskStatus.Failed)
.SetProperty(t => t.ErrorMessage, errorMsg)
.SetProperty(t => t.LastActivity, now)
.SetProperty(t => t.UpdatedAt, now)
);
await SendTaskFailedNotificationAsync(task);
if (updatedRows > 0)
{
// Update the cached task
task.Status = TaskStatus.Failed;
task.ErrorMessage = errorMsg;
task.LastActivity = now;
task.UpdatedAt = now;
await RemoveCacheAsync(taskId);
await SendTaskFailedNotificationAsync(task);
}
}
/// <summary>
@@ -233,17 +273,17 @@ public class PersistentTaskService(
? query.OrderByDescending(t => t.Progress)
: query.OrderBy(t => t.Progress);
break;
case "createdat":
case "created":
orderedQuery = sortDescending
? query.OrderByDescending(t => t.CreatedAt)
: query.OrderBy(t => t.CreatedAt);
break;
case "updatedat":
case "updated":
orderedQuery = sortDescending
? query.OrderByDescending(t => t.UpdatedAt)
: query.OrderBy(t => t.UpdatedAt);
break;
case "lastactivity":
case "activity":
default:
orderedQuery = sortDescending
? query.OrderByDescending(t => t.LastActivity)
@@ -281,20 +321,20 @@ public class PersistentTaskService(
ExpiredTasks = tasks.Count(t => t.Status == TaskStatus.Expired),
AverageProgress = tasks.Any(t => t.Status == TaskStatus.InProgress || t.Status == TaskStatus.Paused)
? tasks.Where(t => t.Status == TaskStatus.InProgress || t.Status == TaskStatus.Paused)
.Average(t => t.Progress)
.Average(t => t.Progress)
: 0,
RecentActivity = tasks.OrderByDescending(t => t.LastActivity)
.Take(10)
.Select(t => new TaskActivity
{
TaskId = t.TaskId,
Name = t.Name,
Type = t.Type,
Status = t.Status,
Progress = t.Progress,
LastActivity = t.LastActivity
})
.ToList()
.Take(10)
.Select(t => new TaskActivity
{
TaskId = t.TaskId,
Name = t.Name,
Type = t.Type,
Status = t.Status,
Progress = t.Progress,
LastActivity = t.LastActivity
})
.ToList()
};
return stats;
@@ -314,11 +354,11 @@ public class PersistentTaskService(
var oldTasks = await db.Tasks
.Where(t => t.AccountId == accountId &&
(t.Status == TaskStatus.Completed ||
t.Status == TaskStatus.Failed ||
t.Status == TaskStatus.Cancelled ||
t.Status == TaskStatus.Expired) &&
t.UpdatedAt < cutoff)
(t.Status == TaskStatus.Completed ||
t.Status == TaskStatus.Failed ||
t.Status == TaskStatus.Cancelled ||
t.Status == TaskStatus.Expired) &&
t.UpdatedAt < cutoff)
.ToListAsync();
db.Tasks.RemoveRange(oldTasks);
@@ -344,13 +384,14 @@ public class PersistentTaskService(
TaskId = task.TaskId,
Name = task.Name,
Type = task.Type.ToString(),
CreatedAt = task.CreatedAt.ToString("O", null)
Parameters = task.Parameters,
CreatedAt = task.CreatedAt.ToString()
};
var packet = new WebSocketPacket
{
Type = "task.created",
Data = Google.Protobuf.ByteString.CopyFromUtf8(System.Text.Json.JsonSerializer.Serialize(data))
Data = GrpcTypeHelper.ConvertObjectToByteString(data)
};
await ringService.PushWebSocketPacketAsync(new PushWebSocketPacketRequest
@@ -369,10 +410,6 @@ public class PersistentTaskService(
{
try
{
// Only send significant progress updates (every 5% or major milestones)
if (Math.Abs(newProgress - previousProgress) < 5 && newProgress < 100 && newProgress > 0)
return;
var data = new TaskProgressData
{
TaskId = task.TaskId,
@@ -380,13 +417,13 @@ public class PersistentTaskService(
Type = task.Type.ToString(),
Progress = newProgress,
Status = task.Status.ToString(),
LastActivity = task.LastActivity.ToString("O", null)
LastActivity = task.LastActivity.ToString()
};
var packet = new WebSocketPacket
{
Type = "task.progress",
Data = Google.Protobuf.ByteString.CopyFromUtf8(System.Text.Json.JsonSerializer.Serialize(data))
Data = GrpcTypeHelper.ConvertObjectToByteString(data)
};
await ringService.PushWebSocketPacketAsync(new PushWebSocketPacketRequest
@@ -410,7 +447,7 @@ public class PersistentTaskService(
TaskId = task.TaskId,
Name = task.Name,
Type = task.Type.ToString(),
CompletedAt = task.CompletedAt?.ToString("O", null) ?? task.UpdatedAt.ToString("O", null),
CompletedAt = task.CompletedAt?.ToString() ?? task.UpdatedAt.ToString(),
Results = task.Results
};
@@ -418,7 +455,7 @@ public class PersistentTaskService(
var wsPacket = new WebSocketPacket
{
Type = "task.completed",
Data = Google.Protobuf.ByteString.CopyFromUtf8(System.Text.Json.JsonSerializer.Serialize(data))
Data = GrpcTypeHelper.ConvertObjectToByteString(data)
};
await ringService.PushWebSocketPacketAsync(new PushWebSocketPacketRequest
@@ -426,22 +463,6 @@ public class PersistentTaskService(
UserId = task.AccountId.ToString(),
Packet = wsPacket
});
// Push notification
var pushNotification = new PushNotification
{
Topic = "task",
Title = "Task Completed",
Subtitle = task.Name,
Body = $"Your {task.Type.ToString().ToLower()} task has completed successfully.",
IsSavable = true
};
await ringService.SendPushNotificationToUserAsync(new SendPushNotificationToUserRequest
{
UserId = task.AccountId.ToString(),
Notification = pushNotification
});
}
catch (Exception ex)
{
@@ -458,7 +479,7 @@ public class PersistentTaskService(
TaskId = task.TaskId,
Name = task.Name,
Type = task.Type.ToString(),
FailedAt = task.UpdatedAt.ToString("O", null),
FailedAt = task.UpdatedAt.ToString(),
ErrorMessage = task.ErrorMessage ?? "Task failed due to an unknown error"
};
@@ -466,7 +487,7 @@ public class PersistentTaskService(
var wsPacket = new WebSocketPacket
{
Type = "task.failed",
Data = Google.Protobuf.ByteString.CopyFromUtf8(System.Text.Json.JsonSerializer.Serialize(data))
Data = GrpcTypeHelper.ConvertObjectToByteString(data)
};
await ringService.PushWebSocketPacketAsync(new PushWebSocketPacketRequest
@@ -478,8 +499,8 @@ public class PersistentTaskService(
// Push notification
var pushNotification = new PushNotification
{
Topic = "task",
Title = "Task Failed",
Topic = "drive.tasks",
Title = "Drive Task Failed",
Subtitle = task.Name,
Body = $"Your {task.Type.ToString().ToLower()} task has failed.",
IsSavable = true
@@ -504,6 +525,8 @@ public class PersistentTaskService(
private async Task SetCacheAsync(PersistentTask task)
{
var cacheKey = $"{CacheKeyPrefix}{task.TaskId}";
// Cache the entire task object directly - this includes all properties including Parameters dictionary
await cache.SetAsync(cacheKey, task, CacheDuration);
}
@@ -514,6 +537,489 @@ public class PersistentTaskService(
}
#endregion
#region Upload-Specific Methods
/// <summary>
/// Gets the first available pool ID, or creates a default one if none exist
/// </summary>
private async Task<Guid> GetFirstAvailablePoolIdAsync()
{
// Try to get the first available pool
var firstPool = await db.Pools
.Where(p => p.PolicyConfig.PublicUsable)
.OrderBy(p => p.CreatedAt)
.FirstOrDefaultAsync();
if (firstPool != null)
{
return firstPool.Id;
}
// If no pools exist, create a default one
logger.LogWarning("No pools found in database. Creating default pool...");
var defaultPoolId = Guid.NewGuid();
var defaultPool = new DysonNetwork.Shared.Models.FilePool
{
Id = defaultPoolId,
Name = "Default Storage Pool",
Description = "Automatically created default storage pool",
StorageConfig = new DysonNetwork.Shared.Models.RemoteStorageConfig
{
Region = "auto",
Bucket = "solar-network-development",
Endpoint = "localhost:9000",
SecretId = "littlesheep",
SecretKey = "password",
EnableSigned = true,
EnableSsl = false
},
BillingConfig = new DysonNetwork.Shared.Models.BillingConfig
{
CostMultiplier = 1.0
},
PolicyConfig = new DysonNetwork.Shared.Models.PolicyConfig
{
EnableFastUpload = true,
EnableRecycle = true,
PublicUsable = true,
AllowEncryption = true,
AllowAnonymous = true,
AcceptTypes = new List<string> { "*/*" },
MaxFileSize = 1024L * 1024 * 1024 * 10, // 10GB
RequirePrivilege = 0
},
IsHidden = false,
AccountId = null,
CreatedAt = SystemClock.Instance.GetCurrentInstant(),
UpdatedAt = SystemClock.Instance.GetCurrentInstant()
};
db.Pools.Add(defaultPool);
await db.SaveChangesAsync();
logger.LogInformation("Created default pool with ID: {PoolId}", defaultPoolId);
return defaultPoolId;
}
/// <summary>
/// Creates a new persistent upload task
/// </summary>
public async Task<PersistentUploadTask> CreateUploadTaskAsync(
string taskId,
CreateUploadTaskRequest request,
Guid accountId
)
{
var chunkSize = request.ChunkSize ?? 1024 * 1024 * 5; // 5MB default
var chunksCount = (int)Math.Ceiling((double)request.FileSize / chunkSize);
// If the second chunk is too small (less than 1MB), merge it with the first chunk
if (chunksCount == 2 && (request.FileSize - chunkSize) < 1024 * 1024)
{
chunksCount = 1;
chunkSize = request.FileSize;
}
// Use the default pool if no pool is specified, or find the first available pool
var poolId = request.PoolId ?? await GetFirstAvailablePoolIdAsync();
var uploadTask = new PersistentUploadTask
{
TaskId = taskId,
FileName = request.FileName,
FileSize = request.FileSize,
ContentType = request.ContentType,
ChunkSize = chunkSize,
ChunksCount = chunksCount,
ChunksUploaded = 0,
PoolId = poolId,
BundleId = request.BundleId,
EncryptPassword = request.EncryptPassword,
ExpiredAt = request.ExpiredAt,
Hash = request.Hash,
Path = request.Path,
AccountId = accountId,
Status = TaskStatus.InProgress,
UploadedChunks = [],
LastActivity = SystemClock.Instance.GetCurrentInstant()
};
db.Tasks.Add(uploadTask);
await db.SaveChangesAsync();
await SetCacheAsync(uploadTask);
await SendTaskCreatedNotificationAsync(uploadTask);
return uploadTask;
}
/// <summary>
/// Gets an existing upload task by ID
/// </summary>
public async Task<PersistentUploadTask?> GetUploadTaskAsync(string taskId)
{
var cacheKey = $"{CacheKeyPrefix}{taskId}";
var cachedTask = await cache.GetAsync<PersistentUploadTask>(cacheKey);
if (cachedTask is not null)
return cachedTask;
var task = await db.Tasks
.OfType<PersistentUploadTask>()
.FirstOrDefaultAsync(t => t.TaskId == taskId && t.Status == TaskStatus.InProgress);
if (task is not null)
await SetCacheAsync(task);
return task;
}
/// <summary>
/// Updates chunk upload progress
/// </summary>
public async Task UpdateChunkProgressAsync(string taskId, int chunkIndex)
{
var task = await GetUploadTaskAsync(taskId);
if (task is null) return;
if (!task.UploadedChunks.Contains(chunkIndex))
{
var previousProgress = task.ChunksCount > 0 ? (double)task.ChunksUploaded / task.ChunksCount * 100 : 0;
// Get current parameters and update them directly
var parameters = task.TypedParameters;
if (!parameters.UploadedChunks.Contains(chunkIndex))
{
parameters.UploadedChunks.Add(chunkIndex);
parameters.ChunksUploaded = parameters.UploadedChunks.Count;
var now = SystemClock.Instance.GetCurrentInstant();
// Use ExecuteUpdateAsync to update the Parameters dictionary directly
var updatedRows = await db.Tasks
.OfType<PersistentUploadTask>()
.Where(t => t.TaskId == taskId)
.ExecuteUpdateAsync(setters => setters
.SetProperty(t => t.Parameters, ParameterHelper.Untyped(parameters))
.SetProperty(t => t.LastActivity, now)
.SetProperty(t => t.UpdatedAt, now)
);
if (updatedRows > 0)
{
// Update the cached task
task.UploadedChunks.Add(chunkIndex);
task.ChunksUploaded = task.UploadedChunks.Count;
task.LastActivity = now;
task.UpdatedAt = now;
await SetCacheAsync(task);
// Send real-time progress update
var newProgress = task.ChunksCount > 0 ? (double)task.ChunksUploaded / task.ChunksCount * 100 : 0;
await SendUploadProgressUpdateAsync(task, newProgress, previousProgress);
}
}
}
}
/// <summary>
/// Checks if a chunk has already been uploaded
/// </summary>
public async Task<bool> IsChunkUploadedAsync(string taskId, int chunkIndex)
{
var task = await GetUploadTaskAsync(taskId);
return task?.UploadedChunks.Contains(chunkIndex) ?? false;
}
/// <summary>
/// Gets upload progress as percentage
/// </summary>
public async Task<double> GetUploadProgressAsync(string taskId)
{
var task = await GetUploadTaskAsync(taskId);
if (task is null || task.ChunksCount == 0) return 0;
return (double)task.ChunksUploaded / task.ChunksCount * 100;
}
/// <summary>
/// Gets user upload tasks with filtering and pagination
/// </summary>
public async Task<(List<PersistentUploadTask> Items, int TotalCount)> GetUserUploadTasksAsync(
Guid accountId,
UploadTaskStatus? status = null,
string? sortBy = "lastActivity",
bool sortDescending = true,
int offset = 0,
int limit = 50
)
{
var query = db.Tasks.OfType<PersistentUploadTask>().Where(t => t.AccountId == accountId);
// Apply status filter
if (status.HasValue)
{
query = query.Where(t => t.Status == (TaskStatus)status.Value);
}
// Get total count
var totalCount = await query.CountAsync();
// Apply sorting
IOrderedQueryable<PersistentUploadTask> orderedQuery;
switch (sortBy?.ToLower())
{
case "filename":
orderedQuery = sortDescending
? query.OrderByDescending(t => t.FileName)
: query.OrderBy(t => t.FileName);
break;
case "filesize":
orderedQuery = sortDescending
? query.OrderByDescending(t => t.FileSize)
: query.OrderBy(t => t.FileSize);
break;
case "created":
orderedQuery = sortDescending
? query.OrderByDescending(t => t.CreatedAt)
: query.OrderBy(t => t.CreatedAt);
break;
case "updated":
orderedQuery = sortDescending
? query.OrderByDescending(t => t.UpdatedAt)
: query.OrderBy(t => t.UpdatedAt);
break;
case "activity":
default:
orderedQuery = sortDescending
? query.OrderByDescending(t => t.LastActivity)
: query.OrderBy(t => t.LastActivity);
break;
}
// Apply pagination
var items = await orderedQuery
.Skip(offset)
.Take(limit)
.ToListAsync();
return (items, totalCount);
}
/// <summary>
/// Gets upload statistics for a user
/// </summary>
public async Task<UserUploadStats> GetUserUploadStatsAsync(Guid accountId)
{
var tasks = await db.Tasks
.OfType<PersistentUploadTask>()
.Where(t => t.AccountId == accountId)
.ToListAsync();
var stats = new UserUploadStats
{
TotalTasks = tasks.Count,
InProgressTasks = tasks.Count(t => t.Status == TaskStatus.InProgress),
CompletedTasks = tasks.Count(t => t.Status == TaskStatus.Completed),
FailedTasks = tasks.Count(t => t.Status == TaskStatus.Failed),
ExpiredTasks = tasks.Count(t => t.Status == TaskStatus.Expired),
TotalUploadedBytes = tasks.Sum(t => t.ChunksUploaded * t.ChunkSize),
AverageProgress = tasks.Any(t => t.Status == TaskStatus.InProgress)
? tasks.Where(t => t.Status == TaskStatus.InProgress)
.Average(t => t.ChunksCount > 0 ? (double)t.ChunksUploaded / t.ChunksCount * 100 : 0)
: 0,
RecentActivity = tasks.OrderByDescending(t => t.LastActivity)
.Take(5)
.Select(t => new RecentActivity
{
TaskId = t.TaskId,
FileName = t.FileName,
Status = (UploadTaskStatus)t.Status,
LastActivity = t.LastActivity,
Progress = t.ChunksCount > 0 ? (double)t.ChunksUploaded / t.ChunksCount * 100 : 0
})
.ToList()
};
return stats;
}
/// <summary>
/// Cleans up failed tasks for a user
/// </summary>
public async Task<int> CleanupUserFailedTasksAsync(Guid accountId)
{
var failedTasks = await db.Tasks
.OfType<PersistentUploadTask>()
.Where(t => t.AccountId == accountId &&
(t.Status == TaskStatus.Failed || t.Status == TaskStatus.Expired))
.ToListAsync();
foreach (var task in failedTasks)
{
await RemoveCacheAsync(task.TaskId);
// Clean up temp files
var taskPath = Path.Combine(Path.GetTempPath(), "multipart-uploads", task.TaskId);
if (!Directory.Exists(taskPath)) continue;
try
{
Directory.Delete(taskPath, true);
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to cleanup temp files for task {TaskId}", task.TaskId);
}
}
db.Tasks.RemoveRange(failedTasks);
await db.SaveChangesAsync();
return failedTasks.Count;
}
/// <summary>
/// Gets recent tasks for a user
/// </summary>
public async Task<List<PersistentUploadTask>> GetRecentUserTasksAsync(Guid accountId, int limit = 10)
{
return await db.Tasks
.OfType<PersistentUploadTask>()
.Where(t => t.AccountId == accountId)
.OrderByDescending(t => t.LastActivity)
.Take(limit)
.ToListAsync();
}
/// <summary>
/// Sends upload completion notification
/// </summary>
public async Task SendUploadCompletedNotificationAsync(PersistentUploadTask task, string fileId)
{
try
{
var completionData = new UploadCompletionData
{
TaskId = task.TaskId,
FileId = fileId,
FileName = task.FileName,
FileSize = task.FileSize,
CompletedAt = SystemClock.Instance.GetCurrentInstant().ToString()
};
// Send WebSocket notification
var wsPacket = new WebSocketPacket
{
Type = "upload.completed",
Data = GrpcTypeHelper.ConvertObjectToByteString(completionData)
};
await ringService.PushWebSocketPacketAsync(new PushWebSocketPacketRequest
{
UserId = task.AccountId.ToString(),
Packet = wsPacket
});
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to send upload completion notification for task {TaskId}", task.TaskId);
}
}
/// <summary>
/// Sends upload failure notification
/// </summary>
public async Task SendUploadFailedNotificationAsync(PersistentUploadTask task, string? errorMessage = null)
{
try
{
var failureData = new UploadFailureData
{
TaskId = task.TaskId,
FileName = task.FileName,
FileSize = task.FileSize,
FailedAt = SystemClock.Instance.GetCurrentInstant().ToString(),
ErrorMessage = errorMessage ?? "Upload failed due to an unknown error"
};
// Send WebSocket notification
var wsPacket = new WebSocketPacket
{
Type = "upload.failed",
Data = GrpcTypeHelper.ConvertObjectToByteString(failureData)
};
await ringService.PushWebSocketPacketAsync(new PushWebSocketPacketRequest
{
UserId = task.AccountId.ToString(),
Packet = wsPacket
});
// Send push notification
var pushNotification = new PushNotification
{
Topic = "drive.tasks.upload",
Title = "Upload Failed",
Subtitle = task.FileName,
Body = $"Your file '{task.FileName}' upload has failed. You can try again.",
IsSavable = true
};
await ringService.SendPushNotificationToUserAsync(new SendPushNotificationToUserRequest
{
UserId = task.AccountId.ToString(),
Notification = pushNotification
});
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to send upload failure notification for task {TaskId}", task.TaskId);
}
}
/// <summary>
/// Sends real-time upload progress update via WebSocket
/// </summary>
private async Task SendUploadProgressUpdateAsync(PersistentUploadTask task, double newProgress,
double previousProgress)
{
try
{
// Only send significant progress updates (every 5% or major milestones)
if (Math.Abs(newProgress - previousProgress) < 5 && newProgress < 100)
return;
var progressData = new UploadProgressData
{
TaskId = task.TaskId,
FileName = task.FileName,
FileSize = task.FileSize,
ChunksUploaded = task.ChunksUploaded,
ChunksTotal = task.ChunksCount,
Progress = newProgress,
Status = task.Status.ToString(),
LastActivity = task.LastActivity.ToString()
};
var packet = new WebSocketPacket
{
Type = "upload.progress",
Data = GrpcTypeHelper.ConvertObjectToByteString(progressData)
};
await ringService.PushWebSocketPacketAsync(new PushWebSocketPacketRequest
{
UserId = task.AccountId.ToString(),
Packet = packet
});
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to send upload progress update for task {TaskId}", task.TaskId);
}
}
#endregion
}
#region Data Transfer Objects
@@ -524,6 +1030,7 @@ public class TaskCreatedData
public string Name { get; set; } = null!;
public string Type { get; set; } = null!;
public string CreatedAt { get; set; } = null!;
public Dictionary<string, object?>? Parameters { get; set; }
}
public class TaskProgressData
@@ -579,3 +1086,58 @@ public class TaskActivity
}
#endregion
#region Upload-Specific Data Transfer Objects
public class UploadProgressData
{
public string TaskId { get; set; } = null!;
public string FileName { get; set; } = null!;
public long FileSize { get; set; }
public int ChunksUploaded { get; set; }
public int ChunksTotal { get; set; }
public double Progress { get; set; }
public string Status { get; set; } = null!;
public string LastActivity { get; set; } = null!;
}
public class UploadCompletionData
{
public string TaskId { get; set; } = null!;
public string FileId { get; set; } = null!;
public string FileName { get; set; } = null!;
public long FileSize { get; set; }
public string CompletedAt { get; set; } = null!;
}
public class UploadFailureData
{
public string TaskId { get; set; } = null!;
public string FileName { get; set; } = null!;
public long FileSize { get; set; }
public string FailedAt { get; set; } = null!;
public string ErrorMessage { get; set; } = null!;
}
public class UserUploadStats
{
public int TotalTasks { get; set; }
public int InProgressTasks { get; set; }
public int CompletedTasks { get; set; }
public int FailedTasks { get; set; }
public int ExpiredTasks { get; set; }
public long TotalUploadedBytes { get; set; }
public double AverageProgress { get; set; }
public List<RecentActivity> RecentActivity { get; set; } = new();
}
public class RecentActivity
{
public string TaskId { get; set; } = null!;
public string FileName { get; set; } = null!;
public UploadTaskStatus Status { get; set; }
public Instant LastActivity { get; set; }
public double Progress { get; set; }
}
#endregion

View File

@@ -1,567 +0,0 @@
using DysonNetwork.Drive.Storage.Model;
using DysonNetwork.Shared.Cache;
using DysonNetwork.Shared.Proto;
using Microsoft.EntityFrameworkCore;
using NodaTime;
using System.Text.Json;
using TaskStatus = DysonNetwork.Drive.Storage.Model.TaskStatus;
namespace DysonNetwork.Drive.Storage;
public class PersistentUploadService(
AppDatabase db,
ICacheService cache,
ILogger<PersistentUploadService> logger,
RingService.RingServiceClient ringService
)
{
private const string CacheKeyPrefix = "upload:task:";
private static readonly TimeSpan CacheDuration = TimeSpan.FromMinutes(30);
/// <summary>
/// Creates a new persistent upload task
/// </summary>
public async Task<PersistentUploadTask> CreateUploadTaskAsync(
string taskId,
CreateUploadTaskRequest request,
Guid accountId
)
{
var chunkSize = request.ChunkSize ?? 1024 * 1024 * 5; // 5MB default
var chunksCount = (int)Math.Ceiling((double)request.FileSize / chunkSize);
var uploadTask = new PersistentUploadTask
{
TaskId = taskId,
FileName = request.FileName,
FileSize = request.FileSize,
ContentType = request.ContentType,
ChunkSize = chunkSize,
ChunksCount = chunksCount,
ChunksUploaded = 0,
PoolId = request.PoolId.Value,
BundleId = request.BundleId,
EncryptPassword = request.EncryptPassword,
ExpiredAt = request.ExpiredAt,
Hash = request.Hash,
AccountId = accountId,
Status = Model.TaskStatus.InProgress,
UploadedChunks = new List<int>(),
LastActivity = SystemClock.Instance.GetCurrentInstant()
};
db.UploadTasks.Add(uploadTask);
await db.SaveChangesAsync();
await SetCacheAsync(uploadTask);
return uploadTask;
}
/// <summary>
/// Gets an existing upload task by ID
/// </summary>
public async Task<PersistentUploadTask?> GetUploadTaskAsync(string taskId)
{
var cacheKey = $"{CacheKeyPrefix}{taskId}";
var cachedTask = await cache.GetAsync<PersistentUploadTask>(cacheKey);
if (cachedTask is not null)
return cachedTask;
var task = await db.Tasks
.OfType<PersistentUploadTask>()
.FirstOrDefaultAsync(t => t.TaskId == taskId && t.Status == TaskStatus.InProgress);
if (task is not null)
await SetCacheAsync(task);
return task;
}
/// <summary>
/// Updates chunk upload progress
/// </summary>
public async Task UpdateChunkProgressAsync(string taskId, int chunkIndex)
{
var task = await GetUploadTaskAsync(taskId);
if (task is null) return;
if (!task.UploadedChunks.Contains(chunkIndex))
{
var previousProgress = task.ChunksCount > 0 ? (double)task.ChunksUploaded / task.ChunksCount * 100 : 0;
task.UploadedChunks.Add(chunkIndex);
task.ChunksUploaded = task.UploadedChunks.Count;
task.LastActivity = SystemClock.Instance.GetCurrentInstant();
await db.SaveChangesAsync();
await SetCacheAsync(task);
// Send real-time progress update
var newProgress = task.ChunksCount > 0 ? (double)task.ChunksUploaded / task.ChunksCount * 100 : 0;
await SendUploadProgressUpdateAsync(task, newProgress, previousProgress);
}
}
/// <summary>
/// Marks an upload task as completed
/// </summary>
public async Task MarkTaskCompletedAsync(string taskId)
{
var task = await GetUploadTaskAsync(taskId);
if (task is null) return;
task.Status = Model.TaskStatus.Completed;
task.LastActivity = SystemClock.Instance.GetCurrentInstant();
await db.SaveChangesAsync();
await RemoveCacheAsync(taskId);
}
/// <summary>
/// Marks an upload task as failed
/// </summary>
public async Task MarkTaskFailedAsync(string taskId)
{
var task = await GetUploadTaskAsync(taskId);
if (task is null) return;
task.Status = Model.TaskStatus.Failed;
task.LastActivity = SystemClock.Instance.GetCurrentInstant();
await db.SaveChangesAsync();
await RemoveCacheAsync(taskId);
}
/// <summary>
/// Gets all resumable tasks for an account
/// </summary>
public async Task<List<PersistentUploadTask>> GetResumableTasksAsync(Guid accountId)
{
return await db.Tasks
.OfType<PersistentUploadTask>()
.Where(t => t.AccountId == accountId &&
t.Status == Model.TaskStatus.InProgress &&
t.LastActivity > SystemClock.Instance.GetCurrentInstant() - Duration.FromHours(24))
.OrderByDescending(t => t.LastActivity)
.ToListAsync();
}
/// <summary>
/// Gets user tasks with filtering and pagination
/// </summary>
public async Task<(List<PersistentUploadTask> Items, int TotalCount)> GetUserTasksAsync(
Guid accountId,
UploadTaskStatus? status = null,
string? sortBy = "lastActivity",
bool sortDescending = true,
int offset = 0,
int limit = 50
)
{
var query = db.Tasks.OfType<PersistentUploadTask>().Where(t => t.AccountId == accountId);
// Apply status filter
if (status.HasValue)
{
query = query.Where(t => t.Status == (TaskStatus)status.Value);
}
// Get total count
var totalCount = await query.CountAsync();
// Apply sorting
IOrderedQueryable<PersistentUploadTask> orderedQuery;
switch (sortBy?.ToLower())
{
case "filename":
orderedQuery = sortDescending
? query.OrderByDescending(t => t.FileName)
: query.OrderBy(t => t.FileName);
break;
case "filesize":
orderedQuery = sortDescending
? query.OrderByDescending(t => t.FileSize)
: query.OrderBy(t => t.FileSize);
break;
case "createdat":
orderedQuery = sortDescending
? query.OrderByDescending(t => t.CreatedAt)
: query.OrderBy(t => t.CreatedAt);
break;
case "updatedat":
orderedQuery = sortDescending
? query.OrderByDescending(t => t.UpdatedAt)
: query.OrderBy(t => t.UpdatedAt);
break;
case "lastactivity":
default:
orderedQuery = sortDescending
? query.OrderByDescending(t => t.LastActivity)
: query.OrderBy(t => t.LastActivity);
break;
}
// Apply pagination
var items = await orderedQuery
.Skip(offset)
.Take(limit)
.ToListAsync();
return (items, totalCount);
}
/// <summary>
/// Checks if a chunk has already been uploaded
/// </summary>
public async Task<bool> IsChunkUploadedAsync(string taskId, int chunkIndex)
{
var task = await GetUploadTaskAsync(taskId);
return task?.UploadedChunks.Contains(chunkIndex) ?? false;
}
/// <summary>
/// Cleans up expired/stale upload tasks
/// </summary>
public async Task CleanupStaleTasksAsync()
{
var now = SystemClock.Instance.GetCurrentInstant();
var staleThreshold = now - Duration.FromHours(24); // 24 hours
var staleTasks = await db.Tasks
.OfType<PersistentUploadTask>()
.Where(t => t.Status == Model.TaskStatus.InProgress &&
t.LastActivity < staleThreshold)
.ToListAsync();
foreach (var task in staleTasks)
{
task.Status = Model.TaskStatus.Expired;
await RemoveCacheAsync(task.TaskId);
// Clean up temp files
var taskPath = Path.Combine(Path.GetTempPath(), "multipart-uploads", task.TaskId);
if (Directory.Exists(taskPath))
{
try
{
Directory.Delete(taskPath, true);
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to cleanup temp files for task {TaskId}", task.TaskId);
}
}
}
await db.SaveChangesAsync();
if (staleTasks.Any())
{
logger.LogInformation("Cleaned up {Count} stale upload tasks", staleTasks.Count);
}
}
/// <summary>
/// Gets upload progress as percentage
/// </summary>
public async Task<double> GetUploadProgressAsync(string taskId)
{
var task = await GetUploadTaskAsync(taskId);
if (task is null || task.ChunksCount == 0) return 0;
return (double)task.ChunksUploaded / task.ChunksCount * 100;
}
private async Task SetCacheAsync(PersistentUploadTask task)
{
var cacheKey = $"{CacheKeyPrefix}{task.TaskId}";
await cache.SetAsync(cacheKey, task, CacheDuration);
}
private async Task RemoveCacheAsync(string taskId)
{
var cacheKey = $"{CacheKeyPrefix}{taskId}";
await cache.RemoveAsync(cacheKey);
}
/// <summary>
/// Gets upload statistics for a user
/// </summary>
public async Task<UserUploadStats> GetUserUploadStatsAsync(Guid accountId)
{
var tasks = await db.Tasks
.OfType<PersistentUploadTask>()
.Where(t => t.AccountId == accountId)
.ToListAsync();
var stats = new UserUploadStats
{
TotalTasks = tasks.Count,
InProgressTasks = tasks.Count(t => t.Status == Model.TaskStatus.InProgress),
CompletedTasks = tasks.Count(t => t.Status == Model.TaskStatus.Completed),
FailedTasks = tasks.Count(t => t.Status == Model.TaskStatus.Failed),
ExpiredTasks = tasks.Count(t => t.Status == Model.TaskStatus.Expired),
TotalUploadedBytes = tasks.Sum(t => (long)t.ChunksUploaded * t.ChunkSize),
AverageProgress = tasks.Any(t => t.Status == Model.TaskStatus.InProgress)
? tasks.Where(t => t.Status == Model.TaskStatus.InProgress)
.Average(t => t.ChunksCount > 0 ? (double)t.ChunksUploaded / t.ChunksCount * 100 : 0)
: 0,
RecentActivity = tasks.OrderByDescending(t => t.LastActivity)
.Take(5)
.Select(t => new RecentActivity
{
TaskId = t.TaskId,
FileName = t.FileName,
Status = (UploadTaskStatus)t.Status,
LastActivity = t.LastActivity,
Progress = t.ChunksCount > 0 ? (double)t.ChunksUploaded / t.ChunksCount * 100 : 0
})
.ToList()
};
return stats;
}
/// <summary>
/// Cleans up failed tasks for a user
/// </summary>
public async Task<int> CleanupUserFailedTasksAsync(Guid accountId)
{
var failedTasks = await db.Tasks
.OfType<PersistentUploadTask>()
.Where(t => t.AccountId == accountId &&
(t.Status == Model.TaskStatus.Failed || t.Status == Model.TaskStatus.Expired))
.ToListAsync();
foreach (var task in failedTasks)
{
await RemoveCacheAsync(task.TaskId);
// Clean up temp files
var taskPath = Path.Combine(Path.GetTempPath(), "multipart-uploads", task.TaskId);
if (Directory.Exists(taskPath))
{
try
{
Directory.Delete(taskPath, true);
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to cleanup temp files for task {TaskId}", task.TaskId);
}
}
}
db.Tasks.RemoveRange(failedTasks);
await db.SaveChangesAsync();
return failedTasks.Count;
}
/// <summary>
/// Gets recent tasks for a user
/// </summary>
public async Task<List<PersistentUploadTask>> GetRecentUserTasksAsync(Guid accountId, int limit = 10)
{
return await db.Tasks
.OfType<PersistentUploadTask>()
.Where(t => t.AccountId == accountId)
.OrderByDescending(t => t.LastActivity)
.Take(limit)
.ToListAsync();
}
/// <summary>
/// Sends real-time upload progress update via WebSocket
/// </summary>
private async Task SendUploadProgressUpdateAsync(PersistentUploadTask task, double newProgress, double previousProgress)
{
try
{
// Only send significant progress updates (every 5% or major milestones)
if (Math.Abs(newProgress - previousProgress) < 5 && newProgress < 100)
return;
var progressData = new UploadProgressData
{
TaskId = task.TaskId,
FileName = task.FileName,
FileSize = task.FileSize,
ChunksUploaded = task.ChunksUploaded,
ChunksTotal = task.ChunksCount,
Progress = newProgress,
Status = task.Status.ToString(),
LastActivity = task.LastActivity.ToString("O", null)
};
var packet = new WebSocketPacket
{
Type = "upload.progress",
Data = Google.Protobuf.ByteString.CopyFromUtf8(System.Text.Json.JsonSerializer.Serialize(progressData))
};
await ringService.PushWebSocketPacketAsync(new PushWebSocketPacketRequest
{
UserId = task.AccountId.ToString(),
Packet = packet
});
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to send upload progress update for task {TaskId}", task.TaskId);
}
}
/// <summary>
/// Sends upload completion notification
/// </summary>
public async Task SendUploadCompletedNotificationAsync(PersistentUploadTask task, string fileId)
{
try
{
var completionData = new UploadCompletionData
{
TaskId = task.TaskId,
FileId = fileId,
FileName = task.FileName,
FileSize = task.FileSize,
CompletedAt = SystemClock.Instance.GetCurrentInstant().ToString("O", null)
};
// Send WebSocket notification
var wsPacket = new WebSocketPacket
{
Type = "upload.completed",
Data = Google.Protobuf.ByteString.CopyFromUtf8(System.Text.Json.JsonSerializer.Serialize(completionData))
};
await ringService.PushWebSocketPacketAsync(new PushWebSocketPacketRequest
{
UserId = task.AccountId.ToString(),
Packet = wsPacket
});
// Send push notification
var pushNotification = new PushNotification
{
Topic = "upload",
Title = "Upload Completed",
Subtitle = task.FileName,
Body = $"Your file '{task.FileName}' has been uploaded successfully.",
IsSavable = true
};
await ringService.SendPushNotificationToUserAsync(new SendPushNotificationToUserRequest
{
UserId = task.AccountId.ToString(),
Notification = pushNotification
});
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to send upload completion notification for task {TaskId}", task.TaskId);
}
}
/// <summary>
/// Sends upload failure notification
/// </summary>
public async Task SendUploadFailedNotificationAsync(PersistentUploadTask task, string? errorMessage = null)
{
try
{
var failureData = new UploadFailureData
{
TaskId = task.TaskId,
FileName = task.FileName,
FileSize = task.FileSize,
FailedAt = SystemClock.Instance.GetCurrentInstant().ToString("O", null),
ErrorMessage = errorMessage ?? "Upload failed due to an unknown error"
};
// Send WebSocket notification
var wsPacket = new WebSocketPacket
{
Type = "upload.failed",
Data = Google.Protobuf.ByteString.CopyFromUtf8(System.Text.Json.JsonSerializer.Serialize(failureData))
};
await ringService.PushWebSocketPacketAsync(new PushWebSocketPacketRequest
{
UserId = task.AccountId.ToString(),
Packet = wsPacket
});
// Send push notification
var pushNotification = new PushNotification
{
Topic = "upload",
Title = "Upload Failed",
Subtitle = task.FileName,
Body = $"Your file '{task.FileName}' upload has failed. You can try again.",
IsSavable = true
};
await ringService.SendPushNotificationToUserAsync(new SendPushNotificationToUserRequest
{
UserId = task.AccountId.ToString(),
Notification = pushNotification
});
}
catch (Exception ex)
{
logger.LogWarning(ex, "Failed to send upload failure notification for task {TaskId}", task.TaskId);
}
}
}
public class UploadProgressData
{
public string TaskId { get; set; } = null!;
public string FileName { get; set; } = null!;
public long FileSize { get; set; }
public int ChunksUploaded { get; set; }
public int ChunksTotal { get; set; }
public double Progress { get; set; }
public string Status { get; set; } = null!;
public string LastActivity { get; set; } = null!;
}
public class UploadCompletionData
{
public string TaskId { get; set; } = null!;
public string FileId { get; set; } = null!;
public string FileName { get; set; } = null!;
public long FileSize { get; set; }
public string CompletedAt { get; set; } = null!;
}
public class UploadFailureData
{
public string TaskId { get; set; } = null!;
public string FileName { get; set; } = null!;
public long FileSize { get; set; }
public string FailedAt { get; set; } = null!;
public string ErrorMessage { get; set; } = null!;
}
public class UserUploadStats
{
public int TotalTasks { get; set; }
public int InProgressTasks { get; set; }
public int CompletedTasks { get; set; }
public int FailedTasks { get; set; }
public int ExpiredTasks { get; set; }
public long TotalUploadedBytes { get; set; }
public double AverageProgress { get; set; }
public List<RecentActivity> RecentActivity { get; set; } = new();
}
public class RecentActivity
{
public string TaskId { get; set; } = null!;
public string FileName { get; set; } = null!;
public UploadTaskStatus Status { get; set; }
public Instant LastActivity { get; set; }
public double Progress { get; set; }
}

View File

@@ -1,33 +1,35 @@
# DysonNetwork Drive - Persistent/Resumable Upload System
# DysonNetwork Drive - Persistent Task System
A comprehensive, production-ready file upload system with resumable uploads, real-time progress tracking, and dynamic notifications powered by RingService.
A comprehensive, production-ready generic task system with support for file uploads, background operations, real-time progress tracking, and dynamic notifications powered by RingService.
When using with the Gateway, use the `/drive` to replace `/api`.
The realtime messages are from the websocket gateway.
## 🚀 Features
### Core Upload Features
### Core Task Features
- **Generic Task System**: Support for various background operations beyond file uploads
- **Resumable Uploads**: Pause and resume uploads across app restarts
- **Chunked Uploads**: Efficient large file handling with configurable chunk sizes
- **Progress Persistence**: Upload state survives server restarts and network interruptions
- **Progress Persistence**: Task state survives server restarts and network interruptions
- **Duplicate Detection**: Automatic detection of already uploaded files via hash checking
- **Quota Management**: Integration with user quota and billing systems
- **Pool-based Storage**: Support for multiple storage pools with different policies
### Real-Time Features
- **Live Progress Updates**: WebSocket-based real-time progress tracking
- **Completion Notifications**: Instant notifications when uploads complete
- **Failure Alerts**: Immediate notification of upload failures with error details
- **Live Progress Updates**: WebSocket-based real-time progress tracking for all task types
- **Task Lifecycle Notifications**: Instant notifications for task creation, progress, completion, and failure
- **Failure Alerts**: Immediate notification of task failures with error details
- **Push Notifications**: Cross-platform push notifications for mobile/desktop
- **Smart Throttling**: Optimized update frequency to prevent network spam
### Management Features
- **Task Listing**: Comprehensive API for listing and filtering upload tasks
- **Task Statistics**: Detailed analytics and usage statistics
- **Task Listing**: Comprehensive API for listing and filtering all task types
- **Task Statistics**: Detailed analytics and usage statistics for all operations
- **Cleanup Operations**: Automatic and manual cleanup of failed/stale tasks
- **Ownership Verification**: Secure access control for all operations
- **Detailed Task Info**: Rich metadata including speed calculations and ETAs
- **Detailed Task Info**: Rich metadata including progress, parameters, and results
- **Task Lifecycle Management**: Full control over task states (pause, resume, cancel)
## 📋 Table of Contents
@@ -93,18 +95,29 @@ Creates a new resumable upload task.
**Request Body:**
```json
{
"fileName": "string", // Required: Name of the file
"fileSize": "long", // Required: Size in bytes
"contentType": "string", // Required: MIME type
"poolId": "uuid", // Optional: Storage pool ID
"bundleId": "uuid", // Optional: File bundle ID
"chunkSize": "long", // Optional: Chunk size (default: 5MB)
"encryptPassword": "string", // Optional: Encryption password
"expiredAt": "datetime", // Optional: Expiration date
"hash": "string" // Required: File hash for deduplication
"fileName": "string",
"fileSize": "long",
"contentType": "string",
"poolId": "uuid",
"bundleId": "uuid",
"chunkSize": "long",
"encryptPassword": "string",
"expiredAt": "datetime",
"hash": "string"
}
```
**Field Descriptions:**
- `fileName`: Required - Name of the file
- `fileSize`: Required - Size in bytes
- `contentType`: Required - MIME type
- `poolId`: Optional - Storage pool ID
- `bundleId`: Optional - File bundle ID
- `chunkSize`: Optional - Chunk size (default: 5MB)
- `encryptPassword`: Optional - Encryption password
- `expiredAt`: Optional - Expiration date
- `hash`: Required - File hash for deduplication
**Response:**
```json
{
@@ -175,7 +188,7 @@ Gets upload statistics for the current user.
"expiredTasks": 1,
"totalUploadedBytes": 5368709120,
"averageProgress": 67.5,
"recentActivity": [...]
"recentActivity": []
}
```
@@ -187,56 +200,73 @@ Gets the most recent upload tasks.
## 🔌 WebSocket Events
The system sends real-time updates via WebSocket using RingService. Connect to the WebSocket endpoint and listen for upload-related events.
The system sends real-time updates via WebSocket using RingService. Connect to the WebSocket endpoint and listen for task-related events.
### Event Types
#### `upload.progress`
Sent when upload progress changes significantly (every 5% or major milestones).
#### `task.created`
Sent when a new task is created.
```json
{
"type": "upload.progress",
"type": "task.created",
"data": {
"taskId": "abc123def456",
"fileName": "document.pdf",
"fileSize": 10485760,
"chunksUploaded": 5,
"chunksTotal": 10,
"progress": 50.0,
"taskId": "task123",
"name": "Upload File",
"type": "FileUpload",
"createdAt": "2025-11-09T02:00:00Z"
}
}
```
#### `task.progress`
Sent when task progress changes significantly (every 5% or major milestones).
```json
{
"type": "task.progress",
"data": {
"taskId": "task123",
"name": "Upload File",
"type": "FileUpload",
"progress": 67.5,
"status": "InProgress",
"lastActivity": "2025-11-09T01:56:00.0000000Z"
"lastActivity": "2025-11-09T02:05:00Z"
}
}
```
#### `upload.completed`
Sent when an upload completes successfully.
#### `task.completed`
Sent when a task completes successfully.
```json
{
"type": "upload.completed",
"type": "task.completed",
"data": {
"taskId": "abc123def456",
"fileId": "file789xyz",
"fileName": "document.pdf",
"fileSize": 10485760,
"completedAt": "2025-11-09T01:57:00.0000000Z"
"taskId": "task123",
"name": "Upload File",
"type": "FileUpload",
"completedAt": "2025-11-09T02:10:00Z",
"results": {
"fileId": "file456",
"fileName": "document.pdf",
"fileSize": 10485760
}
}
}
```
#### `upload.failed`
Sent when an upload fails.
#### `task.failed`
Sent when a task fails.
```json
{
"type": "upload.failed",
"type": "task.failed",
"data": {
"taskId": "abc123def456",
"fileName": "document.pdf",
"fileSize": 10485760,
"failedAt": "2025-11-09T01:58:00.0000000Z",
"taskId": "task123",
"name": "Upload File",
"type": "FileUpload",
"failedAt": "2025-11-09T02:15:00Z",
"errorMessage": "File processing failed: invalid format"
}
}
@@ -256,18 +286,18 @@ ws.onopen = () => {
}));
};
// Handle upload events
// Handle task events
ws.onmessage = (event) => {
const packet = JSON.parse(event.data);
switch (packet.type) {
case 'upload.progress':
case 'task.progress':
updateProgressBar(packet.data);
break;
case 'upload.completed':
case 'task.completed':
showSuccessNotification(packet.data);
break;
case 'upload.failed':
case 'task.failed':
showErrorNotification(packet.data);
break;
}
@@ -282,6 +312,10 @@ function updateProgressBar(data) {
}
```
### Note on Upload-Specific Notifications
The system also includes upload-specific notifications (`upload.progress`, `upload.completed`, `upload.failed`) for backward compatibility. However, for new implementations, it's recommended to use the generic task notifications as they provide the same functionality with less object allocation overhead. Since users are typically in the foreground during upload operations, the generic task notifications provide sufficient progress visibility.
## 🗄️ Database Schema
### `upload_tasks` Table
@@ -348,7 +382,7 @@ UPLOAD_CACHE_DURATION_MINUTES=30
```csharp
// In Program.cs or Startup.cs
builder.Services.AddScoped<PersistentUploadService>();
builder.Services.AddScoped<PersistentTaskService>();
builder.Services.AddSingleton<RingService.RingServiceClient>(sp => {
// Configure gRPC client for RingService
var channel = GrpcChannel.ForAddress("https://ring-service:50051");
@@ -754,7 +788,7 @@ public class PersistentTaskService(
### Real-Time Task Notifications
All task operations send WebSocket notifications via RingService:
All task operations send WebSocket notifications via RingService using the shared `GrpcTypeHelper` for consistent JSON serialization:
#### Task Created
```json
@@ -867,6 +901,36 @@ Tasks support multiple statuses:
- **Cancelled**: Manually cancelled
- **Expired**: Timed out or expired
### Available Service Methods
Based on the `PersistentTaskService` implementation, the following methods are available:
#### Core Task Operations
- `CreateTaskAsync<T>(T task)`: Creates any type of persistent task
- `GetTaskAsync<T>(string taskId)`: Retrieves a task by ID with caching
- `UpdateTaskProgressAsync(string taskId, double progress, string? statusMessage)`: Updates task progress with automatic notifications
- `MarkTaskCompletedAsync(string taskId, Dictionary<string, object?>? results)`: Marks task as completed with optional results
- `MarkTaskFailedAsync(string taskId, string? errorMessage)`: Marks task as failed with error message
- `PauseTaskAsync(string taskId)`: Pauses an in-progress task
- `ResumeTaskAsync(string taskId)`: Resumes a paused task
- `CancelTaskAsync(string taskId)`: Cancels a task
#### Task Querying & Statistics
- `GetUserTasksAsync()`: Gets tasks for a user with filtering and pagination
- `GetUserTaskStatsAsync(Guid accountId)`: Gets comprehensive task statistics
- `CleanupOldTasksAsync(Guid accountId, Duration maxAge)`: Cleans up old completed/failed tasks
#### Upload-Specific Operations
- `CreateUploadTaskAsync()`: Creates a new persistent upload task
- `GetUploadTaskAsync(string taskId)`: Gets an existing upload task
- `UpdateChunkProgressAsync(string taskId, int chunkIndex)`: Updates chunk upload progress
- `IsChunkUploadedAsync(string taskId, int chunkIndex)`: Checks if a chunk has been uploaded
- `GetUploadProgressAsync(string taskId)`: Gets upload progress as percentage
- `GetUserUploadTasksAsync()`: Gets user upload tasks with filtering
- `GetUserUploadStatsAsync(Guid accountId)`: Gets upload statistics for a user
- `CleanupUserFailedTasksAsync(Guid accountId)`: Cleans up failed upload tasks
- `GetRecentUserTasksAsync(Guid accountId, int limit)`: Gets recent upload tasks
### Priority System
Tasks can be assigned priorities (0-100, higher = more important) to control execution order in background processing.

View File

@@ -1,20 +0,0 @@
using DysonNetwork.Shared.Data;
using Microsoft.AspNetCore.Mvc;
namespace DysonNetwork.Drive;
[ApiController]
[Route("/api/version")]
public class VersionController : ControllerBase
{
[HttpGet]
public IActionResult Get()
{
return Ok(new AppVersion
{
Version = ThisAssembly.AssemblyVersion,
Commit = ThisAssembly.GitCommitId,
UpdateDate = ThisAssembly.GitCommitDate
});
}
}

View File

@@ -1,121 +1,124 @@
{
"Debug": true,
"BaseUrl": "http://localhost:5090",
"GatewayUrl": "http://localhost:5094",
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"AllowedHosts": "*",
"ConnectionStrings": {
"App": "Host=localhost;Port=5432;Database=dyson_drive;Username=postgres;Password=postgres;Include Error Detail=True;Maximum Pool Size=20;Connection Idle Lifetime=60"
},
"Authentication": {
"Schemes": {
"Bearer": {
"ValidAudiences": [
"http://localhost:5071",
"https://localhost:7099"
],
"ValidIssuer": "solar-network"
}
}
},
"AuthToken": {
"PublicKeyPath": "Keys/PublicKey.pem",
"PrivateKeyPath": "Keys/PrivateKey.pem"
},
"Tus": {
"StorePath": "Uploads"
},
"Storage": {
"Uploads": "Uploads",
"PreferredRemote": "2adceae3-981a-4564-9b8d-5d71a211c873",
"Remote": [
{
"Id": "minio",
"Label": "Minio",
"Region": "auto",
"Bucket": "solar-network-development",
"Endpoint": "localhost:9000",
"SecretId": "littlesheep",
"SecretKey": "password",
"EnabledSigned": true,
"EnableSsl": false
},
{
"Id": "cloudflare",
"Label": "Cloudflare R2",
"Region": "auto",
"Bucket": "solar-network",
"Endpoint": "0a70a6d1b7128888c823359d0008f4e1.r2.cloudflarestorage.com",
"SecretId": "8ff5d06c7b1639829d60bc6838a542e6",
"SecretKey": "fd58158c5201be16d1872c9209d9cf199421dae3c2f9972f94b2305976580d67",
"EnableSigned": true,
"EnableSsl": true
}
"Debug": true,
"BaseUrl": "http://localhost:5090",
"GatewayUrl": "http://localhost:5094",
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"AllowedHosts": "*",
"ConnectionStrings": {
"App": "Host=localhost;Port=5432;Database=dyson_drive;Username=postgres;Password=postgres;Include Error Detail=True;Maximum Pool Size=20;Connection Idle Lifetime=60",
"Registrar": "127.0.0.1:2379",
"Cache": "127.0.0.1:6379",
"Queue": "127.0.0.1:4222"
},
"Authentication": {
"Schemes": {
"Bearer": {
"ValidAudiences": [
"http://localhost:5071",
"https://localhost:7099"
],
"ValidIssuer": "solar-network"
}
}
},
"AuthToken": {
"PublicKeyPath": "Keys/PublicKey.pem",
"PrivateKeyPath": "Keys/PrivateKey.pem"
},
"Storage": {
"Uploads": "Uploads",
"PreferredRemote": "c53136a6-9152-4ecb-9f88-43c41438c23e",
"Remote": [
{
"Id": "minio",
"Label": "Minio",
"Region": "auto",
"Bucket": "solar-network-development",
"Endpoint": "localhost:9000",
"SecretId": "littlesheep",
"SecretKey": "password",
"EnabledSigned": true,
"EnableSsl": false
},
{
"Id": "cloudflare",
"Label": "Cloudflare R2",
"Region": "auto",
"Bucket": "solar-network",
"Endpoint": "0a70a6d1b7128888c823359d0008f4e1.r2.cloudflarestorage.com",
"SecretId": "8ff5d06c7b1639829d60bc6838a542e6",
"SecretKey": "fd58158c5201be16d1872c9209d9cf199421dae3c2f9972f94b2305976580d67",
"EnableSigned": true,
"EnableSsl": true
}
]
},
"Captcha": {
"Provider": "cloudflare",
"ApiKey": "0x4AAAAAABCDUdOujj4feOb_",
"ApiSecret": "0x4AAAAAABCDUWABiJQweqlB7tYq-IqIm8U"
},
"Notifications": {
"Topic": "dev.solsynth.solian",
"Endpoint": "http://localhost:8088"
},
"Email": {
"Server": "smtp4dev.orb.local",
"Port": 25,
"UseSsl": false,
"Username": "no-reply@mail.solsynth.dev",
"Password": "password",
"FromAddress": "no-reply@mail.solsynth.dev",
"FromName": "Alphabot",
"SubjectPrefix": "Solar Network"
},
"RealtimeChat": {
"Endpoint": "https://solar-network-im44o8gq.livekit.cloud",
"ApiKey": "APIs6TiL8wj3A4j",
"ApiSecret": "SffxRneIwTnlHPtEf3zicmmv3LUEl7xXael4PvWZrEhE"
},
"GeoIp": {
"DatabasePath": "./Keys/GeoLite2-City.mmdb"
},
"Oidc": {
"Google": {
"ClientId": "961776991058-963m1qin2vtp8fv693b5fdrab5hmpl89.apps.googleusercontent.com",
"ClientSecret": ""
},
"Apple": {
"ClientId": "dev.solsynth.solian",
"TeamId": "W7HPZ53V6B",
"KeyId": "B668YP4KBG",
"PrivateKeyPath": "./Keys/Solarpass.p8"
},
"Microsoft": {
"ClientId": "YOUR_MICROSOFT_CLIENT_ID",
"ClientSecret": "YOUR_MICROSOFT_CLIENT_SECRET",
"DiscoveryEndpoint": "YOUR_MICROSOFT_DISCOVERY_ENDPOINT"
}
},
"Payment": {
"Auth": {
"Afdian": "<token here>"
},
"Subscriptions": {
"Afdian": {
"7d17aae23c9611f0b5705254001e7c00": "solian.stellar.primary",
"7dfae4743c9611f0b3a55254001e7c00": "solian.stellar.nova",
"141713ee3d6211f085b352540025c377": "solian.stellar.supernova"
}
}
},
"Cache": {
"Serializer": "MessagePack"
},
"KnownProxies": [
"127.0.0.1",
"::1"
]
},
"Captcha": {
"Provider": "cloudflare",
"ApiKey": "0x4AAAAAABCDUdOujj4feOb_",
"ApiSecret": "0x4AAAAAABCDUWABiJQweqlB7tYq-IqIm8U"
},
"Notifications": {
"Topic": "dev.solsynth.solian",
"Endpoint": "http://localhost:8088"
},
"Email": {
"Server": "smtp4dev.orb.local",
"Port": 25,
"UseSsl": false,
"Username": "no-reply@mail.solsynth.dev",
"Password": "password",
"FromAddress": "no-reply@mail.solsynth.dev",
"FromName": "Alphabot",
"SubjectPrefix": "Solar Network"
},
"RealtimeChat": {
"Endpoint": "https://solar-network-im44o8gq.livekit.cloud",
"ApiKey": "APIs6TiL8wj3A4j",
"ApiSecret": "SffxRneIwTnlHPtEf3zicmmv3LUEl7xXael4PvWZrEhE"
},
"GeoIp": {
"DatabasePath": "./Keys/GeoLite2-City.mmdb"
},
"Oidc": {
"Google": {
"ClientId": "961776991058-963m1qin2vtp8fv693b5fdrab5hmpl89.apps.googleusercontent.com",
"ClientSecret": ""
},
"Apple": {
"ClientId": "dev.solsynth.solian",
"TeamId": "W7HPZ53V6B",
"KeyId": "B668YP4KBG",
"PrivateKeyPath": "./Keys/Solarpass.p8"
},
"Microsoft": {
"ClientId": "YOUR_MICROSOFT_CLIENT_ID",
"ClientSecret": "YOUR_MICROSOFT_CLIENT_SECRET",
"DiscoveryEndpoint": "YOUR_MICROSOFT_DISCOVERY_ENDPOINT"
}
},
"Payment": {
"Auth": {
"Afdian": "<token here>"
},
"Subscriptions": {
"Afdian": {
"7d17aae23c9611f0b5705254001e7c00": "solian.stellar.primary",
"7dfae4743c9611f0b3a55254001e7c00": "solian.stellar.nova",
"141713ee3d6211f085b352540025c377": "solian.stellar.supernova"
}
}
},
"KnownProxies": [
"127.0.0.1",
"::1"
]
}

View File

@@ -1,7 +0,0 @@
{
"version": "1.0",
"publicReleaseRefSpec": ["^refs/heads/main$"],
"cloudBuild": {
"setVersionVariables": true
}
}

View File

@@ -1,12 +0,0 @@
using Microsoft.AspNetCore.Mvc;
[ApiController]
[Route("config")]
public class ConfigurationController(IConfiguration configuration) : ControllerBase
{
[HttpGet]
public IActionResult Get() => Ok(configuration.GetSection("Client").Get<Dictionary<string, object>>());
[HttpGet("site")]
public IActionResult GetSiteUrl() => Ok(configuration["SiteUrl"]);
}

View File

@@ -1,23 +0,0 @@
FROM mcr.microsoft.com/dotnet/aspnet:9.0 AS base
USER $APP_UID
WORKDIR /app
EXPOSE 8080
EXPOSE 8081
FROM mcr.microsoft.com/dotnet/sdk:9.0 AS build
ARG BUILD_CONFIGURATION=Release
WORKDIR /src
COPY ["DysonNetwork.Gateway/DysonNetwork.Gateway.csproj", "DysonNetwork.Gateway/"]
RUN dotnet restore "DysonNetwork.Gateway/DysonNetwork.Gateway.csproj"
COPY . .
WORKDIR "/src/DysonNetwork.Gateway"
RUN dotnet build "./DysonNetwork.Gateway.csproj" -c $BUILD_CONFIGURATION -o /app/build
FROM build AS publish
ARG BUILD_CONFIGURATION=Release
RUN dotnet publish "./DysonNetwork.Gateway.csproj" -c $BUILD_CONFIGURATION -o /app/publish /p:UseAppHost=false
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "DysonNetwork.Gateway.dll"]

View File

@@ -1,18 +0,0 @@
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net9.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.ServiceDiscovery.Yarp" Version="9.5.2" />
<PackageReference Include="Yarp.ReverseProxy" Version="2.3.0" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\DysonNetwork.Shared\DysonNetwork.Shared.csproj" />
</ItemGroup>
</Project>

View File

@@ -1,168 +0,0 @@
using System.Threading.RateLimiting;
using DysonNetwork.Shared.Http;
using Yarp.ReverseProxy.Configuration;
using Microsoft.AspNetCore.HttpOverrides;
var builder = WebApplication.CreateBuilder(args);
builder.AddServiceDefaults();
builder.ConfigureAppKestrel(builder.Configuration, maxRequestBodySize: long.MaxValue, enableGrpc: false);
builder.Services.AddCors(options =>
{
options.AddDefaultPolicy(
policy =>
{
policy.SetIsOriginAllowed(origin => true)
.AllowAnyMethod()
.AllowAnyHeader()
.AllowCredentials()
.WithExposedHeaders("X-Total");
});
});
builder.Services.AddRateLimiter(options =>
{
options.AddPolicy("fixed", context =>
{
var ip = context.Connection.RemoteIpAddress?.ToString() ?? "unknown";
return RateLimitPartition.GetFixedWindowLimiter(
partitionKey: ip,
factory: _ => new FixedWindowRateLimiterOptions
{
PermitLimit = 120, // 120 requests...
Window = TimeSpan.FromMinutes(1), // ...per minute per IP
QueueProcessingOrder = QueueProcessingOrder.OldestFirst,
QueueLimit = 10 // allow short bursts instead of instant 503s
});
});
options.OnRejected = async (context, token) =>
{
// Log the rejected IP
var logger = context.HttpContext.RequestServices
.GetRequiredService<ILoggerFactory>()
.CreateLogger("RateLimiter");
var ip = context.HttpContext.Connection.RemoteIpAddress?.ToString() ?? "unknown";
logger.LogWarning("Rate limit exceeded for IP: {IP}", ip);
// Respond to the client
context.HttpContext.Response.StatusCode = StatusCodes.Status429TooManyRequests;
await context.HttpContext.Response.WriteAsync(
"Rate limit exceeded. Try again later.", token);
};
});
var serviceNames = new[] { "ring", "pass", "drive", "sphere", "develop", "insight" };
var specialRoutes = new[]
{
new RouteConfig
{
RouteId = "ring-ws",
ClusterId = "ring",
Match = new RouteMatch { Path = "/ws" }
},
new RouteConfig
{
RouteId = "pass-openid",
ClusterId = "pass",
Match = new RouteMatch { Path = "/.well-known/openid-configuration" }
},
new RouteConfig
{
RouteId = "pass-jwks",
ClusterId = "pass",
Match = new RouteMatch { Path = "/.well-known/jwks" }
},
new RouteConfig
{
RouteId = "drive-tus",
ClusterId = "drive",
Match = new RouteMatch { Path = "/api/tus" }
}
};
var apiRoutes = serviceNames.Select(serviceName =>
{
var apiPath = serviceName switch
{
_ => $"/{serviceName}"
};
return new RouteConfig
{
RouteId = $"{serviceName}-api",
ClusterId = serviceName,
Match = new RouteMatch { Path = $"{apiPath}/{{**catch-all}}" },
Transforms =
[
new Dictionary<string, string> { { "PathRemovePrefix", apiPath } },
new Dictionary<string, string> { { "PathPrefix", "/api" } }
]
};
});
var swaggerRoutes = serviceNames.Select(serviceName => new RouteConfig
{
RouteId = $"{serviceName}-swagger",
ClusterId = serviceName,
Match = new RouteMatch { Path = $"/swagger/{serviceName}/{{**catch-all}}" },
Transforms =
[
new Dictionary<string, string> { { "PathRemovePrefix", $"/swagger/{serviceName}" } },
new Dictionary<string, string> { { "PathPrefix", "/swagger" } }
]
});
var routes = specialRoutes.Concat(apiRoutes).Concat(swaggerRoutes).ToArray();
var clusters = serviceNames.Select(serviceName => new ClusterConfig
{
ClusterId = serviceName,
HealthCheck = new HealthCheckConfig
{
Active = new ActiveHealthCheckConfig
{
Enabled = true,
Interval = TimeSpan.FromSeconds(10),
Timeout = TimeSpan.FromSeconds(5),
Path = "/health"
},
Passive = new()
{
Enabled = true
}
},
Destinations = new Dictionary<string, DestinationConfig>
{
{ "destination1", new DestinationConfig { Address = $"http://{serviceName}" } }
}
}).ToArray();
builder.Services
.AddReverseProxy()
.LoadFromMemory(routes, clusters)
.AddServiceDiscoveryDestinationResolver();
builder.Services.AddControllers();
var app = builder.Build();
var forwardedHeadersOptions = new ForwardedHeadersOptions
{
ForwardedHeaders = ForwardedHeaders.All
};
forwardedHeadersOptions.KnownNetworks.Clear();
forwardedHeadersOptions.KnownProxies.Clear();
app.UseForwardedHeaders(forwardedHeadersOptions);
app.UseCors();
app.MapReverseProxy().RequireRateLimiting("fixed");
app.MapControllers();
app.Run();

View File

@@ -1,13 +0,0 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"AllowedHosts": "*",
"SiteUrl": "http://localhost:3000",
"Client": {
"SomeSetting": "SomeValue"
}
}

View File

@@ -1,3 +1,4 @@
using DysonNetwork.Shared.Data;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Design;
@@ -12,6 +13,7 @@ public class AppDatabase(
{
public DbSet<SnThinkingSequence> ThinkingSequences { get; set; }
public DbSet<SnThinkingThought> ThinkingThoughts { get; set; }
public DbSet<SnUnpaidAccount> UnpaidAccounts { get; set; }
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
@@ -28,36 +30,15 @@ public class AppDatabase(
public override async Task<int> SaveChangesAsync(CancellationToken cancellationToken = default)
{
var now = SystemClock.Instance.GetCurrentInstant();
foreach (var entry in ChangeTracker.Entries<ModelBase>())
{
switch (entry.State)
{
case EntityState.Added:
entry.Entity.CreatedAt = now;
entry.Entity.UpdatedAt = now;
break;
case EntityState.Modified:
entry.Entity.UpdatedAt = now;
break;
case EntityState.Deleted:
entry.State = EntityState.Modified;
entry.Entity.DeletedAt = now;
break;
case EntityState.Detached:
case EntityState.Unchanged:
default:
break;
}
}
this.ApplyAuditableAndSoftDelete();
return await base.SaveChangesAsync(cancellationToken);
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
modelBuilder.ApplySoftDeleteFilters();
}
}

View File

@@ -1,21 +1,42 @@
using DysonNetwork.Insight.Thought;
using DysonNetwork.Shared.Auth;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Logging;
using Microsoft.EntityFrameworkCore;
using DysonNetwork.Shared.Proto;
namespace DysonNetwork.Insight.Controllers;
[ApiController]
[Route("/api/billing")]
public class BillingController(ThoughtService thoughtService, ILogger<BillingController> logger) : ControllerBase
[Route("api/billing")]
public class BillingController(AppDatabase db, ThoughtService thoughtService, ILogger<BillingController> logger)
: ControllerBase
{
[HttpPost("settle")]
[Authorize]
[RequiredPermission("maintenance", "insight.billing.settle")]
public async Task<IActionResult> ProcessTokenBilling()
[HttpGet("status")]
public async Task<IActionResult> GetBillingStatus()
{
await thoughtService.SettleThoughtBills(logger);
return Ok();
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return Unauthorized();
var accountId = Guid.Parse(currentUser.Id);
var isMarked = await db.UnpaidAccounts.AnyAsync(u => u.AccountId == accountId);
return Ok(isMarked ? new { status = "unpaid" } : new { status = "ok" });
}
}
[HttpPost("retry")]
public async Task<IActionResult> RetryBilling()
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser)
return Unauthorized();
var accountId = Guid.Parse(currentUser.Id);
var (success, cost) = await thoughtService.RetryBillingForAccountAsync(accountId, logger);
if (success)
{
return Ok(cost > 0
? new { message = $"Billing retry successful. Billed {cost} points." }
: new { message = "No outstanding payment found." });
}
return BadRequest(new { message = "Billing retry failed. Please check your balance and try again." });
}
}

View File

@@ -1,12 +1,12 @@
#See https://aka.ms/customizecontainer to learn how to customize your debug container and how Visual Studio uses this Dockerfile to build your images for faster debugging.
FROM mcr.microsoft.com/dotnet/aspnet:9.0 AS base
FROM mcr.microsoft.com/dotnet/aspnet:10.0 AS base
USER app
WORKDIR /app
EXPOSE 8080
EXPOSE 8081
FROM mcr.microsoft.com/dotnet/sdk:9.0 AS build
FROM mcr.microsoft.com/dotnet/sdk:10.0 AS build
ARG BUILD_CONFIGURATION=Release
WORKDIR /src
COPY ["DysonNetwork.Insight/DysonNetwork.Insight.csproj", "DysonNetwork.Insight/"]

View File

@@ -1,22 +1,19 @@
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net9.0</TargetFramework>
<TargetFramework>net10.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="EFCore.NamingConventions" Version="9.0.0" />
<PackageReference Include="Microsoft.AspNetCore.OpenApi" Version="9.0.10" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.10">
<PackageReference Include="Microsoft.AspNetCore.OpenApi" Version="10.0.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.11">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="Microsoft.SemanticKernel" Version="1.66.0" />
<PackageReference Include="Microsoft.SemanticKernel" Version="1.67.1" />
<PackageReference Include="Microsoft.SemanticKernel.Connectors.Ollama" Version="1.66.0-alpha" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="9.0.4" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.NodaTime" Version="9.0.4" />
<PackageReference Include="Microsoft.SemanticKernel.Plugins.Web" Version="1.66.0-alpha" />
<PackageReference Include="Quartz" Version="3.15.1" />
<PackageReference Include="Quartz.AspNetCore" Version="3.15.1" />

View File

@@ -69,11 +69,6 @@ namespace DysonNetwork.Insight.Migrations
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<List<SnThinkingChunk>>("Chunks")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("chunks");
b.Property<string>("Content")
.HasColumnType("text")
.HasColumnName("content");

View File

@@ -12,21 +12,13 @@ namespace DysonNetwork.Insight.Migrations
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.AddColumn<List<SnThinkingChunk>>(
name: "chunks",
table: "thinking_thoughts",
type: "jsonb",
nullable: false,
defaultValue: new List<SnThinkingChunk>()
);
// The chunk type has been removed, so this did nothing
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropColumn(
name: "chunks",
table: "thinking_thoughts");
// The chunk type has been removed, so this did nothing
}
}
}

View File

@@ -77,11 +77,6 @@ namespace DysonNetwork.Insight.Migrations
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<List<SnThinkingChunk>>("Chunks")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("chunks");
b.Property<string>("Content")
.HasColumnType("text")
.HasColumnName("content");

View File

@@ -0,0 +1,142 @@
// <auto-generated />
using System;
using System.Collections.Generic;
using DysonNetwork.Insight;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace DysonNetwork.Insight.Migrations
{
[DbContext(typeof(AppDatabase))]
[Migration("20251115084746_RefactorThoughtMessage")]
partial class RefactorThoughtMessage
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingSequence", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<long>("PaidToken")
.HasColumnType("bigint")
.HasColumnName("paid_token");
b.Property<string>("Topic")
.HasMaxLength(4096)
.HasColumnType("character varying(4096)")
.HasColumnName("topic");
b.Property<long>("TotalToken")
.HasColumnType("bigint")
.HasColumnName("total_token");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_thinking_sequences");
b.ToTable("thinking_sequences", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingThought", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<List<SnCloudFileReferenceObject>>("Files")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("files");
b.Property<string>("ModelName")
.HasMaxLength(4096)
.HasColumnType("character varying(4096)")
.HasColumnName("model_name");
b.Property<List<SnThinkingMessagePart>>("Parts")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("parts");
b.Property<int>("Role")
.HasColumnType("integer")
.HasColumnName("role");
b.Property<Guid>("SequenceId")
.HasColumnType("uuid")
.HasColumnName("sequence_id");
b.Property<long>("TokenCount")
.HasColumnType("bigint")
.HasColumnName("token_count");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_thinking_thoughts");
b.HasIndex("SequenceId")
.HasDatabaseName("ix_thinking_thoughts_sequence_id");
b.ToTable("thinking_thoughts", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingThought", b =>
{
b.HasOne("DysonNetwork.Shared.Models.SnThinkingSequence", "Sequence")
.WithMany()
.HasForeignKey("SequenceId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired()
.HasConstraintName("fk_thinking_thoughts_thinking_sequences_sequence_id");
b.Navigation("Sequence");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,30 @@
using System.Collections.Generic;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace DysonNetwork.Insight.Migrations
{
/// <inheritdoc />
public partial class RefactorThoughtMessage : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.AddColumn<List<SnThinkingMessagePart>>(
name: "parts",
table: "thinking_thoughts",
type: "jsonb",
nullable: false);
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropColumn(
name: "parts",
table: "thinking_thoughts");
}
}
}

View File

@@ -0,0 +1,142 @@
// <auto-generated />
using System;
using System.Collections.Generic;
using DysonNetwork.Insight;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace DysonNetwork.Insight.Migrations
{
[DbContext(typeof(AppDatabase))]
[Migration("20251115162347_UpdatedFunctionCallModels")]
partial class UpdatedFunctionCallModels
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingSequence", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<long>("PaidToken")
.HasColumnType("bigint")
.HasColumnName("paid_token");
b.Property<string>("Topic")
.HasMaxLength(4096)
.HasColumnType("character varying(4096)")
.HasColumnName("topic");
b.Property<long>("TotalToken")
.HasColumnType("bigint")
.HasColumnName("total_token");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_thinking_sequences");
b.ToTable("thinking_sequences", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingThought", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<List<SnCloudFileReferenceObject>>("Files")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("files");
b.Property<string>("ModelName")
.HasMaxLength(4096)
.HasColumnType("character varying(4096)")
.HasColumnName("model_name");
b.Property<List<SnThinkingMessagePart>>("Parts")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("parts");
b.Property<int>("Role")
.HasColumnType("integer")
.HasColumnName("role");
b.Property<Guid>("SequenceId")
.HasColumnType("uuid")
.HasColumnName("sequence_id");
b.Property<long>("TokenCount")
.HasColumnType("bigint")
.HasColumnName("token_count");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_thinking_thoughts");
b.HasIndex("SequenceId")
.HasDatabaseName("ix_thinking_thoughts_sequence_id");
b.ToTable("thinking_thoughts", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingThought", b =>
{
b.HasOne("DysonNetwork.Shared.Models.SnThinkingSequence", "Sequence")
.WithMany()
.HasForeignKey("SequenceId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired()
.HasConstraintName("fk_thinking_thoughts_thinking_sequences_sequence_id");
b.Navigation("Sequence");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,22 @@
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace DysonNetwork.Insight.Migrations
{
/// <inheritdoc />
public partial class UpdatedFunctionCallModels : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
}
}
}

View File

@@ -0,0 +1,159 @@
// <auto-generated />
using System;
using System.Collections.Generic;
using DysonNetwork.Insight;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace DysonNetwork.Insight.Migrations
{
[DbContext(typeof(AppDatabase))]
[Migration("20251115165833_AddUnpaidAccounts")]
partial class AddUnpaidAccounts
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingSequence", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<long>("PaidToken")
.HasColumnType("bigint")
.HasColumnName("paid_token");
b.Property<string>("Topic")
.HasMaxLength(4096)
.HasColumnType("character varying(4096)")
.HasColumnName("topic");
b.Property<long>("TotalToken")
.HasColumnType("bigint")
.HasColumnName("total_token");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_thinking_sequences");
b.ToTable("thinking_sequences", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingThought", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<List<SnCloudFileReferenceObject>>("Files")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("files");
b.Property<string>("ModelName")
.HasMaxLength(4096)
.HasColumnType("character varying(4096)")
.HasColumnName("model_name");
b.Property<List<SnThinkingMessagePart>>("Parts")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("parts");
b.Property<int>("Role")
.HasColumnType("integer")
.HasColumnName("role");
b.Property<Guid>("SequenceId")
.HasColumnType("uuid")
.HasColumnName("sequence_id");
b.Property<long>("TokenCount")
.HasColumnType("bigint")
.HasColumnName("token_count");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_thinking_thoughts");
b.HasIndex("SequenceId")
.HasDatabaseName("ix_thinking_thoughts_sequence_id");
b.ToTable("thinking_thoughts", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnUnpaidAccount", b =>
{
b.Property<Guid>("AccountId")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<DateTime>("MarkedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("marked_at");
b.HasKey("AccountId")
.HasName("pk_unpaid_accounts");
b.ToTable("unpaid_accounts", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingThought", b =>
{
b.HasOne("DysonNetwork.Shared.Models.SnThinkingSequence", "Sequence")
.WithMany()
.HasForeignKey("SequenceId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired()
.HasConstraintName("fk_thinking_thoughts_thinking_sequences_sequence_id");
b.Navigation("Sequence");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,34 @@
using System;
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace DysonNetwork.Insight.Migrations
{
/// <inheritdoc />
public partial class AddUnpaidAccounts : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.CreateTable(
name: "unpaid_accounts",
columns: table => new
{
account_id = table.Column<Guid>(type: "uuid", nullable: false),
marked_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("pk_unpaid_accounts", x => x.account_id);
});
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable(
name: "unpaid_accounts");
}
}
}

View File

@@ -0,0 +1,163 @@
// <auto-generated />
using System;
using System.Collections.Generic;
using DysonNetwork.Insight;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace DysonNetwork.Insight.Migrations
{
[DbContext(typeof(AppDatabase))]
[Migration("20251116123552_SharableThought")]
partial class SharableThought
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingSequence", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid>("AccountId")
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<bool>("IsPublic")
.HasColumnType("boolean")
.HasColumnName("is_public");
b.Property<long>("PaidToken")
.HasColumnType("bigint")
.HasColumnName("paid_token");
b.Property<string>("Topic")
.HasMaxLength(4096)
.HasColumnType("character varying(4096)")
.HasColumnName("topic");
b.Property<long>("TotalToken")
.HasColumnType("bigint")
.HasColumnName("total_token");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_thinking_sequences");
b.ToTable("thinking_sequences", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingThought", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
b.Property<Instant?>("DeletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<List<SnCloudFileReferenceObject>>("Files")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("files");
b.Property<string>("ModelName")
.HasMaxLength(4096)
.HasColumnType("character varying(4096)")
.HasColumnName("model_name");
b.Property<List<SnThinkingMessagePart>>("Parts")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("parts");
b.Property<int>("Role")
.HasColumnType("integer")
.HasColumnName("role");
b.Property<Guid>("SequenceId")
.HasColumnType("uuid")
.HasColumnName("sequence_id");
b.Property<long>("TokenCount")
.HasColumnType("bigint")
.HasColumnName("token_count");
b.Property<Instant>("UpdatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at");
b.HasKey("Id")
.HasName("pk_thinking_thoughts");
b.HasIndex("SequenceId")
.HasDatabaseName("ix_thinking_thoughts_sequence_id");
b.ToTable("thinking_thoughts", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnUnpaidAccount", b =>
{
b.Property<Guid>("AccountId")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<DateTime>("MarkedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("marked_at");
b.HasKey("AccountId")
.HasName("pk_unpaid_accounts");
b.ToTable("unpaid_accounts", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingThought", b =>
{
b.HasOne("DysonNetwork.Shared.Models.SnThinkingSequence", "Sequence")
.WithMany()
.HasForeignKey("SequenceId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired()
.HasConstraintName("fk_thinking_thoughts_thinking_sequences_sequence_id");
b.Navigation("Sequence");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,29 @@
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace DysonNetwork.Insight.Migrations
{
/// <inheritdoc />
public partial class SharableThought : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.AddColumn<bool>(
name: "is_public",
table: "thinking_sequences",
type: "boolean",
nullable: false,
defaultValue: false);
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropColumn(
name: "is_public",
table: "thinking_sequences");
}
}
}

View File

@@ -20,7 +20,7 @@ namespace DysonNetwork.Insight.Migrations
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.10")
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
@@ -44,6 +44,10 @@ namespace DysonNetwork.Insight.Migrations
.HasColumnType("timestamp with time zone")
.HasColumnName("deleted_at");
b.Property<bool>("IsPublic")
.HasColumnType("boolean")
.HasColumnName("is_public");
b.Property<long>("PaidToken")
.HasColumnType("bigint")
.HasColumnName("paid_token");
@@ -74,15 +78,6 @@ namespace DysonNetwork.Insight.Migrations
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<List<SnThinkingChunk>>("Chunks")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("chunks");
b.Property<string>("Content")
.HasColumnType("text")
.HasColumnName("content");
b.Property<Instant>("CreatedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at");
@@ -101,6 +96,11 @@ namespace DysonNetwork.Insight.Migrations
.HasColumnType("character varying(4096)")
.HasColumnName("model_name");
b.Property<List<SnThinkingMessagePart>>("Parts")
.IsRequired()
.HasColumnType("jsonb")
.HasColumnName("parts");
b.Property<int>("Role")
.HasColumnType("integer")
.HasColumnName("role");
@@ -126,6 +126,23 @@ namespace DysonNetwork.Insight.Migrations
b.ToTable("thinking_thoughts", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnUnpaidAccount", b =>
{
b.Property<Guid>("AccountId")
.ValueGeneratedOnAdd()
.HasColumnType("uuid")
.HasColumnName("account_id");
b.Property<DateTime>("MarkedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("marked_at");
b.HasKey("AccountId")
.HasName("pk_unpaid_accounts");
b.ToTable("unpaid_accounts", (string)null);
});
modelBuilder.Entity("DysonNetwork.Shared.Models.SnThinkingThought", b =>
{
b.HasOne("DysonNetwork.Shared.Models.SnThinkingSequence", "Sequence")

View File

@@ -7,7 +7,9 @@ using Microsoft.EntityFrameworkCore;
var builder = WebApplication.CreateBuilder(args);
builder.AddServiceDefaults();
builder.Services.Configure<ServiceRegistrationOptions>(opts => { opts.Name = "insight"; });
builder.AddServiceDefaults("insight");
builder.ConfigureAppKestrel(builder.Configuration);
@@ -19,8 +21,6 @@ builder.Services.AddAppBusinessServices();
builder.Services.AddAppScheduledJobs();
builder.Services.AddDysonAuth();
builder.Services.AddAccountService();
builder.Services.AddSphereService();
builder.Services.AddThinkingServices(builder.Configuration);
builder.AddSwaggerManifest(

View File

@@ -1,3 +1,4 @@
using DysonNetwork.Insight.Thought;
using Quartz;
namespace DysonNetwork.Insight.Startup;

View File

@@ -2,6 +2,7 @@ using System.Text.Json;
using System.Text.Json.Serialization;
using DysonNetwork.Insight.Thought;
using DysonNetwork.Shared.Cache;
using DysonNetwork.Shared.Registry;
using Microsoft.SemanticKernel;
using NodaTime;
using NodaTime.Serialization.SystemTextJson;
@@ -13,9 +14,7 @@ public static class ServiceCollectionExtensions
public static IServiceCollection AddAppServices(this IServiceCollection services)
{
services.AddDbContext<AppDatabase>();
services.AddSingleton<IClock>(SystemClock.Instance);
services.AddHttpContextAccessor();
services.AddSingleton<ICacheService, CacheServiceRedis>();
services.AddHttpClient();
@@ -66,14 +65,6 @@ public static class ServiceCollectionExtensions
services.AddSingleton<ThoughtProvider>();
services.AddScoped<ThoughtService>();
// Add gRPC clients for ThoughtService
services.AddGrpcClient<Shared.Proto.PaymentService.PaymentServiceClient>(o => o.Address = new Uri("https://_grpc.pass"))
.ConfigurePrimaryHttpMessageHandler(_ => new HttpClientHandler()
{ ServerCertificateCustomValidationCallback = (_, _, _, _) => true });
services.AddGrpcClient<Shared.Proto.WalletService.WalletServiceClient>(o => o.Address = new Uri("https://_grpc.pass"))
.ConfigurePrimaryHttpMessageHandler(_ => new HttpClientHandler()
{ ServerCertificateCustomValidationCallback = (_, _, _, _) => true });
return services;
}
}

View File

@@ -0,0 +1,155 @@
# Client-Side Guide: Handling the New Message Structure
This document outlines how to update your client application to support the new rich message structure for the thinking/chat feature. The backend now sends structured messages that can include plain text, function calls, and function results, allowing for a more interactive and transparent user experience.
When using with gateway, all the response type are in snake case
## 1. Data Models
When you receive a complete message (a "thought"), it will be in the form of an `SnThinkingThought` object. The core of this object is the `Parts` array, which contains the different components of the message.
Here are the primary data models you will be working with, represented here in a TypeScript-like format for clarity:
```typescript
// The main message object from the assistant or user
interface SnThinkingThought {
id: string;
parts: SnThinkingMessagePart[];
role: 'Assistant' /*Value is (0)*/ | 'User' /*Value is (1)*/;
createdAt: string; // ISO 8601 date string
// ... other metadata
}
// A single part of a message
interface SnThinkingMessagePart {
type: ThinkingMessagePartType;
text?: string;
functionCall?: SnFunctionCall;
functionResult?: SnFunctionResult;
}
// Enum for the different part types
enum ThinkingMessagePartType {
Text = 0,
FunctionCall = 1,
FunctionResult = 2,
}
// Represents a function/tool call made by the assistant
interface SnFunctionCall {
id: string;
name: string;
arguments: string; // A JSON string of the arguments
}
// Represents the result of a function call
interface SnFunctionResult {
callId: string; // The ID of the corresponding function call
result: any; // The data returned by the function
isError: boolean;
}
```
## 2. Handling the SSE Stream
The response is streamed using Server-Sent Events (SSE). Your client should listen to this stream and process events as they arrive to build the UI in real-time.
The stream sends different types of messages, identified by a `type` field in the JSON payload.
| Event Type | `data` Payload | Client-Side Action |
| ------------------------ | -------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------- |
| `text` | `{ "type": "text", "data": "some text" }` | Append the text content to the current message being displayed. This is the most common event. |
| `function_call_update` | `{ "type": "function_call_update", "data": { ... } }` | This provides real-time updates as the AI decides on a function call. You can use this to show an advanced "thinking" state, but it's optional. The key events to handle are `function_call` and `function_result`. |
| `function_call` | `{ "type": "function_call", "data": SnFunctionCall }` | The AI has committed to using a tool. Display a "Using tool..." indicator. You can show the `name` of the tool for more clarity. |
| `function_result` | `{ "type": "function_result", "data": SnFunctionResult }` | The tool has finished running. You can hide the "thinking" indicator for this tool and optionally display a summary of the result. |
| `topic` | `{ "type": "topic", "data": "A new topic" }` | If this is the first message in a new conversation, this event provides the auto-generated topic title. Update your UI accordingly. |
| `thought` | `{ "type": "thought", "data": SnThinkingThought }` | This is the **final event** in the stream. It contains the complete, persisted message object with all its `Parts`. You should use this final object to replace the incrementally-built message in your state to ensure consistency. |
## 3. Rendering a Message from `SnThinkingThought`
Once you have the final `SnThinkingThought` object (either from the `thought` event in the stream or by fetching conversation history), you can render it by iterating through the `parts` array.
### Pseudocode for Rendering
```javascript
function renderThought(thought: SnThinkingThought) {
const messageContainer = document.createElement('div');
messageContainer.className = `message message-role-${thought.role}`;
// User messages are simple and will only have one text part
if (thought.role === 'User') {
const textPart = thought.parts[0];
messageContainer.innerText = textPart.text;
return messageContainer;
}
// Assistant messages can have multiple parts
let textBuffer = '';
thought.parts.forEach(part => {
switch (part.type) {
case ThinkingMessagePartType.Text:
// Buffer text to combine consecutive text parts
textBuffer += part.text;
break;
case ThinkingMessagePartType.FunctionCall:
// First, render any buffered text
if (textBuffer) {
messageContainer.appendChild(renderText(textBuffer));
textBuffer = '';
}
// Then, render the function call UI component
messageContainer.appendChild(renderFunctionCall(part.functionCall));
break;
case ThinkingMessagePartType.FunctionResult:
// Render buffered text
if (textBuffer) {
messageContainer.appendChild(renderText(textBuffer));
textBuffer = '';
}
// Then, render the function result UI component
messageContainer.appendChild(renderFunctionResult(part.functionResult));
break;
}
});
// Render any remaining text at the end
if (textBuffer) {
messageContainer.appendChild(renderText(textBuffer));
}
return messageContainer;
}
// Helper functions to create UI components
function renderText(text) {
const p = document.createElement('p');
p.innerText = text;
return p;
}
function renderFunctionCall(functionCall) {
const el = document.createElement('div');
el.className = 'function-call-indicator';
el.innerHTML = `<i>Using tool: <strong>${functionCall.name}</strong>...</i>`;
// You could add a button to show functionCall.arguments
return el;
}
function renderFunctionResult(functionResult) {
const el = document.createElement('div');
el.className = 'function-result-indicator';
if (functionResult.isError) {
el.classList.add('error');
el.innerText = 'An error occurred while using the tool.';
} else {
el.innerText = 'Tool finished.';
}
// You could expand this to show a summary of functionResult.result
return el;
}
```
This approach ensures that text and tool-use indicators are rendered inline and in the correct order, providing a clear and accurate representation of the assistant's actions.

View File

@@ -0,0 +1,29 @@
using DysonNetwork.Shared.Models;
using DysonNetwork.Shared.Proto;
using Microsoft.IdentityModel.Tokens;
using Microsoft.SemanticKernel;
namespace DysonNetwork.Insight.Thought.Plugins;
public class SnAccountKernelPlugin(
AccountService.AccountServiceClient accountClient
)
{
[KernelFunction("get_account")]
public async Task<SnAccount?> GetAccount(string userId)
{
var request = new GetAccountRequest { Id = userId };
var response = await accountClient.GetAccountAsync(request);
if (response is null) return null;
return SnAccount.FromProtoValue(response);
}
[KernelFunction("get_account_by_name")]
public async Task<SnAccount?> GetAccountByName(string username)
{
var request = new LookupAccountBatchRequest();
request.Names.Add(username);
var response = await accountClient.LookupAccountBatchAsync(request);
return response.Accounts.IsNullOrEmpty() ? null : SnAccount.FromProtoValue(response.Accounts[0]);
}
}

View File

@@ -0,0 +1,98 @@
using System.ComponentModel;
using DysonNetwork.Shared.Models;
using DysonNetwork.Shared.Proto;
using Microsoft.SemanticKernel;
using NodaTime;
using NodaTime.Serialization.Protobuf;
using NodaTime.Text;
namespace DysonNetwork.Insight.Thought.Plugins;
public class SnPostKernelPlugin(
PostService.PostServiceClient postClient
)
{
[KernelFunction("get_post")]
public async Task<SnPost?> GetPost(string postId)
{
var request = new GetPostRequest { Id = postId };
var response = await postClient.GetPostAsync(request);
return response is null ? null : SnPost.FromProtoValue(response);
}
[KernelFunction("search_posts")]
[Description("Perform a full-text search in all Solar Network posts.")]
public async Task<List<SnPost>> SearchPostsContent(string contentQuery, int pageSize = 10, int page = 1)
{
var request = new SearchPostsRequest
{
Query = contentQuery,
PageSize = pageSize,
PageToken = ((page - 1) * pageSize).ToString()
};
var response = await postClient.SearchPostsAsync(request);
return response.Posts.Select(SnPost.FromProtoValue).ToList();
}
public class KernelPostListResult
{
public List<SnPost> Posts { get; set; } = [];
public int TotalCount { get; set; }
}
[KernelFunction("list_posts")]
[Description("List all posts on the Solar Network without filters, orderBy can be date or popularity")]
public async Task<KernelPostListResult> ListPosts(
string orderBy = "date",
bool orderDesc = true,
int pageSize = 10,
int page = 1
)
{
var request = new ListPostsRequest
{
OrderBy = orderBy,
OrderDesc = orderDesc,
PageSize = pageSize,
PageToken = ((page - 1) * pageSize).ToString()
};
var response = await postClient.ListPostsAsync(request);
return new KernelPostListResult
{
Posts = response.Posts.Select(SnPost.FromProtoValue).ToList(),
TotalCount = response.TotalSize,
};
}
[KernelFunction("list_posts_within_time")]
[Description(
"List posts in a period of time, the time requires ISO-8601 format, one of the start and end must be provided.")]
public async Task<KernelPostListResult> ListPostsWithinTime(
string? beforeTime,
string? afterTime,
int pageSize = 10,
int page = 1
)
{
var pattern = InstantPattern.General;
Instant? before = !string.IsNullOrWhiteSpace(beforeTime)
? pattern.Parse(beforeTime).TryGetValue(default, out var beforeValue) ? beforeValue : null
: null;
Instant? after = !string.IsNullOrWhiteSpace(afterTime)
? pattern.Parse(afterTime).TryGetValue(default, out var afterValue) ? afterValue : null
: null;
var request = new ListPostsRequest
{
After = after?.ToTimestamp(),
Before = before?.ToTimestamp(),
PageSize = pageSize,
PageToken = ((page - 1) * pageSize).ToString()
};
var response = await postClient.ListPostsAsync(request);
return new KernelPostListResult
{
Posts = response.Posts.Select(SnPost.FromProtoValue).ToList(),
TotalCount = response.TotalSize,
};
}
}

View File

@@ -1,15 +1,13 @@
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System.Diagnostics.CodeAnalysis;
using System.IO;
using System.Text;
using System.Text.Json;
using DysonNetwork.Shared.Auth;
using DysonNetwork.Shared.Models;
using DysonNetwork.Shared.Proto;
using Microsoft.AspNetCore.Mvc;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.Ollama;
namespace DysonNetwork.Insight.Thought;
@@ -22,12 +20,50 @@ public class ThoughtController(ThoughtProvider provider, ThoughtService service)
public class StreamThinkingRequest
{
[Required] public string UserMessage { get; set; } = null!;
public string? ServiceId { get; set; }
public Guid? SequenceId { get; set; }
public List<string>? AttachedPosts { get; set; }
public List<string>? AttachedPosts { get; set; } = [];
public List<Dictionary<string, dynamic>>? AttachedMessages { get; set; }
public List<string> AcceptProposals { get; set; } = [];
}
public class UpdateSharingRequest
{
public bool IsPublic { get; set; }
}
public class ThoughtServiceInfo
{
public string ServiceId { get; set; } = null!;
public double BillingMultiplier { get; set; }
public int PerkLevel { get; set; }
}
public class ThoughtServicesResponse
{
public string DefaultService { get; set; } = null!;
public IEnumerable<ThoughtServiceInfo> Services { get; set; } = null!;
}
[HttpGet("services")]
[ProducesResponseType(StatusCodes.Status200OK)]
public ActionResult<ThoughtServicesResponse> GetAvailableServices()
{
var services = provider.GetAvailableServicesInfo()
.Select(s => new ThoughtServiceInfo
{
ServiceId = s.ServiceId,
BillingMultiplier = s.BillingMultiplier,
PerkLevel = s.PerkLevel
});
return Ok(new ThoughtServicesResponse
{
DefaultService = provider.GetDefaultServiceId(),
Services = services
});
}
[HttpPost]
[Experimental("SKEXP0110")]
public async Task<ActionResult> Think([FromBody] StreamThinkingRequest request)
@@ -38,6 +74,25 @@ public class ThoughtController(ThoughtProvider provider, ThoughtService service)
if (request.AcceptProposals.Any(e => !AvailableProposals.Contains(e)))
return BadRequest("Request contains unavailable proposal");
var serviceId = provider.GetServiceId(request.ServiceId);
var serviceInfo = provider.GetServiceInfo(serviceId);
if (serviceInfo is null)
{
return BadRequest("Service not found or configured.");
}
if (serviceInfo.PerkLevel > 0 && !currentUser.IsSuperuser)
if (currentUser.PerkSubscription is null ||
PerkSubscriptionPrivilege.GetPrivilegeFromIdentifier(currentUser.PerkSubscription.Identifier) <
serviceInfo.PerkLevel)
return StatusCode(403, "Not enough perk level");
var kernel = provider.GetKernel(request.ServiceId);
if (kernel is null)
{
return BadRequest("Service not found or configured.");
}
// Generate a topic if creating a new sequence
string? topic = null;
if (!request.SequenceId.HasValue)
@@ -49,7 +104,13 @@ public class ThoughtController(ThoughtProvider provider, ThoughtService service)
);
summaryHistory.AddUserMessage(request.UserMessage);
var summaryResult = await provider.Kernel
var summaryKernel = provider.GetKernel(); // Get default kernel
if (summaryKernel is null)
{
return BadRequest("Default service not found or configured.");
}
var summaryResult = await summaryKernel
.GetRequiredService<IChatCompletionService>()
.GetChatMessageContentAsync(summaryHistory);
@@ -61,7 +122,13 @@ public class ThoughtController(ThoughtProvider provider, ThoughtService service)
if (sequence == null) return Forbid(); // or NotFound
// Save user thought
await service.SaveThoughtAsync(sequence, request.UserMessage, ThinkingThoughtRole.User);
await service.SaveThoughtAsync(sequence, [
new SnThinkingMessagePart
{
Type = ThinkingMessagePartType.Text,
Text = request.UserMessage
}
], ThinkingThoughtRole.User);
// Build chat history
var chatHistory = new ChatHistory(
@@ -108,19 +175,71 @@ public class ThoughtController(ThoughtProvider provider, ThoughtService service)
// Add previous thoughts (excluding the current user thought, which is the first one since descending)
var previousThoughts = await service.GetPreviousThoughtsAsync(sequence);
var count = previousThoughts.Count;
for (var i = 1; i < count; i++) // skip first (the newest, current user)
for (var i = count - 1; i >= 1; i--) // skip first (the newest, current user)
{
var thought = previousThoughts[i];
switch (thought.Role)
var textContent = new StringBuilder();
var functionCalls = new List<FunctionCallContent>();
var functionResults = new List<FunctionResultContent>();
foreach (var part in thought.Parts)
{
case ThinkingThoughtRole.User:
chatHistory.AddUserMessage(thought.Content ?? "");
break;
case ThinkingThoughtRole.Assistant:
chatHistory.AddAssistantMessage(thought.Content ?? "");
break;
default:
throw new ArgumentOutOfRangeException();
switch (part.Type)
{
case ThinkingMessagePartType.Text:
textContent.Append(part.Text);
break;
case ThinkingMessagePartType.FunctionCall:
var arguments = !string.IsNullOrEmpty(part.FunctionCall!.Arguments)
? JsonSerializer.Deserialize<Dictionary<string, object?>>(part.FunctionCall!.Arguments)
: null;
var kernelArgs = arguments is not null ? new KernelArguments(arguments) : null;
functionCalls.Add(new FunctionCallContent(
functionName: part.FunctionCall!.Name,
pluginName: part.FunctionCall.PluginName,
id: part.FunctionCall.Id,
arguments: kernelArgs
));
break;
case ThinkingMessagePartType.FunctionResult:
var resultObject = part.FunctionResult!.Result;
var resultString = resultObject as string ?? JsonSerializer.Serialize(resultObject);
functionResults.Add(new FunctionResultContent(
callId: part.FunctionResult.CallId,
functionName: part.FunctionResult.FunctionName,
pluginName: part.FunctionResult.PluginName,
result: resultString
));
break;
default:
throw new ArgumentOutOfRangeException();
}
}
if (thought.Role == ThinkingThoughtRole.User)
{
chatHistory.AddUserMessage(textContent.ToString());
}
else
{
var assistantMessage = new ChatMessageContent(AuthorRole.Assistant, textContent.ToString());
if (functionCalls.Count > 0)
{
assistantMessage.Items = [];
foreach (var fc in functionCalls)
{
assistantMessage.Items.Add(fc);
}
}
chatHistory.Add(assistantMessage);
if (functionResults.Count <= 0) continue;
foreach (var fr in functionResults)
{
chatHistory.Add(fr.ToChatMessage());
}
}
}
@@ -130,75 +249,118 @@ public class ThoughtController(ThoughtProvider provider, ThoughtService service)
Response.Headers.Append("Content-Type", "text/event-stream");
Response.StatusCode = 200;
var kernel = provider.Kernel;
var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();
var executionSettings = provider.CreatePromptExecutionSettings(request.ServiceId);
// Kick off streaming generation
var accumulatedContent = new StringBuilder();
var thinkingChunks = new List<SnThinkingChunk>();
await foreach (var chunk in chatCompletionService.GetStreamingChatMessageContentsAsync(
chatHistory,
provider.CreatePromptExecutionSettings(),
kernel: kernel
))
var assistantParts = new List<SnThinkingMessagePart>();
while (true)
{
// Process each item in the chunk for detailed streaming
foreach (var item in chunk.Items)
var textContentBuilder = new StringBuilder();
AuthorRole? authorRole = null;
var functionCallBuilder = new FunctionCallContentBuilder();
await foreach (
var streamingContent in chatCompletionService.GetStreamingChatMessageContentsAsync(
chatHistory, executionSettings, kernel)
)
{
var streamingChunk = item switch
authorRole ??= streamingContent.Role;
if (streamingContent.Content is not null)
{
StreamingTextContent textContent => new SnThinkingChunk
{ Type = StreamingContentType.Text, Data = new() { ["text"] = textContent.Text ?? "" } },
StreamingReasoningContent reasoningContent => new SnThinkingChunk
{
Type = StreamingContentType.Reasoning, Data = new() { ["text"] = reasoningContent.Text }
},
StreamingFunctionCallUpdateContent functionCall => string.IsNullOrEmpty(functionCall.CallId)
? null
: new SnThinkingChunk
{
Type = StreamingContentType.FunctionCall,
Data = JsonSerializer.Deserialize<Dictionary<string, object>>(
JsonSerializer.Serialize(functionCall)) ?? new Dictionary<string, object>()
},
_ => new SnThinkingChunk
{
Type = StreamingContentType.Unknown, Data = new() { ["data"] = JsonSerializer.Serialize(item) }
}
};
if (streamingChunk == null) continue;
textContentBuilder.Append(streamingContent.Content);
var messageJson = JsonSerializer.Serialize(new
{ type = "text", data = streamingContent.Content });
await Response.Body.WriteAsync(Encoding.UTF8.GetBytes($"data: {messageJson}\n\n"));
await Response.Body.FlushAsync();
}
thinkingChunks.Add(streamingChunk);
var messageJson = item switch
{
StreamingTextContent textContent =>
JsonSerializer.Serialize(new { type = "text", data = textContent.Text ?? "" }),
StreamingReasoningContent reasoningContent =>
JsonSerializer.Serialize(new { type = "reasoning", data = reasoningContent.Text }),
StreamingFunctionCallUpdateContent functionCall =>
JsonSerializer.Serialize(new { type = "function_call", data = functionCall }),
_ =>
JsonSerializer.Serialize(new { type = "unknown", data = item })
};
// Write a structured JSON message to the HTTP response as SSE
var messageBytes = Encoding.UTF8.GetBytes($"data: {messageJson}\n\n");
await Response.Body.WriteAsync(messageBytes);
await Response.Body.FlushAsync();
functionCallBuilder.Append(streamingContent);
}
// Accumulate content for saving (only text content)
accumulatedContent.Append(chunk.Content ?? "");
var finalMessageText = textContentBuilder.ToString();
if (!string.IsNullOrEmpty(finalMessageText))
{
assistantParts.Add(new SnThinkingMessagePart
{ Type = ThinkingMessagePartType.Text, Text = finalMessageText });
}
var functionCalls = functionCallBuilder.Build()
.Where(fc => !string.IsNullOrEmpty(fc.Id)).ToList();
if (functionCalls.Count == 0)
break;
var assistantMessage = new ChatMessageContent(
authorRole ?? AuthorRole.Assistant,
string.IsNullOrEmpty(finalMessageText) ? null : finalMessageText
);
foreach (var functionCall in functionCalls)
{
assistantMessage.Items.Add(functionCall);
}
chatHistory.Add(assistantMessage);
foreach (var functionCall in functionCalls)
{
var part = new SnThinkingMessagePart
{
Type = ThinkingMessagePartType.FunctionCall,
FunctionCall = new SnFunctionCall
{
Id = functionCall.Id!,
PluginName = functionCall.PluginName,
Name = functionCall.FunctionName,
Arguments = JsonSerializer.Serialize(functionCall.Arguments)
}
};
assistantParts.Add(part);
var messageJson = JsonSerializer.Serialize(new { type = "function_call", data = part.FunctionCall });
await Response.Body.WriteAsync(Encoding.UTF8.GetBytes($"data: {messageJson}\n\n"));
await Response.Body.FlushAsync();
FunctionResultContent resultContent;
try
{
resultContent = await functionCall.InvokeAsync(kernel);
}
catch (Exception ex)
{
resultContent = new FunctionResultContent(functionCall.Id!, ex.Message);
}
chatHistory.Add(resultContent.ToChatMessage());
var resultPart = new SnThinkingMessagePart
{
Type = ThinkingMessagePartType.FunctionResult,
FunctionResult = new SnFunctionResult
{
CallId = resultContent.CallId!,
PluginName = resultContent.PluginName,
FunctionName = resultContent.FunctionName,
Result = resultContent.Result!,
IsError = resultContent.Result is Exception
}
};
assistantParts.Add(resultPart);
var resultMessageJson =
JsonSerializer.Serialize(new { type = "function_result", data = resultPart.FunctionResult });
await Response.Body.WriteAsync(Encoding.UTF8.GetBytes($"data: {resultMessageJson}\n\n"));
await Response.Body.FlushAsync();
}
}
// Save assistant thought
var savedThought = await service.SaveThoughtAsync(
sequence,
accumulatedContent.ToString(),
assistantParts,
ThinkingThoughtRole.Assistant,
thinkingChunks,
provider.ModelDefault
serviceId
);
// Write the topic if it was newly set, then the thought object as JSON to the stream
@@ -209,7 +371,6 @@ public class ThoughtController(ThoughtProvider provider, ThoughtService service)
{
var topicJson = JsonSerializer.Serialize(new { type = "topic", data = sequence.Topic ?? "" });
await streamBuilder.WriteAsync(Encoding.UTF8.GetBytes($"topic: {topicJson}\n\n"));
savedThought.Sequence.Topic = topic;
}
var thoughtJson = JsonSerializer.Serialize(new { type = "thought", data = savedThought },
@@ -250,6 +411,25 @@ public class ThoughtController(ThoughtProvider provider, ThoughtService service)
return Ok(sequences);
}
[HttpPatch("sequences/{sequenceId:guid}/sharing")]
[ProducesResponseType(StatusCodes.Status204NoContent)]
[ProducesResponseType(StatusCodes.Status404NotFound)]
[ProducesResponseType(StatusCodes.Status403Forbidden)]
public async Task<ActionResult> UpdateSequenceSharing(Guid sequenceId, [FromBody] UpdateSharingRequest request)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser) return Unauthorized();
var accountId = Guid.Parse(currentUser.Id);
var sequence = await service.GetSequenceAsync(sequenceId);
if (sequence == null) return NotFound();
if (sequence.AccountId != accountId) return Forbid();
sequence.IsPublic = request.IsPublic;
await service.UpdateSequenceAsync(sequence);
return NoContent();
}
/// <summary>
/// Retrieves the thoughts in a specific thinking sequence.
/// </summary>
@@ -262,12 +442,18 @@ public class ThoughtController(ThoughtProvider provider, ThoughtService service)
[ProducesResponseType(StatusCodes.Status404NotFound)]
public async Task<ActionResult<List<SnThinkingThought>>> GetSequenceThoughts(Guid sequenceId)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser) return Unauthorized();
var accountId = Guid.Parse(currentUser.Id);
var sequence = await service.GetOrCreateSequenceAsync(accountId, sequenceId);
var sequence = await service.GetSequenceAsync(sequenceId);
if (sequence == null) return NotFound();
if (!sequence.IsPublic)
{
if (HttpContext.Items["CurrentUser"] is not Account currentUser) return Unauthorized();
var accountId = Guid.Parse(currentUser.Id);
if (sequence.AccountId != accountId)
return StatusCode(403);
}
var thoughts = await service.GetPreviousThoughtsAsync(sequence);
return Ok(thoughts);

View File

@@ -1,158 +1,126 @@
using System.ClientModel;
using System.Diagnostics.CodeAnalysis;
using System.Text.Json;
using DysonNetwork.Shared.Models;
using DysonNetwork.Insight.Thought.Plugins;
using DysonNetwork.Shared.Proto;
using DysonNetwork.Shared.Registry;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Connectors.Ollama;
using Microsoft.SemanticKernel.Connectors.OpenAI;
using OpenAI;
using PostType = DysonNetwork.Shared.Proto.PostType;
using Microsoft.SemanticKernel.Plugins.Web;
using Microsoft.SemanticKernel.Plugins.Web.Bing;
using Microsoft.SemanticKernel.Plugins.Web.Google;
using NodaTime.Serialization.Protobuf;
using NodaTime.Text;
namespace DysonNetwork.Insight.Thought;
public class ThoughtServiceModel
{
public string ServiceId { get; set; } = null!;
public string? Provider { get; set; }
public string? Model { get; set; }
public double BillingMultiplier { get; set; }
public int PerkLevel { get; set; }
}
public class ThoughtProvider
{
private readonly PostService.PostServiceClient _postClient;
private readonly AccountService.AccountServiceClient _accountClient;
private readonly IConfiguration _configuration;
private readonly ILogger<ThoughtProvider> _logger;
public Kernel Kernel { get; }
private string? ModelProviderType { get; set; }
public string? ModelDefault { get; set; }
private readonly Dictionary<string, Kernel> _kernels = new();
private readonly Dictionary<string, string> _serviceProviders = new();
private readonly Dictionary<string, ThoughtServiceModel> _serviceModels = new();
private readonly string _defaultServiceId;
[Experimental("SKEXP0050")]
public ThoughtProvider(
IConfiguration configuration,
PostService.PostServiceClient postServiceClient,
AccountService.AccountServiceClient accountServiceClient,
ILogger<ThoughtProvider> logger
AccountService.AccountServiceClient accountServiceClient
)
{
_logger = logger;
_postClient = postServiceClient;
_accountClient = accountServiceClient;
_configuration = configuration;
Kernel = InitializeThinkingProvider(configuration);
InitializeHelperFunctions();
var cfg = configuration.GetSection("Thinking");
_defaultServiceId = cfg.GetValue<string>("DefaultService")!;
var services = cfg.GetSection("Services").GetChildren();
foreach (var service in services)
{
var serviceId = service.Key;
var serviceModel = new ThoughtServiceModel
{
ServiceId = serviceId,
Provider = service.GetValue<string>("Provider"),
Model = service.GetValue<string>("Model"),
BillingMultiplier = service.GetValue<double>("BillingMultiplier", 1.0),
PerkLevel = service.GetValue<int>("PerkLevel", 0)
};
_serviceModels[serviceId] = serviceModel;
var providerType = service.GetValue<string>("Provider")?.ToLower();
if (providerType is null) continue;
var kernel = InitializeThinkingService(service);
InitializeHelperFunctions(kernel);
_kernels[serviceId] = kernel;
_serviceProviders[serviceId] = providerType;
}
}
private Kernel InitializeThinkingProvider(IConfiguration configuration)
private Kernel InitializeThinkingService(IConfigurationSection serviceConfig)
{
var cfg = configuration.GetSection("Thinking");
ModelProviderType = cfg.GetValue<string>("Provider")?.ToLower();
ModelDefault = cfg.GetValue<string>("Model");
var endpoint = cfg.GetValue<string>("Endpoint");
var apiKey = cfg.GetValue<string>("ApiKey");
var providerType = serviceConfig.GetValue<string>("Provider")?.ToLower();
var model = serviceConfig.GetValue<string>("Model");
var endpoint = serviceConfig.GetValue<string>("Endpoint");
var apiKey = serviceConfig.GetValue<string>("ApiKey");
var builder = Kernel.CreateBuilder();
switch (ModelProviderType)
switch (providerType)
{
case "ollama":
builder.AddOllamaChatCompletion(ModelDefault!, new Uri(endpoint ?? "http://localhost:11434/api"));
builder.AddOllamaChatCompletion(
model!,
new Uri(endpoint ?? "http://localhost:11434/api")
);
break;
case "deepseek":
var client = new OpenAIClient(
new ApiKeyCredential(apiKey!),
new OpenAIClientOptions { Endpoint = new Uri(endpoint ?? "https://api.deepseek.com/v1") }
);
builder.AddOpenAIChatCompletion(ModelDefault!, client);
builder.AddOpenAIChatCompletion(model!, client);
break;
default:
throw new IndexOutOfRangeException("Unknown thinking provider: " + ModelProviderType);
throw new IndexOutOfRangeException("Unknown thinking provider: " + providerType);
}
// Add gRPC clients for Thought Plugins
builder.Services.AddServiceDiscoveryCore();
builder.Services.AddServiceDiscovery();
builder.Services.AddAccountService();
builder.Services.AddSphereService();
builder.Plugins.AddFromObject(new SnAccountKernelPlugin(_accountClient));
builder.Plugins.AddFromObject(new SnPostKernelPlugin(_postClient));
return builder.Build();
}
[Experimental("SKEXP0050")]
private void InitializeHelperFunctions()
private void InitializeHelperFunctions(Kernel kernel)
{
// Add Solar Network tools plugin
Kernel.ImportPluginFromFunctions("solar_network", [
KernelFunctionFactory.CreateFromMethod(async (string userId) =>
{
var request = new GetAccountRequest { Id = userId };
var response = await _accountClient.GetAccountAsync(request);
return JsonSerializer.Serialize(response, GrpcTypeHelper.SerializerOptions);
}, "get_user", "Get a user profile from the Solar Network."),
KernelFunctionFactory.CreateFromMethod(async (string postId) =>
{
var request = new GetPostRequest { Id = postId };
var response = await _postClient.GetPostAsync(request);
return JsonSerializer.Serialize(response, GrpcTypeHelper.SerializerOptions);
}, "get_post", "Get a single post by ID from the Solar Network."),
KernelFunctionFactory.CreateFromMethod(async (string query) =>
{
var request = new SearchPostsRequest { Query = query, PageSize = 10 };
var response = await _postClient.SearchPostsAsync(request);
return JsonSerializer.Serialize(response.Posts, GrpcTypeHelper.SerializerOptions);
}, "search_posts",
"Search posts by query from the Solar Network. The input query is will be used to search with title, description and body content"),
KernelFunctionFactory.CreateFromMethod(async (
string? orderBy = null,
string? afterIso = null,
string? beforeIso = null
) =>
{
_logger.LogInformation("Begin building request to list post from sphere...");
var request = new ListPostsRequest
{
PageSize = 20,
OrderBy = orderBy,
};
if (!string.IsNullOrEmpty(afterIso))
try
{
request.After = InstantPattern.General.Parse(afterIso).Value.ToTimestamp();
}
catch (Exception)
{
_logger.LogWarning("Invalid afterIso format: {AfterIso}", afterIso);
}
if (!string.IsNullOrEmpty(beforeIso))
try
{
request.Before = InstantPattern.General.Parse(beforeIso).Value.ToTimestamp();
}
catch (Exception)
{
_logger.LogWarning("Invalid beforeIso format: {BeforeIso}", beforeIso);
}
_logger.LogInformation("Request built, {Request}", request);
var response = await _postClient.ListPostsAsync(request);
var data = response.Posts.Select(SnPost.FromProtoValue);
_logger.LogInformation("Sphere service returned posts: {Posts}", data);
return JsonSerializer.Serialize(data, GrpcTypeHelper.SerializerOptions);
}, "list_posts",
"Get posts from the Solar Network.\n" +
"Parameters:\n" +
"orderBy (optional, string: order by published date, accept asc or desc)\n" +
"afterIso (optional, string: ISO date for posts after this date)\n" +
"beforeIso (optional, string: ISO date for posts before this date)"
)
]);
// Add web search plugins if configured
var bingApiKey = _configuration.GetValue<string>("Thinking:BingApiKey");
if (!string.IsNullOrEmpty(bingApiKey))
{
var bingConnector = new BingConnector(bingApiKey);
var bing = new WebSearchEnginePlugin(bingConnector);
Kernel.ImportPluginFromObject(bing, "bing");
kernel.ImportPluginFromObject(bing, "bing");
}
var googleApiKey = _configuration.GetValue<string>("Thinking:GoogleApiKey");
@@ -163,36 +131,58 @@ public class ThoughtProvider
apiKey: googleApiKey,
searchEngineId: googleCx);
var google = new WebSearchEnginePlugin(googleConnector);
Kernel.ImportPluginFromObject(google, "google");
kernel.ImportPluginFromObject(google, "google");
}
}
public PromptExecutionSettings CreatePromptExecutionSettings()
public Kernel? GetKernel(string? serviceId = null)
{
switch (ModelProviderType)
serviceId ??= _defaultServiceId;
return _kernels.GetValueOrDefault(serviceId);
}
public string GetServiceId(string? serviceId = null)
{
return serviceId ?? _defaultServiceId;
}
public IEnumerable<string> GetAvailableServices()
{
return _kernels.Keys;
}
public IEnumerable<ThoughtServiceModel> GetAvailableServicesInfo()
{
return _serviceModels.Values;
}
public ThoughtServiceModel? GetServiceInfo(string? serviceId)
{
serviceId ??= _defaultServiceId;
return _serviceModels.GetValueOrDefault(serviceId);
}
public string GetDefaultServiceId()
{
return _defaultServiceId;
}
public PromptExecutionSettings CreatePromptExecutionSettings(string? serviceId = null)
{
serviceId ??= _defaultServiceId;
var providerType = _serviceProviders.GetValueOrDefault(serviceId);
return providerType switch
{
case "ollama":
return new OllamaPromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(
options: new FunctionChoiceBehaviorOptions
{
AllowParallelCalls = true,
AllowConcurrentInvocation = true
})
};
case "deepseek":
return new OpenAIPromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(
options: new FunctionChoiceBehaviorOptions
{
AllowParallelCalls = true,
AllowConcurrentInvocation = true
})
};
default:
throw new InvalidOperationException("Unknown provider: " + ModelProviderType);
}
"ollama" => new OllamaPromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(autoInvoke: false)
},
"deepseek" => new OpenAIPromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(autoInvoke: false), ModelId = serviceId
},
_ => throw new InvalidOperationException("Unknown provider for service: " + serviceId)
};
}
}

View File

@@ -11,8 +11,7 @@ namespace DysonNetwork.Insight.Thought;
public class ThoughtService(
AppDatabase db,
ICacheService cache,
PaymentService.PaymentServiceClient paymentService,
WalletService.WalletServiceClient walletService
PaymentService.PaymentServiceClient paymentService
)
{
public async Task<SnThinkingSequence?> GetOrCreateSequenceAsync(
@@ -37,40 +36,52 @@ public class ThoughtService(
}
}
public async Task<SnThinkingSequence?> GetSequenceAsync(Guid sequenceId)
{
return await db.ThinkingSequences.FindAsync(sequenceId);
}
public async Task UpdateSequenceAsync(SnThinkingSequence sequence)
{
db.ThinkingSequences.Update(sequence);
await db.SaveChangesAsync();
}
public async Task<SnThinkingThought> SaveThoughtAsync(
SnThinkingSequence sequence,
string content,
List<SnThinkingMessagePart> parts,
ThinkingThoughtRole role,
List<SnThinkingChunk>? chunks = null,
string? model = null
)
{
// Approximate token count (1 token ≈ 4 characters for GPT-like models)
var tokenCount = content?.Length / 4 ?? 0;
var totalChars = parts.Sum(part =>
(part.Type == ThinkingMessagePartType.Text ? part.Text?.Length : 0) ?? 0 +
(part.Type == ThinkingMessagePartType.FunctionCall ? part.FunctionCall?.Arguments.Length : 0) ?? 0
);
var tokenCount = totalChars / 4;
var thought = new SnThinkingThought
{
SequenceId = sequence.Id,
Content = content,
Parts = parts,
Role = role,
TokenCount = tokenCount,
ModelName = model,
Chunks = chunks ?? new List<SnThinkingChunk>(),
};
db.ThinkingThoughts.Add(thought);
// Update sequence total tokens only for assistant responses
if (role == ThinkingThoughtRole.Assistant)
sequence.TotalToken += tokenCount;
await db.SaveChangesAsync();
// Invalidate cache for this sequence's thoughts
await cache.RemoveGroupAsync($"sequence:{sequence.Id}");
return thought;
}
public async Task<List<SnThinkingThought>> GetPreviousThoughtsAsync(SnThinkingSequence sequence)
{
var cacheKey = $"thoughts:{sequence.Id}";
@@ -133,6 +144,13 @@ public class ThoughtService(
foreach (var accountGroup in groupedByAccount)
{
var accountId = accountGroup.Key;
if (await db.UnpaidAccounts.AnyAsync(u => u.AccountId == accountId))
{
logger.LogWarning("Skipping billing for marked account {accountId}", accountId);
continue;
}
var totalUnpaidTokens = accountGroup.Sum(s => s.TotalToken - s.PaidToken);
var cost = (long)Math.Ceiling(totalUnpaidTokens / 10.0);
@@ -166,9 +184,86 @@ public class ThoughtService(
catch (Exception ex)
{
logger.LogError(ex, "Error billing for account {accountId}", accountId);
if (!await db.UnpaidAccounts.AnyAsync(u => u.AccountId == accountId))
{
db.UnpaidAccounts.Add(new SnUnpaidAccount { AccountId = accountId, MarkedAt = DateTime.UtcNow });
}
}
}
await db.SaveChangesAsync();
}
public async Task<(bool success, long cost)> RetryBillingForAccountAsync(Guid accountId, ILogger logger)
{
var isMarked = await db.UnpaidAccounts.FirstOrDefaultAsync(u => u.AccountId == accountId);
if (isMarked == null)
{
logger.LogInformation("Account {accountId} is not marked for unpaid bills.", accountId);
return (true, 0);
}
var sequences = await db
.ThinkingSequences.Where(s => s.AccountId == accountId && s.PaidToken < s.TotalToken)
.ToListAsync();
if (!sequences.Any())
{
logger.LogInformation("No unpaid sequences found for account {accountId}. Unmarking.", accountId);
db.UnpaidAccounts.Remove(isMarked);
await db.SaveChangesAsync();
return (true, 0);
}
var totalUnpaidTokens = sequences.Sum(s => s.TotalToken - s.PaidToken);
var cost = (long)Math.Ceiling(totalUnpaidTokens / 10.0);
if (cost == 0)
{
logger.LogInformation("Unpaid tokens for {accountId} resulted in zero cost. Marking as paid and unmarking.", accountId);
foreach (var sequence in sequences)
{
sequence.PaidToken = sequence.TotalToken;
}
db.UnpaidAccounts.Remove(isMarked);
await db.SaveChangesAsync();
return (true, 0);
}
try
{
var date = DateTime.Now.ToString("yyyy-MM-dd");
await paymentService.CreateTransactionWithAccountAsync(
new CreateTransactionWithAccountRequest
{
PayerAccountId = accountId.ToString(),
Currency = WalletCurrency.SourcePoint,
Amount = cost.ToString(),
Remarks = $"Wage for SN-chan on {date} (Retry)",
Type = TransactionType.System,
}
);
foreach (var sequence in sequences)
{
sequence.PaidToken = sequence.TotalToken;
}
db.UnpaidAccounts.Remove(isMarked);
logger.LogInformation(
"Successfully billed {cost} points for account {accountId} on retry.",
cost,
accountId
);
await db.SaveChangesAsync();
return (true, cost);
}
catch (Exception ex)
{
logger.LogError(ex, "Error retrying billing for account {accountId}", accountId);
return (false, cost);
}
}
}

View File

@@ -1,7 +1,6 @@
using DysonNetwork.Insight.Thought;
using Quartz;
namespace DysonNetwork.Insight.Startup;
namespace DysonNetwork.Insight.Thought;
public class TokenBillingJob(ThoughtService thoughtService, ILogger<TokenBillingJob> logger) : IJob
{

View File

@@ -10,7 +10,10 @@
},
"AllowedHosts": "*",
"ConnectionStrings": {
"App": "Host=localhost;Port=5432;Database=dyson_insight;Username=postgres;Password=postgres;Include Error Detail=True;Maximum Pool Size=20;Connection Idle Lifetime=60"
"App": "Host=localhost;Port=5432;Database=dyson_insight;Username=postgres;Password=postgres;Include Error Detail=True;Maximum Pool Size=20;Connection Idle Lifetime=60",
"Registrar": "127.0.0.1:2379",
"Cache": "127.0.0.1:6379",
"Queue": "127.0.0.1:4222"
},
"KnownProxies": [
"127.0.0.1",
@@ -19,9 +22,26 @@
"Etcd": {
"Insecure": true
},
"Cache": {
"Serializer": "MessagePack"
},
"Thinking": {
"Provider": "deepseek",
"Model": "deepseek-chat",
"ApiKey": "sk-bd20f6a2e9fa40b98c46899baa0e9f09"
"DefaultService": "deepseek-chat",
"Services": {
"deepseek-chat": {
"Provider": "deepseek",
"Model": "deepseek-chat",
"ApiKey": "sk-",
"BillingMultiplier": 1.0,
"PerkLevel": 0
},
"deepseek-reasoner": {
"Provider": "deepseek",
"Model": "deepseek-reasoner",
"ApiKey": "sk-",
"BillingMultiplier": 1.5,
"PerkLevel": 1
}
}
}
}
}

View File

@@ -1,9 +1,11 @@
using System.ComponentModel.DataAnnotations;
using DysonNetwork.Pass.Affiliation;
using DysonNetwork.Pass.Auth;
using DysonNetwork.Pass.Credit;
using DysonNetwork.Pass.Permission;
using DysonNetwork.Pass.Wallet;
using DysonNetwork.Shared.GeoIp;
using DysonNetwork.Shared.Auth;
using DysonNetwork.Shared.Geometry;
using DysonNetwork.Shared.Http;
using DysonNetwork.Shared.Models;
using Microsoft.AspNetCore.Authorization;
@@ -22,7 +24,8 @@ public class AccountController(
SubscriptionService subscriptions,
AccountEventService events,
SocialCreditService socialCreditService,
GeoIpService geo
AffiliationSpellService ars,
GeoService geo
) : ControllerBase
{
[HttpGet("{name}")]
@@ -34,7 +37,7 @@ public class AccountController(
.Include(e => e.Badges)
.Include(e => e.Profile)
.Include(e => e.Contacts.Where(c => c.IsPublic))
.Where(a => a.Name == name)
.Where(a => EF.Functions.Like(a.Name, name))
.FirstOrDefaultAsync();
if (account is null) return NotFound(ApiError.NotFound(name, traceId: HttpContext.TraceIdentifier));
@@ -103,6 +106,52 @@ public class AccountController(
[MaxLength(32)] public string Language { get; set; } = "en-us";
[Required] public string CaptchaToken { get; set; } = string.Empty;
public string? AffiliationSpell { get; set; }
}
public class AccountCreateValidateRequest
{
[MinLength(2)]
[MaxLength(256)]
[RegularExpression(@"^[A-Za-z0-9_-]+$",
ErrorMessage = "Name can only contain letters, numbers, underscores, and hyphens.")
]
public string? Name { get; set; }
[EmailAddress]
[RegularExpression(@"^[^+]+@[^@]+\.[^@]+$", ErrorMessage = "Email address cannot contain '+' symbol.")]
[MaxLength(1024)]
public string? Email { get; set; }
public string? AffiliationSpell { get; set; }
}
[HttpPost("validate")]
[ProducesResponseType(StatusCodes.Status200OK)]
[ProducesResponseType(StatusCodes.Status404NotFound)]
public async Task<ActionResult<string>> ValidateCreateAccountRequest(
[FromBody] AccountCreateValidateRequest request)
{
if (request.Name is not null)
{
if (await accounts.CheckAccountNameHasTaken(request.Name))
return BadRequest("Account name has already been taken.");
}
if (request.Email is not null)
{
if (await accounts.CheckEmailHasBeenUsed(request.Email))
return BadRequest("Email has already been used.");
}
if (request.AffiliationSpell is not null)
{
if (!await ars.CheckAffiliationSpellHasTaken(request.AffiliationSpell))
return BadRequest("No affiliation spell has been found.");
}
return Ok("Everything seems good.");
}
[HttpPost]
@@ -271,10 +320,21 @@ public class AccountController(
[HttpPost("credits/validate")]
[Authorize]
[RequiredPermission("maintenance", "credits.validate.perform")]
[AskPermission("credits.validate.perform")]
public async Task<IActionResult> PerformSocialCreditValidation()
{
await socialCreditService.ValidateSocialCredits();
return Ok();
}
}
[HttpDelete("{name}")]
[Authorize]
[AskPermission("accounts.deletion")]
public async Task<IActionResult> AdminDeleteAccount(string name)
{
var account = await accounts.LookupAccount(name);
if (account is null) return NotFound();
await accounts.DeleteAccount(account);
return Ok();
}
}

View File

@@ -1,6 +1,7 @@
using System.ComponentModel.DataAnnotations;
using DysonNetwork.Pass.Permission;
using DysonNetwork.Pass.Wallet;
using DysonNetwork.Shared.Auth;
using DysonNetwork.Shared.Http;
using DysonNetwork.Shared.Models;
using DysonNetwork.Shared.Proto;
@@ -82,7 +83,7 @@ public class AccountCurrentController(
[MaxLength(4096)] public string? Bio { get; set; }
public Shared.Models.UsernameColor? UsernameColor { get; set; }
public Instant? Birthday { get; set; }
public List<ProfileLink>? Links { get; set; }
public List<SnProfileLink>? Links { get; set; }
[MaxLength(32)] public string? PictureId { get; set; }
[MaxLength(32)] public string? BackgroundId { get; set; }
@@ -194,7 +195,7 @@ public class AccountCurrentController(
}
[HttpPatch("statuses")]
[RequiredPermission("global", "accounts.statuses.update")]
[AskPermission("accounts.statuses.update")]
public async Task<ActionResult<SnAccountStatus>> UpdateStatus([FromBody] AccountController.StatusRequest request)
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser) return Unauthorized();
@@ -228,7 +229,7 @@ public class AccountCurrentController(
}
[HttpPost("statuses")]
[RequiredPermission("global", "accounts.statuses.create")]
[AskPermission("accounts.statuses.create")]
public async Task<ActionResult<SnAccountStatus>> CreateStatus([FromBody] AccountController.StatusRequest request)
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser) return Unauthorized();
@@ -559,7 +560,7 @@ public class AccountCurrentController(
[HttpGet("devices")]
[Authorize]
public async Task<ActionResult<List<SnAuthClientWithChallenge>>> GetDevices()
public async Task<ActionResult<List<SnAuthClientWithSessions>>> GetDevices()
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser ||
HttpContext.Items["CurrentSession"] is not SnAuthSession currentSession) return Unauthorized();
@@ -570,18 +571,41 @@ public class AccountCurrentController(
.Where(device => device.AccountId == currentUser.Id)
.ToListAsync();
var challengeDevices = devices.Select(SnAuthClientWithChallenge.FromClient).ToList();
var deviceIds = challengeDevices.Select(x => x.Id).ToList();
var sessionDevices = devices.ConvertAll(SnAuthClientWithSessions.FromClient).ToList();
var clientIds = sessionDevices.Select(x => x.Id).ToList();
var authChallenges = await db.AuthChallenges
.Where(c => c.ClientId != null && deviceIds.Contains(c.ClientId.Value))
.GroupBy(c => c.ClientId)
.ToDictionaryAsync(c => c.Key!.Value, c => c.ToList());
foreach (var challengeDevice in challengeDevices)
if (authChallenges.TryGetValue(challengeDevice.Id, out var challenge))
challengeDevice.Challenges = challenge;
var authSessions = await db.AuthSessions
.Where(c => c.ClientId != null && clientIds.Contains(c.ClientId.Value))
.GroupBy(c => c.ClientId!.Value)
.ToDictionaryAsync(c => c.Key, c => c.ToList());
foreach (var dev in sessionDevices)
if (authSessions.TryGetValue(dev.Id, out var challenge))
dev.Sessions = challenge;
return Ok(challengeDevices);
return Ok(sessionDevices);
}
[HttpGet("challenges")]
[Authorize]
public async Task<ActionResult<List<SnAuthChallenge>>> GetChallenges(
[FromQuery] int take = 20,
[FromQuery] int offset = 0
)
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser) return Unauthorized();
var query = db.AuthChallenges
.Where(challenge => challenge.AccountId == currentUser.Id)
.OrderByDescending(c => c.CreatedAt);
var total = await query.CountAsync();
Response.Headers.Append("X-Total", total.ToString());
var challenges = await query
.Skip(offset)
.Take(take)
.ToListAsync();
return Ok(challenges);
}
[HttpGet("sessions")]
@@ -595,8 +619,8 @@ public class AccountCurrentController(
HttpContext.Items["CurrentSession"] is not SnAuthSession currentSession) return Unauthorized();
var query = db.AuthSessions
.OrderByDescending(x => x.LastGrantedAt)
.Include(session => session.Account)
.Include(session => session.Challenge)
.Where(session => session.Account.Id == currentUser.Id);
var total = await query.CountAsync();
@@ -604,7 +628,6 @@ public class AccountCurrentController(
Response.Headers.Append("X-Auth-Session", currentSession.Id.ToString());
var sessions = await query
.OrderByDescending(x => x.LastGrantedAt)
.Skip(offset)
.Take(take)
.ToListAsync();
@@ -688,7 +711,7 @@ public class AccountCurrentController(
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser ||
HttpContext.Items["CurrentSession"] is not SnAuthSession currentSession) return Unauthorized();
var device = await db.AuthClients.FirstOrDefaultAsync(d => d.Id == currentSession.Challenge.ClientId);
var device = await db.AuthClients.FirstOrDefaultAsync(d => d.Id == currentSession.ClientId);
if (device is null) return NotFound();
try

View File

@@ -137,7 +137,7 @@ public class AccountEventService(
}
}
if (cacheMissUserIds.Count != 0)
if (cacheMissUserIds.Count == 0) return results;
{
var now = SystemClock.Instance.GetCurrentInstant();
var statusesFromDb = await db.AccountStatuses
@@ -160,7 +160,7 @@ public class AccountEventService(
}
var usersWithoutStatus = cacheMissUserIds.Except(foundUserIds).ToList();
if (usersWithoutStatus.Any())
if (usersWithoutStatus.Count == 0) return results;
{
foreach (var userId in usersWithoutStatus)
{
@@ -313,46 +313,84 @@ public class AccountEventService(
CultureInfo.CurrentCulture = cultureInfo;
CultureInfo.CurrentUICulture = cultureInfo;
// Generate 2 positive tips
var positiveIndices = Enumerable.Range(1, FortuneTipCount)
.OrderBy(_ => Random.Next())
.Take(2)
.ToList();
var tips = positiveIndices.Select(index => new CheckInFortuneTip
{
IsPositive = true,
Title = localizer[$"FortuneTipPositiveTitle_{index}"].Value,
Content = localizer[$"FortuneTipPositiveContent_{index}"].Value
}).ToList();
// Generate 2 negative tips
var negativeIndices = Enumerable.Range(1, FortuneTipCount)
.Except(positiveIndices)
.OrderBy(_ => Random.Next())
.Take(2)
.ToList();
tips.AddRange(negativeIndices.Select(index => new CheckInFortuneTip
{
IsPositive = false,
Title = localizer[$"FortuneTipNegativeTitle_{index}"].Value,
Content = localizer[$"FortuneTipNegativeContent_{index}"].Value
}));
// The 5 is specialized, keep it alone.
var sum = 0;
var maxLevel = Enum.GetValues<CheckInResultLevel>().Length - 1;
for (var i = 0; i < 5; i++)
sum += Random.Next(maxLevel);
var checkInLevel = (CheckInResultLevel)(sum / 5);
var accountBirthday = await db.AccountProfiles
var accountProfile = await db.AccountProfiles
.Where(x => x.AccountId == user.Id)
.Select(x => x.Birthday)
.Select(x => new { x.Birthday, x.TimeZone })
.FirstOrDefaultAsync();
var accountBirthday = accountProfile?.Birthday;
var now = SystemClock.Instance.GetCurrentInstant().InUtc().Date;
if (accountBirthday.HasValue && accountBirthday.Value.InUtc().Date == now)
var now = SystemClock.Instance.GetCurrentInstant();
var userTimeZone = DateTimeZone.Utc;
if (!string.IsNullOrEmpty(accountProfile?.TimeZone))
{
userTimeZone = DateTimeZoneProviders.Tzdb.GetZoneOrNull(accountProfile.TimeZone) ?? DateTimeZone.Utc;
}
var todayInUserTz = now.InZone(userTimeZone).Date;
var birthdayDate = accountBirthday?.InZone(userTimeZone).Date;
var isBirthday = birthdayDate.HasValue &&
birthdayDate.Value.Month == todayInUserTz.Month &&
birthdayDate.Value.Day == todayInUserTz.Day;
List<CheckInFortuneTip> tips;
CheckInResultLevel checkInLevel;
if (isBirthday)
{
// Skip random logic and tips generation for birthday
checkInLevel = CheckInResultLevel.Special;
tips = [
new CheckInFortuneTip()
{
IsPositive = true,
Title = localizer["FortuneTipSpecialTitle_Birthday"].Value,
Content = localizer["FortuneTipSpecialContent_Birthday", user.Nick].Value,
}
];
}
else
{
// Generate 2 positive tips
var positiveIndices = Enumerable.Range(1, FortuneTipCount)
.OrderBy(_ => Random.Next())
.Take(2)
.ToList();
tips = positiveIndices.Select(index => new CheckInFortuneTip
{
IsPositive = true,
Title = localizer[$"FortuneTipPositiveTitle_{index}"].Value,
Content = localizer[$"FortuneTipPositiveContent_{index}"].Value
}).ToList();
// Generate 2 negative tips
var negativeIndices = Enumerable.Range(1, FortuneTipCount)
.Except(positiveIndices)
.OrderBy(_ => Random.Next())
.Take(2)
.ToList();
tips.AddRange(negativeIndices.Select(index => new CheckInFortuneTip
{
IsPositive = false,
Title = localizer[$"FortuneTipNegativeTitle_{index}"].Value,
Content = localizer[$"FortuneTipNegativeContent_{index}"].Value
}));
// The 5 is specialized, keep it alone.
// Use weighted random distribution to make all levels reasonably achievable
// Weights: Worst: 10%, Worse: 20%, Normal: 40%, Better: 20%, Best: 10%
var randomValue = Random.Next(100);
checkInLevel = randomValue switch
{
< 10 => CheckInResultLevel.Worst, // 0-9: 10% chance
< 30 => CheckInResultLevel.Worse, // 10-29: 20% chance
< 70 => CheckInResultLevel.Normal, // 30-69: 40% chance
< 90 => CheckInResultLevel.Better, // 70-89: 20% chance
_ => CheckInResultLevel.Best // 90-99: 10% chance
};
}
var result = new SnCheckInResult
{
@@ -472,6 +510,54 @@ public class AccountEventService(
return activities;
}
public async Task<Dictionary<Guid, List<SnPresenceActivity>>> GetActiveActivitiesBatch(List<Guid> userIds)
{
var results = new Dictionary<Guid, List<SnPresenceActivity>>();
var cacheMissUserIds = new List<Guid>();
// Try to get activities from cache first
foreach (var userId in userIds)
{
var cacheKey = $"{ActivityCacheKey}{userId}";
var cachedActivities = await cache.GetAsync<List<SnPresenceActivity>>(cacheKey);
if (cachedActivities != null)
{
results[userId] = cachedActivities;
}
else
{
cacheMissUserIds.Add(userId);
}
}
// If all activities were found in cache, return early
if (cacheMissUserIds.Count == 0) return results;
// Fetch remaining activities from database in a single query
var now = SystemClock.Instance.GetCurrentInstant();
var activitiesFromDb = await db.PresenceActivities
.Where(e => cacheMissUserIds.Contains(e.AccountId) && e.LeaseExpiresAt > now && e.DeletedAt == null)
.ToListAsync();
// Group activities by user ID and update cache
var activitiesByUser = activitiesFromDb
.GroupBy(a => a.AccountId)
.ToDictionary(g => g.Key, g => g.ToList());
foreach (var userId in cacheMissUserIds)
{
var userActivities = activitiesByUser.GetValueOrDefault(userId, new List<SnPresenceActivity>());
results[userId] = userActivities;
// Update cache for this user
var cacheKey = $"{ActivityCacheKey}{userId}";
await cache.SetWithGroupsAsync(cacheKey, userActivities, [$"{AccountService.AccountCachePrefix}{userId}"],
TimeSpan.FromMinutes(1));
}
return results;
}
public async Task<(List<SnPresenceActivity>, int)> GetAllActivities(Guid userId, int offset = 0, int take = 20)
{
var query = db.PresenceActivities

View File

@@ -1,8 +1,11 @@
using System.Globalization;
using DysonNetwork.Pass.Affiliation;
using DysonNetwork.Pass.Auth.OpenId;
using DysonNetwork.Pass.Localization;
using DysonNetwork.Pass.Mailer;
using DysonNetwork.Pass.Resources.Emails;
using DysonNetwork.Shared.Cache;
using DysonNetwork.Shared.Data;
using DysonNetwork.Shared.Models;
using DysonNetwork.Shared.Proto;
using DysonNetwork.Shared.Stream;
@@ -22,6 +25,7 @@ public class AccountService(
FileService.FileServiceClient files,
FileReferenceService.FileReferenceServiceClient fileRefs,
AccountUsernameService uname,
AffiliationSpellService ars,
EmailService mailer,
RingService.RingServiceClient pusher,
IStringLocalizer<NotificationResource> localizer,
@@ -52,11 +56,13 @@ public class AccountService(
public async Task<SnAccount?> LookupAccount(string probe)
{
var account = await db.Accounts.Where(a => a.Name == probe).FirstOrDefaultAsync();
var account = await db.Accounts.Where(a => EF.Functions.ILike(a.Name, probe)).FirstOrDefaultAsync();
if (account is not null) return account;
var contact = await db.AccountContacts
.Where(c => c.Content == probe)
.Where(c => c.Type == Shared.Models.AccountContactType.Email ||
c.Type == Shared.Models.AccountContactType.PhoneNumber)
.Where(c => EF.Functions.ILike(c.Content, probe))
.Include(c => c.Account)
.FirstOrDefaultAsync();
return contact?.Account;
@@ -79,6 +85,17 @@ public class AccountService(
return profile?.Level;
}
public async Task<bool> CheckAccountNameHasTaken(string name)
{
return await db.Accounts.AnyAsync(a => EF.Functions.ILike(a.Name, name));
}
public async Task<bool> CheckEmailHasBeenUsed(string email)
{
return await db.AccountContacts.AnyAsync(c =>
c.Type == Shared.Models.AccountContactType.Email && EF.Functions.ILike(c.Content, email));
}
public async Task<SnAccount> CreateAccount(
string name,
string nick,
@@ -86,12 +103,12 @@ public class AccountService(
string? password,
string language = "en-US",
string region = "en",
string? affiliationSpell = null,
bool isEmailVerified = false,
bool isActivated = false
)
{
var dupeNameCount = await db.Accounts.Where(a => a.Name == name).CountAsync();
if (dupeNameCount > 0)
if (await CheckAccountNameHasTaken(name))
throw new InvalidOperationException("Account name has already been taken.");
var dupeEmailCount = await db.AccountContacts
@@ -99,7 +116,7 @@ public class AccountService(
).CountAsync();
if (dupeEmailCount > 0)
throw new InvalidOperationException("Account email has already been used.");
var account = new SnAccount
{
Name = name,
@@ -108,7 +125,7 @@ public class AccountService(
Region = region,
Contacts =
[
new()
new SnAccountContact
{
Type = Shared.Models.AccountContactType.Email,
Content = email,
@@ -130,6 +147,9 @@ public class AccountService(
Profile = new SnAccountProfile()
};
if (affiliationSpell is not null)
await ars.CreateAffiliationResult(affiliationSpell, $"account:{account.Id}");
if (isActivated)
{
account.ActivatedAt = SystemClock.Instance.GetCurrentInstant();
@@ -138,7 +158,7 @@ public class AccountService(
{
db.PermissionGroupMembers.Add(new SnPermissionGroupMember
{
Actor = $"user:{account.Id}",
Actor = account.Id.ToString(),
Group = defaultGroup
});
}
@@ -179,10 +199,7 @@ public class AccountService(
displayName,
userInfo.Email,
null,
"en-US",
"en",
userInfo.EmailVerified,
userInfo.EmailVerified
isEmailVerified: userInfo.EmailVerified
);
}
@@ -272,7 +289,8 @@ public class AccountService(
return isExists;
}
public async Task<SnAccountAuthFactor?> CreateAuthFactor(SnAccount account, Shared.Models.AccountAuthFactorType type, string? secret)
public async Task<SnAccountAuthFactor?> CreateAuthFactor(SnAccount account,
Shared.Models.AccountAuthFactorType type, string? secret)
{
SnAccountAuthFactor? factor = null;
switch (type)
@@ -350,7 +368,8 @@ public class AccountService(
public async Task<SnAccountAuthFactor> EnableAuthFactor(SnAccountAuthFactor factor, string? code)
{
if (factor.EnabledAt is not null) throw new ArgumentException("The factor has been enabled.");
if (factor.Type is Shared.Models.AccountAuthFactorType.Password or Shared.Models.AccountAuthFactorType.TimedCode)
if (factor.Type is Shared.Models.AccountAuthFactorType.Password
or Shared.Models.AccountAuthFactorType.TimedCode)
{
if (code is null || !factor.VerifyPassword(code))
throw new InvalidOperationException(
@@ -447,10 +466,10 @@ public class AccountService(
}
await mailer
.SendTemplatedEmailAsync<Emails.VerificationEmail, VerificationEmailModel>(
.SendTemplatedEmailAsync<FactorCodeEmail, VerificationEmailModel>(
account.Nick,
contact.Content,
emailLocalizer["EmailCodeTitle"],
emailLocalizer["CodeEmailTitle"],
new VerificationEmailModel
{
Name = account.Name,
@@ -506,9 +525,7 @@ public class AccountService(
private async Task<bool> IsDeviceActive(Guid id)
{
return await db.AuthSessions
.Include(s => s.Challenge)
.AnyAsync(s => s.Challenge.ClientId == id);
return await db.AuthSessions.AnyAsync(s => s.ClientId == id);
}
public async Task<SnAuthClient> UpdateDeviceName(SnAccount account, string deviceId, string label)
@@ -527,8 +544,7 @@ public class AccountService(
public async Task DeleteSession(SnAccount account, Guid sessionId)
{
var session = await db.AuthSessions
.Include(s => s.Challenge)
.ThenInclude(s => s.Client)
.Include(s => s.Client)
.Where(s => s.Id == sessionId && s.AccountId == account.Id)
.FirstOrDefaultAsync();
if (session is null) throw new InvalidOperationException("Session was not found.");
@@ -537,11 +553,11 @@ public class AccountService(
db.AuthSessions.Remove(session);
await db.SaveChangesAsync();
if (session.Challenge.ClientId.HasValue)
if (session.ClientId.HasValue)
{
if (!await IsDeviceActive(session.Challenge.ClientId.Value))
if (!await IsDeviceActive(session.ClientId.Value))
await pusher.UnsubscribePushNotificationsAsync(new UnsubscribePushNotificationsRequest()
{ DeviceId = session.Challenge.Client!.DeviceId }
{ DeviceId = session.Client!.DeviceId }
);
}
@@ -562,15 +578,13 @@ public class AccountService(
);
var sessions = await db.AuthSessions
.Include(s => s.Challenge)
.Where(s => s.Challenge.ClientId == device.Id && s.AccountId == account.Id)
.Where(s => s.ClientId == device.Id && s.AccountId == account.Id)
.ToListAsync();
// The current session should be included in the sessions' list
var now = SystemClock.Instance.GetCurrentInstant();
await db.AuthSessions
.Include(s => s.Challenge)
.Where(s => s.Challenge.ClientId == device.Id)
.Where(s => s.ClientId == device.Id)
.ExecuteUpdateAsync(p => p.SetProperty(s => s.DeletedAt, s => now));
db.AuthClients.Remove(device);
@@ -580,7 +594,8 @@ public class AccountService(
await cache.RemoveAsync($"{AuthService.AuthCachePrefix}{item.Id}");
}
public async Task<SnAccountContact> CreateContactMethod(SnAccount account, Shared.Models.AccountContactType type, string content)
public async Task<SnAccountContact> CreateContactMethod(SnAccount account, Shared.Models.AccountContactType type,
string content)
{
var isExists = await db.AccountContacts
.Where(x => x.AccountId == account.Id && x.Type == type && x.Content == content)
@@ -642,7 +657,8 @@ public class AccountService(
}
}
public async Task<SnAccountContact> SetContactMethodPublic(SnAccount account, SnAccountContact contact, bool isPublic)
public async Task<SnAccountContact> SetContactMethodPublic(SnAccount account, SnAccountContact contact,
bool isPublic)
{
contact.IsPublic = isPublic;
db.AccountContacts.Update(contact);

View File

@@ -24,15 +24,16 @@ public class AccountServiceGrpc(
public override async Task<Shared.Proto.Account> GetAccount(GetAccountRequest request, ServerCallContext context)
{
if (!Guid.TryParse(request.Id, out var accountId))
throw new RpcException(new Grpc.Core.Status(StatusCode.InvalidArgument, "Invalid account ID format"));
throw new RpcException(new Status(StatusCode.InvalidArgument, "Invalid account ID format"));
var account = await _db.Accounts
.AsNoTracking()
.Include(a => a.Profile)
.Include(a => a.Contacts.Where(c => c.IsPublic))
.FirstOrDefaultAsync(a => a.Id == accountId);
if (account == null)
throw new RpcException(new Grpc.Core.Status(StatusCode.NotFound, $"Account {request.Id} not found"));
throw new RpcException(new Status(StatusCode.NotFound, $"Account {request.Id} not found"));
var perk = await subscriptions.GetPerkSubscriptionAsync(account.Id);
account.PerkSubscription = perk?.ToReference();

View File

@@ -1,10 +1,10 @@
using DysonNetwork.Shared.Cache;
using DysonNetwork.Shared.GeoIp;
using DysonNetwork.Shared.Geometry;
using DysonNetwork.Shared.Models;
namespace DysonNetwork.Pass.Account;
public class ActionLogService(GeoIpService geo, FlushBufferService fbs)
public class ActionLogService(GeoService geo, FlushBufferService fbs)
{
public void CreateActionLog(Guid accountId, string action, Dictionary<string, object> meta)
{

View File

@@ -0,0 +1,55 @@
using DysonNetwork.Shared.Models;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
namespace DysonNetwork.Pass.Account;
[ApiController]
[Route("/api/friends")]
public class FriendsController(AppDatabase db, RelationshipService rels, AccountEventService events) : ControllerBase
{
public class FriendOverviewItem
{
public SnAccount Account { get; set; } = null!;
public SnAccountStatus Status { get; set; } = null!;
public List<SnPresenceActivity> Activities { get; set; } = [];
}
[HttpGet("overview")]
[Authorize]
public async Task<ActionResult<List<FriendOverviewItem>>> GetOverview([FromQuery] bool includeOffline = false)
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser) return Unauthorized();
var friendIds = await rels.ListAccountFriends(currentUser);
// Fetch data in parallel using batch methods for better performance
var accountsTask = db.Accounts
.Where(a => friendIds.Contains(a.Id))
.Include(a => a.Profile)
.ToListAsync();
var statusesTask = events.GetStatuses(friendIds);
var activitiesTask = events.GetActiveActivitiesBatch(friendIds);
// Wait for all data to be fetched
await Task.WhenAll(accountsTask, statusesTask, activitiesTask);
var accounts = accountsTask.Result;
var statuses = statusesTask.Result;
var activities = activitiesTask.Result;
var result = (from account in accounts
let status = statuses.GetValueOrDefault(account.Id)
where includeOffline || status is { IsOnline: true }
let accountActivities = activities.GetValueOrDefault(account.Id, new List<SnPresenceActivity>())
select new FriendOverviewItem
{
Account = account, Status = status ?? new SnAccountStatus { AccountId = account.Id },
Activities = accountActivities
}).ToList();
return Ok(result);
}
}

View File

@@ -1,3 +1,5 @@
using DysonNetwork.Shared.Models;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
@@ -7,17 +9,31 @@ namespace DysonNetwork.Pass.Account;
[Route("/api/spells")]
public class MagicSpellController(AppDatabase db, MagicSpellService sp) : ControllerBase
{
[HttpPost("activation/resend")]
[Authorize]
public async Task<ActionResult> ResendActivationMagicSpell()
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser) return Unauthorized();
var spell = await db.MagicSpells.FirstOrDefaultAsync(s =>
s.Type == MagicSpellType.AccountActivation && s.AccountId == currentUser.Id);
if (spell is null) return BadRequest("Unable to find activation magic spell.");
await sp.NotifyMagicSpell(spell, true);
return Ok();
}
[HttpPost("{spellId:guid}/resend")]
public async Task<ActionResult> ResendMagicSpell(Guid spellId)
{
var spell = db.MagicSpells.FirstOrDefault(x => x.Id == spellId);
if (spell == null)
return NotFound();
await sp.NotifyMagicSpell(spell, true);
return Ok();
}
[HttpGet("{spellWord}")]
public async Task<ActionResult> GetMagicSpell(string spellWord)
{
@@ -38,7 +54,8 @@ public class MagicSpellController(AppDatabase db, MagicSpellService sp) : Contro
}
[HttpPost("{spellWord}/apply")]
public async Task<ActionResult> ApplyMagicSpell([FromRoute] string spellWord, [FromBody] MagicSpellApplyRequest? request)
public async Task<ActionResult> ApplyMagicSpell([FromRoute] string spellWord,
[FromBody] MagicSpellApplyRequest? request)
{
var word = Uri.UnescapeDataString(spellWord);
var spell = await db.MagicSpells
@@ -59,6 +76,7 @@ public class MagicSpellController(AppDatabase db, MagicSpellService sp) : Contro
{
return BadRequest(ex.Message);
}
return Ok();
}
}

View File

@@ -1,7 +1,7 @@
using System.Security.Cryptography;
using System.Text.Json;
using DysonNetwork.Pass.Emails;
using DysonNetwork.Pass.Mailer;
using DysonNetwork.Pass.Resources.Emails;
using DysonNetwork.Shared.Cache;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
@@ -26,6 +26,7 @@ public class MagicSpellService(
Dictionary<string, object> meta,
Instant? expiredAt = null,
Instant? affectedAt = null,
string? code = null,
bool preventRepeat = false
)
{
@@ -41,7 +42,7 @@ public class MagicSpellService(
return existingSpell;
}
var spellWord = _GenerateRandomString(128);
var spellWord = code ?? _GenerateRandomString(128);
var spell = new SnMagicSpell
{
Spell = spellWord,
@@ -94,10 +95,10 @@ public class MagicSpellService(
switch (spell.Type)
{
case MagicSpellType.AccountActivation:
await email.SendTemplatedEmailAsync<LandingEmail, LandingEmailModel>(
await email.SendTemplatedEmailAsync<RegistrationConfirmEmail, LandingEmailModel>(
contact.Account.Nick,
contact.Content,
localizer["EmailLandingTitle"],
localizer["RegConfirmTitle"],
new LandingEmailModel
{
Name = contact.Account.Name,
@@ -109,7 +110,7 @@ public class MagicSpellService(
await email.SendTemplatedEmailAsync<AccountDeletionEmail, AccountDeletionEmailModel>(
contact.Account.Nick,
contact.Content,
localizer["EmailAccountDeletionTitle"],
localizer["AccountDeletionTitle"],
new AccountDeletionEmailModel
{
Name = contact.Account.Name,
@@ -121,7 +122,7 @@ public class MagicSpellService(
await email.SendTemplatedEmailAsync<PasswordResetEmail, PasswordResetEmailModel>(
contact.Account.Nick,
contact.Content,
localizer["EmailPasswordResetTitle"],
localizer["PasswordResetTitle"],
new PasswordResetEmailModel
{
Name = contact.Account.Name,
@@ -135,7 +136,7 @@ public class MagicSpellService(
await email.SendTemplatedEmailAsync<ContactVerificationEmail, ContactVerificationEmailModel>(
contact.Account.Nick,
contactMethod!,
localizer["EmailContactVerificationTitle"],
localizer["ContractVerificationTitle"],
new ContactVerificationEmailModel
{
Name = contact.Account.Name,
@@ -193,7 +194,7 @@ public class MagicSpellService(
{
db.PermissionGroupMembers.Add(new SnPermissionGroupMember
{
Actor = $"user:{account.Id}",
Actor = account.Id.ToString(),
Group = defaultGroup
});
}

View File

@@ -17,12 +17,18 @@ public class RelationshipService(
{
private const string UserFriendsCacheKeyPrefix = "accounts:friends:";
private const string UserBlockedCacheKeyPrefix = "accounts:blocked:";
private static readonly TimeSpan CacheExpiration = TimeSpan.FromHours(1);
public async Task<bool> HasExistingRelationship(Guid accountId, Guid relatedId)
{
if (accountId == Guid.Empty || relatedId == Guid.Empty)
throw new ArgumentException("Account IDs cannot be empty.");
if (accountId == relatedId)
return false; // Prevent self-relationships
var count = await db.AccountRelationships
.Where(r => (r.AccountId == accountId && r.RelatedId == relatedId) ||
(r.AccountId == relatedId && r.AccountId == accountId))
(r.AccountId == relatedId && r.RelatedId == accountId))
.CountAsync();
return count > 0;
}
@@ -34,6 +40,9 @@ public class RelationshipService(
bool ignoreExpired = false
)
{
if (accountId == Guid.Empty || relatedId == Guid.Empty)
throw new ArgumentException("Account IDs cannot be empty.");
var now = Instant.FromDateTimeUtc(DateTime.UtcNow);
var queries = db.AccountRelationships.AsQueryable()
.Where(r => r.AccountId == accountId && r.RelatedId == relatedId);
@@ -61,7 +70,7 @@ public class RelationshipService(
db.AccountRelationships.Add(relationship);
await db.SaveChangesAsync();
await PurgeRelationshipCache(sender.Id, target.Id);
await PurgeRelationshipCache(sender.Id, target.Id, status);
return relationship;
}
@@ -80,7 +89,7 @@ public class RelationshipService(
db.Remove(relationship);
await db.SaveChangesAsync();
await PurgeRelationshipCache(sender.Id, target.Id);
await PurgeRelationshipCache(sender.Id, target.Id, RelationshipStatus.Blocked);
return relationship;
}
@@ -114,19 +123,24 @@ public class RelationshipService(
}
});
await PurgeRelationshipCache(sender.Id, target.Id, RelationshipStatus.Pending);
return relationship;
}
public async Task DeleteFriendRequest(Guid accountId, Guid relatedId)
{
var relationship = await GetRelationship(accountId, relatedId, RelationshipStatus.Pending);
if (relationship is null) throw new ArgumentException("Friend request was not found.");
if (accountId == Guid.Empty || relatedId == Guid.Empty)
throw new ArgumentException("Account IDs cannot be empty.");
await db.AccountRelationships
var affectedRows = await db.AccountRelationships
.Where(r => r.AccountId == accountId && r.RelatedId == relatedId && r.Status == RelationshipStatus.Pending)
.ExecuteDeleteAsync();
await PurgeRelationshipCache(relationship.AccountId, relationship.RelatedId);
if (affectedRows == 0)
throw new ArgumentException("Friend request was not found.");
await PurgeRelationshipCache(accountId, relatedId, RelationshipStatus.Pending);
}
public async Task<SnAccountRelationship> AcceptFriendRelationship(
@@ -155,7 +169,7 @@ public class RelationshipService(
await db.SaveChangesAsync();
await PurgeRelationshipCache(relationship.AccountId, relationship.RelatedId);
await PurgeRelationshipCache(relationship.AccountId, relationship.RelatedId, RelationshipStatus.Friends, status);
return relationshipBackward;
}
@@ -165,11 +179,12 @@ public class RelationshipService(
var relationship = await GetRelationship(accountId, relatedId);
if (relationship is null) throw new ArgumentException("There is no relationship between you and the user.");
if (relationship.Status == status) return relationship;
var oldStatus = relationship.Status;
relationship.Status = status;
db.Update(relationship);
await db.SaveChangesAsync();
await PurgeRelationshipCache(accountId, relatedId);
await PurgeRelationshipCache(accountId, relatedId, oldStatus, status);
return relationship;
}
@@ -181,21 +196,7 @@ public class RelationshipService(
public async Task<List<Guid>> ListAccountFriends(Guid accountId)
{
var cacheKey = $"{UserFriendsCacheKeyPrefix}{accountId}";
var friends = await cache.GetAsync<List<Guid>>(cacheKey);
if (friends == null)
{
friends = await db.AccountRelationships
.Where(r => r.RelatedId == accountId)
.Where(r => r.Status == RelationshipStatus.Friends)
.Select(r => r.AccountId)
.ToListAsync();
await cache.SetAsync(cacheKey, friends, TimeSpan.FromHours(1));
}
return friends ?? [];
return await GetCachedRelationships(accountId, RelationshipStatus.Friends, UserFriendsCacheKeyPrefix);
}
public async Task<List<Guid>> ListAccountBlocked(SnAccount account)
@@ -205,21 +206,7 @@ public class RelationshipService(
public async Task<List<Guid>> ListAccountBlocked(Guid accountId)
{
var cacheKey = $"{UserBlockedCacheKeyPrefix}{accountId}";
var blocked = await cache.GetAsync<List<Guid>>(cacheKey);
if (blocked == null)
{
blocked = await db.AccountRelationships
.Where(r => r.RelatedId == accountId)
.Where(r => r.Status == RelationshipStatus.Blocked)
.Select(r => r.AccountId)
.ToListAsync();
await cache.SetAsync(cacheKey, blocked, TimeSpan.FromHours(1));
}
return blocked ?? [];
return await GetCachedRelationships(accountId, RelationshipStatus.Blocked, UserBlockedCacheKeyPrefix);
}
public async Task<bool> HasRelationshipWithStatus(Guid accountId, Guid relatedId,
@@ -229,11 +216,52 @@ public class RelationshipService(
return relationship is not null;
}
private async Task PurgeRelationshipCache(Guid accountId, Guid relatedId)
private async Task<List<Guid>> GetCachedRelationships(Guid accountId, RelationshipStatus status, string cachePrefix)
{
await cache.RemoveAsync($"{UserFriendsCacheKeyPrefix}{accountId}");
await cache.RemoveAsync($"{UserFriendsCacheKeyPrefix}{relatedId}");
await cache.RemoveAsync($"{UserBlockedCacheKeyPrefix}{accountId}");
await cache.RemoveAsync($"{UserBlockedCacheKeyPrefix}{relatedId}");
if (accountId == Guid.Empty)
throw new ArgumentException("Account ID cannot be empty.");
var cacheKey = $"{cachePrefix}{accountId}";
var relationships = await cache.GetAsync<List<Guid>>(cacheKey);
if (relationships == null)
{
var now = Instant.FromDateTimeUtc(DateTime.UtcNow);
relationships = await db.AccountRelationships
.Where(r => r.RelatedId == accountId)
.Where(r => r.Status == status)
.Where(r => r.ExpiredAt == null || r.ExpiredAt > now)
.Select(r => r.AccountId)
.ToListAsync();
await cache.SetAsync(cacheKey, relationships, CacheExpiration);
}
return relationships ?? new List<Guid>();
}
}
private async Task PurgeRelationshipCache(Guid accountId, Guid relatedId, params RelationshipStatus[] statuses)
{
if (statuses.Length == 0)
{
statuses = Enum.GetValues<RelationshipStatus>();
}
var keysToRemove = new List<string>();
if (statuses.Contains(RelationshipStatus.Friends) || statuses.Contains(RelationshipStatus.Pending))
{
keysToRemove.Add($"{UserFriendsCacheKeyPrefix}{accountId}");
keysToRemove.Add($"{UserFriendsCacheKeyPrefix}{relatedId}");
}
if (statuses.Contains(RelationshipStatus.Blocked))
{
keysToRemove.Add($"{UserBlockedCacheKeyPrefix}{accountId}");
keysToRemove.Add($"{UserBlockedCacheKeyPrefix}{relatedId}");
}
var removeTasks = keysToRemove.Select(key => cache.RemoveAsync(key));
await Task.WhenAll(removeTasks);
}
}

View File

@@ -0,0 +1,134 @@
using System.ComponentModel.DataAnnotations;
using DysonNetwork.Shared.Models;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
namespace DysonNetwork.Pass.Affiliation;
[ApiController]
[Route("/api/affiliations")]
public class AffiliationSpellController(AppDatabase db, AffiliationSpellService ars) : ControllerBase
{
public class CreateAffiliationSpellRequest
{
[MaxLength(1024)] public string? Spell { get; set; }
}
[HttpPost]
[Authorize]
public async Task<ActionResult<SnAffiliationSpell>> CreateSpell([FromBody] CreateAffiliationSpellRequest request)
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser) return Unauthorized();
try
{
var spell = await ars.CreateAffiliationSpell(currentUser.Id, request.Spell);
return Ok(spell);
}
catch (InvalidOperationException e)
{
return BadRequest(e.Message);
}
}
[HttpGet]
[Authorize]
public async Task<ActionResult<List<SnAffiliationSpell>>> ListCreatedSpells(
[FromQuery(Name = "order")] string orderBy = "date",
[FromQuery(Name = "desc")] bool orderDesc = false,
[FromQuery] int take = 10,
[FromQuery] int offset = 0
)
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser) return Unauthorized();
var queryable = db.AffiliationSpells
.Where(s => s.AccountId == currentUser.Id)
.AsQueryable();
queryable = orderBy switch
{
"usage" => orderDesc
? queryable.OrderByDescending(q => q.Results.Count)
: queryable.OrderBy(q => q.Results.Count),
_ => orderDesc
? queryable.OrderByDescending(q => q.CreatedAt)
: queryable.OrderBy(q => q.CreatedAt)
};
var totalCount = queryable.Count();
Response.Headers["X-Total"] = totalCount.ToString();
var spells = await queryable
.Skip(offset)
.Take(take)
.ToListAsync();
return Ok(spells);
}
[HttpGet("{id:guid}")]
[Authorize]
public async Task<ActionResult<SnAffiliationSpell>> GetSpell([FromRoute] Guid id)
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser) return Unauthorized();
var spell = await db.AffiliationSpells
.Where(s => s.AccountId == currentUser.Id)
.Where(s => s.Id == id)
.FirstOrDefaultAsync();
if (spell is null) return NotFound();
return Ok(spell);
}
[HttpGet("{id:guid}/results")]
[Authorize]
public async Task<ActionResult<List<SnAffiliationResult>>> ListResults(
[FromRoute] Guid id,
[FromQuery(Name = "desc")] bool orderDesc = false,
[FromQuery] int take = 10,
[FromQuery] int offset = 0
)
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser) return Unauthorized();
var queryable = db.AffiliationResults
.Include(r => r.Spell)
.Where(r => r.Spell.AccountId == currentUser.Id)
.Where(r => r.SpellId == id)
.AsQueryable();
// Order by creation date
queryable = orderDesc
? queryable.OrderByDescending(r => r.CreatedAt)
: queryable.OrderBy(r => r.CreatedAt);
var totalCount = queryable.Count();
Response.Headers["X-Total"] = totalCount.ToString();
var results = await queryable
.Skip(offset)
.Take(take)
.ToListAsync();
return Ok(results);
}
[HttpDelete("{id:guid}")]
[Authorize]
public async Task<ActionResult> DeleteSpell([FromRoute] Guid id)
{
if (HttpContext.Items["CurrentUser"] is not SnAccount currentUser) return Unauthorized();
var spell = await db.AffiliationSpells
.Where(s => s.AccountId == currentUser.Id)
.Where(s => s.Id == id)
.FirstOrDefaultAsync();
if (spell is null) return NotFound();
db.AffiliationSpells.Remove(spell);
await db.SaveChangesAsync();
return Ok();
}
}

View File

@@ -0,0 +1,62 @@
using System.Security.Cryptography;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
namespace DysonNetwork.Pass.Affiliation;
public class AffiliationSpellService(AppDatabase db)
{
public async Task<SnAffiliationSpell> CreateAffiliationSpell(Guid accountId, string? spellWord)
{
spellWord ??= _GenerateRandomString(8);
if (await CheckAffiliationSpellHasTaken(spellWord))
throw new InvalidOperationException("The spell has been taken.");
var spell = new SnAffiliationSpell
{
AccountId = accountId,
Spell = spellWord
};
db.AffiliationSpells.Add(spell);
await db.SaveChangesAsync();
return spell;
}
public async Task<SnAffiliationResult> CreateAffiliationResult(string spellWord, string resourceId)
{
var spell =
await db.AffiliationSpells.FirstOrDefaultAsync(a => a.Spell == spellWord);
if (spell is null) throw new InvalidOperationException("The spell was not found.");
var result = new SnAffiliationResult
{
Spell = spell,
ResourceIdentifier = resourceId
};
db.AffiliationResults.Add(result);
await db.SaveChangesAsync();
return result;
}
public async Task<bool> CheckAffiliationSpellHasTaken(string spellWord)
{
return await db.AffiliationSpells.AnyAsync(s => s.Spell == spellWord);
}
private static string _GenerateRandomString(int length)
{
const string chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
var result = new char[length];
using var rng = RandomNumberGenerator.Create();
for (var i = 0; i < length; i++)
{
var bytes = new byte[1];
rng.GetBytes(bytes);
result[i] = chars[bytes[0] % chars.Length];
}
return new string(result);
}
}

View File

@@ -1,8 +1,8 @@
using System.Linq.Expressions;
using System.Reflection;
using System.Text.Json;
using System.Text.Json.Serialization;
using DysonNetwork.Pass.Permission;
using DysonNetwork.Shared.Data;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Design;
@@ -61,6 +61,9 @@ public class AppDatabase(
public DbSet<SnLottery> Lotteries { get; set; } = null!;
public DbSet<SnLotteryRecord> LotteryRecords { get; set; } = null!;
public DbSet<SnAffiliationSpell> AffiliationSpells { get; set; } = null!;
public DbSet<SnAffiliationResult> AffiliationResults { get; set; } = null!;
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseNpgsql(
@@ -100,7 +103,7 @@ public class AppDatabase(
"stickers.packs.create",
"stickers.create"
}.Select(permission =>
PermissionService.NewPermissionNode("group:default", "global", permission, true))
PermissionService.NewPermissionNode("group:default", permission, true))
.ToList()
});
await context.SaveChangesAsync(cancellationToken);
@@ -143,51 +146,12 @@ public class AppDatabase(
.HasForeignKey(pm => pm.RealmId)
.OnDelete(DeleteBehavior.Cascade);
// Automatically apply soft-delete filter to all entities inheriting BaseModel
foreach (var entityType in modelBuilder.Model.GetEntityTypes())
{
if (!typeof(ModelBase).IsAssignableFrom(entityType.ClrType)) continue;
var method = typeof(AppDatabase)
.GetMethod(nameof(SetSoftDeleteFilter),
BindingFlags.NonPublic | BindingFlags.Static)!
.MakeGenericMethod(entityType.ClrType);
method.Invoke(null, [modelBuilder]);
}
}
private static void SetSoftDeleteFilter<TEntity>(ModelBuilder modelBuilder)
where TEntity : ModelBase
{
modelBuilder.Entity<TEntity>().HasQueryFilter(e => e.DeletedAt == null);
modelBuilder.ApplySoftDeleteFilters();
}
public override async Task<int> SaveChangesAsync(CancellationToken cancellationToken = default)
{
var now = SystemClock.Instance.GetCurrentInstant();
foreach (var entry in ChangeTracker.Entries<ModelBase>())
{
switch (entry.State)
{
case EntityState.Added:
entry.Entity.CreatedAt = now;
entry.Entity.UpdatedAt = now;
break;
case EntityState.Modified:
entry.Entity.UpdatedAt = now;
break;
case EntityState.Deleted:
entry.State = EntityState.Modified;
entry.Entity.DeletedAt = now;
break;
case EntityState.Detached:
case EntityState.Unchanged:
default:
break;
}
}
this.ApplyAuditableAndSoftDelete();
return await base.SaveChangesAsync(cancellationToken);
}
}
@@ -266,34 +230,3 @@ public class AppDatabaseFactory : IDesignTimeDbContextFactory<AppDatabase>
}
}
public static class OptionalQueryExtensions
{
public static IQueryable<T> If<T>(
this IQueryable<T> source,
bool condition,
Func<IQueryable<T>, IQueryable<T>> transform
)
{
return condition ? transform(source) : source;
}
public static IQueryable<T> If<T, TP>(
this IIncludableQueryable<T, TP> source,
bool condition,
Func<IIncludableQueryable<T, TP>, IQueryable<T>> transform
)
where T : class
{
return condition ? transform(source) : source;
}
public static IQueryable<T> If<T, TP>(
this IIncludableQueryable<T, IEnumerable<TP>> source,
bool condition,
Func<IIncludableQueryable<T, IEnumerable<TP>>, IQueryable<T>> transform
)
where T : class
{
return condition ? transform(source) : source;
}
}

View File

@@ -70,7 +70,7 @@ public class DysonTokenAuthHandler(
};
// Add scopes as claims
session.Challenge?.Scopes.ForEach(scope => claims.Add(new Claim("scope", scope)));
session.Scopes.ForEach(scope => claims.Add(new Claim("scope", scope)));
// Add superuser claim if applicable
if (session.Account.IsSuperuser)
@@ -117,16 +117,17 @@ public class DysonTokenAuthHandler(
{
if (authHeader.StartsWith("Bearer ", StringComparison.OrdinalIgnoreCase))
{
var token = authHeader["Bearer ".Length..].Trim();
var parts = token.Split('.');
var tokenText = authHeader["Bearer ".Length..].Trim();
var parts = tokenText.Split('.');
return new TokenInfo
{
Token = token,
Token = tokenText,
Type = parts.Length == 3 ? TokenType.OidcKey : TokenType.AuthKey
};
}
else if (authHeader.StartsWith("AtField ", StringComparison.OrdinalIgnoreCase))
if (authHeader.StartsWith("AtField ", StringComparison.OrdinalIgnoreCase))
{
return new TokenInfo
{
@@ -134,7 +135,8 @@ public class DysonTokenAuthHandler(
Type = TokenType.AuthKey
};
}
else if (authHeader.StartsWith("AkField ", StringComparison.OrdinalIgnoreCase))
if (authHeader.StartsWith("AkField ", StringComparison.OrdinalIgnoreCase))
{
return new TokenInfo
{

View File

@@ -3,7 +3,7 @@ using Microsoft.AspNetCore.Mvc;
using NodaTime;
using Microsoft.EntityFrameworkCore;
using DysonNetwork.Pass.Localization;
using DysonNetwork.Shared.GeoIp;
using DysonNetwork.Shared.Geometry;
using DysonNetwork.Shared.Proto;
using Microsoft.Extensions.Localization;
using AccountService = DysonNetwork.Pass.Account.AccountService;
@@ -18,7 +18,7 @@ public class AuthController(
AppDatabase db,
AccountService accounts,
AuthService auth,
GeoIpService geo,
GeoService geo,
ActionLogService als,
RingService.RingServiceClient pusher,
IConfiguration configuration,
@@ -30,12 +30,12 @@ public class AuthController(
public class ChallengeRequest
{
[Required] public ClientPlatform Platform { get; set; }
[Required] public Shared.Models.ClientPlatform Platform { get; set; }
[Required] [MaxLength(256)] public string Account { get; set; } = null!;
[Required] [MaxLength(512)] public string DeviceId { get; set; } = null!;
[MaxLength(1024)] public string? DeviceName { get; set; }
public List<string> Audiences { get; set; } = new();
public List<string> Scopes { get; set; } = new();
public List<string> Audiences { get; set; } = [];
public List<string> Scopes { get; set; } = [];
}
[HttpPost("challenge")]
@@ -61,9 +61,6 @@ public class AuthController(
request.DeviceName ??= userAgent;
var device =
await auth.GetOrCreateDeviceAsync(account.Id, request.DeviceId, request.DeviceName, request.Platform);
// Trying to pick up challenges from the same IP address and user agent
var existingChallenge = await db.AuthChallenges
.Where(e => e.AccountId == account.Id)
@@ -71,15 +68,9 @@ public class AuthController(
.Where(e => e.UserAgent == userAgent)
.Where(e => e.StepRemain > 0)
.Where(e => e.ExpiredAt != null && now < e.ExpiredAt)
.Where(e => e.Type == Shared.Models.ChallengeType.Login)
.Where(e => e.ClientId == device.Id)
.Where(e => e.DeviceId == request.DeviceId)
.FirstOrDefaultAsync();
if (existingChallenge is not null)
{
var existingSession = await db.AuthSessions.Where(e => e.ChallengeId == existingChallenge.Id)
.FirstOrDefaultAsync();
if (existingSession is null) return existingChallenge;
}
if (existingChallenge is not null) return existingChallenge;
var challenge = new SnAuthChallenge
{
@@ -90,7 +81,9 @@ public class AuthController(
IpAddress = ipAddress,
UserAgent = userAgent,
Location = geo.GetPointFromIp(ipAddress),
ClientId = device.Id,
DeviceId = request.DeviceId,
DeviceName = request.DeviceName,
Platform = request.Platform,
AccountId = account.Id
}.Normalize();
@@ -112,14 +105,11 @@ public class AuthController(
.ThenInclude(e => e.Profile)
.FirstOrDefaultAsync(e => e.Id == id);
if (challenge is null)
{
logger.LogWarning("GetChallenge: challenge not found (challengeId={ChallengeId}, ip={IpAddress})",
id, HttpContext.Connection.RemoteIpAddress?.ToString());
return NotFound("Auth challenge was not found.");
}
if (challenge is not null) return challenge;
logger.LogWarning("GetChallenge: challenge not found (challengeId={ChallengeId}, ip={IpAddress})",
id, HttpContext.Connection.RemoteIpAddress?.ToString());
return NotFound("Auth challenge was not found.");
return challenge;
}
[HttpGet("challenge/{id:guid}/factors")]
@@ -176,7 +166,6 @@ public class AuthController(
{
var challenge = await db.AuthChallenges
.Include(e => e.Account)
.Include(authChallenge => authChallenge.Client)
.FirstOrDefaultAsync(e => e.Id == id);
if (challenge is null) return NotFound("Auth challenge was not found.");
@@ -218,7 +207,7 @@ public class AuthController(
throw new ArgumentException("Invalid password.");
}
}
catch (Exception ex)
catch (Exception)
{
challenge.FailedAttempts++;
db.Update(challenge);
@@ -231,8 +220,11 @@ public class AuthController(
);
await db.SaveChangesAsync();
logger.LogWarning("DoChallenge: authentication failure (challengeId={ChallengeId}, factorId={FactorId}, accountId={AccountId}, failedAttempts={FailedAttempts}, factorType={FactorType}, ip={IpAddress}, uaLength={UaLength})",
challenge.Id, factor.Id, challenge.AccountId, challenge.FailedAttempts, factor.Type, HttpContext.Connection.RemoteIpAddress?.ToString(), (HttpContext.Request.Headers.UserAgent.ToString() ?? "").Length);
logger.LogWarning(
"DoChallenge: authentication failure (challengeId={ChallengeId}, factorId={FactorId}, accountId={AccountId}, failedAttempts={FailedAttempts}, factorType={FactorType}, ip={IpAddress}, uaLength={UaLength})",
challenge.Id, factor.Id, challenge.AccountId, challenge.FailedAttempts, factor.Type,
HttpContext.Connection.RemoteIpAddress?.ToString(),
HttpContext.Request.Headers.UserAgent.ToString().Length);
return BadRequest("Invalid password.");
}
@@ -242,11 +234,11 @@ public class AuthController(
AccountService.SetCultureInfo(challenge.Account);
await pusher.SendPushNotificationToUserAsync(new SendPushNotificationToUserRequest
{
Notification = new PushNotification()
Notification = new PushNotification
{
Topic = "auth.login",
Title = localizer["NewLoginTitle"],
Body = localizer["NewLoginBody", challenge.Client?.DeviceName ?? "unknown",
Body = localizer["NewLoginBody", challenge.DeviceName ?? "unknown",
challenge.IpAddress ?? "unknown"],
IsSavable = true
},
@@ -277,6 +269,14 @@ public class AuthController(
public string Token { get; set; } = string.Empty;
}
public class NewSessionRequest
{
[Required] [MaxLength(512)] public string DeviceId { get; set; } = null!;
[MaxLength(1024)] public string? DeviceName { get; set; }
[Required] public Shared.Models.ClientPlatform Platform { get; set; }
public Instant? ExpiredAt { get; set; }
}
[HttpPost("token")]
public async Task<ActionResult<TokenExchangeResponse>> ExchangeToken([FromBody] TokenExchangeRequest request)
{
@@ -327,4 +327,35 @@ public class AuthController(
});
return Ok();
}
}
[HttpPost("login/session")]
[Microsoft.AspNetCore.Authorization.Authorize] // Use full namespace to avoid ambiguity with DysonNetwork.Pass.Permission.Authorize
public async Task<ActionResult<TokenExchangeResponse>> LoginFromSession([FromBody] NewSessionRequest request)
{
if (HttpContext.Items["CurrentUser"] is not SnAccount ||
HttpContext.Items["CurrentSession"] is not SnAuthSession currentSession)
return Unauthorized();
var newSession = await auth.CreateSessionFromParentAsync(
currentSession,
request.DeviceId,
request.DeviceName,
request.Platform,
request.ExpiredAt
);
var tk = auth.CreateToken(newSession);
// Set cookie using HttpContext, similar to CreateSessionAndIssueToken
HttpContext.Response.Cookies.Append(AuthConstants.CookieTokenName, tk, new CookieOptions
{
HttpOnly = true,
Secure = true,
SameSite = SameSiteMode.Lax,
Domain = _cookieDomain,
Expires = request.ExpiredAt?.ToDateTimeOffset() ?? DateTime.UtcNow.AddYears(20)
});
return Ok(new TokenExchangeResponse { Token = tk });
}
}

View File

@@ -2,6 +2,8 @@ using System.Security.Cryptography;
using System.Text.Json;
using System.Text.Json.Serialization;
using DysonNetwork.Shared.Cache;
using DysonNetwork.Shared.Data;
using DysonNetwork.Shared.Geometry;
using DysonNetwork.Shared.Models;
using Microsoft.EntityFrameworkCore;
using NodaTime;
@@ -13,7 +15,8 @@ public class AuthService(
IConfiguration config,
IHttpClientFactory httpClientFactory,
IHttpContextAccessor httpContextAccessor,
ICacheService cache
ICacheService cache,
GeoService geo
)
{
private HttpContext HttpContext => httpContextAccessor.HttpContext!;
@@ -30,7 +33,7 @@ public class AuthService(
{
// 1) Find out how many authentication factors the account has enabled.
var enabledFactors = await db.AccountAuthFactors
.Where(f => f.AccountId == account.Id)
.Where(f => f.AccountId == account.Id && f.Type != AccountAuthFactorType.PinCode)
.Where(f => f.EnabledAt != null)
.ToListAsync();
var maxSteps = enabledFactors.Count;
@@ -41,13 +44,18 @@ public class AuthService(
// 2) Get login context from recent sessions
var recentSessions = await db.AuthSessions
.Include(s => s.Challenge)
.Where(s => s.AccountId == account.Id)
.Where(s => s.LastGrantedAt != null)
.OrderByDescending(s => s.LastGrantedAt)
.Take(10)
.ToListAsync();
var recentChallengeIds =
recentSessions
.Where(s => s.ChallengeId != null)
.Select(s => s.ChallengeId!.Value).ToList();
var recentChallenges = await db.AuthChallenges.Where(c => recentChallengeIds.Contains(c.Id)).ToListAsync();
var ipAddress = request.HttpContext.Connection.RemoteIpAddress?.ToString();
var userAgent = request.Headers.UserAgent.ToString();
@@ -59,14 +67,14 @@ public class AuthService(
else
{
// Check if IP has been used before
var ipPreviouslyUsed = recentSessions.Any(s => s.Challenge?.IpAddress == ipAddress);
var ipPreviouslyUsed = recentChallenges.Any(c => c.IpAddress == ipAddress);
if (!ipPreviouslyUsed)
{
riskScore += 8;
}
// Check geographical distance for last known location
var lastKnownIp = recentSessions.FirstOrDefault(s => !string.IsNullOrWhiteSpace(s.Challenge?.IpAddress))?.Challenge?.IpAddress;
var lastKnownIp = recentChallenges.FirstOrDefault(c => !string.IsNullOrWhiteSpace(c.IpAddress))?.IpAddress;
if (!string.IsNullOrWhiteSpace(lastKnownIp) && lastKnownIp != ipAddress)
{
riskScore += 6;
@@ -80,9 +88,9 @@ public class AuthService(
}
else
{
var uaPreviouslyUsed = recentSessions.Any(s =>
!string.IsNullOrWhiteSpace(s.Challenge?.UserAgent) &&
string.Equals(s.Challenge.UserAgent, userAgent, StringComparison.OrdinalIgnoreCase));
var uaPreviouslyUsed = recentChallenges.Any(c =>
!string.IsNullOrWhiteSpace(c.UserAgent) &&
string.Equals(c.UserAgent, userAgent, StringComparison.OrdinalIgnoreCase));
if (!uaPreviouslyUsed)
{
@@ -156,7 +164,7 @@ public class AuthService(
// 8) Device Trust Assessment
var trustedDeviceIds = recentSessions
.Where(s => s.CreatedAt > now.Minus(Duration.FromDays(30))) // Trust devices from last 30 days
.Select(s => s.Challenge?.ClientId)
.Select(s => s.ClientId)
.Where(id => id.HasValue)
.Distinct()
.ToList();
@@ -180,29 +188,28 @@ public class AuthService(
return totalRequiredSteps;
}
public async Task<SnAuthSession> CreateSessionForOidcAsync(SnAccount account, Instant time,
Guid? customAppId = null)
public async Task<SnAuthSession> CreateSessionForOidcAsync(
SnAccount account,
Instant time,
Guid? customAppId = null,
SnAuthSession? parentSession = null
)
{
var challenge = new SnAuthChallenge
{
AccountId = account.Id,
IpAddress = HttpContext.Connection.RemoteIpAddress?.ToString(),
UserAgent = HttpContext.Request.Headers.UserAgent,
StepRemain = 1,
StepTotal = 1,
Type = customAppId is not null ? ChallengeType.OAuth : ChallengeType.Oidc
};
var ipAddr = HttpContext.Connection.RemoteIpAddress?.ToString();
var geoLocation = ipAddr is not null ? geo.GetPointFromIp(ipAddr) : null;
var session = new SnAuthSession
{
AccountId = account.Id,
CreatedAt = time,
LastGrantedAt = time,
Challenge = challenge,
AppId = customAppId
IpAddress = ipAddr,
UserAgent = HttpContext.Request.Headers.UserAgent,
Location = geoLocation,
AppId = customAppId,
ParentSessionId = parentSession?.Id,
Type = customAppId is not null ? SessionType.OAuth : SessionType.Oidc,
};
db.AuthChallenges.Add(challenge);
db.AuthSessions.Add(session);
await db.SaveChangesAsync();
@@ -216,7 +223,8 @@ public class AuthService(
ClientPlatform platform = ClientPlatform.Unidentified
)
{
var device = await db.AuthClients.FirstOrDefaultAsync(d => d.DeviceId == deviceId && d.AccountId == accountId);
var device = await db.AuthClients
.FirstOrDefaultAsync(d => d.DeviceId == deviceId && d.AccountId == accountId);
if (device is not null) return device;
device = new SnAuthClient
{
@@ -287,35 +295,71 @@ public class AuthService(
/// <summary>
/// Immediately revoke a session by setting expiry to now and clearing from cache
/// This provides immediate invalidation of tokens and sessions
/// This provides immediate invalidation of tokens and sessions, including all child sessions recursively.
/// </summary>
/// <param name="sessionId">Session ID to revoke</param>
/// <returns>True if session was found and revoked, false otherwise</returns>
public async Task<bool> RevokeSessionAsync(Guid sessionId)
{
var session = await db.AuthSessions.FirstOrDefaultAsync(s => s.Id == sessionId);
if (session == null)
var sessionsToRevokeIds = new HashSet<Guid>();
await CollectSessionsToRevoke(sessionId, sessionsToRevokeIds);
if (sessionsToRevokeIds.Count == 0)
{
return false;
}
// Set expiry to now (immediate invalidation)
var now = SystemClock.Instance.GetCurrentInstant();
session.ExpiredAt = now;
db.AuthSessions.Update(session);
var accountIdsToClearCache = new HashSet<Guid>();
// Clear from cache immediately
var cacheKey = $"{AuthCachePrefix}{session.Id}";
await cache.RemoveAsync(cacheKey);
// Fetch all sessions to be revoked in one go
var sessions = await db.AuthSessions
.Where(s => sessionsToRevokeIds.Contains(s.Id))
.ToListAsync();
// Clear account-level cache groups that include this session
await cache.RemoveAsync($"{AuthCachePrefix}{session.AccountId}");
foreach (var session in sessions)
{
session.ExpiredAt = now;
accountIdsToClearCache.Add(session.AccountId);
// Clear from cache immediately for each session
await cache.RemoveAsync($"{AuthCachePrefix}{session.Id}");
}
db.AuthSessions.UpdateRange(sessions);
await db.SaveChangesAsync();
// Clear account-level cache groups
foreach (var accountId in accountIdsToClearCache)
{
await cache.RemoveAsync($"{AuthCachePrefix}{accountId}");
}
return true;
}
/// <summary>
/// Recursively collects all session IDs that need to be revoked, starting from a given session.
/// </summary>
/// <param name="currentSessionId">The session ID to start collecting from.</param>
/// <param name="sessionsToRevoke">A HashSet to store the IDs of all sessions to be revoked.</param>
private async Task CollectSessionsToRevoke(Guid currentSessionId, HashSet<Guid> sessionsToRevoke)
{
if (!sessionsToRevoke.Add(currentSessionId))
return; // Already processed this session
// Find direct children
var childSessions = await db.AuthSessions
.Where(s => s.ParentSessionId == currentSessionId)
.Select(s => s.Id)
.ToListAsync();
foreach (var childId in childSessions)
{
await CollectSessionsToRevoke(childId, sessionsToRevoke);
}
}
/// <summary>
/// Revoke all sessions for an account (logout everywhere)
/// </summary>
@@ -374,10 +418,12 @@ public class AuthService(
if (challenge.StepRemain != 0)
throw new ArgumentException("Challenge not yet completed.");
var hasSession = await db.AuthSessions
.AnyAsync(e => e.ChallengeId == challenge.Id);
if (hasSession)
throw new ArgumentException("Session already exists for this challenge.");
var device = await GetOrCreateDeviceAsync(
challenge.AccountId,
challenge.DeviceId,
challenge.DeviceName,
challenge.Platform
);
var now = SystemClock.Instance.GetCurrentInstant();
var session = new SnAuthSession
@@ -385,7 +431,13 @@ public class AuthService(
LastGrantedAt = now,
ExpiredAt = now.Plus(Duration.FromDays(7)),
AccountId = challenge.AccountId,
ChallengeId = challenge.Id
IpAddress = challenge.IpAddress,
UserAgent = challenge.UserAgent,
Location = challenge.Location,
Scopes = challenge.Scopes,
Audiences = challenge.Audiences,
ChallengeId = challenge.Id,
ClientId = device.Id,
};
db.AuthSessions.Add(session);
@@ -408,7 +460,7 @@ public class AuthService(
return tk;
}
private string CreateCompactToken(Guid sessionId, RSA rsa)
private static string CreateCompactToken(Guid sessionId, RSA rsa)
{
// Create the payload: just the session ID
var payloadBytes = sessionId.ToByteArray();
@@ -499,7 +551,8 @@ public class AuthService(
return key;
}
public async Task<SnApiKey> CreateApiKey(Guid accountId, string label, Instant? expiredAt = null)
public async Task<SnApiKey> CreateApiKey(Guid accountId, string label, Instant? expiredAt = null,
SnAuthSession? parentSession = null)
{
var key = new SnApiKey
{
@@ -508,7 +561,8 @@ public class AuthService(
Session = new SnAuthSession
{
AccountId = accountId,
ExpiredAt = expiredAt
ExpiredAt = expiredAt,
ParentSessionId = parentSession?.Id
},
};
@@ -614,4 +668,47 @@ public class AuthService(
return Convert.FromBase64String(padded);
}
}
/// <summary>
/// Creates a new session derived from an existing parent session.
/// </summary>
/// <param name="parentSession">The existing session from which the new session is derived.</param>
/// <param name="deviceId">The ID of the device for the new session.</param>
/// <param name="deviceName">The name of the device for the new session.</param>
/// <param name="platform">The platform of the device for the new session.</param>
/// <param name="expiredAt">Optional: The expiration time for the new session.</param>
/// <returns>The newly created SnAuthSession.</returns>
public async Task<SnAuthSession> CreateSessionFromParentAsync(
SnAuthSession parentSession,
string deviceId,
string? deviceName,
ClientPlatform platform,
Instant? expiredAt = null
)
{
var device = await GetOrCreateDeviceAsync(parentSession.AccountId, deviceId, deviceName, platform);
var ipAddress = HttpContext.Connection.RemoteIpAddress?.ToString();
var userAgent = HttpContext.Request.Headers.UserAgent.ToString();
var geoLocation = ipAddress is not null ? geo.GetPointFromIp(ipAddress) : null;
var now = SystemClock.Instance.GetCurrentInstant();
var session = new SnAuthSession
{
IpAddress = ipAddress,
UserAgent = userAgent,
Location = geoLocation,
AccountId = parentSession.AccountId,
CreatedAt = now,
LastGrantedAt = now,
ExpiredAt = expiredAt,
ParentSessionId = parentSession.Id,
ClientId = device.Id,
};
db.AuthSessions.Add(session);
await db.SaveChangesAsync();
return session;
}
}

View File

@@ -306,7 +306,7 @@ public class OidcProviderController(
HttpContext.Items["CurrentSession"] is not SnAuthSession currentSession) return Unauthorized();
// Get requested scopes from the token
var scopes = currentSession.Challenge?.Scopes ?? [];
var scopes = currentSession.Scopes;
var userInfo = new Dictionary<string, object>
{

View File

@@ -5,7 +5,8 @@ namespace DysonNetwork.Pass.Auth.OidcProvider.Models;
public class AuthorizationCodeInfo
{
public Guid ClientId { get; set; }
public Guid AccountId { get; set; }
public Guid? AccountId { get; set; }
public ExternalUserInfo? ExternalUserInfo { get; set; }
public string RedirectUri { get; set; } = string.Empty;
public List<string> Scopes { get; set; } = new();
public string? CodeChallenge { get; set; }

View File

@@ -0,0 +1,9 @@
namespace DysonNetwork.Pass.Auth.OidcProvider.Models;
public class ExternalUserInfo
{
public string Provider { get; set; } = null!;
public string UserId { get; set; } = null!;
public string? Email { get; set; }
public string? Name { get; set; }
}

View File

@@ -5,7 +5,7 @@ namespace DysonNetwork.Pass.Auth.OidcProvider.Responses;
public class TokenResponse
{
[JsonPropertyName("access_token")]
public string AccessToken { get; set; } = null!;
public string? AccessToken { get; set; } = null!;
[JsonPropertyName("expires_in")]
public int ExpiresIn { get; set; }
@@ -22,4 +22,7 @@ public class TokenResponse
[JsonPropertyName("id_token")]
public string? IdToken { get; set; }
[JsonPropertyName("onboarding_token")]
public string? OnboardingToken { get; set; }
}

View File

@@ -72,7 +72,6 @@ public class OidcProviderService(
var now = SystemClock.Instance.GetCurrentInstant();
var queryable = db.AuthSessions
.Include(s => s.Challenge)
.AsQueryable();
if (withAccount)
queryable = queryable
@@ -85,8 +84,7 @@ public class OidcProviderService(
.Where(s => s.AccountId == accountId &&
s.AppId == clientId &&
(s.ExpiredAt == null || s.ExpiredAt > now) &&
s.Challenge != null &&
s.Challenge.Type == Shared.Models.ChallengeType.OAuth)
s.Type == Shared.Models.SessionType.OAuth)
.OrderByDescending(s => s.CreatedAt)
.FirstOrDefaultAsync();
}
@@ -257,18 +255,15 @@ public class OidcProviderService(
}
private async Task<(SnAuthSession session, string? nonce, List<string>? scopes)> HandleAuthorizationCodeFlowAsync(
string authorizationCode,
Guid clientId,
string? redirectUri,
string? codeVerifier
AuthorizationCodeInfo authCode,
Guid clientId
)
{
var authCode = await ValidateAuthorizationCodeAsync(authorizationCode, clientId, redirectUri, codeVerifier);
if (authCode == null)
throw new InvalidOperationException("Invalid authorization code");
if (authCode.AccountId == null)
throw new InvalidOperationException("Invalid authorization code, account id is missing.");
// Load the session for the user
var existingSession = await FindValidSessionAsync(authCode.AccountId, clientId, withAccount: true);
var existingSession = await FindValidSessionAsync(authCode.AccountId.Value, clientId, withAccount: true);
SnAuthSession session;
if (existingSession == null)
@@ -315,31 +310,124 @@ public class OidcProviderService(
var client = await FindClientByIdAsync(clientId) ?? throw new InvalidOperationException("Client not found");
var (session, nonce, scopes) = authorizationCode != null
? await HandleAuthorizationCodeFlowAsync(authorizationCode, clientId, redirectUri, codeVerifier)
: sessionId.HasValue
? await HandleRefreshTokenFlowAsync(sessionId.Value)
: throw new InvalidOperationException("Either authorization code or session ID must be provided");
if (authorizationCode != null)
{
var authCode = await ValidateAuthorizationCodeAsync(authorizationCode, clientId, redirectUri, codeVerifier);
if (authCode == null)
{
throw new InvalidOperationException("Invalid authorization code");
}
if (authCode.AccountId.HasValue)
{
var (session, nonce, scopes) = await HandleAuthorizationCodeFlowAsync(authCode, clientId);
var clock = SystemClock.Instance;
var now = clock.GetCurrentInstant();
var expiresIn = (int)_options.AccessTokenLifetime.TotalSeconds;
var expiresAt = now.Plus(Duration.FromSeconds(expiresIn));
// Generate tokens
var accessToken = GenerateJwtToken(client, session, expiresAt, scopes);
var idToken = GenerateIdToken(client, session, nonce, scopes);
var refreshToken = GenerateRefreshToken(session);
return new TokenResponse
{
AccessToken = accessToken,
IdToken = idToken,
ExpiresIn = expiresIn,
TokenType = "Bearer",
RefreshToken = refreshToken,
Scope = scopes != null ? string.Join(" ", scopes) : null
};
}
if (authCode.ExternalUserInfo != null)
{
var onboardingToken = GenerateOnboardingToken(client, authCode.ExternalUserInfo, authCode.Nonce, authCode.Scopes);
return new TokenResponse
{
OnboardingToken = onboardingToken,
TokenType = "Onboarding"
};
}
throw new InvalidOperationException("Invalid authorization code state.");
}
if (sessionId.HasValue)
{
var (session, nonce, scopes) = await HandleRefreshTokenFlowAsync(sessionId.Value);
var clock = SystemClock.Instance;
var now = clock.GetCurrentInstant();
var expiresIn = (int)_options.AccessTokenLifetime.TotalSeconds;
var expiresAt = now.Plus(Duration.FromSeconds(expiresIn));
var accessToken = GenerateJwtToken(client, session, expiresAt, scopes);
var idToken = GenerateIdToken(client, session, nonce, scopes);
var refreshToken = GenerateRefreshToken(session);
return new TokenResponse
{
AccessToken = accessToken,
IdToken = idToken,
ExpiresIn = expiresIn,
TokenType = "Bearer",
RefreshToken = refreshToken,
Scope = scopes != null ? string.Join(" ", scopes) : null
};
}
throw new InvalidOperationException("Either authorization code or session ID must be provided");
}
private string GenerateOnboardingToken(CustomApp client, ExternalUserInfo externalUserInfo, string? nonce,
List<string> scopes)
{
var tokenHandler = new JwtSecurityTokenHandler();
var clock = SystemClock.Instance;
var now = clock.GetCurrentInstant();
var expiresIn = (int)_options.AccessTokenLifetime.TotalSeconds;
var expiresAt = now.Plus(Duration.FromSeconds(expiresIn));
// Generate tokens
var accessToken = GenerateJwtToken(client, session, expiresAt, scopes);
var idToken = GenerateIdToken(client, session, nonce, scopes);
var refreshToken = GenerateRefreshToken(session);
return new TokenResponse
var claims = new List<Claim>
{
AccessToken = accessToken,
IdToken = idToken,
ExpiresIn = expiresIn,
TokenType = "Bearer",
RefreshToken = refreshToken,
Scope = scopes != null ? string.Join(" ", scopes) : null
new(JwtRegisteredClaimNames.Iss, _options.IssuerUri),
new(JwtRegisteredClaimNames.Aud, client.Slug),
new(JwtRegisteredClaimNames.Iat, now.ToUnixTimeSeconds().ToString(), ClaimValueTypes.Integer64),
new(JwtRegisteredClaimNames.Exp,
now.Plus(Duration.FromMinutes(15)).ToUnixTimeSeconds()
.ToString(), ClaimValueTypes.Integer64),
new("provider", externalUserInfo.Provider),
new("provider_user_id", externalUserInfo.UserId)
};
if (!string.IsNullOrEmpty(externalUserInfo.Email))
{
claims.Add(new Claim(JwtRegisteredClaimNames.Email, externalUserInfo.Email));
}
if (!string.IsNullOrEmpty(externalUserInfo.Name))
{
claims.Add(new Claim("name", externalUserInfo.Name));
}
if (!string.IsNullOrEmpty(nonce))
{
claims.Add(new Claim("nonce", nonce));
}
var tokenDescriptor = new SecurityTokenDescriptor
{
Subject = new ClaimsIdentity(claims),
Issuer = _options.IssuerUri,
Audience = client.Slug,
Expires = now.Plus(Duration.FromMinutes(15)).ToDateTimeUtc(),
NotBefore = now.ToDateTimeUtc(),
SigningCredentials = new SigningCredentials(
new RsaSecurityKey(_options.GetRsaPrivateKey()),
SecurityAlgorithms.RsaSha256
)
};
var token = tokenHandler.CreateToken(tokenDescriptor);
return tokenHandler.WriteToken(token);
}
private string GenerateJwtToken(
@@ -421,7 +509,6 @@ public class OidcProviderService(
{
return await db.AuthSessions
.Include(s => s.Account)
.Include(s => s.Challenge)
.FirstOrDefaultAsync(s => s.Id == sessionId);
}
@@ -440,12 +527,6 @@ public class OidcProviderService(
string? nonce = null
)
{
// Generate a random code
var clock = SystemClock.Instance;
var code = GenerateRandomString(32);
var now = clock.GetCurrentInstant();
// Create the authorization code info
var authCodeInfo = new AuthorizationCodeInfo
{
ClientId = clientId,
@@ -455,17 +536,47 @@ public class OidcProviderService(
CodeChallenge = codeChallenge,
CodeChallengeMethod = codeChallengeMethod,
Nonce = nonce,
CreatedAt = now
CreatedAt = SystemClock.Instance.GetCurrentInstant()
};
// Store the code with its metadata in the cache
return await StoreAuthorizationCode(authCodeInfo);
}
public async Task<string> GenerateAuthorizationCodeAsync(
Guid clientId,
ExternalUserInfo externalUserInfo,
string redirectUri,
IEnumerable<string> scopes,
string? codeChallenge = null,
string? codeChallengeMethod = null,
string? nonce = null
)
{
var authCodeInfo = new AuthorizationCodeInfo
{
ClientId = clientId,
ExternalUserInfo = externalUserInfo,
RedirectUri = redirectUri,
Scopes = scopes.ToList(),
CodeChallenge = codeChallenge,
CodeChallengeMethod = codeChallengeMethod,
Nonce = nonce,
CreatedAt = SystemClock.Instance.GetCurrentInstant()
};
return await StoreAuthorizationCode(authCodeInfo);
}
private async Task<string> StoreAuthorizationCode(AuthorizationCodeInfo authCodeInfo)
{
var code = GenerateRandomString(32);
var cacheKey = $"{CacheKeyPrefixAuthCode}{code}";
await cache.SetAsync(cacheKey, authCodeInfo, _options.AuthorizationCodeLifetime);
logger.LogInformation("Generated authorization code for client {ClientId} and user {UserId}", clientId, userId);
logger.LogInformation("Generated authorization code for client {ClientId}", authCodeInfo.ClientId);
return code;
}
private async Task<AuthorizationCodeInfo?> ValidateAuthorizationCodeAsync(
string code,
Guid clientId,

View File

@@ -3,6 +3,7 @@ using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
using DysonNetwork.Shared.Cache;
using Microsoft.AspNetCore.WebUtilities;
using NodaTime;
using DysonNetwork.Shared.Models;
@@ -17,7 +18,8 @@ public class ConnectionController(
AccountService accounts,
AuthService auth,
ICacheService cache,
IConfiguration configuration
IConfiguration configuration,
ILogger<ConnectionController> logger
) : ControllerBase
{
private const string StateCachePrefix = "oidc-state:";
@@ -152,8 +154,13 @@ public class ConnectionController(
{
var stateValue = await cache.GetAsync<string>(stateKey);
if (string.IsNullOrEmpty(stateValue) || !OidcState.TryParse(stateValue, out oidcState) || oidcState == null)
{
logger.LogWarning("Invalid or expired OIDC state: {State}", callbackData.State);
return BadRequest("Invalid or expired state parameter");
}
}
logger.LogInformation("OIDC callback for provider {Provider} with state {State} and flow {FlowType}", provider, callbackData.State, oidcState.FlowType);
// Remove the state from cache to prevent replay attacks
await cache.RemoveAsync(stateKey);
@@ -166,19 +173,24 @@ public class ConnectionController(
{
callbackData.State = oidcState.DeviceId;
}
return await HandleManualConnection(provider, oidcService, callbackData, oidcState.AccountId.Value);
}
else if (oidcState.FlowType == OidcFlowType.Login)
if (oidcState.FlowType == OidcFlowType.Login)
{
// Login/Registration flow
if (!string.IsNullOrEmpty(oidcState.DeviceId))
{
callbackData.State = oidcState.DeviceId;
}
// Store return URL if provided
if (string.IsNullOrEmpty(oidcState.ReturnUrl) || oidcState.ReturnUrl == "/")
{
logger.LogInformation("No returnUrl provided in OIDC state, will use default.");
return await HandleLoginOrRegistration(provider, oidcService, callbackData);
}
logger.LogInformation("Storing returnUrl {ReturnUrl} for state {State}", oidcState.ReturnUrl, callbackData.State);
var returnUrlKey = $"{ReturnUrlCachePrefix}{callbackData.State}";
await cache.SetAsync(returnUrlKey, oidcState.ReturnUrl, StateExpiration);
@@ -204,6 +216,7 @@ public class ConnectionController(
}
catch (Exception ex)
{
logger.LogError(ex, "Error processing OIDC callback for provider {Provider} during connection flow", provider);
return BadRequest($"Error processing {provider} authentication: {ex.Message}");
}
@@ -268,8 +281,9 @@ public class ConnectionController(
{
await db.SaveChangesAsync();
}
catch (DbUpdateException)
catch (DbUpdateException ex)
{
logger.LogError(ex, "Failed to save OIDC connection for provider {Provider}", provider);
return StatusCode(500, $"Failed to save {provider} connection. Please try again.");
}
@@ -279,8 +293,10 @@ public class ConnectionController(
await cache.RemoveAsync(returnUrlKey);
var siteUrl = configuration["SiteUrl"];
return Redirect(string.IsNullOrEmpty(returnUrl) ? siteUrl + "/auth/callback" : returnUrl);
var redirectUrl = string.IsNullOrEmpty(returnUrl) ? siteUrl + "/auth/callback" : returnUrl;
logger.LogInformation("Redirecting after OIDC connection to {RedirectUrl}", redirectUrl);
return Redirect(redirectUrl);
}
private async Task<IActionResult> HandleLoginOrRegistration(
@@ -296,6 +312,7 @@ public class ConnectionController(
}
catch (Exception ex)
{
logger.LogError(ex, "Error processing OIDC callback for provider {Provider} during login/registration flow", provider);
return BadRequest($"Error processing callback: {ex.Message}");
}
@@ -303,12 +320,21 @@ public class ConnectionController(
{
return BadRequest($"Email or user ID is missing from {provider}'s response");
}
// Retrieve and clean up the return URL
var returnUrlKey = $"{ReturnUrlCachePrefix}{callbackData.State}";
var returnUrl = await cache.GetAsync<string>(returnUrlKey);
await cache.RemoveAsync(returnUrlKey);
var siteUrl = configuration["SiteUrl"];
var redirectBaseUrl = string.IsNullOrEmpty(returnUrl) ? siteUrl + "/auth/callback" : returnUrl;
var connection = await db.AccountConnections
.Include(c => c.Account)
.FirstOrDefaultAsync(c => c.Provider == provider && c.ProvidedIdentifier == userInfo.UserId);
var clock = SystemClock.Instance;
if (connection != null)
{
// Login existing user
@@ -316,12 +342,21 @@ public class ConnectionController(
callbackData.State.Split('|').FirstOrDefault() :
string.Empty;
var challenge = await oidcService.CreateChallengeForUserAsync(
if (HttpContext.Items["CurrentSession"] is not SnAuthSession parentSession) parentSession = null;
var session = await oidcService.CreateSessionForUserAsync(
userInfo,
connection.Account,
HttpContext,
deviceId ?? string.Empty);
return Redirect($"/auth/callback?challenge={challenge.Id}");
deviceId ?? string.Empty,
null,
ClientPlatform.Web,
parentSession);
var token = auth.CreateToken(session);
var redirectUrl = QueryHelpers.AddQueryString(redirectBaseUrl, "token", token);
logger.LogInformation("OIDC login successful for user {UserId}. Redirecting to {RedirectUrl}", connection.AccountId, redirectUrl);
return Redirect(redirectUrl);
}
// Register new user
@@ -345,9 +380,9 @@ public class ConnectionController(
var loginSession = await auth.CreateSessionForOidcAsync(account, clock.GetCurrentInstant());
var loginToken = auth.CreateToken(loginSession);
var siteUrl = configuration["SiteUrl"];
return Redirect(siteUrl + $"/auth/callback?token={loginToken}");
var finalRedirectUrl = QueryHelpers.AddQueryString(redirectBaseUrl, "token", loginToken);
logger.LogInformation("OIDC registration successful for new user {UserId}. Redirecting to {RedirectUrl}", account.Id, finalRedirectUrl);
return Redirect(finalRedirectUrl);
}
private static async Task<OidcCallbackData> ExtractCallbackData(HttpRequest request)

View File

@@ -14,7 +14,9 @@ public class OidcController(
IServiceProvider serviceProvider,
AppDatabase db,
AccountService accounts,
ICacheService cache
AuthService auth,
ICacheService cache,
ILogger<OidcController> logger
)
: ControllerBase
{
@@ -25,15 +27,17 @@ public class OidcController(
public async Task<ActionResult> OidcLogin(
[FromRoute] string provider,
[FromQuery] string? returnUrl = "/",
[FromHeader(Name = "X-Device-Id")] string? deviceId = null
[FromQuery] string? deviceId = null,
[FromQuery] string? flow = null
)
{
logger.LogInformation("OIDC login request for provider {Provider} with returnUrl {ReturnUrl}, deviceId {DeviceId} and flow {Flow}", provider, returnUrl, deviceId, flow);
try
{
var oidcService = GetOidcService(provider);
// If the user is already authenticated, treat as an account connection request
if (HttpContext.Items["CurrentUser"] is SnAccount currentUser)
if (flow != "login" && HttpContext.Items["CurrentUser"] is SnAccount currentUser)
{
var state = Guid.NewGuid().ToString();
var nonce = Guid.NewGuid().ToString();
@@ -41,6 +45,7 @@ public class OidcController(
// Create and store connection state
var oidcState = OidcState.ForConnection(currentUser.Id, provider, nonce, deviceId);
await cache.SetAsync($"{StateCachePrefix}{state}", oidcState, StateExpiration);
logger.LogInformation("OIDC connection flow started for user {UserId} with state {State}", currentUser.Id, state);
// The state parameter sent to the provider is the GUID key for the cache.
var authUrl = await oidcService.GetAuthorizationUrlAsync(state, nonce);
@@ -54,12 +59,14 @@ public class OidcController(
// Create login state with return URL and device ID
var oidcState = OidcState.ForLogin(returnUrl ?? "/", deviceId);
await cache.SetAsync($"{StateCachePrefix}{state}", oidcState, StateExpiration);
logger.LogInformation("OIDC login flow started with state {State} and returnUrl {ReturnUrl}", state, oidcState.ReturnUrl);
var authUrl = await oidcService.GetAuthorizationUrlAsync(state, nonce);
return Redirect(authUrl);
}
}
catch (Exception ex)
{
logger.LogError(ex, "Error initiating OIDC flow for provider {Provider}", provider);
return BadRequest($"Error initiating OpenID Connect flow: {ex.Message}");
}
}
@@ -69,7 +76,7 @@ public class OidcController(
/// Handles Apple authentication directly from mobile apps
/// </summary>
[HttpPost("apple/mobile")]
public async Task<ActionResult<SnAuthChallenge>> AppleMobileLogin(
public async Task<ActionResult<AuthController.TokenExchangeResponse>> AppleMobileLogin(
[FromBody] AppleMobileSignInRequest request
)
{
@@ -92,16 +99,21 @@ public class OidcController(
// Find or create user account using existing logic
var account = await FindOrCreateAccount(userInfo, "apple");
if (HttpContext.Items["CurrentSession"] is not SnAuthSession parentSession) parentSession = null;
// Create session using the OIDC service
var challenge = await appleService.CreateChallengeForUserAsync(
var session = await appleService.CreateSessionForUserAsync(
userInfo,
account,
HttpContext,
request.DeviceId,
request.DeviceName
request.DeviceName,
ClientPlatform.Ios,
parentSession
);
return Ok(challenge);
var token = auth.CreateToken(session);
return Ok(new AuthController.TokenExchangeResponse { Token = token });
}
catch (SecurityTokenValidationException ex)
{

View File

@@ -1,4 +1,3 @@
using System;
using System.IdentityModel.Tokens.Jwt;
using System.Security.Cryptography;
using System.Text;
@@ -250,15 +249,17 @@ public abstract class OidcService(
}
/// <summary>
/// Creates a challenge and session for an authenticated user
/// Creates a session for an authenticated user
/// Also creates or updates the account connection
/// </summary>
public async Task<SnAuthChallenge> CreateChallengeForUserAsync(
public async Task<SnAuthSession> CreateSessionForUserAsync(
OidcUserInfo userInfo,
SnAccount account,
HttpContext request,
string deviceId,
string? deviceName = null
string? deviceName = null,
ClientPlatform platform = ClientPlatform.Web,
SnAuthSession? parentSession = null
)
{
// Create or update the account connection
@@ -282,28 +283,24 @@ public abstract class OidcService(
await Db.AccountConnections.AddAsync(connection);
}
// Create a challenge that's already completed
// Create a session directly
var now = SystemClock.Instance.GetCurrentInstant();
var device = await auth.GetOrCreateDeviceAsync(account.Id, deviceId, deviceName, ClientPlatform.Ios);
var challenge = new SnAuthChallenge
{
ExpiredAt = now.Plus(Duration.FromHours(1)),
StepTotal = await auth.DetectChallengeRisk(request.Request, account),
Type = ChallengeType.Oidc,
Audiences = [ProviderName],
Scopes = ["*"],
AccountId = account.Id,
ClientId = device.Id,
IpAddress = request.Connection.RemoteIpAddress?.ToString() ?? null,
UserAgent = request.Request.Headers.UserAgent,
};
challenge.StepRemain--;
if (challenge.StepRemain < 0) challenge.StepRemain = 0;
var device = await auth.GetOrCreateDeviceAsync(account.Id, deviceId, deviceName, platform);
await Db.AuthChallenges.AddAsync(challenge);
var session = new SnAuthSession
{
AccountId = account.Id,
CreatedAt = now,
LastGrantedAt = now,
ParentSessionId = parentSession?.Id,
ClientId = device.Id,
ExpiredAt = now.Plus(Duration.FromDays(30))
};
await Db.AuthSessions.AddAsync(session);
await Db.SaveChangesAsync();
return challenge;
return session;
}
}

View File

@@ -77,7 +77,7 @@ public class TokenAuthService(
"AuthenticateTokenAsync: success via cache (sessionId={SessionId}, accountId={AccountId}, scopes={ScopeCount}, expiresAt={ExpiresAt})",
sessionId,
session.AccountId,
session.Challenge?.Scopes.Count,
session.Scopes.Count,
session.ExpiredAt
);
return (true, session, null);
@@ -87,8 +87,7 @@ public class TokenAuthService(
session = await db.AuthSessions
.AsNoTracking()
.Include(e => e.Challenge)
.ThenInclude(e => e.Client)
.Include(e => e.Client)
.Include(e => e.Account)
.ThenInclude(e => e.Profile)
.FirstOrDefaultAsync(s => s.Id == sessionId);
@@ -110,11 +109,11 @@ public class TokenAuthService(
"AuthenticateTokenAsync: DB session loaded (sessionId={SessionId}, accountId={AccountId}, clientId={ClientId}, appId={AppId}, scopes={ScopeCount}, ip={Ip}, uaLen={UaLen})",
sessionId,
session.AccountId,
session.Challenge?.ClientId,
session.ClientId,
session.AppId,
session.Challenge?.Scopes.Count,
session.Challenge?.IpAddress,
(session.Challenge?.UserAgent ?? string.Empty).Length
session.Scopes.Count,
session.IpAddress,
(session.UserAgent ?? string.Empty).Length
);
logger.LogDebug("AuthenticateTokenAsync: enriching account with subscription (accountId={AccountId})", session.AccountId);
@@ -143,7 +142,7 @@ public class TokenAuthService(
"AuthenticateTokenAsync: success via DB (sessionId={SessionId}, accountId={AccountId}, clientId={ClientId})",
sessionId,
session.AccountId,
session.Challenge?.ClientId
session.ClientId
);
return (true, session, null);
}

View File

@@ -1,12 +1,12 @@
# Stage 1: Base runtime image
FROM mcr.microsoft.com/dotnet/aspnet:9.0 AS base
FROM mcr.microsoft.com/dotnet/aspnet:10.0 AS base
USER $APP_UID
WORKDIR /app
EXPOSE 8080
EXPOSE 8081
# Stage 2: Build .NET application
FROM mcr.microsoft.com/dotnet/sdk:9.0 AS build
FROM mcr.microsoft.com/dotnet/sdk:10.0 AS build
WORKDIR /src
# Copy .csproj and restore as distinct layers

View File

@@ -1,48 +1,32 @@
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net9.0</TargetFramework>
<TargetFramework>net10.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Grpc.AspNetCore.Server" Version="2.71.0" />
<PackageReference Include="Microsoft.AspNetCore.OpenApi" Version="9.0.10" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.10">
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
<PrivateAssets>all</PrivateAssets>
<PackageReference Include="Microsoft.AspNetCore.OpenApi" Version="10.0.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.11">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="Nager.Holiday" Version="1.0.1" />
<PackageReference Include="Nerdbank.GitVersioning" Version="3.8.118">
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
<PrivateAssets>all</PrivateAssets>
</PackageReference>
<PackageReference Include="NodaTime" Version="3.2.2" />
<PackageReference Include="NodaTime.Serialization.JsonNet" Version="3.2.0" />
<PackageReference Include="NodaTime.Serialization.Protobuf" Version="2.0.2" />
<PackageReference Include="NodaTime.Serialization.SystemTextJson" Version="1.3.0" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="9.0.4" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.Design" Version="1.1.0" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.NodaTime" Version="9.0.4" />
<PackageReference Include="OpenGraph-Net" Version="4.0.1" />
<PackageReference Include="OpenTelemetry.Exporter.OpenTelemetryProtocol" Version="1.13.1" />
<PackageReference Include="OpenTelemetry.Extensions.Hosting" Version="1.13.1" />
<PackageReference Include="OpenTelemetry.Instrumentation.AspNetCore" Version="1.13.0" />
<PackageReference Include="OpenTelemetry.Instrumentation.Http" Version="1.13.0" />
<PackageReference Include="OpenTelemetry.Instrumentation.Runtime" Version="1.13.0" />
<PackageReference Include="Otp.NET" Version="1.4.0" />
<PackageReference Include="Quartz" Version="3.15.1" />
<PackageReference Include="Quartz.AspNetCore" Version="3.15.1" />
<PackageReference Include="Quartz.Extensions.Hosting" Version="3.15.1" />
<PackageReference Include="BCrypt.Net-Next" Version="4.0.3" />
<PackageReference Include="EFCore.BulkExtensions" Version="9.0.2" />
<PackageReference Include="EFCore.BulkExtensions.PostgreSql" Version="9.0.2" />
<PackageReference Include="EFCore.NamingConventions" Version="9.0.0" />
<PackageReference Include="SpotifyAPI.Web" Version="7.2.1" />
<PackageReference Include="SteamWebAPI2" Version="4.4.1" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="9.0.6" />
<PackageReference Include="Swashbuckle.AspNetCore.SwaggerUI" Version="9.0.6" />
<PackageReference Include="SteamWebAPI2" Version="5.0.0" />
</ItemGroup>
<ItemGroup>
@@ -131,4 +115,21 @@
<LastGenOutput>SharedResource.Designer.cs</LastGenOutput>
</EmbeddedResource>
</ItemGroup>
<ItemGroup>
<_ContentIncludedByDefault Remove="Emails\AccountDeletionEmail.razor" />
<_ContentIncludedByDefault Remove="Emails\ContactVerificationEmail.razor" />
<_ContentIncludedByDefault Remove="Emails\EmailLayout.razor" />
<_ContentIncludedByDefault Remove="Emails\FactorCodeEmail.razor" />
<_ContentIncludedByDefault Remove="Emails\PasswordResetEmail.razor" />
<_ContentIncludedByDefault Remove="Emails\RegistrationConfirmEmail.razor" />
</ItemGroup>
<ItemGroup>
<AdditionalFiles Include="Resources\Emails\AccountDeletionEmail.razor" />
<AdditionalFiles Include="Resources\Emails\ContactVerificationEmail.razor" />
<AdditionalFiles Include="Resources\Emails\EmailLayout.razor" />
<AdditionalFiles Include="Resources\Emails\PasswordResetEmail.razor" />
<AdditionalFiles Include="Resources\Emails\RegistrationConfirmEmail.razor" />
</ItemGroup>
</Project>

View File

@@ -1,42 +0,0 @@
@using DysonNetwork.Pass.Localization
@using Microsoft.Extensions.Localization
<EmailLayout>
<tr>
<td class="wrapper">
<p class="font-bold">@(Localizer["AccountDeletionHeader"])</p>
<p>@(Localizer["AccountDeletionPara1"]) @@@Name,</p>
<p>@(Localizer["AccountDeletionPara2"])</p>
<p>@(Localizer["AccountDeletionPara3"])</p>
<table role="presentation" border="0" cellpadding="0" cellspacing="0" class="btn btn-primary">
<tbody>
<tr>
<td align="left">
<table role="presentation" border="0" cellpadding="0" cellspacing="0">
<tbody>
<tr>
<td>
<a href="@Link" target="_blank">
@(Localizer["AccountDeletionButton"])
</a>
</td>
</tr>
</tbody>
</table>
</td>
</tr>
</tbody>
</table>
<p>@(Localizer["AccountDeletionPara4"])</p>
</td>
</tr>
</EmailLayout>
@code {
[Parameter] public required string Name { get; set; }
[Parameter] public required string Link { get; set; }
[Inject] IStringLocalizer<EmailResource> Localizer { get; set; } = null!;
}

View File

@@ -1,43 +0,0 @@
@using DysonNetwork.Pass.Localization
@using Microsoft.Extensions.Localization
@using EmailResource = DysonNetwork.Pass.Localization.EmailResource
<EmailLayout>
<tr>
<td class="wrapper">
<p class="font-bold">@(Localizer["ContactVerificationHeader"])</p>
<p>@(Localizer["ContactVerificationPara1"]) @Name,</p>
<p>@(Localizer["ContactVerificationPara2"])</p>
<table role="presentation" border="0" cellpadding="0" cellspacing="0" class="btn btn-primary">
<tbody>
<tr>
<td align="left">
<table role="presentation" border="0" cellpadding="0" cellspacing="0">
<tbody>
<tr>
<td>
<a href="@Link" target="_blank">
@(Localizer["ContactVerificationButton"])
</a>
</td>
</tr>
</tbody>
</table>
</td>
</tr>
</tbody>
</table>
<p>@(Localizer["ContactVerificationPara3"])</p>
<p>@(Localizer["ContactVerificationPara4"])</p>
</td>
</tr>
</EmailLayout>
@code {
[Parameter] public required string Name { get; set; }
[Parameter] public required string Link { get; set; }
[Inject] IStringLocalizer<EmailResource> Localizer { get; set; } = null!;
}

View File

@@ -1,337 +0,0 @@
@inherits LayoutComponentBase
<!doctype html>
<html lang="en">
<head>
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<style media="all" type="text/css">
body {
font-family: Helvetica, sans-serif;
-webkit-font-smoothing: antialiased;
font-size: 16px;
line-height: 1.3;
-ms-text-size-adjust: 100%;
-webkit-text-size-adjust: 100%;
}
table {
border-collapse: separate;
mso-table-lspace: 0pt;
mso-table-rspace: 0pt;
width: 100%;
}
table td {
font-family: Helvetica, sans-serif;
font-size: 16px;
vertical-align: top;
}
body {
background-color: #f4f5f6;
margin: 0;
padding: 0;
}
.body {
background-color: #f4f5f6;
width: 100%;
}
.container {
margin: 0 auto !important;
max-width: 600px;
padding: 0;
padding-top: 24px;
width: 600px;
}
.content {
box-sizing: border-box;
display: block;
margin: 0 auto;
max-width: 600px;
padding: 0;
}
.main {
background: #ffffff;
border: 1px solid #eaebed;
border-radius: 16px;
width: 100%;
}
.wrapper {
box-sizing: border-box;
padding: 24px;
}
.footer {
clear: both;
padding-top: 24px;
text-align: center;
width: 100%;
}
.footer td,
.footer p,
.footer span,
.footer a {
color: #9a9ea6;
font-size: 16px;
text-align: center;
}
p {
font-family: Helvetica, sans-serif;
font-size: 16px;
font-weight: normal;
margin: 0;
margin-bottom: 16px;
}
a {
color: #0867ec;
text-decoration: underline;
}
.btn {
box-sizing: border-box;
min-width: 100% !important;
width: 100%;
}
.btn > tbody > tr > td {
padding-bottom: 16px;
}
.btn table {
width: auto;
}
.btn table td {
background-color: #ffffff;
border-radius: 4px;
text-align: center;
}
.btn a {
background-color: #ffffff;
border: solid 2px #0867ec;
border-radius: 4px;
box-sizing: border-box;
color: #0867ec;
cursor: pointer;
display: inline-block;
font-size: 16px;
font-weight: bold;
margin: 0;
padding: 12px 24px;
text-decoration: none;
text-transform: capitalize;
}
.btn-primary table td {
background-color: #0867ec;
}
.btn-primary a {
background-color: #0867ec;
border-color: #0867ec;
color: #ffffff;
}
.font-bold {
font-weight: bold;
}
.verification-code
{
font-family: "Courier New", Courier, monospace;
font-size: 24px;
letter-spacing: 0.5em;
}
@@media all {
.btn-primary table td:hover {
background-color: #ec0867 !important;
}
.btn-primary a:hover {
background-color: #ec0867 !important;
border-color: #ec0867 !important;
}
}
.last {
margin-bottom: 0;
}
.first {
margin-top: 0;
}
.align-center {
text-align: center;
}
.align-right {
text-align: right;
}
.align-left {
text-align: left;
}
.text-link {
color: #0867ec !important;
text-decoration: underline !important;
}
.clear {
clear: both;
}
.mt0 {
margin-top: 0;
}
.mb0 {
margin-bottom: 0;
}
.preheader {
color: transparent;
display: none;
height: 0;
max-height: 0;
max-width: 0;
opacity: 0;
overflow: hidden;
mso-hide: all;
visibility: hidden;
width: 0;
}
.powered-by a {
text-decoration: none;
}
@@media only screen and (max-width: 640px) {
.main p,
.main td,
.main span {
font-size: 16px !important;
}
.wrapper {
padding: 8px !important;
}
.content {
padding: 0 !important;
}
.container {
padding: 0 !important;
padding-top: 8px !important;
width: 100% !important;
}
.main {
border-left-width: 0 !important;
border-radius: 0 !important;
border-right-width: 0 !important;
}
.btn table {
max-width: 100% !important;
width: 100% !important;
}
.btn a {
font-size: 16px !important;
max-width: 100% !important;
width: 100% !important;
}
}
@@media all {
.ExternalClass {
width: 100%;
}
.ExternalClass,
.ExternalClass p,
.ExternalClass span,
.ExternalClass font,
.ExternalClass td,
.ExternalClass div {
line-height: 100%;
}
.apple-link a {
color: inherit !important;
font-family: inherit !important;
font-size: inherit !important;
font-weight: inherit !important;
line-height: inherit !important;
text-decoration: none !important;
}
#MessageViewBody a {
color: inherit;
text-decoration: none;
font-size: inherit;
font-family: inherit;
font-weight: inherit;
line-height: inherit;
}
}
</style>
</head>
<body>
<table role="presentation" border="0" cellpadding="0" cellspacing="0" class="body">
<tr>
<td>&nbsp;</td>
<td class="container">
<div class="content">
<!-- START CENTERED WHITE CONTAINER -->
<table role="presentation" border="0" cellpadding="0" cellspacing="0" class="main">
<!-- START MAIN CONTENT AREA -->
@ChildContent
<!-- END MAIN CONTENT AREA -->
</table>
<!-- START FOOTER -->
<div class="footer">
<table role="presentation" border="0" cellpadding="0" cellspacing="0">
<tr>
<td class="content-block">
<span class="apple-link">Solar Network</span>
<br> Solsynth LLC © @(DateTime.Now.Year)
</td>
</tr>
<tr>
<td class="content-block powered-by">
Powered by <a href="https://github.com/solsynth/dysonnetwork">Dyson Network</a>
</td>
</tr>
</table>
</div>
<!-- END FOOTER -->
<!-- END CENTERED WHITE CONTAINER --></div>
</td>
<td>&nbsp;</td>
</tr>
</table>
</body>
</html>
@code {
[Parameter] public RenderFragment? ChildContent { get; set; }
}

Some files were not shown because too many files have changed in this diff Show More