Compare commits

..

235 Commits
master ... test

Author SHA1 Message Date
fc2aa31d77 Merge branch 'master' into test 2025-03-13 15:11:54 +01:00
29166b0b39 refactor(navigation): fix merge 2025-02-20 15:20:13 +01:00
3949a60828 refactor(migration): fix merge 2025-02-20 15:04:57 +01:00
7bdd9b648a refactor(settings): fix merge 2025-02-20 15:03:36 +01:00
51164abc54 refactor(settings): add USERAUTH_MODE env var option to override user-auth.protocol 2025-02-20 15:03:21 +01:00
d6607b4432 refactor(model): fix User merge 2025-02-20 15:02:35 +01:00
988dddc0a2 Merge branch '145-build-system-rewrite' into test 2025-02-20 13:34:46 +01:00
4eb28c3c5b Merge branch 'master' into test 2025-02-20 11:01:15 +01:00
cac16b4e1c chore(release): v27.4.59-test-h0.0.18 2025-02-19 13:52:25 +01:00
dfa7746ff6 build: update stack.yaml.lock 2025-02-19 13:36:43 +01:00
be250afce5 chore(release): v27.4.59-test-h0.0.17 2025-02-19 12:31:38 +01:00
81278d92a1 Merge branch '145-build-system-rewrite' into test 2025-02-19 12:29:54 +01:00
Sarah Vaupel
6693bbe166 chore(release): 28.1.1 2024-04-22 10:44:24 +02:00
Sarah Vaupel
1696135096 Merge branch '128-remove-nodejs' into test 2024-04-22 10:42:50 +02:00
c764182a6d bugfix(gitlab-ci): refined pattern matching for not matching manifest.json 2024-04-21 05:34:52 +02:00
Sarah Vaupel
a112ef2eca chore(release): 28.1.0 2024-04-19 00:34:33 +02:00
Sarah Vaupel
7e28517b82 Merge branch 'test' into 55-oauth2-single-sign-on 2024-04-19 00:34:07 +02:00
Sarah Vaupel
3080ab995a chore(release): 28.0.10 2024-04-19 00:32:24 +02:00
Sarah Vaupel
7a510b315d fix(auth): use appsettings for azure tenant id; refactor azure lookup url methods 2024-04-18 22:27:51 +02:00
Sarah Vaupel
dc701e5c49 chore: fix tests 2024-04-18 02:07:04 +02:00
233e9ca92f chore(gitlab-ci): Add debug print to container sanitation. 2024-04-18 01:29:05 +02:00
Sarah Vaupel
e1a25cdd31 feat(middleware): allow Cross Origin Resource Sharing (CORS) 2024-04-17 02:52:11 +02:00
Sarah Vaupel
5be23c0d52 refactor: move makeMiddleware and dependencies to separate module; refactor Application imports 2024-04-17 01:30:22 +02:00
Sarah Vaupel
de8cf11d4d chore(release): 28.0.9 2024-04-08 19:06:07 +02:00
Sarah Vaupel
666a50e163 chore(release): 28.0.8 2024-04-07 01:56:07 +02:00
0bd256cb09 Merge branch '128-remove-nodejs' into 'test'
chore(gitlab-ci): query nodejs roots in nix store if nix store delete fails,...

See merge request fradrive/fradrive!30
2024-04-06 23:45:52 +00:00
0d46802862 chore(gitlab-ci): query nodejs roots in nix store if nix store delete fails,... 2024-04-06 23:45:51 +00:00
b190e25c88 chore(release): 28.0.0 2024-03-21 17:48:37 +01:00
8290f9dd23 chore(changelog): add entry for new OAuth2 support 2024-03-21 17:48:24 +01:00
22e57dc075 chore(oauth2): fix DEVELOPMENT imports 2024-03-21 17:40:29 +01:00
aa6406f949 Merge branch 'oauth2' into 'test'
Implement OAuth2 (AzureADv2) support

See merge request fradrive/fradrive!29
2024-03-21 16:00:37 +00:00
663ad01740 chore(oauth2): remove unused imports and defs 2024-03-21 14:14:33 +01:00
619c5975aa chore(oauth2): remove unused import 2024-03-21 14:06:15 +01:00
795c707a1f chore(oauth2): remove unused loadPlugin function 2024-03-21 09:16:43 +01:00
0599ec2512 chore(oauth2): fix type 2024-03-21 00:27:43 +01:00
b1cb45ac7e chore(oauth2): fix !develop syntax contd 2024-03-20 18:24:39 +01:00
274c86a820 chore(oauth2): fix conf constructors in !develop 2024-03-20 15:56:30 +01:00
3119dff6fe Merge branch 'test' into oauth2 2024-03-19 22:51:37 +01:00
f7f3532b30 Merge branch 'master' into test 2024-03-19 22:47:59 +01:00
1dd83af6aa chore(oauth2): fix syntax 2024-03-19 22:45:04 +01:00
cea64da34d chore(oauth2): downgrade yesod-auth-oauth2 to v0.6.3.4 2024-03-19 16:27:57 +01:00
cba9cadb41 chore: update backend dependency sources 2024-03-19 15:11:19 +01:00
9428bc05cc chore: revert to previous flake inputs 2024-03-18 15:28:02 +01:00
94d45c1f17 chore: update stack.yaml.lock 2024-03-18 12:54:30 +01:00
8be3e2ea78 chore: use previous oauth2 lib 2024-03-18 12:54:05 +01:00
7e33d9e5de chore: update stack(-flake).yaml (fix fork urls, add inputs, revert to previous oauth2 lib) 2024-03-18 12:53:37 +01:00
923166b592 chore: update package.yaml 2024-03-18 12:51:19 +01:00
dbfd3657a0 chore(flake): remove redundant inputs 2024-03-18 12:50:59 +01:00
Sarah Vaupel
864175284d Merge branch 'master' into test 2024-03-15 10:44:43 +01:00
a4eda81436 chore: work on flakey oauth2 yesod plugin input for v0.7.2 specifically 2024-03-15 10:40:07 +01:00
4db44733ca chore: fix haskell inputs 2024-03-15 10:12:33 +01:00
1fc43a8727 chore: update flake 2024-03-14 22:14:53 +01:00
6cd1d829b6 chore(nix): fix backend build target 2024-03-14 21:59:15 +01:00
85dc1fa0b5 chore: depromote debug logErrorS calls 2024-03-14 19:26:16 +01:00
2aa64f7360 feat(sso): redirect to login when auto-sign-on is enabled and user is not authenticated 2024-03-14 19:20:37 +01:00
f3da2ac630 chore(sso): add bare auto-sign-out setting 2024-03-14 14:07:17 +01:00
d44b903b3e chore: fix tests 2024-03-14 13:07:22 +01:00
c4501f1d08 chore: hlint 2024-03-14 13:06:58 +01:00
560d1adf5f chore(sso): disable sso by default (i.e. for develop) 2024-03-14 12:47:04 +01:00
acd6a3c11c chore: hlint 2024-03-14 12:42:10 +01:00
2787bde8da Merge branch '142-userdata-oauth-mode' into 'oauth2'
Resolve "Benutzerdaten-Abfrage: Mehrere Modi & OAuth-Modus"

See merge request fradrive/fradrive!26
2024-03-13 16:24:04 +00:00
6b82c26268 chore(migration): fix oauth2 migration contd 2024-03-13 12:24:25 +01:00
770c2f3182 chore(migration): fix oauth2 migration 2024-03-13 10:20:10 +01:00
843e6dbba2 chore(migration): add oauth2 migration 2024-03-12 18:09:18 +01:00
3607a9da6d Merge branch 'oauth2' into 142-userdata-oauth-mode 2024-03-12 15:08:20 +01:00
608bea5199 Merge branch '139-single-sign-on-sso-routing-anpassen' into 'oauth2'
Resolve "Single sign-on (SSO): Routing anpassen"

See merge request fradrive/fradrive!28
2024-03-11 14:49:41 +00:00
07dd91665c chore: fix auth plugin refs 2024-03-11 15:20:24 +01:00
5662a2d1f1 chore: fix merge oopsie contd 2024-03-11 15:09:33 +01:00
72938e41ba chore: fix merge oopsie 2024-03-11 15:07:50 +01:00
Sarah Vaupel
cf6ae898c4 Merge branch '139-single-sign-on-sso-routing-anpassen' into 142-userdata-oauth-mode 2024-03-11 14:50:07 +01:00
05acba8cbe chore(foundation): ditch redirectToReferrer in favour of SSOut 2024-03-11 14:30:44 +01:00
9856272734 chore(login): do not login via modal 2024-03-11 14:23:35 +01:00
504490f593 chore(admin): switch to generic Aeson Value for oauth response parsing 2024-03-11 11:09:59 +01:00
David Mosbach
4c109538ee chore(auth): new 'Account' section 2024-03-10 22:15:20 +00:00
David Mosbach
1e5c4df163 chore(auth): fix single sign out redirect route 2024-03-10 19:43:54 +00:00
e1ebd528b8 chore(auth): use available sources in AuthIsExternal access pred 2024-03-08 21:16:16 +01:00
708320e067 chore(auth): change user identification to UserIdent for ExternalUser entries 2024-03-08 20:04:19 +01:00
51298ba726 chore: make fetch and upsert results Maybe 2024-03-08 19:05:58 +01:00
96e3eb613d chore(admin): merge external-user handlers (ldap, oauth2) 2024-03-08 12:10:26 +01:00
a2903da109 refactor(auth): UserConversionException -> DecodeUserException 2024-03-08 10:40:49 +01:00
c9fa627651 chore(admin): generalize admin ldap handler for all source types (TODO: rename) 2024-03-08 09:56:54 +01:00
969cc4df63 chore(jobs): use userLookupAndUpsert for synchronise user job 2024-03-08 09:56:27 +01:00
2480efc345 chore: userLookupAndUpsert contd 2024-03-08 09:55:51 +01:00
8c4ec00c35 chore(ldap): ldapSearch for arbitrary number of results 2024-03-08 09:54:30 +01:00
78a8442d07 chore(auth): userLookupAndUpsert 2024-03-07 23:24:41 +01:00
95803db3a0 chore(auth): fix fetchUserData 2024-03-07 15:32:07 +01:00
d71ff014ea chore(ldap): derive more json instances 2024-03-07 15:30:48 +01:00
aca5a79de2 chore(auth): implement fetchUserData, generalized version of azureUser and ldapUser 2024-03-07 05:38:39 +01:00
4feb05a02e chore(foundation): tweak UpsertUserData fields 2024-03-07 05:37:27 +01:00
77a9100b2e chore(auth): refactor; add util function 2024-03-07 05:36:03 +01:00
David Mosbach
b947037ea2 feat(auth): implemented single sign out 2024-03-07 03:31:17 +00:00
David Mosbach
d88acf4634 chore(auth): updated mock server 2024-03-06 04:26:47 +00:00
David Mosbach
fbe0e37d28 feat(auth): oidc based sso for auth protected routes 2024-03-05 23:57:10 +00:00
bb03d28b7d chore(auth): actually use user-auth config for determining auth plugins to load 2024-03-03 06:16:53 +01:00
2196e89208 chore(settings): define more sane default values in settings.yml 2024-03-03 04:36:18 +01:00
4ff51c8f6f chore: add TODOs and debug logs 2024-03-03 04:35:39 +01:00
434eed2217 chore(auth): do not authenticate against external sources on dummy login 2024-03-01 20:42:51 +01:00
f88e527fe4 chore(model): remigrate ExternalAuth -> ExternalUser for more general data lookup; redefine lastSync timestamp semantics contd 2024-03-01 12:03:38 +01:00
40fe8ecfc6 chore(model): remigrate ExternalAuth -> ExternalUser for more general data lookup; redefine lastSync timestamp semantics 2024-03-01 10:47:52 +01:00
13502d704e refactor(auth): add missing TODOs, remove debris 2024-02-29 22:16:11 +01:00
d1e1f25162 chore(login): use correct auth plugin identifiers for comparison in login template 2024-02-29 17:52:31 +01:00
ac5bca2fcd chore(ldap): use separate source-id for ldap instance identification 2024-02-28 15:50:47 +01:00
064645d1b3 refactor(ldap): move orphan instance 2024-02-28 12:00:06 +01:00
956c85a9f3 chore(migration): remove old ldap-primary-key index 2024-02-28 11:05:01 +01:00
David Mosbach
bee135ab48 chore(auth): connect azure user lookup 2024-02-22 18:56:03 +00:00
42ecc91c22 chore(test): update test database 2024-02-21 07:19:37 +01:00
a37d4b369a chore(application): rename conf constructors 2024-02-21 07:14:18 +01:00
039b1234c5 chore(sap): generalize ldap-cutoff over configured ldap sources 2024-02-21 07:13:51 +01:00
87b3214c84 chore(lms): fix password in fake user 2024-02-21 07:13:00 +01:00
ad937cda8c chore(users): remove ldap-specific columns in admin users page 2024-02-21 07:12:29 +01:00
899071e4d6 chore(users): remove eppn support 2024-02-21 07:11:59 +01:00
55bf8c0355 chore: add forgotten audPassword 2024-02-21 07:11:22 +01:00
b4a8ccf9cc chore(admin): tweak ldap view 2024-02-21 07:10:19 +01:00
76d3c57658 chore(messages): add and tweak auth messages 2024-02-21 07:09:18 +01:00
2490f8e69f chore(users): add password to user data for addNewUser 2024-02-21 07:08:56 +01:00
6cd0152636 refactor(jobs): use new user sync job name 2024-02-21 07:07:54 +01:00
19433fdc56 chore(profile): better auth info on profile page 2024-02-21 07:05:57 +01:00
012c75db21 chore(pwhash): reintroduce digest computation 2024-02-21 02:32:15 +01:00
71e2d6827e chore(model): rename userLastLogin->userLastAuthentication for less migration woes 2024-02-21 02:06:00 +01:00
41b14f1ece chore(model): replace auth-source model tables with AuthSourceIdent jsonified unique ids 2024-02-21 02:02:58 +01:00
a2e01e74af chore(notifications): reimplement authmode-update notification to support new login modes 2024-02-20 01:33:34 +01:00
8a353c357f chore(users): tweak assimilateUsers for new config 2024-02-20 00:38:46 +01:00
9bf7033eac chore(guess-user): remove eppn lookup 2024-02-20 00:13:55 +01:00
0a01490aa7 chore(auth): use ldap external auth in health reports 2024-02-20 00:09:31 +01:00
115452035d refactor(jobs): SynchroniseUserdb -> SynchroniseUsers 2024-02-20 00:05:56 +01:00
b8e7ee2b3d chore(users): remove old auth kind digesting 2024-02-19 23:49:17 +01:00
3d1908d71a chore(users): tweak addNewUser to conform to new model 2024-02-19 23:48:33 +01:00
ed54b666ec chore: add todos 2024-02-19 23:46:45 +01:00
a1d8dc2e7e chore(auth): migrate password hash back to User model 2024-02-19 02:24:31 +01:00
David Mosbach
956464659e feat(auth): link to sso test from dev login widget 2024-02-19 00:52:15 +00:00
9a5c487b2c chore(auth): switch back to AuthId UniWorX == UserId 2024-02-19 01:44:58 +01:00
bcfcbd5c9b chore(auth): fix redundant imports 2024-02-18 18:43:44 +01:00
96038a4f22 chore(auth): fix azure exception handler 2024-02-18 18:42:22 +01:00
5c4042e5f3 chore(oauth2): fix query function exports 2024-02-18 18:41:29 +01:00
c9f1bc4047 Merge branch 'oauth2' into 142-userdata-oauth-mode 2024-02-18 18:29:24 +01:00
bf13473954 chore(auth): rewrote authenticate (still WIP) 2024-02-18 05:06:23 +01:00
a0e7b2f96c chore(auth): work on authenticate 2024-02-16 03:25:36 +01:00
848890d3cd chore(auth): add more data to user upsert mode 2024-02-16 02:28:15 +01:00
f8bf02df2b chore(ldap): move and add more instances 2024-02-16 02:26:24 +01:00
1489c27121 Merge branch '140-admin-handler-fur-oauth-response-inspection' into 'oauth2'
Resolve "Admin-Handler für OAuth Response Inspection"

See merge request fradrive/fradrive!24
2024-02-15 16:22:12 +00:00
0c5f4cb430 refactor(settings): use better settings type names for user-auth 2024-02-14 02:02:42 +01:00
9597663881 chore(ldap): add more Ldap instances 2024-02-13 22:44:47 +01:00
7ed5e7a326 chore(model): use more specific (new)types for ldap model 2024-02-13 22:44:30 +01:00
1180ef6fd0 chore(ldap): add Ldap.Scope instances 2024-02-13 19:01:49 +01:00
2c3292cadf chore(model): add authentication source models 2024-02-13 18:22:00 +01:00
7803b753cb refactor(model): migrate auth models and model types to models/auth.model 2024-02-13 17:38:22 +01:00
David Mosbach
bbeebc641e chore(auth): new port offset calculation 2024-02-12 15:06:30 +00:00
42c97924ec chore: remove debris 2024-02-11 17:41:22 +01:00
29fc201294 chore(auth): authenticate against new InternalAuthHash in internal login AuthPlugin 2024-02-11 17:40:46 +01:00
938423b832 chore(auth): AuthTagLDAP -> AuthTagExternal, AuthTagPWHash -> AuthTagInternal 2024-02-11 17:39:42 +01:00
54f2430b3e chore(model)!: separate user authentication data from User table; add ExternalAuth and InternalAuth models 2024-02-11 17:36:57 +01:00
2e47df00b9 refactor(model): rename module Model.Types.Security -> Model.Types.Auth 2024-02-11 01:44:18 +01:00
223ae0f2f8 refactor(messages): rename campus error messages 2024-02-10 16:34:37 +01:00
cc8bd19f85 refactor(ldap): CampusUserError -> LdapUserError 2024-02-10 00:27:36 +01:00
David Mosbach
3f5a22c85d chore(auth): update oauth2 mock server 2024-02-09 17:38:35 +00:00
12fe58fc81 chore(model)!: move user authentication data to new ExternalUser model 2024-02-09 18:17:43 +01:00
David Mosbach
fafa25a7b5 chore(auth): auto start oauth2 mock server in develop 2024-02-03 21:10:24 +00:00
David Mosbach
d4cfce317d feat(auth): formatted output of user queries 2024-02-03 20:48:32 +00:00
ac045fdc70 chore(auth): oauth2MockServer->azureMockServer 2024-02-01 20:53:55 +01:00
a85a5be4cd chore(auth): mockPluginName->apAzureMock 2024-02-01 20:51:31 +01:00
1d7b46b4a4 chore(npm): remove oauth2-mock-server 2024-02-01 12:20:47 +01:00
David Mosbach
453034100b feat(auth): admin handler can query user data 2024-01-31 14:32:49 +00:00
9c608070ae chore(db-fill): add missing user fields contd 2024-01-30 22:08:55 +01:00
aa81de74a4 chore(db-fill): add missing user fields 2024-01-30 22:02:48 +01:00
d9ed893b52 chore(application): fix ldapPool setup 2024-01-30 21:54:46 +01:00
dfa774f655 chore(users): campusUser->ldapUser 2024-01-30 21:54:20 +01:00
608d8a3661 chore(users): add missing azure id field for UsersAdd 2024-01-30 21:53:58 +01:00
3c4e6b62fb chore: fix constructor names 2024-01-30 21:53:30 +01:00
f39de71c02 chore(jobs): upsertAzureUser on synchronise user job with azure config 2024-01-30 21:52:30 +01:00
24dbaf36bc chore(form): add uuidField 2024-01-30 21:51:25 +01:00
43bf25a5bd chore(azure): implement azureUser variant 2024-01-30 21:50:56 +01:00
f4b8417deb chore(messages): add admin azure message 2024-01-30 21:50:19 +01:00
c8350722a4 chore(ldap): migrate more campusUser usages 2024-01-30 14:01:54 +01:00
af09e02801 chore(lms): add missing user fields for fake user 2024-01-30 13:52:33 +01:00
8e2a98c12b chore(foundation): fix ldap auth and user lookup 2024-01-30 11:42:45 +01:00
1cdb20eb60 chore(ldap): fix user lookup types 2024-01-30 11:20:44 +01:00
David Mosbach
c8fa509ace feat(auth): tokens can be stored & refreshed 2024-01-30 05:06:06 +00:00
David Mosbach
5a023a9e32 chore(auth): added function for user queries to auth servers 2024-01-29 21:34:39 +00:00
David Mosbach
2763d2012a chore(auth): provide oauth2 test users yaml 2024-01-29 00:45:43 +00:00
264aaab24c chore: campus->ldap 2024-01-28 20:05:52 +01:00
c65dc04e8f chore: add missing AuthAzure case 2024-01-28 20:05:28 +01:00
a1ba004efa chore(messages): add message for Azure auth kind 2024-01-28 18:37:59 +01:00
514bca5257 chore: rename setting 2024-01-28 18:37:28 +01:00
9cbc35c263 chore(users): add azure id to AddUserData 2024-01-28 18:32:36 +01:00
84d7890ae4 chore(auth): oauth2User->azureUser 2024-01-28 18:32:14 +01:00
aa893062f1 chore(ldap): refactor ldapLogin type 2024-01-28 18:16:10 +01:00
d4a3459adf chore: user sources 2024-01-28 18:06:30 +01:00
David Mosbach
8acfc1d10c feat(auth): integrated oauth2 mock server 2024-01-28 12:53:00 +00:00
e9bbeffd7e chore(auth): campusLogin->ldapLogin 2024-01-28 12:45:59 +01:00
7e3e772055 chore(foundation): use multifunctional authenticate 2024-01-28 12:45:44 +01:00
471982d245 chore(application): reimplement ldapPool startup 2024-01-26 23:32:45 +01:00
3eec9ef8df refactor(jobs): ldap->userdb messages 2024-01-26 23:32:10 +01:00
ff5b31929e refactor(jobs): ldap->userdb 2024-01-26 23:31:13 +01:00
12bb8b7145 chore(foundation): loosen tight ldap<>failover coupling, move campusUser to ldapUser 2024-01-26 23:29:50 +01:00
2e005a90f2 chore(foundation): remove failover from ldap pool conf 2024-01-26 23:27:52 +01:00
843ac60aae chore(auth): oauth2->azure 2024-01-26 23:27:13 +01:00
a42ccb0faa chore(auth): campus->ldap 2024-01-26 23:26:53 +01:00
c929d42ebd chore(foundation): rename auth exceptions 2024-01-26 23:26:00 +01:00
4051d1e11b chore(settings): refactor userdb config structure 2024-01-26 23:24:40 +01:00
71af64dc28 chore(model): add AuthAzure 2024-01-26 23:22:58 +01:00
74f044919c chore(model): add azure primary key 2024-01-26 23:21:33 +01:00
9dc6ec461c chore(settings): simplify/flatten userdb config settings 2024-01-23 02:59:25 +01:00
1f31fe8cf2 chore(settings): add support for multiple modes for userdb 2024-01-23 02:16:06 +01:00
d56c9c3c31 Merge branch 'oauth2' into 142-userdata-oauth-mode 2024-01-22 10:36:43 +01:00
55ed01cb40 chore: improve settings, rename old ldap settings 2024-01-19 23:23:23 +01:00
Sarah Vaupel
9f299c854c chore(settings)!: rename userdb app settings 2024-01-19 14:53:00 +01:00
Sarah Vaupel
35902daff6 chore(settings): add default value for oauth2 scopes in yaml 2024-01-13 01:19:58 +01:00
Sarah Vaupel
31f657a15f chore(settings): fix oauth2 config json parsers 2024-01-13 01:14:54 +01:00
Sarah Vaupel
7946e046e2 chore(settings): update settings.yml 2024-01-13 00:42:25 +01:00
Sarah Vaupel
7ca12d064d refactor(settings): enhance field names 2024-01-13 00:40:57 +01:00
Sarah Vaupel
5e85eae825 refactor(settings): move ResourcePool, Ldap and OAuth2 settings to separate modules 2024-01-12 23:24:58 +01:00
Sarah Vaupel
3e9e90ed86 chore(settings): restructure Settings.hs; add OAuthConf to AppSettings 2024-01-12 17:14:42 +01:00
David Mosbach
a67697d159 chore(admin): added oauth2 handling widget 2023-12-18 02:58:14 +00:00
David Mosbach
ce8aa849f8 chore(admin): oauth2 admin form identifiers 2023-12-18 00:56:50 +00:00
5c4f742745 chore(admin): add basic admin route stub and navigation for response inspection 2023-12-13 16:36:52 +00:00
7b7b82cba3 Merge branch 'oauth2' into 140-admin-handler-fur-oauth-response-inspection 2023-12-13 14:52:32 +00:00
David Mosbach
cf89722c7f chore(auth): enabled ldap lookup for oauth2 creds 2023-12-04 00:32:01 +00:00
David Mosbach
44d082f8b9 feat(auth): added azure & mock server to login widget 2023-12-03 23:23:44 +00:00
David Mosbach
9b9370fed0 feat(auth): WIP authorization function 2023-12-03 15:06:39 +00:00
David Mosbach
2351388826 feat(auth): WIP support for OAuth2 2023-12-03 03:49:20 +00:00
aa41004c39 chore(release): 27.4.49 2023-11-09 10:21:10 +00:00
29df39f3b5 Merge branch 'fradrive/company' into test 2023-11-08 17:03:01 +00:00
de005691f1 chore(release): 27.4.48 2023-11-03 16:59:25 +00:00
050516c0bc Merge branch 'fradrive/company' into test 2023-11-03 16:58:31 +00:00
e63c8751eb Merge branch 'master' into test 2023-11-03 15:36:04 +00:00
2a4158303e chore(release): 27.4.47 2023-10-27 23:49:40 +00:00
1797d4eb9b Merge branch 'fradrive/company' into test 2023-10-27 16:39:18 +00:00
307cda543e chore(release): 27.4.46 2023-10-26 17:14:40 +00:00
de19073e11 Merge branch 'fradrive/company' into test 2023-10-26 17:14:08 +00:00
18af65da10 Merge branch 'master' into test 2023-10-26 08:12:52 +00:00
45048ce62d Merge branch 'fradrive/company' into test 2023-10-24 16:15:32 +00:00
bc4594bea2 fix(build): comment planned model changes 2023-10-23 08:02:03 +00:00
e4883c62d0 chore(test): ensure test branch uses different filenames and idents 2023-10-20 16:49:08 +00:00
6e5a58aa37 Merge branch 'fradrive/company' into test 2023-10-20 16:46:30 +00:00
d495a31ad8 chore(qualifications): thoughts on the prerequisite modelling 2023-09-25 06:48:49 +00:00
1310 changed files with 24321 additions and 21211 deletions

3
.babelrc.license Normal file
View File

@ -0,0 +1,3 @@
SPDX-FileCopyrightText: 2022 Felix Hamann <felix.hamann@campus.lmu.de>,Sarah Vaupel <sarah.vaupel@ifi.lmu.de>,Sarah Vaupel <vaupel.sarah@campus.lmu.de>
SPDX-License-Identifier: AGPL-3.0-or-later

3
.eslintrc.json.license Normal file
View File

@ -0,0 +1,3 @@
SPDX-FileCopyrightText: 2022 Gregor Kleen <gregor.kleen@ifi.lmu.de>,Sarah Vaupel <sarah.vaupel@ifi.lmu.de>,Sarah Vaupel <vaupel.sarah@campus.lmu.de>
SPDX-License-Identifier: AGPL-3.0-or-later

21
.gitignore vendored
View File

@ -2,10 +2,8 @@
dist*
develop
node_modules/
.npm/
.node_repl_history
**/assets/icons
**/assets/favicons
assets/icons
assets/favicons
bin/
assets/fonts/
*.hi
@ -40,21 +38,22 @@ uniworx.nix
.kateproject
src/Handler/Assist.bak
src/Handler/Course.SnapCustom.hs
frontend/src/env.sass
*.orig
/instance
backend/instance
.stack-work-*
.stack-work.lock
.directory
tags
test.log
*.dump-splices
/.stack-work.lock
/.npmrc
/.npm/
/config/manifest.json
tunnel.log
static
well-known
.well-known-cache
manifest.json
/static
/well-known
/.well-known-cache
/.nix-well-known
/**/tmp-*
/testdata/bigAlloc_*.csv
@ -68,4 +67,4 @@ manifest.json
**/result-*
.develop.cmd
/.vscode
backend/.ghc/ghci_history
.ghc/ghci_history

272
.gitlab-ci/sanitize-docker.pl Executable file
View File

@ -0,0 +1,272 @@
#!/usr/bin/env perl
use strict;
use warnings;
use Data::Dumper;
print "Sanitize script for node removal from container.\n";
system("pwd");
{
my @l = (".","..");
for(1..8) {
push @l, (("../" x $_)."..")
}
for(@l) {
my $cmd = "ls -ld $_";
print "running: $cmd\n";
system $cmd;
}
}
my $tmpdir = "tmp-sanitize";
die "Has already run, abort" if -e $tmpdir;
mkdir $tmpdir;
chmodWrap(0755, $tmpdir);
chdir($tmpdir);
system("ln -s ../uniworx.tar.gz .");
system("tar xzvf uniworx.tar.gz");
chmodWrap(0755, '.'); # tar can change the rights of '.' if it contains an entry for '.' with other rights
my %truerights = ();
storeRightsMake7(".");
#print "=== Extended rights:\n";
#system("ls -l *");
#resetRights(".");
#print "=== Reset rights:\n";
#system("ls -l *");
sub chmodWrap {
my ($mode, $fn) = @_;
my $tries = 0;
die "file '$fn' does not exist; cannot change its permissions to $mode" unless -e $fn;
RIGHTS: {
chmod($mode, $fn);
my $ismode = (stat($fn))[2];
my $fm = $ismode % 512;
if($fm != $mode) {
if($tries++ > 20) {
die "Problem with file permissions, abort"
}
warn sprintf "File rights were meant to be set, but were not updated properly for file '%s', is %03o but was set to %03o; try again in 1 second";
sleep 1;
redo RIGHTS;
}
}
}
#
sub storeRightsMake7 {
my ($pwd) = @_;
my $dh = undef;
opendir($dh, $pwd) or die "Could not read dir '$pwd', because: $!";
while(my $fn = readdir($dh)) {
next if $fn=~m#^\.\.?$#;
#perl -le 'my $dh = undef;opendir($dh, ".");while(my $fn = readdir($dh)) { my $mode = (stat($fn))[2];my $fm = $mode % 512;my $fmo=sprintf("%03o",$fm);print "$fn -> $fmo" }'
my $fullname = "$pwd/$fn";
my $mode = (stat($fullname))[2];
my $fm = $mode % 512;
#my $fmo = sprintf("%03o",$fm);
$truerights{$fullname} = $fm;
chmodWrap(($fm | 0700), $fullname);
storeRightsMake7($fullname) if -d $fullname;
}
}
sub resetRights {
my ($pwd) = @_;
print "Resetting rights to:\n" if '.' eq $pwd;
print Data::Dumper::Dumper(\%truerights);
my $dh = undef;
opendir($dh, $pwd) or die "Could not read dir '$pwd', because: $!";
while(my $fn = readdir($dh)) {
next if $fn=~m#^\.\.?$#;
#perl -le 'my $dh = undef;opendir($dh, ".");while(my $fn = readdir($dh)) { my $mode = (stat($fn))[2];my $fm = $mode % 512;my $fmo=sprintf("%03o",$fm);print "$fn -> $fmo" }'
my $fullname = "$pwd/$fn";
printf(" set rights of '$fullname' back to %03o\n", $truerights{$fullname});
chmodWrap($truerights{$fullname}, $fullname);
resetRights($fullname) if -d $fullname;
}
}
sub renameWithRights {
my ($from, $to) = @_;
print " rename file '$from' to '$to'\n";
my %oldrights = %truerights;
%truerights = ();
while(my ($k,$v) = each %oldrights) {
$k =~ s#^\./\Q$from\E#./$to#;
$truerights{$k} = $v;
}
#my $rights = $truerights{$from};
#delete $truerights{$from};
rename($from, $to) or die "Could not rename '$from' to '$to', because $!";
my $waittimer = 20;
while(-e $from || not(-e $to) and $waittimer-- > 0) {
sleep 1
}
die "rename file from '$from' to '$to', but it is still there" if -e $from;
die "rename file from '$from' to '$to', but there is no file under the new name" unless -e $to;
#$truerights{$to} = $rights
}
print Data::Dumper::Dumper(\%truerights);
#exit 0;
# Checksummen:
# outerjson c27f -- toplevel $outerjson.json, by sha256sum $outerjson.json
# imageid d940 -- toplevel verzeichnis mit der layer darin; doc says: Each images ID is given by the SHA256 hash of its configuration JSON.
# we'll try as configuration "remove nodejs $oldhash"
# or we just use a random number ;)
# layertar fd3d -- doc says: Each images ID is given by the SHA256 hash of its configuration JSON.
#
##### FOUND
# outerjson c27f64c8de183296ef409baecc27ddac8cd4065aac760b1b512caf482ad782dd -- in manifest.json
# imageid d940253667b5ab47060e8bf537bd5b3e66a2447978f3c784a22b115a262fccbf -- in manifest.json
# imageid d940253667b5ab47060e8bf537bd5b3e66a2447978f3c784a22b115a262fccbf -- as toplevel dirname
# outerjson c27f64c8de183296ef409baecc27ddac8cd4065aac760b1b512caf482ad782dd -- as toplevel filename
# imageid d940253667b5ab47060e8bf537bd5b3e66a2447978f3c784a22b115a262fccbf -- in $layerdir/json
# layertar fd3d3cdf4ece09864ac933aa664eb5f397cf5ca28652125addd689726f8485cd -- in $outerjson.json
#
#
##### COMPUTE
# toplevel
# outerjson c27f64c8de183296ef409baecc27ddac8cd4065aac760b1b512caf482ad782dd $outerjson.json
# b21db3fcc85b23d91067a2a5834e114ca9eec0364742c8680546f040598d8cd9 manifest.json
# 238f234e3a1ddb27a034f4ee1e59735175741e5cc05673b5dd41d9a42bac2ebd uniworx.tar.gz
# in $layerdir/
# 028c1e8d9688b420f7316bb44ce0e26f4712dc21ef93c5af8000c102b1405ad4 json
# layertar fd3d3cdf4ece09864ac933aa664eb5f397cf5ca28652125addd689726f8485cd layer.tar
# d0ff5974b6aa52cf562bea5921840c032a860a91a3512f7fe8f768f6bbe005f6 VERSION
#
#
# sha256sum layer.tar fd3d3cdf4ece09864ac933aa664eb5f397cf5ca28652125addd689726f8485cd
my ($outerjson, $imageid) = ();
{
my $dirh = undef;
opendir($dirh, '.') or die "Could not read dir '.', because: $!";
while(my $fn = readdir($dirh)) {
next if $fn=~m#^\.#;
if($fn=~m#(.{16,})\.json#) { # it shall match on hash sums but not for example on manifest.json
$outerjson = $1;
next
}
if($fn=~m#^[0-9a-f]{64}$#) {
$imageid = $fn
}
}
}
die "Bad archive, could not found expected files and directories" unless defined($outerjson) and defined($imageid);
#system("pwd");
#print "will run: sha256sum $imageid/layer.tar\n";
my $oldLayerdir = qx(sha256sum $imageid/layer.tar);
#print "oldLayerdir is for now $oldLayerdir\n\n";
$oldLayerdir =~ m#^([0-9a-f]{64}).*$# or die "layer.tar not found or sha256sum not installed!";
$oldLayerdir = $1;
# tar --delete --file layer.tar nix/store/cdalbhzm3z4gz07wyg89maprdbjc4yah-nodejs-14.17.0
my $layerContent = qx(tar -tf $imageid/layer.tar);
my @rms = $layerContent=~m#^((?:\./)?nix/store/[a-z0-9]+-(?:nodejs|openjdk|ghc)-[^/]+/)$#gm;
print "rm <<$_>>\n" for @rms;
system("tar --delete --file $imageid/layer.tar '$_'") for @rms;
### Deconstruction finished, now lets put everything together again after fixing the checksums
my $newImageId = qx(echo 'remove nodejs $imageid' | sha256sum);
$newImageId =~ m#^([0-9a-f]{64}).*$# or die "sha256sum not installed!";
$newImageId = $1;
my $newLayerdir = qx(sha256sum $imageid/layer.tar);
$newLayerdir =~ m#^([0-9a-f]{64}).*$# or die "sha256sum not installed!";
$newLayerdir = $1;
# new outerjson is computed later, as we first have to change its content
sub cautionWaiter {
# some file operations give the impression that they are not instant.
# Hence, we wait here a bit to see if that fixes stuff
#sleep 5; # seems not to be the reason
}
sub replaceInFile {
my ($filename, $replacer) = @_;
return unless -e $filename;
my $fh = undef;
open($fh, '<', $filename) or die "Could not read $filename, because: $!";
my $content = join '', <$fh>;
close $fh;
keys %$replacer;
while(my ($k,$v) = each %$replacer) {
$content=~s#\Q$k\E#$v#g;
}
my $wh = undef;
open($wh, '>', $filename) or die "Could not write $filename, because: $!";
print $wh $content;
close $wh;
}
my %replacer = (
$oldLayerdir => $newLayerdir,
$imageid => $newImageId,
);
replaceInFile("$imageid/json", \%replacer);
replaceInFile("$outerjson.json", \%replacer);
cautionWaiter();
my $newOuterjson = qx(sha256sum '$outerjson.json');
$newOuterjson =~ m#^([0-9a-f]{64}).*$# or die "sha256sum not installed!";
$newOuterjson = $1;
cautionWaiter();
renameWithRights("$outerjson.json", "$newOuterjson.json");
$replacer{$outerjson} = $newOuterjson;
replaceInFile("manifest.json", \%replacer);
replaceInFile("repositories", \%replacer);
cautionWaiter();
renameWithRights($imageid, $newImageId);
cautionWaiter();
resetRights(".");
system("find");
unlink("uniworx.tar.gz");
system("tar czvf uniwox-rmnodejs.tar.gz *");
cautionWaiter();
print "Debug output, content of container:\n";
system("tar tzvf uniwox-rmnodejs.tar.gz");
cautionWaiter();
#unlink("../uniworx.tar.gz");
system("cp uniwox-rmnodejs.tar.gz ../uniworx-sanitized.tar.gz");

64
.ports/assign.hs Normal file
View File

@ -0,0 +1,64 @@
-- SPDX-FileCopyrightText: 2024 David Mosbach <david.mosbach@uniworx.de>
--
-- SPDX-License-Identifier: AGPL-3.0-or-later
{-# Language OverloadedStrings, LambdaCase, TypeApplications #-}
import Data.Text (Text)
import qualified Data.Text as T
import System.Directory
import System.Environment
import System.IO
main :: IO ()
main = getArgs >>= \case
["--assign", offsetFile] -> parseOffsets offsetFile >>= uncurry nextOffset
["--remove", offset] -> removeOffset offset
_ -> fail "unsupported args"
parseOffsets :: FilePath -> IO (Int,Int)
parseOffsets offsetFile = do
user <- T.pack <$> getEnv "USER"
let pred x = "//" `T.isPrefixOf` x || T.null (T.strip x)
tokenise = map (filter (not . pred) . T.lines) . T.split (=='#')
extract = map tail . filter (\u -> not (null u) && user == (T.strip $ head u))
((extract . tokenise . T.pack) <$> readFile offsetFile) >>= \case
[[min,max]] -> return (read $ T.unpack min, read $ T.unpack max)
x -> print x >> fail "malformed offset file"
nextOffset :: Int -> Int -> IO ()
nextOffset min max
| min > max = nextOffset max min
| otherwise = do
home <- getEnv "HOME"
offset <- findFile [home] ".port-offsets" >>= \case
Nothing -> writeFile (home ++ "/.port-offsets") (show min) >> return min
Just path -> do
used <- (map (read @Int) . filter (not . null) . lines) <$> readFile path
o <- next min max used
appendFile path ('\n' : show o)
return o
print offset
where
next :: Int -> Int -> [Int] -> IO Int
next min max used
| min > max = fail "all offsets currently in use"
| min `elem` used = next (min+1) max used
| otherwise = return min
removeOffset :: String -> IO ()
removeOffset offset = do
home <- getEnv "HOME"
findFile [home] ".port-offsets" >>= \case
Nothing -> fail "offset file does not exist"
Just path -> do
remaining <- (filter (/= offset) . lines) <$> readFile path
run <- getEnv "XDG_RUNTIME_DIR"
(tempPath, fh) <- openTempFile run ".port-offsets"
let out = unlines remaining
hPutStr fh $ out
case T.null (T.strip $ T.pack out) of
True -> removeFile path
False -> writeFile path $ out
removeFile tempPath

24
.ports/offsets Normal file
View File

@ -0,0 +1,24 @@
// SPDX-FileCopyrightText: 2024 David Mosbach <david.mosbach@uniworx.de>
//
// SPDX-License-Identifier: AGPL-3.0-or-later
# gkleen
-1000
-950
# ishka
-949
-899
# jost
-898
-848
# mosbach
-847
-797
# savau
-796
-746

File diff suppressed because it is too large Load Diff

480
Makefile
View File

@ -1,59 +1,91 @@
export SHELL=bash
export CLEAN_DEPENDENCIES ?= false
export CLEAN_IMAGES ?= false
# MAKE=make -f Makefile-loggingsymbols
# MAKE=make -d
export ENTRYPOINT ?= bash
# System information
export CPU_CORES = $(shell cat /proc/cpuinfo | grep '^processor' | wc -l)
export CONTAINER_COMMAND ?= podman
export CONTAINER_BGRUN ?= $(CONTAINER_COMMAND) run -dit --network=host --replace
export CONTAINER_FGRUN ?= $(CONTAINER_COMMAND) run -it --network=host --replace
export IMAGE_REGISTRY = docker.io
export MEMCACHED_IMAGE = $(IMAGE_REGISTRY)/memcached:latest
export MINIO_IMAGE = $(IMAGE_REGISTRY)/minio/minio:latest
export MAILDEV_IMAGE = $(IMAGE_REGISTRY)/maildev/maildev:latest # TODO: needs different port than 1025 to avoid conflicts
export IN_CONTAINER ?= false
export IN_CI ?= false
export CONTAINER_FILE
export CONTAINER_IDENT
export CF_PREFIX
export DEVELOP
export CONTAINER_ATTACHED
export CONTAINER_INIT
export CONTAINER_CLEANUP
export PROJECT_DIR=/fradrive
export SERVICE
export SERVICE_VARIANT ?= $(SERVICE)
export JOB
export IMAGE
export SET_IMAGE
export ENTRYPOINT
export EXEC_OPTS
export STACK_CORES = $(shell echo $(($(CPU_CORES)/2)))
export BASE_PORTS
export UNIWORXDB_OPTS ?= -cf
export PROD ?= false
export SRC
ifneq ($(PROD),true)
export --DEVELOPMENT=--flag uniworx:dev
endif
export DATE := $(shell date +'%Y-%m-%dT%H-%M-%S')
export CURR_DEV = $(shell cat develop/.current 2>/dev/null)
export SET_DEVELOP = $(eval DEVELOP=develop/$$(CURR_DEV))
export NEW_DEVELOP = $(eval DEVELOP=develop/$$(DATE))
.PHONY: help
# HELP: print out this help message
help:
docker compose run help
@if [ -z "$$(which perl 2>/dev/null)" ] ; then \
$(CONTAINER_FGRUN) .:/mnt 'debian:12.5' '/mnt/utils/makehelp.pl' '/mnt/Makefile' ; \
else \
utils/makehelp.pl Makefile ; \
fi
.PHONY: clean
# HELP: clean compilation caches
# HELP: stop all running containers and remove all compilation results in the directory (but leave images including dependencies unharmed)
clean:
$(MAKE) clean-frontend CLEAN_DEPENDENCIES=$(CLEAN_DEPENDENCIES) CLEAN_IMAGES=$(CLEAN_IMAGES)
$(MAKE) clean-backend CLEAN_DEPENDENCIES=$(CLEAN_DEPENDENCIES) CLEAN_IMAGES=$(CLEAN_IMAGES)
rm -rf develop
-rm -rf node_modules .npm .cache assets/icons assets/favicons static well-known config/manifest.json frontend/src/env.sass
-rm -rf .stack-work .stack-work.lock
-rm -rf bin .Dockerfile develop
-$(CONTAINER_COMMAND) container prune --force
.PHONY: clean-images
# HELP: stop all running containers and clean all images from local repositories
clean-images:
rm -rf develop
sleep 5
-$(CONTAINER_COMMAND) system prune --all --force --volumes
-$(CONTAINER_COMMAND) image prune --all --force
-$(CONTAINER_COMMAND) volume prune --force
.PHONY: clean-all
# HELP: clean everything, including dependency and image caches
clean-all: CLEAN_DEPENDENCIES = true
clean-all: CLEAN_IMAGES = true
clean-all: clean ;
# HELP: like clean but with full container, image, and volume prune
clean-all: clean
-rm -rf .stack
$(CONTAINER_COMMAND) system reset --force
.PHONY: clean-%
# HELP(clean-$SERVICE): invalidate caches for a given service. Supported services: frontend, backend.
clean-%:
$(MAKE) stop-$*
@$(MAKE) -- --clean-$*
@echo "Cleaned $* build files and binaries."
ifeq ("$(CLEAN_DEPENDENCIES)", "true")
@$(MAKE) -- --clean-$*-deps
@echo "Cleaned $* dependencies."
endif
ifeq ("$(CLEAN_IMAGES)", "true")
$(MAKE) kill-$*
docker compose rm --force --volumes
docker compose down --rmi 'all' --volumes
@echo "Cleaned $* image."
endif
--clean-frontend:
-rm -rf assets/icons assets/favicons
-rm -rf static well-known
--clean-frontend-deps:
-rm -rf frontend/node_modules
-rm -rf frontend/.npm
--clean-backend:
-rm -rf backend/.stack-work
-rm -rf bin/
--clean-backend-deps:
-rf -rf backend/.stack
# TODO: only release when build and tests are passing!!!
.PHONY: release
# HELP: create, commit and push a new release
# TODO: only release when build and tests are passing!!!
release:
VERSION=`./utils/version.pl -changelog CHANGELOG.md -v` ; \
git add CHANGELOG.md ; \
@ -64,60 +96,346 @@ release:
.PHONY: compile
# HELP: perform full compilation (frontend and backend)
compile: compile-frontend compile-backend ;
.PHONY: compile-%
# HELP(compile-$SERVICE): compile a given service once
compile-%:
docker compose run --remove-orphans --build --no-deps $* make compile
compile:
$(MAKE) compile-frontend
$(MAKE) compile-backend
.PHONY: start
# HELP: start complete development environment with a fresh test database
start: start-postgres start-maildev start-memcached start-minio start-backend
docker compose exec backend make start
start:
$(MAKE) start-postgres
$(MAKE) start-memcached
$(MAKE) start-minio
$(MAKE) start-maildev
$(MAKE) compile-frontend
$(MAKE) compile-uniworxdb
$(MAKE) start-backend
.PHONY: %-backend
%-backend: SERVICE=backend
%-backend: SERVICE_VARIANT=backend
%-backend: IMAGE=localhost/fradrive/backend
%-backend: BASE_PORTS = "DEV_PORT_HTTP=3000" "DEV_PORT_HTTPS=3443"
.PHONY: %-uniworxdb
%-uniworxdb: SERVICE=backend
%-uniworxdb: SERVICE_VARIANT=uniworxdb
%-uniworxdb: IMAGE=localhost/fradrive/backend
.PHONY: %-ghci
%-ghci: SERVICE=backend
%-ghci: SERVICE_VARIANT=ghci
%-ghci: IMAGE=localhost/fradrive/backend
.PHONY: %-hoogle
%-hoogle: SERVICE=backend
%-hoogle: SERVICE_VARIANT=hoogle
%-hoogle: BASE_PORTS = "HOOGLE_PORT=8081"
%-hoogle: IMAGE=localhost/fradrive/backend
--start-hoogle:
HOOGLE_PORT=`cat $(CONTAINER_FILE) | grep 'HOOGLE_PORT=' | sed 's/HOOGLE_PORT=//'` ; \
stack $(STACK_CORES) hoogle -- server --local --port $${HOOGLE_PORT}
.PHONY: %-frontend
%-frontend: SERVICE=frontend
%-frontend: SERVICE_VARIANT=frontend
%-frontend: IMAGE=localhost/fradrive/frontend
.PHONY: %-postgres
%-postgres: SERVICE=postgres
%-postgres: SERVICE_VARIANT=postgres
%-postgres: BASE_PORTS = "PGPORT=5432"
%-postgres: SET_IMAGE=localhost/fradrive/postgres
.PHONY: %-memcached
%-memcached: SERVICE=memcached
%-memcached: SERVICE_VARIANT=memcached
%-memcached: SET_IMAGE=$$(MEMCACHED_IMAGE) --port=`cat $$(CONTAINER_FILE) | grep 'MEMCACHED_PORT=' | sed 's/MEMCACHED_PORT=//'`
%-memcached: BASE_PORTS = "MEMCACHED_PORT=11211"
.PHONY: %-maildev
%-maildev: SERVICE=maildev
%-maildev: SERVICE_VARIANT=maildev
%-maildev: SET_IMAGE=$$(MAILDEV_IMAGE) --port=`cat $$(CONTAINER_FILE) | grep 'MAILDEV_PORT=' | sed 's/MAILDEV_PORT=//'`
%-maildev: BASE_PORTS = "MAILDEV_PORT=1025"
.PHONY: %-release
%-release: PROD=true
%-release: SERVICE=fradrive
%-release: SERVICE_VARIANT=fradrive
%-release: IMAGE=localhost/fradrive/fradrive
.PHONY: %-minio
%-minio: SERVICE=minio
%-minio: SERVICE_VARIANT=minio
%-minio: SET_IMAGE=$$(MINIO_IMAGE) -- server `mktemp` --address=:`cat $$(CONTAINER_FILE) | grep 'UPLOAD_S3_PORT=' | sed 's/UPLOAD_S3_PORT=//'`
%-minio: BASE_PORTS = "UPLOAD_S3_PORT=9000"
.PHONY: start-%
# HELP(start-$SERVICE): start a given service
start-%:
docker compose up -d --build $*
start-%: JOB=start
start-%: CF_PREFIX = start-
start-%: CONTAINER_ATTACHED = false
start-%: --act ;
.PHONY: compile-%
compile-%: JOB=compile
compile-%: CF_PREFIX = compile-
compile-%: CONTAINER_ATTACHED = true
compile-%: --act ;
.PHONY: dependencies-%
dependencies-%: JOB=dependencies
dependencies-%: CF_PREFIX = dependencies-
dependencies-%: CONTAINER_ATTACHED = true
dependencies-%: --act ;
.PHONY: test-%
test-%: JOB=test
test-%: CF_PREFIX = test-
test-%: CONTAINER_ATTACHED = true
test-%: --act ;
.PHONY: lint-%
lint-%: JOB=lint
lint-%: CF_PREFIX = lint-
lint-%: CONTAINER_ATTACHED = true
lint-%: --act ;
.PHONY: shell-%
# HELP(shell-$SERVICE): launch a (bash) shell inside a given service
shell-%:
docker compose run --build --no-deps --entrypoint="$(ENTRYPOINT)" $*
# HELP(shell-$SERVICE): launch (bash) shell inside a new $SERVICE container
shell-%: JOB=shell
shell-%: CF_PREFIX=shell-
shell-%: CONTAINER_ATTACHED=true
shell-%: --act ;
.PHONY: ghci
# HELP: launch ghci instance. Use in combination with SRC to specify the modules to be loaded by ghci: make ghci SRC=src/SomeModule.hs
ghci: ENTRYPOINT=stack ghci $(SRC)
ghci: shell-backend ;
# HELP(ghci): launch new backend instance and enter interactive ghci shell
ghci: shell-ghci;
.PHONY: stop
# HELP: stop all services
stop:
docker compose down
.PHONY: stop-%
# HELP(stop-$SERVICE): stop a given service
stop-%:
docker compose down $*
.PHONY: kill-%
# HELP(kill-$SERVICE): kill a given service the hard way. Use this if the servive does not respond to stop.
kill-%:
docker compose kill $*
--act: --develop_containerized;
--develop_%: PORTS = $(foreach PORT,$(BASE_PORTS),$(shell utils/next_free_port.pl $(PORT)))
--develop_%: --ensure-develop
DEVELOP=develop/`cat develop/.current` ; \
CONTAINER_IDENT=$(CF_PREFIX)$(SERVICE_VARIANT) ; \
CONTAINER_FILE=$${DEVELOP}/$${CONTAINER_IDENT} ; \
if [[ -e $${CONTAINER_FILE} ]]; then \
>&2 echo "Another $* service is already running! Use \"make new-develop\" to start a new develop instance despite currently running services." ; \
exit 1 ; \
fi ; \
echo "$(PORTS)" | sed 's/ /\n/g' > $${CONTAINER_FILE} ; \
$(MAKE) -- --$* CONTAINER_FILE=$${CONTAINER_FILE} CONTAINER_IDENT=$${CONTAINER_IDENT} JOB=$(JOB)
.PHONY: rebuild-%
# HELP(rebuild-{backend,frontend,database,memcached,minio}): force-rebuild a given container image
rebuild-%:
$(MAKE) -- --image-build SERVICE=$* NO_CACHE=--no-cache
--image-build:
ifeq "$(IMAGE)" "localhost/fradrive/$(SERVICE)"
rm -f .Dockerfile
ln -s docker/$(SERVICE)/Dockerfile .Dockerfile
PROJECT_DIR=/fradrive; \
if [ "$(IN_CONTAINER)" == "false" ] ; then \
$(CONTAINER_COMMAND) build $(NO_CACHE) \
-v $(PWD):$${PROJECT_DIR}:rw \
--build-arg PROJECT_DIR=$${PROJECT_DIR} \
--env IN_CONTAINER=true \
--env JOB=$(JOB) \
--tag fradrive/$(SERVICE) \
--file $(PWD)/.Dockerfile ; \
fi
else
:
endif
--containerized: --image-build
DEVELOP=`cat develop/.current` ; \
./utils/watchcontainerrun.sh "$(CONTAINER_COMMAND)" "$(CONTAINER_FILE)" "$(CONTAINER_INIT)" "$(CONTAINER_CLEANUP)" & \
CONTAINER_NAME=fradrive.$(CURR_DEV).$(CONTAINER_IDENT) ; \
if ! [ -z "$(SET_IMAGE)" ] ; \
then \
IMAGE="$(SET_IMAGE)" ; \
else \
IMAGE=$(IMAGE) ; \
MAKECALL="make -- --$(JOB)-$(SERVICE_VARIANT) IN_CONTAINER=true" ; \
fi ; \
CONTAINER_ID=`$(CONTAINER_BGRUN) \
-v $(PWD):$(PROJECT_DIR):rw \
--env IN_CONTAINER=true \
--env CONTAINER_FILE=$(CONTAINER_FILE) \
--env CONTAINER_NAME=$${CONTAINER_NAME} \
--env JOB=$(JOB) \
--env SRC=$(SRC) \
--name $${CONTAINER_NAME} \
$${IMAGE} \
$${MAKECALL} \
` ; \
printf "CONTAINER_ID=$${CONTAINER_ID}" >> "$(CONTAINER_FILE)" ; \
if [[ "true" == "$(CONTAINER_ATTACHED)" ]] ; then \
$(CONTAINER_COMMAND) attach $${CONTAINER_ID} || : ; \
fi
# For Reverse Proxy Problem see: https://groups.google.com/g/yesodweb/c/2EO53kSOuy0/m/Lw6tq2VYat4J
# HELP(start-backend): start development instance
--start-backend: --dependencies-backend
export YESOD_IP_FROM_HEADER=true; \
export DEV_PORT_HTTP=`cat $(CONTAINER_FILE) | grep 'DEV_PORT_HTTP=' | sed 's/DEV_PORT_HTTP=//'`; \
export DEV_PORT_HTTPS=`cat $(CONTAINER_FILE) | grep 'DEV_PORT_HTTPS=' | sed 's/DEV_PORT_HTTPS=//'`; \
export HOST=127.0.0.1 ; \
export PORT=$${PORT:-$${DEV_PORT_HTTP}} ; \
export DETAILED_LOGGING=$${DETAILED_LOGGING:-true} ; \
export LOG_ALL=$${LOG_ALL:-false} ; \
export LOGLEVEL=$${LOGLEVEL:-info} ; \
export DUMMY_LOGIN=$${DUMMY_LOGIN:-true} ; \
export SERVER_SESSION_ACID_FALLBACK=$${SERVER_SESSION_ACID_FALLBACK:-true} ; \
export SERVER_SESSION_COOKIES_SECURE=$${SERVER_SESSION_COOKIES_SECURE:-false} ; \
export COOKIES_SECURE=$${COOKIES_SECURE:-false} ; \
export ALLOW_DEPRECATED=$${ALLOW_DEPRECATED:-true} ; \
export ENCRYPT_ERRORS=$${ENCRYPT_ERRORS:-false} ; \
export RIBBON=$${RIBBON:-$${HOST:-localhost}} ; \
export APPROOT=$${APPROOT:-http://localhost:$${DEV_PORT_HTTP}} ; \
export AVSPASS=$${AVSPASS:-nopasswordset} ; \
stack $(STACK_CORES) exec --local-bin-path $$(pwd)/bin --copy-bins -- yesod devel -p "$${DEV_PORT_HTTP}" -q "$${DEV_PORT_HTTPS}"
# HELP(compile-backend): compile backend binaries
--compile-backend: --dependencies-backend
stack build $(STACK_CORES) --fast --profile --library-profiling --executable-profiling --flag uniworx:-library-only $(--DEVELOPMENT) --local-bin-path $$(pwd)/bin --copy-bins
# HELP(dependencies-backend): (re-)build backend dependencies
--dependencies-backend: #uniworx.cabal
chown -R `id -un`:`id -gn` "$(PROJECT_DIR)"; \
stack install hpack; stack install yesod-bin; \
stack build -j2 --only-dependencies
# HELP(lint-backend): lint backend
--lint-backend:
stack build $(STACK_CORES) --test --fast --flag uniworx:library-only $(--DEVELOPMENT) uniworx:test:hlint
# HELP(test-backend): test backend
--test-backend:
stack build $(STACK_CORES) --test --coverage --fast --flag uniworx:library-only $(--DEVELOPMENT)
# uniworx.cabal:
# stack exec -- hpack --force
# HELP(compile-frontend): compile frontend assets
--compile-frontend: --dependencies-frontend
npm run build
--start-frontend: --compile-frontend;
--dependencies-frontend: node_modules assets esbuild.config.mjs frontend/src/env.sass;
node_modules: package.json package-lock.json
npm install --cache .npm --prefer-offline
package-lock.json: package.json
npm install --cache .npm --prefer-offline
assets: assets/favicons assets/icons;
assets/favicons:
./utils/faviconize.pl assets/favicon.svg long assets/favicons
assets/icons: node_modules assets/icons-src/fontawesome.json
./utils/renamer.pl node_modules/@fortawesome/fontawesome-free/svgs/solid assets/icons-src/fontawesome.json assets/icons/fradrive
./utils/renamer.pl node_modules/@fortawesome/fontawesome-free/svgs/regular assets/icons-src/fontawesome.json assets/icons/fradrive
-cp assets/icons-src/*.svg assets/icons/fradrive
frontend/src/env.sass:
echo "\$$path: '$${PROJECT_DIR}'" > frontend/src/env.sass
static: --dependencies-frontend
npm run build
well-known: static;
--lint-frontend: --compile-frontend
npm run lint
--test-frontend: --compile-frontend
npm run test
# HELP(compile-uniworxdb): clear and fill database. requires running postgres instance (use "make start-postgres" to start one)
# TODO (db-m-$MIGRATION-backend): apply migration (see src/Model/Migration/Definition.hs for list of available migrations)
--compile-uniworxdb: --compile-backend
SERVER_SESSION_ACID_FALLBACK=${SERVER_SESSION_ACID_FALLBACK:-true} ; \
AVSPASS=${AVSPASS:-nopasswordset} ; \
./bin/uniworxdb $(UNIWORXDB_OPTS)
# HELP(shell-ghci): enter ghci shell. Use "make ghci SRC=<MODULE_FILE.hs>" to load specific source modules."
--shell-ghci:
stack ghci -- $(SRC)
# --main-is uniworx:exe:uniworx
# HELP(shell-{backend,frontend,memcached,minio,postgres}): enter (bash) shell inside a new container of a given service
--shell-%:
/bin/bash
# HELP(start-minio): start minio service
.PHONY: status
# HELP: print an overview of currently running services and their health
# HELP: print develop status: running containers, used ports
status:
docker compose ps
.PHONY: top
# HELP: print an overview of the ressource usage of the currently running services
top:
docker compose stats
.PHONY: list-projects
# HELP: list all currently running projects on this machine
list-projects:
docker compose ls
@./utils/develop-status.pl -a
.PHONY: log-%
# HELP(log-$SERVICE): follow the output of a given service. Service must be running.
# HELP(log-$(JOB)-$(SERVICE)): inspect output of a given service. The service must be currently running When a service supports multiple running instances in one develop (i.e. backend), you need to specify the exact instance by its associated file (e.g. backend-1, backend-2, etc.), please check the contents of the develop/ directory for a list of running instances.
log-%:
docker compose logs --follow --timestamps $*
DEVELOP=develop/`cat develop/.current` ; \
SEARCH_FILE="$${DEVELOP}/$*" ; \
if [[ ! -e "$${SEARCH_FILE}" ]] ; then \
SEARCH_FILE="$${DEVELOP}/.exited.$*" ; \
fi ; \
if [[ -e "$${SEARCH_FILE}" ]] ; then \
$(CONTAINER_COMMAND) logs --follow `cat "$${SEARCH_FILE}" | grep CONTAINER_ID= | sed 's/^CONTAINER_ID=//'` ; \
else \
>&2 echo "Cannot show log: No develop file found for '$*'" ; \
exit 1 ; \
fi
.PHONY: enter
# HELP: launch (bash) shell inside a currently running container. Use ./enter shell wrapper for more convenient usage, possibly with tab-completion in the future
enter: --ensure-develop
$(MAKE) -- --enter
.PHONY: psql
# HELP: enter psql (postgresql) cli inside a currently running database container
psql: ENTRYPOINT=/usr/bin/psql -d uniworx
psql: EXEC_OPTS=--user postgres
psql: --ensure-develop
$(MAKE) -- --enter CONTAINER_FILE=develop/`cat develop/.current`/start-postgres
--enter:
CONTAINER_ID=`cat $(CONTAINER_FILE) | grep 'CONTAINER_ID=' | sed 's/CONTAINER_ID=//'` ; \
$(CONTAINER_COMMAND) exec -it $(EXEC_OPTS) $${CONTAINER_ID} $(if $(ENTRYPOINT),$(ENTRYPOINT),/bin/bash)
.PHONY: stop
# HELP: stop all currently running develop instances
stop:
rm -rf develop
.PHONY: stop-%
# HELP(stop-SERVICE): stop all currently running develop instances of a given service (i.e. backend,frontend,uniworxdb,hoogle,postgres,...)
# HELP(stop-JOB): stop all currently running develop instances of a given job (i.e. compile,start,test,lint)
stop-compile: CF_PREFIX=compile-
stop-start: CF_PREFIX=start-
stop-test: CF_PREFIX=test-
stop-lint: CF_PREFIX=lint-
stop-%: --stop;
--stop:
$(SET_DEVELOP)
ifdef CF_PREFIX
rm -rf $(DEVELOP)/$(CF_PREFIX)*
endif
ifdef SERVICE_VARIANT
rm -rf $(DEVELOP)/*-$(SERVICE_VARIANT)
endif
.PHONY: new-develop
# HELP: instantiate new development bundle, i.e. create new directory under develop/
new-develop:
$(NEW_DEVELOP)
mkdir -p $(DEVELOP)
$(MAKE) develop/.current
.PHONY: switch-develop
# HELP: switch current develop instance to DEVELOP=...
switch-develop:
if ! [ -e develop/$(DEVELOP) ]; then \
echo "Specified develop $(DEVELOP) does not exist! Not switching." ; \
exit 1 ; \
fi ; \
echo "$(DEVELOP)" > develop/.current
--ensure-develop:
if ! [[ -e develop ]]; then \
$(MAKE) new-develop; \
fi
$(MAKE) develop/.current
$(SET_DEVELOP)
.PHONY: develop/.current
develop/.current:
ls -1 develop | tail -n1 > develop/.current
.PHONY: --%
.SUFFIXES: # Delete all default suffixes

View File

@ -1,40 +0,0 @@
ARG FROM_IMG=docker.io/library/debian
ARG FROM_TAG=12.5
FROM ${FROM_IMG}:${FROM_TAG}
ENV LANG=de_DE.UTF-8
# basic dependencies
RUN apt-get -y update && apt-get -y install git
RUN apt-get -y update && apt-get -y install haskell-stack
RUN apt-get -y update && apt-get -y install llvm
RUN apt-get -y update && apt-get install -y --no-install-recommends locales locales-all
# compile-time dependencies
RUN apt-get -y update && apt-get install -y libpq-dev libsodium-dev
RUN apt-get -y update && apt-get -y install g++ libghc-zlib-dev libpq-dev libsodium-dev pkg-config
RUN apt-get -y update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends tzdata
# run-time dependencies for uniworx binary
RUN apt-get -y update && apt-get -y install fonts-roboto
# RUN apt-get -y update && apt-get -y install pdftk
# RUN apt-get -y update && apt-get -y install \
# texlive texlive-latex-recommended texlive-luatex texlive-plain-generic texlive-lang-german texlive-lang-english
RUN apt-get -y update && apt-get -y install texlive
# RUN ls /usr/local/texlive
# RUN chown -hR root /usr/local/texlive/2018
# RUN tlmgr init-usertree
# RUN tlmgr option repository ftp://tug.org/historic/systems/texlive/2018/tlnet-final
# RUN tlmgr update --self --all
ARG PROJECT_DIR=/fradrive
ENV PROJECT_DIR=${PROJECT_DIR}
# RUN mkdir -p "${PROJECT_DIR}"; chmod -R 777 "${PROJECT_DIR}"
WORKDIR ${PROJECT_DIR}
ENV HOME=${PROJECT_DIR}
ENV STACK_ROOT="${PROJECT_DIR}/.stack"
ENV STACK_SRC=""
ENV STACK_ENTRY="ghci ${STACK_SRC}"
ENTRYPOINT stack ${STACK_ENTRY}

View File

@ -1,51 +0,0 @@
export CPU_CORES = $(shell cat /proc/cpuinfo | grep '^processor' | wc -l)
export STACK_CORES = $(shell echo $(($(CPU_CORES)/2)))
ifeq ($(PROD),true)
export --DEVELOPMENT=--flag uniworx:-dev
else
export --DEVELOPMENT=--flag uniworx:dev
endif
.PHONY: dependencies
dependencies:
stack install hpack; stack install yesod-bin; \
stack build -j2 --only-dependencies
.PHONY: compile
compile: dependencies
stack build $(STACK_CORES) --fast --profile --library-profiling --executable-profiling --flag uniworx:-library-only $(--DEVELOPMENT) --local-bin-path $$(pwd)/bin --copy-bins
.PHONY: lint
lint:
stack build $(STACK_CORES) --test --fast --flag uniworx:library-only $(--DEVELOPMENT) uniworx:test:hlint
.PHONY: test
test:
stack build $(STACK_CORES) --test --coverage --fast --flag uniworx:library-only $(--DEVELOPMENT)
# For Reverse Proxy Problem see: https://groups.google.com/g/yesodweb/c/2EO53kSOuy0/m/Lw6tq2VYat4J
.PHONY: start
start: dependencies
export YESOD_IP_FROM_HEADER=true; \
export DEV_PORT_HTTP=3000; \
export DEV_PORT_HTTPS=3443; \
export HOST=127.0.0.1 ; \
export PORT=$${PORT:-$${DEV_PORT_HTTP}} ; \
export DETAILED_LOGGING=$${DETAILED_LOGGING:-true} ; \
export LOG_ALL=$${LOG_ALL:-false} ; \
export LOGLEVEL=$${LOGLEVEL:-info} ; \
export DUMMY_LOGIN=$${DUMMY_LOGIN:-true} ; \
export SERVER_SESSION_ACID_FALLBACK=$${SERVER_SESSION_ACID_FALLBACK:-true} ; \
export SERVER_SESSION_COOKIES_SECURE=$${SERVER_SESSION_COOKIES_SECURE:-false} ; \
export COOKIES_SECURE=$${COOKIES_SECURE:-false} ; \
export ALLOW_DEPRECATED=$${ALLOW_DEPRECATED:-true} ; \
export ENCRYPT_ERRORS=$${ENCRYPT_ERRORS:-false} ; \
export RIBBON=$${RIBBON:-$${HOST:-localhost}} ; \
export APPROOT=$${APPROOT:-http://localhost:$${DEV_PORT_HTTP}} ; \
export AVSPASS=$${AVSPASS:-nopasswordset} ; \
stack $(STACK_CORES) exec --local-bin-path $$(pwd)/bin --copy-bins -- yesod devel -p "$${DEV_PORT_HTTP}" -q "$${DEV_PORT_HTTPS}"
.PHONY: clean
clean:
rm -rf .stack-work .stack uniworx.cabal .ghc

View File

@ -1,23 +0,0 @@
-- SPDX-FileCopyrightText: 2022 Gregor Kleen <gregor.kleen@ifi.lmu.de>
--
-- SPDX-License-Identifier: AGPL-3.0-or-later
module Foundation.Types
( UpsertCampusUserMode(..)
, _UpsertCampusUserLoginLdap, _UpsertCampusUserLoginDummy, _UpsertCampusUserLoginOther, _UpsertCampusUserLdapSync, _UpsertCampusUserGuessUser
, _upsertCampusUserIdent
) where
import Import.NoFoundation
data UpsertCampusUserMode
= UpsertCampusUserLoginLdap
| UpsertCampusUserLoginDummy { upsertCampusUserIdent :: UserIdent }
| UpsertCampusUserLoginOther { upsertCampusUserIdent :: UserIdent } -- erlaubt keinen späteren Login
| UpsertCampusUserLdapSync { upsertCampusUserIdent :: UserIdent }
| UpsertCampusUserGuessUser
deriving (Eq, Ord, Read, Show, Generic)
makeLenses_ ''UpsertCampusUserMode
makePrisms ''UpsertCampusUserMode

View File

@ -1,411 +0,0 @@
-- SPDX-FileCopyrightText: 2022 Gregor Kleen <gregor.kleen@ifi.lmu.de>,Steffen Jost <jost@cip.ifi.lmu.de>,Steffen Jost <jost@tcs.ifi.lmu.de>
--
-- SPDX-License-Identifier: AGPL-3.0-or-later
module Foundation.Yesod.Auth
( authenticate
, ldapLookupAndUpsert
, upsertCampusUser
, decodeUserTest
, CampusUserConversionException(..)
, campusUserFailoverMode, updateUserLanguage
) where
import Import.NoFoundation hiding (authenticate)
import Foundation.Type
import Foundation.Types
import Foundation.I18n
import Handler.Utils.Profile
import Handler.Utils.LdapSystemFunctions
import Handler.Utils.Memcached
import Foundation.Authorization (AuthorizationCacheKey(..))
import Yesod.Auth.Message
import Auth.LDAP
import Auth.PWHash (apHash)
import Auth.Dummy (apDummy)
import qualified Data.CaseInsensitive as CI
import qualified Control.Monad.Catch as C (Handler(..))
import qualified Ldap.Client as Ldap
import qualified Data.Text as Text
import qualified Data.Text.Encoding as Text
import qualified Data.ByteString as ByteString
import qualified Data.Set as Set
import qualified Data.Map as Map
-- import qualified Data.Conduit.Combinators as C
-- import qualified Data.List as List ((\\))
-- import qualified Data.UUID as UUID
-- import Data.ByteArray (convert)
-- import Crypto.Hash (SHAKE128)
-- import qualified Data.Binary as Binary
-- import qualified Database.Esqueleto.Legacy as E
-- import qualified Database.Esqueleto.Utils as E
-- import Crypto.Hash.Conduit (sinkHash)
authenticate :: ( MonadHandler m, HandlerSite m ~ UniWorX
, YesodPersist UniWorX, BackendCompatible SqlBackend (YesodPersistBackend UniWorX)
, YesodAuth UniWorX, UserId ~ AuthId UniWorX
)
=> Creds UniWorX -> m (AuthenticationResult UniWorX)
authenticate creds@Creds{..} = liftHandler . runDB . withReaderT projectBackend $ do
now <- liftIO getCurrentTime
let
uAuth = UniqueAuthentication $ CI.mk credsIdent
upsertMode = creds ^? _upsertCampusUserMode
isDummy = is (_Just . _UpsertCampusUserLoginDummy) upsertMode
isOther = is (_Just . _UpsertCampusUserLoginOther) upsertMode
excRecovery res
| isDummy || isOther
= do
case res of
UserError err -> addMessageI Error err
ServerError err -> addMessage Error $ toHtml err
_other -> return ()
acceptExisting
| otherwise
= return res
excHandlers =
[ C.Handler $ \case
CampusUserNoResult -> do
$logWarnS "LDAP" $ "User lookup failed after successful login for " <> credsIdent
excRecovery . UserError $ IdentifierNotFound credsIdent
CampusUserAmbiguous -> do
$logWarnS "LDAP" $ "Multiple LDAP results for " <> credsIdent
excRecovery . UserError $ IdentifierNotFound credsIdent
err -> do
$logErrorS "LDAP" $ tshow err
mr <- getMessageRender
excRecovery . ServerError $ mr MsgInternalLdapError
, C.Handler $ \(cExc :: CampusUserConversionException) -> do
$logErrorS "LDAP" $ tshow cExc
mr <- getMessageRender
excRecovery . ServerError $ mr cExc
]
acceptExisting :: SqlPersistT (HandlerFor UniWorX) (AuthenticationResult UniWorX)
acceptExisting = do
res <- maybe (UserError $ IdentifierNotFound credsIdent) (Authenticated . entityKey) <$> getBy uAuth
case res of
Authenticated uid
-> associateUserSchoolsByTerms uid
_other
-> return ()
case res of
Authenticated uid
| not isDummy -> res <$ update uid [ UserLastAuthentication =. Just now ]
_other -> return res
$logDebugS "auth" $ tshow Creds{..}
ldapPool' <- getsYesod $ view _appLdapPool
flip catches excHandlers $ case ldapPool' of
Just ldapPool
| Just upsertMode' <- upsertMode -> do
ldapData <- campusUser ldapPool campusUserFailoverMode Creds{..}
$logDebugS "LDAP" $ "Successful LDAP lookup: " <> tshow ldapData
Authenticated . entityKey <$> upsertCampusUser upsertMode' ldapData
_other
-> acceptExisting
data CampusUserConversionException
= CampusUserInvalidIdent
| CampusUserInvalidEmail
| CampusUserInvalidDisplayName
| CampusUserInvalidGivenName
| CampusUserInvalidSurname
| CampusUserInvalidTitle
-- | CampusUserInvalidMatriculation
| CampusUserInvalidFeaturesOfStudy Text
| CampusUserInvalidAssociatedSchools Text
deriving (Eq, Ord, Read, Show, Generic)
deriving anyclass (Exception)
_upsertCampusUserMode :: Traversal' (Creds UniWorX) UpsertCampusUserMode
_upsertCampusUserMode mMode cs@Creds{..}
| credsPlugin == apDummy = setMode <$> mMode (UpsertCampusUserLoginDummy $ CI.mk credsIdent)
| credsPlugin == apLdap = setMode <$> mMode UpsertCampusUserLoginLdap
| otherwise = setMode <$> mMode (UpsertCampusUserLoginOther $ CI.mk credsIdent)
where
setMode UpsertCampusUserLoginLdap
= cs{ credsPlugin = apLdap }
setMode (UpsertCampusUserLoginDummy ident)
= cs{ credsPlugin = apDummy
, credsIdent = CI.original ident
}
setMode (UpsertCampusUserLoginOther ident)
= cs{ credsPlugin = bool defaultOther credsPlugin (credsPlugin /= apDummy && credsPlugin /= apLdap)
, credsIdent = CI.original ident
}
setMode _ = cs
defaultOther = apHash
ldapLookupAndUpsert :: forall m. (MonadHandler m, HandlerSite m ~ UniWorX, MonadMask m, MonadUnliftIO m) => Text -> SqlPersistT m (Entity User)
ldapLookupAndUpsert ident =
getsYesod (view _appLdapPool) >>= \case
Nothing -> throwM $ CampusUserLdapError $ LdapHostNotResolved "No LDAP configuration in Foundation."
Just ldapPool ->
campusUser'' ldapPool campusUserFailoverMode ident >>= \case
Nothing -> throwM CampusUserNoResult
Just ldapResponse -> upsertCampusUser UpsertCampusUserGuessUser ldapResponse
{- THIS FUNCION JUST DECODES, BUT IT DOES NOT QUERY LDAP!
upsertCampusUserByCn :: forall m.
( MonadHandler m, HandlerSite m ~ UniWorX
, MonadThrow m
)
=> Text -> SqlPersistT m (Entity User)
upsertCampusUserByCn persNo = upsertCampusUser UpsertCampusUserGuessUser [(ldapPrimaryKey,[Text.encodeUtf8 persNo])]
-}
-- | Upsert User DB according to given LDAP data (does not query LDAP itself)
upsertCampusUser :: forall m.
( MonadHandler m, HandlerSite m ~ UniWorX
, MonadCatch m
)
=> UpsertCampusUserMode -> Ldap.AttrList [] -> SqlPersistT m (Entity User)
upsertCampusUser upsertMode ldapData = do
now <- liftIO getCurrentTime
userDefaultConf <- getsYesod $ view _appUserDefaults
(newUser,userUpdate) <- decodeUser now userDefaultConf upsertMode ldapData
oldUsers <- for (userLdapPrimaryKey newUser) $ \pKey -> selectKeysList [ UserLdapPrimaryKey ==. Just pKey ] []
user@(Entity userId userRec) <- case oldUsers of
Just [oldUserId] -> updateGetEntity oldUserId userUpdate
_other -> upsertBy (UniqueAuthentication (newUser ^. _userIdent)) newUser userUpdate
unless (validDisplayName (newUser ^. _userTitle)
(newUser ^. _userFirstName)
(newUser ^. _userSurname)
(userRec ^. _userDisplayName)) $
update userId [ UserDisplayName =. (newUser ^. _userDisplayName) ] -- update invalid display names only
when (validEmail' (userRec ^. _userEmail)) $ do -- RECALL: userRec already contains basic updates
let emUps = [ UserDisplayEmail =. (newUser ^. _userEmail) | not (validEmail' (userRec ^. _userDisplayEmail)) ]
++ [ UserAuthentication =. AuthLDAP | is _AuthNoLogin (userRec ^. _userAuthentication) ]
update userId emUps -- update already checks whether list is empty
-- Attempt to update ident, too:
unless (validEmail' (userRec ^. _userIdent)) $
void $ maybeCatchAll (update userId [ UserIdent =. (newUser ^. _userEmail) ] >> return (Just ()))
let
userSystemFunctions = determineSystemFunctions . Set.fromList $ map CI.mk userSystemFunctions'
userSystemFunctions' = do
(k, v) <- ldapData
guard $ k == ldapAffiliation
v' <- v
Right str <- return $ Text.decodeUtf8' v'
assertM' (not . Text.null) $ Text.strip str
iforM_ userSystemFunctions $ \func preset -> do
memcachedByInvalidate (AuthCacheSystemFunctionList func) $ Proxy @(Set UserId)
if | preset -> void $ upsert (UserSystemFunction userId func False False) []
| otherwise -> deleteWhere [UserSystemFunctionUser ==. userId, UserSystemFunctionFunction ==. func, UserSystemFunctionIsOptOut ==. False, UserSystemFunctionManual ==. False]
return user
decodeUserTest :: (MonadHandler m, HandlerSite m ~ UniWorX, MonadCatch m)
=> Maybe UserIdent -> Ldap.AttrList [] -> m (Either CampusUserConversionException (User, [Update User]))
decodeUserTest mbIdent ldapData = do
now <- liftIO getCurrentTime
userDefaultConf <- getsYesod $ view _appUserDefaults
let mode = maybe UpsertCampusUserLoginLdap UpsertCampusUserLoginDummy mbIdent
try $ decodeUser now userDefaultConf mode ldapData
decodeUser :: (MonadThrow m) => UTCTime -> UserDefaultConf -> UpsertCampusUserMode -> Ldap.AttrList [] -> m (User,_)
decodeUser now UserDefaultConf{..} upsertMode ldapData = do
let
userTelephone = decodeLdap ldapUserTelephone <&> canonicalPhone
userMobile = decodeLdap ldapUserMobile <&> canonicalPhone
userCompanyPersonalNumber = decodeLdap ldapUserFraportPersonalnummer
userCompanyDepartment = decodeLdap ldapUserFraportAbteilung
userAuthentication
| is _UpsertCampusUserLoginOther upsertMode
= AuthNoLogin -- AuthPWHash (error "Non-LDAP logins should only work for users that are already known")
| otherwise = AuthLDAP
userLastAuthentication = guardOn isLogin now
isLogin = has (_UpsertCampusUserLoginLdap <> _UpsertCampusUserLoginOther . united) upsertMode
userTitle = decodeLdap ldapUserTitle -- CampusUserInvalidTitle
userFirstName = decodeLdap' ldapUserFirstName -- CampusUserInvalidGivenName
userSurname = decodeLdap' ldapUserSurname -- CampusUserInvalidSurname
userDisplayName <- decodeLdap1 ldapUserDisplayName CampusUserInvalidDisplayName <&> fixDisplayName -- do not check LDAP-given userDisplayName
--userDisplayName <- decodeLdap1 ldapUserDisplayName CampusUserInvalidDisplayName >>=
-- (maybeThrow CampusUserInvalidDisplayName . checkDisplayName userTitle userFirstName userSurname)
userIdent <- if
| [bs] <- ldapMap !!! ldapUserPrincipalName
, Right userIdent' <- CI.mk <$> Text.decodeUtf8' bs
, hasn't _upsertCampusUserIdent upsertMode || has (_upsertCampusUserIdent . only userIdent') upsertMode
-> return userIdent'
| Just userIdent' <- upsertMode ^? _upsertCampusUserIdent
-> return userIdent'
| otherwise
-> throwM CampusUserInvalidIdent
userEmail <- if -- TODO: refactor! NOTE: LDAP doesnt know email for all users; we use userPrincialName instead; however validEmail refutes `E<number@fraport.de` here, too strong! Make Email-Field optional!
| userEmail : _ <- mapMaybe (assertM (elem '@') . either (const Nothing) Just . Text.decodeUtf8') (lookupSome ldapMap $ toList ldapUserEmail)
-> return $ CI.mk userEmail
-- | userEmail : _ <- mapMaybe (assertM validEmail . either (const Nothing) Just . Text.decodeUtf8') (lookupSome ldapMap $ toList ldapUserEmail) -- TOO STRONG, see above!
-- -> return $ CI.mk userEmail
| otherwise
-> throwM CampusUserInvalidEmail
userLdapPrimaryKey <- if
| [bs] <- ldapMap !!! ldapPrimaryKey
, Right userLdapPrimaryKey'' <- Text.decodeUtf8' bs
, Just userLdapPrimaryKey''' <- assertM' (not . Text.null) $ Text.strip userLdapPrimaryKey''
-> return $ Just userLdapPrimaryKey'''
| otherwise
-> return Nothing
let
newUser = User
{ userMaxFavourites = userDefaultMaxFavourites
, userMaxFavouriteTerms = userDefaultMaxFavouriteTerms
, userTheme = userDefaultTheme
, userDateTimeFormat = userDefaultDateTimeFormat
, userDateFormat = userDefaultDateFormat
, userTimeFormat = userDefaultTimeFormat
, userDownloadFiles = userDefaultDownloadFiles
, userWarningDays = userDefaultWarningDays
, userShowSex = userDefaultShowSex
, userSex = Nothing
, userBirthday = Nothing
, userExamOfficeGetSynced = userDefaultExamOfficeGetSynced
, userExamOfficeGetLabels = userDefaultExamOfficeGetLabels
, userNotificationSettings = def
, userLanguages = Nothing
, userCsvOptions = def
, userTokensIssuedAfter = Nothing
, userCreated = now
, userLastLdapSynchronisation = Just now
, userDisplayName = userDisplayName
, userDisplayEmail = userEmail
, userMatrikelnummer = Nothing -- not known from LDAP, must be derived from REST interface to AVS TODO
, userPostAddress = Nothing -- not known from LDAP, must be derived from REST interface to AVS TODO
, userPostLastUpdate = Nothing
, userPinPassword = Nothing -- must be derived via AVS
, userPrefersPostal = userDefaultPrefersPostal
, ..
}
userUpdate =
[ UserLastAuthentication =. Just now | isLogin ] ++
[ UserEmail =. userEmail | validEmail' userEmail ] ++
[
-- UserDisplayName =. userDisplayName -- not updated here, since users are allowed to change their DisplayName; see line 191
UserFirstName =. userFirstName
, UserSurname =. userSurname
, UserLastLdapSynchronisation =. Just now
, UserLdapPrimaryKey =. userLdapPrimaryKey
, UserMobile =. userMobile
, UserTelephone =. userTelephone
, UserCompanyPersonalNumber =. userCompanyPersonalNumber
, UserCompanyDepartment =. userCompanyDepartment
]
return (newUser, userUpdate)
where
ldapMap :: Map.Map Ldap.Attr [Ldap.AttrValue] -- Recall: Ldap.AttrValue == ByteString
ldapMap = Map.fromListWith (++) $ ldapData <&> second (filter (not . ByteString.null))
-- just returns Nothing on error, pure
decodeLdap :: Ldap.Attr -> Maybe Text
decodeLdap attr = listToMaybe . rights $ Text.decodeUtf8' <$> ldapMap !!! attr
decodeLdap' :: Ldap.Attr -> Text
decodeLdap' = fromMaybe "" . decodeLdap
-- accept the first successful decoding or empty; only throw an error if all decodings fail
-- decodeLdap' :: (Exception e) => Ldap.Attr -> e -> m (Maybe Text)
-- decodeLdap' attr err
-- | [] <- vs = return Nothing
-- | (h:_) <- rights vs = return $ Just h
-- | otherwise = throwM err
-- where
-- vs = Text.decodeUtf8' <$> (ldapMap !!! attr)
-- only accepts the first successful decoding, ignoring all others, but failing if there is none
-- decodeLdap1 :: (MonadThrow m, Exception e) => Ldap.Attr -> e -> m Text
decodeLdap1 attr err
| (h:_) <- rights vs = return h
| otherwise = throwM err
where
vs = Text.decodeUtf8' <$> (ldapMap !!! attr)
-- accept and merge one or more successful decodings, ignoring all others
-- decodeLdapN attr err
-- | t@(_:_) <- rights vs
-- = return $ Text.unwords t
-- | otherwise = throwM err
-- where
-- vs = Text.decodeUtf8' <$> (ldapMap !!! attr)
associateUserSchoolsByTerms :: MonadIO m => UserId -> SqlPersistT m ()
associateUserSchoolsByTerms uid = do
sfs <- selectList [StudyFeaturesUser ==. uid] []
forM_ sfs $ \(Entity _ StudyFeatures{..}) -> do
schoolTerms <- selectList [SchoolTermsTerms ==. studyFeaturesField] []
forM_ schoolTerms $ \(Entity _ SchoolTerms{..}) ->
void $ insertUnique UserSchool
{ userSchoolUser = uid
, userSchoolSchool = schoolTermsSchool
, userSchoolIsOptOut = False
}
updateUserLanguage :: ( MonadHandler m, HandlerSite m ~ UniWorX
, YesodAuth UniWorX
, UserId ~ AuthId UniWorX
)
=> Maybe Lang -> SqlPersistT m (Maybe Lang)
updateUserLanguage (Just lang) = do
unless (lang `elem` appLanguages) $
invalidArgs ["Unsupported language"]
muid <- maybeAuthId
for_ muid $ \uid -> do
langs <- languages
update uid [ UserLanguages =. Just (Languages $ lang : nubOrd (filter ((&&) <$> (`elem` appLanguages) <*> (/= lang)) langs)) ]
setRegisteredCookie CookieLang lang
return $ Just lang
updateUserLanguage Nothing = runMaybeT $ do
uid <- MaybeT maybeAuthId
User{..} <- MaybeT $ get uid
setLangs <- toList . selectLanguages appLanguages <$> languages
highPrioSetLangs <- toList . selectLanguages appLanguages <$> highPrioRequestedLangs
let userLanguages' = toList . selectLanguages appLanguages <$> userLanguages ^? _Just . _Wrapped
lang <- case (userLanguages', setLangs, highPrioSetLangs) of
(_, _, hpl : _)
-> lift $ hpl <$ update uid [ UserLanguages =. Just (Languages highPrioSetLangs) ]
(Just (l : _), _, _)
-> return l
(Nothing, l : _, _)
-> lift $ l <$ update uid [ UserLanguages =. Just (Languages setLangs) ]
(Just [], l : _, _)
-> return l
(_, [], _)
-> mzero
setRegisteredCookie CookieLang lang
return lang
campusUserFailoverMode :: FailoverMode
campusUserFailoverMode = FailoverUnlimited
embedRenderMessage ''UniWorX ''CampusUserConversionException id

View File

@ -1,53 +0,0 @@
-- SPDX-FileCopyrightText: 2022-2025 Sarah Vaupel <sarah.vaupel@uniworx.systems>, Gregor Kleen <gregor.kleen@ifi.lmu.de>, Sarah Vaupel <sarah.vaupel@ifi.lmu.de>
--
-- SPDX-License-Identifier: AGPL-3.0-or-later
module Foundation.Yesod.StaticContent
( addStaticContent
) where
import Import.NoFoundation hiding (addStaticContent)
import Foundation.Type
import qualified Database.Memcached.Binary.IO as Memcached
import qualified Data.ByteString.Lazy as Lazy
import qualified Data.ByteString.Base64.URL as Base64 (encodeUnpadded)
import Data.ByteArray (convert)
import Crypto.Hash (SHAKE256)
import Crypto.Hash.Conduit (sinkHash)
import Data.Bits (Bits(zeroBits))
import qualified Data.Conduit.Combinators as C
addStaticContent :: Text
-> Text
-> Lazy.ByteString
-> HandlerFor UniWorX (Maybe (Either Text (Route UniWorX, [(Text, Text)])))
addStaticContent ext _mime content = do
UniWorX{appWidgetMemcached, appSettings'} <- getYesod
for ((,) <$> appWidgetMemcached <*> appWidgetMemcachedConf appSettings') $ \(mConn, WidgetMemcachedConf{ widgetMemcachedConf = MemcachedConf { memcachedExpiry }, widgetMemcachedBaseUrl }) -> do
let expiry = maybe 0 ceiling memcachedExpiry
touch = liftIO $ Memcached.touch expiry (encodeUtf8 $ pack fileName) mConn
addItem = liftIO $ Memcached.add zeroBits expiry (encodeUtf8 $ pack fileName) content mConn
absoluteLink = unpack widgetMemcachedBaseUrl </> fileName
catchIf Memcached.isKeyNotFound touch . const $
handleIf Memcached.isKeyExists (const $ return ()) addItem
return . Left $ pack absoluteLink
where
-- Generate a unique filename based on the content itself, this is used
-- for deduplication so a collision resistant hash function is required
--
-- SHA-3 (SHAKE256) seemed to be a future-proof choice
--
-- Length of hash is 144 bits ~~instead of MD5's 128, so as to avoid
-- padding after base64-conversion~~ for backwards compatibility
fileName = (<.> unpack ext)
. unpack
. decodeUtf8
. Base64.encodeUnpadded
. (convert :: Digest (SHAKE256 144) -> ByteString)
. runConduitPure
$ C.sourceLazy content .| sinkHash

View File

@ -1,69 +0,0 @@
-- SPDX-FileCopyrightText: 2022 Steffen Jost <jost@tcs.ifi.lmu.de>
--
-- SPDX-License-Identifier: AGPL-3.0-or-later
module Handler.Admin.Ldap
( getAdminLdapR
, postAdminLdapR
) where
import Import
-- import qualified Control.Monad.State.Class as State
-- import Data.Aeson (encode)
import qualified Data.CaseInsensitive as CI
import qualified Data.Text as Text
import qualified Data.Text.Encoding as Text
-- import qualified Data.Set as Set
import Foundation.Yesod.Auth (decodeUserTest,ldapLookupAndUpsert,campusUserFailoverMode,CampusUserConversionException())
import Handler.Utils
import qualified Ldap.Client as Ldap
import Auth.LDAP
getAdminLdapR, postAdminLdapR :: Handler Html
getAdminLdapR = postAdminLdapR
postAdminLdapR = do
((presult, pwidget), penctype) <- runFormPost $ identifyForm ("adminLdapLookup"::Text) $ \html ->
flip (renderAForm FormStandard) html $ areq textField (fslI MsgAdminUserIdent) Nothing
let procFormPerson :: Text -> Handler (Maybe (Ldap.AttrList []))
procFormPerson lid = do
ldapPool' <- getsYesod $ view _appLdapPool
case ldapPool' of
Nothing -> addMessage Error (text2Html "LDAP Configuration missing.") >> return Nothing
Just ldapPool -> do
addMessage Info $ text2Html "Input for LDAP test received."
ldapData <- campusUser'' ldapPool campusUserFailoverMode lid
decodedErr <- decodeUserTest (pure $ CI.mk lid) $ concat ldapData
whenIsLeft decodedErr $ addMessageI Error
return ldapData
mbLdapData <- formResultMaybe presult procFormPerson
((uresult, uwidget), uenctype) <- runFormPost $ identifyForm ("adminLdapUpsert"::Text) $ \html ->
flip (renderAForm FormStandard) html $ areq textField (fslI MsgAdminUserIdent) Nothing
let procFormUpsert :: Text -> Handler (Maybe (Either CampusUserConversionException (Entity User)))
procFormUpsert lid = pure <$> runDB (try $ ldapLookupAndUpsert lid)
mbLdapUpsert <- formResultMaybe uresult procFormUpsert
actionUrl <- fromMaybe AdminLdapR <$> getCurrentRoute
siteLayoutMsg MsgMenuLdap $ do
setTitleI MsgMenuLdap
let personForm = wrapForm pwidget def
{ formAction = Just $ SomeRoute actionUrl
, formEncoding = penctype
}
upsertForm = wrapForm uwidget def
{ formAction = Just $ SomeRoute actionUrl
, formEncoding = uenctype
}
presentUtf8 lv = Text.intercalate ", " (either tshow id . Text.decodeUtf8' <$> lv)
presentLatin1 lv = Text.intercalate ", " ( Text.decodeLatin1 <$> lv)
-- TODO: use i18nWidgetFile instead if this is to become permanent
$(widgetFile "ldap")

View File

@ -1,69 +0,0 @@
-- SPDX-FileCopyrightText: 2022 Gregor Kleen <gregor.kleen@ifi.lmu.de>
--
-- SPDX-License-Identifier: AGPL-3.0-or-later
module Jobs.Handler.SynchroniseLdap
( dispatchJobSynchroniseLdap
, dispatchJobSynchroniseLdapUser
, dispatchJobSynchroniseLdapAll
, SynchroniseLdapException(..)
) where
import Import
import qualified Data.CaseInsensitive as CI
import qualified Data.Conduit.List as C
import Auth.LDAP
import Foundation.Yesod.Auth (CampusUserConversionException, upsertCampusUser)
import Jobs.Queue
data SynchroniseLdapException
= SynchroniseLdapNoLdap
deriving (Eq, Ord, Enum, Bounded, Read, Show, Generic)
instance Exception SynchroniseLdapException
dispatchJobSynchroniseLdap :: Natural -> Natural -> Natural -> JobHandler UniWorX
dispatchJobSynchroniseLdap numIterations epoch iteration
= JobHandlerAtomic . runConduit $
readUsers .| filterIteration .| sinkDBJobs
where
readUsers :: ConduitT () UserId (YesodJobDB UniWorX) ()
readUsers = selectKeys [] []
filterIteration :: ConduitT UserId Job (YesodJobDB UniWorX) ()
filterIteration = C.mapMaybeM $ \userId -> runMaybeT $ do
let
userIteration, currentIteration :: Integer
userIteration = toInteger (hash epoch `hashWithSalt` userId) `mod` toInteger numIterations
currentIteration = toInteger iteration `mod` toInteger numIterations
$logDebugS "SynchroniseLdap" [st|User ##{tshow (fromSqlKey userId)}: LDAP sync on #{tshow userIteration}/#{tshow numIterations}, now #{tshow currentIteration}|]
guard $ userIteration == currentIteration
return $ JobSynchroniseLdapUser userId
dispatchJobSynchroniseLdapUser :: UserId -> JobHandler UniWorX
dispatchJobSynchroniseLdapUser jUser = JobHandlerException $ do
UniWorX{..} <- getYesod
case appLdapPool of
Just ldapPool ->
runDB . void . runMaybeT . handleExc $ do
user@User{userIdent,userLdapPrimaryKey} <- MaybeT $ get jUser
let upsertIdent = maybe userIdent CI.mk userLdapPrimaryKey
$logInfoS "SynchroniseLdap" [st|Synchronising #{upsertIdent}|]
reTestAfter <- getsYesod $ view _appLdapReTestFailover
ldapAttrs <- MaybeT $ campusUserReTest' ldapPool ((>= reTestAfter) . realToFrac) FailoverUnlimited user
void . lift $ upsertCampusUser (UpsertCampusUserLdapSync upsertIdent) ldapAttrs
Nothing ->
throwM SynchroniseLdapNoLdap
where
handleExc :: MaybeT DB a -> MaybeT DB a
handleExc
= catchMPlus (Proxy @CampusUserException)
. catchMPlus (Proxy @CampusUserConversionException)
dispatchJobSynchroniseLdapAll :: JobHandler UniWorX
dispatchJobSynchroniseLdapAll = JobHandlerAtomic . runConduit $ selectSource [] [] .| C.mapM_ (queueDBJob . JobSynchroniseLdapUser . entityKey)

View File

@ -1,15 +0,0 @@
-- SPDX-FileCopyrightText: 2022 Gregor Kleen <gregor.kleen@ifi.lmu.de>
--
-- SPDX-License-Identifier: AGPL-3.0-or-later
{-# OPTIONS_GHC -fno-warn-orphans #-}
module Ldap.Client.Instances
(
) where
import ClassyPrelude
import Ldap.Client
deriving instance Ord ResultCode

View File

@ -1,113 +0,0 @@
-- SPDX-FileCopyrightText: 2022 Steffen Jost <jost@tcs.ifi.lmu.de>
--
-- SPDX-License-Identifier: AGPL-3.0-or-later
{-# OPTIONS_GHC -fno-warn-unused-top-binds #-}
module Utils.Users
( AuthenticationKind(..)
, AddUserData(..)
, addNewUser, addNewUserDB
) where
import Import
data AuthenticationKind = AuthKindLDAP | AuthKindPWHash | AuthKindNoLogin
deriving (Eq, Ord, Read, Show, Enum, Bounded, Generic, Universe, Finite)
--instance Universe AuthenticationKind
--instance Finite AuthenticationKind
embedRenderMessage ''UniWorX ''AuthenticationKind id
nullaryPathPiece ''AuthenticationKind $ camelToPathPiece' 2
mkAuthMode :: AuthenticationKind -> AuthenticationMode
mkAuthMode AuthKindLDAP = AuthLDAP
mkAuthMode AuthKindPWHash = AuthPWHash ""
mkAuthMode AuthKindNoLogin = AuthNoLogin
{-
classifyAuth :: AuthenticationMode -> AuthenticationKind
classifyAuth AuthLDAP = AuthKindLDAP
classifyAuth AuthPWHash{} = AuthKindPWHash
classifyAuth AuthNoLogin = AuthKindNoLogin
-}
data AddUserData = AddUserData
{ audTitle :: Maybe Text
, audFirstName :: Text
, audSurname :: UserSurname
, audDisplayName :: UserDisplayName
, audDisplayEmail :: UserEmail
, audMatriculation :: Maybe UserMatriculation
, audSex :: Maybe Sex
, audBirthday :: Maybe Day
, audMobile :: Maybe Text
, audTelephone :: Maybe Text
, audFPersonalNumber :: Maybe Text
, audFDepartment :: Maybe Text
, audPostAddress :: Maybe StoredMarkup
, audPrefersPostal :: Bool
, audPinPassword :: Maybe Text
, audEmail :: UserEmail
, audIdent :: UserIdent
, audAuth :: AuthenticationKind
}
-- | Adds a new user to database, no background jobs are scheduled, no notifications send
-- Note: `Foundation.Yesod.Auth` contains similar code with potentially differing defaults!
addNewUser :: AddUserData -> Handler (Maybe UserId)
addNewUser aud = do
udc <- getsYesod $ view _appUserDefaults
usr <- makeUser udc aud
runDB $ insertUnique usr
-- | Variant of `addNewUser` which allows for rollback through follwing throws
addNewUserDB :: AddUserData -> DB (Maybe UserId)
addNewUserDB aud = do
udc <- liftHandler $ getsYesod $ view _appUserDefaults
usr <- makeUser udc aud
insertUnique usr
makeUser :: MonadIO m => UserDefaultConf -> AddUserData -> m User
makeUser UserDefaultConf{..} AddUserData{..} = do
now <- liftIO getCurrentTime
return User
{ userIdent = audIdent
, userMaxFavourites = userDefaultMaxFavourites
, userMaxFavouriteTerms = userDefaultMaxFavouriteTerms
, userTheme = userDefaultTheme
, userDateTimeFormat = userDefaultDateTimeFormat
, userDateFormat = userDefaultDateFormat
, userTimeFormat = userDefaultTimeFormat
, userDownloadFiles = userDefaultDownloadFiles
, userWarningDays = userDefaultWarningDays
, userShowSex = userDefaultShowSex
, userExamOfficeGetSynced = userDefaultExamOfficeGetSynced
, userExamOfficeGetLabels = userDefaultExamOfficeGetLabels
, userNotificationSettings = def
, userLanguages = Nothing
, userCsvOptions = def { csvFormat = review csvPreset CsvPresetXlsx }
, userTokensIssuedAfter = Nothing
, userCreated = now
, userLastLdapSynchronisation = Nothing
, userLdapPrimaryKey = audFPersonalNumber
, userLastAuthentication = Nothing
, userEmail = audEmail
, userDisplayName = audDisplayName
, userDisplayEmail = audDisplayEmail
, userFirstName = audFirstName
, userSurname = audSurname
, userTitle = audTitle
, userSex = audSex
, userBirthday = audBirthday
, userMobile = audMobile
, userTelephone = audTelephone
, userCompanyPersonalNumber = audFPersonalNumber
, userCompanyDepartment = audFDepartment
, userPostAddress = audPostAddress
, userPostLastUpdate = Nothing
, userPrefersPostal = audPrefersPostal
, userPinPassword = audPinPassword
, userMatrikelnummer = audMatriculation
, userAuthentication = mkAuthMode audAuth
}

View File

@ -1,33 +0,0 @@
$newline never
$# SPDX-FileCopyrightText: 2022 Steffen Jost <jost@tcs.ifi.lmu.de>
$#
$# SPDX-License-Identifier: AGPL-3.0-or-later
<section>
<p>
LDAP Person Search:
^{personForm}
$maybe answers <- mbLdapData
<h1>
Antwort: #
<dl .deflist>
$forall (lk, lv) <- answers
$with numv <- length lv
<dt>
#{show lk}
$if 1 < numv
\ (#{show numv})
<dd>
UTF8: #{presentUtf8 lv}
&#8212;
Latin: #{presentLatin1 lv}
<section>
<p>
LDAP Upsert user in DB:
^{upsertForm}
$maybe answer <- mbLdapUpsert
<h1>
Antwort: #
<p>
#{tshow answer}

View File

@ -1,20 +0,0 @@
$newline never
$# SPDX-FileCopyrightText: 2022 Gregor Kleen <gregor.kleen@ifi.lmu.de>
$#
$# SPDX-License-Identifier: AGPL-3.0-or-later
$forall AuthPlugin{apName, apLogin} <- plugins
$if apName == "LDAP"
<section>
<h2>_{MsgLDAPLoginTitle}
^{apLogin toParent}
$elseif apName == "PWHash"
<section>
<h2>_{MsgPWHashLoginTitle}
<p>_{MsgPWHashLoginNote}
^{apLogin toParent}
$elseif apName == "dummy"
<section>
<h2>_{MsgDummyLoginTitle}
^{apLogin toParent}

7
cbt.sh Executable file
View File

@ -0,0 +1,7 @@
#!/usr/bin/env bash
# SPDX-FileCopyrightText: 2022 Sarah Vaupel <vaupel.sarah@campus.lmu.de>
#
# SPDX-License-Identifier: AGPL-3.0-or-later
cbt_tunnels --username $CBT_USERNAME --authkey $CBT_AUTHKEY

View File

@ -1,86 +1,35 @@
services:
help:
image: docker.io/library/perl:stable
pull_policy: if_not_present
volumes:
- ./utils/makehelp.pl:/mnt/utils/makehelp.pl:ro
- ./Makefile:/tmp/Makefile:ro
command: /mnt/utils/makehelp.pl /tmp/Makefile
frontend:
# image: registry.uniworx.de/fradrive/fradrive/frontend # TODO: reference to current branch required; how to do that here?
# pull_policy: if_not_present
build:
context: ./frontend
dockerfile: ./Dockerfile
dockerfile: ./docker/frontend/Dockerfile
context: .
environment:
- PROJECT_DIR=/fradrive
volumes:
- type: bind
source: ./frontend
target: /fradrive
- ./assets:/fradrive/assets:rw
- ./static:/fradrive/static:rw
- ./well-known:/fradrive/well-known:rw
- &fradrive-mnt .:/tmp/fradrive
backend:
# image: registry.uniworx.de/fradrive/fradrive/backend
# pull_policy: if_not_present
build:
context: ./backend
dockerfile: ./Dockerfile
environment:
PATH: /fradrive/bin:$PATH
dockerfile: ./docker/backend/Dockerfile
context: ./
volumes:
- ./backend:/fradrive
- ./bin:/fradrive/bin
- ./assets:/fradrive/assets:ro
- ./static:/fradrive/static:ro
- ./well-known:/fradrive/well-known:ro
- *fradrive-mnt
depends_on:
- frontend
- postgres
- memcached
- minio
- maildev
ports:
- "3000:3000" # dev http
- "3443:3443" # dev https
- "8081:8081" # hoogle
# links:
# - postgres
# - memcached
# - minio
# - maildev
stdin_open: true
network_mode: host
postgres:
image: docker.io/library/postgres:12
pull_policy: if_not_present
database:
# image: registry.uniworx.de/fradrive/fradrive/database
# pull_policy: if_not_present
build: ./docker/database
ports:
- "5432:5432"
environment:
- POSTGRES_HOST_AUTH_METHOD=trust
volumes:
- ./docker/postgres/pg_hba.conf:/tmp/pg_hba.conf:ro
- ./docker/postgres/postgresql.conf:/tmp/postgresql.conf:ro
- ./docker/postgres/pgconfig.sh:/docker-entrypoint-initdb.d/_pgconfig.sh:ro
- ./docker/postgres/schema.sql:/docker-entrypoint-initdb.d/schema.sql:ro
- "9876:5432"
# privileged: true
memcached:
image: docker.io/library/memcached:latest
pull_policy: if_not_present
ports:
- "11211:11211"
minio:
image: docker.io/minio/minio:latest
pull_policy: if_not_present
command: server `mktemp`
ports:
- "9000:9000"
maildev:
image: docker.io/maildev/maildev:latest
pull_policy: if_not_present
ports:
- "1025-1026:1025"
# driver: local
# driver_opts:
# type: none
# o: bind
# device: ./

View File

@ -24,9 +24,9 @@ mail-from:
email: "_env:MAILFROM_EMAIL:uniworx@localhost"
mail-object-domain: "_env:MAILOBJECT_DOMAIN:localhost"
mail-use-replyto-instead-sender: "_env:MAIL_USES_REPLYTO:true"
mail-reroute-to:
name: "_env:MAIL_REROUTE_TO_NAME:"
email: "_env:MAIL_REROUTE_TO_EMAIL:"
mail-reroute-to:
name: "_env:MAIL_REROUTE_TO_NAME:"
email: "_env:MAIL_REROUTE_TO_EMAIL:"
#mail-verp:
# separator: "_env:VERP_SEPARATOR:+"
# prefix: "_env:VERP_PREFIX:bounce"
@ -45,7 +45,7 @@ legal-external:
imprint: "https://www.fraport.com/de/tools/impressum.html"
data-protection: "https://www.fraport.com/de/konzern/datenschutz.html"
terms-of-use: "https://www.fraport.com/de/tools/disclaimer.html"
payments: "https://www.fraport.com/de/geschaeftsfelder/service/geschaeftspartner/richtlinien-und-zahlungsbedingungen.html"
payments: "https://www.fraport.com/de/geschaeftsfelder/service/geschaeftspartner/richtlinien-und-zahlungsbedingungen.html"
job-workers: "_env:JOB_WORKERS:10"
job-flush-interval: "_env:JOB_FLUSH:30"
@ -66,7 +66,7 @@ keep-unreferenced-files: 86400
health-check-interval:
matching-cluster-config: "_env:HEALTHCHECK_INTERVAL_MATCHING_CLUSTER_CONFIG:600"
http-reachable: "_env:HEALTHCHECK_INTERVAL_HTTP_REACHABLE:600"
ldap-admins: "_env:HEALTHCHECK_INTERVAL_LDAP_ADMINS:600"
ldap-admins: "_env:HEALTHCHECK_INTERVAL_LDAP_ADMINS:600" # TODO: either generalize over every external auth sources, or otherwise reimplement for different semantics
smtp-connect: "_env:HEALTHCHECK_INTERVAL_SMTP_CONNECT:600"
widget-memcached: "_env:HEALTHCHECK_INTERVAL_WIDGET_MEMCACHED:600"
active-job-executors: "_env:HEALTHCHECK_INTERVAL_ACTIVE_JOB_EXECUTORS:60"
@ -77,7 +77,7 @@ health-check-http: "_env:HEALTHCHECK_HTTP:true" # Can we assume, that we can rea
health-check-active-job-executors-timeout: "_env:HEALTHCHECK_ACTIVE_JOB_EXECUTORS_TIMEOUT:5"
health-check-active-widget-memcached-timeout: "_env:HEALTHCHECK_ACTIVE_WIDGET_MEMCACHED_TIMEOUT:2"
health-check-smtp-connect-timeout: "_env:HEALTHCHECK_SMTP_CONNECT_TIMEOUT:5"
health-check-ldap-admins-timeout: "_env:HEALTHCHECK_LDAP_ADMINS_TIMEOUT:60"
health-check-ldap-admins-timeout: "_env:HEALTHCHECK_LDAP_ADMINS_TIMEOUT:60" # TODO: either generalize over every external auth sources, or otherwise reimplement for different semantics
health-check-http-reachable-timeout: "_env:HEALTHCHECK_HTTP_REACHABLE_TIMEOUT:2"
health-check-matching-cluster-config-timeout: "_env:HEALTHCHECK_MATCHING_CLUSTER_CONFIG_TIMEOUT:2"
@ -126,24 +126,47 @@ database:
database: "_env:PGDATABASE:uniworx"
poolsize: "_env:PGPOOLSIZE:990"
auto-db-migrate: '_env:AUTO_DB_MIGRATE:true'
auto-db-migrate: "_env:AUTO_DB_MIGRATE:true"
ldap:
- host: "_env:LDAPHOST:"
tls: "_env:LDAPTLS:"
port: "_env:LDAPPORT:389"
user: "_env:LDAPUSER:"
pass: "_env:LDAPPASS:"
baseDN: "_env:LDAPBASE:"
scope: "_env:LDAPSCOPE:WholeSubtree"
timeout: "_env:LDAPTIMEOUT:5"
search-timeout: "_env:LDAPSEARCHTIME:5"
pool:
stripes: "_env:LDAPSTRIPES:1"
timeout: "_env:LDAPTIMEOUT:20"
limit: "_env:LDAPLIMIT:10"
# External sources used for user authentication and userdata lookups
user-auth:
# mode: single-source
protocol: "_env:USERAUTH_MODE:azureadv2"
config:
client-id: "_env:AZURECLIENTID:00000000-0000-0000-0000-000000000000"
client-secret: "_env:AZURECLIENTSECRET:''"
tenant-id: "_env:AZURETENANTID:00000000-0000-0000-0000-000000000000"
scopes: "_env:AZURESCOPES:[ID,Profile]"
# protocol: "ldap"
# config:
# host: "_env:LDAPHOST:"
# tls: "_env:LDAPTLS:"
# port: "_env:LDAPPORT:389"
# user: "_env:LDAPUSER:"
# pass: "_env:LDAPPASS:"
# baseDN: "_env:LDAPBASE:"
# scope: "_env:LDAPSCOPE:WholeSubtree"
# timeout: "_env:LDAPTIMEOUT:5"
# search-timeout: "_env:LDAPSEARCHTIME:5"
ldap-re-test-failover: 60
single-sign-on: "_env:OIDC_SSO:false"
# Automatically redirect to SSO route when not signed on
# Note: This will force authentication, thus the site will be inaccessible without external credentials. Only use this option when it is ensured that every user that should be able to access the site has valid external credentials!
auto-sign-on: "_env:AUTO_SIGN_ON:false"
# TODO: generalize for arbitrary auth protocols
# TODO: maybe use separate pools for external databases?
ldap-pool:
stripes: "_env:LDAPSTRIPES:1"
timeout: "_env:LDAPTIMEOUT:20"
limit: "_env:LDAPLIMIT:10"
# TODO: reintroduce and move into failover settings once failover mode has been reimplemented
# user-retest-failover: 60
# TODO; maybe implement syncWithin and syncInterval per auth source
user-sync-within: "_env:USER_SYNC_WITHIN:1209600" # 14 Tage in Sekunden
user-sync-interval: "_env:USER_SYNC_INTERVAL:3600" # jede Stunde
lms-direct:
upload-header: "_env:LMSUPLOADHEADER:true"
@ -166,7 +189,7 @@ avs:
lpr:
host: "_env:LPRHOST:fravm017173.fra.fraport.de"
port: "_env:LPRPORT:515"
queue: "_env:LPRQUEUE:fradrive"
queue: "_env:LPRQUEUE:fradrive"
smtp:
host: "_env:SMTPHOST:"
@ -189,7 +212,7 @@ widget-memcached:
timeout: "_env:WIDGET_MEMCACHED_TIMEOUT:20"
base-url: "_env:WIDGET_MEMCACHED_ROOT:"
expiration: "_env:WIDGET_MEMCACHED_EXPIRATION:3600"
session-memcached:
host: "_env:SESSION_MEMCACHED_HOST:localhost"
port: "_env:SESSION_MEMCACHED_PORT:11211"

61
docker/backend/Dockerfile Normal file
View File

@ -0,0 +1,61 @@
ARG FROM_IMG=docker.io/library/haskell
ARG FROM_TAG=8.10.4
FROM ${FROM_IMG}:${FROM_TAG}
ENV LANG=de_DE.UTF-8
# compile-time dependencies
RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
--mount=type=cache,target=/var/lib/apt,sharing=locked \
apt-get -y update && apt-get install -y libpq-dev libsodium-dev
# RUN apt-get -y update && apt-get -y install llvm
# RUN apt-get -y update && apt-get -y install g++ libghc-zlib-dev libpq-dev libsodium-dev pkg-config
# RUN apt-get -y update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends tzdata
RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
--mount=type=cache,target=/var/lib/apt,sharing=locked \
apt-get -y update && apt-get install -y --no-install-recommends locales locales-all
# run-time dependencies for uniworx binary
# RUN apt-get -y update && apt-get -y install pdftk
# RUN apt-get -y update && apt-get -y install \
# texlive texlive-latex-recommended texlive-luatex texlive-plain-generic texlive-lang-german texlive-lang-english
RUN apt-get -y update && apt-get -y install \
wget \
perl \
xz-utils \
fonts-roboto \
texlive \
texlive-luatex \
texlive-latex-extra \
texlive-fonts-recommended \
texlive-fonts-extra \
&& apt-get clean
# RUN ls /usr/local/texlive
# RUN chown -hR root /usr/local/texlive/2018
ENV PATH="/usr/local/texlive/2018/bin/x86_64-linux:${PATH}"
ENV TEXLIVE_VERSION=2018
RUN tlmgr init-usertree
RUN tlmgr option repository ftp://tug.org/historic/systems/texlive/2018/tlnet-final
RUN tlmgr update --self --all
RUN tlmgr install \
babel \
babel-english \
babel-german \
booktabs \
textpos \
enumitem \
# luatex lualatex luatexbase lualatex-math eurosym \
# above line requires tlmgr to run in -sys mode (~root?! apparently -privileged is missing)
koma-script \
unicode-math \
selnolig
ARG PROJECT_DIR=/fradrive
ENV PROJECT_DIR=${PROJECT_DIR}
# RUN mkdir -p "${PROJECT_DIR}"; chmod -R 777 "${PROJECT_DIR}"
WORKDIR ${PROJECT_DIR}
ENV HOME=${PROJECT_DIR}
ENV STACK_ROOT="${PROJECT_DIR}/.stack"

View File

@ -0,0 +1,9 @@
FROM docker.io/postgres:12
# Allow for connecting to database without password authentication
ENV POSTGRES_HOST_AUTH_METHOD=trust
COPY --chown=postgres:postgres docker/postgres/pg_hba.conf /tmp/pg_hba.conf
COPY --chown=postgres:postgres docker/postgres/postgresql.conf /tmp/postgresql.conf
COPY docker/postgres/pgconfig.sh /docker-entrypoint-initdb.d/_pgconfig.sh
COPY --chown=postgres:postgres docker/postgres/schema.sql /docker-entrypoint-initdb.d/schema.sql

14
docker/postgres/initdb.sh Normal file
View File

@ -0,0 +1,14 @@
#!/bin/bash
# Init and start the postgres daemon
initdb --no-locale
pg_ctl start -w -o "-c listen_addresses='*' -c unix_socket_permissions=0700 -c max_connections=9990 -c shared_preload_libraries=pg_stat_statements -c session_preload_libraries=auto_explain -c auto_explain.log_min_duration=100ms"
POSTGRID=`cat /var/lib/postgresql/data/postmaster.pid | perl -le '<>=~m#(\d+)# and print $1'`
# Create uniworx and uniworx_test database
psql -f /schema.sql postgres
# Wait for postgres daemon to terminate
while [ -e /proc/$POSTGRID ]; do
sleep 0.5;
done

View File

@ -0,0 +1,3 @@
local all all trust
host all all 0.0.0.0/0 trust
host all all ::1/128 trust

6
docker/postgres/pgconfig.sh Executable file
View File

@ -0,0 +1,6 @@
#!/usr/bin/env bash
cat /tmp/pg_hba.conf > /var/lib/postgresql/data/pg_hba.conf
cat /tmp/postgresql.conf > /var/lib/postgresql/data/postgresql.conf
echo "Custom pg_hba.conf and postgresql.conf successfully deployed."

View File

@ -0,0 +1,6 @@
listen_addresses='*'
unix_socket_permissions=0700
max_connections=9990
shared_preload_libraries=pg_stat_statements
session_preload_libraries=auto_explain
auto_explain.log_min_duration=100ms

View File

@ -0,0 +1,5 @@
CREATE USER uniworx WITH SUPERUSER PASSWORD 'uniworx';
CREATE DATABASE uniworx_test;
GRANT ALL ON DATABASE uniworx_test TO uniworx;
CREATE DATABASE uniworx;
GRANT ALL ON DATABASE uniworx TO uniworx;

View File

@ -20,8 +20,8 @@ await esbuild.build({
minify: true,
sourcemap: true,
entryPoints: {
main: './src/main.js',
polyfill: './src/polyfill.js',
main: './frontend/src/main.js',
polyfill: './frontend/src/polyfill.js',
},
outdir: staticDir,
plugins: [
@ -48,20 +48,19 @@ await esbuild.build({
copy({
resolveFrom: 'cwd',
assets: {
from: [ './robots.txt' ],
from: [ './config/robots.txt' ],
to: wellKnownDirs,
},
}),
// ...['de-de-formal','en-eu'].map((lang) => manifestPlugin({
manifestPlugin({
filename: 'manifest.json',
path: '.',
path: 'config',
// metadata: { timestamp: new Date(), module: 'myapp', type: 'esm', },
processOutput(assets) {
const orderAssets = {
main: assets['main'],
polyfill: assets['polyfill'],
icons: { "svg": assets['']['svg'][0] },
...assets
};
return JSON.stringify(orderAssets, null, ' ');
},

View File

@ -1,23 +0,0 @@
.PHONY: all
all: dependencies compile ;
.PHONY: dependencies
dependencies: node_modules assets ;
.PHONY: compile
compile: static well-known ;
node_modules: package.json package-lock.json
npm install --cache .npm --prefer-offline
package-lock.json: package.json
npm install --cache .npm --prefer-offline
static: node_modules assets esbuild.config.mjs jsconfig.json postcss.config.js
echo "$${PROJECT_DIR}"
npm run build
well-known: static ;
assets: assets/favicons assets/icons;
assets/favicons:
./utils/faviconize.pl assets/favicon.svg long assets/favicons
assets/icons: node_modules assets/icons-src/fontawesome.json
./utils/renamer.pl node_modules/@fortawesome/fontawesome-free/svgs/solid assets/icons-src/fontawesome.json assets/icons/fradrive
./utils/renamer.pl node_modules/@fortawesome/fontawesome-free/svgs/regular assets/icons-src/fontawesome.json assets/icons/fradrive
-cp assets/icons-src/*.svg assets/icons/fradrive

19093
frontend/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -3,7 +3,7 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
// SPDX-License-Identifier: LicenseRef-Fraport-Corporate-Design
// @import 'env';
@import 'env';
$ico-width: 15px;
@ -110,7 +110,7 @@ $icons: new,
@each $name in $icons {
.ico-#{$name} {
background-image: url('/fradrive/assets/icons/fradrive/#{$name}.svg');
background-image: url('#{$path}/assets/icons/fradrive/#{$name}.svg');
background-size: contain;
background-repeat: no-repeat;
background-position: center;

View File

@ -301,7 +301,7 @@ export class ExamCorrect {
users: [user],
status: STATUS.LOADING,
};
if (results && results != {}) rowInfo.results = results;
if (results && Object.keys(results).length > 0) rowInfo.results = results;
if (result !== undefined) rowInfo.result = result;
this._addRow(rowInfo);

View File

View File

@ -1,4 +1,4 @@
# SPDX-FileCopyrightText: 2022-25 Gregor Kleen <gregor.kleen@ifi.lmu.de>,Winnie Ros <winnie.ros@campus.lmu.de>,Steffen Jost <s.jost@fraport.de>
# SPDX-FileCopyrightText: 2022-2025 Sarah Vaupel <sarah.vaupel@uniworx.de>,Gregor Kleen <gregor.kleen@ifi.lmu.de>,Winnie Ros <winnie.ros@campus.lmu.de>,Steffen Jost <s.jost@fraport.de>
#
# SPDX-License-Identifier: AGPL-3.0-or-later
@ -150,6 +150,8 @@ InterfaceName: Schnittstelle
InterfaceLastSynch: Zuletzt
InterfaceSubtype: Betreffend
InterfaceWrite: Schreibend
AdminUserPassword: Passwort
InterfaceSuccess: Rückmeldung
InterfaceInfo: Nachricht
InterfaceFreshness: Maximale Zugriffsfrist
@ -161,4 +163,4 @@ IWTActDelete: Entfernen
InterfaceWarningAdded: Schnittstellenwarnungszeit hinzugefügt oder geändert
InterfaceWarningDeleted n@Int: #{pluralDEeN n "Schnittstellenwarnungszeit"} gelöscht
InterfaceWarningDisabledEntirely: Alle Fehler ignorieren
InterfaceWarningDisabledInterval: Keine Zugriffsfrist
InterfaceWarningDisabledInterval: Keine Zugriffsfrist

View File

@ -1,4 +1,4 @@
# SPDX-FileCopyrightText: 2022-25 Sarah Vaupel <sarah.vaupel@ifi.lmu.de>,Winnie Ros <winnie.ros@campus.lmu.de>,Steffen Jost <s.jost@fraport.de>
# SPDX-FileCopyrightText: 2022-2025 Sarah Vaupel <sarah.vaupel@uniworx.de>,Winnie Ros <winnie.ros@campus.lmu.de>,Steffen Jost <s.jost@fraport.de>
#
# SPDX-License-Identifier: AGPL-3.0-or-later
@ -150,6 +150,8 @@ InterfaceName: Interface
InterfaceLastSynch: Last
InterfaceSubtype: Affecting
InterfaceWrite: Write
AdminUserPassword: Password
InterfaceSuccess: Returned
InterfaceInfo: Message
InterfaceFreshness: Maximum usage period
@ -161,4 +163,4 @@ IWTActDelete: Delete
InterfaceWarningAdded: Interface warning time added/changed
InterfaceWarningDeleted n: #{pluralENsN n "interface warning time"} deleted
InterfaceWarningDisabledEntirely: Ignore all errors
InterfaceWarningDisabledInterval: No maximum usage period
InterfaceWarningDisabledInterval: No maximum usage period

View File

@ -1,4 +1,4 @@
# SPDX-FileCopyrightText: 2022 Gregor Kleen <gregor.kleen@ifi.lmu.de>,Sarah Vaupel <sarah.vaupel@ifi.lmu.de>,Steffen Jost <jost@tcs.ifi.lmu.de>,Winnie Ros <winnie.ros@campus.lmu.de>
# SPDX-FileCopyrightText: 2022-2024 David Mosbach <david.mosbach@uniworx.de>, Sarah Vaupel <sarah.vaupel@uniworx.de>, Gregor Kleen <gregor.kleen@ifi.lmu.de>, Sarah Vaupel <sarah.vaupel@ifi.lmu.de>, Steffen Jost <jost@tcs.ifi.lmu.de>, Winnie Ros <winnie.ros@campus.lmu.de>
#
# SPDX-License-Identifier: AGPL-3.0-or-later
@ -72,8 +72,8 @@ UnauthorizedTutorialTutorControl: Ausbilder:innen dürfen diesen Kurs nicht edit
UnauthorizedCourseTutor: Sie sind nicht Ausbilder:in für diese Kursart.
UnauthorizedTutor: Sie sind nicht Ausbilder:in.
UnauthorizedTutorialRegisterGroup: Sie sind bereits in einem Kurs mit derselben Registrierungs-Gruppe eingetragen.
UnauthorizedLDAP: Angegebener Nutzer/Angegebene Nutzerin meldet sich nicht mit Fraport Login an.
UnauthorizedPWHash: Angegebener Nutzer/Angegebene Nutzerin meldet sich nicht mit FRADrive-Kennung an.
UnauthorizedExternal: Angegebene:r Benuzter:in meldet sich nicht über einen aktuell unterstützten externen Login an.
UnauthorizedInternal: Angegebene:r Benutzer:in meldet sich nicht mit FRADrive-Kennung an.
UnauthorizedExternalExamListNotEmpty: Liste von externen Prüfungen ist nicht leer
UnauthorizedExternalExamLecturer: Sie sind nicht als Prüfer:in für diese externe Prüfung eingetragen
UnauthorizedSubmissionSubmissionGroup: Sie sind nicht Mitglied in einer der registrierten Abgabegruppen, die an dieser Abgabe beteiligt sind
@ -102,15 +102,15 @@ LDAPLoginTitle: Fraport Login für interne und externe Nutzer
PWHashLoginTitle: Spezieller Funktionsnutzer Login
PWHashLoginNote: Verwenden Sie dieses Formular nur, wenn Sie explizit dazu aufgefordert wurden. Alle anderen sollten das andere Login Formular verwenden!
DummyLoginTitle: Development-Login
InternalLdapError: Interner Fehler beim Fraport Büko-Login
CampusUserInvalidIdent: Konnte anhand des Fraport Büko-Logins keine eindeutige Identifikation ermitteln
CampusUserInvalidEmail: Konnte anhand des Fraport Büko-Logins keine E-Mail-Addresse ermitteln
CampusUserInvalidDisplayName: Konnte anhand des Fraport Büko-Logins keinen vollen Namen ermitteln
CampusUserInvalidGivenName: Konnte anhand des Fraport Büko-Logins keinen Vornamen ermitteln
CampusUserInvalidSurname: Konnte anhand des Fraport Büko-Logins keinen Nachname ermitteln
CampusUserInvalidTitle: Konnte anhand des Fraport Büko-Logins keinen akademischen Titel ermitteln
CampusUserInvalidFeaturesOfStudy parseErr@Text: Konnte anhand des Fraport Büko-Logins keine Studiengänge ermitteln
CampusUserInvalidAssociatedSchools parseErr@Text: Konnte anhand des Fraport Büko-Logins keine Bereiche ermitteln
InternalLoginError: Interner Fehler beim Login
DecodeUserInvalidIdent: Konnte anhand des Fraport Büko-Logins keine eindeutige Identifikation ermitteln
DecodeUserInvalidEmail: Konnte anhand des Fraport Büko-Logins keine E-Mail-Addresse ermitteln
DecodeUserInvalidDisplayName: Konnte anhand des Fraport Büko-Logins keinen vollen Namen ermitteln
DecodeUserInvalidGivenName: Konnte anhand des Fraport Büko-Logins keinen Vornamen ermitteln
DecodeUserInvalidSurname: Konnte anhand des Fraport Büko-Logins keinen Nachname ermitteln
DecodeUserInvalidTitle: Konnte anhand des Fraport Büko-Logins keinen akademischen Titel ermitteln
DecodeUserInvalidFeaturesOfStudy parseErr@Text: Konnte anhand des Fraport Büko-Logins keine Studiengänge ermitteln
DecodeUserInvalidAssociatedSchools parseErr@Text: Konnte anhand des Fraport Büko-Logins keine Bereiche ermitteln
InvalidCredentialsADNoSuchObject: Benutzereintrag existiert nicht
InvalidCredentialsADLogonFailure: Ungültiges Passwort
InvalidCredentialsADAccountRestriction: Beschränkungen des Fraport Accounts verhindern Login
@ -139,3 +139,6 @@ FormHoneypotNamePlaceholder: Name
FormHoneypotComment: Kommentar
FormHoneypotCommentPlaceholder: Kommentar
FormHoneypotFilled: Bitte füllen Sie keines der verstecken Felder aus
Logout: Abmeldung
SingleSignOut: Abmeldung bei Azure

View File

@ -1,4 +1,4 @@
# SPDX-FileCopyrightText: 2022 Gregor Kleen <gregor.kleen@ifi.lmu.de>,Sarah Vaupel <sarah.vaupel@ifi.lmu.de>,Steffen Jost <jost@tcs.ifi.lmu.de>,Winnie Ros <winnie.ros@campus.lmu.de>
# SPDX-FileCopyrightText: 2022-2024 Sarah Vaupel <sarah.vaupel@uniworx.de>, David Mosbach <david.mosbach@uniworx.de>, Gregor Kleen <gregor.kleen@ifi.lmu.de>, Sarah Vaupel <sarah.vaupel@ifi.lmu.de>, Steffen Jost <jost@tcs.ifi.lmu.de>, Winnie Ros <winnie.ros@campus.lmu.de>
#
# SPDX-License-Identifier: AGPL-3.0-or-later
@ -72,8 +72,8 @@ UnauthorizedTutorialTutorControl: Instructors may not edit this course.
UnauthorizedCourseTutor: You are no instructor for this course.
UnauthorizedTutor: You are no instructor.
UnauthorizedTutorialRegisterGroup: You are already registered for a course with the same registration group.
UnauthorizedLDAP: Specified user does not log in with their Fraport password.
UnauthorizedPWHash: Specified user does not log in with an FRADrive-account.
UnauthorizedExternal: Specified user does not log in with any currently supported external login.
UnauthorizedInternal: Specified user does not log in with a FRADrive-account.
UnauthorizedExternalExamListNotEmpty: List of external exams is not empty
UnauthorizedExternalExamLecturer: You are not an associated person for this external exam
UnauthorizedSubmissionSubmissionGroup: You are not member in any of the submission groups for this submission
@ -103,15 +103,15 @@ LDAPLoginTitle: Fraport login for intern and extern users
PWHashLoginTitle: Special function user login
PWHashLoginNote: Only use this login form if you have received special instructions to do so. All others should use the other login field.
DummyLoginTitle: Development login
InternalLdapError: Internal error during Fraport Büko login
CampusUserInvalidIdent: Could not determine unique identification during Fraport Büko login
CampusUserInvalidEmail: Could not determine email address during Fraport Büko login
CampusUserInvalidDisplayName: Could not determine display name during Fraport Büko login
CampusUserInvalidGivenName: Could not determine given name during Fraport Büko login
CampusUserInvalidSurname: Could not determine surname during Fraport Büko login
CampusUserInvalidTitle: Could not determine title during Fraport Büko login
CampusUserInvalidFeaturesOfStudy parseErr: Could not determine features of study during Fraport Büko login
CampusUserInvalidAssociatedSchools parseErr: Could not determine associated departments during Fraport Büko login
InternalLoginError: Internal error during login
DecodeUserInvalidIdent: Could not determine unique identification during Fraport Büko login
DecodeUserInvalidEmail: Could not determine email address during Fraport Büko login
DecodeUserInvalidDisplayName: Could not determine display name during Fraport Büko login
DecodeUserInvalidGivenName: Could not determine given name during Fraport Büko login
DecodeUserInvalidSurname: Could not determine surname during Fraport Büko login
DecodeUserInvalidTitle: Could not determine title during Fraport Büko login
DecodeUserInvalidFeaturesOfStudy parseErr: Could not determine features of study during Fraport Büko login
DecodeUserInvalidAssociatedSchools parseErr: Could not determine associated departments during Fraport Büko login
InvalidCredentialsADNoSuchObject: User entry does not exist
InvalidCredentialsADLogonFailure: Invalid password
InvalidCredentialsADAccountRestriction: Restrictions on your Fraport account prevent a login
@ -140,3 +140,6 @@ FormHoneypotNamePlaceholder !ident-ok: Name
FormHoneypotComment: Comment
FormHoneypotCommentPlaceholder: Comment
FormHoneypotFilled: Please do not fill in any of the hidden fields
Logout: Logout
SingleSignOut: Azure logout

Some files were not shown because too many files have changed in this diff Show More