Page MenuHomePhorge

No OneTemporary

Size
619 KB
Referenced Files
None
Subscribers
None
This file is larger than 256 KB, so syntax highlighting was skipped.
diff --git a/.gitignore b/.gitignore
index 4009bd844..3b672184e 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,61 +1,62 @@
# App artifacts
/_build
/db
/deps
/*.ez
/test/instance
/test/uploads
/.elixir_ls
/test/fixtures/DSCN0010_tmp.jpg
/test/fixtures/test_tmp.txt
/test/fixtures/image_tmp.jpg
/test/tmp/
/doc
/instance
/priv/ssh_keys
# Prevent committing custom emojis
/priv/static/emoji/custom/*
# Generated on crash by the VM
erl_crash.dump
# Files matching config/*.secret.exs pattern contain sensitive
# data and you should not commit them into version control.
#
# Alternatively, you may comment the line below and commit the
# secrets files as long as you replace their contents by environment
# variables.
/config/*.secret.exs
/config/generated_config.exs
/config/runtime.exs
/config/*.env
# Database setup file, some may forget to delete it
/config/setup_db*.psql
.DS_Store
.env
# Editor config
/.vscode/
# Prevent committing docs files
/priv/static/doc/*
docs/generated_config.md
# Code test coverage
/cover
/Elixir.*.coverdata
/coverage.xml
.idea
pleroma.iml
# asdf
.tool-versions
# Editor temp files
-/*~
-/*#
+*~
+*#
+*.swp
diff --git a/CHANGELOG.md b/CHANGELOG.md
index e071fe693..063d51d4c 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,1394 +1,1399 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
+## 2.6.2
+
+### Security
+- MRF StealEmojiPolicy: Sanitize shortcodes (thanks to Hazel K for the report
+
## 2.6.1
### Changed
- - Document maximum supported version of Erlang & Elixir
### Added
- [docs] add frontends management documentation
### Fixed
- TwitterAPI: Return proper error when healthcheck is disabled
- Fix eblurhash and elixir-captcha not using system cflags
## 2.6.0
### Security
- Preload: Make generated JSON html-safe. It already was html safe because it only consists of config data that is base64 encoded, but this will keep it safe it that ever changes.
- CommonAPI: Prevent users from accessing media of other users by creating a status with reused attachment ID
- Disable XML entity resolution completely to fix a dos vulnerability
### Added
- Support for Image activities, namely from Hubzilla
- Add OAuth scope descriptions
- Allow lang attribute in status text
- OnlyMedia Upload Filter
- Implement MRF policy to reject or delist according to emojis
- (hardening) Add no_new_privs=yes to OpenRC service files
- Implement quotes
- Add unified streaming endpoint
### Fixed
- rel="me" was missing its cache
- MediaProxy responses now return a sandbox CSP header
- Filter context activities using Visibility.visible_for_user?
- UploadedMedia: Add missing disposition_type to Content-Disposition
- fix not being able to fetch flash file from remote instance
- Fix abnormal behaviour when refetching a poll
- Allow non-HTTP(s) URIs in "url" fields for compatibility with "FEP-fffd: Proxy Objects"
- Fix opengraph and twitter card meta tags
- ForceMentionsInContent: fix double mentions for Mastodon/Misskey posts
- OEmbed HTML tags are now filtered
- Restrict attachments to only uploaded files only
- Fix error 404 when deleting status of a banned user
- Fix config ownership in dockerfile to pass restriction test
- Fix user fetch completely broken if featured collection is not in a supported form
- Correctly handle the situation when a poll has both "anyOf" and "oneOf" but one of them being empty
- Fix handling report from a deactivated user
- Prevent using the .json format to bypass authorized fetch mode
- Fix mentioning punycode domains when using Markdown
- Show more informative errors when profile exceeds char limits
### Removed
- BREAKING: Support for passwords generated with `crypt(3)` (Gnu Social migration artifact)
- remove BBS/SSH feature, replaced by an external bridge.
- Remove a few unused indexes.
- Cleanup OStatus-era user upgrades and ap_enabled indicator
- Deprecate Pleroma's audio scrobbling
## 2.5.4
## Security
- Fix XML External Entity (XXE) loading vulnerability allowing to fetch arbitrary files from the server's filesystem
## 2.5.3
### Security
- Emoji pack loader sanitizes pack names
- Reduced permissions of config files and directories, distros requiring greater permissions like group-read need to pre-create the directories
## 2.5.5
## Security
- Prevent users from accessing media of other users by creating a status with reused attachment ID
## 2.5.4
## Security
- Fix XML External Entity (XXE) loading vulnerability allowing to fetch arbitrary files from the server's filesystem
## 2.5.3
### Security
- Emoji pack loader sanitizes pack names
- Reduced permissions of config files and directories, distros requiring greater permissions like group-read need to pre-create the directories
## 2.5.2
### Security
- `/proxy` endpoint now sets a Content-Security-Policy (sandbox)
- WebSocket endpoint now respects unauthenticated restrictions for streams of public posts
- OEmbed HTML tags are now filtered
### Changed
- docs: Be more explicit about the level of compatibility of OTP releases
- Set default background worker timeout to 15 minutes
### Fixed
- Atom/RSS formatting (HTML truncation, published, missing summary)
- Remove `static_fe` pipeline for `/users/:nickname/feed`
- Stop oban from retrying if validating errors occur when processing incoming data
- Make sure object refetching as used by already received polls follows MRF rules
### Removed
- BREAKING: Support for passwords generated with `crypt(3)` (Gnu Social migration artifact)
## 2.5.1
### Added
- Allow customizing instance languages
### Fixed
- Security: uploading HTTP endpoint can no longer create directories in the upload dir (internal APIs, like backup, still can do it.)
- ~ character in urls in Markdown posts are handled properly
- Exiftool upload filter will now ignore SVG files
- Fix `block_from_stranger` setting
- Fix rel="me"
- Docker images will now run properly
- Fix improper content being cached in report content
- Notification filter on object content will not operate on the ones that inherently have no content
- ZWNJ and double dots in links are parsed properly for Plain-text posts
- OTP releases will work on systems with a newer libcrypt
- Errors when running Exiftool.ReadDescription filter will not be filled into the image description
## 2.5.0 - 2022-12-23
### Removed
- MastoFE
- Quack, the logging backend that pushes to Slack channels
### Changed
- **Breaking:** Elixir >=1.11 is now required (was >= 1.9)
- Allow users to remove their emails if instance does not need email to register
- Uploadfilter `Pleroma.Upload.Filter.Exiftool` has been renamed to `Pleroma.Upload.Filter.Exiftool.StripLocation`
- **Breaking**: `/api/v1/pleroma/backups` endpoints now requires `read:backups` scope instead of `read:accounts`
- Updated the recommended pleroma.vcl configuration for Varnish to target Varnish 7.0+
- Set timeout values for Oban queues. The default is infinity and some operations may not time out on their own.
- Delete activities are federated at lowest priority
- CSP now includes wasm-unsafe-eval
### Added
- `activeMonth` and `activeHalfyear` fields in NodeInfo usage.users object
- Experimental support for Finch. Put `config :tesla, :adapter, {Tesla.Adapter.Finch, name: MyFinch}` in your secrets file to use it. Reverse Proxy will still use Hackney.
- `ForceMentionsInPostContent` MRF policy
- PleromaAPI: Add remote follow API endpoint at `POST /api/v1/pleroma/remote_interaction`
- MastoAPI: Add `GET /api/v1/accounts/lookup`
- MastoAPI: Profile Directory support
- MastoAPI: Support v2 Suggestions (handpicked accounts only)
- Ability to log slow Ecto queries by configuring `:pleroma, :telemetry, :slow_queries_logging`
- Added Phoenix LiveDashboard at `/phoenix/live_dashboard`
- Added `/manifest.json` for progressive web apps.
- MastoAPI: Support for `birthday` and `show_birthday` field in `/api/v1/accounts/update_credentials`.
- Configuration: Add `birthday_required` and `birthday_min_age` settings to provide a way to require users to enter their birth date.
- PleromaAPI: Add `GET /api/v1/pleroma/birthdays` API endpoint
- Make backend-rendered pages translatable. This includes emails. Pages returned as a HTTP response are translated using the language specified in the `userLanguage` cookie, or the `Accept-Language` header. Emails are translated using the `language` field when registering. This language can be changed by `PATCH /api/v1/accounts/update_credentials` with the `language` field.
- Add fine grained options to provide privileges to moderators and admins (e.g. delete messages, manage reports...)
- Uploadfilter `Pleroma.Upload.Filter.Exiftool.ReadDescription` returns description values to the FE so they can pre fill the image description field
- Added move account API
- Enable remote users to interact with posts
- Possibility to discover users like `user@example.org`, while Pleroma is working on `pleroma.example.org`. Additional configuration required.
### Fixed
- Subscription(Bell) Notifications: Don't create from Pipeline Ingested replies
- Handle Reject for already-accepted Follows properly
- Display OpenGraph data on alternative notice routes.
- Fix replies count for remote replies
- Fixed hashtags disappearing from the end of lines when Markdown is enabled
- ChatAPI: Add link headers
- Limited number of search results to 40 to prevent DoS attacks
- ActivityPub: fixed federation of attachment dimensions
- Fixed benchmarks
- Elixir 1.13 support
- Fixed crash when pinned_objects is nil
- Fixed slow timelines when there are a lot of deactivated users
- Fixed account deletion API
- Fixed lowercase HTTP HEAD method in the Media Proxy Preview code
- Removed useless notification call on Delete activities
- Improved performance for filtering out deactivated and invisible users
- RSS and Atom feeds for users work again
- TwitterCard meta tags conformance
## 2.4.5 - 2022-11-27
## Fixed
- Image `class` attributes not being scrubbed, allowing to exploit frontend special classes [!3792](https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3792)
- Delete report notifs when demoting from superuser [!3642](https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3642)
- Validate `mediaType` only by it's format rather than using a list [!3597](https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3597)
- Pagination: Make mutes and blocks lists behave the same as other lists [!3693](https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3693)
- Compatibility with Elixir 1.14 [!3740](https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3740)
- Frontend installer: FediFE build URL [!3736](https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3736)
- Streaming: Don't stream ChatMessage into the home timeline [!3738](https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3738)
- Streaming: Stream local-only posts in the local timeline [!3738](https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3738)
- Signatures: Fix `keyId` lookup for GoToSocial [!3725](https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3725)
- Validator: Fix `replies` handling for GoToSocial [!3725](https://git.pleroma.social/pleroma/pleroma/-/merge_requests/3725)
## 2.4.4 - 2022-08-19
### Security
- Streaming API sessions will now properly disconnect if the corresponding token is revoked
## 2.4.3 - 2022-05-06
### Security
- Private `/objects/` and `/activities/` leaking if cached by authenticated user
- SweetXML library DTD bomb
## 2.4.2 - 2022-01-10
### Fixed
- Federation issues caused by HTTP pool checkout timeouts
- Compatibility with Elixir 1.13
### Upgrade notes
1. Restart Pleroma
## 2.4.1 - 2021-08-29
### Changed
- Make `mix pleroma.database set_text_search_config` run concurrently and indefinitely
### Added
- AdminAPI: Missing configuration description for StealEmojiPolicy
### Fixed
- MastodonAPI: Stream out Create activities
- MRF ObjectAgePolicy: Fix pattern matching on "published"
- TwitterAPI: Make `change_password` and `change_email` require params on body instead of query
- Subscription(Bell) Notifications: Don't create from Pipeline Ingested replies
- AdminAPI: Fix rendering reports containing a `nil` object
- Mastodon API: Activity Search fallbacks on status fetching after a DB Timeout/Error
- Mastodon API: Fix crash in Streamer related to reblogging
- AdminAPI: List available frontends when `static/frontends` folder is missing
- Make activity search properly use language-aware GIN indexes
- AdminAPI: Fix suggestions for MRF Policies
## 2.4.0 - 2021-08-08
### Changed
- **Breaking:** Configuration: `:chat, enabled` moved to `:shout, enabled` and `:instance, chat_limit` moved to `:shout, limit`
- **Breaking** Entries for simple_policy, transparency_exclusions and quarantined_instances now list both the instance and a reason.
- Support for Erlang/OTP 24
- The `application` metadata returned with statuses is no longer hardcoded. Apps that want to display these details will now have valid data for new posts after this change.
- HTTPSecurityPlug now sends a response header to opt out of Google's FLoC (Federated Learning of Cohorts) targeted advertising.
- Email address is now returned if requesting user is the owner of the user account so it can be exposed in client and FE user settings UIs.
- Improved Twittercard and OpenGraph meta tag generation including thumbnails and image dimension metadata when available.
- AdminAPI: sort users so the newest are at the top.
- ActivityPub Client-to-Server(C2S): Limitation on the type of Activity/Object are lifted as they are now passed through ObjectValidators
- MRF (`AntiFollowbotPolicy`): Bot accounts are now also considered followbots. Users can still allow bots to follow them by first following the bot.
### Added
- MRF (`FollowBotPolicy`): New MRF Policy which makes a designated local Bot account attempt to follow all users in public Notes received by your instance. Users who require approving follower requests or have #nobot in their profile are excluded.
- Return OAuth token `id` (primary key) in POST `/oauth/token`.
- AdminAPI: return `created_at` date with users.
- AdminAPI: add DELETE `/api/v1/pleroma/admin/instances/:instance` to delete all content from a remote instance.
- `AnalyzeMetadata` upload filter for extracting image/video attachment dimensions and generating blurhashes for images. Blurhashes for videos are not generated at this time.
- Attachment dimensions and blurhashes are federated when available.
- Mastodon API: support `poll` notification.
- Pinned posts federation
### Fixed
- Don't crash so hard when email settings are invalid.
- Checking activated Upload Filters for required commands.
- Remote users can no longer reappear after being deleted.
- Deactivated users may now be deleted.
- Deleting an activity with a lot of likes/boosts no longer causes a database timeout.
- Mix task `pleroma.database prune_objects`
- Fixed rendering of JSON errors on ActivityPub endpoints.
- Linkify: Parsing crash with URLs ending in unbalanced closed paren, no path separator, and no query parameters
- Try to save exported ConfigDB settings (migrate_from_db) in the system temp directory if default location is not writable.
- Uploading custom instance thumbnail via AdminAPI/AdminFE generated invalid URL to the image
- Applying ConcurrentLimiter settings via AdminAPI
- User login failures if their `notification_settings` were in a NULL state.
- Mix task `pleroma.user delete_activities` query transaction timeout is now :infinity
- MRF (`SimplePolicy`): Embedded objects are now checked. If any embedded object would be rejected, its parent is rejected. This fixes Announces leaking posts from blocked domains.
- Fixed some Markdown issues, including trailing slash in links.
### Removed
- **Breaking**: Remove deprecated `/api/qvitter/statuses/notifications/read` (replaced by `/api/v1/pleroma/notifications/read`)
## [2.3.0] - 2021-03-01
### Security
- Fixed client user agent leaking through MediaProxy
### Removed
- `:auth, :enforce_oauth_admin_scope_usage` configuration option.
### Changed
- **Breaking**: Changed `mix pleroma.user toggle_confirmed` to `mix pleroma.user confirm`
- **Breaking**: Changed `mix pleroma.user toggle_activated` to `mix pleroma.user activate/deactivate`
- **Breaking:** NSFW hashtag is no longer added on sensitive posts
- Polls now always return a `voters_count`, even if they are single-choice.
- Admin Emails: The ap id is used as the user link in emails now.
- Improved registration workflow for email confirmation and account approval modes.
- Search: When using Postgres 11+, Pleroma will use the `websearch_to_tsvector` function to parse search queries.
- Emoji: Support the full Unicode 13.1 set of Emoji for reactions, plus regional indicators.
- Deprecated `Pleroma.Uploaders.S3, :public_endpoint`. Now `Pleroma.Upload, :base_url` is the standard configuration key for all uploaders.
- Improved Apache webserver support: updated sample configuration, MediaProxy cache invalidation verified with the included sample script
- Improve OAuth 2.0 provider support. A missing `fqn` field was added to the response, but does not expose the user's email address.
- Provide redirect of external posts from `/notice/:id` to their original URL
- Admins no longer receive notifications for reports if they are the actor making the report.
- Improved Mailer configuration setting descriptions for AdminFE.
- Updated default avatar to look nicer.
<details>
<summary>API Changes</summary>
- **Breaking:** AdminAPI changed User field `confirmation_pending` to `is_confirmed`
- **Breaking:** AdminAPI changed User field `approval_pending` to `is_approved`
- **Breaking**: AdminAPI changed User field `deactivated` to `is_active`
- **Breaking:** AdminAPI `GET /api/pleroma/admin/users/:nickname_or_id/statuses` changed response format and added the number of total users posts.
- **Breaking:** AdminAPI `GET /api/pleroma/admin/instances/:instance/statuses` changed response format and added the number of total users posts.
- Admin API: Reports now ordered by newest
- Pleroma API: `GET /api/v1/pleroma/chats` is deprecated in favor of `GET /api/v2/pleroma/chats`.
- Pleroma API: Reroute `/api/pleroma/*` to `/api/v1/pleroma/*`
</details>
- Improved hashtag timeline performance (requires a background migration).
### Added
- Reports now generate notifications for admins and mods.
- Support for local-only statuses.
- Support pagination of blocks and mutes.
- Account backup.
- Configuration: Add `:instance, autofollowing_nicknames` setting to provide a way to make accounts automatically follow new users that register on the local Pleroma instance.
- `[:activitypub, :blockers_visible]` config to control visibility of blockers.
- Ability to view remote timelines, with ex. `/api/v1/timelines/public?instance=lain.com` and streams `public:remote` and `public:remote:media`.
- The site title is now injected as a `title` tag like preloads or metadata.
- Password reset tokens now are not accepted after a certain age.
- Mix tasks to help with displaying and removing ConfigDB entries. See `mix pleroma.config`.
- OAuth form improvements: users are remembered by their cookie, the CSS is overridable by the admin, and the style has been improved.
- OAuth improvements and fixes: more secure session-based authentication (by token that could be revoked anytime), ability to revoke belonging OAuth token from any client etc.
- Ability to set ActivityPub aliases for follower migration.
- Configurable background job limits for RichMedia (link previews) and MediaProxyWarmingPolicy
- Ability to define custom HTTP headers per each frontend
- MRF (`NoEmptyPolicy`): New MRF Policy which will deny empty statuses or statuses of only mentions from being created by local users
- New users will receive a simple email confirming their registration if no other emails will be dispatched. (e.g., Welcome, Confirmation, or Approval Required)
<details>
<summary>API Changes</summary>
- Admin API: (`GET /api/pleroma/admin/users`) filter users by `unconfirmed` status and `actor_type`.
- Admin API: OpenAPI spec for the user-related operations
- Pleroma API: `GET /api/v2/pleroma/chats` added. It is exactly like `GET /api/v1/pleroma/chats` except supports pagination.
- Pleroma API: Add `idempotency_key` to the chat message entity that can be used for optimistic message sending.
- Pleroma API: (`GET /api/v1/pleroma/federation_status`) Add a way to get a list of unreachable instances.
- Mastodon API: User and conversation mutes can now auto-expire if `expires_in` parameter was given while adding the mute.
- Admin API: An endpoint to manage frontends.
- Streaming API: Add follow relationships updates.
- WebPush: Introduce `pleroma:chat_mention` and `pleroma:emoji_reaction` notification types.
- Mastodon API: Add monthly active users to `/api/v1/instance` (`pleroma.stats.mau`).
- Mastodon API: Home, public, hashtag & list timelines accept `only_media`, `remote` & `local` parameters for filtration.
- Mastodon API: `/api/v1/accounts/:id` & `/api/v1/mutes` endpoints accept `with_relationships` parameter and return filled `pleroma.relationship` field.
- Mastodon API: Endpoint to remove a conversation (`DELETE /api/v1/conversations/:id`).
- Mastodon API: `expires_in` in the scheduled post `params` field on `/api/v1/statuses` and `/api/v1/scheduled_statuses/:id` endpoints.
</details>
### Fixed
- Users with `is_discoverable` field set to false (default value) will appear in in-service search results but be hidden from external services (search bots etc.).
- Streaming API: Posts and notifications are not dropped, when CLI task is executing.
- Creating incorrect IPv4 address-style HTTP links when encountering certain numbers.
- Reblog API Endpoint: Do not set visibility parameter to public by default and let CommonAPI to infer it from status, so a user can reblog their private status without explicitly setting reblog visibility to private.
- Tag URLs in statuses are now absolute
- Removed duplicate jobs to purge expired activities
- File extensions of some attachments were incorrectly changed. This feature has been disabled for now.
- Mix task pleroma.instance creates missing parent directories if the configuration or SQL output paths are changed.
<details>
<summary>API Changes</summary>
- Mastodon API: Current user is now included in conversation if it's the only participant.
- Mastodon API: Fixed last_status.account being not filled with account data.
- Mastodon API: Fix not being able to add or remove multiple users at once in lists.
- Mastodon API: Fixed own_votes being not returned with poll data.
- Mastodon API: Fixed creation of scheduled posts with polls.
- Mastodon API: Support for expires_in/expires_at in the Filters.
</details>
## [2.2.2] - 2021-01-18
### Fixed
- StealEmojiPolicy creates dir for emojis, if it doesn't exist.
- Updated `elixir_make` to a non-retired version
### Upgrade notes
1. Restart Pleroma
## [2.2.1] - 2020-12-22
### Changed
- Updated Pleroma FE
### Fixed
- Config generation: rename `Pleroma.Upload.Filter.ExifTool` to `Pleroma.Upload.Filter.Exiftool`.
- S3 Uploads with Elixir 1.11.
- Mix task pleroma.user delete_activities for source installations.
- Search: RUM index search speed has been fixed.
- Rich Media Previews sometimes showed the wrong preview due to a bug following redirects.
- Fixes for the autolinker.
- Forwarded reports duplication from Pleroma instances.
- Emoji Reaction activity filtering from blocked and muted accounts.
- <details>
<summary>API</summary>
- Statuses were not displayed for Mastodon forwarded reports.
</details>
### Upgrade notes
1. Restart Pleroma
## [2.2.0] - 2020-11-12
### Security
- Fixed the possibility of using file uploads to spoof posts.
### Changed
- **Breaking** Requires `libmagic` (or `file`) to guess file types.
- **Breaking:** App metrics endpoint (`/api/pleroma/app_metrics`) is disabled by default, check `docs/API/prometheus.md` on enabling and configuring.
- **Breaking:** Pleroma Admin API: emoji packs and files routes changed.
- **Breaking:** Sensitive/NSFW statuses no longer disable link previews.
- Search: Users are now findable by their urls.
- Renamed `:await_up_timeout` in `:connections_pool` namespace to `:connect_timeout`, old name is deprecated.
- Renamed `:timeout` in `pools` namespace to `:recv_timeout`, old name is deprecated.
- The `discoverable` field in the `User` struct will now add a NOINDEX metatag to profile pages when false.
- Users with the `is_discoverable` field set to false will not show up in searches ([bug](https://git.pleroma.social/pleroma/pleroma/-/issues/2301)).
- Minimum lifetime for ephmeral activities changed to 10 minutes and made configurable (`:min_lifetime` option).
- Introduced optional dependencies on `ffmpeg`, `ImageMagick`, `exiftool` software packages. Please refer to `docs/installation/optional/media_graphics_packages.md`.
- <details>
<summary>API Changes</summary>
- API: Empty parameter values for integer parameters are now ignored in non-strict validaton mode.
</details>
### Removed
- **Breaking:** `Pleroma.Workers.Cron.StatsWorker` setting from Oban `:crontab` (moved to a simpler implementation).
- **Breaking:** `Pleroma.Workers.Cron.ClearOauthTokenWorker` setting from Oban `:crontab` (moved to scheduled jobs).
- **Breaking:** `Pleroma.Workers.Cron.PurgeExpiredActivitiesWorker` setting from Oban `:crontab` (moved to scheduled jobs).
- Removed `:managed_config` option. In practice, it was accidentally removed with 2.0.0 release when frontends were
switched to a new configuration mechanism, however it was not officially removed until now.
### Added
- Media preview proxy (requires `ffmpeg` and `ImageMagick` to be installed and media proxy to be enabled; see `:media_preview_proxy` config for more details).
- Mix tasks for controlling user account confirmation status in bulk (`mix pleroma.user confirm_all` and `mix pleroma.user unconfirm_all`)
- Mix task for sending confirmation emails to all unconfirmed users (`mix pleroma.email resend_confirmation_emails`)
- Mix task option for force-unfollowing relays
- App metrics: ability to restrict access to specified IP whitelist.
<details>
<summary>API Changes</summary>
- Admin API: Importing emoji from a zip file
- Pleroma API: Importing the mutes users from CSV files.
- Pleroma API: Pagination for remote/local packs and emoji.
</details>
### Fixed
- Add documented-but-missing chat pagination.
- Allow sending out emails again.
- Allow sending chat messages to yourself
- OStatus / static FE endpoints: fixed inaccessibility for anonymous users on non-federating instances, switched to handling per `:restrict_unauthenticated` setting.
- Fix remote users with a whitespace name.
### Upgrade notes
1. Install libmagic and development headers (`libmagic-dev` on Ubuntu/Debian, `file-dev` on Alpine Linux)
2. Run database migrations (inside Pleroma directory):
- OTP: `./bin/pleroma_ctl migrate`
- From Source: `mix ecto.migrate`
3. Restart Pleroma
## [2.1.2] - 2020-09-17
### Security
- Fix most MRF rules either crashing or not being applied to objects passed into the Common Pipeline (ChatMessage, Question, Answer, Audio, Event).
### Fixed
- Welcome Chat messages preventing user registration with MRF Simple Policy applied to the local instance.
- Mastodon API: the public timeline returning an error when the `reply_visibility` parameter is set to `self` for an unauthenticated user.
- Mastodon Streaming API: Handler crashes on authentication failures, resulting in error logs.
- Mastodon Streaming API: Error logs on client pings.
- Rich media: Log spam on failures. Now the error is only logged once per attempt.
### Changed
- Rich Media: A HEAD request is now done to the url, to ensure it has the appropriate content type and size before proceeding with a GET.
### Upgrade notes
1. Restart Pleroma
## [2.1.1] - 2020-09-08
### Security
- Fix possible DoS in Mastodon API user search due to an error in match clauses, leading to an infinite recursion and subsequent OOM with certain inputs.
- Fix metadata leak for accounts and statuses on private instances.
- Fix possible DoS in Admin API search using an atom leak vulnerability. Authentication with admin rights was required to exploit.
### Changed
- **Breaking:** The metadata providers RelMe and Feed are no longer configurable. RelMe should always be activated and Feed only provides a <link> header tag for the actual RSS/Atom feed when the instance is public.
- Improved error message when cmake is not available at build stage.
### Added
- Rich media failure tracking (along with `:failure_backoff` option).
<details>
<summary>Admin API Changes</summary>
- Add `PATCH /api/pleroma/admin/instance_document/:document_name` to modify the Terms of Service and Instance Panel HTML pages via Admin API
</details>
### Fixed
- Default HTTP adapter not respecting pool setting, leading to possible OOM.
- Fixed uploading webp images when the Exiftool Upload Filter is enabled by skipping them
- Mastodon API: Search parameter `following` now correctly returns the followings rather than the followers
- Mastodon API: Timelines hanging for (`number of posts with links * rich media timeout`) in the worst case.
Reduced to just rich media timeout.
- Mastodon API: Cards being wrong for preview statuses due to cache key collision.
- Password resets no longer processed for deactivated accounts.
- Favicon scraper raising exceptions on URLs longer than 255 characters.
## [2.1.0] - 2020-08-28
### Changed
- **Breaking:** The default descriptions on uploads are now empty. The old behavior (filename as default) can be configured, see the cheat sheet.
- **Breaking:** Added the ObjectAgePolicy to the default set of MRFs. This will delist and strip the follower collection of any message received that is older than 7 days. This will stop users from seeing very old messages in the timelines. The messages can still be viewed on the user's page and in conversations. They also still trigger notifications.
- **Breaking:** Elixir >=1.9 is now required (was >= 1.8)
- **Breaking:** Configuration: `:auto_linker, :opts` moved to `:pleroma, Pleroma.Formatter`. Old config namespace is deprecated.
- **Breaking:** Configuration: `:instance, welcome_user_nickname` moved to `:welcome, :direct_message, :sender_nickname`, `:instance, :welcome_message` moved to `:welcome, :direct_message, :message`. Old config namespace is deprecated.
- **Breaking:** LDAP: Fallback to local database authentication has been removed for security reasons and lack of a mechanism to ensure the passwords are synchronized when LDAP passwords are updated.
- **Breaking** Changed defaults for `:restrict_unauthenticated` so that when `:instance, :public` is set to `false` then all `:restrict_unauthenticated` items be effectively set to `true`. If you'd like to allow unauthenticated access to specific API endpoints on a private instance, please explicitly set `:restrict_unauthenticated` to non-default value in `config/prod.secret.exs`.
- In Conversations, return only direct messages as `last_status`
- Using the `only_media` filter on timelines will now exclude reblog media
- MFR policy to set global expiration for all local Create activities
- OGP rich media parser merged with TwitterCard
- Configuration: `:instance, rewrite_policy` moved to `:mrf, policies`, `:instance, :mrf_transparency` moved to `:mrf, :transparency`, `:instance, :mrf_transparency_exclusions` moved to `:mrf, :transparency_exclusions`. Old config namespace is deprecated.
- Configuration: `:media_proxy, whitelist` format changed to host with scheme (e.g. `http://example.com` instead of `example.com`). Domain format is deprecated.
<details>
<summary>API Changes</summary>
- **Breaking:** Pleroma API: The routes to update avatar, banner and background have been removed.
- **Breaking:** Image description length is limited now.
- **Breaking:** Emoji API: changed methods and renamed routes.
- **Breaking:** Notification Settings API for suppressing notifications has been simplified down to `block_from_strangers`.
- **Breaking:** Notification Settings API option for hiding push notification contents has been renamed to `hide_notification_contents`.
- MastodonAPI: Allow removal of avatar, banner and background.
- Streaming: Repeats of a user's posts will no longer be pushed to the user's stream.
- Mastodon API: Added `pleroma.metadata.fields_limits` to /api/v1/instance
- Mastodon API: On deletion, returns the original post text.
- Mastodon API: Add `pleroma.unread_count` to the Marker entity.
- Mastodon API: Added `pleroma.metadata.post_formats` to /api/v1/instance
- Mastodon API (legacy): Allow query parameters for `/api/v1/domain_blocks`, e.g. `/api/v1/domain_blocks?domain=badposters.zone`
- Mastodon API: Make notifications about statuses from muted users and threads read automatically
- Pleroma API: `/api/pleroma/captcha` responses now include `seconds_valid` with an integer value.
</details>
<details>
<summary>Admin API Changes</summary>
- **Breaking** Changed relay `/api/pleroma/admin/relay` endpoints response format.
- Status visibility stats: now can return stats per instance.
- Mix task to refresh counter cache (`mix pleroma.refresh_counter_cache`)
</details>
### Removed
- **Breaking:** removed `with_move` parameter from notifications timeline.
### Added
- Frontends: Add mix task to install frontends.
- Frontends: Add configurable frontends for primary and admin fe.
- Configuration: Added a blacklist for email servers.
- Chats: Added `accepts_chat_messages` field to user, exposed in APIs and federation.
- Chats: Added support for federated chats. For details, see the docs.
- ActivityPub: Added support for existing AP ids for instances migrated from Mastodon.
- Instance: Add `background_image` to configuration and `/api/v1/instance`
- Instance: Extend `/api/v1/instance` with Pleroma-specific information.
- NodeInfo: `pleroma:api/v1/notifications:include_types_filter` to the `features` list.
- NodeInfo: `pleroma_emoji_reactions` to the `features` list.
- Configuration: `:restrict_unauthenticated` setting, restrict access for unauthenticated users to timelines (public and federate), user profiles and statuses.
- Configuration: Add `:database_config_whitelist` setting to whitelist settings which can be configured from AdminFE.
- Configuration: `filename_display_max_length` option to set filename truncate limit, if filename display enabled (0 = no limit).
- New HTTP adapter [gun](https://github.com/ninenines/gun). Gun adapter requires minimum OTP version of 22.2 otherwise Pleroma won’t start. For hackney OTP update is not required.
- Mix task to create trusted OAuth App.
- Mix task to reset MFA for user accounts
- Notifications: Added `follow_request` notification type.
- Added `:reject_deletes` group to SimplePolicy
- MRF (`EmojiStealPolicy`): New MRF Policy which allows to automatically download emojis from remote instances
- Support pagination in emoji packs API (for packs and for files in pack)
- Support for viewing instances favicons next to posts and accounts
- Added Pleroma.Upload.Filter.Exiftool as an alternate EXIF stripping mechanism targeting GPS/location metadata.
- "By approval" registrations mode.
- Configuration: Added `:welcome` settings for the welcome message to newly registered users. You can send a welcome message as a direct message, chat or email.
- Ability to hide favourites and emoji reactions in the API with `[:instance, :show_reactions]` config.
<details>
<summary>API Changes</summary>
- Mastodon API: Add pleroma.parent_visible field to statuses.
- Mastodon API: Extended `/api/v1/instance`.
- Mastodon API: Support for `include_types` in `/api/v1/notifications`.
- Mastodon API: Added `/api/v1/notifications/:id/dismiss` endpoint.
- Mastodon API: Add support for filtering replies in public and home timelines.
- Mastodon API: Support for `bot` field in `/api/v1/accounts/update_credentials`.
- Mastodon API: Support irreversible property for filters.
- Mastodon API: Add pleroma.favicon field to accounts.
- Admin API: endpoints for create/update/delete OAuth Apps.
- Admin API: endpoint for status view.
- OTP: Add command to reload emoji packs
</details>
### Fixed
- Fix list pagination and other list issues.
- Support pagination in conversations API
- **Breaking**: SimplePolicy `:reject` and `:accept` allow deletions again
- Fix follower/blocks import when nicknames starts with @
- Filtering of push notifications on activities from blocked domains
- Resolving Peertube accounts with Webfinger
- `blob:` urls not being allowed by connect-src CSP
- Mastodon API: fix `GET /api/v1/notifications` not returning the full result set
- Rich Media Previews for Twitter links
- Admin API: fix `GET /api/pleroma/admin/users/:nickname/credentials` returning 404 when getting the credentials of a remote user while `:instance, :limit_to_local_content` is set to `:unauthenticated`
- Fix CSP policy generation to include remote Captcha services
- Fix edge case where MediaProxy truncates media, usually caused when Caddy is serving content for the other Federated instance.
- Emoji Packs could not be listed when instance was set to `public: false`
- Fix whole_word always returning false on filter get requests
- Migrations not working on OTP releases if the database was connected over ssl
- Fix relay following
## [2.0.7] - 2020-06-13
### Security
- Fix potential DoSes exploiting atom leaks in rich media parser and the `UserAllowListPolicy` MRF policy
### Fixed
- CSP: not allowing images/media from every host when mediaproxy is disabled
- CSP: not adding mediaproxy base url to image/media hosts
- StaticFE missing the CSS file
### Upgrade notes
1. Restart Pleroma
## [2.0.6] - 2020-06-09
### Security
- CSP: harden `image-src` and `media-src` when MediaProxy is used
### Fixed
- AP C2S: Fix pagination in inbox/outbox
- Various compilation errors on OTP 23
- Mastodon API streaming: Repeats from muted threads not being filtered
### Changed
- Various database performance improvements
### Upgrade notes
1. Run database migrations (inside Pleroma directory):
- OTP: `./bin/pleroma_ctl migrate`
- From Source: `mix ecto.migrate`
2. Restart Pleroma
## [2.0.5] - 2020-05-13
### Security
- Fix possible private status leaks in Mastodon Streaming API
### Fixed
- Crashes when trying to block a user if block federation is disabled
- Not being able to start the instance without `erlang-eldap` installed
- Users with bios over the limit getting rejected
- Follower counters not being updated on incoming follow accepts
### Upgrade notes
1. Restart Pleroma
## [2.0.4] - 2020-05-10
### Security
- AP C2S: Fix a potential DoS by creating nonsensical objects that break timelines
### Fixed
- Peertube user lookups not working
- `InsertSkeletonsForDeletedUsers` migration failing on some instances
- Healthcheck reporting the number of memory currently used, rather than allocated in total
- LDAP not being usable in OTP releases
- Default apache configuration having tls chain issues
### Upgrade notes
#### Apache only
1. Remove the following line from your config:
```
SSLCertificateFile /etc/letsencrypt/live/${servername}/cert.pem
```
#### Everyone
1. Restart Pleroma
## [2.0.3] - 2020-05-02
### Security
- Disallow re-registration of previously deleted users, which allowed viewing direct messages addressed to them
- Mastodon API: Fix `POST /api/v1/follow_requests/:id/authorize` allowing to force a follow from a local user even if they didn't request to follow
- CSP: Sandbox uploads
### Fixed
- Notifications from blocked domains
- Potential federation issues with Mastodon versions before 3.0.0
- HTTP Basic Authentication permissions issue
- Follow/Block imports not being able to find the user if the nickname started with an `@`
- Instance stats counting internal users
- Inability to run a From Source release without git
- ObjectAgePolicy didn't filter out old messages
- `blob:` urls not being allowed by CSP
### Added
- NodeInfo: ObjectAgePolicy settings to the `federation` list.
- Follow request notifications
<details>
<summary>API Changes</summary>
- Admin API: `GET /api/pleroma/admin/need_reboot`.
</details>
### Upgrade notes
1. Restart Pleroma
2. Run database migrations (inside Pleroma directory):
- OTP: `./bin/pleroma_ctl migrate`
- From Source: `mix ecto.migrate`
3. Reset status visibility counters (inside Pleroma directory):
- OTP: `./bin/pleroma_ctl refresh_counter_cache`
- From Source: `mix pleroma.refresh_counter_cache`
## [2.0.2] - 2020-04-08
### Added
- Support for Funkwhale's `Audio` activity
- Admin API: `PATCH /api/pleroma/admin/users/:nickname/update_credentials`
### Fixed
- Blocked/muted users still generating push notifications
- Input textbox for bio ignoring newlines
- OTP: Inability to use PostgreSQL databases with SSL
- `user delete_activities` breaking when trying to delete already deleted posts
- Incorrect URL for Funkwhale channels
### Upgrade notes
1. Restart Pleroma
## [2.0.1] - 2020-03-15
### Security
- Static-FE: Fix remote posts not being sanitized
### Fixed
- Rate limiter crashes when there is no explicitly specified ip in the config
- 500 errors when no `Accept` header is present if Static-FE is enabled
- Instance panel not being updated immediately due to wrong `Cache-Control` headers
- Statuses posted with BBCode/Markdown having unnecessary newlines in Pleroma-FE
- OTP: Fix some settings not being migrated to in-database config properly
- No `Cache-Control` headers on attachment/media proxy requests
- Character limit enforcement being off by 1
- Mastodon Streaming API: hashtag timelines not working
### Changed
- BBCode and Markdown formatters will no longer return any `\n` and only use `<br/>` for newlines
- Mastodon API: Allow registration without email if email verification is not enabled
### Upgrade notes
#### Nginx only
1. Remove `proxy_ignore_headers Cache-Control;` and `proxy_hide_header Cache-Control;` from your config.
#### Everyone
1. Run database migrations (inside Pleroma directory):
- OTP: `./bin/pleroma_ctl migrate`
- From Source: `mix ecto.migrate`
2. Restart Pleroma
## [2.0.0] - 2019-03-08
### Security
- Mastodon API: Fix being able to request enormous amount of statuses in timelines leading to DoS. Now limited to 40 per request.
### Removed
- **Breaking**: Removed 1.0+ deprecated configurations `Pleroma.Upload, :strip_exif` and `:instance, :dedupe_media`
- **Breaking**: OStatus protocol support
- **Breaking**: MDII uploader
- **Breaking**: Using third party engines for user recommendation
<details>
<summary>API Changes</summary>
- **Breaking**: AdminAPI: migrate_from_db endpoint
</details>
### Changed
- **Breaking:** Pleroma won't start if it detects unapplied migrations
- **Breaking:** Elixir >=1.8 is now required (was >= 1.7)
- **Breaking:** `Pleroma.Plugs.RemoteIp` and `:rate_limiter` enabled by default. Please ensure your reverse proxy forwards the real IP!
- **Breaking:** attachment links (`config :pleroma, :instance, no_attachment_links` and `config :pleroma, Pleroma.Upload, link_name`) disabled by default
- **Breaking:** OAuth: defaulted `[:auth, :enforce_oauth_admin_scope_usage]` setting to `true` which demands `admin` OAuth scope to perform admin actions (in addition to `is_admin` flag on User); make sure to use bundled or newer versions of AdminFE & PleromaFE to access admin / moderator features.
- **Breaking:** Dynamic configuration has been rearchitected. The `:pleroma, :instance, dynamic_configuration` setting has been replaced with `config :pleroma, configurable_from_database`. Please backup your configuration to a file and run the migration task to ensure consistency with the new schema.
- **Breaking:** `:instance, no_attachment_links` has been replaced with `:instance, attachment_links` which still takes a boolean value but doesn't use double negative language.
- Replaced [pleroma_job_queue](https://git.pleroma.social/pleroma/pleroma_job_queue) and `Pleroma.Web.Federator.RetryQueue` with [Oban](https://github.com/sorentwo/oban) (see [`docs/config.md`](docs/config.md) on migrating customized worker / retry settings)
- Introduced [quantum](https://github.com/quantum-elixir/quantum-core) job scheduler
- Enabled `:instance, extended_nickname_format` in the default config
- Add `rel="ugc"` to all links in statuses, to prevent SEO spam
- Extract RSS functionality from OStatus
- MRF (Simple Policy): Also use `:accept`/`:reject` on the actors rather than only their activities
- OStatus: Extract RSS functionality
- Deprecated `User.Info` embedded schema (fields moved to `User`)
- Store status data inside Flag activity
- Deprecated (reorganized as `UserRelationship` entity) User fields with user AP IDs (`blocks`, `mutes`, `muted_reblogs`, `muted_notifications`, `subscribers`).
- Rate limiter is now disabled for localhost/socket (unless remoteip plug is enabled)
- Logger: default log level changed from `warn` to `info`.
- Config mix task `migrate_to_db` truncates `config` table before migrating the config file.
- Allow account registration without an email
- Default to `prepare: :unnamed` in the database configuration.
- Instance stats are now loaded on startup instead of being empty until next hourly job.
<details>
<summary>API Changes</summary>
- **Breaking** EmojiReactions: Change endpoints and responses to align with Mastodon
- **Breaking** Admin API: `PATCH /api/pleroma/admin/users/:nickname/force_password_reset` is now `PATCH /api/pleroma/admin/users/force_password_reset` (accepts `nicknames` array in the request body)
- **Breaking:** Admin API: Return link alongside with token on password reset
- **Breaking:** Admin API: `PUT /api/pleroma/admin/reports/:id` is now `PATCH /api/pleroma/admin/reports`, see admin_api.md for details
- **Breaking:** `/api/pleroma/admin/users/invite_token` now uses `POST`, changed accepted params and returns full invite in json instead of only token string.
- **Breaking** replying to reports is now "report notes", endpoint changed from `POST /api/pleroma/admin/reports/:id/respond` to `POST /api/pleroma/admin/reports/:id/notes`
- Mastodon API: stopped sanitizing display names, field names and subject fields since they are supposed to be treated as plaintext
- Admin API: Return `total` when querying for reports
- Mastodon API: Return `pleroma.direct_conversation_id` when creating a direct message (`POST /api/v1/statuses`)
- Admin API: Return link alongside with token on password reset
- Admin API: Support authentication via `x-admin-token` HTTP header
- Mastodon API: Add `pleroma.direct_conversation_id` to the status endpoint (`GET /api/v1/statuses/:id`)
- Mastodon API: `pleroma.thread_muted` to the Status entity
- Mastodon API: Mark the direct conversation as read for the author when they send a new direct message
- Mastodon API, streaming: Add `pleroma.direct_conversation_id` to the `conversation` stream event payload.
- Admin API: Render whole status in grouped reports
- Mastodon API: User timelines will now respect blocks, unless you are getting the user timeline of somebody you blocked (which would be empty otherwise).
- Mastodon API: Favoriting / Repeating a post multiple times will now return the identical response every time. Before, executing that action twice would return an error ("already favorited") on the second try.
- Mastodon API: Limit timeline requests to 3 per timeline per 500ms per user/ip by default.
- Admin API: `PATCH /api/pleroma/admin/users/:nickname/credentials` and `GET /api/pleroma/admin/users/:nickname/credentials`
</details>
### Added
- `:chat_limit` option to limit chat characters.
- `cleanup_attachments` option to remove attachments along with statuses. Does not affect duplicate files and attachments without status. Enabling this will increase load to database when deleting statuses on larger instances.
- Refreshing poll results for remote polls
- Authentication: Added rate limit for password-authorized actions / login existence checks
- Static Frontend: Add the ability to render user profiles and notices server-side without requiring JS app.
- Mix task to re-count statuses for all users (`mix pleroma.count_statuses`)
- Mix task to list all users (`mix pleroma.user list`)
- Mix task to send a test email (`mix pleroma.email test`)
- Support for `X-Forwarded-For` and similar HTTP headers which used by reverse proxies to pass a real user IP address to the backend. Must not be enabled unless your instance is behind at least one reverse proxy (such as Nginx, Apache HTTPD or Varnish Cache).
- MRF: New module which handles incoming posts based on their age. By default, all incoming posts that are older than 2 days will be unlisted and not shown to their followers.
- User notification settings: Add `privacy_option` option.
- Support for custom Elixir modules (such as MRF policies)
- User settings: Add _This account is a_ option.
- A new users admin digest email
- OAuth: admin scopes support (relevant setting: `[:auth, :enforce_oauth_admin_scope_usage]`).
- Add an option `authorized_fetch_mode` to require HTTP signatures for AP fetches.
- ActivityPub: support for `replies` collection (output for outgoing federation & fetching on incoming federation).
- Mix task to refresh counter cache (`mix pleroma.refresh_counter_cache`)
<details>
<summary>API Changes</summary>
- Job queue stats to the healthcheck page
- Admin API: Add ability to fetch reports, grouped by status `GET /api/pleroma/admin/grouped_reports`
- Admin API: Add ability to require password reset
- Mastodon API: Account entities now include `follow_requests_count` (planned Mastodon 3.x addition)
- Pleroma API: `GET /api/v1/pleroma/accounts/:id/scrobbles` to get a list of recently scrobbled items
- Pleroma API: `POST /api/v1/pleroma/scrobble` to scrobble a media item
- Mastodon API: Add `upload_limit`, `avatar_upload_limit`, `background_upload_limit`, and `banner_upload_limit` to `/api/v1/instance`
- Mastodon API: Add `pleroma.unread_conversation_count` to the Account entity
- OAuth: support for hierarchical permissions / [Mastodon 2.4.3 OAuth permissions](https://docs.joinmastodon.org/api/permissions/)
- Metadata Link: Atom syndication Feed
- Mix task to re-count statuses for all users (`mix pleroma.count_statuses`)
- Mastodon API: Add `exclude_visibilities` parameter to the timeline and notification endpoints
- Admin API: `/users/:nickname/toggle_activation` endpoint is now deprecated in favor of: `/users/activate`, `/users/deactivate`, both accept `nicknames` array
- Admin API: Multiple endpoints now require `nicknames` array, instead of singe `nickname`:
- `POST/DELETE /api/pleroma/admin/users/:nickname/permission_group/:permission_group` are deprecated in favor of: `POST/DELETE /api/pleroma/admin/users/permission_group/:permission_group`
- `DELETE /api/pleroma/admin/users` (`nickname` query param or `nickname` sent in JSON body) is deprecated in favor of: `DELETE /api/pleroma/admin/users` (`nicknames` query array param or `nicknames` sent in JSON body)
- Admin API: Add `GET /api/pleroma/admin/relay` endpoint - lists all followed relays
- Pleroma API: `POST /api/v1/pleroma/conversations/read` to mark all conversations as read
- ActivityPub: Support `Move` activities
- Mastodon API: Add `/api/v1/markers` for managing timeline read markers
- Mastodon API: Add the `recipients` parameter to `GET /api/v1/conversations`
- Configuration: `feed` option for user atom feed.
- Pleroma API: Add Emoji reactions
- Admin API: Add `/api/pleroma/admin/instances/:instance/statuses` - lists all statuses from a given instance
- Admin API: Add `/api/pleroma/admin/users/:nickname/statuses` - lists all statuses from a given user
- Admin API: `PATCH /api/pleroma/users/confirm_email` to confirm email for multiple users, `PATCH /api/pleroma/users/resend_confirmation_email` to resend confirmation email for multiple users
- ActivityPub: Configurable `type` field of the actors.
- Mastodon API: `/api/v1/accounts/:id` has `source/pleroma/actor_type` field.
- Mastodon API: `/api/v1/update_credentials` accepts `actor_type` field.
- Captcha: Support native provider
- Captcha: Enable by default
- Mastodon API: Add support for `account_id` param to filter notifications by the account
- Mastodon API: Add `emoji_reactions` property to Statuses
- Mastodon API: Change emoji reaction reply format
- Notifications: Added `pleroma:emoji_reaction` notification type
- Mastodon API: Change emoji reaction reply format once more
- Configuration: `feed.logo` option for tag feed.
- Tag feed: `/tags/:tag.rss` - list public statuses by hashtag.
- Mastodon API: Add `reacted` property to `emoji_reactions`
- Pleroma API: Add reactions for a single emoji.
- ActivityPub: `[:activitypub, :note_replies_output_limit]` setting sets the number of note self-replies to output on outgoing federation.
- Admin API: `GET /api/pleroma/admin/stats` to get status count by visibility scope
- Admin API: `GET /api/pleroma/admin/statuses` - list all statuses (accepts `godmode` and `local_only`)
</details>
### Fixed
- Report emails now include functional links to profiles of remote user accounts
- Not being able to log in to some third-party apps when logged in to MastoFE
- MRF: `Delete` activities being exempt from MRF policies
- OTP releases: Not being able to configure OAuth expired token cleanup interval
- OTP releases: Not being able to configure HTML sanitization policy
- OTP releases: Not being able to change upload limit (again)
- Favorites timeline now ordered by favorite date instead of post date
- Support for cancellation of a follow request
<details>
<summary>API Changes</summary>
- Mastodon API: Fix private and direct statuses not being filtered out from the public timeline for an authenticated user (`GET /api/v1/timelines/public`)
- Mastodon API: Inability to get some local users by nickname in `/api/v1/accounts/:id_or_nickname`
- AdminAPI: If some status received reports both in the "new" format and "old" format it was considered reports on two different statuses (in the context of grouped reports)
- Admin API: Error when trying to update reports in the "old" format
- Mastodon API: Marking a conversation as read (`POST /api/v1/conversations/:id/read`) now no longer brings it to the top in the user's direct conversation list
</details>
## [1.1.9] - 2020-02-10
### Fixed
- OTP: Inability to set the upload limit (again)
- Not being able to pin polls
- Streaming API: incorrect handling of reblog mutes
- Rejecting the user when field length limit is exceeded
- OpenGraph provider: html entities in descriptions
## [1.1.8] - 2020-01-10
### Fixed
- Captcha generation issues
- Returned Kocaptcha endpoint to configuration
- Captcha validity is now 5 minutes
## [1.1.7] - 2019-12-13
### Fixed
- OTP: Inability to set the upload limit
- OTP: Inability to override node name/distribution type to run 2 Pleroma instances on the same machine
### Added
- Integrated captcha provider
### Changed
- Captcha enabled by default
- Default Captcha provider changed from `Pleroma.Captcha.Kocaptcha` to `Pleroma.Captcha.Native`
- Better `Cache-Control` header for static content
### Bundled Pleroma-FE Changes
#### Added
- Icons in the navigation panel
#### Fixed
- Improved support unauthenticated view of private instances
#### Removed
- Whitespace hack on empty post content
## [1.1.6] - 2019-11-19
### Fixed
- Not being able to log into to third party apps when the browser is logged into mastofe
- Email confirmation not being required even when enabled
- Mastodon API: conversations API crashing when one status is malformed
### Bundled Pleroma-FE Changes
#### Added
- About page
- Meme arrows
#### Fixed
- Image modal not closing unless clicked outside of image
- Attachment upload spinner not being centered
- Showing follow counters being 0 when they are actually hidden
## [1.1.5] - 2019-11-09
### Fixed
- Polls having different numbers in timelines/notifications/poll api endpoints due to cache desyncronization
- Pleroma API: OAuth token endpoint not being found when ".json" suffix is appended
### Changed
- Frontend bundle updated to [044c9ad0](https://git.pleroma.social/pleroma/pleroma-fe/commit/044c9ad0562af059dd961d50961a3880fca9c642)
## [1.1.4] - 2019-11-01
### Fixed
- Added a migration that fills up empty user.info fields to prevent breakage after previous unsafe migrations.
- Failure to migrate from pre-1.0.0 versions
- Mastodon API: Notification stream not including follow notifications
## [1.1.3] - 2019-10-25
### Fixed
- Blocked users showing up in notifications collapsed as if they were muted
- `pleroma_ctl` not working on Debian's default shell
## [1.1.2] - 2019-10-18
### Fixed
- `pleroma_ctl` trying to connect to a running instance when generating the config, which of course doesn't exist.
## [1.1.1] - 2019-10-18
### Fixed
- One of the migrations between 1.0.0 and 1.1.0 wiping user info of the relay user because of unexpected behavior of postgresql's `jsonb_set`, resulting in inability to post in the default configuration. If you were affected, please run the following query in postgres console, the relay user will be recreated automatically:
```
delete from users where ap_id = 'https://your.instance.hostname/relay';
```
- Bad user search matches
## [1.1.0] - 2019-10-14
**Breaking:** The stable branch has been changed from `master` to `stable`. If you want to keep using 1.0, the `release/1.0` branch will receive security updates for 6 months after 1.1 release.
**OTP Note:** `pleroma_ctl` in 1.0 defaults to `master` and doesn't support specifying arbitrary branches, making `./pleroma_ctl update` fail. To fix this, fetch a version of `pleroma_ctl` from 1.1 using the command below and proceed with the update normally:
```
curl -Lo ./bin/pleroma_ctl 'https://git.pleroma.social/pleroma/pleroma/raw/develop/rel/files/bin/pleroma_ctl'
```
### Security
- Mastodon API: respect post privacy in `/api/v1/statuses/:id/{favourited,reblogged}_by`
### Removed
- **Breaking:** GNU Social API with Qvitter extensions support
- Emoji: Remove longfox emojis.
- Remove `Reply-To` header from report emails for admins.
- ActivityPub: The `/objects/:uuid/likes` endpoint.
### Changed
- **Breaking:** Configuration: A setting to explicitly disable the mailer was added, defaulting to true, if you are using a mailer add `config :pleroma, Pleroma.Emails.Mailer, enabled: true` to your config
- **Breaking:** Configuration: `/media/` is now removed when `base_url` is configured, append `/media/` to your `base_url` config to keep the old behaviour if desired
- **Breaking:** `/api/pleroma/notifications/read` is moved to `/api/v1/pleroma/notifications/read` and now supports `max_id` and responds with Mastodon API entities.
- Configuration: added `config/description.exs`, from which `docs/config.md` is generated
- Configuration: OpenGraph and TwitterCard providers enabled by default
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
- Federation: Return 403 errors when trying to request pages from a user's follower/following collections if they have `hide_followers`/`hide_follows` set
- NodeInfo: Return `skipThreadContainment` in `metadata` for the `skip_thread_containment` option
- NodeInfo: Return `mailerEnabled` in `metadata`
- Mastodon API: Unsubscribe followers when they unfollow a user
- Mastodon API: `pleroma.thread_muted` key in the Status entity
- AdminAPI: Add "godmode" while fetching user statuses (i.e. admin can see private statuses)
- Improve digest email template
– Pagination: (optional) return `total` alongside with `items` when paginating
- The `Pleroma.FlakeId` module has been replaced with the `flake_id` library.
### Fixed
- Following from Osada
- Favorites timeline doing database-intensive queries
- Metadata rendering errors resulting in the entire page being inaccessible
- `federation_incoming_replies_max_depth` option being ignored in certain cases
- Mastodon API: Handling of search timeouts (`/api/v1/search` and `/api/v2/search`)
- Mastodon API: Misskey's endless polls being unable to render
- Mastodon API: Embedded relationships not being properly rendered in the Account entity of Status entity
- Mastodon API: Notifications endpoint crashing if one notification failed to render
- Mastodon API: `exclude_replies` is correctly handled again.
- Mastodon API: Add `account_id`, `type`, `offset`, and `limit` to search API (`/api/v1/search` and `/api/v2/search`)
- Mastodon API, streaming: Fix filtering of notifications based on blocks/mutes/thread mutes
- Mastodon API: Fix private and direct statuses not being filtered out from the public timeline for an authenticated user (`GET /api/v1/timelines/public`)
- Mastodon API: Ensure the `account` field is not empty when rendering Notification entities.
- Mastodon API: Inability to get some local users by nickname in `/api/v1/accounts/:id_or_nickname`
- Mastodon API: Blocks are now treated consistently between the Streaming API and the Timeline APIs
- Rich Media: Parser failing when no TTL can be found by image TTL setters
- Rich Media: The crawled URL is now spliced into the rich media data.
- ActivityPub S2S: sharedInbox usage has been mostly aligned with the rules in the AP specification.
- ActivityPub C2S: follower/following collection pages being inaccessible even when authentifucated if `hide_followers`/ `hide_follows` was set
- ActivityPub: Deactivated user deletion
- ActivityPub: Fix `/users/:nickname/inbox` crashing without an authenticated user
- MRF: fix ability to follow a relay when AntiFollowbotPolicy was enabled
- ActivityPub: Correct addressing of Undo.
- ActivityPub: Correct addressing of profile update activities.
- ActivityPub: Polls are now refreshed when necessary.
- Report emails now include functional links to profiles of remote user accounts
- Existing user id not being preserved on insert conflict
- Pleroma.Upload base_url was not automatically whitelisted by MediaProxy. Now your custom CDN or file hosting will be accessed directly as expected.
- Report email not being sent to admins when the reporter is a remote user
- Reverse Proxy limiting `max_body_length` was incorrectly defined and only checked `Content-Length` headers which may not be sufficient in some circumstances
### Added
- Expiring/ephemeral activities. All activities can have expires_at value set, which controls when they should be deleted automatically.
- Mastodon API: in post_status, the expires_in parameter lets you set the number of seconds until an activity expires. It must be at least one hour.
- Mastodon API: all status JSON responses contain a `pleroma.expires_at` item which states when an activity will expire. The value is only shown to the user who created the activity. To everyone else it's empty.
- Configuration: `ActivityExpiration.enabled` controls whether expired activities will get deleted at the appropriate time. Enabled by default.
- Conversations: Add Pleroma-specific conversation endpoints and status posting extensions. Run the `bump_all_conversations` task again to create the necessary data.
- MRF: Support for priming the mediaproxy cache (`Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy`)
- MRF: Support for excluding specific domains from Transparency.
- MRF: Support for filtering posts based on who they mention (`Pleroma.Web.ActivityPub.MRF.MentionPolicy`)
- Mastodon API: Support for the [`tagged` filter](https://github.com/tootsuite/mastodon/pull/9755) in [`GET /api/v1/accounts/:id/statuses`](https://docs.joinmastodon.org/api/rest/accounts/#get-api-v1-accounts-id-statuses)
- Mastodon API, streaming: Add support for passing the token in the `Sec-WebSocket-Protocol` header
- Mastodon API, extension: Ability to reset avatar, profile banner, and background
- Mastodon API: Add support for `fields_attributes` API parameter (setting custom fields)
- Mastodon API: Add support for categories for custom emojis by reusing the group feature. <https://github.com/tootsuite/mastodon/pull/11196>
- Mastodon API: Add support for muting/unmuting notifications
- Mastodon API: Add support for the `blocked_by` attribute in the relationship API (`GET /api/v1/accounts/relationships`). <https://github.com/tootsuite/mastodon/pull/10373>
- Mastodon API: Add support for the `domain_blocking` attribute in the relationship API (`GET /api/v1/accounts/relationships`).
- Mastodon API: Add `pleroma.deactivated` to the Account entity
- Mastodon API: added `/auth/password` endpoint for password reset with rate limit.
- Mastodon API: /api/v1/accounts/:id/statuses now supports nicknames or user id
- Mastodon API: Improve support for the user profile custom fields
- Mastodon API: Add support for `fields_attributes` API parameter (setting custom fields)
- Mastodon API: Added an endpoint to get multiple statuses by IDs (`GET /api/v1/statuses/?ids[]=1&ids[]=2`)
- Admin API: Return users' tags when querying reports
- Admin API: Return avatar and display name when querying users
- Admin API: Allow querying user by ID
- Admin API: Added support for `tuples`.
- Admin API: Added endpoints to run mix tasks pleroma.config migrate_to_db & pleroma.config migrate_from_db
- Added synchronization of following/followers counters for external users
- Configuration: `enabled` option for `Pleroma.Emails.Mailer`, defaulting to `false`.
- Configuration: Pleroma.Plugs.RateLimiter `bucket_name`, `params` options.
- Configuration: `user_bio_length` and `user_name_length` options.
- Addressable lists
- Twitter API: added rate limit for `/api/account/password_reset` endpoint.
- ActivityPub: Add an internal service actor for fetching ActivityPub objects.
- ActivityPub: Optional signing of ActivityPub object fetches.
- Admin API: Endpoint for fetching latest user's statuses
- Pleroma API: Add `/api/v1/pleroma/accounts/confirmation_resend?email=<email>` for resending account confirmation.
- Pleroma API: Email change endpoint.
- Admin API: Added moderation log
- Web response cache (currently, enabled for ActivityPub)
- Reverse Proxy: Do not retry failed requests to limit pressure on the peer
### Changed
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
- Admin API: changed json structure for saving config settings.
- RichMedia: parsers and their order are configured in `rich_media` config.
- RichMedia: add the rich media ttl based on image expiration time.
## [1.0.7] - 2019-09-26
### Fixed
- Broken federation on Erlang 22 (previous versions of hackney http client were using an option that got deprecated)
### Changed
- ActivityPub: The first page in inboxes/outboxes is no longer embedded.
## [1.0.6] - 2019-08-14
### Fixed
- MRF: fix use of unserializable keyword lists in describe() implementations
- ActivityPub S2S: POST requests are now signed with `(request-target)` pseudo-header.
## [1.0.5] - 2019-08-13
### Fixed
- Mastodon API: follower/following counters not being nullified, when `hide_follows`/`hide_followers` is set
- Mastodon API: `muted` in the Status entity, using author's account to determine if the thread was muted
- Mastodon API: return the actual profile URL in the Account entity's `url` property when appropriate
- Templates: properly style anchor tags
- Objects being re-embedded to activities after being updated (e.g faved/reposted). Running 'mix pleroma.database prune_objects' again is advised.
- Not being able to access the Mastodon FE login page on private instances
- MRF: ensure that subdomain_match calls are case-insensitive
- Fix internal server error when using the healthcheck API.
### Added
- **Breaking:** MRF describe API, which adds support for exposing configuration information about MRF policies to NodeInfo.
Custom modules will need to be updated by adding, at the very least, `def describe, do: {:ok, %{}}` to the MRF policy modules.
- Relays: Added a task to list relay subscriptions.
- MRF: Support for filtering posts based on ActivityStreams vocabulary (`Pleroma.Web.ActivityPub.MRF.VocabularyPolicy`)
- MRF (Simple Policy): Support for wildcard domains.
- Support for wildcard domains in user domain blocks setting.
- Configuration: `quarantined_instances` support wildcard domains.
- Mix Tasks: `mix pleroma.database fix_likes_collections`
- Configuration: `federation_incoming_replies_max_depth` option
### Removed
- Federation: Remove `likes` from objects.
- **Breaking:** ActivityPub: The `accept_blocks` configuration setting.
## [1.0.4] - 2019-08-01
### Fixed
- Invalid SemVer version generation, when the current branch does not have commits ahead of tag/checked out on a tag
## [1.0.3] - 2019-07-31
### Security
- OStatus: eliminate the possibility of a protocol downgrade attack.
- OStatus: prevent following locked accounts, bypassing the approval process.
- TwitterAPI: use CommonAPI to handle remote follows instead of OStatus.
## [1.0.2] - 2019-07-28
### Fixed
- Not being able to pin unlisted posts
- Mastodon API: represent poll IDs as strings
- MediaProxy: fix matching filenames
- MediaProxy: fix filename encoding
- Migrations: fix a sporadic migration failure
- Metadata rendering errors resulting in the entire page being inaccessible
- Federation/MediaProxy not working with instances that have wrong certificate order
- ActivityPub S2S: remote user deletions now work the same as local user deletions.
### Changed
- Configuration: OpenGraph and TwitterCard providers enabled by default
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
## [1.0.1] - 2019-07-14
### Security
- OStatus: fix an object spoofing vulnerability.
## [1.0.0] - 2019-06-29
### Security
- Mastodon API: Fix display names not being sanitized
- Rich media: Do not crawl private IP ranges
### Added
- Digest email for inactive users
- Add a generic settings store for frontends / clients to use.
- Explicit addressing option for posting.
- Optional SSH access mode. (Needs `erlang-ssh` package on some distributions).
- [MongooseIM](https://github.com/esl/MongooseIM) http authentication support.
- LDAP authentication
- External OAuth provider authentication
- Support for building a release using [`mix release`](https://hexdocs.pm/mix/master/Mix.Tasks.Release.html)
- A [job queue](https://git.pleroma.social/pleroma/pleroma_job_queue) for federation, emails, web push, etc.
- [Prometheus](https://prometheus.io/) metrics
- Support for Mastodon's remote interaction
- Mix Tasks: `mix pleroma.database bump_all_conversations`
- Mix Tasks: `mix pleroma.database remove_embedded_objects`
- Mix Tasks: `mix pleroma.database update_users_following_followers_counts`
- Mix Tasks: `mix pleroma.user toggle_confirmed`
- Mix Tasks: `mix pleroma.config migrate_to_db`
- Mix Tasks: `mix pleroma.config migrate_from_db`
- Federation: Support for `Question` and `Answer` objects
- Federation: Support for reports
- Configuration: `poll_limits` option
- Configuration: `pack_extensions` option
- Configuration: `safe_dm_mentions` option
- Configuration: `link_name` option
- Configuration: `fetch_initial_posts` option
- Configuration: `notify_email` option
- Configuration: Media proxy `whitelist` option
- Configuration: `report_uri` option
- Configuration: `email_notifications` option
- Configuration: `limit_to_local_content` option
- Pleroma API: User subscriptions
- Pleroma API: Healthcheck endpoint
- Pleroma API: `/api/v1/pleroma/mascot` per-user frontend mascot configuration endpoints
- Admin API: Endpoints for listing/revoking invite tokens
- Admin API: Endpoints for making users follow/unfollow each other
- Admin API: added filters (role, tags, email, name) for users endpoint
- Admin API: Endpoints for managing reports
- Admin API: Endpoints for deleting and changing the scope of individual reported statuses
- Admin API: Endpoints to view and change config settings.
- AdminFE: initial release with basic user management accessible at /pleroma/admin/
- Mastodon API: Add chat token to `verify_credentials` response
- Mastodon API: Add background image setting to `update_credentials`
- Mastodon API: [Scheduled statuses](https://docs.joinmastodon.org/api/rest/scheduled-statuses/)
- Mastodon API: `/api/v1/notifications/destroy_multiple` (glitch-soc extension)
- Mastodon API: `/api/v1/pleroma/accounts/:id/favourites` (API extension)
- Mastodon API: [Reports](https://docs.joinmastodon.org/api/rest/reports/)
- Mastodon API: `POST /api/v1/accounts` (account creation API)
- Mastodon API: [Polls](https://docs.joinmastodon.org/api/rest/polls/)
- ActivityPub C2S: OAuth endpoints
- Metadata: RelMe provider
- OAuth: added support for refresh tokens
- Emoji packs and emoji pack manager
- Object pruning (`mix pleroma.database prune_objects`)
- OAuth: added job to clean expired access tokens
- MRF: Support for rejecting reports from specific instances (`mrf_simple`)
- MRF: Support for stripping avatars and banner images from specific instances (`mrf_simple`)
- MRF: Support for running subchains.
- Configuration: `skip_thread_containment` option
- Configuration: `rate_limit` option. See `Pleroma.Plugs.RateLimiter` documentation for details.
- MRF: Support for filtering out likely spam messages by rejecting posts from new users that contain links.
- Configuration: `ignore_hosts` option
- Configuration: `ignore_tld` option
- Configuration: default syslog tag "Pleroma" is now lowercased to "pleroma"
### Changed
- **Breaking:** bind to 127.0.0.1 instead of 0.0.0.0 by default
- **Breaking:** Configuration: move from Pleroma.Mailer to Pleroma.Emails.Mailer
- Thread containment / test for complete visibility will be skipped by default.
- Enforcement of OAuth scopes
- Add multiple use/time expiring invite token
- Restyled OAuth pages to fit with Pleroma's default theme
- Link/mention/hashtag detection is now handled by [auto_linker](https://git.pleroma.social/pleroma/auto_linker)
- NodeInfo: Return `safe_dm_mentions` feature flag
- Federation: Expand the audience of delete activities to all recipients of the deleted object
- Federation: Removed `inReplyToStatusId` from objects
- Configuration: Dedupe enabled by default
- Configuration: Default log level in `prod` environment is now set to `warn`
- Configuration: Added `extra_cookie_attrs` for setting non-standard cookie attributes. Defaults to ["SameSite=Lax"] so that remote follows work.
- Timelines: Messages involving people you have blocked will be excluded from the timeline in all cases instead of just repeats.
- Admin API: Move the user related API to `api/pleroma/admin/users`
- Admin API: `POST /api/pleroma/admin/users` will take list of users
- Pleroma API: Support for emoji tags in `/api/pleroma/emoji` resulting in a breaking API change
- Mastodon API: Support for `exclude_types`, `limit` and `min_id` in `/api/v1/notifications`
- Mastodon API: Add `languages` and `registrations` to `/api/v1/instance`
- Mastodon API: Provide plaintext versions of cw/content in the Status entity
- Mastodon API: Add `pleroma.conversation_id`, `pleroma.in_reply_to_account_acct` fields to the Status entity
- Mastodon API: Add `pleroma.tags`, `pleroma.relationship{}`, `pleroma.is_moderator`, `pleroma.is_admin`, `pleroma.confirmation_pending`, `pleroma.hide_followers`, `pleroma.hide_follows`, `pleroma.hide_favorites` fields to the User entity
- Mastodon API: Add `pleroma.show_role`, `pleroma.no_rich_text` fields to the Source subentity
- Mastodon API: Add support for updating `no_rich_text`, `hide_followers`, `hide_follows`, `hide_favorites`, `show_role` in `PATCH /api/v1/update_credentials`
- Mastodon API: Add `pleroma.is_seen` to the Notification entity
- Mastodon API: Add `pleroma.local` to the Status entity
- Mastodon API: Add `preview` parameter to `POST /api/v1/statuses`
- Mastodon API: Add `with_muted` parameter to timeline endpoints
- Mastodon API: Actual reblog hiding instead of a dummy
- Mastodon API: Remove attachment limit in the Status entity
- Mastodon API: Added support max_id & since_id for bookmark timeline endpoints.
- Deps: Updated Cowboy to 2.6
- Deps: Updated Ecto to 3.0.7
- Don't ship finmoji by default, they can be installed as an emoji pack
- Hide deactivated users and their statuses
- Posts which are marked sensitive or tagged nsfw no longer have link previews.
- HTTP connection timeout is now set to 10 seconds.
- Respond with a 404 Not implemented JSON error message when requested API is not implemented
- Rich Media: crawl only https URLs.
### Fixed
- Follow requests don't get 'stuck' anymore.
- Added an FTS index on objects. Running `vacuum analyze` and setting a larger `work_mem` is recommended.
- Followers counter not being updated when a follower is blocked
- Deactivated users being able to request an access token
- Limit on request body in rich media/relme parsers being ignored resulting in a possible memory leak
- Proper Twitter Card generation instead of a dummy
- Deletions failing for users with a large number of posts
- NodeInfo: Include admins in `staffAccounts`
- ActivityPub: Crashing when requesting empty local user's outbox
- Federation: Handling of objects without `summary` property
- Federation: Add a language tag to activities as required by ActivityStreams 2.0
- Federation: Do not federate avatar/banner if set to default allowing other servers/clients to use their defaults
- Federation: Cope with missing or explicitly nulled address lists
- Federation: Explicitly ensure activities addressed to `as:Public` become addressed to the followers collection
- Federation: Better cope with actors which do not declare a followers collection and use `as:Public` with these semantics
- Federation: Follow requests from remote users who have been blocked will be automatically rejected if appropriate
- MediaProxy: Parse name from content disposition headers even for non-whitelisted types
- MediaProxy: S3 link encoding
- Rich Media: Reject any data which cannot be explicitly encoded into JSON
- Pleroma API: Importing follows from Mastodon 2.8+
- Twitter API: Exposing default scope, `no_rich_text` of the user to anyone
- Twitter API: Returning the `role` object in user entity despite `show_role = false`
- Mastodon API: `/api/v1/favourites` serving only public activities
- Mastodon API: Reblogs having `in_reply_to_id` - `null` even when they are replies
- Mastodon API: Streaming API broadcasting wrong activity id
- Mastodon API: 500 errors when requesting a card for a private conversation
- Mastodon API: Handling of `reblogs` in `/api/v1/accounts/:id/follow`
- Mastodon API: Correct `reblogged`, `favourited`, and `bookmarked` values in the reblog status JSON
- Mastodon API: Exposing default scope of the user to anyone
- Mastodon API: Make `irreversible` field default to `false` [`POST /api/v1/filters`]
- Mastodon API: Replace missing non-nullable Card attributes with empty strings
- User-Agent is now sent correctly for all HTTP requests.
- MRF: Simple policy now properly delists imported or relayed statuses
## Removed
- Configuration: `config :pleroma, :fe` in favor of the more flexible `config :pleroma, :frontend_configurations`
## [0.9.99999] - 2019-05-31
### Security
- Mastodon API: Fix lists leaking private posts
## [0.9.9999] - 2019-04-05
### Security
- Mastodon API: Fix content warnings skipping HTML sanitization
## [0.9.999] - 2019-03-13
Frontend changes only.
### Added
- Added floating action button for posting status on mobile
### Changed
- Changed user-settings icon to a pencil
### Fixed
- Keyboard shortcuts activating when typing a message
- Gaps when scrolling down on a timeline after showing new
## [0.9.99] - 2019-03-08
### Changed
- Update the frontend to the 0.9.99 tag
### Fixed
- Sign the date header in federation to fix Mastodon federation.
## [0.9.9] - 2019-02-22
This is our first stable release.
diff --git a/changelog.d/atom-leak.skip b/changelog.d/atom-leak.skip
new file mode 100644
index 000000000..e69de29bb
diff --git a/changelog.d/bandit.change b/changelog.d/bandit.change
new file mode 100644
index 000000000..7a1104314
--- /dev/null
+++ b/changelog.d/bandit.change
@@ -0,0 +1 @@
+Support Bandit as an alternative to Cowboy for the HTTP server.
diff --git a/changelog.d/bugfix-ccworks.fix b/changelog.d/bugfix-ccworks.fix
new file mode 100644
index 000000000..658e27b86
--- /dev/null
+++ b/changelog.d/bugfix-ccworks.fix
@@ -0,0 +1 @@
+Fix federation with Convergence AP Bridge
\ No newline at end of file
diff --git a/changelog.d/config-stat-symlink.fix b/changelog.d/config-stat-symlink.fix
new file mode 100644
index 000000000..c8b98225d
--- /dev/null
+++ b/changelog.d/config-stat-symlink.fix
@@ -0,0 +1 @@
+- Config: Check the permissions of the linked file instead of the symlink
diff --git a/changelog.d/content-length.fix b/changelog.d/content-length.fix
new file mode 100644
index 000000000..dee906a9d
--- /dev/null
+++ b/changelog.d/content-length.fix
@@ -0,0 +1 @@
+MediaProxy was setting the content-length header which is not permitted by RFC9112§6.2 when we are chunking the reply as it conflicts with the existence of the transfer-encoding header.
diff --git a/changelog.d/dialyzer4.skip b/changelog.d/dialyzer4.skip
new file mode 100644
index 000000000..e69de29bb
diff --git a/changelog.d/gun-logs.skip b/changelog.d/gun-logs.skip
new file mode 100644
index 000000000..e69de29bb
diff --git a/changelog.d/gun_pool.fix b/changelog.d/gun_pool.fix
new file mode 100644
index 000000000..94ec9103d
--- /dev/null
+++ b/changelog.d/gun_pool.fix
@@ -0,0 +1 @@
+Fix logic error in Gun connection pooling which prevented retries even when the worker was launched with retry = true
diff --git a/changelog.d/memleak.fix b/changelog.d/memleak.fix
new file mode 100644
index 000000000..2465921c0
--- /dev/null
+++ b/changelog.d/memleak.fix
@@ -0,0 +1 @@
+Fix a memory leak caused by Websocket connections that would not enter a state where a full garbage collection run could be triggered.
diff --git a/changelog.d/mergeback-2.6.2.skip b/changelog.d/mergeback-2.6.2.skip
new file mode 100644
index 000000000..e69de29bb
diff --git a/changelog.d/notifications-index.fix b/changelog.d/notifications-index.fix
new file mode 100644
index 000000000..4617cbec0
--- /dev/null
+++ b/changelog.d/notifications-index.fix
@@ -0,0 +1 @@
+Fix notifications query which was not using the index properly
diff --git a/changelog.d/oauth-nickname.skip b/changelog.d/oauth-nickname.skip
new file mode 100644
index 000000000..02f16e06c
--- /dev/null
+++ b/changelog.d/oauth-nickname.skip
@@ -0,0 +1 @@
+Use User.full_nickname/1 in oauth html template
\ No newline at end of file
diff --git a/changelog.d/rich_media.fix b/changelog.d/rich_media.fix
new file mode 100644
index 000000000..08f119550
--- /dev/null
+++ b/changelog.d/rich_media.fix
@@ -0,0 +1 @@
+Rich Media Preview cache eviction when the activity is updated.
diff --git a/changelog.d/rich_media_tests.skip b/changelog.d/rich_media_tests.skip
new file mode 100644
index 000000000..e69de29bb
diff --git a/changelog.d/tesla.deps b/changelog.d/tesla.deps
new file mode 100644
index 000000000..799bbc670
--- /dev/null
+++ b/changelog.d/tesla.deps
@@ -0,0 +1 @@
+Update Tesla HTTP client middleware to 1.8.0
diff --git a/changelog.d/websocket-refactor.change b/changelog.d/websocket-refactor.change
new file mode 100644
index 000000000..3c447832b
--- /dev/null
+++ b/changelog.d/websocket-refactor.change
@@ -0,0 +1 @@
+Refactor the Mastodon /api/v1/streaming websocket handler to use Phoenix.Socket.Transport
diff --git a/config/config.exs b/config/config.exs
index 48e94dc4d..a47b7e34a 100644
--- a/config/config.exs
+++ b/config/config.exs
@@ -1,935 +1,928 @@
# .i;;;;i.
# iYcviii;vXY:
# .YXi .i1c.
# .YC. . in7.
# .vc. ...... ;1c.
# i7, .. .;1;
# i7, .. ... .Y1i
# ,7v .6MMM@; .YX,
# .7;. ..IMMMMMM1 :t7.
# .;Y. ;$MMMMMM9. :tc.
# vY. .. .nMMM@MMU. ;1v.
# i7i ... .#MM@M@C. .....:71i
# it: .... $MMM@9;.,i;;;i,;tti
# :t7. ..... 0MMMWv.,iii:::,,;St.
# .nC. ..... IMMMQ..,::::::,.,czX.
# .ct: ....... .ZMMMI..,:::::::,,:76Y.
# c2: ......,i..Y$M@t..:::::::,,..inZY
# vov ......:ii..c$MBc..,,,,,,,,,,..iI9i
# i9Y ......iii:..7@MA,..,,,,,,,,,....;AA:
# iIS. ......:ii::..;@MI....,............;Ez.
# .I9. ......:i::::...8M1..................C0z.
# .z9; ......:i::::,.. .i:...................zWX.
# vbv ......,i::::,,. ................. :AQY
# c6Y. .,...,::::,,..:t0@@QY. ................ :8bi
# :6S. ..,,...,:::,,,..EMMMMMMI. ............... .;bZ,
# :6o, .,,,,..:::,,,..i#MMMMMM#v................. YW2.
# .n8i ..,,,,,,,::,,,,.. tMMMMM@C:.................. .1Wn
# 7Uc. .:::,,,,,::,,,,.. i1t;,..................... .UEi
# 7C...::::::::::::,,,,.. .................... vSi.
# ;1;...,,::::::,......... .................. Yz:
# v97,......... .voC.
# izAotX7777777777777777777777777777777777777777Y7n92:
# .;CoIIIIIUAA666666699999ZZZZZZZZZZZZZZZZZZZZ6ov.
#
# !!! ATTENTION !!!
# DO NOT EDIT THIS FILE! THIS FILE CONTAINS THE DEFAULT VALUES FOR THE CON-
# FIGURATION! EDIT YOUR SECRET FILE (either prod.secret.exs, dev.secret.exs).
#
# This file is responsible for configuring your application
# and its dependencies with the aid of the Config module.
#
# This configuration file is loaded before any dependency and
# is restricted to this project.
import Config
# General application configuration
config :pleroma, ecto_repos: [Pleroma.Repo]
config :pleroma, Pleroma.Repo,
telemetry_event: [Pleroma.Repo.Instrumenter],
migration_lock: nil
config :pleroma, Pleroma.Captcha,
enabled: true,
seconds_valid: 300,
method: Pleroma.Captcha.Native
config :pleroma, Pleroma.Captcha.Kocaptcha, endpoint: "https://captcha.kotobank.ch"
# Upload configuration
config :pleroma, Pleroma.Upload,
uploader: Pleroma.Uploaders.Local,
filters: [Pleroma.Upload.Filter.Dedupe],
link_name: false,
proxy_remote: false,
filename_display_max_length: 30,
default_description: nil,
base_url: nil
config :pleroma, Pleroma.Uploaders.Local, uploads: "uploads"
config :pleroma, Pleroma.Uploaders.S3,
bucket: nil,
bucket_namespace: nil,
truncated_namespace: nil,
streaming_enabled: true
config :ex_aws, :s3,
# host: "s3.wasabisys.com", # required if not Amazon AWS
access_key_id: nil,
secret_access_key: nil,
# region: "us-east-1", # may be required for Amazon AWS
scheme: "https://"
config :pleroma, :emoji,
shortcode_globs: ["/emoji/custom/**/*.png"],
pack_extensions: [".png", ".gif"],
groups: [
Custom: ["/emoji/*.png", "/emoji/**/*.png"]
],
default_manifest: "https://git.pleroma.social/pleroma/emoji-index/raw/master/index.json",
shared_pack_cache_seconds_per_file: 60
config :pleroma, :uri_schemes,
valid_schemes: [
"https",
"http",
"dat",
"dweb",
"gopher",
"hyper",
"ipfs",
"ipns",
"irc",
"ircs",
"magnet",
"mailto",
"mumble",
"ssb",
"xmpp"
]
# Configures the endpoint
config :pleroma, Pleroma.Web.Endpoint,
url: [host: "localhost"],
http: [
- ip: {127, 0, 0, 1},
- dispatch: [
- {:_,
- [
- {"/api/v1/streaming", Pleroma.Web.MastodonAPI.WebsocketHandler, []},
- {:_, Plug.Cowboy.Handler, {Pleroma.Web.Endpoint, []}}
- ]}
- ]
+ ip: {127, 0, 0, 1}
],
protocol: "https",
secret_key_base: "aK4Abxf29xU9TTDKre9coZPUgevcVCFQJe/5xP/7Lt4BEif6idBIbjupVbOrbKxl",
live_view: [signing_salt: "U5ELgdEwTD3n1+D5s0rY0AMg8/y1STxZ3Zvsl3bWh+oBcGrYdil0rXqPMRd3Glcq"],
signing_salt: "CqaoopA2",
render_errors: [view: Pleroma.Web.ErrorView, accepts: ~w(json)],
pubsub_server: Pleroma.PubSub,
secure_cookie_flag: true,
extra_cookie_attrs: [
"SameSite=Lax"
]
# Configures Elixir's Logger
config :logger, :console,
level: :debug,
format: "\n$time $metadata[$level] $message\n",
metadata: [:request_id]
config :logger, :ex_syslogger,
level: :debug,
ident: "pleroma",
format: "$metadata[$level] $message",
metadata: [:request_id]
config :mime, :types, %{
"application/xml" => ["xml"],
"application/xrd+xml" => ["xrd+xml"],
"application/jrd+json" => ["jrd+json"],
"application/activity+json" => ["activity+json"],
"application/ld+json" => ["activity+json"]
}
config :tesla, adapter: Tesla.Adapter.Hackney
# Configures http settings, upstream proxy etc.
config :pleroma, :http,
proxy_url: nil,
send_user_agent: true,
user_agent: :default,
adapter: []
config :pleroma, :instance,
name: "Pleroma",
email: "example@example.com",
notify_email: "noreply@example.com",
description: "Pleroma: An efficient and flexible fediverse server",
short_description: "",
background_image: "/images/city.jpg",
instance_thumbnail: "/instance/thumbnail.jpeg",
favicon: "/favicon.png",
limit: 5_000,
description_limit: 5_000,
remote_limit: 100_000,
upload_limit: 16_000_000,
avatar_upload_limit: 2_000_000,
background_upload_limit: 4_000_000,
banner_upload_limit: 4_000_000,
poll_limits: %{
max_options: 20,
max_option_chars: 200,
min_expiration: 0,
max_expiration: 365 * 24 * 60 * 60
},
registrations_open: true,
invites_enabled: false,
account_activation_required: false,
account_approval_required: false,
federating: true,
federation_incoming_replies_max_depth: 100,
federation_reachability_timeout_days: 7,
allow_relay: true,
public: true,
quarantined_instances: [],
static_dir: "instance/static/",
allowed_post_formats: [
"text/plain",
"text/html",
"text/markdown",
"text/bbcode"
],
autofollowed_nicknames: [],
autofollowing_nicknames: [],
max_pinned_statuses: 1,
attachment_links: false,
max_report_comment_size: 1000,
report_strip_status: true,
safe_dm_mentions: false,
healthcheck: false,
remote_post_retention_days: 90,
skip_thread_containment: true,
limit_to_local_content: :unauthenticated,
user_bio_length: 5000,
user_name_length: 100,
max_account_fields: 10,
max_remote_account_fields: 20,
account_field_name_length: 512,
account_field_value_length: 2048,
registration_reason_length: 500,
external_user_synchronization: true,
extended_nickname_format: true,
cleanup_attachments: false,
multi_factor_authentication: [
totp: [
# digits 6 or 8
digits: 6,
period: 30
],
backup_codes: [
number: 5,
length: 16
]
],
show_reactions: true,
password_reset_token_validity: 60 * 60 * 24,
profile_directory: true,
admin_privileges: [
:users_read,
:users_manage_invites,
:users_manage_activation_state,
:users_manage_tags,
:users_manage_credentials,
:users_delete,
:messages_read,
:messages_delete,
:instances_delete,
:reports_manage_reports,
:moderation_log_read,
:announcements_manage_announcements,
:emoji_manage_emoji,
:statistics_read
],
moderator_privileges: [:messages_delete, :reports_manage_reports],
max_endorsed_users: 20,
birthday_required: false,
birthday_min_age: 0,
max_media_attachments: 1_000
config :pleroma, :welcome,
direct_message: [
enabled: false,
sender_nickname: nil,
message: nil
],
chat_message: [
enabled: false,
sender_nickname: nil,
message: nil
],
email: [
enabled: false,
sender: nil,
subject: "Welcome to <%= instance_name %>",
html: "Welcome to <%= instance_name %>",
text: "Welcome to <%= instance_name %>"
]
config :pleroma, :feed,
post_title: %{
max_length: 100,
omission: "..."
}
config :pleroma, :markup,
# XXX - unfortunately, inline images must be enabled by default right now, because
# of custom emoji. Issue #275 discusses defanging that somehow.
allow_inline_images: true,
allow_headings: false,
allow_tables: false,
allow_fonts: false,
scrub_policy: [
Pleroma.HTML.Scrubber.Default,
Pleroma.HTML.Transform.MediaProxy
]
config :pleroma, :frontend_configurations,
pleroma_fe: %{
alwaysShowSubjectInput: true,
background: "/images/city.jpg",
collapseMessageWithSubject: false,
disableChat: false,
greentext: false,
hideFilteredStatuses: false,
hideMutedPosts: false,
hidePostStats: false,
hideSitename: false,
hideUserStats: false,
loginMethod: "password",
logo: "/static/logo.svg",
logoMargin: ".1em",
logoMask: true,
minimalScopesMode: false,
noAttachmentLinks: false,
nsfwCensorImage: "",
postContentType: "text/plain",
redirectRootLogin: "/main/friends",
redirectRootNoLogin: "/main/all",
scopeCopy: true,
sidebarRight: false,
showFeaturesPanel: true,
showInstanceSpecificPanel: false,
subjectLineBehavior: "email",
theme: "pleroma-dark",
webPushNotifications: false
}
config :pleroma, :assets,
mascots: [
pleroma_fox_tan: %{
url: "/images/pleroma-fox-tan-smol.png",
mime_type: "image/png"
},
pleroma_fox_tan_shy: %{
url: "/images/pleroma-fox-tan-shy.png",
mime_type: "image/png"
}
],
default_mascot: :pleroma_fox_tan
config :pleroma, :manifest,
icons: [
%{
src: "/static/logo.svg",
sizes: "144x144",
purpose: "any",
type: "image/svg+xml"
}
],
theme_color: "#282c37",
background_color: "#191b22"
config :pleroma, :activitypub,
unfollow_blocked: true,
outgoing_blocks: true,
blockers_visible: true,
follow_handshake_timeout: 500,
note_replies_output_limit: 5,
sign_object_fetches: true,
authorized_fetch_mode: false
config :pleroma, :streamer,
workers: 3,
overflow_workers: 2
config :pleroma, :user, deny_follow_blocked: true
config :pleroma, :mrf_normalize_markup, scrub_policy: Pleroma.HTML.Scrubber.Default
config :pleroma, :mrf_rejectnonpublic,
allow_followersonly: false,
allow_direct: false
config :pleroma, :mrf_hellthread,
delist_threshold: 10,
reject_threshold: 20
config :pleroma, :mrf_simple,
media_removal: [],
media_nsfw: [],
federated_timeline_removal: [],
report_removal: [],
reject: [],
followers_only: [],
accept: [],
avatar_removal: [],
banner_removal: [],
reject_deletes: []
config :pleroma, :mrf_keyword,
reject: [],
federated_timeline_removal: [],
replace: []
config :pleroma, :mrf_emoji,
remove_url: [],
remove_shortcode: [],
federated_timeline_removal_url: [],
federated_timeline_removal_shortcode: []
config :pleroma, :mrf_hashtag,
sensitive: ["nsfw"],
reject: [],
federated_timeline_removal: []
config :pleroma, :mrf_subchain, match_actor: %{}
config :pleroma, :mrf_activity_expiration, days: 365
config :pleroma, :mrf_vocabulary,
accept: [],
reject: []
# threshold of 7 days
config :pleroma, :mrf_object_age,
threshold: 604_800,
actions: [:delist, :strip_followers]
config :pleroma, :mrf_follow_bot, follower_nickname: nil
config :pleroma, :mrf_inline_quote, template: "<bdi>RT:</bdi> {url}"
config :pleroma, :rich_media,
enabled: true,
ignore_hosts: [],
ignore_tld: ["local", "localdomain", "lan"],
parsers: [
Pleroma.Web.RichMedia.Parsers.TwitterCard,
Pleroma.Web.RichMedia.Parsers.OEmbed
],
failure_backoff: 60_000,
ttl_setters: [Pleroma.Web.RichMedia.Parser.TTL.AwsSignedUrl]
config :pleroma, :media_proxy,
enabled: false,
invalidation: [
enabled: false,
provider: Pleroma.Web.MediaProxy.Invalidation.Script
],
proxy_opts: [
redirect_on_failure: false,
max_body_length: 25 * 1_048_576,
# Note: max_read_duration defaults to Pleroma.ReverseProxy.max_read_duration_default/1
max_read_duration: 30_000,
http: [
follow_redirect: true,
pool: :media
]
],
whitelist: []
config :pleroma, Pleroma.Web.MediaProxy.Invalidation.Http,
method: :purge,
headers: [],
options: []
config :pleroma, Pleroma.Web.MediaProxy.Invalidation.Script,
script_path: nil,
url_format: nil
# Note: media preview proxy depends on media proxy to be enabled
config :pleroma, :media_preview_proxy,
enabled: false,
thumbnail_max_width: 600,
thumbnail_max_height: 600,
image_quality: 85,
min_content_length: 100 * 1024
config :pleroma, :shout,
enabled: true,
limit: 5_000
config :phoenix, :format_encoders, json: Jason, "activity+json": Jason, ics: ICalendar
config :phoenix, :json_library, Jason
config :phoenix, :filter_parameters, ["password", "confirm"]
config :pleroma, :gopher,
enabled: false,
ip: {0, 0, 0, 0},
port: 9999
config :pleroma, Pleroma.Web.Metadata,
providers: [
Pleroma.Web.Metadata.Providers.OpenGraph,
Pleroma.Web.Metadata.Providers.TwitterCard
],
unfurl_nsfw: false
config :pleroma, Pleroma.Web.Preload,
providers: [
Pleroma.Web.Preload.Providers.Instance
]
config :pleroma, :http_security,
enabled: true,
sts: false,
sts_max_age: 31_536_000,
ct_max_age: 2_592_000,
referrer_policy: "same-origin"
config :cors_plug,
max_age: 86_400,
methods: ["POST", "PUT", "DELETE", "GET", "PATCH", "OPTIONS"],
expose: [
"Link",
"X-RateLimit-Reset",
"X-RateLimit-Limit",
"X-RateLimit-Remaining",
"X-Request-Id",
"Idempotency-Key"
],
credentials: true,
headers: ["Authorization", "Content-Type", "Idempotency-Key"]
config :pleroma, Pleroma.User,
restricted_nicknames: [
".well-known",
"~",
"about",
"activities",
"api",
"auth",
"check_password",
"dev",
"friend-requests",
"inbox",
"internal",
"main",
"media",
"nodeinfo",
"notice",
"oauth",
"objects",
"ostatus_subscribe",
"pleroma",
"proxy",
"push",
"registration",
"relay",
"settings",
"status",
"tag",
"user-search",
"user_exists",
"users",
"web",
"verify_credentials",
"update_credentials",
"relationships",
"search",
"confirmation_resend",
"mfa"
],
email_blacklist: []
config :pleroma, Oban,
repo: Pleroma.Repo,
log: false,
queues: [
activity_expiration: 10,
token_expiration: 5,
filter_expiration: 1,
backup: 1,
federator_incoming: 5,
federator_outgoing: 5,
ingestion_queue: 50,
web_push: 50,
mailer: 10,
transmogrifier: 20,
scheduled_activities: 10,
poll_notifications: 10,
background: 5,
remote_fetcher: 2,
attachments_cleanup: 1,
new_users_digest: 1,
mute_expire: 5,
search_indexing: 10
],
plugins: [Oban.Plugins.Pruner],
crontab: [
{"0 0 * * 0", Pleroma.Workers.Cron.DigestEmailsWorker},
{"0 0 * * *", Pleroma.Workers.Cron.NewUsersDigestWorker}
]
config :pleroma, :workers,
retries: [
federator_incoming: 5,
federator_outgoing: 5,
search_indexing: 2
]
config :pleroma, Pleroma.Formatter,
class: false,
rel: "ugc",
new_window: false,
truncate: false,
strip_prefix: false,
extra: true,
validate_tld: :no_scheme
config :pleroma, :ldap,
enabled: System.get_env("LDAP_ENABLED") == "true",
host: System.get_env("LDAP_HOST") || "localhost",
port: String.to_integer(System.get_env("LDAP_PORT") || "389"),
ssl: System.get_env("LDAP_SSL") == "true",
sslopts: [],
tls: System.get_env("LDAP_TLS") == "true",
tlsopts: [],
base: System.get_env("LDAP_BASE") || "dc=example,dc=com",
uid: System.get_env("LDAP_UID") || "cn"
oauth_consumer_strategies =
System.get_env("OAUTH_CONSUMER_STRATEGIES")
|> to_string()
|> String.split()
|> Enum.map(&hd(String.split(&1, ":")))
ueberauth_providers =
for strategy <- oauth_consumer_strategies do
strategy_module_name = "Elixir.Ueberauth.Strategy.#{String.capitalize(strategy)}"
strategy_module = String.to_atom(strategy_module_name)
{String.to_atom(strategy), {strategy_module, [callback_params: ["state"]]}}
end
config :ueberauth,
Ueberauth,
base_path: "/oauth",
providers: ueberauth_providers
config :pleroma, :auth, oauth_consumer_strategies: oauth_consumer_strategies
config :pleroma, Pleroma.Emails.Mailer, adapter: Swoosh.Adapters.Sendmail, enabled: false
config :pleroma, Pleroma.Emails.UserEmail,
logo: nil,
styling: %{
link_color: "#d8a070",
background_color: "#2C3645",
content_background_color: "#1B2635",
header_color: "#d8a070",
text_color: "#b9b9ba",
text_muted_color: "#b9b9ba"
}
config :pleroma, Pleroma.Emails.NewUsersDigestEmail, enabled: false
config :pleroma, Pleroma.PromEx,
disabled: false,
manual_metrics_start_delay: :no_delay,
drop_metrics_groups: [],
grafana: [
host: System.get_env("GRAFANA_HOST", "http://localhost:3000"),
auth_token: System.get_env("GRAFANA_TOKEN"),
upload_dashboards_on_start: false,
folder_name: "BEAM",
annotate_app_lifecycle: true
],
metrics_server: [
port: 4021,
path: "/metrics",
protocol: :http,
pool_size: 5,
cowboy_opts: [],
auth_strategy: :none
],
datasource: "Prometheus"
config :pleroma, Pleroma.ScheduledActivity,
daily_user_limit: 25,
total_user_limit: 300,
enabled: true
config :pleroma, :email_notifications,
digest: %{
active: false,
interval: 7,
inactivity_threshold: 7
}
config :pleroma, :oauth2,
token_expires_in: 3600 * 24 * 365 * 100,
issue_new_refresh_token: true,
clean_expired_tokens: false
config :pleroma, :database, rum_enabled: false
config :pleroma, :features, improved_hashtag_timeline: :auto
config :pleroma, :populate_hashtags_table, fault_rate_allowance: 0.01
config :pleroma, :delete_context_objects, fault_rate_allowance: 0.01
config :pleroma, :env, Mix.env()
config :http_signatures,
adapter: Pleroma.Signature
config :pleroma, :rate_limit,
authentication: {60_000, 15},
timeline: {500, 3},
search: [{1000, 10}, {1000, 30}],
app_account_creation: {1_800_000, 25},
relations_actions: {10_000, 10},
relation_id_action: {60_000, 2},
statuses_actions: {10_000, 15},
status_id_action: {60_000, 3},
events_actions: {10_000, 15},
password_reset: {1_800_000, 5},
account_confirmation_resend: {8_640_000, 5},
ap_routes: {60_000, 15}
config :pleroma, Pleroma.Workers.PurgeExpiredActivity, enabled: true, min_lifetime: 600
config :pleroma, Pleroma.Web.Plugs.RemoteIp,
enabled: true,
headers: ["x-forwarded-for"],
proxies: [],
reserved: [
"127.0.0.0/8",
"::1/128",
"fc00::/7",
"10.0.0.0/8",
"172.16.0.0/12",
"192.168.0.0/16"
]
config :pleroma, :static_fe, enabled: false
# Example of frontend configuration
# This example will make us serve the primary frontend from the
# frontends directory within your `:pleroma, :instance, static_dir`.
# e.g., instance/static/frontends/pleroma/develop/
#
# With no frontend configuration, the bundled files from the `static` directory will
# be used.
#
# config :pleroma, :frontends,
# primary: %{"name" => "pleroma-fe", "ref" => "develop"},
# admin: %{"name" => "admin-fe", "ref" => "stable"},
# available: %{...}
config :pleroma, :frontends,
available: %{
"kenoma" => %{
"name" => "kenoma",
"git" => "https://git.pleroma.social/lambadalambda/kenoma",
"build_url" =>
"https://git.pleroma.social/lambadalambda/kenoma/-/jobs/artifacts/${ref}/download?job=build",
"ref" => "master"
},
"pleroma-fe" => %{
"name" => "pleroma-fe",
"git" => "https://git.pleroma.social/pleroma/pleroma-fe",
"build_url" =>
"https://git.pleroma.social/pleroma/pleroma-fe/-/jobs/artifacts/${ref}/download?job=build",
"ref" => "develop"
},
"fedi-fe" => %{
"name" => "fedi-fe",
"git" => "https://git.pleroma.social/pleroma/fedi-fe",
"build_url" =>
"https://git.pleroma.social/pleroma/fedi-fe/-/jobs/artifacts/${ref}/download?job=build_release",
"ref" => "master",
"custom-http-headers" => [
{"service-worker-allowed", "/"}
]
},
"admin-fe" => %{
"name" => "admin-fe",
"git" => "https://git.pleroma.social/pleroma/admin-fe",
"build_url" =>
"https://git.pleroma.social/pleroma/admin-fe/-/jobs/artifacts/${ref}/download?job=build",
"ref" => "develop"
},
"soapbox" => %{
"name" => "soapbox",
"git" => "https://gitlab.com/soapbox-pub/soapbox",
"build_url" =>
"https://gitlab.com/soapbox-pub/soapbox/-/jobs/artifacts/${ref}/download?job=build-production",
"ref" => "v3.0.0-beta.1",
"build_dir" => "static"
},
"glitch-lily" => %{
"name" => "glitch-lily",
"git" => "https://lily-is.land/infra/glitch-lily",
"build_url" =>
"https://lily-is.land/infra/glitch-lily/-/jobs/artifacts/${ref}/download?job=build",
"ref" => "servant",
"build_dir" => "public"
}
}
config :pleroma, :web_cache_ttl,
activity_pub: nil,
activity_pub_question: 30_000
config :pleroma, :modules, runtime_dir: "instance/modules"
config :pleroma, configurable_from_database: false
config :pleroma, Pleroma.Repo,
parameters: [gin_fuzzy_search_limit: "500"],
prepare: :unnamed
config :pleroma, :connections_pool,
reclaim_multiplier: 0.1,
connection_acquisition_wait: 250,
connection_acquisition_retries: 5,
max_connections: 250,
max_idle_time: 30_000,
retry: 0,
connect_timeout: 5_000
config :pleroma, :pools,
federation: [
size: 50,
max_waiting: 10,
recv_timeout: 10_000
],
media: [
size: 50,
max_waiting: 20,
recv_timeout: 15_000
],
upload: [
size: 25,
max_waiting: 5,
recv_timeout: 15_000
],
default: [
size: 10,
max_waiting: 2,
recv_timeout: 5_000
]
config :pleroma, :hackney_pools,
federation: [
max_connections: 50,
timeout: 150_000
],
media: [
max_connections: 50,
timeout: 150_000
],
upload: [
max_connections: 25,
timeout: 300_000
]
config :pleroma, :majic_pool, size: 2
private_instance? = :if_instance_is_private
config :pleroma, :restrict_unauthenticated,
timelines: %{local: private_instance?, federated: private_instance?},
profiles: %{local: private_instance?, remote: private_instance?},
activities: %{local: private_instance?, remote: private_instance?}
config :pleroma, Pleroma.Web.ApiSpec.CastAndValidate, strict: false
config :pleroma, :mrf,
policies: [
Pleroma.Web.ActivityPub.MRF.ObjectAgePolicy,
Pleroma.Web.ActivityPub.MRF.TagPolicy,
Pleroma.Web.ActivityPub.MRF.InlineQuotePolicy
],
transparency: true,
transparency_exclusions: []
config :tzdata, :http_client, Pleroma.HTTP.Tzdata
config :ex_aws, http_client: Pleroma.HTTP.ExAws
config :web_push_encryption, http_client: Pleroma.HTTP.WebPush
config :pleroma, :instances_favicons, enabled: false
config :floki, :html_parser, Floki.HTMLParser.FastHtml
config :pleroma, Pleroma.Web.Auth.Authenticator, Pleroma.Web.Auth.PleromaAuthenticator
config :pleroma, Pleroma.User.Backup,
purge_after_days: 30,
limit_days: 7,
dir: nil,
process_wait_time: 30_000,
process_chunk_size: 100
config :pleroma, ConcurrentLimiter, [
{Pleroma.Web.RichMedia.Helpers, [max_running: 5, max_waiting: 5]},
{Pleroma.Web.ActivityPub.MRF.MediaProxyWarmingPolicy, [max_running: 5, max_waiting: 5]},
{Pleroma.Search, [max_running: 30, max_waiting: 50]}
]
config :pleroma, Pleroma.Web.WebFinger, domain: nil, update_nickname_on_user_fetch: true
config :pleroma, Pleroma.Search, module: Pleroma.Search.DatabaseSearch
config :pleroma, Pleroma.Search.Meilisearch,
url: "http://127.0.0.1:7700/",
private_key: nil,
initial_indexing_chunk_size: 100_000
config :pleroma, Pleroma.Application,
background_migrators: true,
internal_fetch: true,
load_custom_modules: true,
max_restarts: 3,
streamer_registry: true
config :pleroma, Pleroma.Uploaders.Uploader, timeout: 30_000
config :geospatial, Geospatial.Service, service: Geospatial.Providers.Nominatim
config :geospatial, Geospatial.Providers.GoogleMaps,
api_key: nil,
fetch_place_details: true
config :geospatial, Geospatial.Providers.Nominatim,
endpoint: "https://nominatim.openstreetmap.org",
api_key: nil
config :geospatial, Geospatial.Providers.Pelias,
endpoint: "https://api.geocode.earth",
api_key: nil
config :geospatial, Geospatial.HTTP, user_agent: &Pleroma.Application.user_agent/0
# Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above.
import_config "#{Mix.env()}.exs"
diff --git a/config/dev.exs b/config/dev.exs
index ab3e83c12..fe8de5045 100644
--- a/config/dev.exs
+++ b/config/dev.exs
@@ -1,71 +1,70 @@
import Config
# For development, we disable any cache and enable
# debugging and code reloading.
#
# The watchers configuration can be used to run external
# watchers to your application. For example, we use it
# with brunch.io to recompile .js and .css sources.
config :pleroma, Pleroma.Web.Endpoint,
http: [
- port: 4000,
- protocol_options: [max_request_line_length: 8192, max_header_value_length: 8192]
+ port: 4000
],
protocol: "http",
debug_errors: true,
code_reloader: true,
check_origin: false,
watchers: [],
secure_cookie_flag: false
config :pleroma, Pleroma.Emails.Mailer, adapter: Swoosh.Adapters.Local
# ## SSL Support
#
# In order to use HTTPS in development, a self-signed
# certificate can be generated by running the following
# command from your terminal:
#
# openssl req -new -newkey rsa:4096 -days 365 -nodes -x509 -subj "/C=US/ST=Denial/L=Springfield/O=Dis/CN=www.example.com" -keyout priv/server.key -out priv/server.pem
#
# The `http:` config above can be replaced with:
#
# https: [port: 4000, keyfile: "priv/server.key", certfile: "priv/server.pem"],
#
# If desired, both `http:` and `https:` keys can be
# configured to run both http and https servers on
# different ports.
# Do not include metadata nor timestamps in development logs
config :logger, :console, format: "[$level] $message\n"
# Set a higher stacktrace during development. Avoid configuring such
# in production as building large stacktraces may be expensive.
config :phoenix, :stacktrace_depth, 20
# Configure your database
config :pleroma, Pleroma.Repo,
adapter: Ecto.Adapters.Postgres,
username: "postgres",
password: "postgres",
database: "pleroma_dev",
hostname: "localhost",
pool_size: 10
config :pleroma, Pleroma.Web.ApiSpec.CastAndValidate, strict: true
# Reduce recompilation time
# https://dashbit.co/blog/speeding-up-re-compilation-of-elixir-projects
config :phoenix, :plug_init_mode, :runtime
if File.exists?("./config/dev.secret.exs") do
import_config "dev.secret.exs"
else
IO.puts(
:stderr,
"!!! RUNNING IN LOCALHOST DEV MODE! !!!\nFEDERATION WON'T WORK UNTIL YOU CONFIGURE A dev.secret.exs"
)
end
if File.exists?("./config/dev.exported_from_db.secret.exs"),
do: import_config("dev.exported_from_db.secret.exs")
diff --git a/lib/mix/tasks/pleroma/emoji.ex b/lib/mix/tasks/pleroma/emoji.ex
index 537f0715e..8b9c921c8 100644
--- a/lib/mix/tasks/pleroma/emoji.ex
+++ b/lib/mix/tasks/pleroma/emoji.ex
@@ -1,275 +1,275 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.Emoji do
use Mix.Task
import Mix.Pleroma
@shortdoc "Manages emoji packs"
@moduledoc File.read!("docs/administration/CLI_tasks/emoji.md")
def run(["ls-packs" | args]) do
start_pleroma()
{options, [], []} = parse_global_opts(args)
url_or_path = options[:manifest] || default_manifest()
manifest = fetch_and_decode!(url_or_path)
Enum.each(manifest, fn {name, info} ->
to_print = [
{"Name", name},
{"Homepage", info["homepage"]},
{"Description", info["description"]},
{"License", info["license"]},
{"Source", info["src"]}
]
for {param, value} <- to_print do
IO.puts(IO.ANSI.format([:bright, param, :normal, ": ", value]))
end
# A newline
IO.puts("")
end)
end
def run(["get-packs" | args]) do
start_pleroma()
{options, pack_names, []} = parse_global_opts(args)
url_or_path = options[:manifest] || default_manifest()
manifest = fetch_and_decode!(url_or_path)
for pack_name <- pack_names do
if Map.has_key?(manifest, pack_name) do
pack = manifest[pack_name]
src = pack["src"]
IO.puts(
IO.ANSI.format([
"Downloading ",
:bright,
pack_name,
:normal,
" from ",
:underline,
src
])
)
{:ok, binary_archive} = fetch(src)
archive_sha = :crypto.hash(:sha256, binary_archive) |> Base.encode16()
sha_status_text = ["SHA256 of ", :bright, pack_name, :normal, " source file is ", :bright]
if archive_sha == String.upcase(pack["src_sha256"]) do
IO.puts(IO.ANSI.format(sha_status_text ++ [:green, "OK"]))
else
IO.puts(IO.ANSI.format(sha_status_text ++ [:red, "BAD"]))
raise "Bad SHA256 for #{pack_name}"
end
# The location specified in files should be in the same directory
files_loc =
url_or_path
|> Path.dirname()
|> Path.join(pack["files"])
IO.puts(
IO.ANSI.format([
"Fetching the file list for ",
:bright,
pack_name,
:normal,
" from ",
:underline,
files_loc
])
)
files = fetch_and_decode!(files_loc)
IO.puts(IO.ANSI.format(["Unpacking ", :bright, pack_name]))
pack_path =
Path.join([
Pleroma.Config.get!([:instance, :static_dir]),
"emoji",
pack_name
])
files_to_unzip =
Enum.map(
files,
fn {_, f} -> to_charlist(f) end
)
{:ok, _} =
:zip.unzip(binary_archive,
- cwd: pack_path,
+ cwd: String.to_charlist(pack_path),
file_list: files_to_unzip
)
IO.puts(IO.ANSI.format(["Writing pack.json for ", :bright, pack_name]))
pack_json = %{
pack: %{
"license" => pack["license"],
"homepage" => pack["homepage"],
"description" => pack["description"],
"fallback-src" => pack["src"],
"fallback-src-sha256" => pack["src_sha256"],
"share-files" => true
},
files: files
}
File.write!(Path.join(pack_path, "pack.json"), Jason.encode!(pack_json, pretty: true))
else
IO.puts(IO.ANSI.format([:bright, :red, "No pack named \"#{pack_name}\" found"]))
end
end
end
def run(["gen-pack" | args]) do
start_pleroma()
{opts, [src], []} =
OptionParser.parse(
args,
strict: [
name: :string,
license: :string,
homepage: :string,
description: :string,
files: :string,
extensions: :string
]
)
proposed_name = Path.basename(src) |> Path.rootname()
name = get_option(opts, :name, "Pack name:", proposed_name)
license = get_option(opts, :license, "License:")
homepage = get_option(opts, :homepage, "Homepage:")
description = get_option(opts, :description, "Description:")
proposed_files_name = "#{name}_files.json"
files_name = get_option(opts, :files, "Save file list to:", proposed_files_name)
default_exts = [".png", ".gif"]
custom_exts =
get_option(
opts,
:extensions,
"Emoji file extensions (separated with spaces):",
Enum.join(default_exts, " ")
)
|> String.split(" ", trim: true)
exts =
if MapSet.equal?(MapSet.new(default_exts), MapSet.new(custom_exts)) do
default_exts
else
custom_exts
end
IO.puts("Using #{Enum.join(exts, " ")} extensions")
IO.puts("Downloading the pack and generating SHA256")
{:ok, %{body: binary_archive}} = Pleroma.HTTP.get(src)
archive_sha = :crypto.hash(:sha256, binary_archive) |> Base.encode16()
IO.puts("SHA256 is #{archive_sha}")
pack_json = %{
name => %{
license: license,
homepage: homepage,
description: description,
src: src,
src_sha256: archive_sha,
files: files_name
}
}
tmp_pack_dir = Path.join(System.tmp_dir!(), "emoji-pack-#{name}")
{:ok, _} = :zip.unzip(binary_archive, cwd: String.to_charlist(tmp_pack_dir))
emoji_map = Pleroma.Emoji.Loader.make_shortcode_to_file_map(tmp_pack_dir, exts)
File.write!(files_name, Jason.encode!(emoji_map, pretty: true))
IO.puts("""
#{files_name} has been created and contains the list of all found emojis in the pack.
Please review the files in the pack and remove those not needed.
""")
pack_file = "#{name}.json"
if File.exists?(pack_file) do
existing_data = File.read!(pack_file) |> Jason.decode!()
File.write!(
pack_file,
Jason.encode!(
Map.merge(
existing_data,
pack_json
),
pretty: true
)
)
IO.puts("#{pack_file} has been updated with the #{name} pack")
else
File.write!(pack_file, Jason.encode!(pack_json, pretty: true))
IO.puts("#{pack_file} has been created with the #{name} pack")
end
end
def run(["reload"]) do
start_pleroma()
Pleroma.Emoji.reload()
IO.puts("Emoji packs have been reloaded.")
end
defp fetch_and_decode!(from) do
with {:ok, json} <- fetch(from) do
Jason.decode!(json)
else
{:error, error} -> raise "#{from} cannot be fetched. Error: #{error} occur."
end
end
defp fetch("http" <> _ = from) do
with {:ok, %{body: body}} <- Pleroma.HTTP.get(from) do
{:ok, body}
end
end
defp fetch(path), do: File.read(path)
defp parse_global_opts(args) do
OptionParser.parse(
args,
strict: [
manifest: :string
],
aliases: [
m: :manifest
]
)
end
defp default_manifest, do: Pleroma.Config.get!([:emoji, :default_manifest])
end
diff --git a/lib/phoenix/transports/web_socket/raw.ex b/lib/phoenix/transports/web_socket/raw.ex
deleted file mode 100644
index cf4fda79f..000000000
--- a/lib/phoenix/transports/web_socket/raw.ex
+++ /dev/null
@@ -1,93 +0,0 @@
-# Pleroma: A lightweight social networking server
-# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
-# SPDX-License-Identifier: AGPL-3.0-only
-
-defmodule Phoenix.Transports.WebSocket.Raw do
- import Plug.Conn,
- only: [
- fetch_query_params: 1,
- send_resp: 3
- ]
-
- alias Phoenix.Socket.Transport
-
- def default_config do
- [
- timeout: 60_000,
- transport_log: false,
- cowboy: Phoenix.Endpoint.CowboyWebSocket
- ]
- end
-
- def init(%Plug.Conn{method: "GET"} = conn, {endpoint, handler, transport}) do
- {_, opts} = handler.__transport__(transport)
-
- conn =
- conn
- |> fetch_query_params
- |> Transport.transport_log(opts[:transport_log])
- |> Transport.check_origin(handler, endpoint, opts)
-
- case conn do
- %{halted: false} = conn ->
- case handler.connect(%{
- endpoint: endpoint,
- transport: transport,
- options: [serializer: nil],
- params: conn.params
- }) do
- {:ok, socket} ->
- {:ok, conn, {__MODULE__, {socket, opts}}}
-
- :error ->
- send_resp(conn, :forbidden, "")
- {:error, conn}
- end
-
- _ ->
- {:error, conn}
- end
- end
-
- def init(conn, _) do
- send_resp(conn, :bad_request, "")
- {:error, conn}
- end
-
- def ws_init({socket, config}) do
- Process.flag(:trap_exit, true)
- {:ok, %{socket: socket}, config[:timeout]}
- end
-
- def ws_handle(op, data, state) do
- state.socket.handler
- |> apply(:handle, [op, data, state])
- |> case do
- {op, data} ->
- {:reply, {op, data}, state}
-
- {op, data, state} ->
- {:reply, {op, data}, state}
-
- %{} = state ->
- {:ok, state}
-
- _ ->
- {:ok, state}
- end
- end
-
- def ws_info({_, _} = tuple, state) do
- {:reply, tuple, state}
- end
-
- def ws_info(_tuple, state), do: {:ok, state}
-
- def ws_close(state) do
- ws_handle(:closed, :normal, state)
- end
-
- def ws_terminate(reason, state) do
- ws_handle(:closed, reason, state)
- end
-end
diff --git a/lib/pleroma/activity/html.ex b/lib/pleroma/activity/html.ex
index 706b2d36c..ba284b4d5 100644
--- a/lib/pleroma/activity/html.ex
+++ b/lib/pleroma/activity/html.ex
@@ -1,81 +1,81 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Activity.HTML do
alias Pleroma.HTML
alias Pleroma.Object
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
# We store a list of cache keys related to an activity in a
# separate cache, scrubber_management_cache. It has the same
# size as scrubber_cache (see application.ex). Every time we add
# a cache to scrubber_cache, we update scrubber_management_cache.
#
# The most recent write of a certain key in the management cache
# is the same as the most recent write of any record related to that
# key in the main cache.
# Assuming LRW ( https://hexdocs.pm/cachex/Cachex.Policy.LRW.html ),
# this means when the management cache is evicted by cachex, all
# related records in the main cache will also have been evicted.
defp get_cache_keys_for(activity_id) do
with {:ok, list} when is_list(list) <- @cachex.get(:scrubber_management_cache, activity_id) do
list
else
_ -> []
end
end
- defp add_cache_key_for(activity_id, additional_key) do
+ def add_cache_key_for(activity_id, additional_key) do
current = get_cache_keys_for(activity_id)
unless additional_key in current do
@cachex.put(:scrubber_management_cache, activity_id, [additional_key | current])
end
end
def invalidate_cache_for(activity_id) do
keys = get_cache_keys_for(activity_id)
Enum.map(keys, &@cachex.del(:scrubber_cache, &1))
@cachex.del(:scrubber_management_cache, activity_id)
end
def get_cached_scrubbed_html_for_activity(
content,
scrubbers,
activity,
key \\ "",
callback \\ fn x -> x end
) do
key = "#{key}#{generate_scrubber_signature(scrubbers)}|#{activity.id}"
@cachex.fetch!(:scrubber_cache, key, fn _key ->
object = Object.normalize(activity, fetch: false)
add_cache_key_for(activity.id, key)
HTML.ensure_scrubbed_html(content, scrubbers, object.data["fake"] || false, callback)
end)
end
def get_cached_stripped_html_for_activity(content, activity, key) do
get_cached_scrubbed_html_for_activity(
content,
FastSanitize.Sanitizer.StripTags,
activity,
key,
&HtmlEntities.decode/1
)
end
defp generate_scrubber_signature(scrubber) when is_atom(scrubber) do
generate_scrubber_signature([scrubber])
end
defp generate_scrubber_signature(scrubbers) do
Enum.reduce(scrubbers, "", fn scrubber, signature ->
"#{signature}#{to_string(scrubber)}"
end)
end
end
diff --git a/lib/pleroma/caching.ex b/lib/pleroma/caching.ex
index eb0588708..796a465af 100644
--- a/lib/pleroma/caching.ex
+++ b/lib/pleroma/caching.ex
@@ -1,19 +1,22 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Caching do
@callback get!(Cachex.cache(), any()) :: any()
@callback get(Cachex.cache(), any()) :: {atom(), any()}
@callback put(Cachex.cache(), any(), any(), Keyword.t()) :: {Cachex.status(), boolean()}
@callback put(Cachex.cache(), any(), any()) :: {Cachex.status(), boolean()}
@callback fetch!(Cachex.cache(), any(), function() | nil) :: any()
+ @callback fetch(Cachex.cache(), any(), function() | nil) ::
+ {atom(), any()} | {atom(), any(), any()}
# @callback del(Cachex.cache(), any(), Keyword.t()) :: {Cachex.status(), boolean()}
@callback del(Cachex.cache(), any()) :: {Cachex.status(), boolean()}
@callback stream!(Cachex.cache(), any()) :: Enumerable.t()
@callback expire_at(Cachex.cache(), binary(), number()) :: {Cachex.status(), boolean()}
+ @callback expire(Cachex.cache(), binary(), number()) :: {Cachex.status(), boolean()}
@callback exists?(Cachex.cache(), any()) :: {Cachex.status(), boolean()}
@callback execute!(Cachex.cache(), function()) :: any()
@callback get_and_update(Cachex.cache(), any(), function()) ::
{:commit | :ignore, any()}
end
diff --git a/lib/pleroma/config/deprecation_warnings.ex b/lib/pleroma/config/deprecation_warnings.ex
index a77923264..58d164dc7 100644
--- a/lib/pleroma/config/deprecation_warnings.ex
+++ b/lib/pleroma/config/deprecation_warnings.ex
@@ -1,418 +1,418 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Config.DeprecationWarnings do
alias Pleroma.Config
require Logger
alias Pleroma.Config
@type config_namespace() :: atom() | [atom()]
@type config_map() :: {config_namespace(), config_namespace(), String.t()}
@mrf_config_map [
{[:instance, :rewrite_policy], [:mrf, :policies],
"\n* `config :pleroma, :instance, rewrite_policy` is now `config :pleroma, :mrf, policies`"},
{[:instance, :mrf_transparency], [:mrf, :transparency],
"\n* `config :pleroma, :instance, mrf_transparency` is now `config :pleroma, :mrf, transparency`"},
{[:instance, :mrf_transparency_exclusions], [:mrf, :transparency_exclusions],
"\n* `config :pleroma, :instance, mrf_transparency_exclusions` is now `config :pleroma, :mrf, transparency_exclusions`"}
]
def check_exiftool_filter do
filters = Config.get([Pleroma.Upload]) |> Keyword.get(:filters, [])
if Pleroma.Upload.Filter.Exiftool in filters do
Logger.warning("""
!!!DEPRECATION WARNING!!!
Your config is using Exiftool as a filter instead of Exiftool.StripLocation. This should work for now, but you are advised to change to the new configuration to prevent possible issues later:
```
config :pleroma, Pleroma.Upload,
filters: [Pleroma.Upload.Filter.Exiftool]
```
Is now
```
config :pleroma, Pleroma.Upload,
filters: [Pleroma.Upload.Filter.Exiftool.StripLocation]
```
""")
new_config =
filters
|> Enum.map(fn
Pleroma.Upload.Filter.Exiftool -> Pleroma.Upload.Filter.Exiftool.StripLocation
filter -> filter
end)
Config.put([Pleroma.Upload, :filters], new_config)
:error
else
:ok
end
end
def check_simple_policy_tuples do
has_strings =
Config.get([:mrf_simple])
|> Enum.any?(fn {_, v} -> Enum.any?(v, &is_binary/1) end)
if has_strings do
Logger.warning("""
!!!DEPRECATION WARNING!!!
Your config is using strings in the SimplePolicy configuration instead of tuples. They should work for now, but you are advised to change to the new configuration to prevent possible issues later:
```
config :pleroma, :mrf_simple,
media_removal: ["instance.tld"],
media_nsfw: ["instance.tld"],
federated_timeline_removal: ["instance.tld"],
report_removal: ["instance.tld"],
reject: ["instance.tld"],
followers_only: ["instance.tld"],
accept: ["instance.tld"],
avatar_removal: ["instance.tld"],
banner_removal: ["instance.tld"],
reject_deletes: ["instance.tld"]
```
Is now
```
config :pleroma, :mrf_simple,
media_removal: [{"instance.tld", "Reason for media removal"}],
media_nsfw: [{"instance.tld", "Reason for media nsfw"}],
federated_timeline_removal: [{"instance.tld", "Reason for federated timeline removal"}],
report_removal: [{"instance.tld", "Reason for report removal"}],
reject: [{"instance.tld", "Reason for reject"}],
followers_only: [{"instance.tld", "Reason for followers only"}],
accept: [{"instance.tld", "Reason for accept"}],
avatar_removal: [{"instance.tld", "Reason for avatar removal"}],
banner_removal: [{"instance.tld", "Reason for banner removal"}],
reject_deletes: [{"instance.tld", "Reason for reject deletes"}]
```
""")
new_config =
Config.get([:mrf_simple])
|> Enum.map(fn {k, v} ->
{k,
Enum.map(v, fn
{instance, reason} -> {instance, reason}
instance -> {instance, ""}
end)}
end)
Config.put([:mrf_simple], new_config)
:error
else
:ok
end
end
def check_quarantined_instances_tuples do
has_strings = Config.get([:instance, :quarantined_instances]) |> Enum.any?(&is_binary/1)
if has_strings do
Logger.warning("""
!!!DEPRECATION WARNING!!!
Your config is using strings in the quarantined_instances configuration instead of tuples. They should work for now, but you are advised to change to the new configuration to prevent possible issues later:
```
config :pleroma, :instance,
quarantined_instances: ["instance.tld"]
```
Is now
```
config :pleroma, :instance,
quarantined_instances: [{"instance.tld", "Reason for quarantine"}]
```
""")
new_config =
Config.get([:instance, :quarantined_instances])
|> Enum.map(fn
{instance, reason} -> {instance, reason}
instance -> {instance, ""}
end)
Config.put([:instance, :quarantined_instances], new_config)
:error
else
:ok
end
end
def check_transparency_exclusions_tuples do
has_strings = Config.get([:mrf, :transparency_exclusions]) |> Enum.any?(&is_binary/1)
if has_strings do
Logger.warning("""
!!!DEPRECATION WARNING!!!
Your config is using strings in the transparency_exclusions configuration instead of tuples. They should work for now, but you are advised to change to the new configuration to prevent possible issues later:
```
config :pleroma, :mrf,
transparency_exclusions: ["instance.tld"]
```
Is now
```
config :pleroma, :mrf,
transparency_exclusions: [{"instance.tld", "Reason to exclude transparency"}]
```
""")
new_config =
Config.get([:mrf, :transparency_exclusions])
|> Enum.map(fn
{instance, reason} -> {instance, reason}
instance -> {instance, ""}
end)
Config.put([:mrf, :transparency_exclusions], new_config)
:error
else
:ok
end
end
def check_hellthread_threshold do
if Config.get([:mrf_hellthread, :threshold]) do
Logger.warning("""
!!!DEPRECATION WARNING!!!
You are using the old configuration mechanism for the hellthread filter. Please check config.md.
""")
:error
else
:ok
end
end
def warn do
[
check_hellthread_threshold(),
check_old_mrf_config(),
check_media_proxy_whitelist_config(),
check_welcome_message_config(),
check_gun_pool_options(),
check_activity_expiration_config(),
check_remote_ip_plug_name(),
check_uploaders_s3_public_endpoint(),
check_old_chat_shoutbox(),
check_quarantined_instances_tuples(),
check_transparency_exclusions_tuples(),
check_simple_policy_tuples(),
check_exiftool_filter()
]
|> Enum.reduce(:ok, fn
:ok, :ok -> :ok
_, _ -> :error
end)
end
def check_welcome_message_config do
instance_config = Pleroma.Config.get([:instance])
use_old_config =
Keyword.has_key?(instance_config, :welcome_user_nickname) or
Keyword.has_key?(instance_config, :welcome_message)
if use_old_config do
Logger.error("""
!!!DEPRECATION WARNING!!!
Your config is using the old namespace for Welcome messages configuration. You need to convert to the new namespace. e.g.,
\n* `config :pleroma, :instance, welcome_user_nickname` and `config :pleroma, :instance, welcome_message` are now equal to:
\n* `config :pleroma, :welcome, direct_message: [enabled: true, sender_nickname: "NICKNAME", message: "Your welcome message"]`"
""")
:error
else
:ok
end
end
def check_old_mrf_config do
warning_preface = """
!!!DEPRECATION WARNING!!!
Your config is using old namespaces for MRF configuration. They should work for now, but you are advised to change to new namespaces to prevent possible issues later:
"""
move_namespace_and_warn(@mrf_config_map, warning_preface)
end
- @spec move_namespace_and_warn([config_map()], String.t()) :: :ok | nil
+ @spec move_namespace_and_warn([config_map()], String.t()) :: :ok | :error
def move_namespace_and_warn(config_map, warning_preface) do
warning =
Enum.reduce(config_map, "", fn
{old, new, err_msg}, acc ->
old_config = Config.get(old)
if old_config do
Config.put(new, old_config)
acc <> err_msg
else
acc
end
end)
if warning == "" do
:ok
else
Logger.warning(warning_preface <> warning)
:error
end
end
- @spec check_media_proxy_whitelist_config() :: :ok | nil
+ @spec check_media_proxy_whitelist_config() :: :ok | :error
def check_media_proxy_whitelist_config do
whitelist = Config.get([:media_proxy, :whitelist])
if Enum.any?(whitelist, &(not String.starts_with?(&1, "http"))) do
Logger.warning("""
!!!DEPRECATION WARNING!!!
Your config is using old format (only domain) for MediaProxy whitelist option. Setting should work for now, but you are advised to change format to scheme with port to prevent possible issues later.
""")
:error
else
:ok
end
end
def check_gun_pool_options do
pool_config = Config.get(:connections_pool)
if timeout = pool_config[:await_up_timeout] do
Logger.warning("""
!!!DEPRECATION WARNING!!!
Your config is using old setting `config :pleroma, :connections_pool, await_up_timeout`. Please change to `config :pleroma, :connections_pool, connect_timeout` to ensure compatibility with future releases.
""")
Config.put(:connections_pool, Keyword.put_new(pool_config, :connect_timeout, timeout))
end
pools_configs = Config.get(:pools)
warning_preface = """
!!!DEPRECATION WARNING!!!
Your config is using old setting name `timeout` instead of `recv_timeout` in pool settings. The setting will not take effect until updated.
"""
updated_config =
Enum.reduce(pools_configs, [], fn {pool_name, config}, acc ->
if timeout = config[:timeout] do
Keyword.put(acc, pool_name, Keyword.put_new(config, :recv_timeout, timeout))
else
acc
end
end)
if updated_config != [] do
pool_warnings =
updated_config
|> Keyword.keys()
|> Enum.map(fn pool_name ->
"\n* `:timeout` options in #{pool_name} pool is now `:recv_timeout`"
end)
Logger.warning(Enum.join([warning_preface | pool_warnings]))
Config.put(:pools, updated_config)
:error
else
:ok
end
end
- @spec check_activity_expiration_config() :: :ok | nil
+ @spec check_activity_expiration_config() :: :ok | :error
def check_activity_expiration_config do
warning_preface = """
!!!DEPRECATION WARNING!!!
Your config is using old namespace for activity expiration configuration. Setting should work for now, but you are advised to change to new namespace to prevent possible issues later:
"""
move_namespace_and_warn(
[
{Pleroma.ActivityExpiration, Pleroma.Workers.PurgeExpiredActivity,
"\n* `config :pleroma, Pleroma.ActivityExpiration` is now `config :pleroma, Pleroma.Workers.PurgeExpiredActivity`"}
],
warning_preface
)
end
- @spec check_remote_ip_plug_name() :: :ok | nil
+ @spec check_remote_ip_plug_name() :: :ok | :error
def check_remote_ip_plug_name do
warning_preface = """
!!!DEPRECATION WARNING!!!
Your config is using old namespace for RemoteIp Plug. Setting should work for now, but you are advised to change to new namespace to prevent possible issues later:
"""
move_namespace_and_warn(
[
{Pleroma.Plugs.RemoteIp, Pleroma.Web.Plugs.RemoteIp,
"\n* `config :pleroma, Pleroma.Plugs.RemoteIp` is now `config :pleroma, Pleroma.Web.Plugs.RemoteIp`"}
],
warning_preface
)
end
- @spec check_uploaders_s3_public_endpoint() :: :ok | nil
+ @spec check_uploaders_s3_public_endpoint() :: :ok | :error
def check_uploaders_s3_public_endpoint do
s3_config = Pleroma.Config.get([Pleroma.Uploaders.S3])
use_old_config = Keyword.has_key?(s3_config, :public_endpoint)
if use_old_config do
Logger.error("""
!!!DEPRECATION WARNING!!!
Your config is using the old setting for controlling the URL of media uploaded to your S3 bucket.\n
Please make the following change at your earliest convenience.\n
\n* `config :pleroma, Pleroma.Uploaders.S3, public_endpoint` is now equal to:
\n* `config :pleroma, Pleroma.Upload, base_url`
""")
:error
else
:ok
end
end
- @spec check_old_chat_shoutbox() :: :ok | nil
+ @spec check_old_chat_shoutbox() :: :ok | :error
def check_old_chat_shoutbox do
instance_config = Pleroma.Config.get([:instance])
chat_config = Pleroma.Config.get([:chat]) || []
use_old_config =
Keyword.has_key?(instance_config, :chat_limit) or
Keyword.has_key?(chat_config, :enabled)
if use_old_config do
Logger.error("""
!!!DEPRECATION WARNING!!!
Your config is using the old namespace for the Shoutbox configuration. You need to convert to the new namespace. e.g.,
\n* `config :pleroma, :chat, enabled` and `config :pleroma, :instance, chat_limit` are now equal to:
\n* `config :pleroma, :shout, enabled` and `config :pleroma, :shout, limit`
""")
:error
else
:ok
end
end
end
diff --git a/lib/pleroma/config/release_runtime_provider.ex b/lib/pleroma/config/release_runtime_provider.ex
index 9ec0f975e..351639836 100644
--- a/lib/pleroma/config/release_runtime_provider.ex
+++ b/lib/pleroma/config/release_runtime_provider.ex
@@ -1,70 +1,70 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Config.ReleaseRuntimeProvider do
@moduledoc """
Imports runtime config and `{env}.exported_from_db.secret.exs` for releases.
"""
@behaviour Config.Provider
@impl true
def init(opts), do: opts
@impl true
def load(config, opts) do
with_defaults = Config.Reader.merge(config, Pleroma.Config.Holder.release_defaults())
config_path =
opts[:config_path] || System.get_env("PLEROMA_CONFIG_PATH") || "/etc/pleroma/config.exs"
with_runtime_config =
if File.exists?(config_path) do
# <https://git.pleroma.social/pleroma/pleroma/-/issues/3135>
- %File.Stat{mode: mode} = File.lstat!(config_path)
+ %File.Stat{mode: mode} = File.stat!(config_path)
if Bitwise.band(mode, 0o007) > 0 do
raise "Configuration at #{config_path} has world-permissions, execute the following: chmod o= #{config_path}"
end
if Bitwise.band(mode, 0o020) > 0 do
raise "Configuration at #{config_path} has group-wise write permissions, execute the following: chmod g-w #{config_path}"
end
# Note: Elixir doesn't provides a getuid(2)
# so cannot forbid group-read only when config is owned by us
runtime_config = Config.Reader.read!(config_path)
with_defaults
|> Config.Reader.merge(pleroma: [config_path: config_path])
|> Config.Reader.merge(runtime_config)
else
warning = [
IO.ANSI.red(),
IO.ANSI.bright(),
"!!! Config path is not declared! Please ensure it exists and that PLEROMA_CONFIG_PATH is unset or points to an existing file",
IO.ANSI.reset()
]
IO.puts(warning)
with_defaults
end
exported_config_path =
opts[:exported_config_path] ||
config_path
|> Path.dirname()
|> Path.join("#{Pleroma.Config.get(:env)}.exported_from_db.secret.exs")
with_exported =
if File.exists?(exported_config_path) do
exported_config = Config.Reader.read!(exported_config_path)
Config.Reader.merge(with_runtime_config, exported_config)
else
with_runtime_config
end
with_exported
end
end
diff --git a/lib/pleroma/emoji/pack.ex b/lib/pleroma/emoji/pack.ex
index 85f7e1877..afc341853 100644
--- a/lib/pleroma/emoji/pack.ex
+++ b/lib/pleroma/emoji/pack.ex
@@ -1,661 +1,661 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Emoji.Pack do
@derive {Jason.Encoder, only: [:files, :pack, :files_count]}
defstruct files: %{},
files_count: 0,
pack_file: nil,
path: nil,
pack: %{},
name: nil
@type t() :: %__MODULE__{
files: %{String.t() => Path.t()},
files_count: non_neg_integer(),
pack_file: Path.t(),
path: Path.t(),
pack: map(),
name: String.t()
}
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
alias Pleroma.Emoji
alias Pleroma.Emoji.Pack
alias Pleroma.Utils
@spec create(String.t()) :: {:ok, t()} | {:error, File.posix()} | {:error, :empty_values}
def create(name) do
with :ok <- validate_not_empty([name]),
dir <- Path.join(emoji_path(), name),
:ok <- File.mkdir(dir) do
save_pack(%__MODULE__{pack_file: Path.join(dir, "pack.json")})
end
end
defp paginate(entities, 1, page_size), do: Enum.take(entities, page_size)
defp paginate(entities, page, page_size) do
entities
|> Enum.chunk_every(page_size)
|> Enum.at(page - 1)
end
@spec show(keyword()) :: {:ok, t()} | {:error, atom()}
def show(opts) do
name = opts[:name]
with :ok <- validate_not_empty([name]),
{:ok, pack} <- load_pack(name) do
shortcodes =
pack.files
|> Map.keys()
|> Enum.sort()
|> paginate(opts[:page], opts[:page_size])
pack = Map.put(pack, :files, Map.take(pack.files, shortcodes))
{:ok, validate_pack(pack)}
end
end
@spec delete(String.t()) ::
{:ok, [binary()]} | {:error, File.posix(), binary()} | {:error, :empty_values}
def delete(name) do
with :ok <- validate_not_empty([name]),
pack_path <- Path.join(emoji_path(), name) do
File.rm_rf(pack_path)
end
end
@spec unpack_zip_emojies(list(tuple())) :: list(map())
defp unpack_zip_emojies(zip_files) do
Enum.reduce(zip_files, [], fn
{_, path, s, _, _, _}, acc when elem(s, 2) == :regular ->
with(
filename <- Path.basename(path),
shortcode <- Path.basename(filename, Path.extname(filename)),
false <- Emoji.exist?(shortcode)
) do
[%{path: path, filename: path, shortcode: shortcode} | acc]
else
_ -> acc
end
_, acc ->
acc
end)
end
@spec add_file(t(), String.t(), Path.t(), Plug.Upload.t()) ::
{:ok, t()}
| {:error, File.posix() | atom()}
def add_file(%Pack{} = pack, _, _, %Plug.Upload{content_type: "application/zip"} = file) do
with {:ok, zip_files} <- :zip.table(to_charlist(file.path)),
[_ | _] = emojies <- unpack_zip_emojies(zip_files),
{:ok, tmp_dir} <- Utils.tmp_dir("emoji") do
try do
{:ok, _emoji_files} =
:zip.unzip(
to_charlist(file.path),
- [{:file_list, Enum.map(emojies, & &1[:path])}, {:cwd, tmp_dir}]
+ [{:file_list, Enum.map(emojies, & &1[:path])}, {:cwd, String.to_charlist(tmp_dir)}]
)
{_, updated_pack} =
Enum.map_reduce(emojies, pack, fn item, emoji_pack ->
emoji_file = %Plug.Upload{
filename: item[:filename],
path: Path.join(tmp_dir, item[:path])
}
{:ok, updated_pack} =
do_add_file(
emoji_pack,
item[:shortcode],
to_string(item[:filename]),
emoji_file
)
{item, updated_pack}
end)
Emoji.reload()
{:ok, updated_pack}
after
File.rm_rf(tmp_dir)
end
else
{:error, _} = error ->
error
_ ->
{:ok, pack}
end
end
def add_file(%Pack{} = pack, shortcode, filename, %Plug.Upload{} = file) do
with :ok <- validate_not_empty([shortcode, filename]),
:ok <- validate_emoji_not_exists(shortcode),
{:ok, updated_pack} <- do_add_file(pack, shortcode, filename, file) do
Emoji.reload()
{:ok, updated_pack}
end
end
defp do_add_file(pack, shortcode, filename, file) do
with :ok <- save_file(file, pack, filename) do
pack
|> put_emoji(shortcode, filename)
|> save_pack()
end
end
@spec delete_file(t(), String.t()) ::
{:ok, t()} | {:error, File.posix() | atom()}
def delete_file(%Pack{} = pack, shortcode) do
with :ok <- validate_not_empty([shortcode]),
:ok <- remove_file(pack, shortcode),
{:ok, updated_pack} <- pack |> delete_emoji(shortcode) |> save_pack() do
Emoji.reload()
{:ok, updated_pack}
end
end
@spec update_file(t(), String.t(), String.t(), String.t(), boolean()) ::
{:ok, t()} | {:error, File.posix() | atom()}
def update_file(%Pack{} = pack, shortcode, new_shortcode, new_filename, force) do
with :ok <- validate_not_empty([shortcode, new_shortcode, new_filename]),
{:ok, filename} <- get_filename(pack, shortcode),
:ok <- validate_emoji_not_exists(new_shortcode, force),
:ok <- rename_file(pack, filename, new_filename),
{:ok, updated_pack} <-
pack
|> delete_emoji(shortcode)
|> put_emoji(new_shortcode, new_filename)
|> save_pack() do
Emoji.reload()
{:ok, updated_pack}
end
end
@spec import_from_filesystem() :: {:ok, [String.t()]} | {:error, File.posix() | atom()}
def import_from_filesystem do
emoji_path = emoji_path()
with {:ok, %{access: :read_write}} <- File.stat(emoji_path),
{:ok, results} <- File.ls(emoji_path) do
names =
results
|> Enum.map(&Path.join(emoji_path, &1))
|> Enum.reject(fn path ->
File.dir?(path) and File.exists?(Path.join(path, "pack.json"))
end)
|> Enum.map(&write_pack_contents/1)
|> Enum.reject(&is_nil/1)
{:ok, names}
else
{:ok, %{access: _}} -> {:error, :no_read_write}
e -> e
end
end
@spec list_remote(keyword()) :: {:ok, map()} | {:error, atom()}
def list_remote(opts) do
uri = opts[:url] |> String.trim() |> URI.parse()
with :ok <- validate_shareable_packs_available(uri) do
uri
|> URI.merge(
"/api/v1/pleroma/emoji/packs?page=#{opts[:page]}&page_size=#{opts[:page_size]}"
)
|> http_get()
end
end
@spec list_local(keyword()) :: {:ok, map(), non_neg_integer()}
def list_local(opts) do
with {:ok, results} <- list_packs_dir() do
all_packs =
results
|> Enum.map(fn name ->
case load_pack(name) do
{:ok, pack} -> pack
_ -> nil
end
end)
|> Enum.reject(&is_nil/1)
packs =
all_packs
|> paginate(opts[:page], opts[:page_size])
|> Map.new(fn pack -> {pack.name, validate_pack(pack)} end)
{:ok, packs, length(all_packs)}
end
end
@spec get_archive(String.t()) :: {:ok, binary()} | {:error, atom()}
def get_archive(name) do
with {:ok, pack} <- load_pack(name),
:ok <- validate_downloadable(pack) do
{:ok, fetch_archive(pack)}
end
end
@spec download(String.t(), String.t(), String.t()) :: {:ok, t()} | {:error, atom()}
def download(name, url, as) do
uri = url |> String.trim() |> URI.parse()
with :ok <- validate_shareable_packs_available(uri),
{:ok, %{"files_count" => files_count}} <-
uri |> URI.merge("/api/v1/pleroma/emoji/pack?name=#{name}&page_size=0") |> http_get(),
{:ok, remote_pack} <-
uri
|> URI.merge("/api/v1/pleroma/emoji/pack?name=#{name}&page_size=#{files_count}")
|> http_get(),
{:ok, %{sha: sha, url: url} = pack_info} <- fetch_pack_info(remote_pack, uri, name),
{:ok, archive} <- download_archive(url, sha),
pack <- copy_as(remote_pack, as || name),
{:ok, _} = unzip(archive, pack_info, remote_pack, pack) do
# Fallback can't contain a pack.json file, since that would cause the fallback-src-sha256
# in it to depend on itself
if pack_info[:fallback] do
save_pack(pack)
else
{:ok, pack}
end
end
end
@spec save_metadata(map(), t()) :: {:ok, t()} | {:error, File.posix()}
def save_metadata(metadata, %__MODULE__{} = pack) do
pack
|> Map.put(:pack, metadata)
|> save_pack()
end
@spec update_metadata(String.t(), map()) :: {:ok, t()} | {:error, File.posix()}
def update_metadata(name, data) do
with {:ok, pack} <- load_pack(name) do
if fallback_sha_changed?(pack, data) do
update_sha_and_save_metadata(pack, data)
else
save_metadata(data, pack)
end
end
end
@spec load_pack(String.t()) :: {:ok, t()} | {:error, :file.posix()}
def load_pack(name) do
name = Path.basename(name)
pack_file = Path.join([emoji_path(), name, "pack.json"])
with {:ok, _} <- File.stat(pack_file),
{:ok, pack_data} <- File.read(pack_file) do
pack =
from_json(
pack_data,
%{
pack_file: pack_file,
path: Path.dirname(pack_file),
name: name
}
)
files_count =
pack.files
|> Map.keys()
|> length()
{:ok, Map.put(pack, :files_count, files_count)}
end
end
@spec emoji_path() :: Path.t()
defp emoji_path do
[:instance, :static_dir]
|> Pleroma.Config.get!()
|> Path.join("emoji")
end
defp validate_emoji_not_exists(shortcode, force \\ false)
defp validate_emoji_not_exists(_shortcode, true), do: :ok
defp validate_emoji_not_exists(shortcode, _) do
if Emoji.exist?(shortcode) do
{:error, :already_exists}
else
:ok
end
end
defp write_pack_contents(path) do
pack = %__MODULE__{
files: files_from_path(path),
path: path,
pack_file: Path.join(path, "pack.json")
}
case save_pack(pack) do
{:ok, _pack} -> Path.basename(path)
_ -> nil
end
end
defp files_from_path(path) do
txt_path = Path.join(path, "emoji.txt")
if File.exists?(txt_path) do
# There's an emoji.txt file, it's likely from a pack installed by the pack manager.
# Make a pack.json file from the contents of that emoji.txt file
# FIXME: Copy-pasted from Pleroma.Emoji/load_from_file_stream/2
# Create a map of shortcodes to filenames from emoji.txt
txt_path
|> File.read!()
|> String.split("\n")
|> Enum.map(&String.trim/1)
|> Enum.map(fn line ->
case String.split(line, ~r/,\s*/) do
# This matches both strings with and without tags
# and we don't care about tags here
[name, file | _] ->
file_dir_name = Path.dirname(file)
if String.ends_with?(path, file_dir_name) do
{name, Path.basename(file)}
else
{name, file}
end
_ ->
nil
end
end)
|> Enum.reject(&is_nil/1)
|> Map.new()
else
# If there's no emoji.txt, assume all files
# that are of certain extensions from the config are emojis and import them all
pack_extensions = Pleroma.Config.get!([:emoji, :pack_extensions])
Emoji.Loader.make_shortcode_to_file_map(path, pack_extensions)
end
end
defp validate_pack(pack) do
info =
if downloadable?(pack) do
archive = fetch_archive(pack)
archive_sha = :crypto.hash(:sha256, archive) |> Base.encode16()
pack.pack
|> Map.put("can-download", true)
|> Map.put("download-sha256", archive_sha)
else
Map.put(pack.pack, "can-download", false)
end
Map.put(pack, :pack, info)
end
defp downloadable?(pack) do
# If the pack is set as shared, check if it can be downloaded
# That means that when asked, the pack can be packed and sent to the remote
# Otherwise, they'd have to download it from external-src
pack.pack["share-files"] &&
Enum.all?(pack.files, fn {_, file} ->
pack.path
|> Path.join(file)
|> File.exists?()
end)
end
defp create_archive_and_cache(pack, hash) do
files = ['pack.json' | Enum.map(pack.files, fn {_, file} -> to_charlist(file) end)]
{:ok, {_, result}} =
:zip.zip('#{pack.name}.zip', files, [:memory, cwd: to_charlist(pack.path)])
ttl_per_file = Pleroma.Config.get!([:emoji, :shared_pack_cache_seconds_per_file])
overall_ttl = :timer.seconds(ttl_per_file * Enum.count(files))
@cachex.put(
:emoji_packs_cache,
pack.name,
# if pack.json MD5 changes, the cache is not valid anymore
%{hash: hash, pack_data: result},
# Add a minute to cache time for every file in the pack
ttl: overall_ttl
)
result
end
defp save_pack(pack) do
with {:ok, json} <- Jason.encode(pack, pretty: true),
:ok <- File.write(pack.pack_file, json) do
{:ok, pack}
end
end
defp from_json(json, attrs) do
map = Jason.decode!(json)
pack_attrs =
attrs
|> Map.merge(%{
files: map["files"],
pack: map["pack"]
})
struct(__MODULE__, pack_attrs)
end
defp validate_shareable_packs_available(uri) do
with {:ok, %{"links" => links}} <- uri |> URI.merge("/.well-known/nodeinfo") |> http_get(),
# Get the actual nodeinfo address and fetch it
{:ok, %{"metadata" => %{"features" => features}}} <-
links |> List.last() |> Map.get("href") |> http_get() do
if Enum.member?(features, "shareable_emoji_packs") do
:ok
else
{:error, :not_shareable}
end
end
end
defp validate_not_empty(list) do
if Enum.all?(list, fn i -> is_binary(i) and i != "" end) do
:ok
else
{:error, :empty_values}
end
end
defp save_file(%Plug.Upload{path: upload_path}, pack, filename) do
file_path = Path.join(pack.path, filename)
create_subdirs(file_path)
with {:ok, _} <- File.copy(upload_path, file_path) do
:ok
end
end
defp put_emoji(pack, shortcode, filename) do
files = Map.put(pack.files, shortcode, filename)
%{pack | files: files, files_count: length(Map.keys(files))}
end
defp delete_emoji(pack, shortcode) do
files = Map.delete(pack.files, shortcode)
%{pack | files: files}
end
defp rename_file(pack, filename, new_filename) do
old_path = Path.join(pack.path, filename)
new_path = Path.join(pack.path, new_filename)
create_subdirs(new_path)
with :ok <- File.rename(old_path, new_path) do
remove_dir_if_empty(old_path, filename)
end
end
defp create_subdirs(file_path) do
with true <- String.contains?(file_path, "/"),
path <- Path.dirname(file_path),
false <- File.exists?(path) do
File.mkdir_p!(path)
end
end
defp remove_file(pack, shortcode) do
with {:ok, filename} <- get_filename(pack, shortcode),
emoji <- Path.join(pack.path, filename),
:ok <- File.rm(emoji) do
remove_dir_if_empty(emoji, filename)
end
end
defp remove_dir_if_empty(emoji, filename) do
dir = Path.dirname(emoji)
if String.contains?(filename, "/") and File.ls!(dir) == [] do
File.rmdir!(dir)
else
:ok
end
end
defp get_filename(pack, shortcode) do
with %{^shortcode => filename} when is_binary(filename) <- pack.files,
file_path <- Path.join(pack.path, filename),
{:ok, _} <- File.stat(file_path) do
{:ok, filename}
else
{:error, _} = error ->
error
_ ->
{:error, :doesnt_exist}
end
end
defp http_get(%URI{} = url), do: url |> to_string() |> http_get()
defp http_get(url) do
with {:ok, %{body: body}} <- Pleroma.HTTP.get(url, [], pool: :default) do
Jason.decode(body)
end
end
defp list_packs_dir do
emoji_path = emoji_path()
# Create the directory first if it does not exist. This is probably the first request made
# with the API so it should be sufficient
with {:create_dir, :ok} <- {:create_dir, File.mkdir_p(emoji_path)},
{:ls, {:ok, results}} <- {:ls, File.ls(emoji_path)} do
{:ok, Enum.sort(results)}
else
{:create_dir, {:error, e}} -> {:error, :create_dir, e}
{:ls, {:error, e}} -> {:error, :ls, e}
end
end
defp validate_downloadable(pack) do
if downloadable?(pack), do: :ok, else: {:error, :cant_download}
end
defp copy_as(remote_pack, local_name) do
path = Path.join(emoji_path(), local_name)
%__MODULE__{
name: local_name,
path: path,
files: remote_pack["files"],
pack_file: Path.join(path, "pack.json")
}
end
defp unzip(archive, pack_info, remote_pack, local_pack) do
with :ok <- File.mkdir_p!(local_pack.path) do
files = Enum.map(remote_pack["files"], fn {_, path} -> to_charlist(path) end)
# Fallback cannot contain a pack.json file
files = if pack_info[:fallback], do: files, else: ['pack.json' | files]
:zip.unzip(archive, cwd: to_charlist(local_pack.path), file_list: files)
end
end
defp fetch_pack_info(remote_pack, uri, name) do
case remote_pack["pack"] do
%{"share-files" => true, "can-download" => true, "download-sha256" => sha} ->
{:ok,
%{
sha: sha,
url: URI.merge(uri, "/api/v1/pleroma/emoji/packs/archive?name=#{name}") |> to_string()
}}
%{"fallback-src" => src, "fallback-src-sha256" => sha} when is_binary(src) ->
{:ok,
%{
sha: sha,
url: src,
fallback: true
}}
_ ->
{:error, "The pack was not set as shared and there is no fallback src to download from"}
end
end
defp download_archive(url, sha) do
with {:ok, %{body: archive}} <- Pleroma.HTTP.get(url) do
if Base.decode16!(sha) == :crypto.hash(:sha256, archive) do
{:ok, archive}
else
{:error, :invalid_checksum}
end
end
end
defp fetch_archive(pack) do
hash = :crypto.hash(:md5, File.read!(pack.pack_file))
case @cachex.get!(:emoji_packs_cache, pack.name) do
%{hash: ^hash, pack_data: archive} -> archive
_ -> create_archive_and_cache(pack, hash)
end
end
defp fallback_sha_changed?(pack, data) do
is_binary(data[:"fallback-src"]) and data[:"fallback-src"] != pack.pack["fallback-src"]
end
defp update_sha_and_save_metadata(pack, data) do
with {:ok, %{body: zip}} <- Pleroma.HTTP.get(data[:"fallback-src"]),
:ok <- validate_has_all_files(pack, zip) do
fallback_sha = :sha256 |> :crypto.hash(zip) |> Base.encode16()
data
|> Map.put("fallback-src-sha256", fallback_sha)
|> save_metadata(pack)
end
end
defp validate_has_all_files(pack, zip) do
with {:ok, f_list} <- :zip.unzip(zip, [:memory]) do
# Check if all files from the pack.json are in the archive
pack.files
|> Enum.all?(fn {_, from_manifest} ->
List.keyfind(f_list, to_charlist(from_manifest), 0)
end)
|> if(do: :ok, else: {:error, :incomplete})
end
end
end
diff --git a/lib/pleroma/filter.ex b/lib/pleroma/filter.ex
index db88bc021..e827d3cbc 100644
--- a/lib/pleroma/filter.ex
+++ b/lib/pleroma/filter.ex
@@ -1,226 +1,223 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Filter do
use Ecto.Schema
import Ecto.Changeset
import Ecto.Query
alias Pleroma.Repo
alias Pleroma.User
@type t() :: %__MODULE__{}
@type format() :: :postgres | :re
schema "filters" do
belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
field(:filter_id, :integer)
field(:hide, :boolean, default: false)
field(:whole_word, :boolean, default: true)
field(:phrase, :string)
field(:context, {:array, :string})
field(:expires_at, :naive_datetime)
timestamps()
end
@spec get(integer() | String.t(), User.t()) :: t() | nil
def get(id, %{id: user_id} = _user) do
query =
from(
f in __MODULE__,
where: f.filter_id == ^id,
where: f.user_id == ^user_id
)
Repo.one(query)
end
@spec get_active(Ecto.Query.t() | module()) :: Ecto.Query.t()
def get_active(query) do
from(f in query, where: is_nil(f.expires_at) or f.expires_at > ^NaiveDateTime.utc_now())
end
@spec get_irreversible(Ecto.Query.t()) :: Ecto.Query.t()
def get_irreversible(query) do
from(f in query, where: f.hide)
end
@spec get_filters(Ecto.Query.t() | module(), User.t()) :: [t()]
def get_filters(query \\ __MODULE__, %User{id: user_id}) do
query =
from(
f in query,
where: f.user_id == ^user_id,
order_by: [desc: :id]
)
Repo.all(query)
end
@spec create(map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
def create(attrs \\ %{}) do
Repo.transaction(fn -> create_with_expiration(attrs) end)
end
defp create_with_expiration(attrs) do
with {:ok, filter} <- do_create(attrs),
{:ok, _} <- maybe_add_expiration_job(filter) do
filter
else
{:error, error} -> Repo.rollback(error)
end
end
defp do_create(attrs) do
%__MODULE__{}
|> cast(attrs, [:phrase, :context, :hide, :expires_at, :whole_word, :user_id, :filter_id])
|> maybe_add_filter_id()
|> validate_required([:phrase, :context, :user_id, :filter_id])
|> maybe_add_expires_at(attrs)
|> Repo.insert()
end
defp maybe_add_filter_id(%{changes: %{filter_id: _}} = changeset), do: changeset
defp maybe_add_filter_id(%{changes: %{user_id: user_id}} = changeset) do
# If filter_id wasn't given, use the max filter_id for this user plus 1.
# XXX This could result in a race condition if a user tries to add two
# different filters for their account from two different clients at the
# same time, but that should be unlikely.
max_id_query =
from(
f in __MODULE__,
where: f.user_id == ^user_id,
select: max(f.filter_id)
)
filter_id =
case Repo.one(max_id_query) do
# Start allocating from 1
nil ->
1
max_id ->
max_id + 1
end
change(changeset, filter_id: filter_id)
end
# don't override expires_at, if passed expires_at and expires_in
defp maybe_add_expires_at(%{changes: %{expires_at: %NaiveDateTime{} = _}} = changeset, _) do
changeset
end
defp maybe_add_expires_at(changeset, %{expires_in: expires_in})
when is_integer(expires_in) and expires_in > 0 do
expires_at =
NaiveDateTime.utc_now()
|> NaiveDateTime.add(expires_in)
|> NaiveDateTime.truncate(:second)
change(changeset, expires_at: expires_at)
end
defp maybe_add_expires_at(changeset, %{expires_in: nil}) do
change(changeset, expires_at: nil)
end
defp maybe_add_expires_at(changeset, _), do: changeset
defp maybe_add_expiration_job(%{expires_at: %NaiveDateTime{} = expires_at} = filter) do
Pleroma.Workers.PurgeExpiredFilter.enqueue(%{
filter_id: filter.id,
expires_at: DateTime.from_naive!(expires_at, "Etc/UTC")
})
end
defp maybe_add_expiration_job(_), do: {:ok, nil}
@spec delete(t()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
def delete(%__MODULE__{} = filter) do
Repo.transaction(fn -> delete_with_expiration(filter) end)
end
defp delete_with_expiration(filter) do
with {:ok, _} <- maybe_delete_old_expiration_job(filter, nil),
{:ok, filter} <- Repo.delete(filter) do
filter
else
{:error, error} -> Repo.rollback(error)
end
end
@spec update(t(), map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
def update(%__MODULE__{} = filter, params) do
Repo.transaction(fn -> update_with_expiration(filter, params) end)
end
defp update_with_expiration(filter, params) do
with {:ok, updated} <- do_update(filter, params),
{:ok, _} <- maybe_delete_old_expiration_job(filter, updated),
{:ok, _} <-
maybe_add_expiration_job(updated) do
updated
else
{:error, error} -> Repo.rollback(error)
end
end
defp do_update(filter, params) do
filter
|> cast(params, [:phrase, :context, :hide, :expires_at, :whole_word])
|> validate_required([:phrase, :context])
|> maybe_add_expires_at(params)
|> Repo.update()
end
defp maybe_delete_old_expiration_job(%{expires_at: nil}, _), do: {:ok, nil}
defp maybe_delete_old_expiration_job(%{expires_at: expires_at}, %{expires_at: expires_at}) do
{:ok, nil}
end
defp maybe_delete_old_expiration_job(%{id: id}, _) do
with %Oban.Job{} = job <- Pleroma.Workers.PurgeExpiredFilter.get_expiration(id) do
Repo.delete(job)
else
nil -> {:ok, nil}
end
end
@spec compose_regex(User.t() | [t()], format()) :: String.t() | Regex.t() | nil
def compose_regex(user_or_filters, format \\ :postgres)
def compose_regex(%User{} = user, format) do
__MODULE__
|> get_active()
|> get_irreversible()
|> get_filters(user)
|> compose_regex(format)
end
def compose_regex([_ | _] = filters, format) do
phrases =
filters
|> Enum.map(& &1.phrase)
|> Enum.join("|")
case format do
:postgres ->
"\\y(#{phrases})\\y"
:re ->
~r/\b#{phrases}\b/i
-
- _ ->
- nil
end
end
def compose_regex(_, _), do: nil
end
diff --git a/lib/pleroma/gun/connection_pool/worker_supervisor.ex b/lib/pleroma/gun/connection_pool/worker_supervisor.ex
index b2be4ff87..24ad61117 100644
--- a/lib/pleroma/gun/connection_pool/worker_supervisor.ex
+++ b/lib/pleroma/gun/connection_pool/worker_supervisor.ex
@@ -1,49 +1,51 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Gun.ConnectionPool.WorkerSupervisor do
@moduledoc "Supervisor for pool workers. Does not do anything except enforce max connection limit"
use DynamicSupervisor
def start_link(opts) do
DynamicSupervisor.start_link(__MODULE__, opts, name: __MODULE__)
end
def init(_opts) do
DynamicSupervisor.init(
strategy: :one_for_one,
max_children: Pleroma.Config.get([:connections_pool, :max_connections])
)
end
def start_worker(opts, retry \\ false) do
case DynamicSupervisor.start_child(__MODULE__, {Pleroma.Gun.ConnectionPool.Worker, opts}) do
{:error, :max_children} ->
- if Enum.any?([retry, free_pool()], &match?(&1, :error)) do
+ funs = [fn -> !retry end, fn -> match?(:error, free_pool()) end]
+
+ if Enum.any?(funs, fn fun -> fun.() end) do
:telemetry.execute([:pleroma, :connection_pool, :provision_failure], %{opts: opts})
{:error, :pool_full}
else
start_worker(opts, true)
end
res ->
res
end
end
defp free_pool do
wait_for_reclaimer_finish(Pleroma.Gun.ConnectionPool.Reclaimer.start_monitor())
end
defp wait_for_reclaimer_finish({pid, mon}) do
receive do
{:DOWN, ^mon, :process, ^pid, :no_unused_conns} ->
:error
{:DOWN, ^mon, :process, ^pid, :normal} ->
:ok
end
end
end
diff --git a/lib/pleroma/html.ex b/lib/pleroma/html.ex
index 5bf735c4f..84ff2f129 100644
--- a/lib/pleroma/html.ex
+++ b/lib/pleroma/html.ex
@@ -1,93 +1,84 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.HTML do
# Scrubbers are compiled on boot so they can be configured in OTP releases
# @on_load :compile_scrubbers
- @cachex Pleroma.Config.get([:cachex, :provider], Cachex)
-
def compile_scrubbers do
dir = Path.join(:code.priv_dir(:pleroma), "scrubbers")
dir
|> Pleroma.Utils.compile_dir()
|> case do
{:error, _errors, _warnings} ->
raise "Compiling scrubbers failed"
{:ok, _modules, _warnings} ->
:ok
end
end
defp get_scrubbers(scrubber) when is_atom(scrubber), do: [scrubber]
defp get_scrubbers(scrubbers) when is_list(scrubbers), do: scrubbers
defp get_scrubbers(_), do: [Pleroma.HTML.Scrubber.Default]
def get_scrubbers do
Pleroma.Config.get([:markup, :scrub_policy])
|> get_scrubbers
end
def filter_tags(html, nil) do
filter_tags(html, get_scrubbers())
end
def filter_tags(html, scrubbers) when is_list(scrubbers) do
Enum.reduce(scrubbers, html, fn scrubber, html ->
filter_tags(html, scrubber)
end)
end
def filter_tags(html, scrubber) do
{:ok, content} = FastSanitize.Sanitizer.scrub(html, scrubber)
content
end
def filter_tags(html), do: filter_tags(html, nil)
def strip_tags(html), do: filter_tags(html, FastSanitize.Sanitizer.StripTags)
def ensure_scrubbed_html(
content,
scrubbers,
fake,
callback
) do
content =
content
|> filter_tags(scrubbers)
|> callback.()
if fake do
{:ignore, content}
else
{:commit, content}
end
end
- def extract_first_external_url_from_object(%{data: %{"content" => content}} = object)
+ @spec extract_first_external_url_from_object(Pleroma.Object.t()) ::
+ {:ok, String.t()} | {:error, :no_content}
+ def extract_first_external_url_from_object(%{data: %{"content" => content}})
when is_binary(content) do
- unless object.data["fake"] do
- key = "URL|#{object.id}"
+ url =
+ content
+ |> Floki.parse_fragment!()
+ |> Floki.find("a:not(.mention,.hashtag,.attachment,[rel~=\"tag\"])")
+ |> Enum.take(1)
+ |> Floki.attribute("href")
+ |> Enum.at(0)
- @cachex.fetch!(:scrubber_cache, key, fn _key ->
- {:commit, {:ok, extract_first_external_url(content)}}
- end)
- else
- {:ok, extract_first_external_url(content)}
- end
+ {:ok, url}
end
def extract_first_external_url_from_object(_), do: {:error, :no_content}
-
- def extract_first_external_url(content) do
- content
- |> Floki.parse_fragment!()
- |> Floki.find("a:not(.mention,.hashtag,.attachment,[rel~=\"tag\"])")
- |> Enum.take(1)
- |> Floki.attribute("href")
- |> Enum.at(0)
- end
end
diff --git a/lib/pleroma/http/request_builder.ex b/lib/pleroma/http/request_builder.ex
index f16fb3b35..0a028a64c 100644
--- a/lib/pleroma/http/request_builder.ex
+++ b/lib/pleroma/http/request_builder.ex
@@ -1,95 +1,95 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.HTTP.RequestBuilder do
@moduledoc """
Helper functions for building Tesla requests
"""
alias Pleroma.HTTP.Request
alias Tesla.Multipart
@doc """
Creates new request
"""
@spec new(Request.t()) :: Request.t()
def new(%Request{} = request \\ %Request{}), do: request
@doc """
Specify the request method when building a request
"""
@spec method(Request.t(), Request.method()) :: Request.t()
def method(request, m), do: %{request | method: m}
@doc """
Specify the request method when building a request
"""
@spec url(Request.t(), Request.url()) :: Request.t()
def url(request, u), do: %{request | url: u}
@doc """
Add headers to the request
"""
@spec headers(Request.t(), Request.headers()) :: Request.t()
def headers(request, headers) do
headers_list =
with true <- Pleroma.Config.get([:http, :send_user_agent]),
nil <- Enum.find(headers, fn {key, _val} -> String.downcase(key) == "user-agent" end) do
[{"user-agent", Pleroma.Application.user_agent()} | headers]
else
_ ->
headers
end
%{request | headers: headers_list}
end
@doc """
Add custom, per-request middleware or adapter options to the request
"""
@spec opts(Request.t(), keyword()) :: Request.t()
def opts(request, options), do: %{request | opts: options}
@doc """
Add optional parameters to the request
"""
- @spec add_param(Request.t(), atom(), atom(), any()) :: Request.t()
+ @spec add_param(Request.t(), atom(), atom() | String.t(), any()) :: Request.t()
def add_param(request, :query, :query, values), do: %{request | query: values}
def add_param(request, :body, :body, value), do: %{request | body: value}
- def add_param(request, :body, key, value) do
+ def add_param(request, :body, key, value) when is_binary(key) do
request
|> Map.put(:body, Multipart.new())
|> Map.update!(
:body,
&Multipart.add_field(
&1,
key,
Jason.encode!(value),
headers: [{"content-type", "application/json"}]
)
)
end
def add_param(request, :file, name, path) do
request
|> Map.put(:body, Multipart.new())
|> Map.update!(:body, &Multipart.add_file(&1, path, name: name))
end
def add_param(request, :form, name, value) do
Map.update(request, :body, %{name => value}, &Map.put(&1, name, value))
end
def add_param(request, location, key, value) do
Map.update(request, location, [{key, value}], &(&1 ++ [{key, value}]))
end
def convert_to_keyword(request) do
request
|> Map.from_struct()
|> Enum.into([])
end
end
diff --git a/lib/pleroma/maps.ex b/lib/pleroma/maps.ex
index 6d586e53e..5020a8ff8 100644
--- a/lib/pleroma/maps.ex
+++ b/lib/pleroma/maps.ex
@@ -1,21 +1,34 @@
# Pleroma: A lightweight social networking server
-# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
+# Copyright © 2017-2024 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Maps do
def put_if_present(map, key, value, value_function \\ &{:ok, &1}) when is_map(map) do
with false <- is_nil(key),
false <- is_nil(value),
{:ok, new_value} <- value_function.(value) do
Map.put(map, key, new_value)
else
_ -> map
end
end
def safe_put_in(data, keys, value) when is_map(data) and is_list(keys) do
Kernel.put_in(data, keys, value)
rescue
_ -> data
end
+
+ def filter_empty_values(data) do
+ # TODO: Change to Map.filter in Elixir 1.13+
+ data
+ |> Enum.filter(fn
+ {_k, nil} -> false
+ {_k, ""} -> false
+ {_k, []} -> false
+ {_k, %{} = v} -> Map.keys(v) != []
+ {_k, _v} -> true
+ end)
+ |> Map.new()
+ end
end
diff --git a/lib/pleroma/mfa.ex b/lib/pleroma/mfa.ex
index 01b730c76..ce30cd4ad 100644
--- a/lib/pleroma/mfa.ex
+++ b/lib/pleroma/mfa.ex
@@ -1,155 +1,155 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.MFA do
@moduledoc """
The MFA context.
"""
alias Pleroma.User
alias Pleroma.MFA.BackupCodes
alias Pleroma.MFA.Changeset
alias Pleroma.MFA.Settings
alias Pleroma.MFA.TOTP
@doc """
Returns MFA methods the user has enabled.
## Examples
iex> Pleroma.MFA.supported_method(User)
"totp, u2f"
"""
@spec supported_methods(User.t()) :: String.t()
def supported_methods(user) do
settings = fetch_settings(user)
Settings.mfa_methods()
|> Enum.reduce([], fn m, acc ->
if method_enabled?(m, settings) do
acc ++ [m]
else
acc
end
end)
|> Enum.join(",")
end
@doc "Checks that user enabled MFA"
def require?(user) do
fetch_settings(user).enabled
end
@doc """
Display MFA settings of user
"""
def mfa_settings(user) do
settings = fetch_settings(user)
Settings.mfa_methods()
|> Enum.map(fn m -> [m, method_enabled?(m, settings)] end)
|> Enum.into(%{enabled: settings.enabled}, fn [a, b] -> {a, b} end)
end
@doc false
def fetch_settings(%User{} = user) do
user.multi_factor_authentication_settings || %Settings{}
end
@doc "clears backup codes"
def invalidate_backup_code(%User{} = user, hash_code) do
%{backup_codes: codes} = fetch_settings(user)
user
|> Changeset.cast_backup_codes(codes -- [hash_code])
|> User.update_and_set_cache()
end
@doc "generates backup codes"
@spec generate_backup_codes(User.t()) :: {:ok, list(binary)} | {:error, String.t()}
def generate_backup_codes(%User{} = user) do
with codes <- BackupCodes.generate(),
hashed_codes <- Enum.map(codes, &Pleroma.Password.Pbkdf2.hash_pwd_salt/1),
changeset <- Changeset.cast_backup_codes(user, hashed_codes),
{:ok, _} <- User.update_and_set_cache(changeset) do
{:ok, codes}
else
{:error, msg} ->
- %{error: msg}
+ {:error, msg}
end
end
@doc """
Generates secret key and set delivery_type to 'app' for TOTP method.
"""
@spec setup_totp(User.t()) :: {:ok, User.t()} | {:error, Ecto.Changeset.t()}
def setup_totp(user) do
user
|> Changeset.setup_totp(%{secret: TOTP.generate_secret(), delivery_type: "app"})
|> User.update_and_set_cache()
end
@doc """
Confirms the TOTP method for user.
`attrs`:
`password` - current user password
`code` - TOTP token
"""
@spec confirm_totp(User.t(), map()) :: {:ok, User.t()} | {:error, Ecto.Changeset.t() | atom()}
def confirm_totp(%User{} = user, attrs) do
with settings <- user.multi_factor_authentication_settings.totp,
{:ok, :pass} <- TOTP.validate_token(settings.secret, attrs["code"]) do
user
|> Changeset.confirm_totp()
|> User.update_and_set_cache()
end
end
@doc """
Disables the TOTP method for user.
`attrs`:
`password` - current user password
"""
@spec disable_totp(User.t()) :: {:ok, User.t()} | {:error, Ecto.Changeset.t()}
def disable_totp(%User{} = user) do
user
|> Changeset.disable_totp()
|> Changeset.disable()
|> User.update_and_set_cache()
end
@doc """
Force disables all MFA methods for user.
"""
@spec disable(User.t()) :: {:ok, User.t()} | {:error, Ecto.Changeset.t()}
def disable(%User{} = user) do
user
|> Changeset.disable_totp()
|> Changeset.disable(true)
|> User.update_and_set_cache()
end
@doc """
Checks if the user has MFA method enabled.
"""
def method_enabled?(method, settings) do
with {:ok, %{confirmed: true} = _} <- Map.fetch(settings, method) do
true
else
_ -> false
end
end
@doc """
Checks if the user has enabled at least one MFA method.
"""
def enabled?(settings) do
Settings.mfa_methods()
|> Enum.map(fn m -> method_enabled?(m, settings) end)
|> Enum.any?()
end
end
diff --git a/lib/pleroma/mfa/totp.ex b/lib/pleroma/mfa/totp.ex
index 429c4b700..96fa8d71d 100644
--- a/lib/pleroma/mfa/totp.ex
+++ b/lib/pleroma/mfa/totp.ex
@@ -1,86 +1,87 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.MFA.TOTP do
@moduledoc """
This module represents functions to create secrets for
TOTP Application as well as validate them with a time based token.
"""
alias Pleroma.Config
@config_ns [:instance, :multi_factor_authentication, :totp]
@doc """
https://github.com/google/google-authenticator/wiki/Key-Uri-Format
"""
+ @spec provisioning_uri(String.t(), String.t(), list()) :: String.t()
def provisioning_uri(secret, label, opts \\ []) do
query =
%{
secret: secret,
issuer: Keyword.get(opts, :issuer, default_issuer()),
digits: Keyword.get(opts, :digits, default_digits()),
period: Keyword.get(opts, :period, default_period())
}
|> Enum.filter(fn {_, v} -> not is_nil(v) end)
|> Enum.into(%{})
|> URI.encode_query()
%URI{scheme: "otpauth", host: "totp", path: "/" <> label, query: query}
- |> URI.to_string()
+ |> to_string()
end
defp default_period, do: Config.get(@config_ns ++ [:period])
defp default_digits, do: Config.get(@config_ns ++ [:digits])
defp default_issuer,
do: Config.get(@config_ns ++ [:issuer], Config.get([:instance, :name]))
@doc "Creates a random Base 32 encoded string"
def generate_secret do
Base.encode32(:crypto.strong_rand_bytes(10))
end
@doc "Generates a valid token based on a secret"
def generate_token(secret) do
:pot.totp(secret)
end
@doc """
Validates a given token based on a secret.
optional parameters:
`token_length` default `6`
`interval_length` default `30`
`window` default 0
Returns {:ok, :pass} if the token is valid and
{:error, :invalid_token} if it is not.
"""
@spec validate_token(String.t(), String.t()) ::
{:ok, :pass} | {:error, :invalid_token | :invalid_secret_and_token}
def validate_token(secret, token)
when is_binary(secret) and is_binary(token) do
opts = [
token_length: default_digits(),
interval_length: default_period()
]
validate_token(secret, token, opts)
end
def validate_token(_, _), do: {:error, :invalid_secret_and_token}
@doc "See `validate_token/2`"
@spec validate_token(String.t(), String.t(), Keyword.t()) ::
{:ok, :pass} | {:error, :invalid_token | :invalid_secret_and_token}
def validate_token(secret, token, options)
when is_binary(secret) and is_binary(token) do
case :pot.valid_totp(token, secret, options) do
true -> {:ok, :pass}
false -> {:error, :invalid_token}
end
end
def validate_token(_, _, _), do: {:error, :invalid_secret_and_token}
end
diff --git a/lib/pleroma/notification.ex b/lib/pleroma/notification.ex
index 728835271..71eac9b41 100644
--- a/lib/pleroma/notification.ex
+++ b/lib/pleroma/notification.ex
@@ -1,855 +1,855 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Notification do
use Ecto.Schema
alias Ecto.Multi
alias Pleroma.Activity
alias Pleroma.FollowingRelationship
alias Pleroma.Marker
alias Pleroma.Notification
alias Pleroma.Object
alias Pleroma.Pagination
alias Pleroma.Repo
alias Pleroma.ThreadMute
alias Pleroma.User
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.CommonAPI.Utils
alias Pleroma.Web.Push
alias Pleroma.Web.Streamer
import Ecto.Query
import Ecto.Changeset
require Logger
@type t :: %__MODULE__{}
@include_muted_option :with_muted
schema "notifications" do
field(:seen, :boolean, default: false)
# This is an enum type in the database. If you add a new notification type,
# remember to add a migration to add it to the `notifications_type` enum
# as well.
field(:type, :string)
belongs_to(:user, User, type: FlakeId.Ecto.CompatType)
belongs_to(:activity, Activity, type: FlakeId.Ecto.CompatType)
timestamps()
end
def update_notification_type(user, activity) do
with %__MODULE__{} = notification <-
Repo.get_by(__MODULE__, user_id: user.id, activity_id: activity.id) do
type =
activity
|> type_from_activity()
notification
|> changeset(%{type: type})
|> Repo.update()
end
end
@spec unread_notifications_count(User.t()) :: integer()
def unread_notifications_count(%User{id: user_id}) do
from(q in __MODULE__,
where: q.user_id == ^user_id and q.seen == false
)
|> Repo.aggregate(:count, :id)
end
@notification_types ~w{
favourite
follow
follow_request
mention
move
pleroma:chat_mention
pleroma:emoji_reaction
pleroma:report
reblog
poll
pleroma:participation_accepted
pleroma:participation_request
pleroma:event_reminder
pleroma:event_update
}
def changeset(%Notification{} = notification, attrs) do
notification
|> cast(attrs, [:seen, :type])
|> validate_inclusion(:type, @notification_types)
end
@spec last_read_query(User.t()) :: Ecto.Queryable.t()
def last_read_query(user) do
from(q in Pleroma.Notification,
where: q.user_id == ^user.id,
where: q.seen == true,
select: type(q.id, :string),
limit: 1,
- order_by: [desc: :id]
+ order_by: fragment("? desc nulls last", q.id)
)
end
defp for_user_query_ap_id_opts(user, opts) do
ap_id_relationships =
[:block] ++
if opts[@include_muted_option], do: [], else: [:notification_mute]
preloaded_ap_ids = User.outgoing_relationships_ap_ids(user, ap_id_relationships)
exclude_blocked_opts = Map.merge(%{blocked_users_ap_ids: preloaded_ap_ids[:block]}, opts)
exclude_notification_muted_opts =
Map.merge(%{notification_muted_users_ap_ids: preloaded_ap_ids[:notification_mute]}, opts)
{exclude_blocked_opts, exclude_notification_muted_opts}
end
def for_user_query(user, opts \\ %{}) do
{exclude_blocked_opts, exclude_notification_muted_opts} =
for_user_query_ap_id_opts(user, opts)
Notification
|> where(user_id: ^user.id)
|> join(:inner, [n], activity in assoc(n, :activity))
|> join(:left, [n, a], object in Object,
on:
fragment(
"(?->>'id') = associated_object_id(?)",
object.data,
a.data
)
)
|> join(:inner, [_n, a], u in User, on: u.ap_id == a.actor, as: :user_actor)
|> preload([n, a, o], activity: {a, object: o})
|> where([user_actor: user_actor], user_actor.is_active)
|> exclude_notification_muted(user, exclude_notification_muted_opts)
|> exclude_blocked(user, exclude_blocked_opts)
|> exclude_blockers(user)
|> exclude_filtered(user)
|> exclude_visibility(opts)
end
# Excludes blocked users and non-followed domain-blocked users
defp exclude_blocked(query, user, opts) do
blocked_ap_ids = opts[:blocked_users_ap_ids] || User.blocked_users_ap_ids(user)
query
|> where([n, a], a.actor not in ^blocked_ap_ids)
|> FollowingRelationship.keep_following_or_not_domain_blocked(user)
end
defp exclude_blockers(query, user) do
if Pleroma.Config.get([:activitypub, :blockers_visible]) == true do
query
else
blocker_ap_ids = User.incoming_relationships_ungrouped_ap_ids(user, [:block])
query
|> where([n, a], a.actor not in ^blocker_ap_ids)
end
end
defp exclude_notification_muted(query, _, %{@include_muted_option => true}) do
query
end
defp exclude_notification_muted(query, user, opts) do
notification_muted_ap_ids =
opts[:notification_muted_users_ap_ids] || User.notification_muted_users_ap_ids(user)
query
|> where([n, a], a.actor not in ^notification_muted_ap_ids)
|> join(:left, [n, a], tm in ThreadMute,
on: tm.user_id == ^user.id and tm.context == fragment("?->>'context'", a.data),
as: :thread_mute
)
|> where([thread_mute: thread_mute], is_nil(thread_mute.user_id))
end
defp exclude_filtered(query, user) do
case Pleroma.Filter.compose_regex(user) do
nil ->
query
regex ->
from([_n, a, o] in query,
where:
fragment("not(?->>'content' ~* ?)", o.data, ^regex) or
fragment("?->>'content' is null", o.data) or
fragment("?->>'actor' = ?", o.data, ^user.ap_id)
)
end
end
@valid_visibilities ~w[direct unlisted public private]
defp exclude_visibility(query, %{exclude_visibilities: visibility})
when is_list(visibility) do
if Enum.all?(visibility, &(&1 in @valid_visibilities)) do
query
|> join(:left, [n, a], mutated_activity in Pleroma.Activity,
on:
fragment(
"associated_object_id(?)",
a.data
) ==
fragment(
"associated_object_id(?)",
mutated_activity.data
) and
fragment("(?->>'type' = 'Like' or ?->>'type' = 'Announce')", a.data, a.data) and
fragment("?->>'type'", mutated_activity.data) == "Create",
as: :mutated_activity
)
|> where(
[n, a, mutated_activity: mutated_activity],
not fragment(
"""
CASE WHEN (?->>'type') = 'Like' or (?->>'type') = 'Announce'
THEN (activity_visibility(?, ?, ?) = ANY (?))
ELSE (activity_visibility(?, ?, ?) = ANY (?)) END
""",
a.data,
a.data,
mutated_activity.actor,
mutated_activity.recipients,
mutated_activity.data,
^visibility,
a.actor,
a.recipients,
a.data,
^visibility
)
)
else
Logger.error("Could not exclude visibility to #{visibility}")
query
end
end
defp exclude_visibility(query, %{exclude_visibilities: visibility})
when visibility in @valid_visibilities do
exclude_visibility(query, [visibility])
end
defp exclude_visibility(query, %{exclude_visibilities: visibility})
when visibility not in @valid_visibilities do
Logger.error("Could not exclude visibility to #{visibility}")
query
end
defp exclude_visibility(query, _visibility), do: query
def for_user(user, opts \\ %{}) do
user
|> for_user_query(opts)
|> Pagination.fetch_paginated(opts)
end
@doc """
Returns notifications for user received since given date.
## Examples
iex> Pleroma.Notification.for_user_since(%Pleroma.User{}, ~N[2019-04-13 11:22:33])
[%Pleroma.Notification{}, %Pleroma.Notification{}]
iex> Pleroma.Notification.for_user_since(%Pleroma.User{}, ~N[2019-04-15 11:22:33])
[]
"""
@spec for_user_since(Pleroma.User.t(), NaiveDateTime.t()) :: [t()]
def for_user_since(user, date) do
from(n in for_user_query(user),
where: n.updated_at > ^date
)
|> Repo.all()
end
def set_read_up_to(%{id: user_id} = user, id) do
query =
from(
n in Notification,
where: n.user_id == ^user_id,
where: n.id <= ^id,
where: n.seen == false,
# Ideally we would preload object and activities here
# but Ecto does not support preloads in update_all
select: n.id
)
{:ok, %{ids: {_, notification_ids}}} =
Multi.new()
|> Multi.update_all(:ids, query, set: [seen: true, updated_at: NaiveDateTime.utc_now()])
|> Marker.multi_set_last_read_id(user, "notifications")
|> Repo.transaction()
for_user_query(user)
|> where([n], n.id in ^notification_ids)
|> Repo.all()
end
@spec read_one(User.t(), String.t()) ::
{:ok, Notification.t()} | {:error, Ecto.Changeset.t()} | nil
def read_one(%User{} = user, notification_id) do
with {:ok, %Notification{} = notification} <- get(user, notification_id) do
Multi.new()
|> Multi.update(:update, changeset(notification, %{seen: true}))
|> Marker.multi_set_last_read_id(user, "notifications")
|> Repo.transaction()
|> case do
{:ok, %{update: notification}} -> {:ok, notification}
{:error, :update, changeset, _} -> {:error, changeset}
end
end
end
def get(%{id: user_id} = _user, id) do
query =
from(
n in Notification,
where: n.id == ^id,
join: activity in assoc(n, :activity),
preload: [activity: activity]
)
notification = Repo.one(query)
case notification do
%{user_id: ^user_id} ->
{:ok, notification}
_ ->
{:error, "Cannot get notification"}
end
end
def clear(user) do
from(n in Notification, where: n.user_id == ^user.id)
|> Repo.delete_all()
end
def destroy_multiple(%{id: user_id} = _user, ids) do
from(n in Notification,
where: n.id in ^ids,
where: n.user_id == ^user_id
)
|> Repo.delete_all()
end
def dismiss(%Pleroma.Activity{} = activity) do
Notification
|> where([n], n.activity_id == ^activity.id)
|> Repo.delete_all()
|> case do
{_, notifications} -> {:ok, notifications}
_ -> {:error, "Cannot dismiss notification"}
end
end
def dismiss(%{id: user_id} = _user, id) do
notification = Repo.get(Notification, id)
case notification do
%{user_id: ^user_id} ->
Repo.delete(notification)
_ ->
{:error, "Cannot dismiss notification"}
end
end
@spec create_notifications(Activity.t(), keyword()) :: {:ok, [Notification.t()] | []}
def create_notifications(activity, options \\ [])
def create_notifications(%Activity{data: %{"to" => _, "type" => "Create"}} = activity, options) do
object = Object.normalize(activity, fetch: false)
if object && object.data["type"] == "Answer" do
{:ok, []}
else
do_create_notifications(activity, options)
end
end
def create_notifications(%Activity{data: %{"type" => type}} = activity, options)
when type in [
"Follow",
"Like",
"Announce",
"Move",
"EmojiReact",
"Flag",
"Update",
"Accept",
"Join"
] do
do_create_notifications(activity, options)
end
def create_notifications(_, _), do: {:ok, []}
defp do_create_notifications(%Activity{} = activity, options) do
do_send = Keyword.get(options, :do_send, true)
{enabled_participants, disabled_participants} =
get_notified_participants_from_activity(activity)
potential_participants = enabled_participants ++ disabled_participants
{enabled_receivers, disabled_receivers} = get_notified_from_activity(activity)
potential_receivers = (enabled_receivers ++ disabled_receivers) -- potential_participants
notifications =
(Enum.map(potential_receivers, fn user ->
do_send = do_send && user in enabled_receivers
create_notification(activity, user, do_send: do_send)
end) ++
Enum.map(potential_participants, fn user ->
do_send = do_send && user in enabled_participants
create_notification(activity, user, do_send: do_send, type: "pleroma:event_update")
end))
|> Enum.reject(&is_nil/1)
{:ok, notifications}
end
defp type_from_activity(%{data: %{"type" => type}} = activity) do
case type do
"Follow" ->
if Activity.follow_accepted?(activity) do
"follow"
else
"follow_request"
end
"Announce" ->
"reblog"
"Like" ->
"favourite"
"Move" ->
"move"
"EmojiReact" ->
"pleroma:emoji_reaction"
"Flag" ->
"pleroma:report"
# Compatibility with old reactions
"EmojiReaction" ->
"pleroma:emoji_reaction"
"Create" ->
activity
|> type_from_activity_object()
"Update" ->
"update"
"Accept" ->
"pleroma:participation_accepted"
"Join" ->
"pleroma:participation_request"
t ->
raise "No notification type for activity type #{t}"
end
end
defp type_from_activity_object(%{data: %{"type" => "Create", "object" => %{}}}), do: "mention"
defp type_from_activity_object(%{data: %{"type" => "Create"}} = activity) do
object = Object.get_by_ap_id(activity.data["object"])
case object && object.data["type"] do
"ChatMessage" -> "pleroma:chat_mention"
_ -> "mention"
end
end
# TODO move to sql, too.
def create_notification(%Activity{} = activity, %User{} = user, opts \\ []) do
do_send = Keyword.get(opts, :do_send, true)
type = Keyword.get(opts, :type, type_from_activity(activity))
unless skip?(activity, user, opts) do
{:ok, %{notification: notification}} =
Multi.new()
|> Multi.insert(:notification, %Notification{
user_id: user.id,
activity: activity,
seen: mark_as_read?(activity, user),
type: type
})
|> Marker.multi_set_last_read_id(user, "notifications")
|> Repo.transaction()
if do_send do
Streamer.stream(["user", "user:notification"], notification)
Push.send(notification)
end
notification
end
end
def create_poll_notifications(%Activity{} = activity) do
with %Object{data: %{"type" => "Question", "actor" => actor} = data} <-
Object.normalize(activity) do
voters =
case data do
%{"voters" => voters} when is_list(voters) -> voters
_ -> []
end
notifications =
Enum.reduce([actor | voters], [], fn ap_id, acc ->
with %User{local: true} = user <- User.get_by_ap_id(ap_id) do
[create_notification(activity, user, type: "poll") | acc]
else
_ -> acc
end
end)
{:ok, notifications}
end
end
def create_event_notifications(%Activity{} = activity) do
with %Object{data: %{"type" => "Event", "actor" => actor} = data} <-
Object.normalize(activity) do
participations =
case data do
%{"participations" => participations} when is_list(participations) -> participations
_ -> []
end
notifications =
Enum.reduce([actor | participations], [], fn ap_id, acc ->
with %User{local: true} = user <- User.get_by_ap_id(ap_id) do
[create_notification(activity, user, type: "pleroma:event_reminder") | acc]
else
_ -> acc
end
end)
{:ok, notifications}
end
end
@doc """
Returns a tuple with 2 elements:
{notification-enabled receivers, currently disabled receivers (blocking / [thread] muting)}
NOTE: might be called for FAKE Activities, see ActivityPub.Utils.get_notified_from_object/1
"""
@spec get_notified_from_activity(Activity.t(), boolean()) :: {list(User.t()), list(User.t())}
def get_notified_from_activity(activity, local_only \\ true)
def get_notified_from_activity(%Activity{data: %{"type" => type}} = activity, local_only)
when type in [
"Create",
"Like",
"Announce",
"Follow",
"Move",
"EmojiReact",
"Flag",
"Update",
"Accept",
"Join"
] do
potential_receiver_ap_ids = get_potential_receiver_ap_ids(activity)
potential_receivers =
User.get_users_from_set(potential_receiver_ap_ids, local_only: local_only)
notification_enabled_ap_ids =
potential_receiver_ap_ids
|> exclude_domain_blocker_ap_ids(activity, potential_receivers)
|> exclude_relationship_restricted_ap_ids(activity)
|> exclude_thread_muter_ap_ids(activity)
notification_enabled_users =
Enum.filter(potential_receivers, fn u -> u.ap_id in notification_enabled_ap_ids end)
{notification_enabled_users, potential_receivers -- notification_enabled_users}
end
def get_notified_from_activity(_, _local_only), do: {[], []}
def get_notified_participants_from_activity(activity, local_only \\ true)
def get_notified_participants_from_activity(
%Activity{data: %{"type" => "Update"}} = activity,
local_only
) do
notification_enabled_ap_ids =
[]
|> Utils.maybe_notify_participants(activity)
potential_receivers =
User.get_users_from_set(notification_enabled_ap_ids, local_only: local_only)
notification_enabled_users =
Enum.filter(potential_receivers, fn u -> u.ap_id in notification_enabled_ap_ids end)
{notification_enabled_users, potential_receivers -- notification_enabled_users}
end
def get_notified_participants_from_activity(_, _), do: {[], []}
# For some activities, only notify the author of the object
def get_potential_receiver_ap_ids(%{data: %{"type" => type, "object" => object_id}})
when type in ~w{Like Announce EmojiReact} do
case Object.get_cached_by_ap_id(object_id) do
%Object{data: %{"actor" => actor}} ->
[actor]
_ ->
[]
end
end
def get_potential_receiver_ap_ids(%{data: %{"type" => "Accept", "object" => join_id}}) do
case Activity.get_by_ap_id_with_object(join_id) do
%Activity{
data: %{"type" => "Join"},
object: %Object{data: %{"type" => "Event", "joinMode" => "free"}}
} ->
[]
%Activity{data: %{"type" => "Join", "actor" => actor_id}} ->
[actor_id]
_ ->
[]
end
end
def get_potential_receiver_ap_ids(%{data: %{"type" => "Join", "object" => object_id}}) do
case Object.get_by_ap_id(object_id) do
%Object{data: %{"type" => "Event", "joinMode" => "free"}} ->
[]
%Object{data: %{"type" => "Event", "actor" => actor_id}} ->
[actor_id]
_ ->
[]
end
end
def get_potential_receiver_ap_ids(%{data: %{"type" => "Follow", "object" => object_id}}) do
[object_id]
end
def get_potential_receiver_ap_ids(%{data: %{"type" => "Flag", "actor" => actor}}) do
(User.all_users_with_privilege(:reports_manage_reports)
|> Enum.map(fn user -> user.ap_id end)) --
[actor]
end
# Update activity: notify all who repeated this
def get_potential_receiver_ap_ids(%{data: %{"type" => "Update", "actor" => actor}} = activity) do
with %Object{data: %{"id" => object_id}} <- Object.normalize(activity, fetch: false) do
repeaters =
Activity.Queries.by_type("Announce")
|> Activity.Queries.by_object_id(object_id)
|> Activity.with_joined_user_actor()
|> where([a, u], u.local)
|> select([a, u], u.ap_id)
|> Repo.all()
repeaters -- [actor]
end
end
def get_potential_receiver_ap_ids(activity) do
[]
|> Utils.maybe_notify_to_recipients(activity)
|> Utils.maybe_notify_mentioned_recipients(activity)
|> Utils.maybe_notify_subscribers(activity)
|> Utils.maybe_notify_followers(activity)
|> Enum.uniq()
end
@doc "Filters out AP IDs domain-blocking and not following the activity's actor"
def exclude_domain_blocker_ap_ids(ap_ids, activity, preloaded_users \\ [])
def exclude_domain_blocker_ap_ids([], _activity, _preloaded_users), do: []
def exclude_domain_blocker_ap_ids(ap_ids, %Activity{} = activity, preloaded_users) do
activity_actor_domain = activity.actor && URI.parse(activity.actor).host
users =
ap_ids
|> Enum.map(fn ap_id ->
Enum.find(preloaded_users, &(&1.ap_id == ap_id)) ||
User.get_cached_by_ap_id(ap_id)
end)
|> Enum.filter(& &1)
domain_blocker_ap_ids = for u <- users, activity_actor_domain in u.domain_blocks, do: u.ap_id
domain_blocker_follower_ap_ids =
if Enum.any?(domain_blocker_ap_ids) do
activity
|> Activity.user_actor()
|> FollowingRelationship.followers_ap_ids(domain_blocker_ap_ids)
else
[]
end
ap_ids
|> Kernel.--(domain_blocker_ap_ids)
|> Kernel.++(domain_blocker_follower_ap_ids)
end
@doc "Filters out AP IDs of users basing on their relationships with activity actor user"
def exclude_relationship_restricted_ap_ids([], _activity), do: []
def exclude_relationship_restricted_ap_ids(ap_ids, %Activity{} = activity) do
relationship_restricted_ap_ids =
activity
|> Activity.user_actor()
|> User.incoming_relationships_ungrouped_ap_ids([
:block,
:notification_mute
])
Enum.uniq(ap_ids) -- relationship_restricted_ap_ids
end
@doc "Filters out AP IDs of users who mute activity thread"
def exclude_thread_muter_ap_ids([], _activity), do: []
def exclude_thread_muter_ap_ids(ap_ids, %Activity{} = activity) do
thread_muter_ap_ids = ThreadMute.muter_ap_ids(activity.data["context"])
Enum.uniq(ap_ids) -- thread_muter_ap_ids
end
def skip?(activity, user, opts \\ [])
@spec skip?(Activity.t(), User.t(), Keyword.t()) :: boolean()
def skip?(%Activity{} = activity, %User{} = user, opts) do
[
:self,
:invisible,
:block_from_strangers,
:recently_followed,
:filtered
]
|> Enum.find(&skip?(&1, activity, user, opts))
end
def skip?(_activity, _user, _opts), do: false
@spec skip?(atom(), Activity.t(), User.t(), Keyword.t()) :: boolean()
def skip?(:self, %Activity{} = activity, %User{} = user, opts) do
cond do
opts[:type] == "poll" -> false
activity.data["actor"] == user.ap_id -> true
true -> false
end
end
def skip?(:invisible, %Activity{} = activity, _user, _opts) do
actor = activity.data["actor"]
user = User.get_cached_by_ap_id(actor)
User.invisible?(user)
end
def skip?(
:block_from_strangers,
%Activity{} = activity,
%User{notification_settings: %{block_from_strangers: true}} = user,
opts
) do
actor = activity.data["actor"]
follower = User.get_cached_by_ap_id(actor)
cond do
opts[:type] == "poll" -> false
user.ap_id == actor -> false
!User.following?(user, follower) -> true
true -> false
end
end
# To do: consider defining recency in hours and checking FollowingRelationship with a single SQL
def skip?(
:recently_followed,
%Activity{data: %{"type" => "Follow"}} = activity,
%User{} = user,
_opts
) do
actor = activity.data["actor"]
Notification.for_user(user)
|> Enum.any?(fn
%{activity: %{data: %{"type" => "Follow", "actor" => ^actor}}} -> true
_ -> false
end)
end
def skip?(:filtered, %{data: %{"type" => type}}, _user, _opts) when type in ["Follow", "Move"],
do: false
def skip?(:filtered, activity, user, _opts) do
object = Object.normalize(activity, fetch: false)
cond do
is_nil(object) ->
false
object.data["actor"] == user.ap_id ->
false
not is_nil(regex = Pleroma.Filter.compose_regex(user, :re)) ->
Regex.match?(regex, object.data["content"])
true ->
false
end
end
def skip?(_type, _activity, _user, _opts), do: false
def mark_as_read?(activity, target_user) do
user = Activity.user_actor(activity)
User.mutes_user?(target_user, user) || CommonAPI.thread_muted?(target_user, activity)
end
def for_user_and_activity(user, activity) do
from(n in __MODULE__,
where: n.user_id == ^user.id,
where: n.activity_id == ^activity.id
)
|> Repo.one()
end
@spec mark_context_as_read(User.t(), String.t()) :: {integer(), nil | [term()]}
def mark_context_as_read(%User{id: id}, context) do
from(
n in Notification,
join: a in assoc(n, :activity),
where: n.user_id == ^id,
where: n.seen == false,
where: fragment("?->>'context'", a.data) == ^context
)
|> Repo.update_all(set: [seen: true])
end
end
diff --git a/lib/pleroma/pagination.ex b/lib/pleroma/pagination.ex
index f12ca2819..8db732cc9 100644
--- a/lib/pleroma/pagination.ex
+++ b/lib/pleroma/pagination.ex
@@ -1,165 +1,166 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Pagination do
@moduledoc """
Implements Mastodon-compatible pagination.
"""
import Ecto.Query
import Ecto.Changeset
alias Pleroma.Repo
@type type :: :keyset | :offset
@default_limit 20
@max_limit 40
@page_keys ["max_id", "min_id", "limit", "since_id", "order"]
def page_keys, do: @page_keys
@spec fetch_paginated(Ecto.Query.t(), map(), type(), atom() | nil) :: [Ecto.Schema.t()]
def fetch_paginated(query, params, type \\ :keyset, table_binding \\ nil)
def fetch_paginated(query, %{total: true} = params, :keyset, table_binding) do
total = Repo.aggregate(query, :count, :id)
%{
total: total,
items: fetch_paginated(query, Map.drop(params, [:total]), :keyset, table_binding)
}
end
def fetch_paginated(query, params, :keyset, table_binding) do
options = cast_params(params)
query
|> paginate(options, :keyset, table_binding)
|> Repo.all()
|> enforce_order(options)
end
def fetch_paginated(query, %{total: true} = params, :offset, table_binding) do
total =
query
|> Ecto.Query.exclude(:left_join)
|> Repo.aggregate(:count, :id)
%{
total: total,
items: fetch_paginated(query, Map.drop(params, [:total]), :offset, table_binding)
}
end
def fetch_paginated(query, params, :offset, table_binding) do
options = cast_params(params)
query
|> paginate(options, :offset, table_binding)
|> Repo.all()
end
- @spec paginate(Ecto.Query.t(), map(), type(), atom() | nil) :: [Ecto.Schema.t()]
- def paginate(query, options, method \\ :keyset, table_binding \\ nil)
-
- def paginate(list, options, _method, _table_binding) when is_list(list) do
+ @spec paginate_list(list(), keyword()) :: list()
+ def paginate_list(list, options) do
offset = options[:offset] || 0
limit = options[:limit] || 0
Enum.slice(list, offset, limit)
end
+ @spec paginate(Ecto.Query.t(), map(), type(), atom() | nil) :: [Ecto.Schema.t()]
+ def paginate(query, options, method \\ :keyset, table_binding \\ nil)
+
def paginate(query, options, :keyset, table_binding) do
query
|> restrict(:min_id, options, table_binding)
|> restrict(:since_id, options, table_binding)
|> restrict(:max_id, options, table_binding)
|> restrict(:order, options, table_binding)
|> restrict(:limit, options, table_binding)
end
def paginate(query, options, :offset, table_binding) do
query
|> restrict(:order, options, table_binding)
|> restrict(:offset, options, table_binding)
|> restrict(:limit, options, table_binding)
end
defp cast_params(params) do
param_types = %{
min_id: :string,
since_id: :string,
max_id: :string,
offset: :integer,
limit: :integer,
skip_extra_order: :boolean,
skip_order: :boolean
}
changeset = cast({%{}, param_types}, params, Map.keys(param_types))
changeset.changes
end
defp restrict(query, :min_id, %{min_id: min_id}, table_binding) do
where(query, [{q, table_position(query, table_binding)}], q.id > ^min_id)
end
defp restrict(query, :since_id, %{since_id: since_id}, table_binding) do
where(query, [{q, table_position(query, table_binding)}], q.id > ^since_id)
end
defp restrict(query, :max_id, %{max_id: max_id}, table_binding) do
where(query, [{q, table_position(query, table_binding)}], q.id < ^max_id)
end
defp restrict(query, :order, %{skip_order: true}, _), do: query
defp restrict(%{order_bys: [_ | _]} = query, :order, %{skip_extra_order: true}, _), do: query
defp restrict(query, :order, %{min_id: _}, table_binding) do
order_by(
query,
[{u, table_position(query, table_binding)}],
fragment("? asc nulls last", u.id)
)
end
defp restrict(query, :order, _options, table_binding) do
order_by(
query,
[{u, table_position(query, table_binding)}],
fragment("? desc nulls last", u.id)
)
end
defp restrict(query, :offset, %{offset: offset}, _table_binding) do
offset(query, ^offset)
end
defp restrict(query, :limit, options, _table_binding) do
limit =
case Map.get(options, :limit, @default_limit) do
limit when limit < @max_limit -> limit
_ -> @max_limit
end
query
|> limit(^limit)
end
defp restrict(query, _, _, _), do: query
defp enforce_order(result, %{min_id: _}) do
result
|> Enum.reverse()
end
defp enforce_order(result, _), do: result
defp table_position(%Ecto.Query{} = query, binding_name) do
Map.get(query.aliases, binding_name, 0)
end
defp table_position(_, _), do: 0
end
diff --git a/lib/pleroma/password/pbkdf2.ex b/lib/pleroma/password/pbkdf2.ex
index 92e9e1952..9c6d2e381 100644
--- a/lib/pleroma/password/pbkdf2.ex
+++ b/lib/pleroma/password/pbkdf2.ex
@@ -1,55 +1,55 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Password.Pbkdf2 do
@moduledoc """
This module implements Pbkdf2 passwords in terms of Plug.Crypto.
"""
alias Plug.Crypto.KeyGenerator
def decode64(str) do
str
|> String.replace(".", "+")
|> Base.decode64!(padding: false)
end
def encode64(bin) do
bin
|> Base.encode64(padding: false)
|> String.replace("+", ".")
end
def verify_pass(password, hash) do
["pbkdf2-" <> digest, iterations, salt, hash] = String.split(hash, "$", trim: true)
salt = decode64(salt)
iterations = String.to_integer(iterations)
- digest = String.to_atom(digest)
+ digest = String.to_existing_atom(digest)
binary_hash =
KeyGenerator.generate(password, salt, digest: digest, iterations: iterations, length: 64)
encode64(binary_hash) == hash
end
def hash_pwd_salt(password, opts \\ []) do
salt =
Keyword.get_lazy(opts, :salt, fn ->
:crypto.strong_rand_bytes(16)
end)
digest = Keyword.get(opts, :digest, :sha512)
iterations =
Keyword.get(opts, :iterations, Pleroma.Config.get([:password, :iterations], 160_000))
binary_hash =
KeyGenerator.generate(password, salt, digest: digest, iterations: iterations, length: 64)
"$pbkdf2-#{digest}$#{iterations}$#{encode64(salt)}$#{encode64(binary_hash)}"
end
end
diff --git a/lib/pleroma/reverse_proxy.ex b/lib/pleroma/reverse_proxy.ex
index cc4530010..4d13e51fc 100644
--- a/lib/pleroma/reverse_proxy.ex
+++ b/lib/pleroma/reverse_proxy.ex
@@ -1,420 +1,420 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.ReverseProxy do
@range_headers ~w(range if-range)
@keep_req_headers ~w(accept accept-encoding cache-control if-modified-since) ++
~w(if-unmodified-since if-none-match) ++ @range_headers
@resp_cache_headers ~w(etag date last-modified)
@keep_resp_headers @resp_cache_headers ++
- ~w(content-length content-type content-disposition content-encoding) ++
+ ~w(content-type content-disposition content-encoding) ++
~w(content-range accept-ranges vary)
@default_cache_control_header "public, max-age=1209600"
@valid_resp_codes [200, 206, 304]
@max_read_duration :timer.seconds(30)
@max_body_length :infinity
@failed_request_ttl :timer.seconds(60)
@methods ~w(GET HEAD)
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
def max_read_duration_default, do: @max_read_duration
def default_cache_control_header, do: @default_cache_control_header
@moduledoc """
A reverse proxy.
Pleroma.ReverseProxy.call(conn, url, options)
It is not meant to be added into a plug pipeline, but to be called from another plug or controller.
Supports `#{inspect(@methods)}` HTTP methods, and only allows `#{inspect(@valid_resp_codes)}` status codes.
Responses are chunked to the client while downloading from the upstream.
Some request / responses headers are preserved:
* request: `#{inspect(@keep_req_headers)}`
* response: `#{inspect(@keep_resp_headers)}`
Options:
* `redirect_on_failure` (default `false`). Redirects the client to the real remote URL if there's any HTTP
errors. Any error during body processing will not be redirected as the response is chunked. This may expose
remote URL, clients IPs, ….
* `max_body_length` (default `#{inspect(@max_body_length)}`): limits the content length to be approximately the
specified length. It is validated with the `content-length` header and also verified when proxying.
* `max_read_duration` (default `#{inspect(@max_read_duration)}` ms): the total time the connection is allowed to
read from the remote upstream.
* `failed_request_ttl` (default `#{inspect(@failed_request_ttl)}` ms): the time the failed request is cached and cannot be retried.
* `inline_content_types`:
* `true` will not alter `content-disposition` (up to the upstream),
* `false` will add `content-disposition: attachment` to any request,
* a list of whitelisted content types
* `req_headers`, `resp_headers` additional headers.
* `http`: options for [hackney](https://github.com/benoitc/hackney) or [gun](https://github.com/ninenines/gun).
"""
@default_options [pool: :media]
@inline_content_types [
"image/gif",
"image/jpeg",
"image/jpg",
"image/png",
"image/svg+xml",
"audio/mpeg",
"audio/mp3",
"video/webm",
"video/mp4",
"video/quicktime"
]
require Logger
import Plug.Conn
@type option() ::
{:max_read_duration, non_neg_integer() | :infinity}
| {:max_body_length, non_neg_integer() | :infinity}
| {:failed_request_ttl, non_neg_integer() | :infinity}
| {:http, keyword()}
| {:req_headers, [{String.t(), String.t()}]}
| {:resp_headers, [{String.t(), String.t()}]}
| {:inline_content_types, boolean() | list(String.t())}
| {:redirect_on_failure, boolean()}
@spec call(Plug.Conn.t(), String.t(), list(option())) :: Plug.Conn.t()
def call(_conn, _url, _opts \\ [])
def call(conn = %{method: method}, url, opts) when method in @methods do
client_opts = Keyword.merge(@default_options, Keyword.get(opts, :http, []))
req_headers = build_req_headers(conn.req_headers, opts)
opts =
if filename = Pleroma.Web.MediaProxy.filename(url) do
Keyword.put_new(opts, :attachment_name, filename)
else
opts
end
with {:ok, nil} <- @cachex.get(:failed_proxy_url_cache, url),
{:ok, code, headers, client} <- request(method, url, req_headers, client_opts),
:ok <-
header_length_constraint(
headers,
Keyword.get(opts, :max_body_length, @max_body_length)
) do
response(conn, client, url, code, headers, opts)
else
{:ok, true} ->
conn
|> error_or_redirect(url, 500, "Request failed", opts)
|> halt()
{:ok, code, headers} ->
head_response(conn, url, code, headers, opts)
|> halt()
{:error, {:invalid_http_response, code}} ->
Logger.error("#{__MODULE__}: request to #{inspect(url)} failed with HTTP status #{code}")
track_failed_url(url, code, opts)
conn
|> error_or_redirect(
url,
code,
"Request failed: " <> Plug.Conn.Status.reason_phrase(code),
opts
)
|> halt()
{:error, error} ->
Logger.error("#{__MODULE__}: request to #{inspect(url)} failed: #{inspect(error)}")
track_failed_url(url, error, opts)
conn
|> error_or_redirect(url, 500, "Request failed", opts)
|> halt()
end
end
def call(conn, _, _) do
conn
|> send_resp(400, Plug.Conn.Status.reason_phrase(400))
|> halt()
end
defp request(method, url, headers, opts) do
Logger.debug("#{__MODULE__} #{method} #{url} #{inspect(headers)}")
method = method |> String.downcase() |> String.to_existing_atom()
case client().request(method, url, headers, "", opts) do
{:ok, code, headers, client} when code in @valid_resp_codes ->
{:ok, code, downcase_headers(headers), client}
{:ok, code, headers} when code in @valid_resp_codes ->
{:ok, code, downcase_headers(headers)}
{:ok, code, _, _} ->
{:error, {:invalid_http_response, code}}
{:ok, code, _} ->
{:error, {:invalid_http_response, code}}
{:error, error} ->
{:error, error}
end
end
defp response(conn, client, url, status, headers, opts) do
Logger.debug("#{__MODULE__} #{status} #{url} #{inspect(headers)}")
result =
conn
|> put_resp_headers(build_resp_headers(headers, opts))
|> send_chunked(status)
|> chunk_reply(client, opts)
case result do
{:ok, conn} ->
halt(conn)
{:error, :closed, conn} ->
client().close(client)
halt(conn)
{:error, error, conn} ->
Logger.warning(
"#{__MODULE__} request to #{url} failed while reading/chunking: #{inspect(error)}"
)
client().close(client)
halt(conn)
end
end
defp chunk_reply(conn, client, opts) do
chunk_reply(conn, client, opts, 0, 0)
end
defp chunk_reply(conn, client, opts, sent_so_far, duration) do
with {:ok, duration} <-
check_read_duration(
duration,
Keyword.get(opts, :max_read_duration, @max_read_duration)
),
{:ok, data, client} <- client().stream_body(client),
{:ok, duration} <- increase_read_duration(duration),
sent_so_far = sent_so_far + byte_size(data),
:ok <-
body_size_constraint(
sent_so_far,
Keyword.get(opts, :max_body_length, @max_body_length)
),
{:ok, conn} <- chunk(conn, data) do
chunk_reply(conn, client, opts, sent_so_far, duration)
else
:done -> {:ok, conn}
{:error, error} -> {:error, error, conn}
end
end
defp head_response(conn, url, code, headers, opts) do
Logger.debug("#{__MODULE__} #{code} #{url} #{inspect(headers)}")
conn
|> put_resp_headers(build_resp_headers(headers, opts))
|> send_resp(code, "")
end
defp error_or_redirect(conn, url, code, body, opts) do
if Keyword.get(opts, :redirect_on_failure, false) do
conn
|> Phoenix.Controller.redirect(external: url)
|> halt()
else
conn
|> send_resp(code, body)
|> halt
end
end
defp downcase_headers(headers) do
Enum.map(headers, fn {k, v} ->
{String.downcase(k), v}
end)
end
defp get_content_type(headers) do
{_, content_type} =
List.keyfind(headers, "content-type", 0, {"content-type", "application/octet-stream"})
[content_type | _] = String.split(content_type, ";")
content_type
end
defp put_resp_headers(conn, headers) do
Enum.reduce(headers, conn, fn {k, v}, conn ->
put_resp_header(conn, k, v)
end)
end
defp build_req_headers(headers, opts) do
headers
|> downcase_headers()
|> Enum.filter(fn {k, _} -> k in @keep_req_headers end)
|> build_req_range_or_encoding_header(opts)
|> build_req_user_agent_header(opts)
|> Keyword.merge(Keyword.get(opts, :req_headers, []))
end
# Disable content-encoding if any @range_headers are requested (see #1823).
defp build_req_range_or_encoding_header(headers, _opts) do
range? = Enum.any?(headers, fn {header, _} -> Enum.member?(@range_headers, header) end)
if range? && List.keymember?(headers, "accept-encoding", 0) do
List.keydelete(headers, "accept-encoding", 0)
else
headers
end
end
defp build_req_user_agent_header(headers, _opts) do
List.keystore(
headers,
"user-agent",
0,
{"user-agent", Pleroma.Application.user_agent()}
)
end
defp build_resp_headers(headers, opts) do
headers
|> Enum.filter(fn {k, _} -> k in @keep_resp_headers end)
|> build_resp_cache_headers(opts)
|> build_resp_content_disposition_header(opts)
|> Keyword.merge(Keyword.get(opts, :resp_headers, []))
end
defp build_resp_cache_headers(headers, _opts) do
has_cache? = Enum.any?(headers, fn {k, _} -> k in @resp_cache_headers end)
cond do
has_cache? ->
# There's caching header present but no cache-control -- we need to set our own
# as Plug defaults to "max-age=0, private, must-revalidate"
List.keystore(
headers,
"cache-control",
0,
{"cache-control", @default_cache_control_header}
)
true ->
List.keystore(
headers,
"cache-control",
0,
{"cache-control", @default_cache_control_header}
)
end
end
defp build_resp_content_disposition_header(headers, opts) do
opt = Keyword.get(opts, :inline_content_types, @inline_content_types)
content_type = get_content_type(headers)
attachment? =
cond do
is_list(opt) && !Enum.member?(opt, content_type) -> true
opt == false -> true
true -> false
end
if attachment? do
name =
try do
{{"content-disposition", content_disposition_string}, _} =
List.keytake(headers, "content-disposition", 0)
[name | _] =
Regex.run(
~r/filename="((?:[^"\\]|\\.)*)"/u,
content_disposition_string || "",
capture: :all_but_first
)
name
rescue
MatchError -> Keyword.get(opts, :attachment_name, "attachment")
end
disposition = "attachment; filename=\"#{name}\""
List.keystore(headers, "content-disposition", 0, {"content-disposition", disposition})
else
headers
end
end
defp header_length_constraint(headers, limit) when is_integer(limit) and limit > 0 do
with {_, size} <- List.keyfind(headers, "content-length", 0),
{size, _} <- Integer.parse(size),
true <- size <= limit do
:ok
else
false ->
{:error, :body_too_large}
_ ->
:ok
end
end
defp header_length_constraint(_, _), do: :ok
defp body_size_constraint(size, limit) when is_integer(limit) and limit > 0 and size >= limit do
{:error, :body_too_large}
end
defp body_size_constraint(_, _), do: :ok
defp check_read_duration(duration, max)
when is_integer(duration) and is_integer(max) and max > 0 do
if duration > max do
{:error, :read_duration_exceeded}
else
{:ok, {duration, :erlang.system_time(:millisecond)}}
end
end
defp check_read_duration(_, _), do: {:ok, :no_duration_limit, :no_duration_limit}
defp increase_read_duration({previous_duration, started})
when is_integer(previous_duration) and is_integer(started) do
duration = :erlang.system_time(:millisecond) - started
{:ok, previous_duration + duration}
end
defp client, do: Pleroma.ReverseProxy.Client.Wrapper
defp track_failed_url(url, error, opts) do
ttl =
unless error in [:body_too_large, 400, 204] do
Keyword.get(opts, :failed_request_ttl, @failed_request_ttl)
else
nil
end
@cachex.put(:failed_proxy_url_cache, url, true, ttl: ttl)
end
end
diff --git a/lib/pleroma/telemetry/logger.ex b/lib/pleroma/telemetry/logger.ex
index 92d395394..9998d8185 100644
--- a/lib/pleroma/telemetry/logger.ex
+++ b/lib/pleroma/telemetry/logger.ex
@@ -1,90 +1,90 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Telemetry.Logger do
@moduledoc "Transforms Pleroma telemetry events to logs"
require Logger
@events [
[:pleroma, :connection_pool, :reclaim, :start],
[:pleroma, :connection_pool, :reclaim, :stop],
[:pleroma, :connection_pool, :provision_failure],
[:pleroma, :connection_pool, :client, :dead],
[:pleroma, :connection_pool, :client, :add]
]
def attach do
:telemetry.attach_many("pleroma-logger", @events, &handle_event/4, [])
end
# Passing anonymous functions instead of strings to logger is intentional,
# that way strings won't be concatenated if the message is going to be thrown
# out anyway due to higher log level configured
def handle_event(
[:pleroma, :connection_pool, :reclaim, :start],
_,
%{max_connections: max_connections, reclaim_max: reclaim_max},
_
) do
Logger.debug(fn ->
"Connection pool is exhausted (reached #{max_connections} connections). Starting idle connection cleanup to reclaim as much as #{reclaim_max} connections"
end)
end
def handle_event(
[:pleroma, :connection_pool, :reclaim, :stop],
%{reclaimed_count: 0},
_,
_
) do
Logger.error(fn ->
"Connection pool failed to reclaim any connections due to all of them being in use. It will have to drop requests for opening connections to new hosts"
end)
end
def handle_event(
[:pleroma, :connection_pool, :reclaim, :stop],
%{reclaimed_count: reclaimed_count},
_,
_
) do
Logger.debug(fn -> "Connection pool cleaned up #{reclaimed_count} idle connections" end)
end
def handle_event(
[:pleroma, :connection_pool, :provision_failure],
%{opts: [key | _]},
_,
_
) do
- Logger.error(fn ->
+ Logger.debug(fn ->
"Connection pool had to refuse opening a connection to #{key} due to connection limit exhaustion"
end)
end
def handle_event(
[:pleroma, :connection_pool, :client, :dead],
%{client_pid: client_pid, reason: reason},
%{key: key},
_
) do
Logger.warning(fn ->
"Pool worker for #{key}: Client #{inspect(client_pid)} died before releasing the connection with #{inspect(reason)}"
end)
end
def handle_event(
[:pleroma, :connection_pool, :client, :add],
%{clients: [_, _ | _] = clients},
%{key: key, protocol: :http},
_
) do
- Logger.info(fn ->
+ Logger.debug(fn ->
"Pool worker for #{key}: #{length(clients)} clients are using an HTTP1 connection at the same time, head-of-line blocking might occur."
end)
end
def handle_event([:pleroma, :connection_pool, :client, :add], _, _, _), do: :ok
end
diff --git a/lib/pleroma/web/activity_pub/mrf/steal_emoji_policy.ex b/lib/pleroma/web/activity_pub/mrf/steal_emoji_policy.ex
index 237dfefa5..fa6b595ea 100644
--- a/lib/pleroma/web/activity_pub/mrf/steal_emoji_policy.ex
+++ b/lib/pleroma/web/activity_pub/mrf/steal_emoji_policy.ex
@@ -1,156 +1,158 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.MRF.StealEmojiPolicy do
require Logger
alias Pleroma.Config
@moduledoc "Detect new emojis by their shortcode and steals them"
@behaviour Pleroma.Web.ActivityPub.MRF.Policy
defp accept_host?(host), do: host in Config.get([:mrf_steal_emoji, :hosts], [])
defp shortcode_matches?(shortcode, pattern) when is_binary(pattern) do
shortcode == pattern
end
defp shortcode_matches?(shortcode, pattern) do
String.match?(shortcode, pattern)
end
defp steal_emoji({shortcode, url}, emoji_dir_path) do
url = Pleroma.Web.MediaProxy.url(url)
with {:ok, %{status: status} = response} when status in 200..299 <- Pleroma.HTTP.get(url) do
size_limit = Config.get([:mrf_steal_emoji, :size_limit], 50_000)
if byte_size(response.body) <= size_limit do
extension =
url
|> URI.parse()
|> Map.get(:path)
|> Path.basename()
|> Path.extname()
extension = if extension == "", do: ".png", else: extension
+ shortcode = Path.basename(shortcode)
file_path = Path.join(emoji_dir_path, shortcode <> extension)
case File.write(file_path, response.body) do
:ok ->
shortcode
e ->
Logger.warning("MRF.StealEmojiPolicy: Failed to write to #{file_path}: #{inspect(e)}")
nil
end
else
Logger.debug(
"MRF.StealEmojiPolicy: :#{shortcode}: at #{url} (#{byte_size(response.body)} B) over size limit (#{size_limit} B)"
)
nil
end
else
e ->
Logger.warning("MRF.StealEmojiPolicy: Failed to fetch #{url}: #{inspect(e)}")
nil
end
end
@impl true
def filter(%{"object" => %{"emoji" => foreign_emojis, "actor" => actor}} = message) do
host = URI.parse(actor).host
if host != Pleroma.Web.Endpoint.host() and accept_host?(host) do
installed_emoji = Pleroma.Emoji.get_all() |> Enum.map(fn {k, _} -> k end)
emoji_dir_path =
Config.get(
[:mrf_steal_emoji, :path],
Path.join(Config.get([:instance, :static_dir]), "emoji/stolen")
)
File.mkdir_p(emoji_dir_path)
new_emojis =
foreign_emojis
|> Enum.reject(fn {shortcode, _url} -> shortcode in installed_emoji end)
+ |> Enum.reject(fn {shortcode, _url} -> String.contains?(shortcode, ["/", "\\"]) end)
|> Enum.filter(fn {shortcode, _url} ->
reject_emoji? =
[:mrf_steal_emoji, :rejected_shortcodes]
|> Config.get([])
|> Enum.find(false, fn pattern -> shortcode_matches?(shortcode, pattern) end)
!reject_emoji?
end)
|> Enum.map(&steal_emoji(&1, emoji_dir_path))
|> Enum.filter(& &1)
if !Enum.empty?(new_emojis) do
Logger.info("Stole new emojis: #{inspect(new_emojis)}")
Pleroma.Emoji.reload()
end
end
{:ok, message}
end
def filter(message), do: {:ok, message}
@impl true
@spec config_description :: %{
children: [
%{
description: <<_::272, _::_*256>>,
key: :hosts | :rejected_shortcodes | :size_limit,
suggestions: [any(), ...],
type: {:list, :string} | {:list, :string} | :integer
},
...
],
description: <<_::448>>,
key: :mrf_steal_emoji,
label: <<_::80>>,
related_policy: <<_::352>>
}
def config_description do
%{
key: :mrf_steal_emoji,
related_policy: "Pleroma.Web.ActivityPub.MRF.StealEmojiPolicy",
label: "MRF Emojis",
description: "Steals emojis from selected instances when it sees them.",
children: [
%{
key: :hosts,
type: {:list, :string},
description: "List of hosts to steal emojis from",
suggestions: [""]
},
%{
key: :rejected_shortcodes,
type: {:list, :string},
description: """
A list of patterns or matches to reject shortcodes with.
Each pattern can be a string or [Regex](https://hexdocs.pm/elixir/Regex.html) in the format of `~r/PATTERN/`.
""",
suggestions: ["foo", ~r/foo/]
},
%{
key: :size_limit,
type: :integer,
description: "File size limit (in bytes), checked before an emoji is saved to the disk",
suggestions: ["100000"]
}
]
}
end
@impl true
def describe do
{:ok, %{}}
end
end
diff --git a/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex b/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex
index ab56ba468..4699029d4 100644
--- a/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex
+++ b/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex
@@ -1,125 +1,128 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes do
alias Pleroma.EctoType.ActivityPub.ObjectValidators
+ alias Pleroma.Maps
alias Pleroma.Object
alias Pleroma.Object.Containment
alias Pleroma.User
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.Utils
require Pleroma.Constants
def cast_and_filter_recipients(message, field, follower_collection, field_fallback \\ []) do
{:ok, data} = ObjectValidators.Recipients.cast(message[field] || field_fallback)
data =
Enum.reject(data, fn x ->
String.ends_with?(x, "/followers") and x != follower_collection
end)
Map.put(message, field, data)
end
def fix_object_defaults(data) do
+ data = Maps.filter_empty_values(data)
+
context =
Utils.maybe_create_context(
data["context"] || data["conversation"] || data["inReplyTo"] || data["id"]
)
%User{follower_address: follower_collection} = User.get_cached_by_ap_id(data["attributedTo"])
data
|> Map.put("context", context)
|> cast_and_filter_recipients("to", follower_collection)
|> cast_and_filter_recipients("cc", follower_collection)
|> cast_and_filter_recipients("bto", follower_collection)
|> cast_and_filter_recipients("bcc", follower_collection)
|> Transmogrifier.fix_implicit_addressing(follower_collection)
end
def fix_activity_addressing(activity) do
%User{follower_address: follower_collection} = User.get_cached_by_ap_id(activity["actor"])
activity
|> cast_and_filter_recipients("to", follower_collection)
|> cast_and_filter_recipients("cc", follower_collection)
|> cast_and_filter_recipients("bto", follower_collection)
|> cast_and_filter_recipients("bcc", follower_collection)
|> Transmogrifier.fix_implicit_addressing(follower_collection)
end
def fix_actor(data) do
actor =
data
|> Map.put_new("actor", data["attributedTo"])
|> Containment.get_actor()
data
|> Map.put("actor", actor)
|> Map.put("attributedTo", actor)
end
def fix_activity_context(data, %Object{data: %{"context" => object_context}}) do
data
|> Map.put("context", object_context)
end
def fix_object_action_recipients(%{"actor" => actor} = data, %Object{data: %{"actor" => actor}}) do
to = ((data["to"] || []) -- [actor]) |> Enum.uniq()
Map.put(data, "to", to)
end
def fix_object_action_recipients(data, %Object{data: %{"actor" => actor}}) do
to = ((data["to"] || []) ++ [actor]) |> Enum.uniq()
Map.put(data, "to", to)
end
def fix_quote_url(%{"quoteUrl" => _quote_url} = data), do: data
# Fedibird
# https://github.com/fedibird/mastodon/commit/dbd7ae6cf58a92ec67c512296b4daaea0d01e6ac
def fix_quote_url(%{"quoteUri" => quote_url} = data) do
Map.put(data, "quoteUrl", quote_url)
end
# Old Fedibird (bug)
# https://github.com/fedibird/mastodon/issues/9
def fix_quote_url(%{"quoteURL" => quote_url} = data) do
Map.put(data, "quoteUrl", quote_url)
end
# Misskey fallback
def fix_quote_url(%{"_misskey_quote" => quote_url} = data) do
Map.put(data, "quoteUrl", quote_url)
end
def fix_quote_url(%{"tag" => [_ | _] = tags} = data) do
tag = Enum.find(tags, &object_link_tag?/1)
if not is_nil(tag) do
data
|> Map.put("quoteUrl", tag["href"])
else
data
end
end
def fix_quote_url(data), do: data
# https://codeberg.org/fediverse/fep/src/branch/main/fep/e232/fep-e232.md
def object_link_tag?(%{
"type" => "Link",
"mediaType" => media_type,
"href" => href
})
when media_type in Pleroma.Constants.activity_json_mime_types() and is_binary(href) do
true
end
def object_link_tag?(_), do: false
end
diff --git a/lib/pleroma/web/activity_pub/transmogrifier.ex b/lib/pleroma/web/activity_pub/transmogrifier.ex
index 44ffc5bf1..76649d14a 100644
--- a/lib/pleroma/web/activity_pub/transmogrifier.ex
+++ b/lib/pleroma/web/activity_pub/transmogrifier.ex
@@ -1,988 +1,984 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.Transmogrifier do
@moduledoc """
A module to handle coding from internal to wire ActivityPub and back.
"""
alias Pleroma.Activity
alias Pleroma.EctoType.ActivityPub.ObjectValidators
alias Pleroma.Maps
alias Pleroma.Object
alias Pleroma.Object.Containment
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Builder
alias Pleroma.Web.ActivityPub.ObjectValidator
alias Pleroma.Web.ActivityPub.Pipeline
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.Federator
import Ecto.Query
require Pleroma.Constants
@doc """
Modifies an incoming AP object (mastodon format) to our internal format.
"""
def fix_object(object, options \\ []) do
object
|> strip_internal_fields()
|> fix_actor()
|> fix_url()
|> fix_attachments()
|> fix_context()
|> fix_in_reply_to(options)
|> fix_emoji()
|> fix_tag()
|> fix_content_map()
|> fix_addressing()
|> fix_summary()
end
def fix_summary(%{"summary" => nil} = object) do
Map.put(object, "summary", "")
end
def fix_summary(%{"summary" => _} = object) do
# summary is present, nothing to do
object
end
def fix_summary(object), do: Map.put(object, "summary", "")
def fix_addressing_list(map, field) do
addrs = map[field]
cond do
is_list(addrs) ->
Map.put(map, field, Enum.filter(addrs, &is_binary/1))
is_binary(addrs) ->
Map.put(map, field, [addrs])
true ->
Map.put(map, field, [])
end
end
# if directMessage flag is set to true, leave the addressing alone
def fix_explicit_addressing(%{"directMessage" => true} = object, _follower_collection),
do: object
def fix_explicit_addressing(%{"to" => to, "cc" => cc} = object, follower_collection) do
explicit_mentions =
Utils.determine_explicit_mentions(object) ++
[Pleroma.Constants.as_public(), follower_collection]
explicit_to = Enum.filter(to, fn x -> x in explicit_mentions end)
explicit_cc = Enum.filter(to, fn x -> x not in explicit_mentions end)
final_cc =
(cc ++ explicit_cc)
|> Enum.filter(& &1)
|> Enum.reject(fn x -> String.ends_with?(x, "/followers") and x != follower_collection end)
|> Enum.uniq()
object
|> Map.put("to", explicit_to)
|> Map.put("cc", final_cc)
end
# if as:Public is addressed, then make sure the followers collection is also addressed
# so that the activities will be delivered to local users.
def fix_implicit_addressing(%{"to" => to, "cc" => cc} = object, followers_collection) do
recipients = to ++ cc
if followers_collection not in recipients do
cond do
Pleroma.Constants.as_public() in cc ->
to = to ++ [followers_collection]
Map.put(object, "to", to)
Pleroma.Constants.as_public() in to ->
cc = cc ++ [followers_collection]
Map.put(object, "cc", cc)
true ->
object
end
else
object
end
end
def fix_addressing(object) do
{:ok, %User{follower_address: follower_collection}} =
object
|> Containment.get_actor()
|> User.get_or_fetch_by_ap_id()
object
|> fix_addressing_list("to")
|> fix_addressing_list("cc")
|> fix_addressing_list("bto")
|> fix_addressing_list("bcc")
|> fix_explicit_addressing(follower_collection)
|> fix_implicit_addressing(follower_collection)
end
def fix_actor(%{"attributedTo" => actor} = object) do
actor = Containment.get_actor(%{"actor" => actor})
# TODO: Remove actor field for Objects
object
|> Map.put("actor", actor)
|> Map.put("attributedTo", actor)
end
def fix_in_reply_to(object, options \\ [])
def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object, options)
when not is_nil(in_reply_to) do
in_reply_to_id = prepare_in_reply_to(in_reply_to)
depth = (options[:depth] || 0) + 1
if Federator.allowed_thread_distance?(depth) do
with {:ok, replied_object} <- get_obj_helper(in_reply_to_id, options),
%Activity{} <- Activity.get_create_by_object_ap_id(replied_object.data["id"]) do
object
|> Map.put("inReplyTo", replied_object.data["id"])
|> Map.put("context", replied_object.data["context"] || object["conversation"])
|> Map.drop(["conversation", "inReplyToAtomUri"])
else
_ ->
object
end
else
object
end
end
def fix_in_reply_to(object, _options), do: object
def fix_quote_url_and_maybe_fetch(object, options \\ []) do
quote_url =
case Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes.fix_quote_url(object) do
%{"quoteUrl" => quote_url} -> quote_url
_ -> nil
end
with {:quoting?, true} <- {:quoting?, not is_nil(quote_url)},
{:ok, quoted_object} <- get_obj_helper(quote_url, options),
%Activity{} <- Activity.get_create_by_object_ap_id(quoted_object.data["id"]) do
Map.put(object, "quoteUrl", quoted_object.data["id"])
else
{:quoting?, _} ->
object
_ ->
object
end
end
defp prepare_in_reply_to(in_reply_to) do
cond do
is_bitstring(in_reply_to) ->
in_reply_to
is_map(in_reply_to) && is_bitstring(in_reply_to["id"]) ->
in_reply_to["id"]
is_list(in_reply_to) && is_bitstring(Enum.at(in_reply_to, 0)) ->
Enum.at(in_reply_to, 0)
true ->
""
end
end
def fix_context(object) do
context = object["context"] || object["conversation"] || Utils.generate_context_id()
object
|> Map.put("context", context)
|> Map.drop(["conversation"])
end
def fix_attachments(%{"attachment" => attachment} = object) when is_list(attachment) do
attachments =
Enum.map(attachment, fn data ->
url =
cond do
is_list(data["url"]) -> List.first(data["url"])
is_map(data["url"]) -> data["url"]
true -> nil
end
media_type =
cond do
is_map(url) && url =~ Pleroma.Constants.mime_regex() ->
url["mediaType"]
is_bitstring(data["mediaType"]) && data["mediaType"] =~ Pleroma.Constants.mime_regex() ->
data["mediaType"]
is_bitstring(data["mimeType"]) && data["mimeType"] =~ Pleroma.Constants.mime_regex() ->
data["mimeType"]
true ->
nil
end
href =
cond do
is_map(url) && is_binary(url["href"]) -> url["href"]
is_binary(data["url"]) -> data["url"]
is_binary(data["href"]) -> data["href"]
true -> nil
end
if href do
attachment_url =
%{
"href" => href,
"type" => Map.get(url || %{}, "type", "Link")
}
|> Maps.put_if_present("mediaType", media_type)
|> Maps.put_if_present("width", (url || %{})["width"] || data["width"])
|> Maps.put_if_present("height", (url || %{})["height"] || data["height"])
%{
"url" => [attachment_url],
"type" => data["type"] || "Document"
}
|> Maps.put_if_present("mediaType", media_type)
|> Maps.put_if_present("name", data["name"])
|> Maps.put_if_present("blurhash", data["blurhash"])
else
nil
end
end)
|> Enum.filter(& &1)
Map.put(object, "attachment", attachments)
end
def fix_attachments(%{"attachment" => attachment} = object) when is_map(attachment) do
object
|> Map.put("attachment", [attachment])
|> fix_attachments()
end
def fix_attachments(object), do: object
def fix_url(%{"url" => url} = object) when is_map(url) do
Map.put(object, "url", url["href"])
end
def fix_url(%{"url" => url} = object) when is_list(url) do
first_element = Enum.at(url, 0)
url_string =
cond do
is_bitstring(first_element) -> first_element
is_map(first_element) -> first_element["href"] || ""
true -> ""
end
Map.put(object, "url", url_string)
end
def fix_url(object), do: object
def fix_emoji(%{"tag" => tags} = object) when is_list(tags) do
emoji =
tags
|> Enum.filter(fn data -> is_map(data) and data["type"] == "Emoji" and data["icon"] end)
|> Enum.reduce(%{}, fn data, mapping ->
name = String.trim(data["name"], ":")
Map.put(mapping, name, data["icon"]["url"])
end)
Map.put(object, "emoji", emoji)
end
def fix_emoji(%{"tag" => %{"type" => "Emoji"} = tag} = object) do
name = String.trim(tag["name"], ":")
emoji = %{name => tag["icon"]["url"]}
Map.put(object, "emoji", emoji)
end
def fix_emoji(object), do: object
def fix_tag(%{"tag" => tag} = object) when is_list(tag) do
tags =
tag
|> Enum.filter(fn data -> data["type"] == "Hashtag" and data["name"] end)
|> Enum.map(fn
%{"name" => "#" <> hashtag} -> String.downcase(hashtag)
%{"name" => hashtag} -> String.downcase(hashtag)
end)
Map.put(object, "tag", tag ++ tags)
end
def fix_tag(%{"tag" => %{} = tag} = object) do
object
|> Map.put("tag", [tag])
|> fix_tag
end
def fix_tag(object), do: object
- def fix_content_map(%{"contentMap" => nil} = object) do
- Map.drop(object, ["contentMap"])
- end
-
# content map usually only has one language so this will do for now.
def fix_content_map(%{"contentMap" => content_map} = object) do
content_groups = Map.to_list(content_map)
{_, content} = Enum.at(content_groups, 0)
Map.put(object, "content", content)
end
def fix_content_map(object), do: object
defp fix_type(%{"type" => "Note", "inReplyTo" => reply_id, "name" => _} = object, options)
when is_binary(reply_id) do
options = Keyword.put(options, :fetch, true)
with %Object{data: %{"type" => "Question"}} <- Object.normalize(reply_id, options) do
Map.put(object, "type", "Answer")
else
_ -> object
end
end
defp fix_type(object, _options), do: object
# Reduce the object list to find the reported user.
defp get_reported(objects) do
Enum.reduce_while(objects, nil, fn ap_id, _ ->
with %User{} = user <- User.get_cached_by_ap_id(ap_id) do
{:halt, user}
else
_ -> {:cont, nil}
end
end)
end
def handle_incoming(data, options \\ [])
# Flag objects are placed ahead of the ID check because Mastodon 2.8 and earlier send them
# with nil ID.
def handle_incoming(%{"type" => "Flag", "object" => objects, "actor" => actor} = data, _options) do
with context <- data["context"] || Utils.generate_context_id(),
content <- data["content"] || "",
%User{} = actor <- User.get_cached_by_ap_id(actor),
# Reduce the object list to find the reported user.
%User{} = account <- get_reported(objects),
# Remove the reported user from the object list.
statuses <- Enum.filter(objects, fn ap_id -> ap_id != account.ap_id end) do
%{
actor: actor,
context: context,
account: account,
statuses: statuses,
content: content,
additional: %{"cc" => [account.ap_id]}
}
|> ActivityPub.flag()
end
end
# disallow objects with bogus IDs
def handle_incoming(%{"id" => nil}, _options), do: :error
def handle_incoming(%{"id" => ""}, _options), do: :error
# length of https:// = 8, should validate better, but good enough for now.
def handle_incoming(%{"id" => id}, _options) when is_binary(id) and byte_size(id) < 8,
do: :error
def handle_incoming(
%{"type" => "Listen", "object" => %{"type" => "Audio"} = object} = data,
options
) do
actor = Containment.get_actor(data)
data =
Map.put(data, "actor", actor)
|> fix_addressing
with {:ok, %User{} = user} <- User.get_or_fetch_by_ap_id(data["actor"]) do
reply_depth = (options[:depth] || 0) + 1
options = Keyword.put(options, :depth, reply_depth)
object = fix_object(object, options)
params = %{
to: data["to"],
object: object,
actor: user,
context: nil,
local: false,
published: data["published"],
additional: Map.take(data, ["cc", "id"])
}
ActivityPub.listen(params)
else
_e -> :error
end
end
@misskey_reactions %{
"like" => "👍",
"love" => "❤️",
"laugh" => "😆",
"hmm" => "🤔",
"surprise" => "😮",
"congrats" => "🎉",
"angry" => "💢",
"confused" => "😥",
"rip" => "😇",
"pudding" => "🍮",
"star" => "⭐"
}
@doc "Rewrite misskey likes into EmojiReacts"
def handle_incoming(
%{
"type" => "Like",
"_misskey_reaction" => reaction
} = data,
options
) do
data
|> Map.put("type", "EmojiReact")
|> Map.put("content", @misskey_reactions[reaction] || reaction)
|> handle_incoming(options)
end
def handle_incoming(
%{"type" => "Create", "object" => %{"type" => objtype, "id" => obj_id}} = data,
options
)
when objtype in ~w{Question Answer ChatMessage Audio Video Event Article Note Page Image} do
fetch_options = Keyword.put(options, :depth, (options[:depth] || 0) + 1)
object =
data["object"]
|> strip_internal_fields()
|> fix_type(fetch_options)
|> fix_in_reply_to(fetch_options)
|> fix_quote_url_and_maybe_fetch(fetch_options)
data = Map.put(data, "object", object)
options = Keyword.put(options, :local, false)
with {:ok, %User{}} <- ObjectValidator.fetch_actor(data),
nil <- Activity.get_create_by_object_ap_id(obj_id),
{:ok, activity, _} <- Pipeline.common_pipeline(data, options) do
{:ok, activity}
else
%Activity{} = activity -> {:ok, activity}
e -> e
end
end
def handle_incoming(%{"type" => type} = data, _options)
when type in ~w{Like EmojiReact Announce Add Remove} do
with :ok <- ObjectValidator.fetch_actor_and_object(data),
{:ok, activity, _meta} <-
Pipeline.common_pipeline(data, local: false) do
{:ok, activity}
else
e -> {:error, e}
end
end
def handle_incoming(
%{"type" => type} = data,
_options
)
when type in ~w{Update Block Follow Accept Reject Join Leave} do
with {:ok, %User{}} <- ObjectValidator.fetch_actor(data),
{:ok, activity, _} <-
Pipeline.common_pipeline(data, local: false) do
{:ok, activity}
end
end
def handle_incoming(
%{"type" => "Delete"} = data,
_options
) do
with {:ok, activity, _} <-
Pipeline.common_pipeline(data, local: false) do
{:ok, activity}
else
{:error, {:validate, _}} = e ->
# Check if we have a create activity for this
with {:ok, object_id} <- ObjectValidators.ObjectID.cast(data["object"]),
%Activity{data: %{"actor" => actor}} <-
Activity.create_by_object_ap_id(object_id) |> Repo.one(),
# We have one, insert a tombstone and retry
{:ok, tombstone_data, _} <- Builder.tombstone(actor, object_id),
{:ok, _tombstone} <- Object.create(tombstone_data) do
handle_incoming(data)
else
_ -> e
end
end
end
def handle_incoming(
%{
"type" => "Undo",
"object" => %{"type" => "Follow", "object" => followed},
"actor" => follower,
"id" => id
} = _data,
_options
) do
with %User{local: true} = followed <- User.get_cached_by_ap_id(followed),
{:ok, %User{} = follower} <- User.get_or_fetch_by_ap_id(follower),
{:ok, activity} <- ActivityPub.unfollow(follower, followed, id, false) do
User.unfollow(follower, followed)
{:ok, activity}
else
_e -> :error
end
end
def handle_incoming(
%{
"type" => "Undo",
"object" => %{"type" => type}
} = data,
_options
)
when type in ["Like", "EmojiReact", "Announce", "Block", "Join"] do
with {:ok, activity, _} <- Pipeline.common_pipeline(data, local: false) do
{:ok, activity}
end
end
# For Undos that don't have the complete object attached, try to find it in our database.
def handle_incoming(
%{
"type" => "Undo",
"object" => object
} = activity,
options
)
when is_binary(object) do
with %Activity{data: data} <- Activity.get_by_ap_id(object) do
activity
|> Map.put("object", data)
|> handle_incoming(options)
else
_e -> :error
end
end
def handle_incoming(
%{
"type" => "Move",
"actor" => origin_actor,
"object" => origin_actor,
"target" => target_actor
},
_options
) do
with %User{} = origin_user <- User.get_cached_by_ap_id(origin_actor),
{:ok, %User{} = target_user} <- User.get_or_fetch_by_ap_id(target_actor),
true <- origin_actor in target_user.also_known_as do
ActivityPub.move(origin_user, target_user, false)
else
_e -> :error
end
end
def handle_incoming(_, _), do: :error
@spec get_obj_helper(String.t(), Keyword.t()) :: {:ok, Object.t()} | nil
def get_obj_helper(id, options \\ []) do
options = Keyword.put(options, :fetch, true)
case Object.normalize(id, options) do
%Object{} = object -> {:ok, object}
_ -> nil
end
end
@spec get_embedded_obj_helper(String.t() | Object.t(), User.t()) :: {:ok, Object.t()} | nil
def get_embedded_obj_helper(%{"attributedTo" => attributed_to, "id" => object_id} = data, %User{
ap_id: ap_id
})
when attributed_to == ap_id do
with {:ok, activity} <-
handle_incoming(%{
"type" => "Create",
"to" => data["to"],
"cc" => data["cc"],
"actor" => attributed_to,
"object" => data
}) do
{:ok, Object.normalize(activity, fetch: false)}
else
_ -> get_obj_helper(object_id)
end
end
def get_embedded_obj_helper(object_id, _) do
get_obj_helper(object_id)
end
def set_reply_to_uri(%{"inReplyTo" => in_reply_to} = object) when is_binary(in_reply_to) do
with false <- String.starts_with?(in_reply_to, "http"),
{:ok, %{data: replied_to_object}} <- get_obj_helper(in_reply_to) do
Map.put(object, "inReplyTo", replied_to_object["external_url"] || in_reply_to)
else
_e -> object
end
end
def set_reply_to_uri(obj), do: obj
@doc """
Fedibird compatibility
https://github.com/fedibird/mastodon/commit/dbd7ae6cf58a92ec67c512296b4daaea0d01e6ac
"""
def set_quote_url(%{"quoteUrl" => quote_url} = object) when is_binary(quote_url) do
Map.put(object, "quoteUri", quote_url)
end
def set_quote_url(obj), do: obj
@doc """
Serialized Mastodon-compatible `replies` collection containing _self-replies_.
Based on Mastodon's ActivityPub::NoteSerializer#replies.
"""
def set_replies(obj_data) do
replies_uris =
with limit when limit > 0 <-
Pleroma.Config.get([:activitypub, :note_replies_output_limit], 0),
%Object{} = object <- Object.get_cached_by_ap_id(obj_data["id"]) do
object
|> Object.self_replies()
|> select([o], fragment("?->>'id'", o.data))
|> limit(^limit)
|> Repo.all()
else
_ -> []
end
set_replies(obj_data, replies_uris)
end
defp set_replies(obj, []) do
obj
end
defp set_replies(obj, replies_uris) do
replies_collection = %{
"type" => "Collection",
"items" => replies_uris
}
Map.merge(obj, %{"replies" => replies_collection})
end
def replies(%{"replies" => %{"first" => %{"items" => items}}}) when not is_nil(items) do
items
end
def replies(%{"replies" => %{"items" => items}}) when not is_nil(items) do
items
end
def replies(_), do: []
# Prepares the object of an outgoing create activity.
def prepare_object(object) do
object
|> add_hashtags
|> add_mention_tags
|> add_emoji_tags
|> add_attributed_to
|> prepare_attachments
|> set_conversation
|> set_reply_to_uri
|> set_quote_url
|> set_replies
|> strip_internal_fields
|> strip_internal_tags
|> set_type
|> maybe_process_history
end
defp maybe_process_history(%{"formerRepresentations" => %{"orderedItems" => history}} = object) do
processed_history =
Enum.map(
history,
fn
item when is_map(item) -> prepare_object(item)
item -> item
end
)
put_in(object, ["formerRepresentations", "orderedItems"], processed_history)
end
defp maybe_process_history(object) do
object
end
# @doc
# """
# internal -> Mastodon
# """
def prepare_outgoing(%{"type" => activity_type, "object" => object_id} = data)
when activity_type in ["Create", "Listen"] do
object =
object_id
|> Object.normalize(fetch: false)
|> Map.get(:data)
|> prepare_object
data =
data
|> Map.put("object", object)
|> Map.merge(Utils.make_json_ld_header())
|> Map.delete("bcc")
{:ok, data}
end
def prepare_outgoing(%{"type" => "Update", "object" => %{"type" => objtype} = object} = data)
when objtype in Pleroma.Constants.updatable_object_types() do
object =
object
|> prepare_object
data =
data
|> Map.put("object", object)
|> Map.merge(Utils.make_json_ld_header())
|> Map.delete("bcc")
{:ok, data}
end
def prepare_outgoing(%{"type" => "Announce", "actor" => ap_id, "object" => object_id} = data) do
object =
object_id
|> Object.normalize(fetch: false)
data =
if Visibility.private?(object) && object.data["actor"] == ap_id do
data |> Map.put("object", object |> Map.get(:data) |> prepare_object)
else
data |> maybe_fix_object_url
end
data =
data
|> strip_internal_fields
|> Map.merge(Utils.make_json_ld_header())
|> Map.delete("bcc")
{:ok, data}
end
# Mastodon Accept/Reject requires a non-normalized object containing the actor URIs,
# because of course it does.
def prepare_outgoing(%{"type" => "Accept"} = data) do
with follow_activity <- Activity.normalize(data["object"]) do
object = %{
"actor" => follow_activity.actor,
"object" => follow_activity.data["object"],
"id" => follow_activity.data["id"],
"type" => "Follow"
}
data =
data
|> Map.put("object", object)
|> Map.merge(Utils.make_json_ld_header())
{:ok, data}
end
end
def prepare_outgoing(%{"type" => "Reject"} = data) do
with follow_activity <- Activity.normalize(data["object"]) do
object = %{
"actor" => follow_activity.actor,
"object" => follow_activity.data["object"],
"id" => follow_activity.data["id"],
"type" => "Follow"
}
data =
data
|> Map.put("object", object)
|> Map.merge(Utils.make_json_ld_header())
{:ok, data}
end
end
def prepare_outgoing(%{"type" => _type} = data) do
data =
data
|> strip_internal_fields
|> maybe_fix_object_url
|> Map.merge(Utils.make_json_ld_header())
{:ok, data}
end
def maybe_fix_object_url(%{"object" => object} = data) when is_binary(object) do
with false <- String.starts_with?(object, "http"),
{:fetch, {:ok, relative_object}} <- {:fetch, get_obj_helper(object)},
%{data: %{"external_url" => external_url}} when not is_nil(external_url) <-
relative_object do
Map.put(data, "object", external_url)
else
{:fetch, _} ->
data
_ ->
data
end
end
def maybe_fix_object_url(data), do: data
def add_hashtags(object) do
tags =
(object["tag"] || [])
|> Enum.map(fn
# Expand internal representation tags into AS2 tags.
tag when is_binary(tag) ->
%{
"href" => Pleroma.Web.Endpoint.url() <> "/tags/#{tag}",
"name" => "##{tag}",
"type" => "Hashtag"
}
# Do not process tags which are already AS2 tag objects.
tag when is_map(tag) ->
tag
end)
Map.put(object, "tag", tags)
end
# TODO These should be added on our side on insertion, it doesn't make much
# sense to regenerate these all the time
def add_mention_tags(object) do
to = object["to"] || []
cc = object["cc"] || []
mentioned = User.get_users_from_set(to ++ cc, local_only: false)
mentions = Enum.map(mentioned, &build_mention_tag/1)
tags = object["tag"] || []
Map.put(object, "tag", tags ++ mentions)
end
defp build_mention_tag(%{ap_id: ap_id, nickname: nickname} = _) do
%{"type" => "Mention", "href" => ap_id, "name" => "@#{nickname}"}
end
def take_emoji_tags(%User{emoji: emoji}) do
emoji
|> Map.to_list()
|> Enum.map(&build_emoji_tag/1)
end
# TODO: we should probably send mtime instead of unix epoch time for updated
def add_emoji_tags(%{"emoji" => emoji} = object) do
tags = object["tag"] || []
out = Enum.map(emoji, &build_emoji_tag/1)
Map.put(object, "tag", tags ++ out)
end
def add_emoji_tags(object), do: object
defp build_emoji_tag({name, url}) do
%{
"icon" => %{"url" => "#{URI.encode(url)}", "type" => "Image"},
"name" => ":" <> name <> ":",
"type" => "Emoji",
"updated" => "1970-01-01T00:00:00Z",
"id" => url
}
end
def set_conversation(object) do
Map.put(object, "conversation", object["context"])
end
def set_type(%{"type" => "Answer"} = object) do
Map.put(object, "type", "Note")
end
def set_type(object), do: object
def add_attributed_to(object) do
attributed_to = object["attributedTo"] || object["actor"]
Map.put(object, "attributedTo", attributed_to)
end
# TODO: Revisit this
def prepare_attachments(%{"type" => "ChatMessage"} = object), do: object
def prepare_attachments(object) do
attachments =
object
|> Map.get("attachment", [])
|> Enum.map(fn data ->
[%{"mediaType" => media_type, "href" => href} = url | _] = data["url"]
%{
"url" => href,
"mediaType" => media_type,
"name" => data["name"],
"type" => "Document"
}
|> Maps.put_if_present("width", url["width"])
|> Maps.put_if_present("height", url["height"])
|> Maps.put_if_present("blurhash", data["blurhash"])
end)
Map.put(object, "attachment", attachments)
end
def strip_internal_fields(object) do
Map.drop(object, Pleroma.Constants.object_internal_fields())
end
defp strip_internal_tags(%{"tag" => tags} = object) do
tags = Enum.filter(tags, fn x -> is_map(x) end)
Map.put(object, "tag", tags)
end
defp strip_internal_tags(object), do: object
def maybe_fix_user_url(%{"url" => url} = data) when is_map(url) do
Map.put(data, "url", url["href"])
end
def maybe_fix_user_url(data), do: data
def maybe_fix_user_object(data), do: maybe_fix_user_url(data)
end
diff --git a/lib/pleroma/web/controller_helper.ex b/lib/pleroma/web/controller_helper.ex
index d3cf372dc..1caf0f7e6 100644
--- a/lib/pleroma/web/controller_helper.ex
+++ b/lib/pleroma/web/controller_helper.ex
@@ -1,120 +1,120 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ControllerHelper do
use Pleroma.Web, :controller
alias Pleroma.Pagination
alias Pleroma.Web.Utils.Params
def json_response(conn, status, _) when status in [204, :no_content] do
conn
|> put_resp_header("content-type", "application/json")
|> send_resp(status, "")
end
def json_response(conn, status, json) do
conn
|> put_status(status)
|> json(json)
end
- @spec fetch_integer_param(map(), String.t(), integer() | nil) :: integer() | nil
+ @spec fetch_integer_param(map(), String.t() | atom(), integer() | nil) :: integer() | nil
def fetch_integer_param(params, name, default \\ nil) do
params
|> Map.get(name, default)
|> param_to_integer(default)
end
defp param_to_integer(val, _) when is_integer(val), do: val
defp param_to_integer(val, default) when is_binary(val) do
case Integer.parse(val) do
{res, _} -> res
_ -> default
end
end
defp param_to_integer(_, default), do: default
def add_link_headers(conn, entries, extra_params \\ %{})
def add_link_headers(%{assigns: %{skip_link_headers: true}} = conn, _entries, _extra_params),
do: conn
def add_link_headers(conn, entries, extra_params) do
case get_pagination_fields(conn, entries, extra_params) do
%{"next" => next_url, "prev" => prev_url} ->
put_resp_header(conn, "link", "<#{next_url}>; rel=\"next\", <#{prev_url}>; rel=\"prev\"")
_ ->
conn
end
end
# TODO: Only fetch the params from open_api_spex when everything is converted
@id_keys Pagination.page_keys() -- ["limit", "order"]
defp build_pagination_fields(conn, min_id, max_id, extra_params) do
params =
if Map.has_key?(conn.private, :open_api_spex) do
get_in(conn, [Access.key(:private), Access.key(:open_api_spex), Access.key(:params)])
else
conn.params
end
|> Map.drop(Map.keys(conn.path_params) |> Enum.map(&String.to_existing_atom/1))
|> Map.merge(extra_params)
|> Map.drop(@id_keys)
%{
"next" => current_url(conn, Map.put(params, :max_id, max_id)),
"prev" => current_url(conn, Map.put(params, :min_id, min_id)),
"id" => current_url(conn)
}
end
def get_pagination_fields(conn, entries, extra_params \\ %{}) do
case List.last(entries) do
%{pagination_id: max_id} when not is_nil(max_id) ->
%{pagination_id: min_id} = List.first(entries)
build_pagination_fields(conn, min_id, max_id, extra_params)
%{id: max_id} ->
%{id: min_id} = List.first(entries)
build_pagination_fields(conn, min_id, max_id, extra_params)
_ ->
%{}
end
end
def assign_account_by_id(%{private: %{open_api_spex: %{params: %{id: id}}}} = conn, _) do
case Pleroma.User.get_cached_by_id(id) do
%Pleroma.User{} = account -> assign(conn, :account, account)
nil -> Pleroma.Web.MastodonAPI.FallbackController.call(conn, {:error, :not_found}) |> halt()
end
end
def try_render(conn, target, params) when is_binary(target) do
render(conn, target, params)
end
def try_render(conn, _, _) do
render_error(conn, :not_implemented, "Can't display this activity")
end
@doc """
Returns true if request specifies to include embedded relationships in account objects.
May only be used in selected account-related endpoints; has no effect for status- or
notification-related endpoints.
"""
# Intended for PleromaFE: https://git.pleroma.social/pleroma/pleroma-fe/-/issues/838
def embed_relationships?(params) do
# To do once OpenAPI transition mess is over: just `truthy_param?(params[:with_relationships])`
params
|> Map.get(:with_relationships, params["with_relationships"])
|> Params.truthy_param?()
end
end
diff --git a/lib/pleroma/web/embed_controller.ex b/lib/pleroma/web/embed_controller.ex
index ab0df9c5a..2ca4501a6 100644
--- a/lib/pleroma/web/embed_controller.ex
+++ b/lib/pleroma/web/embed_controller.ex
@@ -1,42 +1,40 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.EmbedController do
use Pleroma.Web, :controller
alias Pleroma.Activity
alias Pleroma.Object
alias Pleroma.User
alias Pleroma.Web.ActivityPub.Visibility
- plug(:put_layout, :embed)
-
def show(conn, %{"id" => id}) do
with %Activity{local: true} = activity <-
Activity.get_by_id_with_object(id),
true <- Visibility.public?(activity.object) do
{:ok, author} = User.get_or_fetch(activity.object.data["actor"])
conn
|> delete_resp_header("x-frame-options")
|> delete_resp_header("content-security-policy")
|> render("show.html",
activity: activity,
author: User.sanitize_html(author),
counts: get_counts(activity)
)
end
end
defp get_counts(%Activity{} = activity) do
%Object{data: data} = Object.normalize(activity, fetch: false)
%{
likes: Map.get(data, "like_count", 0),
replies: Map.get(data, "repliesCount", 0),
announces: Map.get(data, "announcement_count", 0)
}
end
end
diff --git a/lib/pleroma/web/endpoint.ex b/lib/pleroma/web/endpoint.ex
index 307fa069e..2e2104904 100644
--- a/lib/pleroma/web/endpoint.ex
+++ b/lib/pleroma/web/endpoint.ex
@@ -1,171 +1,193 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.Endpoint do
use Phoenix.Endpoint, otp_app: :pleroma
require Pleroma.Constants
alias Pleroma.Config
+ socket("/api/v1/streaming", Pleroma.Web.MastodonAPI.WebsocketHandler,
+ longpoll: false,
+ websocket: [
+ path: "/",
+ compress: false,
+ error_handler: {Pleroma.Web.MastodonAPI.WebsocketHandler, :handle_error, []},
+ fullsweep_after: 20
+ ]
+ )
+
socket("/socket", Pleroma.Web.UserSocket,
websocket: [
path: "/websocket",
serializer: [
{Phoenix.Socket.V1.JSONSerializer, "~> 1.0.0"},
{Phoenix.Socket.V2.JSONSerializer, "~> 2.0.0"}
],
timeout: 60_000,
transport_log: false,
- compress: false
+ compress: false,
+ fullsweep_after: 20
],
longpoll: false
)
socket("/live", Phoenix.LiveView.Socket)
plug(Plug.Telemetry, event_prefix: [:phoenix, :endpoint])
plug(Pleroma.Web.Plugs.SetLocalePlug)
plug(CORSPlug)
plug(Pleroma.Web.Plugs.HTTPSecurityPlug)
plug(Pleroma.Web.Plugs.UploadedMedia)
- @static_cache_control "public, no-cache"
+ @static_cache_control "public, max-age=1209600"
+ @static_cache_disabled "public, no-cache"
# InstanceStatic needs to be before Plug.Static to be able to override shipped-static files
# If you're adding new paths to `only:` you'll need to configure them in InstanceStatic as well
# Cache-control headers are duplicated in case we turn off etags in the future
plug(
Pleroma.Web.Plugs.InstanceStatic,
at: "/",
from: :pleroma,
only: ["emoji", "images"],
gzip: true,
- cache_control_for_etags: "public, max-age=1209600",
+ cache_control_for_etags: @static_cache_control,
headers: %{
- "cache-control" => "public, max-age=1209600"
+ "cache-control" => @static_cache_control
}
)
plug(Pleroma.Web.Plugs.InstanceStatic,
at: "/",
gzip: true,
- cache_control_for_etags: @static_cache_control,
+ cache_control_for_etags: @static_cache_disabled,
headers: %{
- "cache-control" => @static_cache_control
+ "cache-control" => @static_cache_disabled
+ }
+ )
+
+ plug(Pleroma.Web.Plugs.FrontendStatic,
+ at: "/",
+ frontend_type: :primary,
+ only: ["index.html"],
+ gzip: true,
+ cache_control_for_etags: @static_cache_disabled,
+ headers: %{
+ "cache-control" => @static_cache_disabled
}
)
- # Careful! No `only` restriction here, as we don't know what frontends contain.
plug(Pleroma.Web.Plugs.FrontendStatic,
at: "/",
frontend_type: :primary,
gzip: true,
cache_control_for_etags: @static_cache_control,
headers: %{
"cache-control" => @static_cache_control
}
)
plug(Plug.Static.IndexHtml, at: "/pleroma/admin/")
plug(Pleroma.Web.Plugs.FrontendStatic,
at: "/pleroma/admin",
frontend_type: :admin,
gzip: true,
- cache_control_for_etags: @static_cache_control,
+ cache_control_for_etags: @static_cache_disabled,
headers: %{
- "cache-control" => @static_cache_control
+ "cache-control" => @static_cache_disabled
}
)
# Serve at "/" the static files from "priv/static" directory.
#
# You should set gzip to true if you are running phoenix.digest
# when deploying your static files in production.
plug(
Plug.Static,
at: "/",
from: :pleroma,
only: Pleroma.Constants.static_only_files(),
# credo:disable-for-previous-line Credo.Check.Readability.MaxLineLength
gzip: true,
- cache_control_for_etags: @static_cache_control,
+ cache_control_for_etags: @static_cache_disabled,
headers: %{
- "cache-control" => @static_cache_control
+ "cache-control" => @static_cache_disabled
}
)
plug(Plug.Static,
at: "/pleroma/admin/",
from: {:pleroma, "priv/static/adminfe/"}
)
# Code reloading can be explicitly enabled under the
# :code_reloader configuration of your endpoint.
if code_reloading? do
plug(Phoenix.CodeReloader)
end
plug(Pleroma.Web.Plugs.TrailingFormatPlug)
plug(Plug.RequestId)
plug(Plug.Logger, log: :debug)
plug(Plug.Parsers,
parsers: [:urlencoded, Pleroma.Web.Multipart, :json],
pass: ["*/*"],
json_decoder: Jason,
# Note: this is compile-time only, won't work for database-config
length: Config.get([:instance, :upload_limit]),
body_reader: {Pleroma.Web.Plugs.DigestPlug, :read_body, []}
)
plug(Plug.MethodOverride)
plug(Plug.Head)
secure_cookies = Config.get([__MODULE__, :secure_cookie_flag])
cookie_name =
if secure_cookies,
do: "__Host-pleroma_key",
else: "pleroma_key"
extra =
Config.get([__MODULE__, :extra_cookie_attrs])
|> Enum.join(";")
# The session will be stored in the cookie and signed,
# this means its contents can be read but not tampered with.
# Set :encryption_salt if you would also like to encrypt it.
plug(
Plug.Session,
store: :cookie,
key: cookie_name,
signing_salt: Config.get([__MODULE__, :signing_salt], "CqaoopA2"),
http_only: true,
secure: secure_cookies,
extra: extra
)
plug(Pleroma.Web.Plugs.RemoteIp)
plug(Pleroma.Web.Router)
@doc """
Dynamically loads configuration from the system environment
on startup.
It receives the endpoint configuration from the config files
and must return the updated configuration.
"""
def load_from_system_env(config) do
port = System.get_env("PORT") || raise "expected the PORT environment variable to be set"
{:ok, Keyword.put(config, :http, [:inet6, port: port])}
end
def websocket_url do
String.replace_leading(url(), "http", "ws")
end
end
diff --git a/lib/pleroma/web/mastodon_api/controllers/search_controller.ex b/lib/pleroma/web/mastodon_api/controllers/search_controller.ex
index 7fdf684d2..628aa311b 100644
--- a/lib/pleroma/web/mastodon_api/controllers/search_controller.ex
+++ b/lib/pleroma/web/mastodon_api/controllers/search_controller.ex
@@ -1,202 +1,202 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.MastodonAPI.SearchController do
use Pleroma.Web, :controller
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Web.ControllerHelper
alias Pleroma.Web.Endpoint
alias Pleroma.Web.MastodonAPI.AccountView
alias Pleroma.Web.MastodonAPI.StatusView
alias Pleroma.Web.Plugs.OAuthScopesPlug
alias Pleroma.Web.Plugs.RateLimiter
require Logger
@search_limit 40
plug(Pleroma.Web.ApiSpec.CastAndValidate, replace_params: false)
# Note: Mastodon doesn't allow unauthenticated access (requires read:accounts / read:search)
plug(OAuthScopesPlug, %{scopes: ["read:search"], fallback: :proceed_unauthenticated})
# Note: on private instances auth is required (EnsurePublicOrAuthenticatedPlug is not skipped)
plug(RateLimiter, [name: :search] when action in [:search, :search2, :account_search])
defdelegate open_api_operation(action), to: Pleroma.Web.ApiSpec.SearchOperation
def account_search(
%{assigns: %{user: user}, private: %{open_api_spex: %{params: %{q: query} = params}}} =
conn,
_
) do
accounts = User.search(query, search_options(params, user))
conn
|> put_view(AccountView)
|> render("index.json",
users: accounts,
for: user,
as: :user
)
end
def search2(conn, params), do: do_search(:v2, conn, params)
def search(conn, params), do: do_search(:v1, conn, params)
defp do_search(
version,
%{assigns: %{user: user}, private: %{open_api_spex: %{params: %{q: query} = params}}} =
conn,
_
) do
query = String.trim(query)
options = search_options(params, user)
timeout = Keyword.get(Repo.config(), :timeout, 15_000)
default_values = %{"statuses" => [], "accounts" => [], "hashtags" => []}
result =
default_values
|> Enum.map(fn {resource, default_value} ->
if params[:type] in [nil, resource] do
{resource, fn -> resource_search(version, resource, query, options) end}
else
{resource, fn -> default_value end}
end
end)
|> Task.async_stream(fn {resource, f} -> {resource, with_fallback(f)} end,
timeout: timeout,
on_timeout: :kill_task
)
|> Enum.reduce(default_values, fn
{:ok, {resource, result}}, acc ->
Map.put(acc, resource, result)
_error, acc ->
acc
end)
json(conn, result)
end
defp search_options(params, user) do
[
resolve: params[:resolve],
following: params[:following],
limit: min(params[:limit], @search_limit),
offset: params[:offset],
type: params[:type],
author: get_author(params),
embed_relationships: ControllerHelper.embed_relationships?(params),
for_user: user
]
|> Enum.filter(&elem(&1, 1))
end
defp resource_search(_, "accounts", query, options) do
accounts = with_fallback(fn -> User.search(query, options) end)
AccountView.render("index.json",
users: accounts,
for: options[:for_user],
embed_relationships: options[:embed_relationships]
)
end
defp resource_search(_, "statuses", query, options) do
statuses = with_fallback(fn -> Pleroma.Search.search(query, options) end)
StatusView.render("index.json",
activities: statuses,
for: options[:for_user],
as: :activity
)
end
defp resource_search(:v2, "hashtags", query, options) do
tags_path = Endpoint.url() <> "/tag/"
query
|> prepare_tags(options)
|> Enum.map(fn tag ->
%{name: tag, url: tags_path <> tag}
end)
end
defp resource_search(:v1, "hashtags", query, options) do
prepare_tags(query, options)
end
defp prepare_tags(query, options) do
tags =
query
|> preprocess_uri_query()
|> String.split(~r/[^#\w]+/u, trim: true)
|> Enum.uniq_by(&String.downcase/1)
explicit_tags = Enum.filter(tags, fn tag -> String.starts_with?(tag, "#") end)
tags =
if Enum.any?(explicit_tags) do
explicit_tags
else
tags
end
tags = Enum.map(tags, fn tag -> String.trim_leading(tag, "#") end)
tags =
if Enum.empty?(explicit_tags) && !options[:skip_joined_tag] do
add_joined_tag(tags)
else
tags
end
- Pleroma.Pagination.paginate(tags, options)
+ Pleroma.Pagination.paginate_list(tags, options)
end
defp add_joined_tag(tags) do
tags
|> Kernel.++([joined_tag(tags)])
|> Enum.uniq_by(&String.downcase/1)
end
# If `query` is a URI, returns last component of its path, otherwise returns `query`
defp preprocess_uri_query(query) do
if query =~ ~r/https?:\/\// do
query
|> String.trim_trailing("/")
|> URI.parse()
|> Map.get(:path)
|> String.split("/")
|> Enum.at(-1)
else
query
end
end
defp joined_tag(tags) do
tags
|> Enum.map(fn tag -> String.capitalize(tag) end)
|> Enum.join()
end
defp with_fallback(f, fallback \\ []) do
try do
f.()
rescue
error ->
Logger.error("#{__MODULE__} search error: #{inspect(error)}")
fallback
end
end
defp get_author(%{account_id: account_id}) when is_binary(account_id),
do: User.get_cached_by_id(account_id)
defp get_author(_params), do: nil
end
diff --git a/lib/pleroma/web/mastodon_api/websocket_handler.ex b/lib/pleroma/web/mastodon_api/websocket_handler.ex
index 07c2b62e3..bb27d806d 100644
--- a/lib/pleroma/web/mastodon_api/websocket_handler.ex
+++ b/lib/pleroma/web/mastodon_api/websocket_handler.ex
@@ -1,257 +1,247 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.MastodonAPI.WebsocketHandler do
require Logger
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Web.OAuth.Token
alias Pleroma.Web.Streamer
alias Pleroma.Web.StreamerView
- @behaviour :cowboy_websocket
+ @behaviour Phoenix.Socket.Transport
# Client ping period.
@tick :timer.seconds(30)
- # Cowboy timeout period.
- @timeout :timer.seconds(60)
- # Hibernate every X messages
- @hibernate_every 100
-
- def init(%{qs: qs} = req, state) do
- with params <- Enum.into(:cow_qs.parse_qs(qs), %{}),
- sec_websocket <- :cowboy_req.header("sec-websocket-protocol", req, nil),
- access_token <- Map.get(params, "access_token"),
- {:ok, user, oauth_token} <- authenticate_request(access_token, sec_websocket),
- {:ok, topic} <- Streamer.get_topic(params["stream"], user, oauth_token, params) do
- req =
- if sec_websocket do
- :cowboy_req.set_resp_header("sec-websocket-protocol", sec_websocket, req)
- else
- req
- end
+ @impl Phoenix.Socket.Transport
+ def child_spec(_opts), do: :ignore
+
+ # This only prepares the connection and is not in the process yet
+ @impl Phoenix.Socket.Transport
+ def connect(%{params: params} = transport_info) do
+ with access_token <- Map.get(params, "access_token"),
+ {:ok, user, oauth_token} <- authenticate_request(access_token),
+ {:ok, topic} <-
+ Streamer.get_topic(params["stream"], user, oauth_token, params) do
topics =
if topic do
[topic]
else
[]
end
- {:cowboy_websocket, req,
- %{user: user, topics: topics, oauth_token: oauth_token, count: 0, timer: nil},
- %{idle_timeout: @timeout}}
+ state = %{
+ user: user,
+ topics: topics,
+ oauth_token: oauth_token,
+ count: 0,
+ timer: nil
+ }
+
+ {:ok, state}
else
{:error, :bad_topic} ->
- Logger.debug("#{__MODULE__} bad topic #{inspect(req)}")
- req = :cowboy_req.reply(404, req)
- {:ok, req, state}
+ Logger.debug("#{__MODULE__} bad topic #{inspect(transport_info)}")
+
+ {:error, :bad_topic}
{:error, :unauthorized} ->
- Logger.debug("#{__MODULE__} authentication error: #{inspect(req)}")
- req = :cowboy_req.reply(401, req)
- {:ok, req, state}
+ Logger.debug("#{__MODULE__} authentication error: #{inspect(transport_info)}")
+ {:error, :unauthorized}
end
end
- def websocket_init(state) do
- Logger.debug(
- "#{__MODULE__} accepted websocket connection for user #{(state.user || %{id: "anonymous"}).id}, topics #{state.topics}"
- )
-
+ # All subscriptions/links and messages cannot be created
+ # until the processed is launched with init/1
+ @impl Phoenix.Socket.Transport
+ def init(state) do
Enum.each(state.topics, fn topic -> Streamer.add_socket(topic, state.oauth_token) end)
- {:ok, %{state | timer: timer()}}
- end
- # Client's Pong frame.
- def websocket_handle(:pong, state) do
- if state.timer, do: Process.cancel_timer(state.timer)
- {:ok, %{state | timer: timer()}}
- end
+ Process.send_after(self(), :ping, @tick)
- # We only receive pings for now
- def websocket_handle(:ping, state), do: {:ok, state}
+ {:ok, state}
+ end
- def websocket_handle({:text, text}, state) do
+ @impl Phoenix.Socket.Transport
+ def handle_in({text, [opcode: :text]}, state) do
with {:ok, %{} = event} <- Jason.decode(text) do
handle_client_event(event, state)
else
_ ->
Logger.error("#{__MODULE__} received non-JSON event: #{inspect(text)}")
{:ok, state}
end
end
- def websocket_handle(frame, state) do
+ def handle_in(frame, state) do
Logger.error("#{__MODULE__} received frame: #{inspect(frame)}")
{:ok, state}
end
- def websocket_info({:render_with_user, view, template, item, topic}, state) do
+ @impl Phoenix.Socket.Transport
+ def handle_info({:render_with_user, view, template, item, topic}, state) do
user = %User{} = User.get_cached_by_ap_id(state.user.ap_id)
unless Streamer.filtered_by_user?(user, item) do
- websocket_info({:text, view.render(template, item, user, topic)}, %{state | user: user})
+ message = view.render(template, item, user, topic)
+ {:push, {:text, message}, %{state | user: user}}
else
{:ok, state}
end
end
- def websocket_info({:text, message}, state) do
- # If the websocket processed X messages, force an hibernate/GC.
- # We don't hibernate at every message to balance CPU usage/latency with RAM usage.
- if state.count > @hibernate_every do
- {:reply, {:text, message}, %{state | count: 0}, :hibernate}
- else
- {:reply, {:text, message}, %{state | count: state.count + 1}}
- end
+ def handle_info({:text, text}, state) do
+ {:push, {:text, text}, state}
end
- # Ping tick. We don't re-queue a timer there, it is instead queued when :pong is received.
- # As we hibernate there, reset the count to 0.
- # If the client misses :pong, Cowboy will automatically timeout the connection after
- # `@idle_timeout`.
- def websocket_info(:tick, state) do
- {:reply, :ping, %{state | timer: nil, count: 0}, :hibernate}
+ def handle_info(:ping, state) do
+ Process.send_after(self(), :ping, @tick)
+
+ {:push, {:ping, ""}, state}
end
- def websocket_info(:close, state) do
- {:stop, state}
+ def handle_info(:close, state) do
+ {:stop, {:closed, 'connection closed by server'}, state}
end
- # State can be `[]` only in case we terminate before switching to websocket,
- # we already log errors for these cases in `init/1`, so just do nothing here
- def terminate(_reason, _req, []), do: :ok
+ def handle_info(msg, state) do
+ Logger.debug("#{__MODULE__} received info: #{inspect(msg)}")
- def terminate(reason, _req, state) do
+ {:ok, state}
+ end
+
+ @impl Phoenix.Socket.Transport
+ def terminate(reason, state) do
Logger.debug(
- "#{__MODULE__} terminating websocket connection for user #{(state.user || %{id: "anonymous"}).id}, topics #{state.topics || "?"}: #{inspect(reason)}"
+ "#{__MODULE__} terminating websocket connection for user #{(state.user || %{id: "anonymous"}).id}, topics #{state.topics || "?"}: #{inspect(reason)})"
)
Enum.each(state.topics, fn topic -> Streamer.remove_socket(topic) end)
:ok
end
# Public streams without authentication.
- defp authenticate_request(nil, nil) do
+ defp authenticate_request(nil) do
{:ok, nil, nil}
end
# Authenticated streams.
- defp authenticate_request(access_token, sec_websocket) do
- token = access_token || sec_websocket
-
- with true <- is_bitstring(token),
- oauth_token = %Token{user_id: user_id} <- Repo.get_by(Token, token: token),
+ defp authenticate_request(access_token) do
+ with oauth_token = %Token{user_id: user_id} <- Repo.get_by(Token, token: access_token),
user = %User{} <- User.get_cached_by_id(user_id) do
{:ok, user, oauth_token}
else
_ -> {:error, :unauthorized}
end
end
- defp timer do
- Process.send_after(self(), :tick, @tick)
- end
-
defp handle_client_event(%{"type" => "subscribe", "stream" => _topic} = params, state) do
with {_, {:ok, topic}} <-
{:topic, Streamer.get_topic(params["stream"], state.user, state.oauth_token, params)},
{_, false} <- {:subscribed, topic in state.topics} do
Streamer.add_socket(topic, state.oauth_token)
- {[
- {:text,
- StreamerView.render("pleroma_respond.json", %{type: "subscribe", result: "success"})}
- ], %{state | topics: [topic | state.topics]}}
+ message =
+ StreamerView.render("pleroma_respond.json", %{type: "subscribe", result: "success"})
+
+ {:reply, :ok, {:text, message}, %{state | topics: [topic | state.topics]}}
else
{:subscribed, true} ->
- {[
- {:text,
- StreamerView.render("pleroma_respond.json", %{type: "subscribe", result: "ignored"})}
- ], state}
+ message =
+ StreamerView.render("pleroma_respond.json", %{type: "subscribe", result: "ignored"})
+
+ {:reply, :error, {:text, message}, state}
{:topic, {:error, error}} ->
- {[
- {:text,
- StreamerView.render("pleroma_respond.json", %{
- type: "subscribe",
- result: "error",
- error: error
- })}
- ], state}
+ message =
+ StreamerView.render("pleroma_respond.json", %{
+ type: "subscribe",
+ result: "error",
+ error: error
+ })
+
+ {:reply, :error, {:text, message}, state}
end
end
defp handle_client_event(%{"type" => "unsubscribe", "stream" => _topic} = params, state) do
with {_, {:ok, topic}} <-
{:topic, Streamer.get_topic(params["stream"], state.user, state.oauth_token, params)},
{_, true} <- {:subscribed, topic in state.topics} do
Streamer.remove_socket(topic)
- {[
- {:text,
- StreamerView.render("pleroma_respond.json", %{type: "unsubscribe", result: "success"})}
- ], %{state | topics: List.delete(state.topics, topic)}}
+ message =
+ StreamerView.render("pleroma_respond.json", %{type: "unsubscribe", result: "success"})
+
+ {:reply, :ok, {:text, message}, %{state | topics: List.delete(state.topics, topic)}}
else
{:subscribed, false} ->
- {[
- {:text,
- StreamerView.render("pleroma_respond.json", %{type: "unsubscribe", result: "ignored"})}
- ], state}
+ message =
+ StreamerView.render("pleroma_respond.json", %{type: "unsubscribe", result: "ignored"})
+
+ {:reply, :error, {:text, message}, state}
{:topic, {:error, error}} ->
- {[
- {:text,
- StreamerView.render("pleroma_respond.json", %{
- type: "unsubscribe",
- result: "error",
- error: error
- })}
- ], state}
+ message =
+ StreamerView.render("pleroma_respond.json", %{
+ type: "unsubscribe",
+ result: "error",
+ error: error
+ })
+
+ {:reply, :error, {:text, message}, state}
end
end
defp handle_client_event(
%{"type" => "pleroma:authenticate", "token" => access_token} = _params,
state
) do
with {:auth, nil, nil} <- {:auth, state.user, state.oauth_token},
- {:ok, user, oauth_token} <- authenticate_request(access_token, nil) do
- {[
- {:text,
- StreamerView.render("pleroma_respond.json", %{
- type: "pleroma:authenticate",
- result: "success"
- })}
- ], %{state | user: user, oauth_token: oauth_token}}
+ {:ok, user, oauth_token} <- authenticate_request(access_token) do
+ message =
+ StreamerView.render("pleroma_respond.json", %{
+ type: "pleroma:authenticate",
+ result: "success"
+ })
+
+ {:reply, :ok, {:text, message}, %{state | user: user, oauth_token: oauth_token}}
else
{:auth, _, _} ->
- {[
- {:text,
- StreamerView.render("pleroma_respond.json", %{
- type: "pleroma:authenticate",
- result: "error",
- error: :already_authenticated
- })}
- ], state}
+ message =
+ StreamerView.render("pleroma_respond.json", %{
+ type: "pleroma:authenticate",
+ result: "error",
+ error: :already_authenticated
+ })
+
+ {:reply, :error, {:text, message}, state}
_ ->
- {[
- {:text,
- StreamerView.render("pleroma_respond.json", %{
- type: "pleroma:authenticate",
- result: "error",
- error: :unauthorized
- })}
- ], state}
+ message =
+ StreamerView.render("pleroma_respond.json", %{
+ type: "pleroma:authenticate",
+ result: "error",
+ error: :unauthorized
+ })
+
+ {:reply, :error, {:text, message}, state}
end
end
defp handle_client_event(params, state) do
Logger.error("#{__MODULE__} received unknown event: #{inspect(params)}")
- {[], state}
+ {:ok, state}
+ end
+
+ def handle_error(conn, :unauthorized) do
+ Plug.Conn.send_resp(conn, 401, "Unauthorized")
+ end
+
+ def handle_error(conn, _reason) do
+ Plug.Conn.send_resp(conn, 404, "Not Found")
end
end
diff --git a/lib/pleroma/web/o_auth/o_auth_controller.ex b/lib/pleroma/web/o_auth/o_auth_controller.ex
index 2bbe5d5fa..47b03215f 100644
--- a/lib/pleroma/web/o_auth/o_auth_controller.ex
+++ b/lib/pleroma/web/o_auth/o_auth_controller.ex
@@ -1,633 +1,628 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.OAuth.OAuthController do
use Pleroma.Web, :controller
alias Pleroma.Helpers.AuthHelper
alias Pleroma.Helpers.UriHelper
alias Pleroma.Maps
alias Pleroma.MFA
alias Pleroma.Registration
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Web.Auth.WrapperAuthenticator, as: Authenticator
alias Pleroma.Web.OAuth.App
alias Pleroma.Web.OAuth.Authorization
alias Pleroma.Web.OAuth.MFAController
alias Pleroma.Web.OAuth.MFAView
alias Pleroma.Web.OAuth.OAuthView
alias Pleroma.Web.OAuth.Scopes
alias Pleroma.Web.OAuth.Token
alias Pleroma.Web.OAuth.Token.Strategy.RefreshToken
alias Pleroma.Web.OAuth.Token.Strategy.Revoke, as: RevokeToken
alias Pleroma.Web.Plugs.RateLimiter
alias Pleroma.Web.Utils.Params
require Logger
if Pleroma.Config.oauth_consumer_enabled?(), do: plug(Ueberauth)
plug(:fetch_session)
plug(:fetch_flash)
plug(:skip_auth)
plug(RateLimiter, [name: :authentication] when action == :create_authorization)
action_fallback(Pleroma.Web.OAuth.FallbackController)
@oob_token_redirect_uri "urn:ietf:wg:oauth:2.0:oob"
# Note: this definition is only called from error-handling methods with `conn.params` as 2nd arg
def authorize(%Plug.Conn{} = conn, %{"authorization" => _} = params) do
{auth_attrs, params} = Map.pop(params, "authorization")
authorize(conn, Map.merge(params, auth_attrs))
end
def authorize(%Plug.Conn{assigns: %{token: %Token{}}} = conn, %{"force_login" => _} = params) do
if Params.truthy_param?(params["force_login"]) do
do_authorize(conn, params)
else
handle_existing_authorization(conn, params)
end
end
# Note: the token is set in oauth_plug, but the token and client do not always go together.
# For example, MastodonFE's token is set if user requests with another client,
# after user already authorized to MastodonFE.
# So we have to check client and token.
def authorize(
%Plug.Conn{assigns: %{token: %Token{} = token}} = conn,
%{"client_id" => client_id} = params
) do
with %Token{} = t <- Repo.get_by(Token, token: token.token) |> Repo.preload(:app),
^client_id <- t.app.client_id do
handle_existing_authorization(conn, params)
else
_ -> do_authorize(conn, params)
end
end
def authorize(%Plug.Conn{} = conn, params), do: do_authorize(conn, params)
defp do_authorize(%Plug.Conn{} = conn, params) do
app = Repo.get_by(App, client_id: params["client_id"])
available_scopes = (app && app.scopes) || []
scopes = Scopes.fetch_scopes(params, available_scopes)
user =
with %{assigns: %{user: %User{} = user}} <- conn do
user
else
_ -> nil
end
scopes =
if scopes == [] do
available_scopes
else
scopes
end
# Note: `params` might differ from `conn.params`; use `@params` not `@conn.params` in template
render(conn, Authenticator.auth_template(), %{
user: user,
app: app && Map.delete(app, :client_secret),
response_type: params["response_type"],
client_id: params["client_id"],
available_scopes: available_scopes,
scopes: scopes,
redirect_uri: params["redirect_uri"],
state: params["state"],
params: params
})
end
defp handle_existing_authorization(
%Plug.Conn{assigns: %{token: %Token{} = token}} = conn,
%{"redirect_uri" => @oob_token_redirect_uri}
) do
render(conn, "oob_token_exists.html", %{token: token})
end
defp handle_existing_authorization(
%Plug.Conn{assigns: %{token: %Token{} = token}} = conn,
%{} = params
) do
app = Repo.preload(token, :app).app
redirect_uri =
if is_binary(params["redirect_uri"]) do
params["redirect_uri"]
else
default_redirect_uri(app)
end
if redirect_uri in String.split(app.redirect_uris) do
redirect_uri = redirect_uri(conn, redirect_uri)
url_params = %{access_token: token.token}
url_params = Maps.put_if_present(url_params, :state, params["state"])
url = UriHelper.modify_uri_params(redirect_uri, url_params)
redirect(conn, external: url)
else
conn
|> put_flash(:error, dgettext("errors", "Unlisted redirect_uri."))
|> redirect(external: redirect_uri(conn, redirect_uri))
end
end
def create_authorization(_, _, opts \\ [])
def create_authorization(%Plug.Conn{assigns: %{user: %User{} = user}} = conn, params, []) do
create_authorization(conn, params, user: user)
end
def create_authorization(%Plug.Conn{} = conn, %{"authorization" => _} = params, opts) do
with {:ok, auth, user} <- do_create_authorization(conn, params, opts[:user]),
{:mfa_required, _, _, false} <- {:mfa_required, user, auth, MFA.require?(user)} do
after_create_authorization(conn, auth, params)
else
error ->
handle_create_authorization_error(conn, error, params)
end
end
def after_create_authorization(%Plug.Conn{} = conn, %Authorization{} = auth, %{
"authorization" => %{"redirect_uri" => @oob_token_redirect_uri}
}) do
# Enforcing the view to reuse the template when calling from other controllers
conn
|> put_view(OAuthView)
|> render("oob_authorization_created.html", %{auth: auth})
end
def after_create_authorization(%Plug.Conn{} = conn, %Authorization{} = auth, %{
"authorization" => %{"redirect_uri" => redirect_uri} = auth_attrs
}) do
app = Repo.preload(auth, :app).app
# An extra safety measure before we redirect (also done in `do_create_authorization/2`)
if redirect_uri in String.split(app.redirect_uris) do
redirect_uri = redirect_uri(conn, redirect_uri)
url_params = %{code: auth.token}
url_params = Maps.put_if_present(url_params, :state, auth_attrs["state"])
url = UriHelper.modify_uri_params(redirect_uri, url_params)
redirect(conn, external: url)
else
conn
|> put_flash(:error, dgettext("errors", "Unlisted redirect_uri."))
|> redirect(external: redirect_uri(conn, redirect_uri))
end
end
defp handle_create_authorization_error(
%Plug.Conn{} = conn,
{:error, scopes_issue},
%{"authorization" => _} = params
)
when scopes_issue in [:unsupported_scopes, :missing_scopes] do
# Per https://github.com/tootsuite/mastodon/blob/
# 51e154f5e87968d6bb115e053689767ab33e80cd/app/controllers/api/base_controller.rb#L39
conn
|> put_flash(:error, dgettext("errors", "This action is outside the authorized scopes"))
|> put_status(:unauthorized)
|> authorize(params)
end
defp handle_create_authorization_error(
%Plug.Conn{} = conn,
{:account_status, :confirmation_pending},
%{"authorization" => _} = params
) do
conn
|> put_flash(:error, dgettext("errors", "Your login is missing a confirmed e-mail address"))
|> put_status(:forbidden)
|> authorize(params)
end
defp handle_create_authorization_error(
%Plug.Conn{} = conn,
{:mfa_required, user, auth, _},
params
) do
{:ok, token} = MFA.Token.create(user, auth)
data = %{
"mfa_token" => token.token,
"redirect_uri" => params["authorization"]["redirect_uri"],
"state" => params["authorization"]["state"]
}
MFAController.show(conn, data)
end
defp handle_create_authorization_error(
%Plug.Conn{} = conn,
{:account_status, :password_reset_pending},
%{"authorization" => _} = params
) do
conn
|> put_flash(:error, dgettext("errors", "Password reset is required"))
|> put_status(:forbidden)
|> authorize(params)
end
defp handle_create_authorization_error(
%Plug.Conn{} = conn,
{:account_status, :deactivated},
%{"authorization" => _} = params
) do
conn
|> put_flash(:error, dgettext("errors", "Your account is currently disabled"))
|> put_status(:forbidden)
|> authorize(params)
end
defp handle_create_authorization_error(%Plug.Conn{} = conn, error, %{"authorization" => _}) do
Authenticator.handle_error(conn, error)
end
@doc "Renew access_token with refresh_token"
def token_exchange(
%Plug.Conn{} = conn,
%{"grant_type" => "refresh_token", "refresh_token" => token} = _params
) do
with {:ok, app} <- Token.Utils.fetch_app(conn),
{:ok, %{user: user} = token} <- Token.get_by_refresh_token(app, token),
{:ok, token} <- RefreshToken.grant(token) do
after_token_exchange(conn, %{user: user, token: token})
else
_error -> render_invalid_credentials_error(conn)
end
end
def token_exchange(%Plug.Conn{} = conn, %{"grant_type" => "authorization_code"} = params) do
with {:ok, app} <- Token.Utils.fetch_app(conn),
fixed_token = Token.Utils.fix_padding(params["code"]),
{:ok, auth} <- Authorization.get_by_token(app, fixed_token),
%User{} = user <- User.get_cached_by_id(auth.user_id),
{:ok, token} <- Token.exchange_token(app, auth) do
after_token_exchange(conn, %{user: user, token: token})
else
error ->
handle_token_exchange_error(conn, error)
end
end
def token_exchange(
%Plug.Conn{} = conn,
%{"grant_type" => "password"} = params
) do
with {:ok, %User{} = user} <- Authenticator.get_user(conn),
{:ok, app} <- Token.Utils.fetch_app(conn),
requested_scopes <- Scopes.fetch_scopes(params, app.scopes),
{:ok, token} <- login(user, app, requested_scopes) do
after_token_exchange(conn, %{user: user, token: token})
else
error ->
handle_token_exchange_error(conn, error)
end
end
def token_exchange(
%Plug.Conn{} = conn,
%{"grant_type" => "password", "name" => name, "password" => _password} = params
) do
params =
params
|> Map.delete("name")
|> Map.put("username", name)
token_exchange(conn, params)
end
def token_exchange(%Plug.Conn{} = conn, %{"grant_type" => "client_credentials"} = _params) do
with {:ok, app} <- Token.Utils.fetch_app(conn),
{:ok, auth} <- Authorization.create_authorization(app, %User{}),
{:ok, token} <- Token.exchange_token(app, auth) do
after_token_exchange(conn, %{token: token})
else
_error ->
handle_token_exchange_error(conn, :invalid_credentials)
end
end
# Bad request
def token_exchange(%Plug.Conn{} = conn, params), do: bad_request(conn, params)
def after_token_exchange(%Plug.Conn{} = conn, %{token: token} = view_params) do
conn
|> AuthHelper.put_session_token(token.token)
|> json(OAuthView.render("token.json", view_params))
end
defp handle_token_exchange_error(%Plug.Conn{} = conn, {:mfa_required, user, auth, _}) do
conn
|> put_status(:forbidden)
|> json(build_and_response_mfa_token(user, auth))
end
defp handle_token_exchange_error(%Plug.Conn{} = conn, {:account_status, :deactivated}) do
render_error(
conn,
:forbidden,
"Your account is currently disabled",
%{},
"account_is_disabled"
)
end
defp handle_token_exchange_error(
%Plug.Conn{} = conn,
{:account_status, :password_reset_pending}
) do
render_error(
conn,
:forbidden,
"Password reset is required",
%{},
"password_reset_required"
)
end
defp handle_token_exchange_error(%Plug.Conn{} = conn, {:account_status, :confirmation_pending}) do
render_error(
conn,
:forbidden,
"Your login is missing a confirmed e-mail address",
%{},
"missing_confirmed_email"
)
end
defp handle_token_exchange_error(%Plug.Conn{} = conn, {:account_status, :approval_pending}) do
render_error(
conn,
:forbidden,
"Your account is awaiting approval.",
%{},
"awaiting_approval"
)
end
defp handle_token_exchange_error(%Plug.Conn{} = conn, _error) do
render_invalid_credentials_error(conn)
end
def token_revoke(%Plug.Conn{} = conn, %{"token" => token}) do
with {:ok, %Token{} = oauth_token} <- Token.get_by_token(token),
{:ok, oauth_token} <- RevokeToken.revoke(oauth_token) do
conn =
with session_token = AuthHelper.get_session_token(conn),
%Token{token: ^session_token} <- oauth_token do
AuthHelper.delete_session_token(conn)
else
_ -> conn
end
json(conn, %{})
else
_error ->
# RFC 7009: invalid tokens [in the request] do not cause an error response
json(conn, %{})
end
end
def token_revoke(%Plug.Conn{} = conn, params), do: bad_request(conn, params)
# Response for bad request
defp bad_request(%Plug.Conn{} = conn, _) do
render_error(conn, :internal_server_error, "Bad request")
end
@doc "Prepares OAuth request to provider for Ueberauth"
def prepare_request(%Plug.Conn{} = conn, %{
"provider" => provider,
"authorization" => auth_attrs
}) do
scope =
auth_attrs
|> Scopes.fetch_scopes([])
|> Scopes.to_string()
state =
auth_attrs
|> Map.delete("scopes")
|> Map.put("scope", scope)
|> Jason.encode!()
params =
auth_attrs
|> Map.drop(~w(scope scopes client_id redirect_uri))
|> Map.put("state", state)
# Handing the request to Ueberauth
redirect(conn, to: Routes.o_auth_path(conn, :request, provider, params))
end
def request(%Plug.Conn{} = conn, params) do
message =
if params["provider"] do
dgettext("errors", "Unsupported OAuth provider: %{provider}.",
provider: params["provider"]
)
else
dgettext("errors", "Bad OAuth request.")
end
conn
|> put_flash(:error, message)
|> redirect(to: "/")
end
def callback(%Plug.Conn{assigns: %{ueberauth_failure: failure}} = conn, params) do
params = callback_params(params)
messages = for e <- Map.get(failure, :errors, []), do: e.message
message = Enum.join(messages, "; ")
conn
|> put_flash(
:error,
dgettext("errors", "Failed to authenticate: %{message}.", message: message)
)
|> redirect(external: redirect_uri(conn, params["redirect_uri"]))
end
def callback(%Plug.Conn{} = conn, params) do
params = callback_params(params)
with {:ok, registration} <- Authenticator.get_registration(conn) do
auth_attrs = Map.take(params, ~w(client_id redirect_uri scope scopes state))
case Repo.get_assoc(registration, :user) do
{:ok, user} ->
create_authorization(conn, %{"authorization" => auth_attrs}, user: user)
_ ->
registration_params =
Map.merge(auth_attrs, %{
"nickname" => Registration.nickname(registration),
"email" => Registration.email(registration)
})
conn
|> put_session_registration_id(registration.id)
|> registration_details(%{"authorization" => registration_params})
end
else
error ->
Logger.debug(inspect(["OAUTH_ERROR", error, conn.assigns]))
conn
|> put_flash(:error, dgettext("errors", "Failed to set up user account."))
|> redirect(external: redirect_uri(conn, params["redirect_uri"]))
end
end
defp callback_params(%{"state" => state} = params) do
Map.merge(params, Jason.decode!(state))
end
def registration_details(%Plug.Conn{} = conn, %{"authorization" => auth_attrs}) do
render(conn, "register.html", %{
client_id: auth_attrs["client_id"],
redirect_uri: auth_attrs["redirect_uri"],
state: auth_attrs["state"],
scopes: Scopes.fetch_scopes(auth_attrs, []),
nickname: auth_attrs["nickname"],
email: auth_attrs["email"]
})
end
def register(%Plug.Conn{} = conn, %{"authorization" => _, "op" => "connect"} = params) do
with registration_id when not is_nil(registration_id) <- get_session_registration_id(conn),
%Registration{} = registration <- Repo.get(Registration, registration_id),
{_, {:ok, auth, _user}} <-
{:create_authorization, do_create_authorization(conn, params)},
%User{} = user <- Repo.preload(auth, :user).user,
{:ok, _updated_registration} <- Registration.bind_to_user(registration, user) do
conn
|> put_session_registration_id(nil)
|> after_create_authorization(auth, params)
else
{:create_authorization, error} ->
{:register, handle_create_authorization_error(conn, error, params)}
_ ->
{:register, :generic_error}
end
end
def register(%Plug.Conn{} = conn, %{"authorization" => _, "op" => "register"} = params) do
with registration_id when not is_nil(registration_id) <- get_session_registration_id(conn),
%Registration{} = registration <- Repo.get(Registration, registration_id),
{:ok, user} <- Authenticator.create_from_registration(conn, registration) do
conn
|> put_session_registration_id(nil)
|> create_authorization(
params,
user: user
)
else
{:error, changeset} ->
message =
Enum.map(changeset.errors, fn {field, {error, _}} ->
"#{field} #{error}"
end)
|> Enum.join("; ")
message =
String.replace(
message,
"ap_id has already been taken",
"nickname has already been taken"
)
conn
|> put_status(:forbidden)
|> put_flash(:error, "Error: #{message}.")
|> registration_details(params)
_ ->
{:register, :generic_error}
end
end
defp do_create_authorization(conn, auth_attrs, user \\ nil)
defp do_create_authorization(
%Plug.Conn{} = conn,
%{
"authorization" =>
%{
"client_id" => client_id,
"redirect_uri" => redirect_uri
} = auth_attrs
},
user
) do
with {_, {:ok, %User{} = user}} <-
{:get_user, (user && {:ok, user}) || Authenticator.get_user(conn)},
%App{} = app <- Repo.get_by(App, client_id: client_id),
true <- redirect_uri in String.split(app.redirect_uris),
requested_scopes <- Scopes.fetch_scopes(auth_attrs, app.scopes),
{:ok, auth} <- do_create_authorization(user, app, requested_scopes) do
{:ok, auth, user}
end
end
defp do_create_authorization(%User{} = user, %App{} = app, requested_scopes)
when is_list(requested_scopes) do
with {:account_status, :active} <- {:account_status, User.account_status(user)},
{:ok, scopes} <- validate_scopes(app, requested_scopes),
{:ok, auth} <- Authorization.create_authorization(app, user, scopes) do
{:ok, auth}
end
end
# Note: intended to be a private function but opened for AccountController that logs in on signup
@doc "If checks pass, creates authorization and token for given user, app and requested scopes."
def login(%User{} = user, %App{} = app, requested_scopes) when is_list(requested_scopes) do
with {:ok, auth} <- do_create_authorization(user, app, requested_scopes),
{:mfa_required, _, _, false} <- {:mfa_required, user, auth, MFA.require?(user)},
{:ok, token} <- Token.exchange_token(app, auth) do
{:ok, token}
end
end
defp redirect_uri(%Plug.Conn{}, redirect_uri), do: redirect_uri
defp get_session_registration_id(%Plug.Conn{} = conn), do: get_session(conn, :registration_id)
defp put_session_registration_id(%Plug.Conn{} = conn, registration_id),
do: put_session(conn, :registration_id, registration_id)
defp build_and_response_mfa_token(user, auth) do
with {:ok, token} <- MFA.Token.create(user, auth) do
MFAView.render("mfa_response.json", %{token: token, user: user})
end
end
- @spec validate_scopes(App.t(), map() | list()) ::
+ @spec validate_scopes(App.t(), list()) ::
{:ok, list()} | {:error, :missing_scopes | :unsupported_scopes}
- defp validate_scopes(%App{} = app, params) when is_map(params) do
- requested_scopes = Scopes.fetch_scopes(params, app.scopes)
- validate_scopes(app, requested_scopes)
- end
-
defp validate_scopes(%App{} = app, requested_scopes) when is_list(requested_scopes) do
Scopes.validate(requested_scopes, app.scopes)
end
def default_redirect_uri(%App{} = app) do
app.redirect_uris
|> String.split()
|> Enum.at(0)
end
defp render_invalid_credentials_error(conn) do
render_error(conn, :bad_request, "Invalid credentials")
end
end
diff --git a/lib/pleroma/web/o_status/o_status_controller.ex b/lib/pleroma/web/o_status/o_status_controller.ex
index 4f2cf02c3..ee7ef4a5d 100644
--- a/lib/pleroma/web/o_status/o_status_controller.ex
+++ b/lib/pleroma/web/o_status/o_status_controller.ex
@@ -1,140 +1,139 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.OStatus.OStatusController do
use Pleroma.Web, :controller
alias Pleroma.Activity
alias Pleroma.Object
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPubController
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.Endpoint
alias Pleroma.Web.Fallback.RedirectController
alias Pleroma.Web.Metadata.PlayerView
alias Pleroma.Web.Plugs.RateLimiter
alias Pleroma.Web.Router
plug(
RateLimiter,
[name: :ap_routes, params: ["uuid"]] when action in [:object, :activity]
)
plug(
Pleroma.Web.Plugs.SetFormatPlug
when action in [:object, :activity, :notice]
)
action_fallback(:errors)
def object(%{assigns: %{format: format}} = conn, _params)
when format in ["json", "activity+json"] do
ActivityPubController.call(conn, :object)
end
def object(conn, _params) do
with id <- Endpoint.url() <> conn.request_path,
{_, %Activity{} = activity} <-
{:activity, Activity.get_create_by_object_ap_id_with_object(id)},
{_, true} <- {:public?, Visibility.public?(activity)} do
redirect(conn, to: "/notice/#{activity.id}")
else
reason when reason in [{:public?, false}, {:activity, nil}] ->
{:error, :not_found}
e ->
e
end
end
def activity(%{assigns: %{format: format}} = conn, _params)
when format in ["json", "activity+json"] do
ActivityPubController.call(conn, :activity)
end
def activity(conn, _params) do
with id <- Endpoint.url() <> conn.request_path,
{_, %Activity{} = activity} <- {:activity, Activity.normalize(id)},
{_, true} <- {:public?, Visibility.public?(activity)} do
redirect(conn, to: "/notice/#{activity.id}")
else
reason when reason in [{:public?, false}, {:activity, nil}] ->
{:error, :not_found}
e ->
e
end
end
def notice(%{assigns: %{format: format}} = conn, %{"id" => id}) do
with {_, %Activity{} = activity} <- {:activity, Activity.get_by_id_with_object(id)},
{_, true} <- {:public?, Visibility.public?(activity)},
%User{} = user <- User.get_cached_by_ap_id(activity.data["actor"]) do
cond do
format in ["json", "activity+json"] ->
%{data: %{"id" => redirect_url}} = Object.normalize(activity, fetch: false)
redirect(conn, external: redirect_url)
activity.data["type"] == "Create" ->
%Object{} = object = Object.normalize(activity, fetch: false)
RedirectController.redirector_with_meta(
conn,
%{
activity_id: activity.id,
object: object,
url: Router.Helpers.o_status_url(Endpoint, :notice, activity.id),
user: user
}
)
true ->
RedirectController.redirector(conn, nil)
end
else
reason when reason in [{:public?, false}, {:activity, nil}] ->
conn
|> put_status(404)
|> RedirectController.redirector(nil, 404)
e ->
e
end
end
# Returns an HTML embedded <audio> or <video> player suitable for embed iframes.
def notice_player(conn, %{"id" => id}) do
with %Activity{data: %{"type" => "Create"}} = activity <- Activity.get_by_id_with_object(id),
true <- Visibility.public?(activity),
{_, true} <- {:visible?, Visibility.visible_for_user?(activity, _reading_user = nil)},
%Object{} = object <- Object.normalize(activity, fetch: false),
%{data: %{"attachment" => [%{"url" => [url | _]} | _]}} <- object,
true <- String.starts_with?(url["mediaType"], ["audio", "video"]) do
conn
- |> put_layout(:metadata_player)
|> put_resp_header("x-frame-options", "ALLOW")
|> put_resp_header(
"content-security-policy",
"default-src 'none';style-src 'self' 'unsafe-inline';img-src 'self' data: https:; media-src 'self' https:;"
)
|> put_view(PlayerView)
|> render("player.html", url)
else
_error ->
conn
|> put_status(404)
|> RedirectController.redirector(nil, 404)
end
end
defp errors(conn, {:error, :not_found}) do
render_error(conn, :not_found, "Not found")
end
defp errors(conn, {:fetch_user, nil}), do: errors(conn, {:error, :not_found})
defp errors(conn, _) do
render_error(conn, :internal_server_error, "Something went wrong")
end
end
diff --git a/lib/pleroma/web/pleroma_api/controllers/mascot_controller.ex b/lib/pleroma/web/pleroma_api/controllers/mascot_controller.ex
index 5e412196b..3208cde98 100644
--- a/lib/pleroma/web/pleroma_api/controllers/mascot_controller.ex
+++ b/lib/pleroma/web/pleroma_api/controllers/mascot_controller.ex
@@ -1,49 +1,49 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.PleromaAPI.MascotController do
use Pleroma.Web, :controller
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.Plugs.OAuthScopesPlug
plug(Majic.Plug, [pool: Pleroma.MajicPool] when action in [:update])
plug(Pleroma.Web.ApiSpec.CastAndValidate, replace_params: false)
plug(OAuthScopesPlug, %{scopes: ["read:accounts"]} when action == :show)
plug(OAuthScopesPlug, %{scopes: ["write:accounts"]} when action != :show)
defdelegate open_api_operation(action), to: Pleroma.Web.ApiSpec.PleromaMascotOperation
@doc "GET /api/v1/pleroma/mascot"
def show(%{assigns: %{user: user}} = conn, _params) do
json(conn, User.get_mascot(user))
end
@doc "PUT /api/v1/pleroma/mascot"
def update(
%{assigns: %{user: user}, private: %{open_api_spex: %{body_params: %{file: file}}}} =
conn,
_
) do
with {:content_type, "image" <> _} <- {:content_type, file.content_type},
- {:ok, object} <- ActivityPub.upload(file, actor: User.ap_id(user)) do
+ {_, {:ok, object}} <- {:upload, ActivityPub.upload(file, actor: User.ap_id(user))} do
attachment = render_attachment(object)
{:ok, _user} = User.mascot_update(user, attachment)
json(conn, attachment)
else
{:content_type, _} ->
render_error(conn, :unsupported_media_type, "mascots can only be images")
{:upload, {:error, _}} ->
render_error(conn, :error, "error uploading file")
end
end
defp render_attachment(object) do
attachment_data = Map.put(object.data, "id", object.id)
Pleroma.Web.MastodonAPI.StatusView.render("attachment.json", %{attachment: attachment_data})
end
end
diff --git a/lib/pleroma/web/plugs/rate_limiter/supervisor.ex b/lib/pleroma/web/plugs/rate_limiter/supervisor.ex
index f00f3d95e..5f79a3e3e 100644
--- a/lib/pleroma/web/plugs/rate_limiter/supervisor.ex
+++ b/lib/pleroma/web/plugs/rate_limiter/supervisor.ex
@@ -1,20 +1,20 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.Plugs.RateLimiter.Supervisor do
use Supervisor
def start_link(opts) do
Supervisor.start_link(__MODULE__, opts, name: __MODULE__)
end
def init(_args) do
children = [
Pleroma.Web.Plugs.RateLimiter.LimiterSupervisor
]
- opts = [strategy: :one_for_one, name: Pleroma.Web.Streamer.Supervisor]
+ opts = [strategy: :one_for_one]
Supervisor.init(children, opts)
end
end
diff --git a/lib/pleroma/web/rich_media/helpers.ex b/lib/pleroma/web/rich_media/helpers.ex
index 61000bb9b..a711dc436 100644
--- a/lib/pleroma/web/rich_media/helpers.ex
+++ b/lib/pleroma/web/rich_media/helpers.ex
@@ -1,131 +1,109 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.RichMedia.Helpers do
alias Pleroma.Activity
alias Pleroma.HTML
alias Pleroma.Object
alias Pleroma.Web.RichMedia.Parser
+ @cachex Pleroma.Config.get([:cachex, :provider], Cachex)
+
@config_impl Application.compile_env(:pleroma, [__MODULE__, :config_impl], Pleroma.Config)
@options [
pool: :media,
max_body: 2_000_000,
recv_timeout: 2_000
]
- @spec validate_page_url(URI.t() | binary()) :: :ok | :error
- defp validate_page_url(page_url) when is_binary(page_url) do
- validate_tld = @config_impl.get([Pleroma.Formatter, :validate_tld])
-
- page_url
- |> Linkify.Parser.url?(validate_tld: validate_tld)
- |> parse_uri(page_url)
- end
-
- defp validate_page_url(%URI{host: host, scheme: "https", authority: authority})
- when is_binary(authority) do
- cond do
- host in @config_impl.get([:rich_media, :ignore_hosts], []) ->
- :error
-
- get_tld(host) in @config_impl.get([:rich_media, :ignore_tld], []) ->
- :error
-
- true ->
- :ok
- end
- end
-
- defp validate_page_url(_), do: :error
-
- defp parse_uri(true, url) do
- url
- |> URI.parse()
- |> validate_page_url
- end
-
- defp parse_uri(_, _), do: :error
-
- defp get_tld(host) do
- host
- |> String.split(".")
- |> Enum.reverse()
- |> hd
- end
-
def fetch_data_for_object(object) do
with true <- @config_impl.get([:rich_media, :enabled]),
{:ok, page_url} <-
HTML.extract_first_external_url_from_object(object),
- :ok <- validate_page_url(page_url),
{:ok, rich_media} <- Parser.parse(page_url) do
%{page_url: page_url, rich_media: rich_media}
else
_ -> %{}
end
end
def fetch_data_for_activity(%Activity{data: %{"type" => "Create"}} = activity) do
with true <- @config_impl.get([:rich_media, :enabled]),
%Object{} = object <- Object.normalize(activity, fetch: false) do
- fetch_data_for_object(object)
+ if object.data["fake"] do
+ fetch_data_for_object(object)
+ else
+ key = "URL|#{activity.id}"
+
+ @cachex.fetch!(:scrubber_cache, key, fn _ ->
+ result = fetch_data_for_object(object)
+
+ cond do
+ match?(%{page_url: _, rich_media: _}, result) ->
+ Activity.HTML.add_cache_key_for(activity.id, key)
+ {:commit, result}
+
+ true ->
+ {:ignore, %{}}
+ end
+ end)
+ end
else
_ -> %{}
end
end
def fetch_data_for_activity(_), do: %{}
def rich_media_get(url) do
headers = [{"user-agent", Pleroma.Application.user_agent() <> "; Bot"}]
head_check =
case Pleroma.HTTP.head(url, headers, @options) do
# If the HEAD request didn't reach the server for whatever reason,
# we assume the GET that comes right after won't either
{:error, _} = e ->
e
{:ok, %Tesla.Env{status: 200, headers: headers}} ->
with :ok <- check_content_type(headers),
:ok <- check_content_length(headers),
do: :ok
_ ->
:ok
end
with :ok <- head_check, do: Pleroma.HTTP.get(url, headers, @options)
end
defp check_content_type(headers) do
case List.keyfind(headers, "content-type", 0) do
{_, content_type} ->
case Plug.Conn.Utils.media_type(content_type) do
{:ok, "text", "html", _} -> :ok
_ -> {:error, {:content_type, content_type}}
end
_ ->
:ok
end
end
@max_body @options[:max_body]
defp check_content_length(headers) do
case List.keyfind(headers, "content-length", 0) do
{_, maybe_content_length} ->
case Integer.parse(maybe_content_length) do
{content_length, ""} when content_length <= @max_body -> :ok
{_, ""} -> {:error, :body_too_large}
_ -> :ok
end
_ ->
:ok
end
end
end
diff --git a/lib/pleroma/web/rich_media/parser.ex b/lib/pleroma/web/rich_media/parser.ex
index 837182d8a..a73fbc4b9 100644
--- a/lib/pleroma/web/rich_media/parser.ex
+++ b/lib/pleroma/web/rich_media/parser.ex
@@ -1,169 +1,208 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.RichMedia.Parser do
require Logger
@cachex Pleroma.Config.get([:cachex, :provider], Cachex)
+ @config_impl Application.compile_env(:pleroma, [__MODULE__, :config_impl], Pleroma.Config)
defp parsers do
Pleroma.Config.get([:rich_media, :parsers])
end
def parse(nil), do: {:error, "No URL provided"}
- if Pleroma.Config.get(:env) == :test do
- @spec parse(String.t()) :: {:ok, map()} | {:error, any()}
- def parse(url), do: parse_url(url)
- else
- @spec parse(String.t()) :: {:ok, map()} | {:error, any()}
- def parse(url) do
- with {:ok, data} <- get_cached_or_parse(url),
- {:ok, _} <- set_ttl_based_on_image(data, url) do
- {:ok, data}
- end
+ @spec parse(String.t()) :: {:ok, map()} | {:error, any()}
+ def parse(url) do
+ with :ok <- validate_page_url(url),
+ {:ok, data} <- get_cached_or_parse(url),
+ {:ok, _} <- set_ttl_based_on_image(data, url) do
+ {:ok, data}
end
+ end
- defp get_cached_or_parse(url) do
- case @cachex.fetch(:rich_media_cache, url, fn ->
- case parse_url(url) do
- {:ok, _} = res ->
- {:commit, res}
-
- {:error, reason} = e ->
- # Unfortunately we have to log errors here, instead of doing that
- # along with ttl setting at the bottom. Otherwise we can get log spam
- # if more than one process was waiting for the rich media card
- # while it was generated. Ideally we would set ttl here as well,
- # so we don't override it number_of_waiters_on_generation
- # times, but one, obviously, can't set ttl for not-yet-created entry
- # and Cachex doesn't support returning ttl from the fetch callback.
- log_error(url, reason)
- {:commit, e}
- end
- end) do
- {action, res} when action in [:commit, :ok] ->
- case res do
- {:ok, _data} = res ->
- res
-
- {:error, reason} = e ->
- if action == :commit, do: set_error_ttl(url, reason)
- e
- end
-
- {:error, e} ->
- {:error, {:cachex_error, e}}
- end
+ defp get_cached_or_parse(url) do
+ case @cachex.fetch(:rich_media_cache, url, fn ->
+ case parse_url(url) do
+ {:ok, _} = res ->
+ {:commit, res}
+
+ {:error, reason} = e ->
+ # Unfortunately we have to log errors here, instead of doing that
+ # along with ttl setting at the bottom. Otherwise we can get log spam
+ # if more than one process was waiting for the rich media card
+ # while it was generated. Ideally we would set ttl here as well,
+ # so we don't override it number_of_waiters_on_generation
+ # times, but one, obviously, can't set ttl for not-yet-created entry
+ # and Cachex doesn't support returning ttl from the fetch callback.
+ log_error(url, reason)
+ {:commit, e}
+ end
+ end) do
+ {action, res} when action in [:commit, :ok] ->
+ case res do
+ {:ok, _data} = res ->
+ res
+
+ {:error, reason} = e ->
+ if action == :commit, do: set_error_ttl(url, reason)
+ e
+ end
+
+ {:error, e} ->
+ {:error, {:cachex_error, e}}
end
+ end
- defp set_error_ttl(_url, :body_too_large), do: :ok
- defp set_error_ttl(_url, {:content_type, _}), do: :ok
+ defp set_error_ttl(_url, :body_too_large), do: :ok
+ defp set_error_ttl(_url, {:content_type, _}), do: :ok
- # The TTL is not set for the errors above, since they are unlikely to change
- # with time
+ # The TTL is not set for the errors above, since they are unlikely to change
+ # with time
- defp set_error_ttl(url, _reason) do
- ttl = Pleroma.Config.get([:rich_media, :failure_backoff], 60_000)
- @cachex.expire(:rich_media_cache, url, ttl)
- :ok
- end
+ defp set_error_ttl(url, _reason) do
+ ttl = Pleroma.Config.get([:rich_media, :failure_backoff], 60_000)
+ @cachex.expire(:rich_media_cache, url, ttl)
+ :ok
+ end
- defp log_error(url, {:invalid_metadata, data}) do
- Logger.debug(fn -> "Incomplete or invalid metadata for #{url}: #{inspect(data)}" end)
- end
+ defp log_error(url, {:invalid_metadata, data}) do
+ Logger.debug(fn -> "Incomplete or invalid metadata for #{url}: #{inspect(data)}" end)
+ end
- defp log_error(url, reason) do
- Logger.warning(fn -> "Rich media error for #{url}: #{inspect(reason)}" end)
- end
+ defp log_error(url, reason) do
+ Logger.warning(fn -> "Rich media error for #{url}: #{inspect(reason)}" end)
end
@doc """
Set the rich media cache based on the expiration time of image.
Adopt behaviour `Pleroma.Web.RichMedia.Parser.TTL`
## Example
defmodule MyModule do
@behaviour Pleroma.Web.RichMedia.Parser.TTL
def ttl(data, url) do
image_url = Map.get(data, :image)
# do some parsing in the url and get the ttl of the image
# and return ttl is unix time
parse_ttl_from_url(image_url)
end
end
Define the module in the config
config :pleroma, :rich_media,
ttl_setters: [MyModule]
"""
@spec set_ttl_based_on_image(map(), String.t()) ::
{:ok, integer() | :noop} | {:error, :no_key}
def set_ttl_based_on_image(data, url) do
case get_ttl_from_image(data, url) do
ttl when is_number(ttl) ->
ttl = ttl * 1000
case @cachex.expire_at(:rich_media_cache, url, ttl) do
{:ok, true} -> {:ok, ttl}
{:ok, false} -> {:error, :no_key}
end
_ ->
{:ok, :noop}
end
end
defp get_ttl_from_image(data, url) do
[:rich_media, :ttl_setters]
|> Pleroma.Config.get()
|> Enum.reduce({:ok, nil}, fn
module, {:ok, _ttl} ->
module.ttl(data, url)
_, error ->
error
end)
end
def parse_url(url) do
with {:ok, %Tesla.Env{body: html}} <- Pleroma.Web.RichMedia.Helpers.rich_media_get(url),
{:ok, html} <- Floki.parse_document(html) do
html
|> maybe_parse()
|> Map.put("url", url)
|> clean_parsed_data()
|> check_parsed_data()
end
end
defp maybe_parse(html) do
Enum.reduce_while(parsers(), %{}, fn parser, acc ->
case parser.parse(html, acc) do
data when data != %{} -> {:halt, data}
_ -> {:cont, acc}
end
end)
end
defp check_parsed_data(%{"title" => title} = data)
when is_binary(title) and title != "" do
{:ok, data}
end
defp check_parsed_data(data) do
{:error, {:invalid_metadata, data}}
end
defp clean_parsed_data(data) do
data
|> Enum.reject(fn {key, val} ->
not match?({:ok, _}, Jason.encode(%{key => val}))
end)
|> Map.new()
end
+
+ @spec validate_page_url(URI.t() | binary()) :: :ok | :error
+ defp validate_page_url(page_url) when is_binary(page_url) do
+ validate_tld = @config_impl.get([Pleroma.Formatter, :validate_tld])
+
+ page_url
+ |> Linkify.Parser.url?(validate_tld: validate_tld)
+ |> parse_uri(page_url)
+ end
+
+ defp validate_page_url(%URI{host: host, scheme: "https"}) do
+ cond do
+ Linkify.Parser.ip?(host) ->
+ :error
+
+ host in @config_impl.get([:rich_media, :ignore_hosts], []) ->
+ :error
+
+ get_tld(host) in @config_impl.get([:rich_media, :ignore_tld], []) ->
+ :error
+
+ true ->
+ :ok
+ end
+ end
+
+ defp validate_page_url(_), do: :error
+
+ defp parse_uri(true, url) do
+ url
+ |> URI.parse()
+ |> validate_page_url
+ end
+
+ defp parse_uri(_, _), do: :error
+
+ defp get_tld(host) do
+ host
+ |> String.split(".")
+ |> Enum.reverse()
+ |> hd
+ end
end
diff --git a/lib/pleroma/web/static_fe/static_fe_controller.ex b/lib/pleroma/web/static_fe/static_fe_controller.ex
index 3f2dbcd02..012f8e464 100644
--- a/lib/pleroma/web/static_fe/static_fe_controller.ex
+++ b/lib/pleroma/web/static_fe/static_fe_controller.ex
@@ -1,188 +1,187 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.StaticFE.StaticFEController do
use Pleroma.Web, :controller
alias Pleroma.Activity
alias Pleroma.Object
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.Metadata
alias Pleroma.Web.Router.Helpers
- plug(:put_layout, :static_fe)
plug(:assign_id)
@page_keys ["max_id", "min_id", "limit", "since_id", "order"]
@doc "Renders requested local public activity or public activities of requested user"
def show(%{assigns: %{notice_id: notice_id}} = conn, _params) do
with %Activity{local: true} = activity <-
Activity.get_by_id_with_object(notice_id),
true <- Visibility.public?(activity.object),
{_, true} <- {:visible?, Visibility.visible_for_user?(activity, _reading_user = nil)},
%User{} = user <- User.get_by_ap_id(activity.object.data["actor"]) do
url = Helpers.url(conn) <> conn.request_path
meta =
Metadata.build_tags(%{
activity_id: notice_id,
object: activity.object,
user: user,
url: url
})
timeline =
activity.object.data["context"]
|> ActivityPub.fetch_activities_for_context(%{})
|> Enum.reverse()
|> Enum.map(&represent(&1, &1.object.id == activity.object.id))
render(conn, "conversation.html", %{activities: timeline, meta: meta})
else
%Activity{object: %Object{data: data}} ->
conn
|> put_status(:found)
|> redirect(external: data["url"] || data["external_url"] || data["id"])
_ ->
not_found(conn, "Post not found.")
end
end
def show(%{assigns: %{username_or_id: username_or_id}} = conn, params) do
with {_, %User{local: true} = user} <-
{:fetch_user, User.get_cached_by_nickname_or_id(username_or_id)},
{_, :visible} <- {:visibility, User.visible_for(user, _reading_user = nil)} do
meta = Metadata.build_tags(%{user: user})
params =
params
|> Map.take(@page_keys)
|> Map.new(fn {k, v} -> {String.to_existing_atom(k), v} end)
timeline =
user
|> ActivityPub.fetch_user_activities(_reading_user = nil, params)
|> Enum.map(&represent/1)
prev_page_id =
(params["min_id"] || params["max_id"]) &&
List.first(timeline) && List.first(timeline).id
next_page_id = List.last(timeline) && List.last(timeline).id
render(conn, "profile.html", %{
user: User.sanitize_html(user),
timeline: timeline,
prev_page_id: prev_page_id,
next_page_id: next_page_id,
meta: meta
})
else
_ ->
not_found(conn, "User not found.")
end
end
def show(%{assigns: %{object_id: _}} = conn, _params) do
url = Helpers.url(conn) <> conn.request_path
case Activity.get_create_by_object_ap_id_with_object(url) do
%Activity{} = activity ->
to = Helpers.o_status_path(Pleroma.Web.Endpoint, :notice, activity)
redirect(conn, to: to)
_ ->
not_found(conn, "Post not found.")
end
end
def show(%{assigns: %{activity_id: _}} = conn, _params) do
url = Helpers.url(conn) <> conn.request_path
case Activity.get_by_ap_id(url) do
%Activity{} = activity ->
to = Helpers.o_status_path(Pleroma.Web.Endpoint, :notice, activity)
redirect(conn, to: to)
_ ->
not_found(conn, "Post not found.")
end
end
defp get_title(%Object{data: %{"name" => name}}) when is_binary(name),
do: name
defp get_title(%Object{data: %{"summary" => summary}}) when is_binary(summary),
do: summary
defp get_title(_), do: nil
defp not_found(conn, message) do
conn
|> put_status(404)
|> render("error.html", %{message: message, meta: ""})
end
defp get_counts(%Activity{} = activity) do
%Object{data: data} = Object.normalize(activity, fetch: false)
%{
likes: data["like_count"] || 0,
replies: data["repliesCount"] || 0,
announces: data["announcement_count"] || 0
}
end
defp represent(%Activity{} = activity), do: represent(activity, false)
defp represent(%Activity{object: %Object{data: data}} = activity, selected) do
{:ok, user} = User.get_or_fetch(activity.object.data["actor"])
link =
case user.local do
true -> Helpers.o_status_url(Pleroma.Web.Endpoint, :notice, activity)
_ -> data["url"] || data["external_url"] || data["id"]
end
content =
if data["content"] do
data["content"]
|> Pleroma.HTML.filter_tags()
|> Pleroma.Emoji.Formatter.emojify(Map.get(data, "emoji", %{}))
else
nil
end
%{
user: User.sanitize_html(user),
title: get_title(activity.object),
content: content,
attachment: data["attachment"],
link: link,
published: data["published"],
sensitive: data["sensitive"],
selected: selected,
counts: get_counts(activity),
id: activity.id
}
end
defp assign_id(%{path_info: ["notice", notice_id]} = conn, _opts),
do: assign(conn, :notice_id, notice_id)
defp assign_id(%{path_info: ["users", user_id]} = conn, _opts),
do: assign(conn, :username_or_id, user_id)
defp assign_id(%{path_info: ["objects", object_id]} = conn, _opts),
do: assign(conn, :object_id, object_id)
defp assign_id(%{path_info: ["activities", activity_id]} = conn, _opts),
do: assign(conn, :activity_id, activity_id)
defp assign_id(conn, _opts), do: conn
end
diff --git a/lib/pleroma/web/templates/o_auth/o_auth/show.html.eex b/lib/pleroma/web/templates/o_auth/o_auth/show.html.eex
index 5b38f7142..6bc8eb602 100644
--- a/lib/pleroma/web/templates/o_auth/o_auth/show.html.eex
+++ b/lib/pleroma/web/templates/o_auth/o_auth/show.html.eex
@@ -1,67 +1,67 @@
<%= if Phoenix.Flash.get(@flash, :info) do %>
<p class="alert alert-info" role="alert"><%= Phoenix.Flash.get(@flash, :info) %></p>
<% end %>
<%= if Phoenix.Flash.get(@flash, :error) do %>
<p class="alert alert-danger" role="alert"><%= Phoenix.Flash.get(@flash, :error) %></p>
<% end %>
<%= form_for @conn, Routes.o_auth_path(@conn, :authorize), [as: "authorization"], fn f -> %>
<%= if @user do %>
<div class="account-header">
<div class="account-header__banner" style="background-image: url('<%= Pleroma.User.banner_url(@user) %>')"></div>
<div class="account-header__avatar" style="background-image: url('<%= Pleroma.User.avatar_url(@user) %>')"></div>
<div class="account-header__meta">
<div class="account-header__display-name"><%= @user.name %></div>
- <div class="account-header__nickname">@<%= @user.nickname %>@<%= Pleroma.User.get_host(@user) %></div>
+ <div class="account-header__nickname">@<%= Pleroma.User.full_nickname(@user.nickname) %></div>
</div>
</div>
<% end %>
<div class="container__content">
<%= if @app do %>
<p><%= raw Gettext.dpgettext("static_pages", "oauth authorize message", "Application <strong>%{client_name}</strong> is requesting access to your account.", client_name: safe_to_string(html_escape(@app.client_name))) %></p>
<%= render Phoenix.Controller.view_module(@conn), "_scopes.html", Map.merge(assigns, %{form: f}) %>
<% end %>
<%= if @user do %>
<div class="actions">
<a class="button button--cancel" href="/">
<%= Gettext.dpgettext("static_pages", "oauth authorize cancel button", "Cancel") %>
</a>
<%= submit Gettext.dpgettext("static_pages", "oauth authorize approve button", "Approve"), class: "button--approve" %>
</div>
<% else %>
<%= if @params["registration"] in ["true", true] do %>
<h3><%= Gettext.dpgettext("static_pages", "oauth register page title", "This is the first time you visit! Please enter your Pleroma handle.") %></h3>
<p><%= Gettext.dpgettext("static_pages", "oauth register nickname unchangeable warning", "Choose carefully! You won't be able to change this later. You will be able to change your display name, though.") %></p>
<div class="input">
<%= label f, :nickname, Gettext.dpgettext("static_pages", "oauth register nickname prompt", "Pleroma Handle") %>
<%= text_input f, :nickname, placeholder: "lain", autocomplete: "username" %>
</div>
<%= hidden_input f, :name, value: @params["name"] %>
<%= hidden_input f, :password, value: @params["password"] %>
<br>
<% else %>
<div class="input">
<%= label f, :name, Gettext.dpgettext("static_pages", "oauth login username prompt", "Username") %>
<%= text_input f, :name %>
</div>
<div class="input">
<%= label f, :password, Gettext.dpgettext("static_pages", "oauth login password prompt", "Password") %>
<%= password_input f, :password %>
</div>
<%= submit Gettext.dpgettext("static_pages", "oauth login button", "Log In") %>
<% end %>
<% end %>
</div>
<%= hidden_input f, :client_id, value: @client_id %>
<%= hidden_input f, :response_type, value: @response_type %>
<%= hidden_input f, :redirect_uri, value: @redirect_uri %>
<%= hidden_input f, :state, value: @state %>
<% end %>
<%= if Pleroma.Config.oauth_consumer_enabled?() do %>
<%= render Phoenix.Controller.view_module(@conn), Pleroma.Web.Auth.WrapperAuthenticator.oauth_consumer_template(), assigns %>
<% end %>
diff --git a/lib/pleroma/workers/background_worker.ex b/lib/pleroma/workers/background_worker.ex
index 794417612..7a2210dc1 100644
--- a/lib/pleroma/workers/background_worker.ex
+++ b/lib/pleroma/workers/background_worker.ex
@@ -1,49 +1,49 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Workers.BackgroundWorker do
alias Pleroma.Instances.Instance
alias Pleroma.User
use Pleroma.Workers.WorkerHelper, queue: "background"
@impl Oban.Worker
def perform(%Job{args: %{"op" => "user_activation", "user_id" => user_id, "status" => status}}) do
user = User.get_cached_by_id(user_id)
User.perform(:set_activation_async, user, status)
end
def perform(%Job{args: %{"op" => "delete_user", "user_id" => user_id}}) do
user = User.get_cached_by_id(user_id)
User.perform(:delete, user)
end
def perform(%Job{args: %{"op" => "force_password_reset", "user_id" => user_id}}) do
user = User.get_cached_by_id(user_id)
User.perform(:force_password_reset, user)
end
def perform(%Job{args: %{"op" => op, "user_id" => user_id, "identifiers" => identifiers}})
when op in ["blocks_import", "follow_import", "mutes_import"] do
user = User.get_cached_by_id(user_id)
- {:ok, User.Import.perform(String.to_atom(op), user, identifiers)}
+ {:ok, User.Import.perform(String.to_existing_atom(op), user, identifiers)}
end
def perform(%Job{
args: %{"op" => "move_following", "origin_id" => origin_id, "target_id" => target_id}
}) do
origin = User.get_cached_by_id(origin_id)
target = User.get_cached_by_id(target_id)
Pleroma.FollowingRelationship.move_following(origin, target)
end
def perform(%Job{args: %{"op" => "delete_instance", "host" => host}}) do
Instance.perform(:delete_instance, host)
end
@impl Oban.Worker
def timeout(_job), do: :timer.seconds(900)
end
diff --git a/mix.exs b/mix.exs
index ee433af2a..aee6eaa74 100644
--- a/mix.exs
+++ b/mix.exs
@@ -1,355 +1,356 @@
defmodule Pleroma.Mixfile do
use Mix.Project
def project do
[
app: :pleroma,
- version: version("2.6.51"),
+ version: version("2.6.52"),
elixir: "~> 1.11",
elixirc_paths: elixirc_paths(Mix.env()),
compilers: Mix.compilers(),
elixirc_options: [warnings_as_errors: warnings_as_errors()],
xref: [exclude: [:eldap]],
dialyzer: [plt_add_apps: [:mix, :eldap]],
start_permanent: Mix.env() == :prod,
aliases: aliases(),
deps: deps(),
test_coverage: [tool: :covertool, summary: true],
# Docs
name: "Pleroma",
homepage_url: "https://pleroma.social/",
source_url: "https://git.pleroma.social/pleroma/pleroma",
docs: [
source_url_pattern:
"https://git.pleroma.social/pleroma/pleroma/blob/develop/%{path}#L%{line}",
logo: "priv/static/images/logo.png",
extras: ["README.md", "CHANGELOG.md"] ++ Path.wildcard("docs/**/*.md"),
groups_for_extras: [
"Installation manuals": Path.wildcard("docs/installation/*.md"),
Configuration: Path.wildcard("docs/config/*.md"),
Administration: Path.wildcard("docs/admin/*.md"),
"Pleroma's APIs and Mastodon API extensions": Path.wildcard("docs/api/*.md")
],
main: "readme",
output: "priv/static/doc"
],
releases: [
pleroma: [
include_executables_for: [:unix],
applications: [ex_syslogger: :load, syslog: :load, eldap: :transient],
steps: [:assemble, &put_otp_version/1, &copy_files/1, &copy_nginx_config/1],
config_providers: [{Pleroma.Config.ReleaseRuntimeProvider, []}]
]
]
]
end
def put_otp_version(%{path: target_path} = release) do
File.write!(
Path.join([target_path, "OTP_VERSION"]),
Pleroma.OTPVersion.version()
)
release
end
def copy_files(%{path: target_path} = release) do
File.cp_r!("./rel/files", target_path)
release
end
def copy_nginx_config(%{path: target_path} = release) do
File.cp!(
"./installation/pleroma.nginx",
Path.join([target_path, "installation", "pleroma.nginx"])
)
release
end
# Configuration for the OTP application.
#
# Type `mix help compile.app` for more information.
def application do
[
mod: {Pleroma.Application, []},
extra_applications: [
:logger,
:runtime_tools,
:comeonin,
:fast_sanitize,
:os_mon,
:ssl
],
included_applications: [:ex_syslogger]
]
end
# Specifies which paths to compile per environment.
defp elixirc_paths(:benchmark), do: ["lib", "benchmarks", "priv/scrubbers"]
defp elixirc_paths(:test), do: ["lib", "test/support"]
defp elixirc_paths(_), do: ["lib"]
defp warnings_as_errors, do: System.get_env("CI") == "true"
# Specifies OAuth dependencies.
defp oauth_deps do
oauth_strategy_packages =
System.get_env("OAUTH_CONSUMER_STRATEGIES")
|> to_string()
|> String.split()
|> Enum.map(fn strategy_entry ->
with [_strategy, dependency] <- String.split(strategy_entry, ":") do
dependency
else
[strategy] -> "ueberauth_#{strategy}"
end
end)
for s <- oauth_strategy_packages, do: {String.to_atom(s), ">= 0.0.0"}
end
# Specifies your project dependencies.
#
# Type `mix help deps` for examples and options.
defp deps do
[
{:phoenix, "~> 1.7.3"},
{:phoenix_ecto, "~> 4.4"},
{:ecto_sql, "~> 3.10"},
{:ecto_enum, "~> 1.4"},
{:postgrex, ">= 0.0.0"},
{:phoenix_html, "~> 3.3"},
{:phoenix_live_reload, "~> 1.3.3", only: :dev},
{:phoenix_live_view, "~> 0.19.0"},
{:phoenix_live_dashboard, "~> 0.8.0"},
{:telemetry_metrics, "~> 0.6"},
{:telemetry_poller, "~> 1.0"},
{:tzdata, "~> 1.0.3"},
{:plug_cowboy, "~> 2.5"},
# oban 2.14 requires Elixir 1.12+
{:oban, "~> 2.13.4"},
{:gettext, "~> 0.20"},
{:bcrypt_elixir, "~> 2.2"},
{:trailing_format_plug, "~> 0.0.7"},
{:fast_sanitize, "~> 0.2.0"},
{:html_entities, "~> 0.5", override: true},
{:calendar, "~> 1.0"},
{:cachex, "~> 3.2"},
{:poison, "~> 3.0", override: true},
- {:tesla, "~> 1.4.0", override: true},
+ {:tesla, "~> 1.8.0"},
{:castore, "~> 0.1"},
{:cowlib, "~> 2.9", override: true},
{:gun, "~> 2.0.0-rc.1", override: true},
{:finch, "~> 0.15"},
{:jason, "~> 1.2"},
{:mogrify, "~> 0.8.0"},
{:ex_aws, "~> 2.1.6"},
{:ex_aws_s3, "~> 2.0"},
{:sweet_xml, "~> 0.7.2"},
# earmark 1.4.23 requires Elixir 1.12+
{:earmark, "1.4.22"},
{:bbcode_pleroma, "~> 0.2.0"},
{:cors_plug, "~> 2.0"},
{:web_push_encryption, "~> 0.3.1"},
# swoosh 1.11.2+ requires Elixir 1.12+
{:swoosh, "~> 1.10.0"},
{:phoenix_swoosh, "~> 1.1"},
{:gen_smtp, "~> 0.13"},
{:ex_syslogger, "~> 1.4"},
{:floki, "~> 0.35"},
{:timex, "~> 3.6"},
{:ueberauth, "~> 0.4"},
{:linkify, "~> 0.5.3"},
{:http_signatures, "~> 0.1.2"},
{:telemetry, "~> 1.0.0", override: true},
{:poolboy, "~> 1.5"},
{:prom_ex, "~> 1.9"},
{:recon, "~> 2.5"},
{:joken, "~> 2.0"},
{:pot, "~> 1.0"},
{:ex_const, "~> 0.2"},
{:plug_static_index_html, "~> 1.0.0"},
{:flake_id, "~> 0.1.0"},
{:concurrent_limiter, "~> 0.1.1"},
{:remote_ip,
git: "https://git.pleroma.social/pleroma/remote_ip.git",
ref: "b647d0deecaa3acb140854fe4bda5b7e1dc6d1c8"},
{:captcha,
git: "https://git.pleroma.social/pleroma/elixir-libraries/elixir-captcha.git",
ref: "90f6ce7672f70f56708792a98d98bd05176c9176"},
{:restarter, path: "./restarter"},
{:majic, "~> 1.0"},
{:open_api_spex, "~> 3.16"},
{:ecto_psql_extras, "~> 0.6"},
{:vix, "~> 0.26.0"},
{:elixir_make, "~> 0.7.7", override: true},
{:blurhash, "~> 0.1.0", hex: :rinpatch_blurhash},
{:exile,
git: "https://git.pleroma.social/pleroma/elixir-libraries/exile.git",
ref: "0d6337cf68e7fbc8a093cae000955aa93b067f91"},
+ {:bandit, "~> 1.2"},
{:icalendar, "~> 1.1"},
{:geospatial, "~> 0.2.0"},
## dev & test
{:ex_doc, "~> 0.22", only: :dev, runtime: false},
{:ex_machina, "~> 2.4", only: :test},
{:credo, "~> 1.6", only: [:dev, :test], runtime: false},
{:mock, "~> 0.3.5", only: :test},
{:covertool, "~> 2.0", only: :test},
{:hackney, "~> 1.18.0", override: true},
{:mox, "~> 1.0", only: :test},
{:websockex, "~> 0.4.3", only: :test},
{:benchee, "~> 1.0", only: :benchmark},
{:dialyxir, "~> 1.4", only: [:dev, :test], runtime: false}
] ++ oauth_deps()
end
# Aliases are shortcuts or tasks specific to the current project.
# For example, to create, migrate and run the seeds file at once:
#
# $ mix ecto.setup
#
# See the documentation for `Mix` for more info on aliases.
defp aliases do
[
"ecto.migrate": ["pleroma.ecto.migrate"],
"ecto.rollback": ["pleroma.ecto.rollback"],
"ecto.setup": ["ecto.create", "ecto.migrate", "run priv/repo/seeds.exs"],
"ecto.reset": ["ecto.drop", "ecto.setup"],
test: ["ecto.create --quiet", "ecto.migrate", "test"],
docs: ["pleroma.docs", "docs"],
analyze: ["credo --strict --only=warnings,todo,fixme,consistency,readability"],
copyright: &add_copyright/1,
"copyright.bump": &bump_copyright/1
]
end
# Builds a version string made of:
# * the application version
# * a pre-release if ahead of the tag: the describe string (-count-commithash)
# * branch name
# * build metadata:
# * a build name if `PLEROMA_BUILD_NAME` or `:pleroma, :build_name` is defined
# * the mix environment if different than prod
defp version(version) do
identifier_filter = ~r/[^0-9a-z\-]+/i
git_available? = match?({_output, 0}, System.cmd("sh", ["-c", "command -v git"]))
dotgit_present? = File.exists?(".git")
git_pre_release =
if git_available? and dotgit_present? do
{tag, tag_err} =
System.cmd("git", ["describe", "--tags", "--abbrev=0"], stderr_to_stdout: true)
{describe, describe_err} = System.cmd("git", ["describe", "--tags", "--abbrev=8"])
{commit_hash, commit_hash_err} = System.cmd("git", ["rev-parse", "--short", "HEAD"])
# Pre-release version, denoted from patch version with a hyphen
cond do
tag_err == 0 and describe_err == 0 ->
describe
|> String.trim()
|> String.replace(String.trim(tag), "")
|> String.trim_leading("-")
|> String.trim()
commit_hash_err == 0 ->
"0-g" <> String.trim(commit_hash)
true ->
nil
end
end
# Branch name as pre-release version component, denoted with a dot
branch_name =
with true <- git_available?,
true <- dotgit_present?,
{branch_name, 0} <- System.cmd("git", ["rev-parse", "--abbrev-ref", "HEAD"]),
branch_name <- String.trim(branch_name),
branch_name <- System.get_env("PLEROMA_BUILD_BRANCH") || branch_name,
true <-
!Enum.any?(["master", "HEAD", "release/", "stable"], fn name ->
String.starts_with?(name, branch_name)
end) do
branch_name =
branch_name
|> String.trim()
|> String.replace(identifier_filter, "-")
branch_name
else
_ -> ""
end
build_name =
cond do
name = Application.get_env(:pleroma, :build_name) -> name
name = System.get_env("PLEROMA_BUILD_NAME") -> name
true -> nil
end
env_name = if Mix.env() != :prod, do: to_string(Mix.env())
env_override = System.get_env("PLEROMA_BUILD_ENV")
env_name =
case env_override do
nil -> env_name
env_override when env_override in ["", "prod"] -> nil
env_override -> env_override
end
# Pre-release version, denoted by appending a hyphen
# and a series of dot separated identifiers
pre_release =
[git_pre_release, branch_name]
|> Enum.filter(fn string -> string && string != "" end)
|> Enum.join(".")
|> (fn
"" -> nil
string -> "-" <> String.replace(string, identifier_filter, "-")
end).()
# Build metadata, denoted with a plus sign
build_metadata =
[build_name, env_name]
|> Enum.filter(fn string -> string && string != "" end)
|> Enum.join(".")
|> (fn
"" -> nil
string -> "+" <> String.replace(string, identifier_filter, "-")
end).()
[version, pre_release, build_metadata]
|> Enum.filter(fn string -> string && string != "" end)
|> Enum.join()
end
defp add_copyright(_) do
year = NaiveDateTime.utc_now().year
template = ~s[\
# Pleroma: A lightweight social networking server
# Copyright © 2017-#{year} Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
] |> String.replace("\n", "\\n")
find = "find lib test priv -type f \\( -name '*.ex' -or -name '*.exs' \\) -exec "
grep = "grep -L '# Copyright © [0-9\-]* Pleroma' {} \\;"
xargs = "xargs -n1 sed -i'' '1s;^;#{template};'"
:os.cmd(String.to_charlist("#{find}#{grep} | #{xargs}"))
end
defp bump_copyright(_) do
year = NaiveDateTime.utc_now().year
find = "find lib test priv -type f \\( -name '*.ex' -or -name '*.exs' \\)"
xargs =
"xargs sed -i'' 's;# Copyright © [0-9\-]* Pleroma.*$;# Copyright © 2017-#{year} Pleroma Authors <https://pleroma.social/>;'"
:os.cmd(String.to_charlist("#{find} | #{xargs}"))
end
end
diff --git a/mix.lock b/mix.lock
index f7a6d9cd4..4eec44875 100644
--- a/mix.lock
+++ b/mix.lock
@@ -1,152 +1,154 @@
%{
"accept": {:hex, :accept, "0.3.5", "b33b127abca7cc948bbe6caa4c263369abf1347cfa9d8e699c6d214660f10cd1", [:rebar3], [], "hexpm", "11b18c220bcc2eab63b5470c038ef10eb6783bcb1fcdb11aa4137defa5ac1bb8"},
+ "bandit": {:hex, :bandit, "1.2.1", "aa485b4ac175065b8e0fb5864ddd5dd7b50d52336b36f61c82f484c3718b3d15", [:mix], [{:hpax, "~> 0.1.1", [hex: :hpax, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:thousand_island, "~> 1.0", [hex: :thousand_island, repo: "hexpm", optional: false]}, {:websock, "~> 0.5", [hex: :websock, repo: "hexpm", optional: false]}], "hexpm", "27393e590a407f1b7d51c5fee4737f139fe224a30449ce25061eac70f763896b"},
"base62": {:hex, :base62, "1.2.2", "85c6627eb609317b70f555294045895ffaaeb1758666ab9ef9ca38865b11e629", [:mix], [{:custom_base, "~> 0.2.1", [hex: :custom_base, repo: "hexpm", optional: false]}], "hexpm", "d41336bda8eaa5be197f1e4592400513ee60518e5b9f4dcf38f4b4dae6f377bb"},
"bbcode_pleroma": {:hex, :bbcode_pleroma, "0.2.0", "d36f5bca6e2f62261c45be30fa9b92725c0655ad45c99025cb1c3e28e25803ef", [:mix], [{:nimble_parsec, "~> 0.5", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "19851074419a5fedb4ef49e1f01b30df504bb5dbb6d6adfc135238063bebd1c3"},
"bcrypt_elixir": {:hex, :bcrypt_elixir, "2.3.1", "5114d780459a04f2b4aeef52307de23de961b69e13a5cd98a911e39fda13f420", [:make, :mix], [{:comeonin, "~> 5.3", [hex: :comeonin, repo: "hexpm", optional: false]}, {:elixir_make, "~> 0.6", [hex: :elixir_make, repo: "hexpm", optional: false]}], "hexpm", "42182d5f46764def15bf9af83739e3bf4ad22661b1c34fc3e88558efced07279"},
"benchee": {:hex, :benchee, "1.3.0", "f64e3b64ad3563fa9838146ddefb2d2f94cf5b473bdfd63f5ca4d0657bf96694", [:mix], [{:deep_merge, "~> 1.0", [hex: :deep_merge, repo: "hexpm", optional: false]}, {:statistex, "~> 1.0", [hex: :statistex, repo: "hexpm", optional: false]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "34f4294068c11b2bd2ebf2c59aac9c7da26ffa0068afdf3419f1b176e16c5f81"},
"blurhash": {:hex, :rinpatch_blurhash, "0.1.0", "01a888b0f5f1f382ab52e4396f01831cbe8486ea5828604c90f4dac533d39a4b", [:mix], [{:mogrify, "~> 0.8.0", [hex: :mogrify, repo: "hexpm", optional: true]}], "hexpm", "19911a5dcbb0acb9710169a72f702bce6cb048822b12de566ccd82b2cc42b907"},
"bunt": {:hex, :bunt, "1.0.0", "081c2c665f086849e6d57900292b3a161727ab40431219529f13c4ddcf3e7a44", [:mix], [], "hexpm", "dc5f86aa08a5f6fa6b8096f0735c4e76d54ae5c9fa2c143e5a1fc7c1cd9bb6b5"},
"cachex": {:hex, :cachex, "3.6.0", "14a1bfbeee060dd9bec25a5b6f4e4691e3670ebda28c8ba2884b12fe30b36bf8", [:mix], [{:eternal, "~> 1.2", [hex: :eternal, repo: "hexpm", optional: false]}, {:jumper, "~> 1.0", [hex: :jumper, repo: "hexpm", optional: false]}, {:sleeplocks, "~> 1.1", [hex: :sleeplocks, repo: "hexpm", optional: false]}, {:unsafe, "~> 1.0", [hex: :unsafe, repo: "hexpm", optional: false]}], "hexpm", "ebf24e373883bc8e0c8d894a63bbe102ae13d918f790121f5cfe6e485cc8e2e2"},
"calendar": {:hex, :calendar, "1.0.0", "f52073a708528482ec33d0a171954ca610fe2bd28f1e871f247dc7f1565fa807", [:mix], [{:tzdata, "~> 0.5.20 or ~> 0.1.201603 or ~> 1.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "990e9581920c82912a5ee50e62ff5ef96da6b15949a2ee4734f935fdef0f0a6f"},
"captcha": {:git, "https://git.pleroma.social/pleroma/elixir-libraries/elixir-captcha.git", "90f6ce7672f70f56708792a98d98bd05176c9176", [ref: "90f6ce7672f70f56708792a98d98bd05176c9176"]},
"castore": {:hex, :castore, "0.1.22", "4127549e411bedd012ca3a308dede574f43819fe9394254ca55ab4895abfa1a2", [:mix], [], "hexpm", "c17576df47eb5aa1ee40cc4134316a99f5cad3e215d5c77b8dd3cfef12a22cac"},
"cc_precompiler": {:hex, :cc_precompiler, "0.1.9", "e8d3364f310da6ce6463c3dd20cf90ae7bbecbf6c5203b98bf9b48035592649b", [:mix], [{:elixir_make, "~> 0.7", [hex: :elixir_make, repo: "hexpm", optional: false]}], "hexpm", "9dcab3d0f3038621f1601f13539e7a9ee99843862e66ad62827b0c42b2f58a54"},
"certifi": {:hex, :certifi, "2.12.0", "2d1cca2ec95f59643862af91f001478c9863c2ac9cb6e2f89780bfd8de987329", [:rebar3], [], "hexpm", "ee68d85df22e554040cdb4be100f33873ac6051387baf6a8f6ce82272340ff1c"},
"combine": {:hex, :combine, "0.10.0", "eff8224eeb56498a2af13011d142c5e7997a80c8f5b97c499f84c841032e429f", [:mix], [], "hexpm", "1b1dbc1790073076580d0d1d64e42eae2366583e7aecd455d1215b0d16f2451b"},
"comeonin": {:hex, :comeonin, "5.4.0", "246a56ca3f41d404380fc6465650ddaa532c7f98be4bda1b4656b3a37cc13abe", [:mix], [], "hexpm", "796393a9e50d01999d56b7b8420ab0481a7538d0caf80919da493b4a6e51faf1"},
"concurrent_limiter": {:hex, :concurrent_limiter, "0.1.1", "43ae1dc23edda1ab03dd66febc739c4ff710d047bb4d735754909f9a474ae01c", [:mix], [{:telemetry, "~> 0.3", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "53968ff238c0fbb4d7ed76ddb1af0be6f3b2f77909f6796e249e737c505a16eb"},
"connection": {:hex, :connection, "1.1.0", "ff2a49c4b75b6fb3e674bfc5536451607270aac754ffd1bdfe175abe4a6d7a68", [:mix], [], "hexpm", "722c1eb0a418fbe91ba7bd59a47e28008a189d47e37e0e7bb85585a016b2869c"},
"cors_plug": {:hex, :cors_plug, "2.0.3", "316f806d10316e6d10f09473f19052d20ba0a0ce2a1d910ddf57d663dac402ae", [:mix], [{:plug, "~> 1.8", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "ee4ae1418e6ce117fc42c2ba3e6cbdca4e95ecd2fe59a05ec6884ca16d469aea"},
"covertool": {:hex, :covertool, "2.0.6", "4a291b4e3449025b0595d8f44c8d7635d4f48f033be2ce88d22a329f36f94a91", [:rebar3], [], "hexpm", "5db3fcd82180d8ea4ad857d4d1ab21a8d31b5aee0d60d2f6c0f9e25a411d1e21"},
"cowboy": {:hex, :cowboy, "2.10.0", "ff9ffeff91dae4ae270dd975642997afe2a1179d94b1887863e43f681a203e26", [:make, :rebar3], [{:cowlib, "2.12.1", [hex: :cowlib, repo: "hexpm", optional: false]}, {:ranch, "1.8.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "3afdccb7183cc6f143cb14d3cf51fa00e53db9ec80cdcd525482f5e99bc41d6b"},
"cowboy_telemetry": {:hex, :cowboy_telemetry, "0.4.0", "f239f68b588efa7707abce16a84d0d2acf3a0f50571f8bb7f56a15865aae820c", [:rebar3], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "7d98bac1ee4565d31b62d59f8823dfd8356a169e7fcbb83831b8a5397404c9de"},
"cowlib": {:hex, :cowlib, "2.12.1", "a9fa9a625f1d2025fe6b462cb865881329b5caff8f1854d1cbc9f9533f00e1e1", [:make, :rebar3], [], "hexpm", "163b73f6367a7341b33c794c4e88e7dbfe6498ac42dcd69ef44c5bc5507c8db0"},
"credo": {:hex, :credo, "1.7.3", "05bb11eaf2f2b8db370ecaa6a6bda2ec49b2acd5e0418bc106b73b07128c0436", [:mix], [{:bunt, "~> 0.2.1 or ~> 1.0", [hex: :bunt, repo: "hexpm", optional: false]}, {:file_system, "~> 0.2 or ~> 1.0", [hex: :file_system, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "35ea675a094c934c22fb1dca3696f3c31f2728ae6ef5a53b5d648c11180a4535"},
"crontab": {:hex, :crontab, "1.1.8", "2ce0e74777dfcadb28a1debbea707e58b879e6aa0ffbf9c9bb540887bce43617", [:mix], [{:ecto, "~> 1.0 or ~> 2.0 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}], "hexpm"},
"custom_base": {:hex, :custom_base, "0.2.1", "4a832a42ea0552299d81652aa0b1f775d462175293e99dfbe4d7dbaab785a706", [:mix], [], "hexpm", "8df019facc5ec9603e94f7270f1ac73ddf339f56ade76a721eaa57c1493ba463"},
"db_connection": {:hex, :db_connection, "2.6.0", "77d835c472b5b67fc4f29556dee74bf511bbafecdcaf98c27d27fa5918152086", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "c2f992d15725e721ec7fbc1189d4ecdb8afef76648c746a8e1cad35e3b8a35f3"},
"decimal": {:hex, :decimal, "2.1.1", "5611dca5d4b2c3dd497dec8f68751f1f1a54755e8ed2a966c2633cf885973ad6", [:mix], [], "hexpm", "53cfe5f497ed0e7771ae1a475575603d77425099ba5faef9394932b35020ffcc"},
"deep_merge": {:hex, :deep_merge, "1.0.0", "b4aa1a0d1acac393bdf38b2291af38cb1d4a52806cf7a4906f718e1feb5ee961", [:mix], [], "hexpm", "ce708e5f094b9cd4e8f2be4f00d2f4250c4095be93f8cd6d018c753894885430"},
"dialyxir": {:hex, :dialyxir, "1.4.3", "edd0124f358f0b9e95bfe53a9fcf806d615d8f838e2202a9f430d59566b6b53b", [:mix], [{:erlex, ">= 0.2.6", [hex: :erlex, repo: "hexpm", optional: false]}], "hexpm", "bf2cfb75cd5c5006bec30141b131663299c661a864ec7fbbc72dfa557487a986"},
"earmark": {:hex, :earmark, "1.4.22", "ea3e45c6359446dc308be0a64ce82a03260d973de7d0625a762e6d352ff57958", [:mix], [{:earmark_parser, "~> 1.4.23", [hex: :earmark_parser, repo: "hexpm", optional: false]}], "hexpm", "1caf5145665a42fd76d5317286b0c171861fb1c04f86ab103dde76868814fdfb"},
"earmark_parser": {:hex, :earmark_parser, "1.4.39", "424642f8335b05bb9eb611aa1564c148a8ee35c9c8a8bba6e129d51a3e3c6769", [:mix], [], "hexpm", "06553a88d1f1846da9ef066b87b57c6f605552cfbe40d20bd8d59cc6bde41944"},
"eblurhash": {:git, "https://github.com/zotonic/eblurhash.git", "bc37ceb426ef021ee9927fb249bb93f7059194ab", [ref: "bc37ceb426ef021ee9927fb249bb93f7059194ab"]},
"ecto": {:hex, :ecto, "3.11.1", "4b4972b717e7ca83d30121b12998f5fcdc62ba0ed4f20fd390f16f3270d85c3e", [:mix], [{:decimal, "~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "ebd3d3772cd0dfcd8d772659e41ed527c28b2a8bde4b00fe03e0463da0f1983b"},
"ecto_enum": {:hex, :ecto_enum, "1.4.0", "d14b00e04b974afc69c251632d1e49594d899067ee2b376277efd8233027aec8", [:mix], [{:ecto, ">= 3.0.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:ecto_sql, "> 3.0.0", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:mariaex, ">= 0.0.0", [hex: :mariaex, repo: "hexpm", optional: true]}, {:postgrex, ">= 0.0.0", [hex: :postgrex, repo: "hexpm", optional: true]}], "hexpm", "8fb55c087181c2b15eee406519dc22578fa60dd82c088be376d0010172764ee4"},
"ecto_psql_extras": {:hex, :ecto_psql_extras, "0.7.15", "0fc29dbae0e444a29bd6abeee4cf3c4c037e692a272478a234a1cc765077dbb1", [:mix], [{:ecto_sql, "~> 3.7", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16.0 or ~> 0.17.0", [hex: :postgrex, repo: "hexpm", optional: false]}, {:table_rex, "~> 3.1.1 or ~> 4.0.0", [hex: :table_rex, repo: "hexpm", optional: false]}], "hexpm", "b6127f3a5c6fc3d84895e4768cc7c199f22b48b67d6c99b13fbf4a374e73f039"},
"ecto_sql": {:hex, :ecto_sql, "3.11.1", "e9abf28ae27ef3916b43545f9578b4750956ccea444853606472089e7d169470", [:mix], [{:db_connection, "~> 2.5 or ~> 2.4.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.11.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:myxql, "~> 0.6.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.16.0 or ~> 0.17.0 or ~> 1.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:tds, "~> 2.1.1 or ~> 2.2", [hex: :tds, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "ce14063ab3514424276e7e360108ad6c2308f6d88164a076aac8a387e1fea634"},
"eimp": {:hex, :eimp, "1.0.14", "fc297f0c7e2700457a95a60c7010a5f1dcb768a083b6d53f49cd94ab95a28f22", [:rebar3], [{:p1_utils, "1.0.18", [hex: :p1_utils, repo: "hexpm", optional: false]}], "hexpm", "501133f3112079b92d9e22da8b88bf4f0e13d4d67ae9c15c42c30bd25ceb83b6"},
"elixir_make": {:hex, :elixir_make, "0.7.8", "505026f266552ee5aabca0b9f9c229cbb496c689537c9f922f3eb5431157efc7", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:certifi, "~> 2.0", [hex: :certifi, repo: "hexpm", optional: true]}], "hexpm", "7a71945b913d37ea89b06966e1342c85cfe549b15e6d6d081e8081c493062c07"},
"erlex": {:hex, :erlex, "0.2.6", "c7987d15e899c7a2f34f5420d2a2ea0d659682c06ac607572df55a43753aa12e", [:mix], [], "hexpm", "2ed2e25711feb44d52b17d2780eabf998452f6efda104877a3881c2f8c0c0c75"},
"esbuild": {:hex, :esbuild, "0.5.0", "d5bb08ff049d7880ee3609ed5c4b864bd2f46445ea40b16b4acead724fb4c4a3", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}], "hexpm", "f183a0b332d963c4cfaf585477695ea59eef9a6f2204fdd0efa00e099694ffe5"},
"eternal": {:hex, :eternal, "1.2.2", "d1641c86368de99375b98d183042dd6c2b234262b8d08dfd72b9eeaafc2a1abd", [:mix], [], "hexpm", "2c9fe32b9c3726703ba5e1d43a1d255a4f3f2d8f8f9bc19f094c7cb1a7a9e782"},
"ex_aws": {:hex, :ex_aws, "2.1.9", "dc4865ecc20a05190a34a0ac5213e3e5e2b0a75a0c2835e923ae7bfeac5e3c31", [:mix], [{:configparser_ex, "~> 4.0", [hex: :configparser_ex, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: true]}, {:jsx, "~> 3.0", [hex: :jsx, repo: "hexpm", optional: true]}, {:sweet_xml, "~> 0.6", [hex: :sweet_xml, repo: "hexpm", optional: true]}], "hexpm", "3e6c776703c9076001fbe1f7c049535f042cb2afa0d2cbd3b47cbc4e92ac0d10"},
"ex_aws_s3": {:hex, :ex_aws_s3, "2.5.3", "422468e5c3e1a4da5298e66c3468b465cfd354b842e512cb1f6fbbe4e2f5bdaf", [:mix], [{:ex_aws, "~> 2.0", [hex: :ex_aws, repo: "hexpm", optional: false]}, {:sweet_xml, ">= 0.0.0", [hex: :sweet_xml, repo: "hexpm", optional: true]}], "hexpm", "4f09dd372cc386550e484808c5ac5027766c8d0cd8271ccc578b82ee6ef4f3b8"},
"ex_const": {:hex, :ex_const, "0.2.4", "d06e540c9d834865b012a17407761455efa71d0ce91e5831e86881b9c9d82448", [:mix], [], "hexpm", "96fd346610cc992b8f896ed26a98be82ac4efb065a0578f334a32d60a3ba9767"},
"ex_doc": {:hex, :ex_doc, "0.31.1", "8a2355ac42b1cc7b2379da9e40243f2670143721dd50748bf6c3b1184dae2089", [:mix], [{:earmark_parser, "~> 1.4.39", [hex: :earmark_parser, repo: "hexpm", optional: false]}, {:makeup_c, ">= 0.1.1", [hex: :makeup_c, repo: "hexpm", optional: true]}, {:makeup_elixir, "~> 0.14", [hex: :makeup_elixir, repo: "hexpm", optional: false]}, {:makeup_erlang, "~> 0.1", [hex: :makeup_erlang, repo: "hexpm", optional: false]}], "hexpm", "3178c3a407c557d8343479e1ff117a96fd31bafe52a039079593fb0524ef61b0"},
"ex_machina": {:hex, :ex_machina, "2.7.0", "b792cc3127fd0680fecdb6299235b4727a4944a09ff0fa904cc639272cd92dc7", [:mix], [{:ecto, "~> 2.2 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_sql, "~> 3.0", [hex: :ecto_sql, repo: "hexpm", optional: true]}], "hexpm", "419aa7a39bde11894c87a615c4ecaa52d8f107bbdd81d810465186f783245bf8"},
"ex_syslogger": {:hex, :ex_syslogger, "1.5.2", "72b6aa2d47a236e999171f2e1ec18698740f40af0bd02c8c650bf5f1fd1bac79", [:mix], [{:poison, ">= 1.5.0", [hex: :poison, repo: "hexpm", optional: true]}, {:syslog, "~> 1.1.0", [hex: :syslog, repo: "hexpm", optional: false]}], "hexpm", "ab9fab4136dbc62651ec6f16fa4842f10cf02ab4433fa3d0976c01be99398399"},
"exile": {:git, "https://git.pleroma.social/pleroma/elixir-libraries/exile.git", "0d6337cf68e7fbc8a093cae000955aa93b067f91", [ref: "0d6337cf68e7fbc8a093cae000955aa93b067f91"]},
"expo": {:hex, :expo, "0.5.1", "249e826a897cac48f591deba863b26c16682b43711dd15ee86b92f25eafd96d9", [:mix], [], "hexpm", "68a4233b0658a3d12ee00d27d37d856b1ba48607e7ce20fd376958d0ba6ce92b"},
"fast_html": {:hex, :fast_html, "2.2.0", "6c5ef1be087a4ed613b0379c13f815c4d11742b36b67bb52cee7859847c84520", [:make, :mix], [{:elixir_make, "~> 0.4", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}], "hexpm", "064c4f23b4a6168f9187dac8984b056f2c531bb0787f559fd6a8b34b38aefbae"},
"fast_sanitize": {:hex, :fast_sanitize, "0.2.3", "67b93dfb34e302bef49fec3aaab74951e0f0602fd9fa99085987af05bd91c7a5", [:mix], [{:fast_html, "~> 2.0", [hex: :fast_html, repo: "hexpm", optional: false]}, {:plug, "~> 1.8", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "e8ad286d10d0386e15d67d0ee125245ebcfbc7d7290b08712ba9013c8c5e56e2"},
"file_system": {:hex, :file_system, "0.2.10", "fb082005a9cd1711c05b5248710f8826b02d7d1784e7c3451f9c1231d4fc162d", [:mix], [], "hexpm", "41195edbfb562a593726eda3b3e8b103a309b733ad25f3d642ba49696bf715dc"},
- "finch": {:hex, :finch, "0.17.0", "17d06e1d44d891d20dbd437335eebe844e2426a0cd7e3a3e220b461127c73f70", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: false]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.3", [hex: :mint, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4 or ~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.6 or ~> 1.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "8d014a661bb6a437263d4b5abf0bcbd3cf0deb26b1e8596f2a271d22e48934c7"},
+ "finch": {:hex, :finch, "0.18.0", "944ac7d34d0bd2ac8998f79f7a811b21d87d911e77a786bc5810adb75632ada4", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: false]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.3", [hex: :mint, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4 or ~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.6 or ~> 1.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "69f5045b042e531e53edc2574f15e25e735b522c37e2ddb766e15b979e03aa65"},
"flake_id": {:hex, :flake_id, "0.1.0", "7716b086d2e405d09b647121a166498a0d93d1a623bead243e1f74216079ccb3", [:mix], [{:base62, "~> 1.2", [hex: :base62, repo: "hexpm", optional: false]}, {:ecto, ">= 2.0.0", [hex: :ecto, repo: "hexpm", optional: true]}], "hexpm", "31fc8090fde1acd267c07c36ea7365b8604055f897d3a53dd967658c691bd827"},
"floki": {:hex, :floki, "0.35.2", "87f8c75ed8654b9635b311774308b2760b47e9a579dabf2e4d5f1e1d42c39e0b", [:mix], [], "hexpm", "6b05289a8e9eac475f644f09c2e4ba7e19201fd002b89c28c1293e7bd16773d9"},
"gen_smtp": {:hex, :gen_smtp, "0.15.0", "9f51960c17769b26833b50df0b96123605a8024738b62db747fece14eb2fbfcc", [:rebar3], [], "hexpm", "29bd14a88030980849c7ed2447b8db6d6c9278a28b11a44cafe41b791205440f"},
"geo": {:hex, :geo, "3.6.0", "00c9c6338579f67e91cd5950af4ae2eb25cdce0c3398718c232539f61625d0bd", [:mix], [{:jason, "~> 1.4", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm", "1dbdebf617183b54bc3c8ad7a36531a9a76ada8ca93f75f573b0ae94006168da"},
"geospatial": {:hex, :geospatial, "0.2.0", "c6c9f57df647cabbda71825bbba8465645002922a0c2e6410dc50279dbc95265", [:mix], [{:geo, "~> 3.4", [hex: :geo, repo: "hexpm", optional: false]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: false]}, {:tesla, "~> 1.4.0", [hex: :tesla, repo: "hexpm", optional: false]}, {:tz_world, "~> 1.0", [hex: :tz_world, repo: "hexpm", optional: false]}], "hexpm", "b2f0e8f05a3d40f5473bf546d6b971bb82357e28c4f62c93c160d9e3c3581cb0"},
"gettext": {:hex, :gettext, "0.24.0", "6f4d90ac5f3111673cbefc4ebee96fe5f37a114861ab8c7b7d5b30a1108ce6d8", [:mix], [{:expo, "~> 0.5.1", [hex: :expo, repo: "hexpm", optional: false]}], "hexpm", "bdf75cdfcbe9e4622dd18e034b227d77dd17f0f133853a1c73b97b3d6c770e8b"},
"gun": {:hex, :gun, "2.0.1", "160a9a5394800fcba41bc7e6d421295cf9a7894c2252c0678244948e3336ad73", [:make, :rebar3], [{:cowlib, "2.12.1", [hex: :cowlib, repo: "hexpm", optional: false]}], "hexpm", "a10bc8d6096b9502205022334f719cc9a08d9adcfbfc0dbee9ef31b56274a20b"},
"hackney": {:hex, :hackney, "1.18.2", "d7ff544ddae5e1cb49e9cf7fa4e356d7f41b283989a1c304bfc47a8cc1cf966f", [:rebar3], [{:certifi, "~>2.12.0", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "~>6.1.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "~>1.0.0", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~>1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:parse_trans, "3.4.1", [hex: :parse_trans, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "~>1.1.0", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}, {:unicode_util_compat, "~>0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "af94d5c9f97857db257090a4a10e5426ecb6f4918aa5cc666798566ae14b65fd"},
"hpax": {:hex, :hpax, "0.1.2", "09a75600d9d8bbd064cdd741f21fc06fc1f4cf3d0fcc335e5aa19be1a7235c84", [:mix], [], "hexpm", "2c87843d5a23f5f16748ebe77969880e29809580efdaccd615cd3bed628a8c13"},
"html_entities": {:hex, :html_entities, "0.5.2", "9e47e70598da7de2a9ff6af8758399251db6dbb7eebe2b013f2bbd2515895c3c", [:mix], [], "hexpm", "c53ba390403485615623b9531e97696f076ed415e8d8058b1dbaa28181f4fdcc"},
"http_signatures": {:hex, :http_signatures, "0.1.2", "ed1cc7043abcf5bb4f30d68fb7bad9d618ec1a45c4ff6c023664e78b67d9c406", [:mix], [], "hexpm", "f08aa9ac121829dae109d608d83c84b940ef2f183ae50f2dd1e9a8bc619d8be7"},
"httpoison": {:hex, :httpoison, "1.8.2", "9eb9c63ae289296a544842ef816a85d881d4a31f518a0fec089aaa744beae290", [:mix], [{:hackney, "~> 1.17", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "2bb350d26972e30c96e2ca74a1aaf8293d61d0742ff17f01e0279fef11599921"},
"icalendar": {:hex, :icalendar, "1.1.2", "5d0afff5d0143c5bd43f18ae32a777bf0fb9a724543ab05229a460d368f0a5e7", [:mix], [{:timex, "~> 3.4", [hex: :timex, repo: "hexpm", optional: false]}], "hexpm", "2060f8e353fdf3047e95a3f012583dc3c0bbd7ca1010e32ed9e9fc5760ad4292"},
"idna": {:hex, :idna, "6.1.1", "8a63070e9f7d0c62eb9d9fcb360a7de382448200fbbd1b106cc96d3d8099df8d", [:rebar3], [{:unicode_util_compat, "~>0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "92376eb7894412ed19ac475e4a86f7b413c1b9fbb5bd16dccd57934157944cea"},
"inet_cidr": {:hex, :inet_cidr, "1.0.8", "d26bb7bdbdf21ae401ead2092bf2bb4bf57fe44a62f5eaa5025280720ace8a40", [:mix], [], "hexpm", "d5b26da66603bb56c933c65214c72152f0de9a6ea53618b56d63302a68f6a90e"},
"jason": {:hex, :jason, "1.4.1", "af1504e35f629ddcdd6addb3513c3853991f694921b1b9368b0bd32beb9f1b63", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "fbb01ecdfd565b56261302f7e1fcc27c4fb8f32d56eab74db621fc154604a7a1"},
"joken": {:hex, :joken, "2.6.0", "b9dd9b6d52e3e6fcb6c65e151ad38bf4bc286382b5b6f97079c47ade6b1bcc6a", [:mix], [{:jose, "~> 1.11.5", [hex: :jose, repo: "hexpm", optional: false]}], "hexpm", "5a95b05a71cd0b54abd35378aeb1d487a23a52c324fa7efdffc512b655b5aaa7"},
"jose": {:hex, :jose, "1.11.6", "613fda82552128aa6fb804682e3a616f4bc15565a048dabd05b1ebd5827ed965", [:mix, :rebar3], [], "hexpm", "6275cb75504f9c1e60eeacb771adfeee4905a9e182103aa59b53fed651ff9738"},
"jumper": {:hex, :jumper, "1.0.2", "68cdcd84472a00ac596b4e6459a41b3062d4427cbd4f1e8c8793c5b54f1406a7", [:mix], [], "hexpm", "9b7782409021e01ab3c08270e26f36eb62976a38c1aa64b2eaf6348422f165e1"},
"linkify": {:hex, :linkify, "0.5.3", "5f8143d8f61f5ff08d3aeeff47ef6509492b4948d8f08007fbf66e4d2246a7f2", [:mix], [], "hexpm", "3ef35a1377d47c25506e07c1c005ea9d38d700699d92ee92825f024434258177"},
"majic": {:hex, :majic, "1.0.0", "37e50648db5f5c2ff0c9fb46454d034d11596c03683807b9fb3850676ffdaab3", [:make, :mix], [{:elixir_make, "~> 0.6.1", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:mime, "~> 1.0", [hex: :mime, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: true]}], "hexpm", "7905858f76650d49695f14ea55cd9aaaee0c6654fa391671d4cf305c275a0a9e"},
"makeup": {:hex, :makeup, "1.0.5", "d5a830bc42c9800ce07dd97fa94669dfb93d3bf5fcf6ea7a0c67b2e0e4a7f26c", [:mix], [{:nimble_parsec, "~> 0.5 or ~> 1.0", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "cfa158c02d3f5c0c665d0af11512fed3fba0144cf1aadee0f2ce17747fba2ca9"},
"makeup_elixir": {:hex, :makeup_elixir, "0.14.1", "4f0e96847c63c17841d42c08107405a005a2680eb9c7ccadfd757bd31dabccfb", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm", "f2438b1a80eaec9ede832b5c41cd4f373b38fd7aa33e3b22d9db79e640cbde11"},
"makeup_erlang": {:hex, :makeup_erlang, "0.1.3", "d684f4bac8690e70b06eb52dad65d26de2eefa44cd19d64a8095e1417df7c8fd", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm", "b78dc853d2e670ff6390b605d807263bf606da3c82be37f9d7f68635bd886fc9"},
"meck": {:hex, :meck, "0.9.2", "85ccbab053f1db86c7ca240e9fc718170ee5bda03810a6292b5306bf31bae5f5", [:rebar3], [], "hexpm", "81344f561357dc40a8344afa53767c32669153355b626ea9fcbc8da6b3045826"},
"metrics": {:hex, :metrics, "1.0.1", "25f094dea2cda98213cecc3aeff09e940299d950904393b2a29d191c346a8486", [:rebar3], [], "hexpm", "69b09adddc4f74a40716ae54d140f93beb0fb8978d8636eaded0c31b6f099f16"},
"mime": {:hex, :mime, "1.6.0", "dabde576a497cef4bbdd60aceee8160e02a6c89250d6c0b29e56c0dfb00db3d2", [:mix], [], "hexpm", "31a1a8613f8321143dde1dafc36006a17d28d02bdfecb9e95a880fa7aabd19a7"},
"mimerl": {:hex, :mimerl, "1.2.0", "67e2d3f571088d5cfd3e550c383094b47159f3eee8ffa08e64106cdf5e981be3", [:rebar3], [], "hexpm", "f278585650aa581986264638ebf698f8bb19df297f66ad91b18910dfc6e19323"},
"mint": {:hex, :mint, "1.5.2", "4805e059f96028948870d23d7783613b7e6b0e2fb4e98d720383852a760067fd", [:mix], [{:castore, "~> 0.1.0 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:hpax, "~> 0.1.1", [hex: :hpax, repo: "hexpm", optional: false]}], "hexpm", "d77d9e9ce4eb35941907f1d3df38d8f750c357865353e21d335bdcdf6d892a02"},
"mochiweb": {:hex, :mochiweb, "2.18.0", "eb55f1db3e6e960fac4e6db4e2db9ec3602cc9f30b86cd1481d56545c3145d2e", [:rebar3], [], "hexpm"},
"mock": {:hex, :mock, "0.3.8", "7046a306b71db2488ef54395eeb74df0a7f335a7caca4a3d3875d1fc81c884dd", [:mix], [{:meck, "~> 0.9.2", [hex: :meck, repo: "hexpm", optional: false]}], "hexpm", "7fa82364c97617d79bb7d15571193fc0c4fe5afd0c932cef09426b3ee6fe2022"},
"mogrify": {:hex, :mogrify, "0.8.0", "3506f3ca3f7b95a155f3b4ef803b5db176f5a0633723e3fe85e0d6399e3b11c8", [:mix], [], "hexpm", "2278d245f07056ea3b586e98801e933695147066fa4cf563f552c1b4f0ff8ad9"},
"mox": {:hex, :mox, "1.1.0", "0f5e399649ce9ab7602f72e718305c0f9cdc351190f72844599545e4996af73c", [:mix], [], "hexpm", "d44474c50be02d5b72131070281a5d3895c0e7a95c780e90bc0cfe712f633a13"},
"nimble_options": {:hex, :nimble_options, "1.1.0", "3b31a57ede9cb1502071fade751ab0c7b8dbe75a9a4c2b5bbb0943a690b63172", [:mix], [], "hexpm", "8bbbb3941af3ca9acc7835f5655ea062111c9c27bcac53e004460dfd19008a99"},
"nimble_parsec": {:hex, :nimble_parsec, "0.6.0", "32111b3bf39137144abd7ba1cce0914533b2d16ef35e8abc5ec8be6122944263", [:mix], [], "hexpm", "27eac315a94909d4dc68bc07a4a83e06c8379237c5ea528a9acff4ca1c873c52"},
"nimble_pool": {:hex, :nimble_pool, "0.2.6", "91f2f4c357da4c4a0a548286c84a3a28004f68f05609b4534526871a22053cde", [:mix], [], "hexpm", "1c715055095d3f2705c4e236c18b618420a35490da94149ff8b580a2144f653f"},
"nodex": {:git, "https://git.pleroma.social/pleroma/nodex", "cb6730f943cfc6aad674c92161be23a8411f15d1", [ref: "cb6730f943cfc6aad674c92161be23a8411f15d1"]},
"oban": {:hex, :oban, "2.13.6", "a0cb1bce3bd393770512231fb5a3695fa19fd3af10d7575bf73f837aee7abf43", [:mix], [{:ecto_sql, "~> 3.6", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "3c1c5eb16f377b3cbbf2ea14be24d20e3d91285af9d1ac86260b7c2af5464887"},
"octo_fetch": {:hex, :octo_fetch, "0.4.0", "074b5ecbc08be10b05b27e9db08bc20a3060142769436242702931c418695b19", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "~> 1.1", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}], "hexpm", "cf8be6f40cd519d7000bb4e84adcf661c32e59369ca2827c4e20042eda7a7fc6"},
"open_api_spex": {:hex, :open_api_spex, "3.18.2", "8c855e83bfe8bf81603d919d6e892541eafece3720f34d1700b58024dadde247", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:plug, "~> 1.7", [hex: :plug, repo: "hexpm", optional: false]}, {:poison, "~> 3.0 or ~> 4.0 or ~> 5.0", [hex: :poison, repo: "hexpm", optional: true]}, {:ymlr, "~> 2.0 or ~> 3.0 or ~> 4.0", [hex: :ymlr, repo: "hexpm", optional: true]}], "hexpm", "aa3e6dcfc0ad6a02596b2172662da21c9dd848dac145ea9e603f54e3d81b8d2b"},
"parse_trans": {:hex, :parse_trans, "3.4.1", "6e6aa8167cb44cc8f39441d05193be6e6f4e7c2946cb2759f015f8c56b76e5ff", [:rebar3], [], "hexpm", "620a406ce75dada827b82e453c19cf06776be266f5a67cff34e1ef2cbb60e49a"},
"pbkdf2_elixir": {:hex, :pbkdf2_elixir, "1.2.1", "9cbe354b58121075bd20eb83076900a3832324b7dd171a6895fab57b6bb2752c", [:mix], [{:comeonin, "~> 5.3", [hex: :comeonin, repo: "hexpm", optional: false]}], "hexpm", "d3b40a4a4630f0b442f19eca891fcfeeee4c40871936fed2f68e1c4faa30481f"},
"phoenix": {:hex, :phoenix, "1.7.10", "02189140a61b2ce85bb633a9b6fd02dff705a5f1596869547aeb2b2b95edd729", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 2.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.6", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:plug_crypto, "~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:websock_adapter, "~> 0.5.3", [hex: :websock_adapter, repo: "hexpm", optional: false]}], "hexpm", "cf784932e010fd736d656d7fead6a584a4498efefe5b8227e9f383bf15bb79d0"},
"phoenix_ecto": {:hex, :phoenix_ecto, "4.4.3", "86e9878f833829c3f66da03d75254c155d91d72a201eb56ae83482328dc7ca93", [:mix], [{:ecto, "~> 3.5", [hex: :ecto, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "d36c401206f3011fefd63d04e8ef626ec8791975d9d107f9a0817d426f61ac07"},
"phoenix_html": {:hex, :phoenix_html, "3.3.3", "380b8fb45912b5638d2f1d925a3771b4516b9a78587249cabe394e0a5d579dc9", [:mix], [{:plug, "~> 1.5", [hex: :plug, repo: "hexpm", optional: true]}], "hexpm", "923ebe6fec6e2e3b3e569dfbdc6560de932cd54b000ada0208b5f45024bdd76c"},
"phoenix_live_dashboard": {:hex, :phoenix_live_dashboard, "0.8.3", "7ff51c9b6609470f681fbea20578dede0e548302b0c8bdf338b5a753a4f045bf", [:mix], [{:ecto, "~> 3.6.2 or ~> 3.7", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_mysql_extras, "~> 0.5", [hex: :ecto_mysql_extras, repo: "hexpm", optional: true]}, {:ecto_psql_extras, "~> 0.7", [hex: :ecto_psql_extras, repo: "hexpm", optional: true]}, {:ecto_sqlite3_extras, "~> 1.1.7 or ~> 1.2.0", [hex: :ecto_sqlite3_extras, repo: "hexpm", optional: true]}, {:mime, "~> 1.6 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:phoenix_live_view, "~> 0.19 or ~> 1.0", [hex: :phoenix_live_view, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6 or ~> 1.0", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "f9470a0a8bae4f56430a23d42f977b5a6205fdba6559d76f932b876bfaec652d"},
"phoenix_live_reload": {:hex, :phoenix_live_reload, "1.3.3", "3a53772a6118d5679bf50fc1670505a290e32a1d195df9e069d8c53ab040c054", [:mix], [{:file_system, "~> 0.2.1 or ~> 0.3", [hex: :file_system, repo: "hexpm", optional: false]}, {:phoenix, "~> 1.4", [hex: :phoenix, repo: "hexpm", optional: false]}], "hexpm", "766796676e5f558dbae5d1bdb066849673e956005e3730dfd5affd7a6da4abac"},
"phoenix_live_view": {:hex, :phoenix_live_view, "0.19.5", "6e730595e8e9b8c5da230a814e557768828fd8dfeeb90377d2d8dbb52d4ec00a", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix, "~> 1.6.15 or ~> 1.7.0", [hex: :phoenix, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 3.3", [hex: :phoenix_html, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "b2eaa0dd3cfb9bd7fb949b88217df9f25aed915e986a28ad5c8a0d054e7ca9d3"},
"phoenix_pubsub": {:hex, :phoenix_pubsub, "2.1.3", "3168d78ba41835aecad272d5e8cd51aa87a7ac9eb836eabc42f6e57538e3731d", [:mix], [], "hexpm", "bba06bc1dcfd8cb086759f0edc94a8ba2bc8896d5331a1e2c2902bf8e36ee502"},
"phoenix_swoosh": {:hex, :phoenix_swoosh, "1.2.1", "b74ccaa8046fbc388a62134360ee7d9742d5a8ae74063f34eb050279de7a99e1", [:mix], [{:finch, "~> 0.8", [hex: :finch, repo: "hexpm", optional: true]}, {:hackney, "~> 1.10", [hex: :hackney, repo: "hexpm", optional: true]}, {:phoenix, "~> 1.6", [hex: :phoenix, repo: "hexpm", optional: true]}, {:phoenix_html, "~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:phoenix_view, "~> 1.0 or ~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: false]}, {:swoosh, "~> 1.5", [hex: :swoosh, repo: "hexpm", optional: false]}], "hexpm", "4000eeba3f9d7d1a6bf56d2bd56733d5cadf41a7f0d8ffe5bb67e7d667e204a2"},
"phoenix_template": {:hex, :phoenix_template, "1.0.4", "e2092c132f3b5e5b2d49c96695342eb36d0ed514c5b252a77048d5969330d639", [:mix], [{:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}], "hexpm", "2c0c81f0e5c6753faf5cca2f229c9709919aba34fab866d3bc05060c9c444206"},
"phoenix_view": {:hex, :phoenix_view, "2.0.3", "4d32c4817fce933693741deeb99ef1392619f942633dde834a5163124813aad3", [:mix], [{:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}], "hexpm", "cd34049af41be2c627df99cd4eaa71fc52a328c0c3d8e7d4aa28f880c30e7f64"},
"plug": {:hex, :plug, "1.15.3", "712976f504418f6dff0a3e554c40d705a9bcf89a7ccef92fc6a5ef8f16a30a97", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_crypto, "~> 1.1.1 or ~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "cc4365a3c010a56af402e0809208873d113e9c38c401cabd88027ef4f5c01fd2"},
"plug_cowboy": {:hex, :plug_cowboy, "2.6.2", "753611b23b29231fb916b0cdd96028084b12aff57bfd7b71781bd04b1dbeb5c9", [:mix], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:cowboy_telemetry, "~> 0.3", [hex: :cowboy_telemetry, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "951ed2433df22f4c97b85fdb145d4cee561f36b74854d64c06d896d7cd2921a7"},
"plug_crypto": {:hex, :plug_crypto, "2.0.0", "77515cc10af06645abbfb5e6ad7a3e9714f805ae118fa1a70205f80d2d70fe73", [:mix], [], "hexpm", "53695bae57cc4e54566d993eb01074e4d894b65a3766f1c43e2c61a1b0f45ea9"},
"plug_static_index_html": {:hex, :plug_static_index_html, "1.0.0", "840123d4d3975585133485ea86af73cb2600afd7f2a976f9f5fd8b3808e636a0", [:mix], [{:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "79fd4fcf34d110605c26560cbae8f23c603ec4158c08298bd4360fdea90bb5cf"},
"poison": {:hex, :poison, "3.1.0", "d9eb636610e096f86f25d9a46f35a9facac35609a7591b3be3326e99a0484665", [:mix], [], "hexpm", "fec8660eb7733ee4117b85f55799fd3833eb769a6df71ccf8903e8dc5447cfce"},
"poolboy": {:hex, :poolboy, "1.5.2", "392b007a1693a64540cead79830443abf5762f5d30cf50bc95cb2c1aaafa006b", [:rebar3], [], "hexpm", "dad79704ce5440f3d5a3681c8590b9dc25d1a561e8f5a9c995281012860901e3"},
"postgrex": {:hex, :postgrex, "0.17.4", "5777781f80f53b7c431a001c8dad83ee167bcebcf3a793e3906efff680ab62b3", [:mix], [{:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "6458f7d5b70652bc81c3ea759f91736c16a31be000f306d3c64bcdfe9a18b3cc"},
"pot": {:hex, :pot, "1.0.2", "13abb849139fdc04ab8154986abbcb63bdee5de6ed2ba7e1713527e33df923dd", [:rebar3], [], "hexpm", "78fe127f5a4f5f919d6ea5a2a671827bd53eb9d37e5b4128c0ad3df99856c2e0"},
"prom_ex": {:hex, :prom_ex, "1.9.0", "63e6dda6c05cdeec1f26c48443dcc38ffd2118b3665ae8d2bd0e5b79f2aea03e", [:mix], [{:absinthe, ">= 1.6.0", [hex: :absinthe, repo: "hexpm", optional: true]}, {:broadway, ">= 1.0.2", [hex: :broadway, repo: "hexpm", optional: true]}, {:ecto, ">= 3.5.0", [hex: :ecto, repo: "hexpm", optional: true]}, {:finch, "~> 0.15", [hex: :finch, repo: "hexpm", optional: false]}, {:jason, "~> 1.2", [hex: :jason, repo: "hexpm", optional: false]}, {:oban, ">= 2.4.0", [hex: :oban, repo: "hexpm", optional: true]}, {:octo_fetch, "~> 0.3", [hex: :octo_fetch, repo: "hexpm", optional: false]}, {:phoenix, ">= 1.5.0", [hex: :phoenix, repo: "hexpm", optional: true]}, {:phoenix_live_view, ">= 0.14.0", [hex: :phoenix_live_view, repo: "hexpm", optional: true]}, {:plug, ">= 1.12.1", [hex: :plug, repo: "hexpm", optional: true]}, {:plug_cowboy, "~> 2.5 or ~> 2.6", [hex: :plug_cowboy, repo: "hexpm", optional: false]}, {:telemetry, ">= 1.0.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}, {:telemetry_metrics_prometheus_core, "~> 1.0", [hex: :telemetry_metrics_prometheus_core, repo: "hexpm", optional: false]}, {:telemetry_poller, "~> 1.0", [hex: :telemetry_poller, repo: "hexpm", optional: false]}], "hexpm", "01f3d4f69ec93068219e686cc65e58a29c42bea5429a8ff4e2121f19db178ee6"},
"prometheus": {:hex, :prometheus, "4.10.0", "792adbf0130ff61b5fa8826f013772af24b6e57b984445c8d602c8a0355704a1", [:mix, :rebar3], [{:quantile_estimator, "~> 0.2.1", [hex: :quantile_estimator, repo: "hexpm", optional: false]}], "hexpm", "2a99bb6dce85e238c7236fde6b0064f9834dc420ddbd962aac4ea2a3c3d59384"},
"prometheus_ecto": {:hex, :prometheus_ecto, "1.4.3", "3dd4da1812b8e0dbee81ea58bb3b62ed7588f2eae0c9e97e434c46807ff82311", [:mix], [{:ecto, "~> 2.0 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:prometheus_ex, "~> 1.1 or ~> 2.0 or ~> 3.0", [hex: :prometheus_ex, repo: "hexpm", optional: false]}], "hexpm", "8d66289f77f913b37eda81fd287340c17e61a447549deb28efc254532b2bed82"},
"prometheus_ex": {:git, "https://github.com/lanodan/prometheus.ex.git", "31f7fbe4b71b79ba27efc2a5085746c4011ceb8f", [branch: "fix/elixir-1.14"]},
"prometheus_phoenix": {:hex, :prometheus_phoenix, "1.3.0", "c4b527e0b3a9ef1af26bdcfbfad3998f37795b9185d475ca610fe4388fdd3bb5", [:mix], [{:phoenix, "~> 1.4", [hex: :phoenix, repo: "hexpm", optional: false]}, {:prometheus_ex, "~> 1.3 or ~> 2.0 or ~> 3.0", [hex: :prometheus_ex, repo: "hexpm", optional: false]}], "hexpm", "c4d1404ac4e9d3d963da601db2a7d8ea31194f0017057fabf0cfb9bf5a6c8c75"},
"prometheus_phx": {:git, "https://git.pleroma.social/pleroma/elixir-libraries/prometheus-phx.git", "9cd8f248c9381ffedc799905050abce194a97514", [branch: "no-logging"]},
"prometheus_plugs": {:hex, :prometheus_plugs, "1.1.5", "25933d48f8af3a5941dd7b621c889749894d8a1082a6ff7c67cc99dec26377c5", [:mix], [{:accept, "~> 0.1", [hex: :accept, repo: "hexpm", optional: false]}, {:plug, "~> 1.0", [hex: :plug, repo: "hexpm", optional: false]}, {:prometheus_ex, "~> 1.1 or ~> 2.0 or ~> 3.0", [hex: :prometheus_ex, repo: "hexpm", optional: false]}, {:prometheus_process_collector, "~> 1.1", [hex: :prometheus_process_collector, repo: "hexpm", optional: true]}], "hexpm", "0273a6483ccb936d79ca19b0ab629aef0dba958697c94782bb728b920dfc6a79"},
"quantile_estimator": {:hex, :quantile_estimator, "0.2.1", "ef50a361f11b5f26b5f16d0696e46a9e4661756492c981f7b2229ef42ff1cd15", [:rebar3], [], "hexpm", "282a8a323ca2a845c9e6f787d166348f776c1d4a41ede63046d72d422e3da946"},
"ranch": {:hex, :ranch, "1.8.0", "8c7a100a139fd57f17327b6413e4167ac559fbc04ca7448e9be9057311597a1d", [:make, :rebar3], [], "hexpm", "49fbcfd3682fab1f5d109351b61257676da1a2fdbe295904176d5e521a2ddfe5"},
"recon": {:hex, :recon, "2.5.4", "05dd52a119ee4059fa9daa1ab7ce81bc7a8161a2f12e9d42e9d551ffd2ba901c", [:mix, :rebar3], [], "hexpm", "e9ab01ac7fc8572e41eb59385efeb3fb0ff5bf02103816535bacaedf327d0263"},
"remote_ip": {:git, "https://git.pleroma.social/pleroma/remote_ip.git", "b647d0deecaa3acb140854fe4bda5b7e1dc6d1c8", [ref: "b647d0deecaa3acb140854fe4bda5b7e1dc6d1c8"]},
"rustler": {:hex, :rustler, "0.30.0", "cefc49922132b072853fa9b0ca4dc2ffcb452f68fb73b779042b02d545e097fb", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:toml, "~> 0.6", [hex: :toml, repo: "hexpm", optional: false]}], "hexpm", "9ef1abb6a7dda35c47cfc649e6a5a61663af6cf842a55814a554a84607dee389"},
"sleeplocks": {:hex, :sleeplocks, "1.1.2", "d45aa1c5513da48c888715e3381211c859af34bee9b8290490e10c90bb6ff0ca", [:rebar3], [], "hexpm", "9fe5d048c5b781d6305c1a3a0f40bb3dfc06f49bf40571f3d2d0c57eaa7f59a5"},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.7", "354c321cf377240c7b8716899e182ce4890c5938111a1296add3ec74cf1715df", [:make, :mix, :rebar3], [], "hexpm", "fe4c190e8f37401d30167c8c405eda19469f34577987c76dde613e838bbc67f8"},
"statistex": {:hex, :statistex, "1.0.0", "f3dc93f3c0c6c92e5f291704cf62b99b553253d7969e9a5fa713e5481cd858a5", [:mix], [], "hexpm", "ff9d8bee7035028ab4742ff52fc80a2aa35cece833cf5319009b52f1b5a86c27"},
"sweet_xml": {:hex, :sweet_xml, "0.7.4", "a8b7e1ce7ecd775c7e8a65d501bc2cd933bff3a9c41ab763f5105688ef485d08", [:mix], [], "hexpm", "e7c4b0bdbf460c928234951def54fe87edf1a170f6896675443279e2dbeba167"},
"swoosh": {:hex, :swoosh, "1.10.3", "32f1531ee3fe4e82da8175c597bf3692938f8152eb981e0cbf57107b6c5924c1", [:mix], [{:cowboy, "~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:ex_aws, "~> 2.1", [hex: :ex_aws, repo: "hexpm", optional: true]}, {:finch, "~> 0.6", [hex: :finch, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13 or ~> 1.0", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "8b7167d93047bac6e1a1c367bf7d899cf2e4fea0592ee04a70673548ef6091b9"},
"syslog": {:hex, :syslog, "1.1.0", "6419a232bea84f07b56dc575225007ffe34d9fdc91abe6f1b2f254fd71d8efc2", [:rebar3], [], "hexpm", "4c6a41373c7e20587be33ef841d3de6f3beba08519809329ecc4d27b15b659e1"},
"table_rex": {:hex, :table_rex, "4.0.0", "3c613a68ebdc6d4d1e731bc973c233500974ec3993c99fcdabb210407b90959b", [:mix], [], "hexpm", "c35c4d5612ca49ebb0344ea10387da4d2afe278387d4019e4d8111e815df8f55"},
"telemetry": {:hex, :telemetry, "1.0.0", "0f453a102cdf13d506b7c0ab158324c337c41f1cc7548f0bc0e130bbf0ae9452", [:rebar3], [], "hexpm", "73bc09fa59b4a0284efb4624335583c528e07ec9ae76aca96ea0673850aec57a"},
"telemetry_metrics": {:hex, :telemetry_metrics, "0.6.2", "2caabe9344ec17eafe5403304771c3539f3b6e2f7fb6a6f602558c825d0d0bfb", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "9b43db0dc33863930b9ef9d27137e78974756f5f198cae18409970ed6fa5b561"},
"telemetry_metrics_prometheus_core": {:hex, :telemetry_metrics_prometheus_core, "1.2.0", "b583c3f18508f5c5561b674d16cf5d9afd2ea3c04505b7d92baaeac93c1b8260", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "9cba950e1c4733468efbe3f821841f34ac05d28e7af7798622f88ecdbbe63ea3"},
"telemetry_poller": {:hex, :telemetry_poller, "1.0.0", "db91bb424e07f2bb6e73926fcafbfcbcb295f0193e0a00e825e589a0a47e8453", [:rebar3], [{:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "b3a24eafd66c3f42da30fc3ca7dda1e9d546c12250a2d60d7b81d264fbec4f6e"},
- "tesla": {:hex, :tesla, "1.4.4", "bb89aa0c9745190930366f6a2ac612cdf2d0e4d7fff449861baa7875afd797b2", [:mix], [{:castore, "~> 0.1", [hex: :castore, repo: "hexpm", optional: true]}, {:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:finch, "~> 0.3", [hex: :finch, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:gun, "~> 1.3", [hex: :gun, repo: "hexpm", optional: true]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "4.4.0", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.0", [hex: :mint, repo: "hexpm", optional: true]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm", "d5503a49f9dec1b287567ea8712d085947e247cb11b06bc54adb05bfde466457"},
+ "tesla": {:hex, :tesla, "1.8.0", "d511a4f5c5e42538d97eef7c40ec4f3e44effdc5068206f42ed859e09e51d1fd", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:finch, "~> 0.13", [hex: :finch, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:gun, ">= 1.0.0", [hex: :gun, repo: "hexpm", optional: true]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "4.4.2", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.0", [hex: :mint, repo: "hexpm", optional: true]}, {:msgpax, "~> 2.3", [hex: :msgpax, repo: "hexpm", optional: true]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm", "10501f360cd926a309501287470372af1a6e1cbed0f43949203a4c13300bc79f"},
+ "thousand_island": {:hex, :thousand_island, "1.3.2", "bc27f9afba6e1a676dd36507d42e429935a142cf5ee69b8e3f90bff1383943cd", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "0e085b93012cd1057b378fce40cbfbf381ff6d957a382bfdd5eca1a98eec2535"},
"timex": {:hex, :timex, "3.7.7", "3ed093cae596a410759104d878ad7b38e78b7c2151c6190340835515d4a46b8a", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.10", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 1.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "0ec4b09f25fe311321f9fc04144a7e3affe48eb29481d7a5583849b6c4dfa0a7"},
"toml": {:hex, :toml, "0.7.0", "fbcd773caa937d0c7a02c301a1feea25612720ac3fa1ccb8bfd9d30d822911de", [:mix], [], "hexpm", "0690246a2478c1defd100b0c9b89b4ea280a22be9a7b313a8a058a2408a2fa70"},
"trailing_format_plug": {:hex, :trailing_format_plug, "0.0.7", "64b877f912cf7273bed03379936df39894149e35137ac9509117e59866e10e45", [:mix], [{:plug, "> 0.12.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "bd4fde4c15f3e993a999e019d64347489b91b7a9096af68b2bdadd192afa693f"},
"tz_world": {:hex, :tz_world, "1.2.0", "74ae7ef660a70ab35a25af4dfba46b1354fc904eb5da9b2464151cf885fc1a2c", [:mix], [{:castore, "~> 0.1", [hex: :castore, repo: "hexpm", optional: true]}, {:certifi, "~> 2.5", [hex: :certifi, repo: "hexpm", optional: true]}, {:geo, "~> 1.0 or ~> 2.0 or ~> 3.3", [hex: :geo, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "3cdd255f0a96942496a9edf61c86976a07fa6c0552ad287604c282b9bc1d9cf8"},
"tzdata": {:hex, :tzdata, "1.0.5", "69f1ee029a49afa04ad77801febaf69385f3d3e3d1e4b56b9469025677b89a28", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "55519aa2a99e5d2095c1e61cc74c9be69688f8ab75c27da724eb8279ff402a5a"},
"ueberauth": {:hex, :ueberauth, "0.10.7", "5a31cbe11e7ce5c7484d745dc9e1f11948e89662f8510d03c616de03df581ebd", [:mix], [{:plug, "~> 1.5", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "0bccf73e2ffd6337971340832947ba232877aa8122dba4c95be9f729c8987377"},
"unicode_util_compat": {:hex, :unicode_util_compat, "0.7.0", "bc84380c9ab48177092f43ac89e4dfa2c6d62b40b8bd132b1059ecc7232f9a78", [:rebar3], [], "hexpm", "25eee6d67df61960cf6a794239566599b09e17e668d3700247bc498638152521"},
"unsafe": {:hex, :unsafe, "1.0.2", "23c6be12f6c1605364801f4b47007c0c159497d0446ad378b5cf05f1855c0581", [:mix], [], "hexpm", "b485231683c3ab01a9cd44cb4a79f152c6f3bb87358439c6f68791b85c2df675"},
"vix": {:hex, :vix, "0.26.0", "027f10b6969b759318be84bd0bd8c88af877445e4e41cf96a0460392cea5399c", [:make, :mix], [{:castore, "~> 1.0 or ~> 0.1", [hex: :castore, repo: "hexpm", optional: false]}, {:cc_precompiler, "~> 0.2 or ~> 0.1.4", [hex: :cc_precompiler, repo: "hexpm", optional: false]}, {:elixir_make, "~> 0.8 or ~> 0.7.3", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:kino, "~> 0.7", [hex: :kino, repo: "hexpm", optional: true]}], "hexpm", "71b0a79ae7f199cacfc8e679b0e4ba25ee47dc02e182c5b9097efb29fbe14efd"},
"web_push_encryption": {:hex, :web_push_encryption, "0.3.1", "76d0e7375142dfee67391e7690e89f92578889cbcf2879377900b5620ee4708d", [:mix], [{:httpoison, "~> 1.0", [hex: :httpoison, repo: "hexpm", optional: false]}, {:jose, "~> 1.11.1", [hex: :jose, repo: "hexpm", optional: false]}], "hexpm", "4f82b2e57622fb9337559058e8797cb0df7e7c9790793bdc4e40bc895f70e2a2"},
"websock": {:hex, :websock, "0.5.3", "2f69a6ebe810328555b6fe5c831a851f485e303a7c8ce6c5f675abeb20ebdadc", [:mix], [], "hexpm", "6105453d7fac22c712ad66fab1d45abdf049868f253cf719b625151460b8b453"},
"websock_adapter": {:hex, :websock_adapter, "0.5.5", "9dfeee8269b27e958a65b3e235b7e447769f66b5b5925385f5a569269164a210", [:mix], [{:bandit, ">= 0.6.0", [hex: :bandit, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.6", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:websock, "~> 0.5", [hex: :websock, repo: "hexpm", optional: false]}], "hexpm", "4b977ba4a01918acbf77045ff88de7f6972c2a009213c515a445c48f224ffce9"},
"websockex": {:hex, :websockex, "0.4.3", "92b7905769c79c6480c02daacaca2ddd49de936d912976a4d3c923723b647bf0", [:mix], [], "hexpm", "95f2e7072b85a3a4cc385602d42115b73ce0b74a9121d0d6dbbf557645ac53e4"},
}
diff --git a/priv/gettext/ja/LC_MESSAGES/errors.po b/priv/gettext/ja/LC_MESSAGES/errors.po
index 5d9986485..c3c467a67 100644
--- a/priv/gettext/ja/LC_MESSAGES/errors.po
+++ b/priv/gettext/ja/LC_MESSAGES/errors.po
@@ -1,580 +1,580 @@
msgid ""
msgstr ""
"Project-Id-Version: PACKAGE VERSION\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2021-09-18 09:07+0000\n"
-"PO-Revision-Date: 2021-09-19 09:45+0000\n"
-"Last-Translator: Ryo Ueno <r.ueno.nfive@gmail.com>\n"
+"PO-Revision-Date: 2024-02-16 11:49+0000\n"
+"Last-Translator: SyoBoN <syobon@syobon.net>\n"
"Language-Team: Japanese <https://translate.pleroma.social/projects/pleroma/"
-"pleroma/ja/>\n"
+"pleroma-backend-domain-errors/ja/>\n"
"Language: ja\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"Plural-Forms: nplurals=1; plural=0;\n"
-"X-Generator: Weblate 4.6.2\n"
+"X-Generator: Weblate 4.13.1\n"
## This file is a PO Template file.
##
## `msgid`s here are often extracted from source code.
## Add new translations manually only if they're dynamic
## translations that can't be statically extracted.
##
## Run `mix gettext.extract` to bring this file up to
## date. Leave `msgstr`s empty as changing them here as no
## effect: edit them in PO (`.po`) files instead.
## From Ecto.Changeset.cast/4
msgid "can't be blank"
-msgstr ""
+msgstr "空にすることはできません"
## From Ecto.Changeset.unique_constraint/3
msgid "has already been taken"
-msgstr ""
+msgstr "すでに使用されています"
## From Ecto.Changeset.put_change/3
msgid "is invalid"
msgstr ""
## From Ecto.Changeset.validate_format/3
msgid "has invalid format"
-msgstr ""
+msgstr "書式が正しくありません"
## From Ecto.Changeset.validate_subset/3
msgid "has an invalid entry"
msgstr ""
## From Ecto.Changeset.validate_exclusion/3
msgid "is reserved"
-msgstr ""
+msgstr "予約されています"
## From Ecto.Changeset.validate_confirmation/3
msgid "does not match confirmation"
msgstr ""
## From Ecto.Changeset.no_assoc_constraint/3
msgid "is still associated with this entry"
msgstr ""
msgid "are still associated with this entry"
msgstr ""
## From Ecto.Changeset.validate_length/3
msgid "should be %{count} character(s)"
msgid_plural "should be %{count} character(s)"
-msgstr[0] ""
+msgstr[0] "%{count}文字である必要があります"
msgid "should have %{count} item(s)"
msgid_plural "should have %{count} item(s)"
-msgstr[0] ""
+msgstr[0] "%{count}個の要素を含む必要があります"
msgid "should be at least %{count} character(s)"
msgid_plural "should be at least %{count} character(s)"
-msgstr[0] ""
+msgstr[0] "%{count}文字以上である必要があります"
msgid "should have at least %{count} item(s)"
msgid_plural "should have at least %{count} item(s)"
-msgstr[0] ""
+msgstr[0] "%{count}個以上の要素を含む必要があります"
msgid "should be at most %{count} character(s)"
msgid_plural "should be at most %{count} character(s)"
-msgstr[0] ""
+msgstr[0] "%{count}文字以下である必要があります"
msgid "should have at most %{count} item(s)"
msgid_plural "should have at most %{count} item(s)"
msgstr[0] ""
## From Ecto.Changeset.validate_number/3
msgid "must be less than %{number}"
msgstr "%{number}未満でなければなりません"
msgid "must be greater than %{number}"
msgstr "%{number}より大きくなければなりません"
msgid "must be less than or equal to %{number}"
msgstr "%{number}以下でなければなりません"
msgid "must be greater than or equal to %{number}"
msgstr "%{number}以上でなければなりません"
msgid "must be equal to %{number}"
msgstr "%{number}でなければなりません"
#: lib/pleroma/web/common_api/common_api.ex:505
#, elixir-format
msgid "Account not found"
-msgstr "アカウントがありません"
+msgstr "アカウントが見つかりません"
#: lib/pleroma/web/common_api/common_api.ex:339
#, elixir-format
msgid "Already voted"
msgstr "投票済みです"
#: lib/pleroma/web/oauth/oauth_controller.ex:359
#, elixir-format
msgid "Bad request"
msgstr ""
#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:426
#, elixir-format
msgid "Can't delete object"
msgstr "オブジェクトを削除できません"
#: lib/pleroma/web/controller_helper.ex:105
#: lib/pleroma/web/controller_helper.ex:111
#, elixir-format
msgid "Can't display this activity"
msgstr "このアクティビティを表示できません"
#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:285
#, elixir-format
msgid "Can't find user"
msgstr "ユーザーが見つかりません"
#: lib/pleroma/web/pleroma_api/controllers/account_controller.ex:61
#, elixir-format
msgid "Can't get favorites"
msgstr "お気に入りを取得できません"
#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:438
#, elixir-format
msgid "Can't like object"
msgstr ""
#: lib/pleroma/web/common_api/utils.ex:563
#, elixir-format
msgid "Cannot post an empty status without attachments"
msgstr "本文または添付ファイルを追加してください"
#: lib/pleroma/web/common_api/utils.ex:511
#, elixir-format
msgid "Comment must be up to %{max_size} characters"
msgstr ""
#: lib/pleroma/config/config_db.ex:191
#, elixir-format
msgid "Config with params %{params} not found"
msgstr ""
#: lib/pleroma/web/common_api/common_api.ex:181
#: lib/pleroma/web/common_api/common_api.ex:185
#, elixir-format
msgid "Could not delete"
msgstr "削除できません"
#: lib/pleroma/web/common_api/common_api.ex:231
#, elixir-format
msgid "Could not favorite"
msgstr "お気に入り登録できません"
#: lib/pleroma/web/common_api/common_api.ex:453
#, elixir-format
msgid "Could not pin"
msgstr "ピン留めできません"
#: lib/pleroma/web/common_api/common_api.ex:278
#, elixir-format
msgid "Could not unfavorite"
msgstr "お気に入り解除できません"
#: lib/pleroma/web/common_api/common_api.ex:463
#, elixir-format
msgid "Could not unpin"
msgstr "ピン留め解除できません"
#: lib/pleroma/web/common_api/common_api.ex:216
#, elixir-format
msgid "Could not unrepeat"
msgstr "リピート解除できません"
#: lib/pleroma/web/common_api/common_api.ex:512
#: lib/pleroma/web/common_api/common_api.ex:521
#, elixir-format
msgid "Could not update state"
msgstr "状態を更新できません"
#: lib/pleroma/web/mastodon_api/controllers/timeline_controller.ex:207
#, elixir-format
msgid "Error."
msgstr "エラー。"
#: lib/pleroma/web/twitter_api/twitter_api.ex:106
#, elixir-format
msgid "Invalid CAPTCHA"
msgstr "CAPTCHAが間違っています"
#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:116
#: lib/pleroma/web/oauth/oauth_controller.ex:568
#, elixir-format
msgid "Invalid credentials"
msgstr "認証情報が間違っています"
#: lib/pleroma/plugs/ensure_authenticated_plug.ex:38
#, elixir-format
msgid "Invalid credentials."
msgstr "認証情報が間違っています。"
#: lib/pleroma/web/common_api/common_api.ex:355
#, elixir-format
msgid "Invalid indices"
msgstr ""
#: lib/pleroma/web/admin_api/controllers/fallback_controller.ex:29
#, elixir-format
msgid "Invalid parameters"
msgstr ""
#: lib/pleroma/web/common_api/utils.ex:414
#, elixir-format
msgid "Invalid password."
-msgstr ""
+msgstr "パスワードが間違っています。"
#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:220
#, elixir-format
msgid "Invalid request"
msgstr ""
#: lib/pleroma/web/twitter_api/twitter_api.ex:109
#, elixir-format
msgid "Kocaptcha service unavailable"
-msgstr ""
+msgstr "Kocaptchaが利用できません"
#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:112
#, elixir-format
msgid "Missing parameters"
msgstr ""
#: lib/pleroma/web/common_api/utils.ex:547
#, elixir-format
msgid "No such conversation"
msgstr ""
#: lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:388
#: lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:414 lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:456
#, elixir-format
msgid "No such permission_group"
msgstr ""
#: lib/pleroma/plugs/uploaded_media.ex:84
#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:486 lib/pleroma/web/admin_api/controllers/fallback_controller.ex:11
#: lib/pleroma/web/feed/user_controller.ex:71 lib/pleroma/web/ostatus/ostatus_controller.ex:143
#, elixir-format
msgid "Not found"
-msgstr ""
+msgstr "見つかりません"
#: lib/pleroma/web/common_api/common_api.ex:331
#, elixir-format
msgid "Poll's author can't vote"
-msgstr ""
+msgstr "作成者は投票できません"
#: lib/pleroma/web/mastodon_api/controllers/fallback_controller.ex:20
#: lib/pleroma/web/mastodon_api/controllers/poll_controller.ex:37 lib/pleroma/web/mastodon_api/controllers/poll_controller.ex:49
#: lib/pleroma/web/mastodon_api/controllers/poll_controller.ex:50 lib/pleroma/web/mastodon_api/controllers/status_controller.ex:306
#: lib/pleroma/web/mastodon_api/controllers/subscription_controller.ex:71
#, elixir-format
msgid "Record not found"
msgstr ""
#: lib/pleroma/web/admin_api/controllers/fallback_controller.ex:35
#: lib/pleroma/web/feed/user_controller.ex:77 lib/pleroma/web/mastodon_api/controllers/fallback_controller.ex:36
#: lib/pleroma/web/ostatus/ostatus_controller.ex:149
#, elixir-format
msgid "Something went wrong"
msgstr ""
#: lib/pleroma/web/common_api/activity_draft.ex:107
#, elixir-format
msgid "The message visibility must be direct"
-msgstr ""
+msgstr "公開範囲は「ダイレクト」でなければいけません"
#: lib/pleroma/web/common_api/utils.ex:573
#, elixir-format
msgid "The status is over the character limit"
-msgstr ""
+msgstr "文字数制限を超過しています"
#: lib/pleroma/plugs/ensure_public_or_authenticated_plug.ex:31
#, elixir-format
msgid "This resource requires authentication."
-msgstr ""
+msgstr "認証が必要です。"
#: lib/pleroma/plugs/rate_limiter/rate_limiter.ex:206
#, elixir-format
msgid "Throttled"
msgstr ""
#: lib/pleroma/web/common_api/common_api.ex:356
#, elixir-format
msgid "Too many choices"
-msgstr ""
+msgstr "選択肢が多すぎます"
#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:443
#, elixir-format
msgid "Unhandled activity type"
msgstr ""
#: lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:485
#, elixir-format
msgid "You can't revoke your own admin status."
msgstr ""
#: lib/pleroma/web/oauth/oauth_controller.ex:221
#: lib/pleroma/web/oauth/oauth_controller.ex:308
#, elixir-format
msgid "Your account is currently disabled"
-msgstr ""
+msgstr "あなたのアカウントは無効化されています"
#: lib/pleroma/web/oauth/oauth_controller.ex:183
#: lib/pleroma/web/oauth/oauth_controller.ex:331
#, elixir-format
msgid "Your login is missing a confirmed e-mail address"
msgstr ""
#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:390
#, elixir-format
msgid "can't read inbox of %{nickname} as %{as_nickname}"
-msgstr ""
+msgstr "%{nickname}のinboxを%{as_nickname}として閲覧することはできません"
#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:473
#, elixir-format
msgid "can't update outbox of %{nickname} as %{as_nickname}"
-msgstr ""
+msgstr "%{nickname}のoutboxを%{as_nickname}として更新することはできません"
#: lib/pleroma/web/common_api/common_api.ex:471
#, elixir-format
msgid "conversation is already muted"
-msgstr ""
+msgstr "この会話はミュート済みです"
#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:314
#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:492
#, elixir-format
msgid "error"
-msgstr ""
+msgstr "エラー"
#: lib/pleroma/web/pleroma_api/controllers/mascot_controller.ex:32
#, elixir-format
msgid "mascots can only be images"
msgstr ""
#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:62
#, elixir-format
msgid "not found"
-msgstr ""
+msgstr "見つかりません"
#: lib/pleroma/web/oauth/oauth_controller.ex:394
#, elixir-format
msgid "Bad OAuth request."
msgstr ""
#: lib/pleroma/web/twitter_api/twitter_api.ex:115
#, elixir-format
msgid "CAPTCHA already used"
msgstr ""
#: lib/pleroma/web/twitter_api/twitter_api.ex:112
#, elixir-format
msgid "CAPTCHA expired"
msgstr ""
#: lib/pleroma/plugs/uploaded_media.ex:57
#, elixir-format
msgid "Failed"
msgstr ""
#: lib/pleroma/web/oauth/oauth_controller.ex:410
#, elixir-format
msgid "Failed to authenticate: %{message}."
-msgstr ""
+msgstr "認証に失敗しました: %{message}。"
#: lib/pleroma/web/oauth/oauth_controller.ex:441
#, elixir-format
msgid "Failed to set up user account."
-msgstr ""
+msgstr "ユーザーアカウントの設定に失敗しました。"
#: lib/pleroma/plugs/oauth_scopes_plug.ex:38
#, elixir-format
msgid "Insufficient permissions: %{permissions}."
-msgstr ""
+msgstr "%{permissions}の権限が必要です。"
#: lib/pleroma/plugs/uploaded_media.ex:104
#, elixir-format
msgid "Internal Error"
-msgstr ""
+msgstr "内部エラー"
#: lib/pleroma/web/oauth/fallback_controller.ex:22
#: lib/pleroma/web/oauth/fallback_controller.ex:29
#, elixir-format
msgid "Invalid Username/Password"
-msgstr ""
+msgstr "ユーザー名/パスワードが間違っています"
#: lib/pleroma/web/twitter_api/twitter_api.ex:118
#, elixir-format
msgid "Invalid answer data"
msgstr ""
#: lib/pleroma/web/nodeinfo/nodeinfo_controller.ex:33
#, elixir-format
msgid "Nodeinfo schema version not handled"
msgstr ""
#: lib/pleroma/web/oauth/oauth_controller.ex:172
#, elixir-format
msgid "This action is outside the authorized scopes"
msgstr ""
#: lib/pleroma/web/oauth/fallback_controller.ex:14
#, elixir-format
msgid "Unknown error, please check the details and try again."
-msgstr ""
+msgstr "不明なエラーです。詳細を確認し、再度試行してください。"
#: lib/pleroma/web/oauth/oauth_controller.ex:119
#: lib/pleroma/web/oauth/oauth_controller.ex:158
#, elixir-format
msgid "Unlisted redirect_uri."
msgstr ""
#: lib/pleroma/web/oauth/oauth_controller.ex:390
#, elixir-format
msgid "Unsupported OAuth provider: %{provider}."
msgstr ""
#: lib/pleroma/uploaders/uploader.ex:72
#, elixir-format
msgid "Uploader callback timeout"
msgstr ""
#: lib/pleroma/web/uploader_controller.ex:23
#, elixir-format
msgid "bad request"
msgstr ""
#: lib/pleroma/web/twitter_api/twitter_api.ex:103
#, elixir-format
msgid "CAPTCHA Error"
-msgstr ""
+msgstr "CAPTCHAエラー"
#: lib/pleroma/web/common_api/common_api.ex:290
#, elixir-format
msgid "Could not add reaction emoji"
-msgstr ""
+msgstr "リアクションを追加できません"
#: lib/pleroma/web/common_api/common_api.ex:301
#, elixir-format
msgid "Could not remove reaction emoji"
-msgstr ""
+msgstr "リアクションを解除できません"
#: lib/pleroma/web/twitter_api/twitter_api.ex:129
#, elixir-format
msgid "Invalid CAPTCHA (Missing parameter: %{name})"
msgstr ""
#: lib/pleroma/web/mastodon_api/controllers/list_controller.ex:92
#, elixir-format
msgid "List not found"
-msgstr ""
+msgstr "リストが見つかりません"
#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:123
#, elixir-format
msgid "Missing parameter: %{name}"
msgstr ""
#: lib/pleroma/web/oauth/oauth_controller.ex:210
#: lib/pleroma/web/oauth/oauth_controller.ex:321
#, elixir-format
msgid "Password reset is required"
-msgstr ""
+msgstr "パスワードのリセットが必要です"
#: lib/pleroma/tests/auth_test_controller.ex:9
#: lib/pleroma/web/activity_pub/activity_pub_controller.ex:6 lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:6
#: lib/pleroma/web/admin_api/controllers/config_controller.ex:6 lib/pleroma/web/admin_api/controllers/fallback_controller.ex:6
#: lib/pleroma/web/admin_api/controllers/invite_controller.ex:6 lib/pleroma/web/admin_api/controllers/media_proxy_cache_controller.ex:6
#: lib/pleroma/web/admin_api/controllers/oauth_app_controller.ex:6 lib/pleroma/web/admin_api/controllers/relay_controller.ex:6
#: lib/pleroma/web/admin_api/controllers/report_controller.ex:6 lib/pleroma/web/admin_api/controllers/status_controller.ex:6
#: lib/pleroma/web/controller_helper.ex:6 lib/pleroma/web/embed_controller.ex:6
#: lib/pleroma/web/fallback_redirect_controller.ex:6 lib/pleroma/web/feed/tag_controller.ex:6
#: lib/pleroma/web/feed/user_controller.ex:6 lib/pleroma/web/mailer/subscription_controller.ex:2
#: lib/pleroma/web/masto_fe_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/account_controller.ex:6
#: lib/pleroma/web/mastodon_api/controllers/app_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/auth_controller.ex:6
#: lib/pleroma/web/mastodon_api/controllers/conversation_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/custom_emoji_controller.ex:6
#: lib/pleroma/web/mastodon_api/controllers/domain_block_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/fallback_controller.ex:6
#: lib/pleroma/web/mastodon_api/controllers/filter_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/follow_request_controller.ex:6
#: lib/pleroma/web/mastodon_api/controllers/instance_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/list_controller.ex:6
#: lib/pleroma/web/mastodon_api/controllers/marker_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/mastodon_api_controller.ex:14
#: lib/pleroma/web/mastodon_api/controllers/media_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/notification_controller.ex:6
#: lib/pleroma/web/mastodon_api/controllers/poll_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/report_controller.ex:8
#: lib/pleroma/web/mastodon_api/controllers/scheduled_activity_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/search_controller.ex:6
#: lib/pleroma/web/mastodon_api/controllers/status_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/subscription_controller.ex:7
#: lib/pleroma/web/mastodon_api/controllers/suggestion_controller.ex:6 lib/pleroma/web/mastodon_api/controllers/timeline_controller.ex:6
#: lib/pleroma/web/media_proxy/media_proxy_controller.ex:6 lib/pleroma/web/mongooseim/mongoose_im_controller.ex:6
#: lib/pleroma/web/nodeinfo/nodeinfo_controller.ex:6 lib/pleroma/web/oauth/fallback_controller.ex:6
#: lib/pleroma/web/oauth/mfa_controller.ex:10 lib/pleroma/web/oauth/oauth_controller.ex:6
#: lib/pleroma/web/ostatus/ostatus_controller.ex:6 lib/pleroma/web/pleroma_api/controllers/account_controller.ex:6
#: lib/pleroma/web/pleroma_api/controllers/chat_controller.ex:5 lib/pleroma/web/pleroma_api/controllers/conversation_controller.ex:6
#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:2 lib/pleroma/web/pleroma_api/controllers/emoji_reaction_controller.ex:6
#: lib/pleroma/web/pleroma_api/controllers/mascot_controller.ex:6 lib/pleroma/web/pleroma_api/controllers/notification_controller.ex:6
#: lib/pleroma/web/pleroma_api/controllers/scrobble_controller.ex:6
#: lib/pleroma/web/pleroma_api/controllers/two_factor_authentication_controller.ex:7 lib/pleroma/web/static_fe/static_fe_controller.ex:6
#: lib/pleroma/web/twitter_api/controllers/password_controller.ex:10 lib/pleroma/web/twitter_api/controllers/remote_follow_controller.ex:6
#: lib/pleroma/web/twitter_api/controllers/util_controller.ex:6 lib/pleroma/web/twitter_api/twitter_api_controller.ex:6
#: lib/pleroma/web/uploader_controller.ex:6 lib/pleroma/web/web_finger/web_finger_controller.ex:6
#, elixir-format
msgid "Security violation: OAuth scopes check was neither handled nor explicitly skipped."
msgstr ""
#: lib/pleroma/plugs/ensure_authenticated_plug.ex:28
#, elixir-format
msgid "Two-factor authentication enabled, you must use a access token."
-msgstr ""
+msgstr "二段階認証が有効になっています。アクセストークンが必要です。"
#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:210
#, elixir-format
msgid "Unexpected error occurred while adding file to pack."
msgstr ""
#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:138
#, elixir-format
msgid "Unexpected error occurred while creating pack."
msgstr ""
#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:278
#, elixir-format
msgid "Unexpected error occurred while removing file from pack."
msgstr ""
#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:250
#, elixir-format
msgid "Unexpected error occurred while updating file in pack."
msgstr ""
#: lib/pleroma/web/pleroma_api/controllers/emoji_pack_controller.ex:179
#, elixir-format
msgid "Unexpected error occurred while updating pack metadata."
msgstr ""
#: lib/pleroma/web/mastodon_api/controllers/subscription_controller.ex:61
#, elixir-format
msgid "Web push subscription is disabled on this Pleroma instance"
-msgstr ""
+msgstr "このPleromaインスタンスではプッシュ通知が利用できません"
#: lib/pleroma/web/admin_api/controllers/admin_api_controller.ex:451
#, elixir-format
msgid "You can't revoke your own admin/moderator status."
msgstr ""
#: lib/pleroma/web/mastodon_api/controllers/timeline_controller.ex:126
#, elixir-format
msgid "authorization required for timeline view"
-msgstr ""
+msgstr "タイムラインを閲覧するには認証が必要です"
#: lib/pleroma/web/mastodon_api/controllers/fallback_controller.ex:24
#, elixir-format
msgid "Access denied"
-msgstr ""
+msgstr "アクセスが拒否されました"
#: lib/pleroma/web/mastodon_api/controllers/account_controller.ex:282
#, elixir-format
msgid "This API requires an authenticated user"
-msgstr ""
+msgstr "このAPIを利用するには認証が必要です"
#: lib/pleroma/plugs/user_is_admin_plug.ex:21
#, elixir-format
msgid "User is not an admin."
-msgstr ""
+msgstr "ユーザーは管理者ではありません。"
diff --git a/test/fixtures/ccworld-ap-bridge_note.json b/test/fixtures/ccworld-ap-bridge_note.json
new file mode 100644
index 000000000..4b13b4bc1
--- /dev/null
+++ b/test/fixtures/ccworld-ap-bridge_note.json
@@ -0,0 +1 @@
+{"@context":"https://www.w3.org/ns/activitystreams","type":"Note","id":"https://cc.mkdir.uk/ap/note/e5d1d0a1-1ab3-4498-9949-588e3fdea286","attributedTo":"https://cc.mkdir.uk/ap/acct/hiira","inReplyTo":"","quoteUrl":"","content":"おはコンー","published":"2024-01-19T22:08:05Z","to":["https://www.w3.org/ns/activitystreams#Public"],"tag":null,"attachment":[],"object":null}
diff --git a/test/fixtures/rich_media/google.html b/test/fixtures/rich_media/google.html
new file mode 100644
index 000000000..c068397a5
--- /dev/null
+++ b/test/fixtures/rich_media/google.html
@@ -0,0 +1,12 @@
+<meta property="og:url" content="https://google.com">
+<meta property="og:type" content="website">
+<meta property="og:title" content="Google">
+<meta property="og:description" content="Search the world's information, including webpages, images, videos and more. Google has many special features to help you find exactly what you're looking for.">
+<meta property="og:image" content="">
+
+<meta name="twitter:card" content="summary_large_image">
+<meta property="twitter:domain" content="google.com">
+<meta property="twitter:url" content="https://google.com">
+<meta name="twitter:title" content="Google">
+<meta name="twitter:description" content="Search the world's information, including webpages, images, videos and more. Google has many special features to help you find exactly what you're looking for.">
+<meta name="twitter:image" content="">
diff --git a/test/fixtures/rich_media/oembed.html b/test/fixtures/rich_media/oembed.html
index 55f17004b..5429630d0 100644
--- a/test/fixtures/rich_media/oembed.html
+++ b/test/fixtures/rich_media/oembed.html
@@ -1,3 +1,3 @@
<link rel="alternate" type="application/json+oembed"
- href="http://example.com/oembed.json"
+ href="https://example.com/oembed.json"
title="Bacon Lollys oEmbed Profile" />
diff --git a/test/fixtures/rich_media/yahoo.html b/test/fixtures/rich_media/yahoo.html
new file mode 100644
index 000000000..41d8c5cd9
--- /dev/null
+++ b/test/fixtures/rich_media/yahoo.html
@@ -0,0 +1,12 @@
+<meta property="og:url" content="https://yahoo.com">
+<meta property="og:type" content="website">
+<meta property="og:title" content="Yahoo | Mail, Weather, Search, Politics, News, Finance, Sports & Videos">
+<meta property="og:description" content="Latest news coverage, email, free stock quotes, live scores and video are just the beginning. Discover more every day at Yahoo!">
+<meta property="og:image" content="https://s.yimg.com/cv/apiv2/social/images/yahoo_default_logo.png">
+
+<meta name="twitter:card" content="summary_large_image">
+<meta property="twitter:domain" content="yahoo.com">
+<meta property="twitter:url" content="https://yahoo.com">
+<meta name="twitter:title" content="Yahoo | Mail, Weather, Search, Politics, News, Finance, Sports & Videos">
+<meta name="twitter:description" content="Latest news coverage, email, free stock quotes, live scores and video are just the beginning. Discover more every day at Yahoo!">
+<meta name="twitter:image" content="https://s.yimg.com/cv/apiv2/social/images/yahoo_default_logo.png">
diff --git a/test/pleroma/integration/mastodon_websocket_test.exs b/test/pleroma/integration/mastodon_websocket_test.exs
index a2c20f0a6..a0ffddf8d 100644
--- a/test/pleroma/integration/mastodon_websocket_test.exs
+++ b/test/pleroma/integration/mastodon_websocket_test.exs
@@ -1,496 +1,485 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Integration.MastodonWebsocketTest do
# Needs a streamer, needs to stay synchronous
use Pleroma.DataCase
import ExUnit.CaptureLog
import Pleroma.Factory
alias Pleroma.Integration.WebsocketClient
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.OAuth
@moduletag needs_streamer: true, capture_log: true
@path Pleroma.Web.Endpoint.url()
|> URI.parse()
|> Map.put(:scheme, "ws")
|> Map.put(:path, "/api/v1/streaming")
|> URI.to_string()
def start_socket(qs \\ nil, headers \\ []) do
path =
case qs do
nil -> @path
qs -> @path <> qs
end
WebsocketClient.start_link(self(), path, headers)
end
defp decode_json(json) do
with {:ok, %{"event" => event, "payload" => payload_text}} <- Jason.decode(json),
{:ok, payload} <- Jason.decode(payload_text) do
{:ok, %{"event" => event, "payload" => payload}}
end
end
# Turns atom keys to strings
defp atom_key_to_string(json) do
json
|> Jason.encode!()
|> Jason.decode!()
end
test "refuses invalid requests" do
capture_log(fn ->
assert {:error, %WebSockex.RequestError{code: 404}} = start_socket("?stream=ncjdk")
Process.sleep(30)
end)
end
test "requires authentication and a valid token for protected streams" do
capture_log(fn ->
assert {:error, %WebSockex.RequestError{code: 401}} =
start_socket("?stream=user&access_token=aaaaaaaaaaaa")
assert {:error, %WebSockex.RequestError{code: 401}} = start_socket("?stream=user")
Process.sleep(30)
end)
end
test "allows unified stream" do
assert {:ok, _} = start_socket()
end
test "allows public streams without authentication" do
assert {:ok, _} = start_socket("?stream=public")
assert {:ok, _} = start_socket("?stream=public:local")
assert {:ok, _} = start_socket("?stream=public:remote&instance=lain.com")
assert {:ok, _} = start_socket("?stream=hashtag&tag=lain")
end
test "receives well formatted events" do
user = insert(:user)
{:ok, _} = start_socket("?stream=public")
{:ok, activity} = CommonAPI.post(user, %{status: "nice echo chamber"})
assert_receive {:text, raw_json}, 1_000
assert {:ok, json} = Jason.decode(raw_json)
assert "update" == json["event"]
assert json["payload"]
assert {:ok, json} = Jason.decode(json["payload"])
view_json =
Pleroma.Web.MastodonAPI.StatusView.render("show.json", activity: activity, for: nil)
|> atom_key_to_string()
assert json == view_json
end
describe "subscribing via WebSocket" do
test "can subscribe" do
user = insert(:user)
{:ok, pid} = start_socket()
WebsocketClient.send_text(pid, %{type: "subscribe", stream: "public"} |> Jason.encode!())
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "subscribe", "result" => "success"}
}} = decode_json(raw_json)
{:ok, activity} = CommonAPI.post(user, %{status: "nice echo chamber"})
assert_receive {:text, raw_json}, 1_000
assert {:ok, json} = Jason.decode(raw_json)
assert "update" == json["event"]
assert json["payload"]
assert {:ok, json} = Jason.decode(json["payload"])
view_json =
Pleroma.Web.MastodonAPI.StatusView.render("show.json", activity: activity, for: nil)
|> Jason.encode!()
|> Jason.decode!()
assert json == view_json
end
test "can subscribe to multiple streams" do
user = insert(:user)
{:ok, pid} = start_socket()
WebsocketClient.send_text(pid, %{type: "subscribe", stream: "public"} |> Jason.encode!())
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "subscribe", "result" => "success"}
}} = decode_json(raw_json)
WebsocketClient.send_text(
pid,
%{type: "subscribe", stream: "hashtag", tag: "mew"} |> Jason.encode!()
)
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "subscribe", "result" => "success"}
}} = decode_json(raw_json)
{:ok, _activity} = CommonAPI.post(user, %{status: "nice echo chamber #mew"})
assert_receive {:text, raw_json}, 1_000
assert {:ok, %{"stream" => stream1}} = Jason.decode(raw_json)
assert_receive {:text, raw_json}, 1_000
assert {:ok, %{"stream" => stream2}} = Jason.decode(raw_json)
streams = [stream1, stream2]
assert ["hashtag", "mew"] in streams
assert ["public"] in streams
end
test "won't double subscribe" do
user = insert(:user)
{:ok, pid} = start_socket()
WebsocketClient.send_text(pid, %{type: "subscribe", stream: "public"} |> Jason.encode!())
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "subscribe", "result" => "success"}
}} = decode_json(raw_json)
WebsocketClient.send_text(pid, %{type: "subscribe", stream: "public"} |> Jason.encode!())
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "subscribe", "result" => "ignored"}
}} = decode_json(raw_json)
{:ok, _activity} = CommonAPI.post(user, %{status: "nice echo chamber"})
assert_receive {:text, _}, 1_000
refute_receive {:text, _}, 1_000
end
test "rejects invalid streams" do
{:ok, pid} = start_socket()
WebsocketClient.send_text(pid, %{type: "subscribe", stream: "nonsense"} |> Jason.encode!())
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "subscribe", "result" => "error", "error" => "bad_topic"}
}} = decode_json(raw_json)
end
test "can unsubscribe" do
user = insert(:user)
{:ok, pid} = start_socket()
WebsocketClient.send_text(pid, %{type: "subscribe", stream: "public"} |> Jason.encode!())
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "subscribe", "result" => "success"}
}} = decode_json(raw_json)
WebsocketClient.send_text(pid, %{type: "unsubscribe", stream: "public"} |> Jason.encode!())
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "unsubscribe", "result" => "success"}
}} = decode_json(raw_json)
{:ok, _activity} = CommonAPI.post(user, %{status: "nice echo chamber"})
refute_receive {:text, _}, 1_000
end
end
describe "with a valid user token" do
setup do
{:ok, app} =
Pleroma.Repo.insert(
OAuth.App.register_changeset(%OAuth.App{}, %{
client_name: "client",
scopes: ["read"],
redirect_uris: "url"
})
)
user = insert(:user)
{:ok, auth} = OAuth.Authorization.create_authorization(app, user)
{:ok, token} = OAuth.Token.exchange_token(app, auth)
%{app: app, user: user, token: token}
end
test "accepts valid tokens", state do
assert {:ok, _} = start_socket("?stream=user&access_token=#{state.token.token}")
end
test "accepts the 'user' stream", %{token: token} = _state do
assert {:ok, _} = start_socket("?stream=user&access_token=#{token.token}")
capture_log(fn ->
assert {:error, %WebSockex.RequestError{code: 401}} = start_socket("?stream=user")
Process.sleep(30)
end)
end
test "accepts the 'user:notification' stream", %{token: token} = _state do
assert {:ok, _} = start_socket("?stream=user:notification&access_token=#{token.token}")
capture_log(fn ->
assert {:error, %WebSockex.RequestError{code: 401}} =
start_socket("?stream=user:notification")
Process.sleep(30)
end)
end
- test "accepts valid token on Sec-WebSocket-Protocol header", %{token: token} do
- assert {:ok, _} = start_socket("?stream=user", [{"Sec-WebSocket-Protocol", token.token}])
-
- capture_log(fn ->
- assert {:error, %WebSockex.RequestError{code: 401}} =
- start_socket("?stream=user", [{"Sec-WebSocket-Protocol", "I am a friend"}])
-
- Process.sleep(30)
- end)
- end
-
test "accepts valid token on client-sent event", %{token: token} do
assert {:ok, pid} = start_socket()
WebsocketClient.send_text(
pid,
%{type: "pleroma:authenticate", token: token.token} |> Jason.encode!()
)
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "pleroma:authenticate", "result" => "success"}
}} = decode_json(raw_json)
WebsocketClient.send_text(pid, %{type: "subscribe", stream: "user"} |> Jason.encode!())
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "subscribe", "result" => "success"}
}} = decode_json(raw_json)
end
test "rejects invalid token on client-sent event" do
assert {:ok, pid} = start_socket()
WebsocketClient.send_text(
pid,
%{type: "pleroma:authenticate", token: "Something else"} |> Jason.encode!()
)
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{
"type" => "pleroma:authenticate",
"result" => "error",
"error" => "unauthorized"
}
}} = decode_json(raw_json)
end
test "rejects new authenticate request if already logged-in", %{token: token} do
assert {:ok, pid} = start_socket()
WebsocketClient.send_text(
pid,
%{type: "pleroma:authenticate", token: token.token} |> Jason.encode!()
)
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "pleroma:authenticate", "result" => "success"}
}} = decode_json(raw_json)
WebsocketClient.send_text(
pid,
%{type: "pleroma:authenticate", token: "Something else"} |> Jason.encode!()
)
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{
"type" => "pleroma:authenticate",
"result" => "error",
"error" => "already_authenticated"
}
}} = decode_json(raw_json)
end
test "accepts the 'list' stream", %{token: token, user: user} do
posting_user = insert(:user)
{:ok, list} = Pleroma.List.create("test", user)
Pleroma.List.follow(list, posting_user)
assert {:ok, _} = start_socket("?stream=list&access_token=#{token.token}&list=#{list.id}")
assert {:ok, pid} = start_socket("?access_token=#{token.token}")
WebsocketClient.send_text(
pid,
%{type: "subscribe", stream: "list", list: list.id} |> Jason.encode!()
)
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "subscribe", "result" => "success"}
}} = decode_json(raw_json)
WebsocketClient.send_text(
pid,
%{type: "subscribe", stream: "list", list: to_string(list.id)} |> Jason.encode!()
)
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "pleroma:respond",
"payload" => %{"type" => "subscribe", "result" => "ignored"}
}} = decode_json(raw_json)
end
test "disconnect when token is revoked", %{app: app, user: user, token: token} do
assert {:ok, _} = start_socket("?stream=user:notification&access_token=#{token.token}")
assert {:ok, _} = start_socket("?stream=user&access_token=#{token.token}")
{:ok, auth} = OAuth.Authorization.create_authorization(app, user)
{:ok, token2} = OAuth.Token.exchange_token(app, auth)
assert {:ok, _} = start_socket("?stream=user&access_token=#{token2.token}")
OAuth.Token.Strategy.Revoke.revoke(token)
assert_receive {:close, _}
assert_receive {:close, _}
refute_receive {:close, _}
end
test "receives private statuses", %{user: reading_user, token: token} do
user = insert(:user)
CommonAPI.follow(reading_user, user)
{:ok, _} = start_socket("?stream=user&access_token=#{token.token}")
{:ok, activity} =
CommonAPI.post(user, %{status: "nice echo chamber", visibility: "private"})
assert_receive {:text, raw_json}, 1_000
assert {:ok, json} = Jason.decode(raw_json)
assert "update" == json["event"]
assert json["payload"]
assert {:ok, json} = Jason.decode(json["payload"])
view_json =
Pleroma.Web.MastodonAPI.StatusView.render("show.json",
activity: activity,
for: reading_user
)
|> Jason.encode!()
|> Jason.decode!()
assert json == view_json
end
test "receives edits", %{user: reading_user, token: token} do
user = insert(:user)
CommonAPI.follow(reading_user, user)
{:ok, _} = start_socket("?stream=user&access_token=#{token.token}")
{:ok, activity} =
CommonAPI.post(user, %{status: "nice echo chamber", visibility: "private"})
assert_receive {:text, _raw_json}, 1_000
{:ok, _} = CommonAPI.update(user, activity, %{status: "mew mew", visibility: "private"})
assert_receive {:text, raw_json}, 1_000
activity = Pleroma.Activity.normalize(activity)
view_json =
Pleroma.Web.MastodonAPI.StatusView.render("show.json",
activity: activity,
for: reading_user
)
|> Jason.encode!()
|> Jason.decode!()
assert {:ok, %{"event" => "status.update", "payload" => ^view_json}} = decode_json(raw_json)
end
test "receives notifications", %{user: reading_user, token: token} do
user = insert(:user)
CommonAPI.follow(reading_user, user)
{:ok, _} = start_socket("?stream=user:notification&access_token=#{token.token}")
{:ok, %Pleroma.Activity{id: activity_id} = _activity} =
CommonAPI.post(user, %{
status: "nice echo chamber @#{reading_user.nickname}",
visibility: "private"
})
assert_receive {:text, raw_json}, 1_000
assert {:ok,
%{
"event" => "notification",
"payload" => %{
"status" => %{
"id" => ^activity_id
}
}
}} = decode_json(raw_json)
end
end
end
diff --git a/test/pleroma/maps_test.exs b/test/pleroma/maps_test.exs
new file mode 100644
index 000000000..05f1b18b2
--- /dev/null
+++ b/test/pleroma/maps_test.exs
@@ -0,0 +1,22 @@
+# Pleroma: A lightweight social networking server
+# Copyright © 2024 Pleroma Authors <https://pleroma.social/>
+# SPDX-License-Identifier: AGPL-3.0-only
+
+defmodule Pleroma.MapsTest do
+ use Pleroma.DataCase, async: true
+
+ alias Pleroma.Maps
+
+ describe "filter_empty_values/1" do
+ assert %{"bar" => "b", "ray" => ["foo"], "objs" => %{"a" => "b"}} ==
+ Maps.filter_empty_values(%{
+ "foo" => nil,
+ "fooz" => "",
+ "bar" => "b",
+ "rei" => [],
+ "ray" => ["foo"],
+ "obj" => %{},
+ "objs" => %{"a" => "b"}
+ })
+ end
+end
diff --git a/test/pleroma/web/activity_pub/mrf/steal_emoji_policy_test.exs b/test/pleroma/web/activity_pub/mrf/steal_emoji_policy_test.exs
index c477a093d..2c7497da5 100644
--- a/test/pleroma/web/activity_pub/mrf/steal_emoji_policy_test.exs
+++ b/test/pleroma/web/activity_pub/mrf/steal_emoji_policy_test.exs
@@ -1,149 +1,175 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.MRF.StealEmojiPolicyTest do
use Pleroma.DataCase
alias Pleroma.Config
alias Pleroma.Emoji
alias Pleroma.Web.ActivityPub.MRF.StealEmojiPolicy
setup do
emoji_path = [:instance, :static_dir] |> Config.get() |> Path.join("emoji/stolen")
Emoji.reload()
message = %{
"type" => "Create",
"object" => %{
"emoji" => [{"firedfox", "https://example.org/emoji/firedfox.png"}],
"actor" => "https://example.org/users/admin"
}
}
on_exit(fn ->
File.rm_rf!(emoji_path)
end)
[message: message, path: emoji_path]
end
test "does nothing by default", %{message: message} do
refute "firedfox" in installed()
assert {:ok, _message} = StealEmojiPolicy.filter(message)
refute "firedfox" in installed()
end
test "Steals emoji on unknown shortcode from allowed remote host", %{
message: message,
path: path
} do
refute "firedfox" in installed()
refute File.exists?(path)
Tesla.Mock.mock(fn %{method: :get, url: "https://example.org/emoji/firedfox.png"} ->
%Tesla.Env{status: 200, body: File.read!("test/fixtures/image.jpg")}
end)
clear_config(:mrf_steal_emoji, hosts: ["example.org"], size_limit: 284_468)
assert {:ok, _message} = StealEmojiPolicy.filter(message)
assert "firedfox" in installed()
assert File.exists?(path)
assert path
|> Path.join("firedfox.png")
|> File.exists?()
end
test "works with unknown extension", %{path: path} do
message = %{
"type" => "Create",
"object" => %{
"emoji" => [{"firedfox", "https://example.org/emoji/firedfox"}],
"actor" => "https://example.org/users/admin"
}
}
fullpath = Path.join(path, "firedfox.png")
Tesla.Mock.mock(fn %{method: :get, url: "https://example.org/emoji/firedfox"} ->
%Tesla.Env{status: 200, body: File.read!("test/fixtures/image.jpg")}
end)
clear_config(:mrf_steal_emoji, hosts: ["example.org"], size_limit: 284_468)
refute "firedfox" in installed()
refute File.exists?(path)
assert {:ok, _message} = StealEmojiPolicy.filter(message)
assert "firedfox" in installed()
assert File.exists?(path)
assert File.exists?(fullpath)
end
+ test "rejects invalid shortcodes", %{path: path} do
+ message = %{
+ "type" => "Create",
+ "object" => %{
+ "emoji" => [{"fired/fox", "https://example.org/emoji/firedfox"}],
+ "actor" => "https://example.org/users/admin"
+ }
+ }
+
+ fullpath = Path.join(path, "fired/fox.png")
+
+ Tesla.Mock.mock(fn %{method: :get, url: "https://example.org/emoji/firedfox"} ->
+ %Tesla.Env{status: 200, body: File.read!("test/fixtures/image.jpg")}
+ end)
+
+ clear_config(:mrf_steal_emoji, hosts: ["example.org"], size_limit: 284_468)
+
+ refute "firedfox" in installed()
+ refute File.exists?(path)
+
+ assert {:ok, _message} = StealEmojiPolicy.filter(message)
+
+ refute "fired/fox" in installed()
+ refute File.exists?(fullpath)
+ end
+
test "reject regex shortcode", %{message: message} do
refute "firedfox" in installed()
clear_config(:mrf_steal_emoji,
hosts: ["example.org"],
size_limit: 284_468,
rejected_shortcodes: [~r/firedfox/]
)
assert {:ok, _message} = StealEmojiPolicy.filter(message)
refute "firedfox" in installed()
end
test "reject string shortcode", %{message: message} do
refute "firedfox" in installed()
clear_config(:mrf_steal_emoji,
hosts: ["example.org"],
size_limit: 284_468,
rejected_shortcodes: ["firedfox"]
)
assert {:ok, _message} = StealEmojiPolicy.filter(message)
refute "firedfox" in installed()
end
test "reject if size is above the limit", %{message: message} do
refute "firedfox" in installed()
Tesla.Mock.mock(fn %{method: :get, url: "https://example.org/emoji/firedfox.png"} ->
%Tesla.Env{status: 200, body: File.read!("test/fixtures/image.jpg")}
end)
clear_config(:mrf_steal_emoji, hosts: ["example.org"], size_limit: 50_000)
assert {:ok, _message} = StealEmojiPolicy.filter(message)
refute "firedfox" in installed()
end
test "reject if host returns error", %{message: message} do
refute "firedfox" in installed()
Tesla.Mock.mock(fn %{method: :get, url: "https://example.org/emoji/firedfox.png"} ->
{:ok, %Tesla.Env{status: 404, body: "Not found"}}
end)
clear_config(:mrf_steal_emoji, hosts: ["example.org"], size_limit: 284_468)
ExUnit.CaptureLog.capture_log(fn ->
assert {:ok, _message} = StealEmojiPolicy.filter(message)
end) =~ "MRF.StealEmojiPolicy: Failed to fetch https://example.org/emoji/firedfox.png"
refute "firedfox" in installed()
end
defp installed, do: Emoji.get_all() |> Enum.map(fn {k, _} -> k end)
end
diff --git a/test/pleroma/web/activity_pub/object_validators/article_note_page_validator_test.exs b/test/pleroma/web/activity_pub/object_validators/article_note_page_validator_test.exs
index 4703c3801..2b33950d6 100644
--- a/test/pleroma/web/activity_pub/object_validators/article_note_page_validator_test.exs
+++ b/test/pleroma/web/activity_pub/object_validators/article_note_page_validator_test.exs
@@ -1,168 +1,179 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.ObjectValidators.ArticleNotePageValidatorTest do
use Pleroma.DataCase, async: true
alias Pleroma.Web.ActivityPub.ObjectValidator
alias Pleroma.Web.ActivityPub.ObjectValidators.ArticleNotePageValidator
alias Pleroma.Web.ActivityPub.Utils
import Pleroma.Factory
describe "Notes" do
setup do
user = insert(:user)
note = %{
"id" => Utils.generate_activity_id(),
"type" => "Note",
"actor" => user.ap_id,
"to" => [user.follower_address],
"cc" => [],
"content" => "Hellow this is content.",
"context" => "xxx",
"summary" => "a post"
}
%{user: user, note: note}
end
test "a basic note validates", %{note: note} do
%{valid?: true} = ArticleNotePageValidator.cast_and_validate(note)
end
test "a note from factory validates" do
note = insert(:note)
%{valid?: true} = ArticleNotePageValidator.cast_and_validate(note.data)
end
end
describe "Note with history" do
setup do
user = insert(:user)
{:ok, activity} = Pleroma.Web.CommonAPI.post(user, %{status: "mew mew :dinosaur:"})
{:ok, edit} = Pleroma.Web.CommonAPI.update(user, activity, %{status: "edited :blank:"})
{:ok, %{"object" => external_rep}} =
Pleroma.Web.ActivityPub.Transmogrifier.prepare_outgoing(edit.data)
%{external_rep: external_rep}
end
test "edited note", %{external_rep: external_rep} do
assert %{"formerRepresentations" => %{"orderedItems" => [%{"tag" => [_]}]}} = external_rep
{:ok, validate_res, []} = ObjectValidator.validate(external_rep, [])
assert %{"formerRepresentations" => %{"orderedItems" => [%{"emoji" => %{"dinosaur" => _}}]}} =
validate_res
end
test "edited note, badly-formed formerRepresentations", %{external_rep: external_rep} do
external_rep = Map.put(external_rep, "formerRepresentations", %{})
assert {:error, _} = ObjectValidator.validate(external_rep, [])
end
test "edited note, badly-formed history item", %{external_rep: external_rep} do
history_item =
Enum.at(external_rep["formerRepresentations"]["orderedItems"], 0)
|> Map.put("type", "Foo")
external_rep =
put_in(
external_rep,
["formerRepresentations", "orderedItems"],
[history_item]
)
assert {:error, _} = ObjectValidator.validate(external_rep, [])
end
end
test "a Note from Roadhouse validates" do
insert(:user, ap_id: "https://macgirvin.com/channel/mike")
%{"object" => note} =
"test/fixtures/roadhouse-create-activity.json"
|> File.read!()
|> Jason.decode!()
%{valid?: true} = ArticleNotePageValidator.cast_and_validate(note)
end
+ test "a Note from Convergence AP Bridge validates" do
+ insert(:user, ap_id: "https://cc.mkdir.uk/ap/acct/hiira")
+
+ note =
+ "test/fixtures/ccworld-ap-bridge_note.json"
+ |> File.read!()
+ |> Jason.decode!()
+
+ %{valid?: true} = ArticleNotePageValidator.cast_and_validate(note)
+ end
+
test "a note with an attachment should work", _ do
insert(:user, %{ap_id: "https://owncast.localhost.localdomain/federation/user/streamer"})
note =
"test/fixtures/owncast-note-with-attachment.json"
|> File.read!()
|> Jason.decode!()
%{valid?: true} = ArticleNotePageValidator.cast_and_validate(note)
end
test "a Note without replies/first/items validates" do
insert(:user, ap_id: "https://mastodon.social/users/emelie")
note =
"test/fixtures/tesla_mock/status.emelie.json"
|> File.read!()
|> Jason.decode!()
|> pop_in(["replies", "first", "items"])
|> elem(1)
%{valid?: true} = ArticleNotePageValidator.cast_and_validate(note)
end
test "Fedibird quote post" do
insert(:user, ap_id: "https://fedibird.com/users/noellabo")
data = File.read!("test/fixtures/quote_post/fedibird_quote_post.json") |> Jason.decode!()
cng = ArticleNotePageValidator.cast_and_validate(data)
assert cng.valid?
assert cng.changes.quoteUrl == "https://misskey.io/notes/8vsn2izjwh"
end
test "Fedibird quote post with quoteUri field" do
insert(:user, ap_id: "https://fedibird.com/users/noellabo")
data = File.read!("test/fixtures/quote_post/fedibird_quote_uri.json") |> Jason.decode!()
cng = ArticleNotePageValidator.cast_and_validate(data)
assert cng.valid?
assert cng.changes.quoteUrl == "https://fedibird.com/users/yamako/statuses/107699333438289729"
end
test "Misskey quote post" do
insert(:user, ap_id: "https://misskey.io/users/7rkrarq81i")
data = File.read!("test/fixtures/quote_post/misskey_quote_post.json") |> Jason.decode!()
cng = ArticleNotePageValidator.cast_and_validate(data)
assert cng.valid?
assert cng.changes.quoteUrl == "https://misskey.io/notes/8vs6wxufd0"
end
test "Parse tag as quote" do
# https://codeberg.org/fediverse/fep/src/branch/main/fep/e232/fep-e232.md
insert(:user, ap_id: "https://server.example/users/1")
data = File.read!("test/fixtures/quote_post/fep-e232-tag-example.json") |> Jason.decode!()
cng = ArticleNotePageValidator.cast_and_validate(data)
assert cng.valid?
assert cng.changes.quoteUrl == "https://server.example/objects/123"
assert Enum.at(cng.changes.tag, 0).changes == %{
type: "Link",
mediaType: "application/ld+json; profile=\"https://www.w3.org/ns/activitystreams\"",
href: "https://server.example/objects/123",
name: "RE: https://server.example/objects/123"
}
end
end
diff --git a/test/pleroma/web/mastodon_api/controllers/status_controller_test.exs b/test/pleroma/web/mastodon_api/controllers/status_controller_test.exs
index 02f2cd126..2e8c5299d 100644
--- a/test/pleroma/web/mastodon_api/controllers/status_controller_test.exs
+++ b/test/pleroma/web/mastodon_api/controllers/status_controller_test.exs
@@ -1,2576 +1,2570 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.MastodonAPI.StatusControllerTest do
use Pleroma.Web.ConnCase, async: false
use Oban.Testing, repo: Pleroma.Repo
alias Pleroma.Activity
alias Pleroma.Conversation.Participation
alias Pleroma.ModerationLog
alias Pleroma.Object
alias Pleroma.Repo
alias Pleroma.ScheduledActivity
alias Pleroma.Tests.Helpers
alias Pleroma.Tests.ObanHelpers
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.CommonAPI
alias Pleroma.Workers.ScheduledActivityWorker
import Mox
import Pleroma.Factory
setup do: clear_config([:instance, :federating])
setup do: clear_config([:instance, :allow_relay])
setup do: clear_config([:mrf, :policies])
setup do: clear_config([:mrf_keyword, :reject])
setup do
Pleroma.UnstubbedConfigMock
|> stub_with(Pleroma.Config)
Pleroma.StaticStubbedConfigMock
|> stub(:get, fn
[:rich_media, :enabled] -> false
path -> Pleroma.Test.StaticConfig.get(path)
end)
:ok
end
describe "posting statuses" do
setup do: oauth_access(["write:statuses"])
test "posting a status does not increment reblog_count when relaying", %{conn: conn} do
clear_config([:instance, :federating], true)
clear_config([:instance, :allow_relay], true)
response =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"content_type" => "text/plain",
"source" => "Pleroma FE",
"status" => "Hello world",
"visibility" => "public"
})
|> json_response_and_validate_schema(200)
assert response["reblogs_count"] == 0
ObanHelpers.perform_all()
response =
conn
|> get("/api/v1/statuses/#{response["id"]}", %{})
|> json_response_and_validate_schema(200)
assert response["reblogs_count"] == 0
end
test "posting a status", %{conn: conn} do
idempotency_key = "Pikachu rocks!"
conn_one =
conn
|> put_req_header("content-type", "application/json")
|> put_req_header("idempotency-key", idempotency_key)
|> post("/api/v1/statuses", %{
"status" => "cofe",
"spoiler_text" => "2hu",
"sensitive" => "0"
})
assert %{"content" => "cofe", "id" => id, "spoiler_text" => "2hu", "sensitive" => false} =
json_response_and_validate_schema(conn_one, 200)
assert Activity.get_by_id(id)
conn_two =
conn
|> put_req_header("content-type", "application/json")
|> put_req_header("idempotency-key", idempotency_key)
|> post("/api/v1/statuses", %{
"status" => "cofe",
"spoiler_text" => "2hu",
"sensitive" => 0
})
# Idempotency plug response means detection fail
assert %{"id" => second_id} = json_response(conn_two, 200)
assert id == second_id
conn_three =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "cofe",
"spoiler_text" => "2hu",
"sensitive" => "False"
})
assert %{"id" => third_id} = json_response_and_validate_schema(conn_three, 200)
refute id == third_id
# An activity that will expire:
# 2 hours
expires_in = 2 * 60 * 60
expires_at = DateTime.add(DateTime.utc_now(), expires_in)
conn_four =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "oolong",
"expires_in" => expires_in
})
assert %{"id" => fourth_id} = json_response_and_validate_schema(conn_four, 200)
assert Activity.get_by_id(fourth_id)
assert_enqueued(
worker: Pleroma.Workers.PurgeExpiredActivity,
args: %{activity_id: fourth_id},
scheduled_at: expires_at
)
end
test "posting a quote post", %{conn: conn} do
user = insert(:user)
{:ok, %{id: activity_id} = activity} = CommonAPI.post(user, %{status: "yolo"})
%{data: %{"id" => quote_url}} = Object.normalize(activity)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "indeed",
"quote_id" => activity_id
})
assert %{
"id" => id,
"pleroma" => %{"quote" => %{"id" => ^activity_id}, "quote_url" => ^quote_url}
} = json_response_and_validate_schema(conn, 200)
assert Activity.get_by_id(id)
end
test "it fails to create a status if `expires_in` is less or equal than an hour", %{
conn: conn
} do
# 1 minute
expires_in = 1 * 60
assert %{"error" => "Expiry date is too soon"} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "oolong",
"expires_in" => expires_in
})
|> json_response_and_validate_schema(422)
# 5 minutes
expires_in = 5 * 60
assert %{"error" => "Expiry date is too soon"} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "oolong",
"expires_in" => expires_in
})
|> json_response_and_validate_schema(422)
end
test "Get MRF reason when posting a status is rejected by one", %{conn: conn} do
clear_config([:mrf_keyword, :reject], ["GNO"])
clear_config([:mrf, :policies], [Pleroma.Web.ActivityPub.MRF.KeywordPolicy])
assert %{"error" => "[KeywordPolicy] Matches with rejected keyword"} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{"status" => "GNO/Linux"})
|> json_response_and_validate_schema(422)
end
test "posting an undefined status with an attachment", %{user: user, conn: conn} do
file = %Plug.Upload{
content_type: "image/jpeg",
path: Path.absname("test/fixtures/image.jpg"),
filename: "an_image.jpg"
}
{:ok, upload} = ActivityPub.upload(file, actor: user.ap_id)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"media_ids" => [to_string(upload.id)]
})
assert json_response_and_validate_schema(conn, 200)
end
test "replying to a status", %{user: user, conn: conn} do
{:ok, replied_to} = CommonAPI.post(user, %{status: "cofe"})
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{"status" => "xD", "in_reply_to_id" => replied_to.id})
assert %{"content" => "xD", "id" => id} = json_response_and_validate_schema(conn, 200)
activity = Activity.get_by_id(id)
assert activity.data["context"] == replied_to.data["context"]
assert Activity.get_in_reply_to_activity(activity).id == replied_to.id
end
test "replying to a direct message with visibility other than direct", %{
user: user,
conn: conn
} do
{:ok, replied_to} = CommonAPI.post(user, %{status: "suya..", visibility: "direct"})
Enum.each(["public", "private", "unlisted"], fn visibility ->
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "@#{user.nickname} hey",
"in_reply_to_id" => replied_to.id,
"visibility" => visibility
})
assert json_response_and_validate_schema(conn, 422) == %{
"error" => "The message visibility must be direct"
}
end)
end
test "posting a status with an invalid in_reply_to_id", %{conn: conn} do
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{"status" => "xD", "in_reply_to_id" => ""})
assert %{"content" => "xD", "id" => id} = json_response_and_validate_schema(conn, 200)
assert Activity.get_by_id(id)
end
test "posting a sensitive status", %{conn: conn} do
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{"status" => "cofe", "sensitive" => true})
assert %{"content" => "cofe", "id" => id, "sensitive" => true} =
json_response_and_validate_schema(conn, 200)
assert Activity.get_by_id(id)
end
test "posting a fake status", %{conn: conn} do
real_conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" =>
"\"Tenshi Eating a Corndog\" is a much discussed concept on /jp/. The significance of it is disputed, so I will focus on one core concept: the symbolism behind it"
})
real_status = json_response_and_validate_schema(real_conn, 200)
assert real_status
assert Object.get_by_ap_id(real_status["uri"])
real_status =
real_status
|> Map.put("id", nil)
|> Map.put("url", nil)
|> Map.put("uri", nil)
|> Map.put("created_at", nil)
|> Kernel.put_in(["pleroma", "context"], nil)
|> Kernel.put_in(["pleroma", "conversation_id"], nil)
fake_conn =
conn
|> assign(:user, refresh_record(conn.assigns.user))
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" =>
"\"Tenshi Eating a Corndog\" is a much discussed concept on /jp/. The significance of it is disputed, so I will focus on one core concept: the symbolism behind it",
"preview" => true
})
fake_status = json_response_and_validate_schema(fake_conn, 200)
assert fake_status
refute Object.get_by_ap_id(fake_status["uri"])
fake_status =
fake_status
|> Map.put("id", nil)
|> Map.put("url", nil)
|> Map.put("uri", nil)
|> Map.put("created_at", nil)
|> Kernel.put_in(["pleroma", "context"], nil)
|> Kernel.put_in(["pleroma", "conversation_id"], nil)
assert real_status == fake_status
end
test "fake statuses' preview card is not cached", %{conn: conn} do
Pleroma.StaticStubbedConfigMock
|> stub(:get, fn
[:rich_media, :enabled] -> true
path -> Pleroma.Test.StaticConfig.get(path)
end)
- Tesla.Mock.mock(fn
- %{
- method: :get,
- url: "https://example.com/twitter-card"
- } ->
- %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/twitter_card.html")}
-
+ Tesla.Mock.mock_global(fn
env ->
apply(HttpRequestMock, :request, [env])
end)
conn1 =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "https://example.com/ogp",
"preview" => true
})
conn2 =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "https://example.com/twitter-card",
"preview" => true
})
assert %{"card" => %{"title" => "The Rock"}} = json_response_and_validate_schema(conn1, 200)
assert %{"card" => %{"title" => "Small Island Developing States Photo Submission"}} =
json_response_and_validate_schema(conn2, 200)
end
test "posting a status with OGP link preview", %{conn: conn} do
Tesla.Mock.mock_global(fn env -> apply(HttpRequestMock, :request, [env]) end)
Pleroma.StaticStubbedConfigMock
|> stub(:get, fn
[:rich_media, :enabled] -> true
path -> Pleroma.Test.StaticConfig.get(path)
end)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "https://example.com/ogp"
})
assert %{"id" => id, "card" => %{"title" => "The Rock"}} =
json_response_and_validate_schema(conn, 200)
assert Activity.get_by_id(id)
end
test "posting a direct status", %{conn: conn} do
user2 = insert(:user)
content = "direct cofe @#{user2.nickname}"
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{"status" => content, "visibility" => "direct"})
assert %{"id" => id} = response = json_response_and_validate_schema(conn, 200)
assert response["visibility"] == "direct"
assert response["pleroma"]["direct_conversation_id"]
assert activity = Activity.get_by_id(id)
assert activity.recipients == [user2.ap_id, conn.assigns[:user].ap_id]
assert activity.data["to"] == [user2.ap_id]
assert activity.data["cc"] == []
end
test "discloses application metadata when enabled" do
user = insert(:user, disclose_client: true)
%{user: _user, token: token, conn: conn} = oauth_access(["write:statuses"], user: user)
%Pleroma.Web.OAuth.Token{
app: %Pleroma.Web.OAuth.App{
client_name: app_name,
website: app_website
}
} = token
result =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "cofe is my copilot"
})
assert %{
"content" => "cofe is my copilot"
} = json_response_and_validate_schema(result, 200)
activity = result.assigns.activity.id
result =
conn
|> get("/api/v1/statuses/#{activity}")
assert %{
"content" => "cofe is my copilot",
"application" => %{
"name" => ^app_name,
"website" => ^app_website
}
} = json_response_and_validate_schema(result, 200)
end
test "hides application metadata when disabled" do
user = insert(:user, disclose_client: false)
%{user: _user, token: _token, conn: conn} = oauth_access(["write:statuses"], user: user)
result =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "club mate is my wingman"
})
assert %{"content" => "club mate is my wingman"} =
json_response_and_validate_schema(result, 200)
activity = result.assigns.activity.id
result =
conn
|> get("/api/v1/statuses/#{activity}")
assert %{
"content" => "club mate is my wingman",
"application" => nil
} = json_response_and_validate_schema(result, 200)
end
end
describe "posting scheduled statuses" do
setup do: oauth_access(["write:statuses"])
test "creates a scheduled activity", %{conn: conn} do
scheduled_at =
NaiveDateTime.add(NaiveDateTime.utc_now(), :timer.minutes(120), :millisecond)
|> NaiveDateTime.to_iso8601()
|> Kernel.<>("Z")
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "scheduled",
"scheduled_at" => scheduled_at
})
assert %{"scheduled_at" => expected_scheduled_at} =
json_response_and_validate_schema(conn, 200)
assert expected_scheduled_at == CommonAPI.Utils.to_masto_date(scheduled_at)
assert [] == Repo.all(Activity)
end
test "with expiration" do
%{conn: conn} = oauth_access(["write:statuses", "read:statuses"])
scheduled_at =
NaiveDateTime.add(NaiveDateTime.utc_now(), :timer.minutes(6), :millisecond)
|> NaiveDateTime.to_iso8601()
|> Kernel.<>("Z")
assert %{"id" => status_id, "params" => %{"expires_in" => 300}} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "scheduled",
"scheduled_at" => scheduled_at,
"expires_in" => 300
})
|> json_response_and_validate_schema(200)
assert %{"id" => ^status_id, "params" => %{"expires_in" => 300}} =
conn
|> put_req_header("content-type", "application/json")
|> get("/api/v1/scheduled_statuses/#{status_id}")
|> json_response_and_validate_schema(200)
end
test "ignores nil values", %{conn: conn} do
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "not scheduled",
"scheduled_at" => nil
})
assert result = json_response_and_validate_schema(conn, 200)
assert Activity.get_by_id(result["id"])
end
test "creates a scheduled activity with a media attachment", %{user: user, conn: conn} do
scheduled_at =
NaiveDateTime.utc_now()
|> NaiveDateTime.add(:timer.minutes(120), :millisecond)
|> NaiveDateTime.to_iso8601()
|> Kernel.<>("Z")
file = %Plug.Upload{
content_type: "image/jpeg",
path: Path.absname("test/fixtures/image.jpg"),
filename: "an_image.jpg"
}
{:ok, upload} = ActivityPub.upload(file, actor: user.ap_id)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"media_ids" => [to_string(upload.id)],
"status" => "scheduled",
"scheduled_at" => scheduled_at
})
assert %{"media_attachments" => [media_attachment]} =
json_response_and_validate_schema(conn, 200)
assert %{"type" => "image"} = media_attachment
end
test "skips the scheduling and creates the activity if scheduled_at is earlier than 5 minutes from now",
%{conn: conn} do
scheduled_at =
NaiveDateTime.add(NaiveDateTime.utc_now(), :timer.minutes(5) - 1, :millisecond)
|> NaiveDateTime.to_iso8601()
|> Kernel.<>("Z")
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "not scheduled",
"scheduled_at" => scheduled_at
})
assert %{"content" => "not scheduled"} = json_response_and_validate_schema(conn, 200)
assert [] == Repo.all(ScheduledActivity)
end
test "returns error when daily user limit is exceeded", %{user: user, conn: conn} do
today =
NaiveDateTime.utc_now()
|> NaiveDateTime.add(:timer.minutes(6), :millisecond)
|> NaiveDateTime.to_iso8601()
# TODO
|> Kernel.<>("Z")
attrs = %{params: %{}, scheduled_at: today}
{:ok, _} = ScheduledActivity.create(user, attrs)
{:ok, _} = ScheduledActivity.create(user, attrs)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{"status" => "scheduled", "scheduled_at" => today})
assert %{"error" => "daily limit exceeded"} == json_response_and_validate_schema(conn, 422)
end
test "returns error when total user limit is exceeded", %{user: user, conn: conn} do
today =
NaiveDateTime.utc_now()
|> NaiveDateTime.add(:timer.minutes(6), :millisecond)
|> NaiveDateTime.to_iso8601()
|> Kernel.<>("Z")
tomorrow =
NaiveDateTime.utc_now()
|> NaiveDateTime.add(:timer.hours(36), :millisecond)
|> NaiveDateTime.to_iso8601()
|> Kernel.<>("Z")
attrs = %{params: %{}, scheduled_at: today}
{:ok, _} = ScheduledActivity.create(user, attrs)
{:ok, _} = ScheduledActivity.create(user, attrs)
{:ok, _} = ScheduledActivity.create(user, %{params: %{}, scheduled_at: tomorrow})
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{"status" => "scheduled", "scheduled_at" => tomorrow})
assert %{"error" => "total limit exceeded"} == json_response_and_validate_schema(conn, 422)
end
end
describe "posting polls" do
setup do: oauth_access(["write:statuses"])
test "posting a poll", %{conn: conn} do
time = NaiveDateTime.utc_now()
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "Who is the #bestgrill?",
"poll" => %{
"options" => ["Rei", "Asuka", "Misato"],
"expires_in" => 420
}
})
response = json_response_and_validate_schema(conn, 200)
assert Enum.all?(response["poll"]["options"], fn %{"title" => title} ->
title in ["Rei", "Asuka", "Misato"]
end)
assert NaiveDateTime.diff(NaiveDateTime.from_iso8601!(response["poll"]["expires_at"]), time) in 420..430
assert response["poll"]["expired"] == false
question = Object.get_by_id(response["poll"]["id"])
# closed contains utc timezone
assert question.data["closed"] =~ "Z"
end
test "option limit is enforced", %{conn: conn} do
limit = Config.get([:instance, :poll_limits, :max_options])
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "desu~",
"poll" => %{
"options" => Enum.map(0..limit, fn num -> "desu #{num}" end),
"expires_in" => 1
}
})
%{"error" => error} = json_response_and_validate_schema(conn, 422)
assert error == "Poll can't contain more than #{limit} options"
end
test "option character limit is enforced", %{conn: conn} do
limit = Config.get([:instance, :poll_limits, :max_option_chars])
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "...",
"poll" => %{
"options" => [String.duplicate(".", limit + 1), "lol"],
"expires_in" => 1
}
})
%{"error" => error} = json_response_and_validate_schema(conn, 422)
assert error == "Poll options cannot be longer than #{limit} characters each"
end
test "minimal date limit is enforced", %{conn: conn} do
limit = Config.get([:instance, :poll_limits, :min_expiration])
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "imagine arbitrary limits",
"poll" => %{
"options" => ["this post was made by pleroma gang"],
"expires_in" => limit - 1
}
})
%{"error" => error} = json_response_and_validate_schema(conn, 422)
assert error == "Expiration date is too soon"
end
test "maximum date limit is enforced", %{conn: conn} do
limit = Config.get([:instance, :poll_limits, :max_expiration])
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "imagine arbitrary limits",
"poll" => %{
"options" => ["this post was made by pleroma gang"],
"expires_in" => limit + 1
}
})
%{"error" => error} = json_response_and_validate_schema(conn, 422)
assert error == "Expiration date is too far in the future"
end
test "scheduled poll", %{conn: conn} do
clear_config([ScheduledActivity, :enabled], true)
scheduled_at =
NaiveDateTime.add(NaiveDateTime.utc_now(), :timer.minutes(6), :millisecond)
|> NaiveDateTime.to_iso8601()
|> Kernel.<>("Z")
%{"id" => scheduled_id} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "very cool poll",
"poll" => %{
"options" => ~w(a b c),
"expires_in" => 420
},
"scheduled_at" => scheduled_at
})
|> json_response_and_validate_schema(200)
assert {:ok, %{id: activity_id}} =
perform_job(ScheduledActivityWorker, %{
activity_id: scheduled_id
})
refute_enqueued(worker: ScheduledActivityWorker)
object =
Activity
|> Repo.get(activity_id)
|> Object.normalize()
assert object.data["content"] == "very cool poll"
assert object.data["type"] == "Question"
assert length(object.data["oneOf"]) == 3
end
test "cannot have only one option", %{conn: conn} do
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "desu~",
"poll" => %{"options" => ["mew"], "expires_in" => 1}
})
%{"error" => error} = json_response_and_validate_schema(conn, 422)
assert error == "Poll must contain at least 2 options"
end
test "cannot have only duplicated options", %{conn: conn} do
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "desu~",
"poll" => %{"options" => ["mew", "mew"], "expires_in" => 1}
})
%{"error" => error} = json_response_and_validate_schema(conn, 422)
assert error == "Poll must contain at least 2 options"
end
end
test "get a status" do
%{conn: conn} = oauth_access(["read:statuses"])
activity = insert(:note_activity)
conn = get(conn, "/api/v1/statuses/#{activity.id}")
assert %{"id" => id} = json_response_and_validate_schema(conn, 200)
assert id == to_string(activity.id)
end
defp local_and_remote_activities do
local = insert(:note_activity)
remote = insert(:note_activity, local: false)
{:ok, local: local, remote: remote}
end
defp local_and_remote_context_activities do
local_user_1 = insert(:user)
local_user_2 = insert(:user)
remote_user = insert(:user, local: false)
{:ok, %{id: id1, data: %{"context" => context}}} =
CommonAPI.post(local_user_1, %{status: "post"})
{:ok, %{id: id2} = post} =
CommonAPI.post(local_user_2, %{status: "local reply", in_reply_to_status_id: id1})
params = %{
"@context" => "https://www.w3.org/ns/activitystreams",
"actor" => remote_user.ap_id,
"type" => "Create",
"context" => context,
"id" => "#{remote_user.ap_id}/activities/1",
"inReplyTo" => post.data["id"],
"object" => %{
"type" => "Note",
"content" => "remote reply",
"context" => context,
"id" => "#{remote_user.ap_id}/objects/1",
"attributedTo" => remote_user.ap_id,
"to" => [
local_user_1.ap_id,
local_user_2.ap_id,
"https://www.w3.org/ns/activitystreams#Public"
]
},
"to" => [
local_user_1.ap_id,
local_user_2.ap_id,
"https://www.w3.org/ns/activitystreams#Public"
]
}
{:ok, job} = Pleroma.Web.Federator.incoming_ap_doc(params)
{:ok, remote_activity} = ObanHelpers.perform(job)
%{locals: [id1, id2], remote: remote_activity.id, context: context}
end
describe "status with restrict unauthenticated activities for local and remote" do
setup do: local_and_remote_activities()
setup do: clear_config([:restrict_unauthenticated, :activities, :local], true)
setup do: clear_config([:restrict_unauthenticated, :activities, :remote], true)
test "if user is unauthenticated", %{conn: conn, local: local, remote: remote} do
res_conn = get(conn, "/api/v1/statuses/#{local.id}")
assert json_response_and_validate_schema(res_conn, :not_found) == %{
"error" => "Record not found"
}
res_conn = get(conn, "/api/v1/statuses/#{remote.id}")
assert json_response_and_validate_schema(res_conn, :not_found) == %{
"error" => "Record not found"
}
end
test "if user is authenticated", %{local: local, remote: remote} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses/#{local.id}")
assert %{"id" => _} = json_response_and_validate_schema(res_conn, 200)
res_conn = get(conn, "/api/v1/statuses/#{remote.id}")
assert %{"id" => _} = json_response_and_validate_schema(res_conn, 200)
end
end
describe "status with restrict unauthenticated activities for local" do
setup do: local_and_remote_activities()
setup do: clear_config([:restrict_unauthenticated, :activities, :local], true)
test "if user is unauthenticated", %{conn: conn, local: local, remote: remote} do
res_conn = get(conn, "/api/v1/statuses/#{local.id}")
assert json_response_and_validate_schema(res_conn, :not_found) == %{
"error" => "Record not found"
}
res_conn = get(conn, "/api/v1/statuses/#{remote.id}")
assert %{"id" => _} = json_response_and_validate_schema(res_conn, 200)
end
test "if user is authenticated", %{local: local, remote: remote} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses/#{local.id}")
assert %{"id" => _} = json_response_and_validate_schema(res_conn, 200)
res_conn = get(conn, "/api/v1/statuses/#{remote.id}")
assert %{"id" => _} = json_response_and_validate_schema(res_conn, 200)
end
end
describe "status with restrict unauthenticated activities for remote" do
setup do: local_and_remote_activities()
setup do: clear_config([:restrict_unauthenticated, :activities, :remote], true)
test "if user is unauthenticated", %{conn: conn, local: local, remote: remote} do
res_conn = get(conn, "/api/v1/statuses/#{local.id}")
assert %{"id" => _} = json_response_and_validate_schema(res_conn, 200)
res_conn = get(conn, "/api/v1/statuses/#{remote.id}")
assert json_response_and_validate_schema(res_conn, :not_found) == %{
"error" => "Record not found"
}
end
test "if user is authenticated", %{local: local, remote: remote} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses/#{local.id}")
assert %{"id" => _} = json_response_and_validate_schema(res_conn, 200)
res_conn = get(conn, "/api/v1/statuses/#{remote.id}")
assert %{"id" => _} = json_response_and_validate_schema(res_conn, 200)
end
end
test "getting a status that doesn't exist returns 404" do
%{conn: conn} = oauth_access(["read:statuses"])
activity = insert(:note_activity)
conn = get(conn, "/api/v1/statuses/#{String.downcase(activity.id)}")
assert json_response_and_validate_schema(conn, 404) == %{"error" => "Record not found"}
end
test "get a direct status" do
%{user: user, conn: conn} = oauth_access(["read:statuses"])
other_user = insert(:user)
{:ok, activity} =
CommonAPI.post(user, %{status: "@#{other_user.nickname}", visibility: "direct"})
conn =
conn
|> assign(:user, user)
|> get("/api/v1/statuses/#{activity.id}")
[participation] = Participation.for_user(user)
res = json_response_and_validate_schema(conn, 200)
assert res["pleroma"]["direct_conversation_id"] == participation.id
end
test "get statuses by IDs" do
%{conn: conn} = oauth_access(["read:statuses"])
%{id: id1} = insert(:note_activity)
%{id: id2} = insert(:note_activity)
query_string = "ids[]=#{id1}&ids[]=#{id2}"
conn = get(conn, "/api/v1/statuses/?#{query_string}")
assert [%{"id" => ^id1}, %{"id" => ^id2}] =
Enum.sort_by(json_response_and_validate_schema(conn, :ok), & &1["id"])
end
describe "getting statuses by ids with restricted unauthenticated for local and remote" do
setup do: local_and_remote_activities()
setup do: clear_config([:restrict_unauthenticated, :activities, :local], true)
setup do: clear_config([:restrict_unauthenticated, :activities, :remote], true)
test "if user is unauthenticated", %{conn: conn, local: local, remote: remote} do
res_conn = get(conn, "/api/v1/statuses?ids[]=#{local.id}&ids[]=#{remote.id}")
assert json_response_and_validate_schema(res_conn, 200) == []
end
test "if user is authenticated", %{local: local, remote: remote} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses?ids[]=#{local.id}&ids[]=#{remote.id}")
assert length(json_response_and_validate_schema(res_conn, 200)) == 2
end
end
describe "getting statuses by ids with restricted unauthenticated for local" do
setup do: local_and_remote_activities()
setup do: clear_config([:restrict_unauthenticated, :activities, :local], true)
test "if user is unauthenticated", %{conn: conn, local: local, remote: remote} do
res_conn = get(conn, "/api/v1/statuses?ids[]=#{local.id}&ids[]=#{remote.id}")
remote_id = remote.id
assert [%{"id" => ^remote_id}] = json_response_and_validate_schema(res_conn, 200)
end
test "if user is authenticated", %{local: local, remote: remote} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses?ids[]=#{local.id}&ids[]=#{remote.id}")
assert length(json_response_and_validate_schema(res_conn, 200)) == 2
end
end
describe "getting statuses by ids with restricted unauthenticated for remote" do
setup do: local_and_remote_activities()
setup do: clear_config([:restrict_unauthenticated, :activities, :remote], true)
test "if user is unauthenticated", %{conn: conn, local: local, remote: remote} do
res_conn = get(conn, "/api/v1/statuses?ids[]=#{local.id}&ids[]=#{remote.id}")
local_id = local.id
assert [%{"id" => ^local_id}] = json_response_and_validate_schema(res_conn, 200)
end
test "if user is authenticated", %{local: local, remote: remote} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses?ids[]=#{local.id}&ids[]=#{remote.id}")
assert length(json_response_and_validate_schema(res_conn, 200)) == 2
end
end
describe "getting status contexts restricted unauthenticated for local and remote" do
setup do: local_and_remote_context_activities()
setup do: clear_config([:restrict_unauthenticated, :activities, :local], true)
setup do: clear_config([:restrict_unauthenticated, :activities, :remote], true)
test "if user is unauthenticated", %{conn: conn, locals: [post_id, _]} do
res_conn = get(conn, "/api/v1/statuses/#{post_id}/context")
assert json_response_and_validate_schema(res_conn, 200) == %{
"ancestors" => [],
"descendants" => []
}
end
test "if user is unauthenticated reply", %{conn: conn, locals: [_, reply_id]} do
res_conn = get(conn, "/api/v1/statuses/#{reply_id}/context")
assert json_response_and_validate_schema(res_conn, 200) == %{
"ancestors" => [],
"descendants" => []
}
end
test "if user is authenticated", %{locals: [post_id, reply_id], remote: remote_reply_id} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses/#{post_id}/context")
%{"ancestors" => [], "descendants" => descendants} =
json_response_and_validate_schema(res_conn, 200)
descendant_ids =
descendants
|> Enum.map(& &1["id"])
assert reply_id in descendant_ids
assert remote_reply_id in descendant_ids
end
test "if user is authenticated reply", %{locals: [post_id, reply_id], remote: remote_reply_id} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses/#{reply_id}/context")
%{"ancestors" => ancestors, "descendants" => descendants} =
json_response_and_validate_schema(res_conn, 200)
ancestor_ids =
ancestors
|> Enum.map(& &1["id"])
descendant_ids =
descendants
|> Enum.map(& &1["id"])
assert post_id in ancestor_ids
assert remote_reply_id in descendant_ids
end
end
describe "getting status contexts restricted unauthenticated for local" do
setup do: local_and_remote_context_activities()
setup do: clear_config([:restrict_unauthenticated, :activities, :local], true)
setup do: clear_config([:restrict_unauthenticated, :activities, :remote], false)
test "if user is unauthenticated", %{
conn: conn,
locals: [post_id, reply_id],
remote: remote_reply_id
} do
res_conn = get(conn, "/api/v1/statuses/#{post_id}/context")
%{"ancestors" => [], "descendants" => descendants} =
json_response_and_validate_schema(res_conn, 200)
descendant_ids =
descendants
|> Enum.map(& &1["id"])
assert reply_id not in descendant_ids
assert remote_reply_id in descendant_ids
end
test "if user is unauthenticated reply", %{
conn: conn,
locals: [post_id, reply_id],
remote: remote_reply_id
} do
res_conn = get(conn, "/api/v1/statuses/#{reply_id}/context")
%{"ancestors" => ancestors, "descendants" => descendants} =
json_response_and_validate_schema(res_conn, 200)
ancestor_ids =
ancestors
|> Enum.map(& &1["id"])
descendant_ids =
descendants
|> Enum.map(& &1["id"])
assert post_id not in ancestor_ids
assert remote_reply_id in descendant_ids
end
test "if user is authenticated", %{locals: [post_id, reply_id], remote: remote_reply_id} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses/#{post_id}/context")
%{"ancestors" => [], "descendants" => descendants} =
json_response_and_validate_schema(res_conn, 200)
descendant_ids =
descendants
|> Enum.map(& &1["id"])
assert reply_id in descendant_ids
assert remote_reply_id in descendant_ids
end
test "if user is authenticated reply", %{locals: [post_id, reply_id], remote: remote_reply_id} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses/#{reply_id}/context")
%{"ancestors" => ancestors, "descendants" => descendants} =
json_response_and_validate_schema(res_conn, 200)
ancestor_ids =
ancestors
|> Enum.map(& &1["id"])
descendant_ids =
descendants
|> Enum.map(& &1["id"])
assert post_id in ancestor_ids
assert remote_reply_id in descendant_ids
end
end
describe "getting status contexts restricted unauthenticated for remote" do
setup do: local_and_remote_context_activities()
setup do: clear_config([:restrict_unauthenticated, :activities, :local], false)
setup do: clear_config([:restrict_unauthenticated, :activities, :remote], true)
test "if user is unauthenticated", %{
conn: conn,
locals: [post_id, reply_id],
remote: remote_reply_id
} do
res_conn = get(conn, "/api/v1/statuses/#{post_id}/context")
%{"ancestors" => [], "descendants" => descendants} =
json_response_and_validate_schema(res_conn, 200)
descendant_ids =
descendants
|> Enum.map(& &1["id"])
assert reply_id in descendant_ids
assert remote_reply_id not in descendant_ids
end
test "if user is unauthenticated reply", %{
conn: conn,
locals: [post_id, reply_id],
remote: remote_reply_id
} do
res_conn = get(conn, "/api/v1/statuses/#{reply_id}/context")
%{"ancestors" => ancestors, "descendants" => descendants} =
json_response_and_validate_schema(res_conn, 200)
ancestor_ids =
ancestors
|> Enum.map(& &1["id"])
descendant_ids =
descendants
|> Enum.map(& &1["id"])
assert post_id in ancestor_ids
assert remote_reply_id not in descendant_ids
end
test "if user is authenticated", %{locals: [post_id, reply_id], remote: remote_reply_id} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses/#{post_id}/context")
%{"ancestors" => [], "descendants" => descendants} =
json_response_and_validate_schema(res_conn, 200)
reply_ids =
descendants
|> Enum.map(& &1["id"])
assert reply_id in reply_ids
assert remote_reply_id in reply_ids
end
test "if user is authenticated reply", %{locals: [post_id, reply_id], remote: remote_reply_id} do
%{conn: conn} = oauth_access(["read"])
res_conn = get(conn, "/api/v1/statuses/#{reply_id}/context")
%{"ancestors" => ancestors, "descendants" => descendants} =
json_response_and_validate_schema(res_conn, 200)
ancestor_ids =
ancestors
|> Enum.map(& &1["id"])
descendant_ids =
descendants
|> Enum.map(& &1["id"])
assert post_id in ancestor_ids
assert remote_reply_id in descendant_ids
end
end
describe "deleting a status" do
test "when you created it" do
%{user: author, conn: conn} = oauth_access(["write:statuses"])
activity = insert(:note_activity, user: author)
object = Object.normalize(activity, fetch: false)
content = object.data["content"]
source = object.data["source"]
result =
conn
|> assign(:user, author)
|> delete("/api/v1/statuses/#{activity.id}")
|> json_response_and_validate_schema(200)
assert match?(%{"content" => ^content, "text" => ^source}, result)
refute Activity.get_by_id(activity.id)
end
test "when it doesn't exist" do
%{user: author, conn: conn} = oauth_access(["write:statuses"])
activity = insert(:note_activity, user: author)
conn =
conn
|> assign(:user, author)
|> delete("/api/v1/statuses/#{String.downcase(activity.id)}")
assert %{"error" => "Record not found"} == json_response_and_validate_schema(conn, 404)
end
test "when you didn't create it" do
%{conn: conn} = oauth_access(["write:statuses"])
activity = insert(:note_activity)
conn = delete(conn, "/api/v1/statuses/#{activity.id}")
assert %{"error" => "Record not found"} == json_response_and_validate_schema(conn, 404)
assert Activity.get_by_id(activity.id) == activity
end
test "when you're privileged to", %{conn: conn} do
clear_config([:instance, :moderator_privileges], [:messages_delete])
activity = insert(:note_activity)
user = insert(:user, is_moderator: true)
res_conn =
conn
|> assign(:user, user)
|> assign(:token, insert(:oauth_token, user: user, scopes: ["write:statuses"]))
|> delete("/api/v1/statuses/#{activity.id}")
assert %{} = json_response_and_validate_schema(res_conn, 200)
assert ModerationLog |> Repo.one() |> ModerationLog.get_log_entry_message() ==
"@#{user.nickname} deleted status ##{activity.id}"
refute Activity.get_by_id(activity.id)
end
test "when you're privileged and the user is banned", %{conn: conn} do
clear_config([:instance, :moderator_privileges], [:messages_delete])
posting_user = insert(:user, is_active: false)
refute posting_user.is_active
activity = insert(:note_activity, user: posting_user)
user = insert(:user, is_moderator: true)
res_conn =
conn
|> assign(:user, user)
|> assign(:token, insert(:oauth_token, user: user, scopes: ["write:statuses"]))
|> delete("/api/v1/statuses/#{activity.id}")
assert %{} = json_response_and_validate_schema(res_conn, 200)
assert ModerationLog |> Repo.one() |> ModerationLog.get_log_entry_message() ==
"@#{user.nickname} deleted status ##{activity.id}"
refute Activity.get_by_id(activity.id)
end
end
describe "reblogging" do
setup do: oauth_access(["write:statuses"])
test "reblogs and returns the reblogged status", %{conn: conn} do
activity = insert(:note_activity)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/reblog")
assert %{
"reblog" => %{"id" => id, "reblogged" => true, "reblogs_count" => 1},
"reblogged" => true
} = json_response_and_validate_schema(conn, 200)
assert to_string(activity.id) == id
end
test "returns 404 if the reblogged status doesn't exist", %{conn: conn} do
activity = insert(:note_activity)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{String.downcase(activity.id)}/reblog")
assert %{"error" => "Record not found"} = json_response_and_validate_schema(conn, 404)
end
test "reblogs privately and returns the reblogged status", %{conn: conn} do
activity = insert(:note_activity)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post(
"/api/v1/statuses/#{activity.id}/reblog",
%{"visibility" => "private"}
)
assert %{
"reblog" => %{"id" => id, "reblogged" => true, "reblogs_count" => 1},
"reblogged" => true,
"visibility" => "private"
} = json_response_and_validate_schema(conn, 200)
assert to_string(activity.id) == id
end
test "reblogged status for another user" do
activity = insert(:note_activity)
user1 = insert(:user)
user2 = insert(:user)
user3 = insert(:user)
{:ok, _} = CommonAPI.favorite(user2, activity.id)
{:ok, _bookmark} = Pleroma.Bookmark.create(user2.id, activity.id)
{:ok, reblog_activity1} = CommonAPI.repeat(activity.id, user1)
{:ok, _} = CommonAPI.repeat(activity.id, user2)
conn_res =
build_conn()
|> assign(:user, user3)
|> assign(:token, insert(:oauth_token, user: user3, scopes: ["read:statuses"]))
|> get("/api/v1/statuses/#{reblog_activity1.id}")
assert %{
"reblog" => %{"id" => _id, "reblogged" => false, "reblogs_count" => 2},
"reblogged" => false,
"favourited" => false,
"bookmarked" => false
} = json_response_and_validate_schema(conn_res, 200)
conn_res =
build_conn()
|> assign(:user, user2)
|> assign(:token, insert(:oauth_token, user: user2, scopes: ["read:statuses"]))
|> get("/api/v1/statuses/#{reblog_activity1.id}")
assert %{
"reblog" => %{"id" => id, "reblogged" => true, "reblogs_count" => 2},
"reblogged" => true,
"favourited" => true,
"bookmarked" => true
} = json_response_and_validate_schema(conn_res, 200)
assert to_string(activity.id) == id
end
test "author can reblog own private status", %{conn: conn, user: user} do
{:ok, activity} = CommonAPI.post(user, %{status: "cofe", visibility: "private"})
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/reblog")
assert %{
"reblog" => %{"id" => id, "reblogged" => true, "reblogs_count" => 1},
"reblogged" => true,
"visibility" => "private"
} = json_response_and_validate_schema(conn, 200)
assert to_string(activity.id) == id
end
end
describe "unreblogging" do
setup do: oauth_access(["write:statuses"])
test "unreblogs and returns the unreblogged status", %{user: user, conn: conn} do
activity = insert(:note_activity)
{:ok, _} = CommonAPI.repeat(activity.id, user)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/unreblog")
assert %{"id" => id, "reblogged" => false, "reblogs_count" => 0} =
json_response_and_validate_schema(conn, 200)
assert to_string(activity.id) == id
end
test "returns 404 error when activity does not exist", %{conn: conn} do
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/foo/unreblog")
assert json_response_and_validate_schema(conn, 404) == %{"error" => "Record not found"}
end
end
describe "favoriting" do
setup do: oauth_access(["write:favourites"])
test "favs a status and returns it", %{conn: conn} do
activity = insert(:note_activity)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/favourite")
assert %{"id" => id, "favourites_count" => 1, "favourited" => true} =
json_response_and_validate_schema(conn, 200)
assert to_string(activity.id) == id
end
test "favoriting twice will just return 200", %{conn: conn} do
activity = insert(:note_activity)
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/favourite")
assert conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/favourite")
|> json_response_and_validate_schema(200)
end
test "returns 404 error for a wrong id", %{conn: conn} do
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/1/favourite")
assert json_response_and_validate_schema(conn, 404) == %{"error" => "Record not found"}
end
end
describe "unfavoriting" do
setup do: oauth_access(["write:favourites"])
test "unfavorites a status and returns it", %{user: user, conn: conn} do
activity = insert(:note_activity)
{:ok, _} = CommonAPI.favorite(user, activity.id)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/unfavourite")
assert %{"id" => id, "favourites_count" => 0, "favourited" => false} =
json_response_and_validate_schema(conn, 200)
assert to_string(activity.id) == id
end
test "returns 404 error for a wrong id", %{conn: conn} do
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/1/unfavourite")
assert json_response_and_validate_schema(conn, 404) == %{"error" => "Record not found"}
end
end
describe "pinned statuses" do
setup do: oauth_access(["write:accounts"])
setup %{user: user} do
{:ok, activity} = CommonAPI.post(user, %{status: "HI!!!"})
%{activity: activity}
end
setup do: clear_config([:instance, :max_pinned_statuses], 1)
test "pin status", %{conn: conn, user: user, activity: activity} do
id = activity.id
assert %{"id" => ^id, "pinned" => true} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/pin")
|> json_response_and_validate_schema(200)
assert [%{"id" => ^id, "pinned" => true}] =
conn
|> get("/api/v1/accounts/#{user.id}/statuses?pinned=true")
|> json_response_and_validate_schema(200)
end
test "non authenticated user", %{activity: activity} do
assert build_conn()
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/pin")
|> json_response(403) == %{"error" => "Invalid credentials."}
end
test "/pin: returns 400 error when activity is not public", %{conn: conn, user: user} do
{:ok, dm} = CommonAPI.post(user, %{status: "test", visibility: "direct"})
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{dm.id}/pin")
assert json_response_and_validate_schema(conn, 422) == %{
"error" => "Non-public status cannot be pinned"
}
end
test "pin by another user", %{activity: activity} do
%{conn: conn} = oauth_access(["write:accounts"])
assert conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/pin")
|> json_response(422) == %{"error" => "Someone else's status cannot be pinned"}
end
test "unpin status", %{conn: conn, user: user, activity: activity} do
{:ok, _} = CommonAPI.pin(activity.id, user)
user = refresh_record(user)
id_str = to_string(activity.id)
assert %{"id" => ^id_str, "pinned" => false} =
conn
|> assign(:user, user)
|> post("/api/v1/statuses/#{activity.id}/unpin")
|> json_response_and_validate_schema(200)
assert [] =
conn
|> get("/api/v1/accounts/#{user.id}/statuses?pinned=true")
|> json_response_and_validate_schema(200)
end
test "/unpin: returns 404 error when activity doesn't exist", %{conn: conn} do
assert conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/1/unpin")
|> json_response_and_validate_schema(404) == %{"error" => "Record not found"}
end
test "max pinned statuses", %{conn: conn, user: user, activity: activity_one} do
{:ok, activity_two} = CommonAPI.post(user, %{status: "HI!!!"})
id_str_one = to_string(activity_one.id)
assert %{"id" => ^id_str_one, "pinned" => true} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{id_str_one}/pin")
|> json_response_and_validate_schema(200)
user = refresh_record(user)
assert %{"error" => "You have already pinned the maximum number of statuses"} =
conn
|> assign(:user, user)
|> post("/api/v1/statuses/#{activity_two.id}/pin")
|> json_response_and_validate_schema(400)
end
test "on pin removes deletion job, on unpin reschedule deletion" do
%{conn: conn} = oauth_access(["write:accounts", "write:statuses"])
expires_in = 2 * 60 * 60
expires_at = DateTime.add(DateTime.utc_now(), expires_in)
assert %{"id" => id} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "oolong",
"expires_in" => expires_in
})
|> json_response_and_validate_schema(200)
assert_enqueued(
worker: Pleroma.Workers.PurgeExpiredActivity,
args: %{activity_id: id},
scheduled_at: expires_at
)
assert %{"id" => ^id, "pinned" => true} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{id}/pin")
|> json_response_and_validate_schema(200)
refute_enqueued(
worker: Pleroma.Workers.PurgeExpiredActivity,
args: %{activity_id: id},
scheduled_at: expires_at
)
assert %{"id" => ^id, "pinned" => false} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{id}/unpin")
|> json_response_and_validate_schema(200)
assert_enqueued(
worker: Pleroma.Workers.PurgeExpiredActivity,
args: %{activity_id: id},
scheduled_at: expires_at
)
end
end
describe "cards" do
setup do
Pleroma.StaticStubbedConfigMock
|> stub(:get, fn
[:rich_media, :enabled] -> true
path -> Pleroma.Test.StaticConfig.get(path)
end)
oauth_access(["read:statuses"])
end
test "returns rich-media card", %{conn: conn, user: user} do
Tesla.Mock.mock_global(fn env -> apply(HttpRequestMock, :request, [env]) end)
{:ok, activity} = CommonAPI.post(user, %{status: "https://example.com/ogp"})
card_data = %{
"image" => "http://ia.media-imdb.com/images/rock.jpg",
"provider_name" => "example.com",
"provider_url" => "https://example.com",
"title" => "The Rock",
"type" => "link",
"url" => "https://example.com/ogp",
"description" =>
"Directed by Michael Bay. With Sean Connery, Nicolas Cage, Ed Harris, John Spencer.",
"pleroma" => %{
"opengraph" => %{
"image" => "http://ia.media-imdb.com/images/rock.jpg",
"title" => "The Rock",
"type" => "video.movie",
"url" => "https://example.com/ogp",
"description" =>
"Directed by Michael Bay. With Sean Connery, Nicolas Cage, Ed Harris, John Spencer."
}
}
}
response =
conn
|> get("/api/v1/statuses/#{activity.id}/card")
|> json_response_and_validate_schema(200)
assert response == card_data
# works with private posts
{:ok, activity} =
CommonAPI.post(user, %{status: "https://example.com/ogp", visibility: "direct"})
response_two =
conn
|> get("/api/v1/statuses/#{activity.id}/card")
|> json_response_and_validate_schema(200)
assert response_two == card_data
end
test "replaces missing description with an empty string", %{conn: conn, user: user} do
Tesla.Mock.mock_global(fn env -> apply(HttpRequestMock, :request, [env]) end)
{:ok, activity} = CommonAPI.post(user, %{status: "https://example.com/ogp-missing-data"})
response =
conn
|> get("/api/v1/statuses/#{activity.id}/card")
|> json_response_and_validate_schema(:ok)
assert response == %{
"type" => "link",
"title" => "Pleroma",
"description" => "",
"image" => nil,
"provider_name" => "example.com",
"provider_url" => "https://example.com",
"url" => "https://example.com/ogp-missing-data",
"pleroma" => %{
"opengraph" => %{
"title" => "Pleroma",
"type" => "website",
"url" => "https://example.com/ogp-missing-data"
}
}
}
end
end
test "bookmarks" do
bookmarks_uri = "/api/v1/bookmarks"
%{conn: conn} = oauth_access(["write:bookmarks", "read:bookmarks"])
author = insert(:user)
{:ok, activity1} = CommonAPI.post(author, %{status: "heweoo?"})
{:ok, activity2} = CommonAPI.post(author, %{status: "heweoo!"})
response1 =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity1.id}/bookmark")
assert json_response_and_validate_schema(response1, 200)["bookmarked"] == true
response2 =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity2.id}/bookmark")
assert json_response_and_validate_schema(response2, 200)["bookmarked"] == true
bookmarks = get(conn, bookmarks_uri)
assert [
json_response_and_validate_schema(response2, 200),
json_response_and_validate_schema(response1, 200)
] ==
json_response_and_validate_schema(bookmarks, 200)
response1 =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity1.id}/unbookmark")
assert json_response_and_validate_schema(response1, 200)["bookmarked"] == false
bookmarks = get(conn, bookmarks_uri)
assert [json_response_and_validate_schema(response2, 200)] ==
json_response_and_validate_schema(bookmarks, 200)
end
describe "conversation muting" do
setup do: oauth_access(["write:mutes"])
setup do
post_user = insert(:user)
{:ok, activity} = CommonAPI.post(post_user, %{status: "HIE"})
%{activity: activity}
end
test "mute conversation", %{conn: conn, activity: activity} do
id_str = to_string(activity.id)
assert %{"id" => ^id_str, "muted" => true} =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/mute")
|> json_response_and_validate_schema(200)
end
test "cannot mute already muted conversation", %{conn: conn, user: user, activity: activity} do
{:ok, _} = CommonAPI.add_mute(user, activity)
conn =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/mute")
assert json_response_and_validate_schema(conn, 400) == %{
"error" => "conversation is already muted"
}
end
test "unmute conversation", %{conn: conn, user: user, activity: activity} do
{:ok, _} = CommonAPI.add_mute(user, activity)
id_str = to_string(activity.id)
assert %{"id" => ^id_str, "muted" => false} =
conn
# |> assign(:user, user)
|> post("/api/v1/statuses/#{activity.id}/unmute")
|> json_response_and_validate_schema(200)
end
end
test "Repeated posts that are replies incorrectly have in_reply_to_id null", %{conn: conn} do
user1 = insert(:user)
user2 = insert(:user)
user3 = insert(:user)
{:ok, replied_to} = CommonAPI.post(user1, %{status: "cofe"})
# Reply to status from another user
conn1 =
conn
|> assign(:user, user2)
|> assign(:token, insert(:oauth_token, user: user2, scopes: ["write:statuses"]))
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{"status" => "xD", "in_reply_to_id" => replied_to.id})
assert %{"content" => "xD", "id" => id} = json_response_and_validate_schema(conn1, 200)
activity = Activity.get_by_id_with_object(id)
assert Object.normalize(activity, fetch: false).data["inReplyTo"] ==
Object.normalize(replied_to, fetch: false).data["id"]
assert Activity.get_in_reply_to_activity(activity).id == replied_to.id
# Reblog from the third user
conn2 =
conn
|> assign(:user, user3)
|> assign(:token, insert(:oauth_token, user: user3, scopes: ["write:statuses"]))
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses/#{activity.id}/reblog")
assert %{"reblog" => %{"id" => id, "reblogged" => true, "reblogs_count" => 1}} =
json_response_and_validate_schema(conn2, 200)
assert to_string(activity.id) == id
# Getting third user status
conn3 =
conn
|> assign(:user, user3)
|> assign(:token, insert(:oauth_token, user: user3, scopes: ["read:statuses"]))
|> get("/api/v1/timelines/home")
[reblogged_activity] = json_response_and_validate_schema(conn3, 200)
assert reblogged_activity["reblog"]["in_reply_to_id"] == replied_to.id
replied_to_user = User.get_by_ap_id(replied_to.data["actor"])
assert reblogged_activity["reblog"]["in_reply_to_account_id"] == replied_to_user.id
end
describe "GET /api/v1/statuses/:id/favourited_by" do
setup do: oauth_access(["read:accounts"])
setup %{user: user} do
{:ok, activity} = CommonAPI.post(user, %{status: "test"})
%{activity: activity}
end
test "returns users who have favorited the status", %{conn: conn, activity: activity} do
other_user = insert(:user)
{:ok, _} = CommonAPI.favorite(other_user, activity.id)
response =
conn
|> get("/api/v1/statuses/#{activity.id}/favourited_by")
|> json_response_and_validate_schema(:ok)
[%{"id" => id}] = response
assert id == other_user.id
end
test "returns empty array when status has not been favorited yet", %{
conn: conn,
activity: activity
} do
response =
conn
|> get("/api/v1/statuses/#{activity.id}/favourited_by")
|> json_response_and_validate_schema(:ok)
assert Enum.empty?(response)
end
test "does not return users who have favorited the status but are blocked", %{
conn: %{assigns: %{user: user}} = conn,
activity: activity
} do
other_user = insert(:user)
{:ok, _user_relationship} = User.block(user, other_user)
{:ok, _} = CommonAPI.favorite(other_user, activity.id)
response =
conn
|> get("/api/v1/statuses/#{activity.id}/favourited_by")
|> json_response_and_validate_schema(:ok)
assert Enum.empty?(response)
end
test "does not fail on an unauthenticated request", %{activity: activity} do
other_user = insert(:user)
{:ok, _} = CommonAPI.favorite(other_user, activity.id)
response =
build_conn()
|> get("/api/v1/statuses/#{activity.id}/favourited_by")
|> json_response_and_validate_schema(:ok)
[%{"id" => id}] = response
assert id == other_user.id
end
test "requires authentication for private posts", %{user: user} do
other_user = insert(:user)
{:ok, activity} =
CommonAPI.post(user, %{
status: "@#{other_user.nickname} wanna get some #cofe together?",
visibility: "direct"
})
{:ok, _} = CommonAPI.favorite(other_user, activity.id)
favourited_by_url = "/api/v1/statuses/#{activity.id}/favourited_by"
build_conn()
|> get(favourited_by_url)
|> json_response_and_validate_schema(404)
conn =
build_conn()
|> assign(:user, other_user)
|> assign(:token, insert(:oauth_token, user: other_user, scopes: ["read:accounts"]))
conn
|> assign(:token, nil)
|> get(favourited_by_url)
|> json_response_and_validate_schema(404)
response =
conn
|> get(favourited_by_url)
|> json_response_and_validate_schema(200)
[%{"id" => id}] = response
assert id == other_user.id
end
test "returns empty array when :show_reactions is disabled", %{conn: conn, activity: activity} do
clear_config([:instance, :show_reactions], false)
other_user = insert(:user)
{:ok, _} = CommonAPI.favorite(other_user, activity.id)
response =
conn
|> get("/api/v1/statuses/#{activity.id}/favourited_by")
|> json_response_and_validate_schema(:ok)
assert Enum.empty?(response)
end
end
describe "GET /api/v1/statuses/:id/reblogged_by" do
setup do: oauth_access(["read:accounts"])
setup %{user: user} do
{:ok, activity} = CommonAPI.post(user, %{status: "test"})
%{activity: activity}
end
test "returns users who have reblogged the status", %{conn: conn, activity: activity} do
other_user = insert(:user)
{:ok, _} = CommonAPI.repeat(activity.id, other_user)
response =
conn
|> get("/api/v1/statuses/#{activity.id}/reblogged_by")
|> json_response_and_validate_schema(:ok)
[%{"id" => id}] = response
assert id == other_user.id
end
test "returns empty array when status has not been reblogged yet", %{
conn: conn,
activity: activity
} do
response =
conn
|> get("/api/v1/statuses/#{activity.id}/reblogged_by")
|> json_response_and_validate_schema(:ok)
assert Enum.empty?(response)
end
test "does not return users who have reblogged the status but are blocked", %{
conn: %{assigns: %{user: user}} = conn,
activity: activity
} do
other_user = insert(:user)
{:ok, _user_relationship} = User.block(user, other_user)
{:ok, _} = CommonAPI.repeat(activity.id, other_user)
response =
conn
|> get("/api/v1/statuses/#{activity.id}/reblogged_by")
|> json_response_and_validate_schema(:ok)
assert Enum.empty?(response)
end
test "does not return users who have reblogged the status privately", %{
conn: conn
} do
other_user = insert(:user)
{:ok, activity} = CommonAPI.post(other_user, %{status: "my secret post"})
{:ok, _} = CommonAPI.repeat(activity.id, other_user, %{visibility: "private"})
response =
conn
|> get("/api/v1/statuses/#{activity.id}/reblogged_by")
|> json_response_and_validate_schema(:ok)
assert Enum.empty?(response)
end
test "does not fail on an unauthenticated request", %{activity: activity} do
other_user = insert(:user)
{:ok, _} = CommonAPI.repeat(activity.id, other_user)
response =
build_conn()
|> get("/api/v1/statuses/#{activity.id}/reblogged_by")
|> json_response_and_validate_schema(:ok)
[%{"id" => id}] = response
assert id == other_user.id
end
test "requires authentication for private posts", %{user: user} do
other_user = insert(:user)
{:ok, activity} =
CommonAPI.post(user, %{
status: "@#{other_user.nickname} wanna get some #cofe together?",
visibility: "direct"
})
build_conn()
|> get("/api/v1/statuses/#{activity.id}/reblogged_by")
|> json_response_and_validate_schema(404)
response =
build_conn()
|> assign(:user, other_user)
|> assign(:token, insert(:oauth_token, user: other_user, scopes: ["read:accounts"]))
|> get("/api/v1/statuses/#{activity.id}/reblogged_by")
|> json_response_and_validate_schema(200)
assert [] == response
end
end
test "context" do
user = insert(:user)
{:ok, %{id: id1}} = CommonAPI.post(user, %{status: "1"})
{:ok, %{id: id2}} = CommonAPI.post(user, %{status: "2", in_reply_to_status_id: id1})
{:ok, %{id: id3}} = CommonAPI.post(user, %{status: "3", in_reply_to_status_id: id2})
{:ok, %{id: id4}} = CommonAPI.post(user, %{status: "4", in_reply_to_status_id: id3})
{:ok, %{id: id5}} = CommonAPI.post(user, %{status: "5", in_reply_to_status_id: id4})
response =
build_conn()
|> get("/api/v1/statuses/#{id3}/context")
|> json_response_and_validate_schema(:ok)
assert %{
"ancestors" => [%{"id" => ^id1}, %{"id" => ^id2}],
"descendants" => [%{"id" => ^id4}, %{"id" => ^id5}]
} = response
end
test "favorites paginate correctly" do
%{user: user, conn: conn} = oauth_access(["read:favourites"])
other_user = insert(:user)
{:ok, first_post} = CommonAPI.post(other_user, %{status: "bla"})
{:ok, second_post} = CommonAPI.post(other_user, %{status: "bla"})
{:ok, third_post} = CommonAPI.post(other_user, %{status: "bla"})
{:ok, _first_favorite} = CommonAPI.favorite(user, third_post.id)
{:ok, _second_favorite} = CommonAPI.favorite(user, first_post.id)
{:ok, third_favorite} = CommonAPI.favorite(user, second_post.id)
result =
conn
|> get("/api/v1/favourites?limit=1")
assert [%{"id" => post_id}] = json_response_and_validate_schema(result, 200)
assert post_id == second_post.id
# Using the header for pagination works correctly
[next, _] = get_resp_header(result, "link") |> hd() |> String.split(", ")
[next_url, _next_rel] = String.split(next, ";")
next_url = String.trim_trailing(next_url, ">") |> String.trim_leading("<")
max_id = Helpers.get_query_parameter(next_url, "max_id")
assert max_id == third_favorite.id
result =
conn
|> get("/api/v1/favourites?max_id=#{max_id}")
assert [%{"id" => first_post_id}, %{"id" => third_post_id}] =
json_response_and_validate_schema(result, 200)
assert first_post_id == first_post.id
assert third_post_id == third_post.id
end
test "returns the favorites of a user" do
%{user: user, conn: conn} = oauth_access(["read:favourites"])
other_user = insert(:user)
{:ok, _} = CommonAPI.post(other_user, %{status: "bla"})
{:ok, activity} = CommonAPI.post(other_user, %{status: "trees are happy"})
{:ok, last_like} = CommonAPI.favorite(user, activity.id)
first_conn = get(conn, "/api/v1/favourites")
assert [status] = json_response_and_validate_schema(first_conn, 200)
assert status["id"] == to_string(activity.id)
assert [{"link", _link_header}] =
Enum.filter(first_conn.resp_headers, fn element -> match?({"link", _}, element) end)
# Honours query params
{:ok, second_activity} =
CommonAPI.post(other_user, %{
status: "Trees Are Never Sad Look At Them Every Once In Awhile They're Quite Beautiful."
})
{:ok, _} = CommonAPI.favorite(user, second_activity.id)
second_conn = get(conn, "/api/v1/favourites?since_id=#{last_like.id}")
assert [second_status] = json_response_and_validate_schema(second_conn, 200)
assert second_status["id"] == to_string(second_activity.id)
third_conn = get(conn, "/api/v1/favourites?limit=0")
assert [] = json_response_and_validate_schema(third_conn, 200)
end
test "expires_at is nil for another user" do
%{conn: conn, user: user} = oauth_access(["read:statuses"])
expires_at = DateTime.add(DateTime.utc_now(), 1_000_000)
{:ok, activity} = CommonAPI.post(user, %{status: "foobar", expires_in: 1_000_000})
assert %{"pleroma" => %{"expires_at" => a_expires_at}} =
conn
|> get("/api/v1/statuses/#{activity.id}")
|> json_response_and_validate_schema(:ok)
{:ok, a_expires_at, 0} = DateTime.from_iso8601(a_expires_at)
assert DateTime.diff(expires_at, a_expires_at) == 0
%{conn: conn} = oauth_access(["read:statuses"])
assert %{"pleroma" => %{"expires_at" => nil}} =
conn
|> get("/api/v1/statuses/#{activity.id}")
|> json_response_and_validate_schema(:ok)
end
describe "local-only statuses" do
test "posting a local only status" do
%{user: _user, conn: conn} = oauth_access(["write:statuses"])
conn_one =
conn
|> put_req_header("content-type", "application/json")
|> post("/api/v1/statuses", %{
"status" => "cofe",
"visibility" => "local"
})
local = Utils.as_local_public()
assert %{"content" => "cofe", "id" => id, "visibility" => "local"} =
json_response_and_validate_schema(conn_one, 200)
assert %Activity{id: ^id, data: %{"to" => [^local]}} = Activity.get_by_id(id)
end
test "other users can read local-only posts" do
user = insert(:user)
%{user: _reader, conn: conn} = oauth_access(["read:statuses"])
{:ok, activity} = CommonAPI.post(user, %{status: "#2hu #2HU", visibility: "local"})
received =
conn
|> get("/api/v1/statuses/#{activity.id}")
|> json_response_and_validate_schema(:ok)
assert received["id"] == activity.id
end
test "anonymous users cannot see local-only posts" do
user = insert(:user)
{:ok, activity} = CommonAPI.post(user, %{status: "#2hu #2HU", visibility: "local"})
_received =
build_conn()
|> get("/api/v1/statuses/#{activity.id}")
|> json_response_and_validate_schema(:not_found)
end
end
describe "muted reactions" do
test "index" do
%{conn: conn, user: user} = oauth_access(["read:statuses"])
other_user = insert(:user)
{:ok, activity} = CommonAPI.post(user, %{status: "test"})
{:ok, _} = CommonAPI.react_with_emoji(activity.id, other_user, "🎅")
User.mute(user, other_user)
result =
conn
|> get("/api/v1/statuses/?ids[]=#{activity.id}")
|> json_response_and_validate_schema(200)
assert [
%{
"pleroma" => %{
"emoji_reactions" => []
}
}
] = result
result =
conn
|> get("/api/v1/statuses/?ids[]=#{activity.id}&with_muted=true")
|> json_response_and_validate_schema(200)
assert [
%{
"pleroma" => %{
"emoji_reactions" => [%{"count" => 1, "me" => false, "name" => "🎅"}]
}
}
] = result
end
test "show" do
# %{conn: conn, user: user, token: token} = oauth_access(["read:statuses"])
%{conn: conn, user: user, token: _token} = oauth_access(["read:statuses"])
other_user = insert(:user)
{:ok, activity} = CommonAPI.post(user, %{status: "test"})
{:ok, _} = CommonAPI.react_with_emoji(activity.id, other_user, "🎅")
User.mute(user, other_user)
result =
conn
|> get("/api/v1/statuses/#{activity.id}")
|> json_response_and_validate_schema(200)
assert %{
"pleroma" => %{
"emoji_reactions" => []
}
} = result
result =
conn
|> get("/api/v1/statuses/#{activity.id}?with_muted=true")
|> json_response_and_validate_schema(200)
assert %{
"pleroma" => %{
"emoji_reactions" => [%{"count" => 1, "me" => false, "name" => "🎅"}]
}
} = result
end
end
describe "get status history" do
setup do
%{conn: build_conn()}
end
test "unedited post", %{conn: conn} do
activity = insert(:note_activity)
conn = get(conn, "/api/v1/statuses/#{activity.id}/history")
assert [_] = json_response_and_validate_schema(conn, 200)
end
test "edited post", %{conn: conn} do
note =
insert(
:note,
data: %{
"formerRepresentations" => %{
"type" => "OrderedCollection",
"orderedItems" => [
%{
"type" => "Note",
"content" => "mew mew 2",
"summary" => "title 2"
},
%{
"type" => "Note",
"content" => "mew mew 1",
"summary" => "title 1"
}
],
"totalItems" => 2
}
}
)
activity = insert(:note_activity, note: note)
conn = get(conn, "/api/v1/statuses/#{activity.id}/history")
assert [%{"spoiler_text" => "title 1"}, %{"spoiler_text" => "title 2"}, _] =
json_response_and_validate_schema(conn, 200)
end
end
describe "get status source" do
setup do
%{conn: build_conn()}
end
test "it returns the source", %{conn: conn} do
user = insert(:user)
{:ok, activity} = CommonAPI.post(user, %{status: "mew mew #abc", spoiler_text: "#def"})
conn = get(conn, "/api/v1/statuses/#{activity.id}/source")
id = activity.id
assert %{"id" => ^id, "text" => "mew mew #abc", "spoiler_text" => "#def"} =
json_response_and_validate_schema(conn, 200)
end
end
describe "update status" do
setup do
oauth_access(["write:statuses"])
end
test "it updates the status" do
%{conn: conn, user: user} = oauth_access(["write:statuses", "read:statuses"])
{:ok, activity} = CommonAPI.post(user, %{status: "mew mew #abc", spoiler_text: "#def"})
conn
|> get("/api/v1/statuses/#{activity.id}")
|> json_response_and_validate_schema(200)
response =
conn
|> put_req_header("content-type", "application/json")
|> put("/api/v1/statuses/#{activity.id}", %{
"status" => "edited",
"spoiler_text" => "lol"
})
|> json_response_and_validate_schema(200)
assert response["content"] == "edited"
assert response["spoiler_text"] == "lol"
response =
conn
|> get("/api/v1/statuses/#{activity.id}")
|> json_response_and_validate_schema(200)
assert response["content"] == "edited"
assert response["spoiler_text"] == "lol"
end
test "it updates the attachments", %{conn: conn, user: user} do
attachment = insert(:attachment, user: user)
attachment_id = to_string(attachment.id)
{:ok, activity} = CommonAPI.post(user, %{status: "mew mew #abc", spoiler_text: "#def"})
response =
conn
|> put_req_header("content-type", "application/json")
|> put("/api/v1/statuses/#{activity.id}", %{
"status" => "mew mew #abc",
"spoiler_text" => "#def",
"media_ids" => [attachment_id]
})
|> json_response_and_validate_schema(200)
assert [%{"id" => ^attachment_id}] = response["media_attachments"]
end
test "it does not update visibility", %{conn: conn, user: user} do
{:ok, activity} =
CommonAPI.post(user, %{
status: "mew mew #abc",
spoiler_text: "#def",
visibility: "private"
})
response =
conn
|> put_req_header("content-type", "application/json")
|> put("/api/v1/statuses/#{activity.id}", %{
"status" => "edited",
"spoiler_text" => "lol"
})
|> json_response_and_validate_schema(200)
assert response["visibility"] == "private"
end
test "it refuses to update when original post is not by the user", %{conn: conn} do
another_user = insert(:user)
{:ok, activity} =
CommonAPI.post(another_user, %{status: "mew mew #abc", spoiler_text: "#def"})
conn
|> put_req_header("content-type", "application/json")
|> put("/api/v1/statuses/#{activity.id}", %{
"status" => "edited",
"spoiler_text" => "lol"
})
|> json_response_and_validate_schema(:forbidden)
end
test "it refuses to update an event", %{conn: conn, user: user} do
{:ok, activity} =
CommonAPI.event(user, %{
name: "I'm not a regular status",
status: "",
join_mode: "free",
start_time: DateTime.from_iso8601("2023-01-01T01:00:00.000Z") |> elem(1)
})
conn
|> put_req_header("content-type", "application/json")
|> put("/api/v1/statuses/#{activity.id}", %{
"status" => "edited"
})
|> json_response_and_validate_schema(:unprocessable_entity)
end
test "it returns 404 if the user cannot see the post", %{conn: conn} do
another_user = insert(:user)
{:ok, activity} =
CommonAPI.post(another_user, %{
status: "mew mew #abc",
spoiler_text: "#def",
visibility: "private"
})
conn
|> put_req_header("content-type", "application/json")
|> put("/api/v1/statuses/#{activity.id}", %{
"status" => "edited",
"spoiler_text" => "lol"
})
|> json_response_and_validate_schema(:not_found)
end
end
end
diff --git a/test/pleroma/web/pleroma_api/views/chat_message_reference_view_test.exs b/test/pleroma/web/pleroma_api/views/chat_message_reference_view_test.exs
index 44d40269c..c8b3cb391 100644
--- a/test/pleroma/web/pleroma_api/views/chat_message_reference_view_test.exs
+++ b/test/pleroma/web/pleroma_api/views/chat_message_reference_view_test.exs
@@ -1,94 +1,95 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.PleromaAPI.ChatMessageReferenceViewTest do
alias Pleroma.NullCache
use Pleroma.DataCase, async: true
alias Pleroma.Chat
alias Pleroma.Chat.MessageReference
alias Pleroma.Object
alias Pleroma.StaticStubbedConfigMock
alias Pleroma.UnstubbedConfigMock, as: ConfigMock
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.PleromaAPI.Chat.MessageReferenceView
import Mox
import Pleroma.Factory
test "it displays a chat message" do
user = insert(:user)
recipient = insert(:user)
ConfigMock
|> stub_with(Pleroma.Test.StaticConfig)
file = %Plug.Upload{
content_type: "image/jpeg",
path: Path.absname("test/fixtures/image.jpg"),
filename: "an_image.jpg"
}
{:ok, upload} = ActivityPub.upload(file, actor: recipient.ap_id)
{:ok, activity} =
CommonAPI.post_chat_message(user, recipient, "kippis :firefox:", idempotency_key: "123")
chat = Chat.get(user.id, recipient.ap_id)
object = Object.normalize(activity, fetch: false)
cm_ref = MessageReference.for_chat_and_object(chat, object)
id = cm_ref.id
Pleroma.CachexMock
|> stub(:get, fn
:chat_message_id_idempotency_key_cache, ^id -> {:ok, "123"}
cache, key -> NullCache.get(cache, key)
end)
+ |> stub(:fetch, fn :rich_media_cache, _, _ -> {:ok, {:ok, %{}}} end)
chat_message = MessageReferenceView.render("show.json", chat_message_reference: cm_ref)
assert chat_message[:id] == cm_ref.id
assert chat_message[:content] == "kippis :firefox:"
assert chat_message[:account_id] == user.id
assert chat_message[:chat_id]
assert chat_message[:created_at]
assert chat_message[:unread] == false
assert match?([%{shortcode: "firefox"}], chat_message[:emojis])
assert chat_message[:idempotency_key] == "123"
StaticStubbedConfigMock
|> stub(:get, fn
[:rich_media, :enabled] -> true
path -> Pleroma.Test.StaticConfig.get(path)
end)
Tesla.Mock.mock_global(fn
%{url: "https://example.com/ogp"} ->
%Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/ogp.html")}
end)
{:ok, activity} =
CommonAPI.post_chat_message(recipient, user, "gkgkgk https://example.com/ogp",
media_id: upload.id
)
object = Object.normalize(activity, fetch: false)
cm_ref = MessageReference.for_chat_and_object(chat, object)
chat_message_two = MessageReferenceView.render("show.json", chat_message_reference: cm_ref)
assert chat_message_two[:id] == cm_ref.id
assert chat_message_two[:content] == object.data["content"]
assert chat_message_two[:account_id] == recipient.id
assert chat_message_two[:chat_id] == chat_message[:chat_id]
assert chat_message_two[:attachment]
assert chat_message_two[:unread] == true
assert chat_message_two[:card]
end
end
diff --git a/test/pleroma/web/rich_media/helpers_test.exs b/test/pleroma/web/rich_media/helpers_test.exs
index 3ef5705ce..13d2341ad 100644
--- a/test/pleroma/web/rich_media/helpers_test.exs
+++ b/test/pleroma/web/rich_media/helpers_test.exs
@@ -1,111 +1,137 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.RichMedia.HelpersTest do
- use Pleroma.DataCase, async: true
+ use Pleroma.DataCase, async: false
alias Pleroma.StaticStubbedConfigMock, as: ConfigMock
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.RichMedia.Helpers
import Mox
import Pleroma.Factory
import Tesla.Mock
setup do
- mock(fn env -> apply(HttpRequestMock, :request, [env]) end)
+ mock_global(fn env -> apply(HttpRequestMock, :request, [env]) end)
ConfigMock
|> stub(:get, fn
[:rich_media, :enabled] -> false
path -> Pleroma.Test.StaticConfig.get(path)
end)
|> stub(:get, fn
path, default -> Pleroma.Test.StaticConfig.get(path, default)
end)
:ok
end
test "refuses to crawl incomplete URLs" do
user = insert(:user)
{:ok, activity} =
CommonAPI.post(user, %{
status: "[test](example.com/ogp)",
content_type: "text/markdown"
})
ConfigMock
|> stub(:get, fn
[:rich_media, :enabled] -> true
path -> Pleroma.Test.StaticConfig.get(path)
end)
assert %{} == Pleroma.Web.RichMedia.Helpers.fetch_data_for_activity(activity)
end
test "refuses to crawl malformed URLs" do
user = insert(:user)
{:ok, activity} =
CommonAPI.post(user, %{
status: "[test](example.com[]/ogp)",
content_type: "text/markdown"
})
ConfigMock
|> stub(:get, fn
[:rich_media, :enabled] -> true
path -> Pleroma.Test.StaticConfig.get(path)
end)
assert %{} == Pleroma.Web.RichMedia.Helpers.fetch_data_for_activity(activity)
end
test "crawls valid, complete URLs" do
user = insert(:user)
{:ok, activity} =
CommonAPI.post(user, %{
status: "[test](https://example.com/ogp)",
content_type: "text/markdown"
})
ConfigMock
|> stub(:get, fn
[:rich_media, :enabled] -> true
path -> Pleroma.Test.StaticConfig.get(path)
end)
assert %{page_url: "https://example.com/ogp", rich_media: _} =
Pleroma.Web.RichMedia.Helpers.fetch_data_for_activity(activity)
end
- # This does not seem to work. The urls are being fetched.
- @tag skip: true
+ test "recrawls URLs on updates" do
+ original_url = "https://google.com/"
+ updated_url = "https://yahoo.com/"
+
+ Pleroma.StaticStubbedConfigMock
+ |> stub(:get, fn
+ [:rich_media, :enabled] -> true
+ path -> Pleroma.Test.StaticConfig.get(path)
+ end)
+
+ user = insert(:user)
+ {:ok, activity} = CommonAPI.post(user, %{status: "I like this site #{original_url}"})
+
+ assert match?(
+ %{page_url: ^original_url, rich_media: _},
+ Pleroma.Web.RichMedia.Helpers.fetch_data_for_activity(activity)
+ )
+
+ {:ok, _} = CommonAPI.update(user, activity, %{status: "I like this site #{updated_url}"})
+
+ activity = Pleroma.Activity.get_by_id(activity.id)
+
+ assert match?(
+ %{page_url: ^updated_url, rich_media: _},
+ Pleroma.Web.RichMedia.Helpers.fetch_data_for_activity(activity)
+ )
+ end
+
test "refuses to crawl URLs of private network from posts" do
user = insert(:user)
{:ok, activity} =
CommonAPI.post(user, %{status: "http://127.0.0.1:4000/notice/9kCP7VNyPJXFOXDrgO"})
{:ok, activity2} = CommonAPI.post(user, %{status: "https://10.111.10.1/notice/9kCP7V"})
{:ok, activity3} = CommonAPI.post(user, %{status: "https://172.16.32.40/notice/9kCP7V"})
{:ok, activity4} = CommonAPI.post(user, %{status: "https://192.168.10.40/notice/9kCP7V"})
{:ok, activity5} = CommonAPI.post(user, %{status: "https://pleroma.local/notice/9kCP7V"})
ConfigMock
|> stub(:get, fn
[:rich_media, :enabled] -> true
path -> Pleroma.Test.StaticConfig.get(path)
end)
- assert %{} = Helpers.fetch_data_for_activity(activity)
- assert %{} = Helpers.fetch_data_for_activity(activity2)
- assert %{} = Helpers.fetch_data_for_activity(activity3)
- assert %{} = Helpers.fetch_data_for_activity(activity4)
- assert %{} = Helpers.fetch_data_for_activity(activity5)
+ assert %{} == Helpers.fetch_data_for_activity(activity)
+ assert %{} == Helpers.fetch_data_for_activity(activity2)
+ assert %{} == Helpers.fetch_data_for_activity(activity3)
+ assert %{} == Helpers.fetch_data_for_activity(activity4)
+ assert %{} == Helpers.fetch_data_for_activity(activity5)
end
end
diff --git a/test/pleroma/web/rich_media/parser_test.exs b/test/pleroma/web/rich_media/parser_test.exs
index 9064138a6..a05b89a2b 100644
--- a/test/pleroma/web/rich_media/parser_test.exs
+++ b/test/pleroma/web/rich_media/parser_test.exs
@@ -1,176 +1,107 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.RichMedia.ParserTest do
- use ExUnit.Case, async: true
+ use Pleroma.DataCase, async: false
alias Pleroma.Web.RichMedia.Parser
- setup do
- Tesla.Mock.mock(fn
- %{
- method: :get,
- url: "http://example.com/ogp"
- } ->
- %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/ogp.html")}
-
- %{
- method: :get,
- url: "http://example.com/non-ogp"
- } ->
- %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/non_ogp_embed.html")}
-
- %{
- method: :get,
- url: "http://example.com/ogp-missing-title"
- } ->
- %Tesla.Env{
- status: 200,
- body: File.read!("test/fixtures/rich_media/ogp-missing-title.html")
- }
-
- %{
- method: :get,
- url: "http://example.com/twitter-card"
- } ->
- %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/twitter_card.html")}
-
- %{
- method: :get,
- url: "http://example.com/oembed"
- } ->
- %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/oembed.html")}
-
- %{
- method: :get,
- url: "http://example.com/oembed.json"
- } ->
- %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/oembed.json")}
-
- %{method: :get, url: "http://example.com/empty"} ->
- %Tesla.Env{status: 200, body: "hello"}
+ import Tesla.Mock
- %{method: :get, url: "http://example.com/malformed"} ->
- %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/malformed-data.html")}
-
- %{method: :get, url: "http://example.com/error"} ->
- {:error, :overload}
-
- %{
- method: :head,
- url: "http://example.com/huge-page"
- } ->
- %Tesla.Env{
- status: 200,
- headers: [{"content-length", "2000001"}, {"content-type", "text/html"}]
- }
-
- %{
- method: :head,
- url: "http://example.com/pdf-file"
- } ->
- %Tesla.Env{
- status: 200,
- headers: [{"content-length", "1000000"}, {"content-type", "application/pdf"}]
- }
-
- %{method: :head} ->
- %Tesla.Env{status: 404, body: "", headers: []}
- end)
-
- :ok
+ setup do
+ mock_global(fn env -> apply(HttpRequestMock, :request, [env]) end)
end
test "returns error when no metadata present" do
- assert {:error, _} = Parser.parse("http://example.com/empty")
+ assert {:error, _} = Parser.parse("https://example.com/empty")
end
test "doesn't just add a title" do
- assert {:error, {:invalid_metadata, _}} = Parser.parse("http://example.com/non-ogp")
+ assert {:error, {:invalid_metadata, _}} = Parser.parse("https://example.com/non-ogp")
end
test "parses ogp" do
- assert Parser.parse("http://example.com/ogp") ==
+ assert Parser.parse("https://example.com/ogp") ==
{:ok,
%{
"image" => "http://ia.media-imdb.com/images/rock.jpg",
"title" => "The Rock",
"description" =>
"Directed by Michael Bay. With Sean Connery, Nicolas Cage, Ed Harris, John Spencer.",
"type" => "video.movie",
- "url" => "http://example.com/ogp"
+ "url" => "https://example.com/ogp"
}}
end
test "falls back to <title> when ogp:title is missing" do
- assert Parser.parse("http://example.com/ogp-missing-title") ==
+ assert Parser.parse("https://example.com/ogp-missing-title") ==
{:ok,
%{
"image" => "http://ia.media-imdb.com/images/rock.jpg",
"title" => "The Rock (1996)",
"description" =>
"Directed by Michael Bay. With Sean Connery, Nicolas Cage, Ed Harris, John Spencer.",
"type" => "video.movie",
- "url" => "http://example.com/ogp-missing-title"
+ "url" => "https://example.com/ogp-missing-title"
}}
end
test "parses twitter card" do
- assert Parser.parse("http://example.com/twitter-card") ==
+ assert Parser.parse("https://example.com/twitter-card") ==
{:ok,
%{
"card" => "summary",
"site" => "@flickr",
"image" => "https://farm6.staticflickr.com/5510/14338202952_93595258ff_z.jpg",
"title" => "Small Island Developing States Photo Submission",
"description" => "View the album on Flickr.",
- "url" => "http://example.com/twitter-card"
+ "url" => "https://example.com/twitter-card"
}}
end
test "parses OEmbed and filters HTML tags" do
- assert Parser.parse("http://example.com/oembed") ==
+ assert Parser.parse("https://example.com/oembed") ==
{:ok,
%{
"author_name" => "\u202E\u202D\u202Cbees\u202C",
"author_url" => "https://www.flickr.com/photos/bees/",
"cache_age" => 3600,
"flickr_type" => "photo",
"height" => "768",
"html" =>
"<a href=\"https://www.flickr.com/photos/bees/2362225867/\" title=\"Bacon Lollys by \u202E\u202D\u202Cbees\u202C, on Flickr\"><img src=\"https://farm4.staticflickr.com/3040/2362225867_4a87ab8baf_b.jpg\" width=\"1024\" height=\"768\" alt=\"Bacon Lollys\"/></a>",
"license" => "All Rights Reserved",
"license_id" => 0,
"provider_name" => "Flickr",
"provider_url" => "https://www.flickr.com/",
"thumbnail_height" => 150,
"thumbnail_url" =>
"https://farm4.staticflickr.com/3040/2362225867_4a87ab8baf_q.jpg",
"thumbnail_width" => 150,
"title" => "Bacon Lollys",
"type" => "photo",
- "url" => "http://example.com/oembed",
+ "url" => "https://example.com/oembed",
"version" => "1.0",
"web_page" => "https://www.flickr.com/photos/bees/2362225867/",
"web_page_short_url" => "https://flic.kr/p/4AK2sc",
"width" => "1024"
}}
end
test "rejects invalid OGP data" do
- assert {:error, _} = Parser.parse("http://example.com/malformed")
+ assert {:error, _} = Parser.parse("https://example.com/malformed")
end
test "returns error if getting page was not successful" do
- assert {:error, :overload} = Parser.parse("http://example.com/error")
+ assert {:error, :overload} = Parser.parse("https://example.com/error")
end
test "does a HEAD request to check if the body is too large" do
- assert {:error, :body_too_large} = Parser.parse("http://example.com/huge-page")
+ assert {:error, :body_too_large} = Parser.parse("https://example.com/huge-page")
end
test "does a HEAD request to check if the body is html" do
- assert {:error, {:content_type, _}} = Parser.parse("http://example.com/pdf-file")
+ assert {:error, {:content_type, _}} = Parser.parse("https://example.com/pdf-file")
end
end
diff --git a/test/support/cachex_proxy.ex b/test/support/cachex_proxy.ex
index 83ae5610f..8f27986a9 100644
--- a/test/support/cachex_proxy.ex
+++ b/test/support/cachex_proxy.ex
@@ -1,40 +1,46 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.CachexProxy do
@behaviour Pleroma.Caching
@impl true
defdelegate get!(cache, key), to: Cachex
@impl true
defdelegate stream!(cache, key), to: Cachex
@impl true
defdelegate put(cache, key, value, options), to: Cachex
@impl true
defdelegate put(cache, key, value), to: Cachex
@impl true
defdelegate get_and_update(cache, key, func), to: Cachex
@impl true
defdelegate get(cache, key), to: Cachex
@impl true
defdelegate fetch!(cache, key, func), to: Cachex
+ @impl true
+ defdelegate fetch(cache, key, func), to: Cachex
+
@impl true
defdelegate expire_at(cache, str, num), to: Cachex
+ @impl true
+ defdelegate expire(cache, str, num), to: Cachex
+
@impl true
defdelegate exists?(cache, key), to: Cachex
@impl true
defdelegate del(cache, key), to: Cachex
@impl true
defdelegate execute!(cache, func), to: Cachex
end
diff --git a/test/support/http_request_mock.ex b/test/support/http_request_mock.ex
index 32b71cf2f..5ca26d6e1 100644
--- a/test/support/http_request_mock.ex
+++ b/test/support/http_request_mock.ex
@@ -1,1580 +1,1646 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule HttpRequestMock do
require Logger
def activitypub_object_headers, do: [{"content-type", "application/activity+json"}]
def request(
%Tesla.Env{
url: url,
method: method,
headers: headers,
query: query,
body: body
} = _env
) do
with {:ok, res} <- apply(__MODULE__, method, [url, query, body, headers]) do
res
else
error ->
with {:error, message} <- error do
Logger.warning(to_string(message))
end
{_, _r} = error
end
end
# GET Requests
#
def get(url, query \\ [], body \\ [], headers \\ [])
def get("https://osada.macgirvin.com/channel/mike", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https___osada.macgirvin.com_channel_mike.json"),
headers: activitypub_object_headers()
}}
end
def get("https://shitposter.club/users/moonman", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/moonman@shitposter.club.json"),
headers: activitypub_object_headers()
}}
end
def get("https://mastodon.social/users/emelie/statuses/101849165031453009", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/status.emelie.json"),
headers: activitypub_object_headers()
}}
end
def get("https://mastodon.social/users/emelie/statuses/101849165031453404", _, _, _) do
{:ok,
%Tesla.Env{
status: 404,
body: ""
}}
end
def get("https://mastodon.social/users/emelie", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/emelie.json"),
headers: activitypub_object_headers()
}}
end
def get("https://mastodon.social/users/not_found", _, _, _) do
{:ok, %Tesla.Env{status: 404}}
end
def get("https://mastodon.sdf.org/users/rinpatch", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/rinpatch.json"),
headers: activitypub_object_headers()
}}
end
def get("https://mastodon.sdf.org/users/rinpatch/collections/featured", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body:
File.read!("test/fixtures/users_mock/masto_featured.json")
|> String.replace("{{domain}}", "mastodon.sdf.org")
|> String.replace("{{nickname}}", "rinpatch"),
headers: [{"content-type", "application/activity+json"}]
}}
end
def get("https://patch.cx/objects/tesla_mock/poll_attachment", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/poll_attachment.json"),
headers: activitypub_object_headers()
}}
end
def get(
"https://mastodon.social/.well-known/webfinger?resource=https://mastodon.social/users/emelie",
_,
_,
_
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/webfinger_emelie.json"),
headers: activitypub_object_headers()
}}
end
def get(
"https://osada.macgirvin.com/.well-known/webfinger?resource=acct:mike@osada.macgirvin.com",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mike@osada.macgirvin.com.json"),
headers: [{"content-type", "application/jrd+json"}]
}}
end
def get(
"https://social.heldscal.la/.well-known/webfinger?resource=https://social.heldscal.la/user/29191",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https___social.heldscal.la_user_29191.xml")
}}
end
def get(
"https://pawoo.net/.well-known/webfinger?resource=acct:https://pawoo.net/users/pekorino",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https___pawoo.net_users_pekorino.xml")
}}
end
def get(
"https://social.stopwatchingus-heidelberg.de/.well-known/webfinger?resource=acct:https://social.stopwatchingus-heidelberg.de/user/18330",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/atarifrosch_webfinger.xml")
}}
end
def get(
"https://social.heldscal.la/.well-known/webfinger?resource=nonexistent@social.heldscal.la",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/nonexistent@social.heldscal.la.xml")
}}
end
def get(
"https://squeet.me/xrd/?uri=acct:lain@squeet.me",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/lain_squeet.me_webfinger.xml"),
headers: [{"content-type", "application/xrd+xml"}]
}}
end
def get(
"https://mst3k.interlinked.me/users/luciferMysticus",
_,
_,
[{"accept", "application/activity+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/lucifermysticus.json"),
headers: activitypub_object_headers()
}}
end
def get("https://prismo.news/@mxb", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https___prismo.news__mxb.json"),
headers: activitypub_object_headers()
}}
end
def get(
"https://hubzilla.example.org/channel/kaniini",
_,
_,
[{"accept", "application/activity+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/kaniini@hubzilla.example.org.json"),
headers: activitypub_object_headers()
}}
end
def get("https://niu.moe/users/rye", _, _, [{"accept", "application/activity+json"}]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/rye.json"),
headers: activitypub_object_headers()
}}
end
def get("https://n1u.moe/users/rye", _, _, [{"accept", "application/activity+json"}]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/rye.json"),
headers: activitypub_object_headers()
}}
end
def get("http://mastodon.example.org/users/admin/statuses/100787282858396771", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body:
File.read!(
"test/fixtures/tesla_mock/http___mastodon.example.org_users_admin_status_1234.json"
)
}}
end
def get("https://puckipedia.com/", _, _, [{"accept", "application/activity+json"}]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/puckipedia.com.json"),
headers: activitypub_object_headers()
}}
end
def get("https://peertube.moe/accounts/7even", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/7even.json"),
headers: activitypub_object_headers()
}}
end
def get("https://peertube.stream/accounts/createurs", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/peertube/actor-person.json"),
headers: activitypub_object_headers()
}}
end
def get("https://peertube.moe/videos/watch/df5f464b-be8d-46fb-ad81-2d4c2d1630e3", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/peertube.moe-vid.json"),
headers: activitypub_object_headers()
}}
end
def get("https://framatube.org/accounts/framasoft", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https___framatube.org_accounts_framasoft.json"),
headers: activitypub_object_headers()
}}
end
def get("https://framatube.org/videos/watch/6050732a-8a7a-43d4-a6cd-809525a1d206", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/framatube.org-video.json"),
headers: activitypub_object_headers()
}}
end
def get("https://peertube.social/accounts/craigmaloney", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/craigmaloney.json"),
headers: activitypub_object_headers()
}}
end
def get("https://peertube.social/videos/watch/278d2b7c-0f38-4aaa-afe6-9ecc0c4a34fe", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/peertube-social.json"),
headers: activitypub_object_headers()
}}
end
def get("https://mobilizon.org/events/252d5816-00a3-4a89-a66f-15bf65c33e39", _, _, [
{"accept", "application/activity+json"}
]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mobilizon.org-event.json"),
headers: activitypub_object_headers()
}}
end
def get("https://mobilizon.org/@tcit", _, _, [{"accept", "application/activity+json"}]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mobilizon.org-user.json"),
headers: activitypub_object_headers()
}}
end
def get("https://baptiste.gelez.xyz/@/BaptisteGelez", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/baptiste.gelex.xyz-user.json"),
headers: activitypub_object_headers()
}}
end
def get("https://baptiste.gelez.xyz/~/PlumeDevelopment/this-month-in-plume-june-2018/", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/baptiste.gelex.xyz-article.json"),
headers: activitypub_object_headers()
}}
end
def get("https://wedistribute.org/wp-json/pterotype/v1/object/85810", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/wedistribute-article.json"),
headers: activitypub_object_headers()
}}
end
def get("https://wedistribute.org/wp-json/pterotype/v1/actor/-blog", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/wedistribute-user.json"),
headers: activitypub_object_headers()
}}
end
def get("http://mastodon.example.org/users/admin", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/admin@mastdon.example.org.json"),
headers: activitypub_object_headers()
}}
end
def get("http://mastodon.example.org/users/relay", _, _, [
{"accept", "application/activity+json"}
]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/relay@mastdon.example.org.json"),
headers: activitypub_object_headers()
}}
end
def get("http://mastodon.example.org/users/gargron", _, _, [
{"accept", "application/activity+json"}
]) do
{:error, :nxdomain}
end
def get("https://osada.macgirvin.com/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 404,
body: ""
}}
end
def get("http://mastodon.sdf.org/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/sdf.org_host_meta")
}}
end
def get("https://mastodon.sdf.org/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/sdf.org_host_meta")
}}
end
def get(
"https://mastodon.sdf.org/.well-known/webfinger?resource=https://mastodon.sdf.org/users/snowdusk",
_,
_,
_
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/snowdusk@sdf.org_host_meta.json")
}}
end
def get("http://mstdn.jp/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mstdn.jp_host_meta")
}}
end
def get("https://mstdn.jp/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mstdn.jp_host_meta")
}}
end
def get("https://mstdn.jp/.well-known/webfinger?resource=kpherox@mstdn.jp", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/kpherox@mstdn.jp.xml")
}}
end
def get("http://mamot.fr/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mamot.fr_host_meta")
}}
end
def get("https://mamot.fr/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mamot.fr_host_meta")
}}
end
def get("http://pawoo.net/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/pawoo.net_host_meta")
}}
end
def get("https://pawoo.net/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/pawoo.net_host_meta")
}}
end
def get(
"https://pawoo.net/.well-known/webfinger?resource=https://pawoo.net/users/pekorino",
_,
_,
_
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/pekorino@pawoo.net_host_meta.json"),
headers: activitypub_object_headers()
}}
end
def get("http://pleroma.soykaf.com/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/soykaf.com_host_meta")
}}
end
def get("https://pleroma.soykaf.com/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/soykaf.com_host_meta")
}}
end
def get("http://social.stopwatchingus-heidelberg.de/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/stopwatchingus-heidelberg.de_host_meta")
}}
end
def get("https://social.stopwatchingus-heidelberg.de/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/stopwatchingus-heidelberg.de_host_meta")
}}
end
def get(
"http://mastodon.example.org/@admin/99541947525187367",
_,
_,
_
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/mastodon-note-object.json"),
headers: activitypub_object_headers()
}}
end
def get("http://mastodon.example.org/@admin/99541947525187368", _, _, _) do
{:ok,
%Tesla.Env{
status: 404,
body: ""
}}
end
def get("https://shitposter.club/notice/7369654", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/7369654.html")
}}
end
def get("https://mstdn.io/users/mayuutann", _, _, [{"accept", "application/activity+json"}]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mayumayu.json"),
headers: activitypub_object_headers()
}}
end
def get(
"https://mstdn.io/users/mayuutann/statuses/99568293732299394",
_,
_,
[{"accept", "application/activity+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mayumayupost.json"),
headers: activitypub_object_headers()
}}
end
def get(url, _, _, [{"accept", "application/xrd+xml,application/jrd+json"}])
when url in [
"https://pleroma.soykaf.com/.well-known/webfinger?resource=acct:https://pleroma.soykaf.com/users/lain",
"https://pleroma.soykaf.com/.well-known/webfinger?resource=https://pleroma.soykaf.com/users/lain"
] do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https___pleroma.soykaf.com_users_lain.xml")
}}
end
def get(
"https://shitposter.club/.well-known/webfinger?resource=https://shitposter.club/user/1",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https___shitposter.club_user_1.xml")
}}
end
def get("https://testing.pleroma.lol/objects/b319022a-4946-44c5-9de9-34801f95507b", _, _, _) do
{:ok, %Tesla.Env{status: 200}}
end
def get(
"https://shitposter.club/.well-known/webfinger?resource=https://shitposter.club/user/5381",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/spc_5381_xrd.xml")
}}
end
def get("http://shitposter.club/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/shitposter.club_host_meta")
}}
end
def get("https://shitposter.club/notice/4027863", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/7369654.html")
}}
end
def get("http://social.sakamoto.gq/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/social.sakamoto.gq_host_meta")
}}
end
def get(
"https://social.sakamoto.gq/.well-known/webfinger?resource=https://social.sakamoto.gq/users/eal",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/eal_sakamoto.xml")
}}
end
def get("http://mastodon.social/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mastodon.social_host_meta")
}}
end
def get(
"https://mastodon.social/.well-known/webfinger?resource=https://mastodon.social/users/lambadalambda",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body:
File.read!("test/fixtures/tesla_mock/https___mastodon.social_users_lambadalambda.xml")
}}
end
def get(
"https://mastodon.social/.well-known/webfinger?resource=acct:not_found@mastodon.social",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok, %Tesla.Env{status: 404}}
end
def get("http://gs.example.org/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/gs.example.org_host_meta")
}}
end
def get(
"http://gs.example.org/.well-known/webfinger?resource=http://gs.example.org:4040/index.php/user/1",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body:
File.read!("test/fixtures/tesla_mock/http___gs.example.org_4040_index.php_user_1.xml")
}}
end
def get(
"http://gs.example.org:4040/index.php/user/1",
_,
_,
[{"accept", "application/activity+json"}]
) do
{:ok, %Tesla.Env{status: 406, body: ""}}
end
def get("https://squeet.me/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{status: 200, body: File.read!("test/fixtures/tesla_mock/squeet.me_host_meta")}}
end
def get(
"https://squeet.me/xrd?uri=lain@squeet.me",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/lain_squeet.me_webfinger.xml")
}}
end
def get(
"https://social.heldscal.la/.well-known/webfinger?resource=acct:shp@social.heldscal.la",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/shp@social.heldscal.la.xml"),
headers: [{"content-type", "application/xrd+xml"}]
}}
end
def get(
"https://social.heldscal.la/.well-known/webfinger?resource=acct:invalid_content@social.heldscal.la",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok, %Tesla.Env{status: 200, body: "", headers: [{"content-type", "application/jrd+json"}]}}
end
def get("https://framatube.org/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/framatube.org_host_meta")
}}
end
def get(
"https://framatube.org/main/xrd?uri=acct:framasoft@framatube.org",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
headers: [{"content-type", "application/jrd+json"}],
body: File.read!("test/fixtures/tesla_mock/framasoft@framatube.org.json")
}}
end
def get("http://gnusocial.de/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/gnusocial.de_host_meta")
}}
end
def get(
"http://gnusocial.de/main/xrd?uri=winterdienst@gnusocial.de",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/winterdienst_webfinger.json"),
headers: activitypub_object_headers()
}}
end
def get("https://status.alpicola.com/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/status.alpicola.com_host_meta")
}}
end
def get("https://macgirvin.com/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/macgirvin.com_host_meta")
}}
end
def get("https://gerzilla.de/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/gerzilla.de_host_meta")
}}
end
def get(
"https://gerzilla.de/xrd/?uri=acct:kaniini@gerzilla.de",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
headers: [{"content-type", "application/jrd+json"}],
body: File.read!("test/fixtures/tesla_mock/kaniini@gerzilla.de.json")
}}
end
def get(
"https://social.heldscal.la/.well-known/webfinger?resource=https://social.heldscal.la/user/23211",
_,
_,
_
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https___social.heldscal.la_user_23211.xml")
}}
end
def get("http://social.heldscal.la/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/social.heldscal.la_host_meta")
}}
end
def get("https://social.heldscal.la/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/social.heldscal.la_host_meta")
}}
end
def get("https://mastodon.social/users/lambadalambda", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/lambadalambda.json"),
headers: activitypub_object_headers()
}}
end
def get("https://mastodon.social/users/lambadalambda/collections/featured", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body:
File.read!("test/fixtures/users_mock/masto_featured.json")
|> String.replace("{{domain}}", "mastodon.social")
|> String.replace("{{nickname}}", "lambadalambda"),
headers: activitypub_object_headers()
}}
end
def get("https://apfed.club/channel/indio", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/osada-user-indio.json"),
headers: activitypub_object_headers()
}}
end
def get("https://social.heldscal.la/user/23211", _, _, [{"accept", "application/activity+json"}]) do
{:ok, Tesla.Mock.json(%{"id" => "https://social.heldscal.la/user/23211"}, status: 200)}
end
def get("http://example.com/ogp", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/ogp.html")}}
end
def get("https://example.com/ogp", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/ogp.html")}}
end
def get("https://pleroma.local/notice/9kCP7V", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/ogp.html")}}
end
def get("http://localhost:4001/users/masto_closed/followers", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/users_mock/masto_closed_followers.json"),
headers: activitypub_object_headers()
}}
end
def get("http://localhost:4001/users/masto_closed/followers?page=1", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/users_mock/masto_closed_followers_page.json"),
headers: activitypub_object_headers()
}}
end
def get("http://localhost:4001/users/masto_closed/following", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/users_mock/masto_closed_following.json"),
headers: activitypub_object_headers()
}}
end
def get("http://localhost:4001/users/masto_closed/following?page=1", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/users_mock/masto_closed_following_page.json"),
headers: activitypub_object_headers()
}}
end
def get("http://localhost:8080/followers/fuser3", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/users_mock/friendica_followers.json"),
headers: activitypub_object_headers()
}}
end
def get("http://localhost:8080/following/fuser3", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/users_mock/friendica_following.json"),
headers: activitypub_object_headers()
}}
end
def get("http://localhost:4001/users/fuser2/followers", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/users_mock/pleroma_followers.json"),
headers: activitypub_object_headers()
}}
end
def get("http://localhost:4001/users/fuser2/following", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/users_mock/pleroma_following.json"),
headers: activitypub_object_headers()
}}
end
def get("http://domain-with-errors:4001/users/fuser1/followers", _, _, _) do
{:ok,
%Tesla.Env{
status: 504,
body: ""
}}
end
def get("http://domain-with-errors:4001/users/fuser1/following", _, _, _) do
{:ok,
%Tesla.Env{
status: 504,
body: ""
}}
end
def get("http://example.com/ogp-missing-data", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/rich_media/ogp-missing-data.html")
}}
end
def get("https://example.com/ogp-missing-data", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/rich_media/ogp-missing-data.html")
}}
end
- def get("http://example.com/malformed", _, _, _) do
+ def get("https://example.com/malformed", _, _, _) do
{:ok,
%Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/malformed-data.html")}}
end
def get("http://example.com/empty", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: "hello"}}
end
def get("http://404.site" <> _, _, _, _) do
{:ok,
%Tesla.Env{
status: 404,
body: ""
}}
end
def get("https://404.site" <> _, _, _, _) do
{:ok,
%Tesla.Env{
status: 404,
body: ""
}}
end
def get(
"https://zetsubou.xn--q9jyb4c/.well-known/webfinger?resource=acct:lain@zetsubou.xn--q9jyb4c",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/lain.xml"),
headers: [{"content-type", "application/xrd+xml"}]
}}
end
def get(
"https://zetsubou.xn--q9jyb4c/.well-known/webfinger?resource=acct:https://zetsubou.xn--q9jyb4c/users/lain",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/lain.xml"),
headers: [{"content-type", "application/xrd+xml"}]
}}
end
def get("http://zetsubou.xn--q9jyb4c/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/host-meta-zetsubou.xn--q9jyb4c.xml")
}}
end
def get(
"https://zetsubou.xn--q9jyb4c/.well-known/host-meta",
_,
_,
_
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/host-meta-zetsubou.xn--q9jyb4c.xml")
}}
end
def get("http://lm.kazv.moe/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/lm.kazv.moe_host_meta")
}}
end
def get("https://lm.kazv.moe/.well-known/host-meta", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/lm.kazv.moe_host_meta")
}}
end
def get(
"https://lm.kazv.moe/.well-known/webfinger?resource=acct:mewmew@lm.kazv.moe",
_,
_,
[{"accept", "application/xrd+xml,application/jrd+json"}]
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https___lm.kazv.moe_users_mewmew.xml"),
headers: [{"content-type", "application/xrd+xml"}]
}}
end
def get("https://lm.kazv.moe/users/mewmew", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mewmew@lm.kazv.moe.json"),
headers: activitypub_object_headers()
}}
end
def get("https://lm.kazv.moe/users/mewmew/collections/featured", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body:
File.read!("test/fixtures/users_mock/masto_featured.json")
|> String.replace("{{domain}}", "lm.kazv.moe")
|> String.replace("{{nickname}}", "mewmew"),
headers: [{"content-type", "application/activity+json"}]
}}
end
def get("https://info.pleroma.site/activity.json", _, _, [
{"accept", "application/activity+json"}
]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https__info.pleroma.site_activity.json"),
headers: activitypub_object_headers()
}}
end
def get("https://info.pleroma.site/activity.json", _, _, _) do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
def get("https://info.pleroma.site/activity2.json", _, _, [
{"accept", "application/activity+json"}
]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https__info.pleroma.site_activity2.json"),
headers: activitypub_object_headers()
}}
end
def get("https://info.pleroma.site/activity2.json", _, _, _) do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
def get("https://info.pleroma.site/activity3.json", _, _, [
{"accept", "application/activity+json"}
]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https__info.pleroma.site_activity3.json"),
headers: activitypub_object_headers()
}}
end
def get("https://info.pleroma.site/activity3.json", _, _, _) do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
def get("https://mstdn.jp/.well-known/webfinger?resource=acct:kpherox@mstdn.jp", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/kpherox@mstdn.jp.xml"),
headers: [{"content-type", "application/xrd+xml"}]
}}
end
def get("https://10.111.10.1/notice/9kCP7V", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: ""}}
end
def get("https://172.16.32.40/notice/9kCP7V", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: ""}}
end
def get("https://192.168.10.40/notice/9kCP7V", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: ""}}
end
def get("https://www.patreon.com/posts/mastodon-2-9-and-28121681", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: ""}}
end
def get("http://mastodon.example.org/@admin/99541947525187367", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/mastodon-post-activity.json"),
headers: activitypub_object_headers()
}}
end
def get("https://info.pleroma.site/activity4.json", _, _, _) do
{:ok, %Tesla.Env{status: 500, body: "Error occurred"}}
end
def get("http://example.com/rel_me/anchor", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rel_me_anchor.html")}}
end
def get("http://example.com/rel_me/anchor_nofollow", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rel_me_anchor_nofollow.html")}}
end
def get("http://example.com/rel_me/link", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rel_me_link.html")}}
end
def get("http://example.com/rel_me/null", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rel_me_null.html")}}
end
def get("https://skippers-bin.com/notes/7x9tmrp97i", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/misskey_poll_no_end_date.json"),
headers: activitypub_object_headers()
}}
end
def get("https://example.org/emoji/firedfox.png", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/image.jpg")}}
end
def get("https://skippers-bin.com/users/7v1w1r8ce6", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/sjw.json"),
headers: activitypub_object_headers()
}}
end
def get("https://patch.cx/users/rin", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/rin.json"),
headers: activitypub_object_headers()
}}
end
def get(
"https://channels.tests.funkwhale.audio/federation/music/uploads/42342395-0208-4fee-a38d-259a6dae0871",
_,
_,
_
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/funkwhale_audio.json"),
headers: activitypub_object_headers()
}}
end
def get("https://channels.tests.funkwhale.audio/federation/actors/compositions", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/funkwhale_channel.json"),
headers: activitypub_object_headers()
}}
end
def get("http://example.com/rel_me/error", _, _, _) do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
def get("https://relay.mastodon.host/actor", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/relay/relay.json"),
headers: activitypub_object_headers()
}}
end
def get("http://localhost:4001/", _, "", [{"accept", "text/html"}]) do
{:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/tesla_mock/7369654.html")}}
end
def get("https://osada.macgirvin.com/", _, "", [{"accept", "text/html"}]) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/https___osada.macgirvin.com.html")
}}
end
def get("https://patch.cx/objects/a399c28e-c821-4820-bc3e-4afeb044c16f", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/emoji-in-summary.json"),
headers: activitypub_object_headers()
}}
end
def get("https://gleasonator.com/objects/102eb097-a18b-4cd5-abfc-f952efcb70bb", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/gleasonator-AG3RzWfwEKKrY63qj2.json"),
headers: activitypub_object_headers()
}}
end
def get("https://misskey.io/users/83ssedkv53", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/aimu@misskey.io.json"),
headers: activitypub_object_headers()
}}
end
def get("https://gleasonator.com/users/macgirvin", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/macgirvin@gleasonator.com.json"),
headers: activitypub_object_headers()
}}
end
def get("https://gleasonator.com/users/macgirvin/collections/featured", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body:
File.read!("test/fixtures/users_mock/masto_featured.json")
|> String.replace("{{domain}}", "gleasonator.com")
|> String.replace("{{nickname}}", "macgirvin"),
headers: activitypub_object_headers()
}}
end
def get("https://mk.absturztau.be/users/8ozbzjs3o8", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mametsuko@mk.absturztau.be.json"),
headers: activitypub_object_headers()
}}
end
def get("https://p.helene.moe/users/helene", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/helene@p.helene.moe.json"),
headers: activitypub_object_headers()
}}
end
def get("https://mk.absturztau.be/notes/93e7nm8wqg", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mk.absturztau.be-93e7nm8wqg.json"),
headers: activitypub_object_headers()
}}
end
def get("https://mk.absturztau.be/notes/93e7nm8wqg/activity", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/mk.absturztau.be-93e7nm8wqg-activity.json"),
headers: activitypub_object_headers()
}}
end
def get("https://p.helene.moe/objects/fd5910ac-d9dc-412e-8d1d-914b203296c4", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/p.helene.moe-AM7S6vZQmL6pI9TgPY.json"),
headers: activitypub_object_headers()
}}
end
def get("https://misskey.io/notes/8vs6wxufd0", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/misskey.io_8vs6wxufd0.json"),
headers: activitypub_object_headers()
}}
end
+ def get("https://google.com/", _, _, _) do
+ {:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/google.html")}}
+ end
+
+ def get("https://yahoo.com/", _, _, _) do
+ {:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/yahoo.html")}}
+ end
+
+ def get("https://example.com/error", _, _, _), do: {:error, :overload}
+
+ def get("https://example.com/ogp-missing-title", _, _, _) do
+ {:ok,
+ %Tesla.Env{
+ status: 200,
+ body: File.read!("test/fixtures/rich_media/ogp-missing-title.html")
+ }}
+ end
+
+ def get("https://example.com/oembed", _, _, _) do
+ {:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/oembed.html")}}
+ end
+
+ def get("https://example.com/oembed.json", _, _, _) do
+ {:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/oembed.json")}}
+ end
+
+ def get("https://example.com/twitter-card", _, _, _) do
+ {:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/twitter_card.html")}}
+ end
+
+ def get("https://example.com/non-ogp", _, _, _) do
+ {:ok,
+ %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/non_ogp_embed.html")}}
+ end
+
+ def get("https://example.com/empty", _, _, _) do
+ {:ok, %Tesla.Env{status: 200, body: "hello"}}
+ end
+
def get(
"https://nominatim.openstreetmap.org/search?format=geocodejson&q=Benis&limit=10&accept-language=en&addressdetails=1&namedetails=1",
_,
_,
_
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/nominatim_search_results.json"),
headers: [{"content-type", "application/json"}]
}}
end
def get(
"https://nominatim.openstreetmap.org/lookup?format=geocodejson&osm_ids=N3726208425,R3726208425,W3726208425&accept-language=en&addressdetails=1&namedetails=1",
_,
_,
_
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/tesla_mock/nominatim_single_result.json"),
headers: [{"content-type", "application/json"}]
}}
end
def get(url, query, body, headers) do
{:error,
"Mock response not implemented for GET #{inspect(url)}, #{query}, #{inspect(body)}, #{inspect(headers)}"}
end
# POST Requests
#
def post(url, query \\ [], body \\ [], headers \\ [])
def post("https://relay.mastodon.host/inbox", _, _, _) do
{:ok, %Tesla.Env{status: 200, body: ""}}
end
def post("http://example.org/needs_refresh", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: ""
}}
end
def post("http://mastodon.example.org/inbox", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: ""
}}
end
def post("https://hubzilla.example.org/inbox", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: ""
}}
end
def post("http://gs.example.org/index.php/main/salmon/user/1", _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: ""
}}
end
def post("http://200.site" <> _, _, _, _) do
{:ok,
%Tesla.Env{
status: 200,
body: ""
}}
end
def post("http://connrefused.site" <> _, _, _, _) do
{:error, :connrefused}
end
def post("http://404.site" <> _, _, _, _) do
{:ok,
%Tesla.Env{
status: 404,
body: ""
}}
end
def post(url, query, body, headers) do
{:error,
"Mock response not implemented for POST #{inspect(url)}, #{query}, #{inspect(body)}, #{inspect(headers)}"}
end
# Most of the rich media mocks are missing HEAD requests, so we just return 404.
@rich_media_mocks [
+ "https://example.com/empty",
+ "https://example.com/error",
+ "https://example.com/malformed",
+ "https://example.com/non-ogp",
+ "https://example.com/oembed",
+ "https://example.com/oembed.json",
"https://example.com/ogp",
"https://example.com/ogp-missing-data",
- "https://example.com/twitter-card"
+ "https://example.com/ogp-missing-title",
+ "https://example.com/twitter-card",
+ "https://google.com/",
+ "https://pleroma.local/notice/9kCP7V",
+ "https://yahoo.com/"
]
+
def head(url, _query, _body, _headers) when url in @rich_media_mocks do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
+ def head("https://example.com/pdf-file", _, _, _) do
+ {:ok,
+ %Tesla.Env{
+ status: 200,
+ headers: [{"content-length", "1000000"}, {"content-type", "application/pdf"}]
+ }}
+ end
+
+ def head("https://example.com/huge-page", _, _, _) do
+ {:ok,
+ %Tesla.Env{
+ status: 200,
+ headers: [{"content-length", "2000001"}, {"content-type", "text/html"}]
+ }}
+ end
+
def head(url, query, body, headers) do
{:error,
"Mock response not implemented for HEAD #{inspect(url)}, #{query}, #{inspect(body)}, #{inspect(headers)}"}
end
end
diff --git a/test/support/null_cache.ex b/test/support/null_cache.ex
index 9f1d45f1d..47c84174e 100644
--- a/test/support/null_cache.ex
+++ b/test/support/null_cache.ex
@@ -1,49 +1,55 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2022 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.NullCache do
@moduledoc """
A module simulating a permanently empty cache.
"""
@behaviour Pleroma.Caching
@impl true
def get!(_, _), do: nil
@impl true
def put(_, _, _, _ \\ nil), do: {:ok, true}
@impl true
def stream!(_, _), do: []
@impl true
def get(_, _), do: {:ok, nil}
@impl true
def fetch!(_, key, func) do
case func.(key) do
{_, res} -> res
res -> res
end
end
+ @impl true
+ def fetch(_, key, func), do: func.(key)
+
@impl true
def get_and_update(_, _, func) do
func.(nil)
end
@impl true
def expire_at(_, _, _), do: {:ok, true}
+ @impl true
+ def expire(_, _, _), do: {:ok, true}
+
@impl true
def exists?(_, _), do: {:ok, false}
@impl true
def execute!(_, func) do
func.(:nothing)
end
@impl true
def del(_, _), do: {:ok, true}
end

File Metadata

Mime Type
text/x-diff
Expires
Mon, Nov 25, 5:24 AM (1 d, 10 h)
Storage Engine
blob
Storage Format
Raw Data
Storage Handle
39623
Default Alt Text
(619 KB)

Event Timeline