Skip to content
Permalink

Comparing changes

This is a direct comparison between two commits made in this repository or its related repositories. View the default comparison for this range or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: TryGhost/ActivityPub
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: 715cad2b400c796ab887a2f37e3a390a35f385d7
Choose a base ref
..
head repository: TryGhost/ActivityPub
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: d49cee31e4a6692082f1a95321f2fc38c8254214
Choose a head ref
Showing with 3,519 additions and 11,836 deletions.
  1. +5 −34 .github/workflows/build.yml
  2. +6 −0 README.md
  3. +23 −119 cedar/README.md
  4. +0 −5 cedar/args.sh
  5. +1 −0 cedar/data-generation/config.js
  6. +1 −1 cedar/data-generation/data-generate.sh
  7. +0 −4 cedar/data-generation/data-reset.sh
  8. +0 −92 cedar/data-generation/package-lock.json
  9. +0 −1 cedar/gcloud.sh
  10. +1 −1 cedar/query-runner/package.json
  11. +0 −3 cedar/query-runner/src/app.ts
  12. +153 −153 cedar/query-runner/yarn.lock
  13. +0 −27 cedar/setup.sh
  14. +0 −34 cedar/upload-data.sh
  15. +2 −3 docker-compose.yml
  16. +11 −6 features/create-note.feature
  17. +14 −14 features/create-reply.feature
  18. +0 −34 features/delete-post.feature
  19. +0 −136 features/feed.feature
  20. +13 −3 features/follow-account.feature
  21. +1 −0 features/followers.feature
  22. +6 −1 features/handle-announce.feature
  23. +8 −3 features/handle-create-article.feature
  24. +0 −37 features/handle-delete.feature
  25. +6 −1 features/handle-group-announce.feature
  26. +1 −1 features/handle-like.feature
  27. +24 −4 features/like-activity.feature
  28. +0 −72 features/profile-liked-post.feature
  29. +0 −65 features/profile-posts.feature
  30. +0 −47 features/repost-activity.feature
  31. +0 −8 features/site-data.feature
  32. +150 −545 features/step_definitions/stepdefs.js
  33. +0 −118 features/thread.feature
  34. +0 −13 jobs/migrate-site-inbox-to-notifications/Dockerfile
  35. +0 −108 jobs/migrate-site-inbox-to-notifications/README.md
  36. +0 −70 jobs/migrate-site-inbox-to-notifications/deploy.sh
  37. +0 −309 jobs/migrate-site-inbox-to-notifications/index.mjs
  38. +0 −6 jobs/migrate-site-inbox-to-notifications/package.json
  39. +0 −37 jobs/migrate-site-inbox-to-notifications/run.sh
  40. +0 −221 jobs/migrate-site-inbox-to-notifications/yarn.lock
  41. +1 −4 migrate/bin/up
  42. +0 −1 migrate/migrations/000007_add-posts-table.down.sql
  43. +0 −32 migrate/migrations/000007_add-posts-table.up.sql
  44. +0 −1 migrate/migrations/000008_add-likes-table.down.sql
  45. +0 −12 migrate/migrations/000008_add-likes-table.up.sql
  46. +0 −1 migrate/migrations/000009_add-reposts-table.down.sql
  47. +0 −12 migrate/migrations/000009_add-reposts-table.up.sql
  48. +0 −1 migrate/migrations/000010_add-feeds-table.down.sql
  49. +0 −21 migrate/migrations/000010_add-feeds-table.up.sql
  50. +0 −1 migrate/migrations/000011_add-unique-constraint-to-feeds.down.sql
  51. +0 −1 migrate/migrations/000011_add-unique-constraint-to-feeds.up.sql
  52. +0 −1 migrate/migrations/000012_add-uuid-to-accounts.down.sql
  53. +0 −1 migrate/migrations/000012_add-uuid-to-accounts.up.sql
  54. +0 −1 migrate/migrations/000013_remove-duplicate-articles.down.sql
  55. +0 −15 migrate/migrations/000013_remove-duplicate-articles.up.sql
  56. +0 −1 migrate/migrations/000014_remove-replies-in-feed.down.sql
  57. +0 −4 migrate/migrations/000014_remove-replies-in-feed.up.sql
  58. +0 −1 migrate/migrations/000015_add-attachments-to-posts.down.sql
  59. +0 −1 migrate/migrations/000015_add-attachments-to-posts.up.sql
  60. +0 −1 migrate/migrations/000016_add-deleted-column-to-posts.down.sql
  61. +0 −1 migrate/migrations/000016_add-deleted-column-to-posts.up.sql
  62. +0 −1 migrate/migrations/000017_drop_deleted_column_from_posts.down.sql
  63. +0 −1 migrate/migrations/000017_drop_deleted_column_from_posts.up.sql
  64. +0 −1 migrate/migrations/000018_add_deleted_at_column_to_posts.down.sql
  65. +0 −1 migrate/migrations/000018_add_deleted_at_column_to_posts.up.sql
  66. +0 −5 migrate/migrations/000019_add_generated_columns_and_indexes_to_key_value.down.sql
  67. +0 −21 migrate/migrations/000019_add_generated_columns_and_indexes_to_key_value.up.sql
  68. +0 −27 migrate/migrations/000020_add_generated_columns_and_indexes_to_key_value.down.sql
  69. +0 −17 migrate/migrations/000020_add_generated_columns_and_indexes_to_key_value.up.sql
  70. +0 −1 migrate/migrations/000021_remove-likes-of-deleted-posts.down.sql
  71. +0 −3 migrate/migrations/000021_remove-likes-of-deleted-posts.up.sql
  72. +0 −1 migrate/migrations/000022_add-published-at-to-feeds.down.sql
  73. +0 −1 migrate/migrations/000022_add-published-at-to-feeds.up.sql
  74. +0 −1 migrate/migrations/000023_add-index-to-published-at-for-feeds.down.sql
  75. +0 −1 migrate/migrations/000023_add-index-to-published-at-for-feeds.up.sql
  76. +0 −1 migrate/migrations/000024_set-published-date-to-created-at-in-feeds.down.sql
  77. +0 −5 migrate/migrations/000024_set-published-date-to-created-at-in-feeds.up.sql
  78. +0 −1 migrate/migrations/000025_set-published-date-to-post-published-date-for-posts-in-feeds.down.sql
  79. +0 −4 migrate/migrations/000025_set-published-date-to-post-published-date-for-posts-in-feeds.up.sql
  80. +0 −1 migrate/migrations/000026_add-notifications-table.down.sql
  81. +0 −17 migrate/migrations/000026_add-notifications-table.up.sql
  82. +10 −19 package.json
  83. +0 −135 src/account/account.entity.ts
  84. +0 −70 src/account/account.entity.unit.test.ts
  85. +0 −185 src/account/account.repository.knex.integration.test.ts
  86. +0 −119 src/account/account.repository.knex.ts
  87. +148 −193 src/account/account.service.integration.test.ts
  88. +37 −136 src/account/account.service.ts
  89. +0 −7 src/account/types.ts
  90. +0 −78 src/activity-handlers/create.handler.ts
  91. +0 −96 src/activity-handlers/delete.handler.ts
  92. +2 −2 src/activitypub/fedify-context.factory.test.ts
  93. +7 −7 src/activitypub/fedify-context.factory.ts
  94. +64 −0 src/activitypub/fediverse-bridge.test.ts
  95. +1 −37 src/activitypub/fediverse-bridge.ts
  96. +0 −157 src/activitypub/fediverse-bridge.unit.test.ts
  97. +0 −119 src/activitypub/followers.service.integration.test.ts
  98. +0 −47 src/activitypub/followers.service.ts
  99. +0 −16 src/activitypub/object-dispatchers/delete.dispatcher.ts
  100. +0 −65 src/activitypub/object-dispatchers/delete.dispatcher.unit.test.ts
  101. +52 −211 src/app.ts
  102. +0 −6 src/constants.ts
  103. +0 −7 src/core/base.entity.ts
  104. +0 −15 src/core/events.ts
  105. +0 −7 src/core/url.ts
  106. +56 −72 src/db.ts
  107. +392 −187 src/dispatchers.ts
  108. +210 −0 src/dispatchers.unit.test.ts
  109. +7 −0 src/example.unit.test.ts
  110. +0 −121 src/feed/feed-update.service.ts
  111. +0 −361 src/feed/feed-update.service.unit.test.ts
  112. +0 −791 src/feed/feed.service.integration.test.ts
  113. +115 −324 src/feed/feed.service.ts
  114. +0 −30 src/feed/feeds-updated.event.ts
  115. +216 −545 src/handlers.ts
  116. +9 −40 src/helpers/activitypub/activity.ts
  117. +0 −76 src/http/api/__snapshots__/feed.json
  118. +0 −82 src/http/api/__snapshots__/thread.json
  119. +30 −119 src/http/api/{account.ts → accounts.ts}
  120. +203 −58 src/http/api/activities.ts
  121. +0 −124 src/http/api/feed.ts
  122. +0 −177 src/http/api/feed.unit.test.ts
  123. +83 −0 src/http/api/feeds.ts
  124. +276 −0 src/http/api/follows.ts
  125. +186 −52 src/http/api/helpers/post.ts
  126. +437 −49 src/http/api/helpers/post.unit.test.ts
  127. +7 −7 src/http/api/index.ts
  128. +2 −15 src/http/api/{note.ts → notes.ts}
  129. +0 −78 src/http/api/post.ts
  130. +131 −0 src/http/api/posts.ts
  131. +0 −564 src/http/api/profile.ts
  132. +109 −0 src/http/api/profiles.ts
  133. +2 −2 src/http/api/search.ts
  134. +0 −48 src/http/api/thread.ts
  135. +0 −158 src/http/api/thread.unit.test.ts
  136. +7 −24 src/http/api/types.ts
  137. +0 −115 src/http/api/webhook.ts
  138. +96 −0 src/http/api/webhooks.ts
  139. +6 −27 src/knex.kvstore.integration.test.ts
  140. +11 −13 src/knex.kvstore.ts
  141. +0 −13 src/post/post-created.event.ts
  142. +0 −20 src/post/post-deleted.event.ts
  143. +0 −20 src/post/post-dereposted.event.ts
  144. +0 −20 src/post/post-reposted.event.ts
  145. +0 −366 src/post/post.entity.ts
  146. +0 −278 src/post/post.entity.unit.test.ts
  147. +0 −860 src/post/post.repository.knex.integration.test.ts
  148. +0 −780 src/post/post.repository.knex.ts
  149. +0 −494 src/post/post.service.ts
  150. +0 −4 src/publishing/__snapshots__/service/publish-note-create-activity.json
  151. +0 −4 src/publishing/__snapshots__/service/publish-note-publish-result.json
  152. +0 −4 src/publishing/__snapshots__/service/publish-post-create-activity.json
  153. +0 −4 src/publishing/__snapshots__/service/publish-post-publish-result.json
  154. +0 −14 src/publishing/content.ts
  155. +6 −4 src/publishing/service.ts
  156. +0 −4 src/publishing/types.ts
  157. +6 −22 src/site/site.service.integration.test.ts
  158. +28 −41 src/site/site.service.ts
  159. +0 −13 src/test/crypto-key-pair.ts
  160. +0 −66 src/test/db.ts
  161. +0 −6 vitest.config.ts
  162. +140 −245 yarn.lock
39 changes: 5 additions & 34 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
@@ -21,21 +21,6 @@ jobs:
- name: Run Biome
run: biome ci .

check-yarn-lock:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4

- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: 'lts/*'
cache: 'yarn'

- name: Check yarn.lock
run: yarn install --frozen-lockfile

build-test-deploy:
name: Build, Test and Deploy
environment: build
@@ -50,7 +35,6 @@ jobs:
with:
images: |
europe-west4-docker.pkg.dev/ghost-activitypub/main/activitypub
europe-docker.pkg.dev/ghost-activitypub/activitypub/activitypub
tags: |
type=edge,branch=main
type=semver,pattern={{version}}
@@ -64,7 +48,6 @@ jobs:
with:
images: |
europe-west4-docker.pkg.dev/ghost-activitypub/main/migrations
europe-docker.pkg.dev/ghost-activitypub/activitypub/migrations
tags: |
type=edge,branch=main
type=semver,pattern={{version}}
@@ -89,28 +72,24 @@ jobs:
- name: "Run Tests"
run: yarn test

- name: "Login to GAR europe-west4"
- name: "Login to GAR"
if: github.ref == 'refs/heads/main'
uses: docker/login-action@v3
with:
registry: europe-west4-docker.pkg.dev
username: _json_key
password: ${{ secrets.SERVICE_ACCOUNT_KEY }}

- name: "Login to GAR europe"
uses: docker/login-action@v3
with:
registry: europe-docker.pkg.dev
username: _json_key
password: ${{ secrets.SERVICE_ACCOUNT_KEY }}

- name: "Push ActivityPub Docker Image"
if: github.ref == 'refs/heads/main'
uses: docker/build-push-action@v6
with:
context: .
push: true
tags: ${{ steps.activitypub-docker-metadata.outputs.tags }}

- name: "Push Migrations Docker Image"
if: github.ref == 'refs/heads/main'
uses: docker/build-push-action@v6
with:
context: migrate
@@ -132,15 +111,7 @@ jobs:
job: migrations
flags: '--wait --execute-now --set-cloudsql-instances=ghost-activitypub:europe-west4:activitypub-db'

- name: "Deploy ActivityPub Queue to Cloud Run"
if: github.ref == 'refs/heads/main'
uses: 'google-github-actions/deploy-cloudrun@v2'
with:
image: europe-west4-docker.pkg.dev/ghost-activitypub/main/activitypub:${{ steps.activitypub-docker-metadata.outputs.version }}
region: europe-west4
service: activitypub-sub

- name: "Deploy ActivityPub API to Cloud Run"
- name: "Deploy ActivityPub to Cloud Run"
if: github.ref == 'refs/heads/main'
uses: 'google-github-actions/deploy-cloudrun@v2'
with:
6 changes: 6 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -86,6 +86,12 @@ If you would like to run other commands you can run `docker compose exec -it mig

 

# Community leaderboard

![Leaderboard](https://github.com/TryGhost/ActivityPub/assets/115641230/371e8f36-8293-43d2-912a-772e56517e1d)

 

# Copyright & license

Copyright (c) 2013-2025 Ghost Foundation - Released under the [MIT license](LICENSE). Ghost and the Ghost Logo are trademarks of Ghost Foundation Ltd. Please see our [trademark policy](https://ghost.org/trademark/) for info on acceptable usage.
142 changes: 23 additions & 119 deletions cedar/README.md
Original file line number Diff line number Diff line change
@@ -1,30 +1,23 @@
# 🌳 Cedar

Cedar is a collection of tools for generating fake data and seeding a
database with it for use in testing and development.
Cedar is a collection of tools for generating fake data and seeding a database
with it for use in testing and development.

The code is split into the following directories:

- `schema`: SQL schema for the database.
- `data-generation`: Generating fake data and seeding the database
with it.
- `query-runner`: Running queries against the database.
- `schema`: SQL schema for the database.
- `data-generation`: Generating fake data and seeding the database with it.
- `query-runner`: Running queries against the database.

## Data Generation

These commands are all run in the `data-generation` directory.
These commands are all run in the data-generation directory

Generate fake data by creating a series of CSV files that can be
imported into a database.
Generate fake data by creating a series of CSV files that can be imported into a
database.

Adjust the `config.js` file to set the parameters used to control the
size of the data set that will be generated.

### Install dependencies

```bash
npm install
```
Adjust the `config.js` file to set the parameters used to control the size of
the data set that will be generated.

### Generate base data

@@ -58,133 +51,43 @@ npm install

### Prepare data to be exported elsewhere

This compresses the generated data files to make them easier to
transfer.

**Prerequisite**: Install `pigz` on your host machine (`brew install
pigz` on macOS)
This gzips the generated data files so they are easier to transfer

```bash
./data-export.sh
```

## Importing Data to GCP

This section guides you through setting up a Google Cloud Platform
database and importing the generated data into it.

### Prerequisites

- `gcloud` CLI configured with appropriate permissions
- Generated data files (from the steps above)

### Step 1: Set up gcloud environment

If you don't have `gcloud` set up on your host machine, use the
provided Docker container:

```bash
./gcloud.sh
```

This runs:

```bash
docker run -it --rm \
-v $HOME/.config/gcloud:/root/.config/gcloud \
-v $(pwd):/workspace \
-w /workspace \
google/cloud-sdk
```

You'll need to run `gcloud auth login` to setup your credentials the
first time.

### Step 2: Set up your GCP resources

Run the setup script to create a database, a storage bucket, upload the schema,
and apply it to the database:

```bash
./setup.sh
```

Run this from within the gcloud Docker container if you're using the `gcloud.sh`
script noted above.

The parameters used for setting up the resources can be configured in
the `args.sh` file.

### Step 3: Generate data

Follow the instructions in the [Data Generation](#data-generation)
section above to create your data files.

### Step 4: Upload data to GCP

Upload the compressed data files to your GCP storage bucket and import them
into the database:

```bash
./upload-data.sh
```

Run this from within the gcloud Docker container if you're using the `gcloud.sh`
script noted above.

### Step 5: Run queries

After importing, you can use the query runner as described in the next
section.

## Query Runner

These commands are all run in the `query-runner` directory.
These commands are all run in the query-runner directory

### Build Docker image

```bash
docker build . \
-t gcr.io/ghost-activitypub/activitypub-data-generator:latest
```
docker build . -t gcr.io/ghost-activitypub/activitypub-data-generator:latest .
```

### Push Docker image to GCP

```bash
```
docker push gcr.io/ghost-activitypub/activitypub-data-generator:latest
```

### Run Query Runner as Cloud Run Job

You can do this via the UI in the GCP Console.
You can do this via the UI in the GCP Console

### Local Testing

You can run this locally with the following commands, but a more
accurate test is to run this as a Cloud Run Job in GCP.

**Note:** For local testing, you may need to remove the platform
specification (`--platform=linux/amd64`) from the first line of the
Dockerfile.

**Prerequisite:** Ensure you have a local MySQL instance running that you
can connect to using the `MYSQL_HOST`, `MYSQL_USER`, `MYSQL_PASSWORD`,
and `MYSQL_DATABASE` environment variables defined in commands below. The database
will need to have the schema stored in `schema` applied to it. The database will
need to have the data files generated previously imported into it.
You can run this locally with the following commands, but a more accurate test
is to run this as a CloudRun Job in GCP. You will need to remove the platform
from the first line of the Dockerfile `--platform=linux/amd64`

Install dependencies:
```bash
yarn install
```

Build the local image:
```bash
docker build . -t query-runner:latest
```

Run with database connection details:
```bash
```
docker run --rm \
-e MYSQL_HOST=<ip> \
-e MYSQL_USER=<user> \
@@ -193,12 +96,13 @@ docker run --rm \
query-runner
```

Combined build and run command:
```bash
```
docker build . -t query-runner:latest && docker run --rm \
-e MYSQL_HOST=<ip> \
-e MYSQL_USER=<user> \
-e MYSQL_PASSWORD=<pass> \
-e MYSQL_DATABASE=activitypub_061224 \
query-runner
```


5 changes: 0 additions & 5 deletions cedar/args.sh

This file was deleted.

1 change: 1 addition & 0 deletions cedar/data-generation/config.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
const SCALING_FACTOR = 1;

const DATA_DIR = process.env.DATA_DIR || './data';
console.log(`Using data dir: ${DATA_DIR}`);

const NUM_SITES = Math.round(60_000 * SCALING_FACTOR);
const NUM_USERS = Math.round(300_000 * SCALING_FACTOR);
2 changes: 1 addition & 1 deletion cedar/data-generation/data-generate.sh
Original file line number Diff line number Diff line change
@@ -2,7 +2,7 @@

set -e

export DATA_DIR=./data
export DATA_DIR=./data/generated_$(date +%Y-%m-%d_%H-%M-%S)

mkdir -p $DATA_DIR

4 changes: 0 additions & 4 deletions cedar/data-generation/data-reset.sh

This file was deleted.

Loading