1
0

195 Commits

Author SHA1 Message Date
Zed 4bf3df94f8 Fix segfault 2026-03-04 17:27:15 +01:00
Zed 2898efab6b Fix search repeating when the end has been reached 2026-03-04 11:56:53 +01:00
Zed b0773dd934 Fix incorrect multi-user search query
Fixes #1373
2026-03-04 11:08:42 +01:00
Zed d187b1cc3f Fix video thumbnails not loading
Fixes #1371
2026-02-22 07:02:45 +01:00
Zed 95a9ee8dc5 Update and speed up GitHub workflows (#1368)
* Update actions and switch to GitHub runners

* Bump workflow Python version to 3.14

* Reuse nitter build for integration test

* Add missing libpcre3 installation to workflow

* Consolidate workflow runtime deps install

* Make nitter binary executable

* Run nimble md and scss simultaneously in workflow

* Run tests with 4 workers in workflow

* Rerun failing integration tests

* Bump integration test workers to 5

* Improve python dep install and run less workers

* Use native GitHub Actions Redis service

* Lower integration test workers to 2

* Switch to poetry to cache venv

* Ensure poetry is installed before setup-python

* Fix poetry sync command

* Switch back to 3 workers

* Cache poetry install

* WIP

* WIP

* Fix poetry/pipx caching

* Speed up integration test significantly

* WIP

* Cleanup
2026-02-19 06:56:20 +01:00
Zed 61b6748d97 Add community notes to RSS 2026-02-19 02:03:10 +01:00
Zed 2bd664ae7d Add community notes support
Fixes #727
Fixes #1023
2026-02-19 01:44:50 +01:00
Zed a15d1ce16b Add full support for tweet edit history
Fixes #700
2026-02-16 00:52:17 +01:00
Zed f257ce53ae Bump style version 2026-02-14 02:38:10 +01:00
Zed d45545cd53 Fix "Replying to" parsing 2026-02-14 02:24:24 +01:00
Zed 90b664ffb7 Make "Tweet unavailable" clickable and consistent 2026-02-14 02:19:53 +01:00
Zed cbce620692 Add dynamic-range-limit to prevent HDR jumpscares
Fixes #1345
2026-02-12 20:16:50 +01:00
Zed 05b6dd2a43 Add config options to enable subset of RSS feeds
Fixes #1363
2026-02-11 23:49:50 +01:00
Zed dcec1eb458 Fix invalid search link formatting 2026-02-10 22:53:54 +01:00
Zed 1c06a67afd Support image alt text
Fixes #559
2026-02-10 22:43:10 +01:00
Zed 40b1ba4e4e Bump css version 2026-02-09 22:08:09 +01:00
Zed b85e8c5d7d Support preference overrides using URL params
Fixes #186
2026-02-09 21:54:57 +01:00
Zed db36f75519 Support restoring preferences via new prefs param
Fixes #1352
Fixes #553
Fixes #249
2026-02-09 20:23:31 +01:00
Zed 5d28bd18c6 Add preference for configuring sticky navbar
Fixes #1354
2026-02-09 17:38:14 +01:00
Zed 0a6e79e626 Add bulk script create_sessions_browser.py 2026-02-09 02:55:07 +01:00
Zed 33dd9b6668 Fix /pic/ exploit 2026-02-06 20:44:37 +01:00
cmj a45227b883 Add user-agent to guest_token request (#1359) 2026-01-29 17:27:41 +01:00
yav a92e79ebc3 Fix the checkmark position (#1347)
Co-authored-by: yav <796176@protonmail.com>
2025-12-24 02:22:20 -05:00
jackyzy823 baeaf685d3 Make maxConcurrentReqs configurable (#1341) 2025-12-08 04:05:08 -05:00
Zed 51b54852dc Add preliminary support for nitter-proxy 2025-12-06 05:15:01 +01:00
Zed 663f5a52e1 Improve headers 2025-12-06 05:00:34 +01:00
Zed 17fc2628f9 Minor fix 2025-11-30 18:07:27 +01:00
Zed e741385828 Allow , in username to support multiple users
Fixes #1329
2025-11-30 18:06:22 +01:00
Zed 693a189462 Add heuristics to detect when to show "Load more"
Fixes #1328
2025-11-30 05:43:17 +01:00
Zed 7734d976f7 Add username validation
Fixes #1317
2025-11-30 04:12:38 +01:00
Zed a62ec9cbb4 Normalize headers 2025-11-30 03:58:43 +01:00
Zed 4b9aec6fde Use graphTweet for cookie sessions for now 2025-11-30 02:57:34 +01:00
Zed 064ec88080 Transition to ID-only RSS GUIDs on Dec 14, 2025
Fixes #447
2025-11-30 02:56:19 +01:00
Zed 71e65c84d7 Round video duration properly 2025-11-29 04:34:04 +01:00
Zed 436a873e4b Improve verified checkmark icon, css improvements 2025-11-29 04:33:49 +01:00
Zed 96ec75fc7f Add video duration to overlay
Fixes #498
2025-11-29 03:38:40 +01:00
Zed 7a08a9e132 Format css 2025-11-29 03:36:21 +01:00
Zed 31d210ca47 Add experimental x-client-transaction-id support (#1324)
* Add experimental x-client-transaction-id support

* Remove broken test
2025-11-29 01:13:08 +01:00
Zed dae68b4f13 Ignore null errors, they're internal API errors 2025-11-29 01:05:57 +01:00
Zed 8516ebe2b7 Fix 'key not found in object: expanded_url' error
Fixes #1318
2025-11-29 00:37:45 +01:00
Zed b83227aaf5 Implement temp fix for cookie sessions
Fixes #1319
2025-11-26 01:03:27 +01:00
Zed 404b06b5f3 Include "Video" and link for video tweets in RSS (#1315)
Fixes #836
2025-11-25 01:03:45 +01:00
Zed 2b922c049a Embed quote tweet in RSS (#1316)
Fixes #132
Closes #820
2025-11-25 01:02:45 +01:00
Zed 78101df2cc Style number input field 2025-11-24 23:04:25 +01:00
Zed 12bbddf204 Update search panel grid layout and animation 2025-11-24 23:04:25 +01:00
Zed 4979d07f2e Add spaces filter, remove broken filters 2025-11-24 23:04:25 +01:00
Zed f038b53fa2 Fix body font size to match x.com
Fixes #711
2025-11-24 23:04:25 +01:00
Zed 4748311f8d Fix intent/follow URL redirect
Fixes #629
2025-11-24 23:04:25 +01:00
Zed d47eb8f0eb Fix double slashes in url replacements
Fixes #520
2025-11-24 23:04:25 +01:00
Zed 1657eeb769 Fix canonical link causing redirects to Twitter
Fixes #526
2025-11-24 23:04:25 +01:00
Zed 25df682094 Expose username as HTML attribute
Fixes #551
2025-11-24 23:04:25 +01:00
Zed 53edbbc4e9 Fix broken tweet pagination ("Load more" button)
Fixes #1277
2025-11-23 20:00:10 +01:00
Zed 5b4a3fe691 Redirect /i/status/id/history to /i/status/id
Fixes #1231
2025-11-23 19:27:13 +01:00
Zed f8a17fdaa5 Remove Nim 1.6.x support
Fixes #1311
2025-11-23 17:28:11 +01:00
Zed b0d9c1d51a Update endpoints, fix parser, remove quotes stat 2025-11-22 21:29:36 +01:00
Zed 78d788b27f Fix verified parsing for oauth endpoints 2025-11-19 07:33:32 +01:00
Zed 824a7e346a Fix rss icon tag 2025-11-17 12:08:44 +01:00
Zed e8de18317e Fix broken pinned tweet parsing 2025-11-17 11:21:21 +01:00
Zed 6b655cddd8 Cleanup 2025-11-17 11:01:20 +01:00
Zed 886f2d2a45 Bump API versions, use more SessionAwareUrls 2025-11-17 11:00:38 +01:00
Zed bb6eb81a20 Add support for tweet views 2025-11-17 10:59:50 +01:00
Zed 0bb0b7e78c Support grok_share card
Fixes #1306
2025-11-17 06:37:24 +01:00
Zed a666c4867c Include username in session logs if available
Fixes #1310
2025-11-17 05:42:44 +01:00
Zed 778eb35ee3 Add curl-based cookie session script 2025-11-17 03:55:23 +01:00
Zed 3f3196d103 Fix broken UserMedia photo rail parsing
Fixes #1307
2025-11-17 00:14:52 +01:00
Zed 68fc7b71c8 Fix media support for cookie sessions
Fixes #1304
2025-11-16 23:22:21 +01:00
0xCathiefish 55d4469401 fix: correct argument parsing for --append flag in get_web_session.py (#1305)
* fix: correct argument parsing for --append flag

* fix: strip quotes from extracted user_id in twid cookie
2025-11-16 21:18:38 +01:00
Zed bf36fc471b Update tests 2025-11-16 06:04:46 +01:00
Zed 5aa0b65fea Add bearer token for cookie-based auth 2025-11-16 05:24:23 +01:00
Zed 4fc7b873c4 Use dynamic rate limits from API responses 2025-11-16 05:22:45 +01:00
Zed 6fe850b2c6 Add optional cookie session fields 2025-11-16 05:03:01 +01:00
Zed 3768762fca Add script for generating cookies 2025-11-16 05:02:57 +01:00
Zed f89d2329d2 Add cookie-based authentication support
Fixes #1303
2025-11-15 22:59:35 +01:00
Zed 9e95615021 Remove unused skipPinned parameter 2025-11-15 07:01:03 +01:00
Zed 32b04a772b Include pinned tweets in RSS
Fixes #1262
2025-11-15 06:57:15 +01:00
0xbarchitect 662ae90e22 Bypass Cloudflare 403 error using cloudscraper (#1291)
* Bypass Cloudflare 403 error using cloudscraper

* add docs
2025-10-12 09:07:37 +02:00
Gabriel Simmer e40c61a6ae Find TimelineAddEntries in tweets response (#1251)
See https://github.com/zedeus/nitter/issues/1250. Sometimes the API gives us more results
and the tweets are no longer at index 0.
2025-05-01 12:39:05 +01:00
Zed 94c83f3811 Hide ads/promoted tweets
Fixes #1234
2025-04-15 02:06:00 +01:00
Zed 83b0f8b55a Retry limited accounts after an hour 2025-04-05 15:57:14 +01:00
Zed 41fa47bfbf Revert "Support both web and Android sessions"
This reverts commit 661be438ec.
2025-02-25 23:36:02 +00:00
Zed 661be438ec Support both web and Android sessions 2025-02-25 05:46:18 +00:00
Zed 4f9ba9c7d6 Always print sessions logs 2025-02-25 04:29:28 +00:00
Émilien (perso) cb334a7d68 chore: Revert back to nim 2.0 for alpine ARM64 (#1222) 2025-02-23 22:07:45 +00:00
SoonKhen OwYong cc28d21a62 Add mounting of sessions.jsonl in docker-compose.yml (#1221) 2025-02-22 16:36:04 +00:00
Zed 92cd6abcf6 Stop logging unimportant errors 2025-02-16 02:06:19 +01:00
Zed 9ccfd8ee99 Reduce integration test concurrency even more 2025-02-12 21:22:09 +01:00
Zed 6da152db07 Reduce integration test concurrency 2025-02-12 20:43:45 +01:00
Zed 5be37737eb Fix rate limit handling 2025-02-12 18:36:29 +01:00
Zed 7702576369 Fix GitHub workflow secrets permissions 2025-02-12 14:42:05 +01:00
Zed fb7c1d8710 Improve test workflow 2025-02-05 22:48:18 +01:00
Zed 54ba1e30b5 Fix empty image URLs in photo rail 2025-02-05 20:27:23 +01:00
Zed bc38315d12 Fix tests 2025-02-05 20:19:18 +01:00
Zed 0664074749 Remove old tokenCount from nitter.example.conf 2025-02-05 19:37:22 +01:00
Zed b9af77a9bd Change main page search to "Tweets" search 2025-02-05 19:28:10 +01:00
Zed 10b1d9c80f Add Python script to create account sessions 2025-02-05 19:10:27 +01:00
Zed a3d341e7a6 Update README, added an important note 2025-02-05 18:10:15 +01:00
Zed 4d5091947c Update Dockerfiles 2025-02-05 18:10:10 +01:00
Zed 6fcd849eff Rename accounts/guest accounts to sessions
The new file loaded by default is now ./sessions.jsonl
JSONL is also required, .json support dropped.
2025-02-05 04:15:53 +01:00
Zed afad55749b Increase max concurrent reqs per account 2025-02-05 03:49:34 +01:00
Zed 5265de101d Skip null fetch errors 2025-02-05 03:49:17 +01:00
Zed 5edaea2359 Silence 404 proxy errors 2025-02-05 03:20:05 +01:00
Zed c0f2eea276 Fix missing video thumbnail being too small 2025-02-05 03:19:10 +01:00
Zed 7728899948 Add lazy loading for images 2025-02-05 03:18:49 +01:00
Zed b43bfc5d42 Return 403 on hmac error 2025-02-05 01:19:27 +01:00
Zed 81764ea0f8 Update endpoint versions, switch tweet endpoint 2025-02-05 01:19:06 +01:00
Zed 5b6dae5228 Add regex for x.com links 2025-02-05 01:16:52 +01:00
Zed e38276a638 Update authority header 2025-02-05 00:40:19 +01:00
Zed 2e13d7b57c Capture "account locked" API error 2025-02-05 00:35:39 +01:00
Zed 1aa9b0dba6 Move limited flag to be account-level 2025-02-05 00:34:50 +01:00
Zed 28d3ed7d9f Raise NoAccountsError when all accounts limited 2025-02-05 00:32:55 +01:00
Zed 19569bb19f Replace old v1 photo rail API with gql 2025-02-05 00:25:50 +01:00
somini c6edec0490 Update auth.nim (#1164)
Avoid expiring the tokens for now.

See:
- https://github.com/zedeus/nitter/issues/983#issuecomment-1923046398
- https://github.com/zedeus/nitter/issues/1155#issuecomment-1917167072

Thanks @cmj
2024-02-26 03:08:25 +00:00
jackyzy823 cdff5e9b1c Fix for #1147, Proxy for audio URL and upgrade hls.js (#1178)
* Revert "Fix broken video playback by forcing fmp4"

This reverts commit 52db03b73a.

* Fix audio url in video m3u8

* Upgrade hls.js to 1.5.1 and use full version
2024-02-21 23:10:54 +00:00
Zed 52db03b73a Fix broken video playback by forcing fmp4 2024-01-12 03:48:42 +01:00
blankie 583c858cdf Fix search queries in user search RSS feeds (#1126)
Fixes #992
2023-12-03 09:54:24 +01:00
Zed a9740fec8b Fix compilation with old Nim again 2023-11-25 10:11:57 +00:00
Zed f8254c2f0f Add support for business and gov verification
Also improve icon rendering on Firefox
2023-11-25 10:07:28 +00:00
Zed d6be08d093 Fix jobDetails error on old Nim versions 2023-11-25 05:53:13 +00:00
Zed 4dac9f0798 Add simple job_details card support 2023-11-25 05:31:15 +00:00
Zed 06ab1ea2e7 Enable disabled tests 2023-11-15 11:11:56 +00:00
Zed c2819dab44 Fix #1106
Closes #831
2023-11-15 11:11:53 +00:00
Zed eaedd2aee7 Fix ARM64 Dockerfile versions 2023-11-08 16:38:43 +00:00
Zed 5e188647fc Bump Nim in the ARM64 Dockerfile, add nitter user 2023-11-08 14:53:35 +00:00
Zed e0d9dd0f9c Fix #670 2023-11-08 14:27:22 +00:00
Zed d17583286a Don't requests made before reset 2023-11-01 05:44:59 +00:00
Zed 209f453b79 Purge expired accounts after parsing 2023-11-01 05:09:44 +00:00
Zed e1838e0933 Move CI workflow to buildjet 2023-11-01 05:09:35 +00:00
Zed 623424f516 Fix outdated test 2023-11-01 04:52:44 +00:00
Zed 7b3fcdc622 Fix guest accounts CI setup attempt 4 2023-11-01 04:19:10 +00:00
Zed 1d20bd01cb Remove redundant "active" field from /.health 2023-11-01 04:16:33 +00:00
Zed 58e73a14c5 Fix guest accounts CI setup attempt 3 2023-11-01 04:13:22 +00:00
Zed b0b335106d Fix missing CI file argument 2023-11-01 04:06:42 +00:00
Zed 006b91c903 Prevent annoying warnings on devel 2023-11-01 04:04:45 +00:00
Zed 33bad37128 Fix guest accounts CI setup attempt 2 2023-11-01 01:25:00 +00:00
Zed b930a3d5bf Fix guest accounts CI setup 2023-10-31 23:54:11 +00:00
Zed bd0be724f0 Merge branch 'master' into guest_accounts 2023-10-31 23:47:02 +00:00
Zed 60a82563da Run tests on multiple Nim versions 2023-10-31 23:46:24 +00:00
Zed b8103cf501 Fix compilation on Nim 1.6.x 2023-10-31 23:02:45 +00:00
Émilien (perso) b62d73dbd3 nim version min require + update dockerfile arm (#1053) 2023-10-31 22:33:08 +00:00
Zed 4120558649 Replace /.tokens with /.health and /.accounts 2023-10-31 12:04:32 +00:00
Zed 089275826c Bump minimum Nim version 2023-10-31 11:33:24 +00:00
Zed edad09f4c9 Update nimcrypto and jsony 2023-10-31 08:31:51 +00:00
Zed 32e3469e3a Fix multi-user timelines 2023-10-31 05:53:55 +00:00
LS 735b30c2da fix(nitter): add graphql user search (#1047)
* fix(nitter): add graphql user search

* fix(nitter): rm gitignore 2nd guest_accounts

* fix(nitter): keep query from user search in result. remove personal mods

* fix(nitter): removce useless line gitignore
2023-10-30 12:13:06 +00:00
Zed 537af7fd5e Improve Liberapay css for Firefox compatibility 2023-09-19 01:29:41 +00:00
Zed 7d14789910 Improve guest accounts loading, add JSONL support 2023-09-18 18:26:01 +00:00
Zed 7abcb489f4 Increase photo rail cache ttl 2023-09-18 17:15:09 +00:00
Zed 14f9a092d8 Fix crash on missing quote tweet data crash 2023-09-14 23:35:41 +00:00
Zed fcd74e8048 Retry rate limited requests with different account 2023-09-02 08:15:58 +02:00
Zed 4250245263 Shorten media proxy error log 2023-09-02 07:28:56 +02:00
Zed b8fe212e94 Add media proxying error logging 2023-09-01 21:39:02 +02:00
Zed 84dcf49079 Fix negative pending requests bug 2023-08-31 05:07:12 +02:00
Zed 82beb5da8c Add empty oauth token logging 2023-08-31 01:31:27 +02:00
Zed 282ce8b0e9 Add 429 logging 2023-08-31 01:29:54 +02:00
Zed 37b58a5a7e Fix accounts logging 2023-08-30 03:43:49 +02:00
Zed 898b19b92f Improve rate limit handling, minor refactor 2023-08-30 03:10:21 +02:00
Zed 986b91ac73 Handle ProtocolError and BadClientError equally 2023-08-29 23:58:03 +02:00
Zed 4ccf350dc7 Improve .tokens output 2023-08-29 23:45:18 +02:00
Zed 7630f57f17 Fix cards not being displayed 2023-08-26 05:16:38 +02:00
Zed 03794a8d4a Cleanup 2023-08-25 16:32:39 +02:00
Zed ae9fa02bf5 Switch to TweetDetail for tweets 2023-08-25 16:28:30 +02:00
Zed 88b005c9da Revert "Switch to using typeahead for user search"
This reverts commit a3e11e3272.
2023-08-23 19:31:40 +02:00
Zed a3e11e3272 Switch to using typeahead for user search 2023-08-23 10:14:44 +02:00
Zed 45808361af Fix tweetDetail stats 2023-08-22 04:45:49 +02:00
Zed 8df5256c1d Switch back to old user search endpoint 2023-08-22 04:33:14 +02:00
Zed 6e8744943f Tweak /.tokens, add amount of limited accounts 2023-08-22 03:43:18 +02:00
Zed 5c08e6a774 Fix compilation on older versions of Nim 2023-08-22 02:27:44 +02:00
Zed 30bdf3a14e Reduce max concurrent pending requests per account 2023-08-22 01:32:28 +02:00
Zed 12504bcffe Fix compilation error 2023-08-21 18:12:06 +02:00
Zed c3d9441370 Unify some guest account logs 2023-08-21 14:49:50 +02:00
Zed 51714b5ad2 Add guest accounts variable to GitHub action 2023-08-21 11:25:27 +02:00
Zed e8b5cbef7b Add missing limitedAt assignment 2023-08-20 12:31:08 +02:00
Zed 3d8858f0d8 Track rate limits, reset after 24 hours 2023-08-20 11:56:42 +02:00
Zed bbd68e6840 Filter out account limits that already reset 2023-08-19 01:13:36 +02:00
Zed 3572dd7771 Replace tokens with guest accounts, swap endpoints 2023-08-19 00:25:14 +02:00
Zed d7ca353a55 Disable photo rail test 2023-08-08 02:49:58 +02:00
Zed 54e6ce14ac Simplify photo rail test 2023-08-08 02:35:43 +02:00
Zed 967f5e50f9 Update and disable some tests 2023-08-08 02:21:40 +02:00
Zed 624394430c Use legacy timeline/user endpoint for Tweets tab 2023-08-08 02:09:56 +02:00
Zed 5725780c99 Bump Nim version in Docker image 2023-08-06 21:02:22 +02:00
Zed 20b5cce5dc Retry infinite scroll errors 2023-07-24 10:37:25 +02:00
Zed 39192bf191 Fix multi-timeline infinite scroll 2023-07-24 10:18:50 +02:00
Zed 59a72831c7 Apply cached profile verified status to tweets 2023-07-24 04:26:32 +02:00
Zed 72d8f35cd1 Search isn't rate limited 2023-07-22 04:06:04 +02:00
Zed 50f821dbd8 Use search instead of old timeline endpoint 2023-07-22 03:22:13 +02:00
Zed cc5841df30 Use old timeline endpoint 2023-07-21 18:56:39 +02:00
Zed f881226b22 Fix video embed 2023-07-14 21:35:43 +02:00
Jakub Wilk 4c4d5485a0 Fix typo (#943) 2023-07-14 18:11:56 +02:00
Zed afbdbd293e Fix protected user photo rail crash 2023-07-12 03:47:37 +02:00
Zed 67203a431d Add back search 2023-07-12 03:37:44 +02:00
Zed b290f6fd29 Optimize timeline data structure 2023-07-12 01:34:39 +02:00
Zed 0bc3c153d9 Fix everything (#927)
* Switch bearer token and endpoints, update parser

* Enable user search, disable tweet search

* Disable multi-user timelines for now

* Fix parsing of pinned tombstone
2023-07-10 11:25:34 +02:00
Zed dcf73354ff Fix GraphQL user crash with invalid JSON 2023-07-01 22:07:37 +02:00
PrivacyDevel 38985af6ed fixed bug that caused everybody to be displayed as verified (#890) 2023-05-30 23:42:14 +02:00
PrivacyDevel f7e878c126 fixed bug that caused threads on user profiles to be hidden (#885) 2023-05-30 13:37:35 +02:00
107 changed files with 6596 additions and 2610 deletions
+11 -12
View File
@@ -10,20 +10,20 @@ on:
jobs: jobs:
tests: tests:
uses: ./.github/workflows/run-tests.yml uses: ./.github/workflows/run-tests.yml
secrets: inherit
build-docker-amd64: build-docker-amd64:
needs: [tests] needs: [tests]
runs-on: buildjet-2vcpu-ubuntu-2204 runs-on: ubuntu-24.04
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v6
with:
fetch-depth: 0
- name: Set up Docker Buildx - name: Set up Docker Buildx
id: buildx id: buildx
uses: docker/setup-buildx-action@v2 uses: docker/setup-buildx-action@v3
with: with:
version: latest version: latest
- name: Login to DockerHub - name: Login to DockerHub
uses: docker/login-action@v2 uses: docker/login-action@v3
with: with:
username: ${{ secrets.DOCKER_USERNAME }} username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }} password: ${{ secrets.DOCKER_PASSWORD }}
@@ -35,20 +35,19 @@ jobs:
platforms: linux/amd64 platforms: linux/amd64
push: true push: true
tags: zedeus/nitter:latest,zedeus/nitter:${{ github.sha }} tags: zedeus/nitter:latest,zedeus/nitter:${{ github.sha }}
build-docker-arm64: build-docker-arm64:
needs: [tests] needs: [tests]
runs-on: buildjet-2vcpu-ubuntu-2204-arm runs-on: ubuntu-24.04-arm
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v6
with:
fetch-depth: 0
- name: Set up Docker Buildx - name: Set up Docker Buildx
id: buildx id: buildx
uses: docker/setup-buildx-action@v2 uses: docker/setup-buildx-action@v3
with: with:
version: latest version: latest
- name: Login to DockerHub - name: Login to DockerHub
uses: docker/login-action@v2 uses: docker/login-action@v3
with: with:
username: ${{ secrets.DOCKER_USERNAME }} username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }} password: ${{ secrets.DOCKER_PASSWORD }}
+123 -24
View File
@@ -8,38 +8,137 @@ on:
- master - master
workflow_call: workflow_call:
# Ensure that multiple runs on the same branch do not overlap.
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
defaults:
run:
shell: bash
jobs: jobs:
test: build-test:
runs-on: ubuntu-latest name: Build and test
runs-on: ubuntu-24.04
strategy:
matrix:
nim: ["2.0.x", "2.2.x", "devel"]
steps: steps:
- uses: actions/checkout@v3 - name: Checkout Code
with: uses: actions/checkout@v6
fetch-depth: 0
- name: Cache nimble - name: Cache Nimble Dependencies
id: cache-nimble id: cache-nimble
uses: actions/cache@v3 uses: actions/cache@v5
with: with:
path: ~/.nimble path: ~/.nimble
key: nimble-${{ hashFiles('*.nimble') }} key: ${{ matrix.nim }}-nimble-v2-${{ hashFiles('*.nimble') }}
restore-keys: "nimble-" restore-keys: |
- uses: actions/setup-python@v4 ${{ matrix.nim }}-nimble-v2-
- name: Setup Nim
uses: jiro4989/setup-nim-action@v2
with: with:
python-version: "3.10" nim-version: ${{ matrix.nim }}
cache: "pip" use-nightlies: true
- uses: jiro4989/setup-nim-action@v1 repo-token: ${{ secrets.GITHUB_TOKEN }}
- name: Build Project
run: nimble build -Y
- name: Upload 2.2.x build artifact
if: matrix.nim == '2.2.x'
uses: actions/upload-artifact@v6
with: with:
nim-version: "1.x" name: nitter-linux-nim-2.2.x-${{ github.sha }}
- run: nimble build -d:release -Y path: |
- run: pip install seleniumbase ./nitter
- run: seleniumbase install chromedriver if-no-files-found: error
- uses: supercharge/redis-github-action@1.5.0
- name: Prepare Nitter integration-test:
needs: [build-test]
name: Integration test
runs-on: ubuntu-24.04
services:
redis:
image: redis:7
ports:
- 6379:6379
steps:
- name: Install runtime deps
run: |
sudo apt-get install -y --no-install-recommends libsass-dev libpcre3
- name: Checkout code
uses: actions/checkout@v6
- name: Cache pipx (poetry)
uses: actions/cache@v5
with:
path: |
~/.local/pipx
~/.local/bin
key: pipx-poetry-${{ runner.os }}
- name: Install poetry
env:
PIPX_HOME: ~/.local/pipx
PIPX_BIN_DIR: ~/.local/bin
run: command -v poetry >/dev/null 2>&1 || pipx install poetry
- name: Setup Python (3.14) with Poetry cache
uses: actions/setup-python@v6
with:
python-version: "3.14"
cache: poetry
cache-dependency-path: tests/poetry.lock
- name: Install Python deps
working-directory: tests
run: poetry sync
- name: Cache Nimble Dependencies
uses: actions/cache@v5
with:
path: ~/.nimble
key: 2.2.x-nimble-v2-${{ hashFiles('*.nimble') }}
restore-keys: |
2.2.x-nimble-v2-
- name: Setup Nim
uses: jiro4989/setup-nim-action@v2
with:
nim-version: 2.2.x
use-nightlies: true
repo-token: ${{ secrets.GITHUB_TOKEN }}
- name: Download 2.2.x build artifact
uses: actions/download-artifact@v4
with:
name: nitter-linux-nim-2.2.x-${{ github.sha }}
path: .
- name: Make nitter binary executable
run: chmod +x ./nitter
- name: Prepare Nitter Environment
run: | run: |
sudo apt install libsass-dev -y
cp nitter.example.conf nitter.conf cp nitter.example.conf nitter.conf
nimble md sed -i 's/enableDebug = false/enableDebug = true/g' nitter.conf
nimble scss
- name: Run tests # Run both Nimble tasks concurrently
nim r tools/rendermd.nim &
nim r tools/gencss.nim &
wait
echo '${{ secrets.SESSIONS }}' | head -n1
echo '${{ secrets.SESSIONS }}' > ./sessions.jsonl
- name: Run Tests
run: | run: |
./nitter & ./nitter &
pytest -n4 tests cd tests
poetry run pytest -n3 --reruns=3 --rs .
+2
View File
@@ -10,4 +10,6 @@ nitter
/public/css/style.css /public/css/style.css
/public/md/*.html /public/md/*.html
nitter.conf nitter.conf
guest_accounts.json*
sessions.json*
dump.rdb dump.rdb
+2 -2
View File
@@ -1,4 +1,4 @@
FROM nimlang/nim:1.6.10-alpine-regular as nim FROM nimlang/nim:2.2.0-alpine-regular as nim
LABEL maintainer="setenforce@protonmail.com" LABEL maintainer="setenforce@protonmail.com"
RUN apk --no-cache add libsass-dev pcre RUN apk --no-cache add libsass-dev pcre
@@ -9,7 +9,7 @@ COPY nitter.nimble .
RUN nimble install -y --depsOnly RUN nimble install -y --depsOnly
COPY . . COPY . .
RUN nimble build -d:danger -d:lto -d:strip \ RUN nimble build -d:danger -d:lto -d:strip --mm:refc \
&& nimble scss \ && nimble scss \
&& nimble md && nimble md
+7 -5
View File
@@ -1,7 +1,7 @@
FROM alpine:3.17 as nim FROM alpine:3.20.6 as nim
LABEL maintainer="setenforce@protonmail.com" LABEL maintainer="setenforce@protonmail.com"
RUN apk --no-cache add gcc git libc-dev libsass-dev "nim=1.6.8-r0" nimble pcre RUN apk --no-cache add libsass-dev pcre gcc git libc-dev nim nimble
WORKDIR /src/nitter WORKDIR /src/nitter
@@ -9,15 +9,17 @@ COPY nitter.nimble .
RUN nimble install -y --depsOnly RUN nimble install -y --depsOnly
COPY . . COPY . .
RUN nimble build -d:danger -d:lto -d:strip \ RUN nimble build -d:danger -d:lto -d:strip --mm:refc \
&& nimble scss \ && nimble scss \
&& nimble md && nimble md
FROM alpine:3.17 FROM alpine:3.20.6
WORKDIR /src/ WORKDIR /src/
RUN apk --no-cache add ca-certificates pcre openssl1.1-compat RUN apk --no-cache add pcre ca-certificates openssl
COPY --from=nim /src/nitter/nitter ./ COPY --from=nim /src/nitter/nitter ./
COPY --from=nim /src/nitter/nitter.example.conf ./nitter.conf COPY --from=nim /src/nitter/nitter.example.conf ./nitter.conf
COPY --from=nim /src/nitter/public ./public COPY --from=nim /src/nitter/public ./public
EXPOSE 8080 EXPOSE 8080
RUN adduser -h /src/ -D -s /bin/sh nitter
USER nitter
CMD ./nitter CMD ./nitter
+34 -23
View File
@@ -4,27 +4,35 @@
[![Test Matrix](https://github.com/zedeus/nitter/workflows/Docker/badge.svg)](https://github.com/zedeus/nitter/actions/workflows/build-docker.yml) [![Test Matrix](https://github.com/zedeus/nitter/workflows/Docker/badge.svg)](https://github.com/zedeus/nitter/actions/workflows/build-docker.yml)
[![License](https://img.shields.io/github/license/zedeus/nitter?style=flat)](#license) [![License](https://img.shields.io/github/license/zedeus/nitter?style=flat)](#license)
> [!NOTE]
> Running a Nitter instance now requires real accounts, since Twitter removed the previous methods. \
> This does not affect users. \
> For instructions on how to obtain session tokens, see [Creating session tokens](https://github.com/zedeus/nitter/wiki/Creating-session-tokens).
A free and open source alternative Twitter front-end focused on privacy and A free and open source alternative Twitter front-end focused on privacy and
performance. \ performance. \
Inspired by the [Invidious](https://github.com/iv-org/invidious) Inspired by the [Invidious](https://github.com/iv-org/invidious) project.
project.
- No JavaScript or ads - No JavaScript or ads
- All requests go through the backend, client never talks to Twitter - All requests go through the backend, client never talks to Twitter
- Prevents Twitter from tracking your IP or JavaScript fingerprint - Prevents Twitter from tracking your IP or JavaScript fingerprint
- Uses Twitter's unofficial API (no rate limits or developer account required) - Uses Twitter's unofficial API (no developer account required)
- Lightweight (for [@nim_lang](https://nitter.net/nim_lang), 60KB vs 784KB from twitter.com) - Lightweight (for [@nim_lang](https://nitter.net/nim_lang), 60KB vs 784KB from twitter.com)
- RSS feeds - RSS feeds
- Themes - Themes
- Mobile support (responsive design) - Mobile support (responsive design)
- AGPLv3 licensed, no proprietary instances permitted - AGPLv3 licensed, no proprietary instances permitted
Liberapay: https://liberapay.com/zedeus \ <details>
Patreon: https://patreon.com/nitter \ <summary>Donations</summary>
BTC: bc1qp7q4qz0fgfvftm5hwz3vy284nue6jedt44kxya \ Liberapay: https://liberapay.com/zedeus<br>
ETH: 0x66d84bc3fd031b62857ad18c62f1ba072b011925 \ Patreon: https://patreon.com/nitter<br>
LTC: ltc1qhsz5nxw6jw9rdtw9qssjeq2h8hqk2f85rdgpkr \ BTC: bc1qpqpzjkcpgluhzf7x9yqe7jfe8gpfm5v08mdr55<br>
XMR: 42hKayRoEAw4D6G6t8mQHPJHQcXqofjFuVfavqKeNMNUZfeJLJAcNU19i1bGdDvcdN6romiSscWGWJCczFLe9RFhM3d1zpL ETH: 0x24a0DB59A923B588c7A5EBd0dBDFDD1bCe9c4460<br>
XMR: 42hKayRoEAw4D6G6t8mQHPJHQcXqofjFuVfavqKeNMNUZfeJLJAcNU19i1bGdDvcdN6romiSscWGWJCczFLe9RFhM3d1zpL<br>
SOL: ANsyGNXFo6osuFwr1YnUqif2RdoYRhc27WdyQNmmETSW<br>
ZEC: u1vndfqtzyy6qkzhkapxelel7ams38wmfeccu3fdpy2wkuc4erxyjm8ncjhnyg747x6t0kf0faqhh2hxyplgaum08d2wnj4n7cyu9s6zhxkqw2aef4hgd4s6vh5hpqvfken98rg80kgtgn64ff70djy7s8f839z00hwhuzlcggvefhdlyszkvwy3c7yw623vw3rvar6q6evd3xcvveypt
</details>
## Roadmap ## Roadmap
@@ -42,12 +50,13 @@ maintained by the community.
## Why? ## Why?
It's impossible to use Twitter without JavaScript enabled. For privacy-minded It's impossible to use Twitter without JavaScript enabled, and as of 2024 you
folks, preventing JavaScript analytics and IP-based tracking is important, but need to sign up. For privacy-minded folks, preventing JavaScript analytics and
apart from using a VPN and uBlock/uMatrix, it's impossible. Despite being behind IP-based tracking is important, but apart from using a VPN and uBlock/uMatrix,
a VPN and using heavy-duty adblockers, you can get accurately tracked with your it's impossible. Despite being behind a VPN and using heavy-duty adblockers,
[browser's fingerprint](https://restoreprivacy.com/browser-fingerprinting/), you can get accurately tracked with your [browser's
[no JavaScript required](https://noscriptfingerprint.com/). This all became fingerprint](https://restoreprivacy.com/browser-fingerprinting/), [no
JavaScript required](https://noscriptfingerprint.com/). This all became
particularly important after Twitter [removed the particularly important after Twitter [removed the
ability](https://www.eff.org/deeplinks/2020/04/twitter-removes-privacy-option-and-shows-why-we-need-strong-privacy-laws) ability](https://www.eff.org/deeplinks/2020/04/twitter-removes-privacy-option-and-shows-why-we-need-strong-privacy-laws)
for users to control whether their data gets sent to advertisers. for users to control whether their data gets sent to advertisers.
@@ -71,19 +80,21 @@ Twitter account.
- libpcre - libpcre
- libsass - libsass
- redis - redis/valkey
To compile Nitter you need a Nim installation, see To compile Nitter you need a Nim installation, see
[nim-lang.org](https://nim-lang.org/install.html) for details. It is possible to [nim-lang.org](https://nim-lang.org/install.html) for details. It is possible
install it system-wide or in the user directory you create below. to install it system-wide or in the user directory you create below.
To compile the scss files, you need to install `libsass`. On Ubuntu and Debian, To compile the scss files, you need to install `libsass`. On Ubuntu and Debian,
you can use `libsass-dev`. you can use `libsass-dev`.
Redis is required for caching and in the future for account info. It should be Redis is required for caching and in the future for account info. As of 2024
available on most distros as `redis` or `redis-server` (Ubuntu/Debian). Redis is no longer open source, so using the fork Valkey is recommended. It
Running it with the default config is fine, Nitter's default config is set to should be available on most distros as `redis` or `redis-server`
use the default Redis port and localhost. (Ubuntu/Debian), or `valkey`/`valkey-server`. Running it with the default
config is fine, Nitter's default config is set to use the default port and
localhost.
Here's how to create a `nitter` user, clone the repo, and build the project Here's how to create a `nitter` user, clone the repo, and build the project
along with the scss and md files. along with the scss and md files.
@@ -93,7 +104,7 @@ along with the scss and md files.
# su nitter # su nitter
$ git clone https://github.com/zedeus/nitter $ git clone https://github.com/zedeus/nitter
$ cd nitter $ cd nitter
$ nimble build -d:release $ nimble build -d:danger --mm:refc
$ nimble scss $ nimble scss
$ nimble md $ nimble md
$ cp nitter.example.conf nitter.conf $ cp nitter.example.conf nitter.conf
+1 -6
View File
@@ -7,12 +7,7 @@
# disable annoying warnings # disable annoying warnings
warning("GcUnsafe2", off) warning("GcUnsafe2", off)
warning("HoleEnumConv", off)
hint("XDeclaredButNotUsed", off) hint("XDeclaredButNotUsed", off)
hint("XCannotRaiseY", off) hint("XCannotRaiseY", off)
hint("User", off) hint("User", off)
const
nimVersion = (major: NimMajor, minor: NimMinor, patch: NimPatch)
when nimVersion >= (1, 6, 0):
warning("HoleEnumConv", off)
+1
View File
@@ -9,6 +9,7 @@ services:
- "127.0.0.1:8080:8080" # Replace with "8080:8080" if you don't use a reverse proxy - "127.0.0.1:8080:8080" # Replace with "8080:8080" if you don't use a reverse proxy
volumes: volumes:
- ./nitter.conf:/src/nitter.conf:Z,ro - ./nitter.conf:/src/nitter.conf:Z,ro
- ./sessions.jsonl:/src/sessions.jsonl:Z,ro # Run get_sessions.py to get the credentials
depends_on: depends_on:
- nitter-redis - nitter-redis
restart: unless-stopped restart: unless-stopped
+19 -17
View File
@@ -1,37 +1,39 @@
[Server] [Server]
hostname = "nitter.net" # for generating links, change this to your own domain/ip hostname = "nitter.net" # for generating links, change this to your own domain/ip
title = "nitter" title = "nitter"
address = "0.0.0.0" address = "0.0.0.0"
port = 8080 port = 8080
https = false # disable to enable cookies when not using https https = false # disable to enable cookies when not using https
httpMaxConnections = 100 httpMaxConnections = 100
staticDir = "./public" staticDir = "./public"
[Cache] [Cache]
listMinutes = 240 # how long to cache list info (not the tweets, so keep it high) listMinutes = 240 # how long to cache list info (not the tweets, so keep it high)
rssMinutes = 10 # how long to cache rss queries rssMinutes = 10 # how long to cache rss queries
redisHost = "localhost" # Change to "nitter-redis" if using docker-compose redisHost = "localhost" # Change to "nitter-redis" if using docker-compose
redisPort = 6379 redisPort = 6379
redisPassword = "" redisPassword = ""
redisConnections = 20 # minimum open connections in pool redisConnections = 20 # minimum open connections in pool
redisMaxConnections = 30 redisMaxConnections = 30
# new connections are opened when none are available, but if the pool size # new connections are opened when none are available, but if the pool size
# goes above this, they're closed when released. don't worry about this unless # goes above this, they're closed when released. don't worry about this unless
# you receive tons of requests per second # you receive tons of requests per second
[Config] [Config]
hmacKey = "secretkey" # random key for cryptographic signing of video urls hmacKey = "secretkey" # random key for cryptographic signing of video urls
base64Media = false # use base64 encoding for proxied media urls base64Media = false # use base64 encoding for proxied media urls
enableRSS = true # set this to false to disable RSS feeds enableRSS = true # master switch, set to false to disable all RSS feeds
enableDebug = false # enable request logs and debug endpoints (/.tokens) enableRSSUserTweets = true # /@user/rss
proxy = "" # http/https url, SOCKS proxies are not supported enableRSSUserReplies = true # /@user/with_replies/rss
enableRSSUserMedia = true # /@user/media/rss
enableRSSSearch = true # /search/rss and /@user/search/rss
enableRSSList = true # list RSS feeds
enableDebug = false # enable request logs and debug endpoints (/.sessions)
proxy = "" # http/https url, SOCKS proxies are not supported
proxyAuth = "" proxyAuth = ""
tokenCount = 10 apiProxy = "" # nitter-proxy host, e.g. localhost:7000
# minimum amount of usable tokens. tokens are used to authorize API requests, disableTid = false # enable this if cookie-based auth is failing
# but they expire after ~1 hour, and have a limit of 500 requests per endpoint. maxConcurrentReqs = 2 # max requests at a time per session to avoid race conditions
# the limits reset every 15 minutes, and the pool is filled up so there's
# always at least `tokenCount` usable tokens. only increase this if you receive
# major bursts all the time and don't have a rate limiting setup via e.g. nginx
# Change default preferences here, see src/prefs_impl.nim for a complete list # Change default preferences here, see src/prefs_impl.nim for a complete list
[Preferences] [Preferences]
+6 -6
View File
@@ -10,11 +10,11 @@ bin = @["nitter"]
# Dependencies # Dependencies
requires "nim >= 1.4.8" requires "nim >= 2.0.0"
requires "jester#baca3f" requires "jester#baca3f"
requires "karax#5cf360c" requires "karax#5cf360c"
requires "sass#7dfdd03" requires "sass#7dfdd03"
requires "nimcrypto#4014ef9" requires "nimcrypto#a079df9"
requires "markdown#158efe3" requires "markdown#158efe3"
requires "packedjson#9e6fbb6" requires "packedjson#9e6fbb6"
requires "supersnappy#6c94198" requires "supersnappy#6c94198"
@@ -22,13 +22,13 @@ requires "redpool#8b7c1db"
requires "https://github.com/zedeus/redis#d0a0e6f" requires "https://github.com/zedeus/redis#d0a0e6f"
requires "zippy#ca5989a" requires "zippy#ca5989a"
requires "flatty#e668085" requires "flatty#e668085"
requires "jsony#ea811be" requires "jsony#1de1f08"
requires "oauth#b8c163b"
# Tasks # Tasks
task scss, "Generate css": task scss, "Generate css":
exec "nimble c --hint[Processing]:off -d:danger -r tools/gencss" exec "nim r --hint[Processing]:off tools/gencss"
task md, "Render md": task md, "Render md":
exec "nimble c --hint[Processing]:off -d:danger -r tools/rendermd" exec "nim r --hint[Processing]:off tools/rendermd"
+120 -30
View File
@@ -1,53 +1,143 @@
@font-face { @font-face {
font-family: 'fontello'; font-family: "fontello";
src: url('/fonts/fontello.eot?21002321'); src: url("/fonts/fontello.eot?42791196");
src: url('/fonts/fontello.eot?21002321#iefix') format('embedded-opentype'), src:
url('/fonts/fontello.woff2?21002321') format('woff2'), url("/fonts/fontello.eot?42791196#iefix") format("embedded-opentype"),
url('/fonts/fontello.woff?21002321') format('woff'), url("/fonts/fontello.woff2?42791196") format("woff2"),
url('/fonts/fontello.ttf?21002321') format('truetype'), url("/fonts/fontello.woff?42791196") format("woff"),
url('/fonts/fontello.svg?21002321#fontello') format('svg'); url("/fonts/fontello.ttf?42791196") format("truetype"),
url("/fonts/fontello.svg?42791196#fontello") format("svg");
font-weight: normal; font-weight: normal;
font-style: normal; font-style: normal;
} }
[class^="icon-"]:before, [class*=" icon-"]:before { [class^="icon-"]:before,
[class*=" icon-"]:before {
font-family: "fontello"; font-family: "fontello";
font-style: normal; font-style: normal;
font-weight: normal; font-weight: normal;
speak: never; speak: never;
display: inline-block; display: inline-block;
text-decoration: inherit; text-decoration: inherit;
width: 1em; width: 1em;
margin-right: 0.2em;
text-align: center; text-align: center;
/* For safety - reset parent styles, that can break glyph codes*/ /* For safety - reset parent styles, that can break glyph codes*/
font-variant: normal; font-variant: normal;
text-transform: none; text-transform: none;
/* fix buttons height, for twitter bootstrap */ /* fix buttons height, for twitter bootstrap */
line-height: 1em; line-height: 1em;
/* Font smoothing. That was taken from TWBS */ /* Font smoothing. That was taken from TWBS */
-webkit-font-smoothing: antialiased; -webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale; -moz-osx-font-smoothing: grayscale;
} }
.icon-heart:before { content: '\2665'; } /* '♥' */ .icon-views:before {
.icon-quote:before { content: '\275e'; } /* '❞' */ content: "\e800";
.icon-comment:before { content: '\e802'; } /* '' */ }
.icon-ok:before { content: '\e803'; } /* '' */
.icon-play:before { content: '\e804'; } /* '' */ /* '' */
.icon-link:before { content: '\e805'; } /* '' */ .icon-heart:before {
.icon-calendar:before { content: '\e806'; } /* '' */ content: "\e801";
.icon-location:before { content: '\e807'; } /* '' */ }
.icon-picture:before { content: '\e809'; } /* '' */
.icon-lock:before { content: '\e80a'; } /* '' */ /* '' */
.icon-down:before { content: '\e80b'; } /* '' */ .icon-quote:before {
.icon-retweet:before { content: '\e80d'; } /* '' */ content: "\e802";
.icon-search:before { content: '\e80e'; } /* '' */ }
.icon-pin:before { content: '\e80f'; } /* '' */
.icon-cog:before { content: '\e812'; } /* '' */ /* '' */
.icon-rss-feed:before { content: '\e813'; } /* '' */ .icon-comment:before {
.icon-info:before { content: '\f128'; } /* '' */ content: "\e803";
.icon-bird:before { content: '\f309'; } /* '' */ }
/* '' */
.icon-group:before {
content: "\e804";
}
/* '' */
.icon-play:before {
content: "\e805";
}
/* '' */
.icon-link:before {
content: "\e806";
}
/* '' */
.icon-calendar:before {
content: "\e807";
}
/* '' */
.icon-location:before {
content: "\e808";
}
/* '' */
.icon-picture:before {
content: "\e809";
}
/* '' */
.icon-lock:before {
content: "\e80a";
}
/* '' */
.icon-down:before {
content: "\e80b";
}
/* '' */
.icon-retweet:before {
content: "\e80c";
}
/* '' */
.icon-search:before {
content: "\e80d";
}
/* '' */
.icon-pin:before {
content: "\e80e";
}
/* '' */
.icon-cog:before {
content: "\e80f";
}
/* '' */
.icon-rss:before {
content: "\e810";
}
/* '' */
.icon-ok:before {
content: "\e811";
}
/* '' */
.icon-circle:before {
content: "\f111";
}
/* '' */
.icon-info:before {
content: "\f128";
}
/* '' */
.icon-bird:before {
content: "\f309";
}
/* '' */
+9 -9
View File
@@ -1,6 +1,15 @@
Font license info Font license info
## Modern Pictograms
Copyright (c) 2012 by John Caserta. All rights reserved.
Author: John Caserta
License: SIL (http://scripts.sil.org/OFL)
Homepage: http://thedesignoffice.org/project/modern-pictograms/
## Entypo ## Entypo
Copyright (C) 2012 by Daniel Bruce Copyright (C) 2012 by Daniel Bruce
@@ -37,12 +46,3 @@ Font license info
Homepage: http://aristeides.com/ Homepage: http://aristeides.com/
## Modern Pictograms
Copyright (c) 2012 by John Caserta. All rights reserved.
Author: John Caserta
License: SIL (http://scripts.sil.org/OFL)
Homepage: http://thedesignoffice.org/project/modern-pictograms/
Binary file not shown.
+21 -15
View File
@@ -1,26 +1,28 @@
<?xml version="1.0" standalone="no"?> <?xml version="1.0" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"> <!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<svg xmlns="http://www.w3.org/2000/svg"> <svg xmlns="http://www.w3.org/2000/svg">
<metadata>Copyright (C) 2020 by original authors @ fontello.com</metadata> <metadata>Copyright (C) 2026 by original authors @ fontello.com</metadata>
<defs> <defs>
<font id="fontello" horiz-adv-x="1000" > <font id="fontello" horiz-adv-x="1000" >
<font-face font-family="fontello" font-weight="400" font-stretch="normal" units-per-em="1000" ascent="850" descent="-150" /> <font-face font-family="fontello" font-weight="400" font-stretch="normal" units-per-em="1000" ascent="850" descent="-150" />
<missing-glyph horiz-adv-x="1000" /> <missing-glyph horiz-adv-x="1000" />
<glyph glyph-name="heart" unicode="&#x2665;" d="M790 644q70-64 70-156t-70-158l-360-330-360 330q-70 66-70 158t70 156q62 58 151 58t153-58l56-52 58 52q62 58 150 58t152-58z" horiz-adv-x="860" /> <glyph glyph-name="views" unicode="&#xe800;" d="M180 516l0-538-180 0 0 538 180 0z m250-138l0-400-180 0 0 400 180 0z m250 344l0-744-180 0 0 744 180 0z" horiz-adv-x="680" />
<glyph glyph-name="quote" unicode="&#x275e;" d="M18 685l335 0 0-334q0-140-98-238t-237-97l0 111q92 0 158 65t65 159l-223 0 0 334z m558 0l335 0 0-334q0-140-98-238t-237-97l0 111q92 0 158 65t65 159l-223 0 0 334z" horiz-adv-x="928" /> <glyph glyph-name="heart" unicode="&#xe801;" d="M790 644q70-64 70-156t-70-158l-360-330-360 330q-70 66-70 158t70 156q62 58 151 58t153-58l56-52 58 52q62 58 150 58t152-58z" horiz-adv-x="860" />
<glyph glyph-name="comment" unicode="&#xe802;" d="M1000 350q0-97-67-179t-182-130-251-48q-39 0-81 4-110-97-257-135-27-8-63-12-10-1-17 5t-10 16v1q-2 2 0 6t1 6 2 5l4 5t4 5 4 5q4 5 17 19t20 22 17 22 18 28 15 33 15 42q-88 50-138 123t-51 157q0 73 40 139t106 114 160 76 194 28q136 0 251-48t182-130 67-179z" horiz-adv-x="1000" /> <glyph glyph-name="quote" unicode="&#xe802;" d="M18 685l335 0 0-334q0-140-98-238t-237-97l0 111q92 0 158 65t65 159l-223 0 0 334z m558 0l335 0 0-334q0-140-98-238t-237-97l0 111q92 0 158 65t65 159l-223 0 0 334z" horiz-adv-x="928" />
<glyph glyph-name="ok" unicode="&#xe803;" d="M0 260l162 162 166-164 508 510 164-164-510-510-162-162-162 164z" horiz-adv-x="1000" /> <glyph glyph-name="comment" unicode="&#xe803;" d="M1000 350q0-97-67-179t-182-130-251-48q-39 0-81 4-110-97-257-135-27-8-63-12-10-1-17 5t-10 16v1q-2 2 0 6t1 6 2 5l4 5t4 5 4 5q4 5 17 19t20 22 17 22 18 28 15 33 15 42q-88 50-138 123t-51 157q0 73 40 139t106 114 160 76 194 28q136 0 251-48t182-130 67-179z" horiz-adv-x="1000" />
<glyph glyph-name="play" unicode="&#xe804;" d="M772 333l-741-412q-13-7-22-2t-9 20v822q0 14 9 20t22-2l741-412q13-7 13-17t-13-17z" horiz-adv-x="785.7" /> <glyph glyph-name="group" unicode="&#xe804;" d="M0 106l0 134q0 26 18 32l171 80q-66 39-68 131 0 56 35 103 37 41 90 43 31 0 63-19-49-125 23-237-12-11-25-19l-114-55q-48-23-52-84l0-143-114 0q-25 0-27 34z m193-59l0 168q0 27 22 37l152 70 57 28q-37 23-60 66t-22 94q0 76 46 130t110 54 109-54 45-130q0-105-78-158l61-30 146-70q24-10 24-37l0-168q-2-37-37-41l-541 0q-14 2-24 14t-10 27z m473 330q68 106 22 231 31 19 66 21 49 0 90-43 35-41 35-103 0-82-65-131l168-80q18-10 18-32l0-134q0-32-27-34l-118 0 0 143q0 57-50 84l-110 53q-15 8-29 25z" horiz-adv-x="1000" />
<glyph glyph-name="link" unicode="&#xe805;" d="M294 116q14 14 34 14t36-14q32-34 0-70l-42-40q-56-56-132-56-78 0-134 56t-56 132q0 78 56 134l148 148q70 68 144 77t128-43q16-16 16-36t-16-36q-36-32-70 0-50 48-132-34l-148-146q-26-26-26-64t26-62q26-26 63-26t63 26z m450 574q56-56 56-132 0-78-56-134l-158-158q-74-72-150-72-62 0-112 50-14 14-14 34t14 36q14 14 35 14t35-14q50-48 122 24l158 156q28 28 28 64 0 38-28 62-24 26-56 31t-60-21l-50-50q-16-14-36-14t-34 14q-34 34 0 70l50 50q54 54 127 51t129-61z" horiz-adv-x="800" /> <glyph glyph-name="play" unicode="&#xe805;" d="M772 333l-741-412q-13-7-22-2t-9 20v822q0 14 9 20t22-2l741-412q13-7 13-17t-13-17z" horiz-adv-x="785.7" />
<glyph glyph-name="calendar" unicode="&#xe806;" d="M800 700q42 0 71-29t29-71l0-600q0-40-29-70t-71-30l-700 0q-40 0-70 30t-30 70l0 600q0 42 30 71t70 29l46 0 0-100 160 0 0 100 290 0 0-100 160 0 0 100 44 0z m0-700l0 400-700 0 0-400 700 0z m-540 800l0-170-70 0 0 170 70 0z m450 0l0-170-70 0 0 170 70 0z" horiz-adv-x="900" /> <glyph glyph-name="link" unicode="&#xe806;" d="M294 116q14 14 34 14t36-14q32-34 0-70l-42-40q-56-56-132-56-78 0-134 56t-56 132q0 78 56 134l148 148q70 68 144 77t128-43q16-16 16-36t-16-36q-36-32-70 0-50 48-132-34l-148-146q-26-26-26-64t26-62q26-26 63-26t63 26z m450 574q56-56 56-132 0-78-56-134l-158-158q-74-72-150-72-62 0-112 50-14 14-14 34t14 36q14 14 35 14t35-14q50-48 122 24l158 156q28 28 28 64 0 38-28 62-24 26-56 31t-60-21l-50-50q-16-14-36-14t-34 14q-34 34 0 70l50 50q54 54 127 51t129-61z" horiz-adv-x="800" />
<glyph glyph-name="location" unicode="&#xe807;" d="M250 750q104 0 177-73t73-177q0-106-62-243t-126-223l-62-84q-10 12-27 35t-60 89-76 130-60 147-27 149q0 104 73 177t177 73z m0-388q56 0 96 40t40 96-40 95-96 39-95-39-39-95 39-96 95-40z" horiz-adv-x="500" /> <glyph glyph-name="calendar" unicode="&#xe807;" d="M800 700q42 0 71-29t29-71l0-600q0-40-29-70t-71-30l-700 0q-40 0-70 30t-30 70l0 600q0 42 30 71t70 29l46 0 0-100 160 0 0 100 290 0 0-100 160 0 0 100 44 0z m0-700l0 400-700 0 0-400 700 0z m-540 800l0-170-70 0 0 170 70 0z m450 0l0-170-70 0 0 170 70 0z" horiz-adv-x="900" />
<glyph glyph-name="location" unicode="&#xe808;" d="M250 750q104 0 177-73t73-177q0-106-62-243t-126-223l-62-84q-10 12-27 35t-60 89-76 130-60 147-27 149q0 104 73 177t177 73z m0-388q56 0 96 40t40 96-40 95-96 39-95-39-39-95 39-96 95-40z" horiz-adv-x="500" />
<glyph glyph-name="picture" unicode="&#xe809;" d="M357 529q0-45-31-76t-76-32-76 32-31 76 31 76 76 31 76-31 31-76z m572-215v-250h-786v107l178 179 90-89 285 285z m53 393h-893q-7 0-12-5t-6-13v-678q0-7 6-13t12-5h893q7 0 13 5t5 13v678q0 8-5 13t-13 5z m89-18v-678q0-37-26-63t-63-27h-893q-36 0-63 27t-26 63v678q0 37 26 63t63 27h893q37 0 63-27t26-63z" horiz-adv-x="1071.4" /> <glyph glyph-name="picture" unicode="&#xe809;" d="M357 529q0-45-31-76t-76-32-76 32-31 76 31 76 76 31 76-31 31-76z m572-215v-250h-786v107l178 179 90-89 285 285z m53 393h-893q-7 0-12-5t-6-13v-678q0-7 6-13t12-5h893q7 0 13 5t5 13v678q0 8-5 13t-13 5z m89-18v-678q0-37-26-63t-63-27h-893q-36 0-63 27t-26 63v678q0 37 26 63t63 27h893q37 0 63-27t26-63z" horiz-adv-x="1071.4" />
@@ -28,19 +30,23 @@
<glyph glyph-name="down" unicode="&#xe80b;" d="M939 399l-414-413q-10-11-25-11t-25 11l-414 413q-11 11-11 26t11 25l93 92q10 11 25 11t25-11l296-296 296 296q11 11 25 11t26-11l92-92q11-11 11-25t-11-26z" horiz-adv-x="1000" /> <glyph glyph-name="down" unicode="&#xe80b;" d="M939 399l-414-413q-10-11-25-11t-25 11l-414 413q-11 11-11 26t11 25l93 92q10 11 25 11t25-11l296-296 296 296q11 11 25 11t26-11l92-92q11-11 11-25t-11-26z" horiz-adv-x="1000" />
<glyph glyph-name="retweet" unicode="&#xe80d;" d="M714 11q0-7-5-13t-13-5h-535q-5 0-8 1t-5 4-3 4-2 7 0 6v335h-107q-15 0-25 11t-11 25q0 13 8 23l179 214q11 12 27 12t28-12l178-214q9-10 9-23 0-15-11-25t-25-11h-107v-214h321q9 0 14-6l89-108q4-5 4-11z m357 232q0-13-8-23l-178-214q-12-13-28-13t-27 13l-179 214q-8 10-8 23 0 14 11 25t25 11h107v214h-322q-9 0-14 7l-89 107q-4 5-4 11 0 7 5 12t13 6h536q4 0 7-1t5-4 3-5 2-6 1-7v-334h107q14 0 25-11t10-25z" horiz-adv-x="1071.4" /> <glyph glyph-name="retweet" unicode="&#xe80c;" d="M714 11q0-7-5-13t-13-5h-535q-5 0-8 1t-5 4-3 4-2 7 0 6v335h-107q-15 0-25 11t-11 25q0 13 8 23l179 214q11 12 27 12t28-12l178-214q9-10 9-23 0-15-11-25t-25-11h-107v-214h321q9 0 14-6l89-108q4-5 4-11z m357 232q0-13-8-23l-178-214q-12-13-28-13t-27 13l-179 214q-8 10-8 23 0 14 11 25t25 11h107v214h-322q-9 0-14 7l-89 107q-4 5-4 11 0 7 5 12t13 6h536q4 0 7-1t5-4 3-5 2-6 1-7v-334h107q14 0 25-11t10-25z" horiz-adv-x="1071.4" />
<glyph glyph-name="search" unicode="&#xe80e;" d="M772 78q30-34 6-62l-46-46q-36-32-68 0l-190 190q-74-42-156-42-128 0-223 95t-95 223 90 219 218 91 224-95 96-223q0-88-46-162z m-678 358q0-88 68-156t156-68 151 63 63 153q0 88-68 155t-156 67-151-63-63-151z" horiz-adv-x="789" /> <glyph glyph-name="search" unicode="&#xe80d;" d="M772 78q30-34 6-62l-46-46q-36-32-68 0l-190 190q-74-42-156-42-128 0-223 95t-95 223 90 219 218 91 224-95 96-223q0-88-46-162z m-678 358q0-88 68-156t156-68 151 63 63 153q0 88-68 155t-156 67-151-63-63-151z" horiz-adv-x="789" />
<glyph glyph-name="pin" unicode="&#xe80f;" d="M268 368v250q0 8-5 13t-13 5-13-5-5-13v-250q0-8 5-13t13-5 13 5 5 13z m375-197q0-14-11-25t-25-10h-239l-29-270q-1-7-6-11t-11-5h-1q-15 0-17 15l-43 271h-225q-15 0-25 10t-11 25q0 69 44 124t99 55v286q-29 0-50 21t-22 50 22 50 50 22h357q29 0 50-22t21-50-21-50-50-21v-286q55 0 99-55t44-124z" horiz-adv-x="642.9" /> <glyph glyph-name="pin" unicode="&#xe80e;" d="M268 368v250q0 8-5 13t-13 5-13-5-5-13v-250q0-8 5-13t13-5 13 5 5 13z m375-197q0-14-11-25t-25-10h-239l-29-270q-1-7-6-11t-11-5h-1q-15 0-17 15l-43 271h-225q-15 0-25 10t-11 25q0 69 44 124t99 55v286q-29 0-50 21t-22 50 22 50 50 22h357q29 0 50-22t21-50-21-50-50-21v-286q55 0 99-55t44-124z" horiz-adv-x="642.9" />
<glyph glyph-name="cog" unicode="&#xe812;" d="M911 295l-133-56q-8-22-12-31l55-133-79-79-135 53q-9-4-31-12l-55-134-112 0-56 133q-11 4-33 13l-132-55-78 79 53 134q-1 3-4 9t-6 12-4 11l-131 55 0 112 131 56 14 33-54 132 78 79 133-54q22 9 33 13l55 132 112 0 56-132q14-5 31-13l133 55 80-79-54-135q6-12 12-30l133-56 0-112z m-447-111q69 0 118 48t49 118-49 119-118 50-119-50-49-119 49-118 119-48z" horiz-adv-x="928" /> <glyph glyph-name="cog" unicode="&#xe80f;" d="M911 295l-133-56q-8-22-12-31l55-133-79-79-135 53q-9-4-31-12l-55-134-112 0-56 133q-11 4-33 13l-132-55-78 79 53 134q-1 3-4 9t-6 12-4 11l-131 55 0 112 131 56 14 33-54 132 78 79 133-54q22 9 33 13l55 132 112 0 56-132q14-5 31-13l133 55 80-79-54-135q6-12 12-30l133-56 0-112z m-447-111q69 0 118 48t49 118-49 119-118 50-119-50-49-119 49-118 119-48z" horiz-adv-x="928" />
<glyph glyph-name="rss-feed" unicode="&#xe813;" d="M184 93c0-51-43-91-93-91s-91 40-91 91c0 50 41 91 91 91s93-41 93-91z m261-85l-125 0c0 174-140 323-315 323l0 118c231 0 440-163 440-441z m259 0l-136 0c0 300-262 561-563 561l0 129c370 0 699-281 699-690z" horiz-adv-x="704" /> <glyph glyph-name="rss" unicode="&#xe810;" d="M184 93c0-51-43-91-93-91s-91 40-91 91c0 50 41 91 91 91s93-41 93-91z m261-85l-125 0c0 174-140 323-315 323l0 118c231 0 440-163 440-441z m259 0l-136 0c0 300-262 561-563 561l0 129c370 0 699-281 699-690z" horiz-adv-x="704" />
<glyph glyph-name="ok" unicode="&#xe811;" d="M933 534q0-22-16-38l-404-404-76-76q-16-15-38-15t-38 15l-76 76-202 202q-15 16-15 38t15 38l76 76q16 16 38 16t38-16l164-165 366 367q16 16 38 16t38-16l76-76q16-15 16-38z" horiz-adv-x="1000" />
<glyph glyph-name="circle" unicode="&#xf111;" d="M857 350q0-117-57-215t-156-156-215-58-216 58-155 156-58 215 58 215 155 156 216 58 215-58 156-156 57-215z" horiz-adv-x="857.1" />
<glyph glyph-name="info" unicode="&#xf128;" d="M393 149v-134q0-9-7-15t-15-7h-134q-9 0-16 7t-7 15v134q0 9 7 16t16 6h134q9 0 15-6t7-16z m176 335q0-30-8-56t-20-43-31-33-32-25-34-19q-23-13-38-37t-15-37q0-10-7-18t-16-9h-134q-8 0-14 11t-6 20v26q0 46 37 87t79 60q33 16 47 32t14 42q0 24-26 41t-60 18q-36 0-60-16-20-14-60-64-7-9-17-9-7 0-14 4l-91 70q-8 6-9 14t3 16q89 148 259 148 45 0 90-17t81-46 59-72 23-88z" horiz-adv-x="571.4" /> <glyph glyph-name="info" unicode="&#xf128;" d="M393 149v-134q0-9-7-15t-15-7h-134q-9 0-16 7t-7 15v134q0 9 7 16t16 6h134q9 0 15-6t7-16z m176 335q0-30-8-56t-20-43-31-33-32-25-34-19q-23-13-38-37t-15-37q0-10-7-18t-16-9h-134q-8 0-14 11t-6 20v26q0 46 37 87t79 60q33 16 47 32t14 42q0 24-26 41t-60 18q-36 0-60-16-20-14-60-64-7-9-17-9-7 0-14 4l-91 70q-8 6-9 14t3 16q89 148 259 148 45 0 90-17t81-46 59-72 23-88z" horiz-adv-x="571.4" />
<glyph glyph-name="bird" unicode="&#xf309;" d="M920 636q-36-54-94-98l0-24q0-130-60-250t-186-203-290-83q-160 0-290 84 14-2 46-2 132 0 234 80-62 2-110 38t-66 94q10-4 34-4 26 0 50 6-66 14-108 66t-42 120l0 2q36-20 84-24-84 58-84 158 0 48 26 94 154-188 390-196-6 18-6 42 0 78 55 133t135 55q82 0 136-58 60 12 120 44-20-66-82-104 56 8 108 30z" horiz-adv-x="920" /> <glyph glyph-name="bird" unicode="&#xf309;" d="M920 636q-36-54-94-98l0-24q0-130-60-250t-186-203-290-83q-160 0-290 84 14-2 46-2 132 0 234 80-62 2-110 38t-66 94q10-4 34-4 26 0 50 6-66 14-108 66t-42 120l0 2q36-20 84-24-84 58-84 158 0 48 26 94 154-188 390-196-6 18-6 42 0 78 55 133t135 55q82 0 136-58 60 12 120 44-20-66-82-104 56 8 108 30z" horiz-adv-x="920" />
</font> </font>
</defs> </defs>
</svg> </svg>

Before

Width:  |  Height:  |  Size: 5.9 KiB

After

Width:  |  Height:  |  Size: 6.9 KiB

Binary file not shown.
Binary file not shown.
Binary file not shown.
-5
View File
File diff suppressed because one or more lines are too long
+5
View File
File diff suppressed because one or more lines are too long
+62 -46
View File
@@ -1,66 +1,82 @@
// @license http://www.gnu.org/licenses/agpl-3.0.html AGPL-3.0 // @license http://www.gnu.org/licenses/agpl-3.0.html AGPL-3.0
// SPDX-License-Identifier: AGPL-3.0-only // SPDX-License-Identifier: AGPL-3.0-only
function insertBeforeLast(node, elem) { function insertBeforeLast(node, elem) {
node.insertBefore(elem, node.childNodes[node.childNodes.length - 2]); node.insertBefore(elem, node.childNodes[node.childNodes.length - 2]);
} }
function getLoadMore(doc) { function getLoadMore(doc) {
return doc.querySelector('.show-more:not(.timeline-item)'); return doc.querySelector(".show-more:not(.timeline-item)");
} }
function isDuplicate(item, itemClass) { function isDuplicate(item, itemClass) {
const tweet = item.querySelector(".tweet-link"); const tweet = item.querySelector(".tweet-link");
if (tweet == null) return false; if (tweet == null) return false;
const href = tweet.getAttribute("href"); const href = tweet.getAttribute("href");
return document.querySelector(itemClass + " .tweet-link[href='" + href + "']") != null; return (
document.querySelector(itemClass + " .tweet-link[href='" + href + "']") !=
null
);
} }
window.onload = function() { window.onload = function () {
const url = window.location.pathname; const url = window.location.pathname;
const isTweet = url.indexOf("/status/") !== -1; const isTweet = url.indexOf("/status/") !== -1;
const containerClass = isTweet ? ".replies" : ".timeline"; const containerClass = isTweet ? ".replies" : ".timeline";
const itemClass = containerClass + ' > div:not(.top-ref)'; const itemClass = containerClass + " > div:not(.top-ref)";
var html = document.querySelector("html"); var html = document.querySelector("html");
var container = document.querySelector(containerClass); var container = document.querySelector(containerClass);
var loading = false; var loading = false;
window.addEventListener('scroll', function() { function handleScroll(failed) {
if (loading) return; if (loading) return;
if (html.scrollTop + html.clientHeight >= html.scrollHeight - 3000) {
loading = true;
var loadMore = getLoadMore(document);
if (loadMore == null) return;
loadMore.children[0].text = "Loading..."; if (html.scrollTop + html.clientHeight >= html.scrollHeight - 3000) {
loading = true;
var loadMore = getLoadMore(document);
if (loadMore == null) return;
var url = new URL(loadMore.children[0].href); loadMore.children[0].text = "Loading...";
url.searchParams.append('scroll', 'true');
fetch(url.toString()).then(function (response) { var url = new URL(loadMore.children[0].href);
return response.text(); url.searchParams.append("scroll", "true");
}).then(function (html) {
var parser = new DOMParser();
var doc = parser.parseFromString(html, 'text/html');
loadMore.remove();
for (var item of doc.querySelectorAll(itemClass)) { fetch(url.toString())
if (item.className == "timeline-item show-more") continue; .then(function (response) {
if (isDuplicate(item, itemClass)) continue; if (response.status > 299) throw "error";
if (isTweet) container.appendChild(item); return response.text();
else insertBeforeLast(container, item); })
} .then(function (html) {
var parser = new DOMParser();
var doc = parser.parseFromString(html, "text/html");
loadMore.remove();
loading = false; for (var item of doc.querySelectorAll(itemClass)) {
const newLoadMore = getLoadMore(doc); if (item.className == "timeline-item show-more") continue;
if (newLoadMore == null) return; if (isDuplicate(item, itemClass)) continue;
if (isTweet) container.appendChild(newLoadMore); if (isTweet) container.appendChild(item);
else insertBeforeLast(container, newLoadMore); else insertBeforeLast(container, item);
}).catch(function (err) { }
console.warn('Something went wrong.', err);
loading = true; loading = false;
}); const newLoadMore = getLoadMore(doc);
} if (newLoadMore == null) return;
}); if (isTweet) container.appendChild(newLoadMore);
else insertBeforeLast(container, newLoadMore);
})
.catch(function (err) {
console.warn("Something went wrong.", err);
if (failed > 3) {
loadMore.children[0].text = "Error";
return;
}
loading = false;
handleScroll((failed || 0) + 1);
});
}
}
window.addEventListener("scroll", () => handleScroll());
}; };
// @license-end // @license-end
+23 -21
View File
@@ -4,15 +4,15 @@ Nitter is a free and open source alternative Twitter front-end focused on
privacy and performance. The source is available on GitHub at privacy and performance. The source is available on GitHub at
<https://github.com/zedeus/nitter> <https://github.com/zedeus/nitter>
* No JavaScript or ads - No JavaScript or ads
* All requests go through the backend, client never talks to Twitter - All requests go through the backend, client never talks to Twitter
* Prevents Twitter from tracking your IP or JavaScript fingerprint - Prevents Twitter from tracking your IP or JavaScript fingerprint
* Uses Twitter's unofficial API (no rate limits or developer account required) - Uses Twitter's unofficial API (no developer account required)
* Lightweight (for [@nim_lang](/nim_lang), 60KB vs 784KB from twitter.com) - Lightweight (for [@nim_lang](/nim_lang), 60KB vs 784KB from twitter.com)
* RSS feeds - RSS feeds
* Themes - Themes
* Mobile support (responsive design) - Mobile support (responsive design)
* AGPLv3 licensed, no proprietary instances permitted - AGPLv3 licensed, no proprietary instances permitted
Nitter's GitHub wiki contains Nitter's GitHub wiki contains
[instances](https://github.com/zedeus/nitter/wiki/Instances) and [instances](https://github.com/zedeus/nitter/wiki/Instances) and
@@ -21,12 +21,13 @@ maintained by the community.
## Why use Nitter? ## Why use Nitter?
It's impossible to use Twitter without JavaScript enabled. For privacy-minded It's impossible to use Twitter without JavaScript enabled, and as of 2024 you
folks, preventing JavaScript analytics and IP-based tracking is important, but need to sign up. For privacy-minded folks, preventing JavaScript analytics and
apart from using a VPN and uBlock/uMatrix, it's impossible. Despite being behind IP-based tracking is important, but apart from using a VPN and uBlock/uMatrix,
a VPN and using heavy-duty adblockers, you can get accurately tracked with your it's impossible. Despite being behind a VPN and using heavy-duty adblockers,
[browser's fingerprint](https://restoreprivacy.com/browser-fingerprinting/), you can get accurately tracked with your [browser's
[no JavaScript required](https://noscriptfingerprint.com/). This all became fingerprint](https://restoreprivacy.com/browser-fingerprinting/), [no
JavaScript required](https://noscriptfingerprint.com/). This all became
particularly important after Twitter [removed the particularly important after Twitter [removed the
ability](https://www.eff.org/deeplinks/2020/04/twitter-removes-privacy-option-and-shows-why-we-need-strong-privacy-laws) ability](https://www.eff.org/deeplinks/2020/04/twitter-removes-privacy-option-and-shows-why-we-need-strong-privacy-laws)
for users to control whether their data gets sent to advertisers. for users to control whether their data gets sent to advertisers.
@@ -42,12 +43,13 @@ Twitter account.
## Donating ## Donating
Liberapay: <https://liberapay.com/zedeus> \ Liberapay: https://liberapay.com/zedeus \
Patreon: <https://patreon.com/nitter> \ Patreon: https://patreon.com/nitter \
BTC: bc1qp7q4qz0fgfvftm5hwz3vy284nue6jedt44kxya \ BTC: bc1qpqpzjkcpgluhzf7x9yqe7jfe8gpfm5v08mdr55 \
ETH: 0x66d84bc3fd031b62857ad18c62f1ba072b011925 \ ETH: 0x24a0DB59A923B588c7A5EBd0dBDFDD1bCe9c4460 \
LTC: ltc1qhsz5nxw6jw9rdtw9qssjeq2h8hqk2f85rdgpkr \ XMR: 42hKayRoEAw4D6G6t8mQHPJHQcXqofjFuVfavqKeNMNUZfeJLJAcNU19i1bGdDvcdN6romiSscWGWJCczFLe9RFhM3d1zpL \
XMR: 42hKayRoEAw4D6G6t8mQHPJHQcXqofjFuVfavqKeNMNUZfeJLJAcNU19i1bGdDvcdN6romiSscWGWJCczFLe9RFhM3d1zpL SOL: ANsyGNXFo6osuFwr1YnUqif2RdoYRhc27WdyQNmmETSW \
ZEC: u1vndfqtzyy6qkzhkapxelel7ams38wmfeccu3fdpy2wkuc4erxyjm8ncjhnyg747x6t0kf0faqhh2hxyplgaum08d2wnj4n7cyu9s6zhxkqw2aef4hgd4s6vh5hpqvfken98rg80kgtgn64ff70djy7s8f839z00hwhuzlcggvefhdlyszkvwy3c7yw623vw3rvar6q6evd3xcvveypt
## Contact ## Contact
+120 -58
View File
@@ -1,58 +1,102 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import asyncdispatch, httpclient, uri, strutils, sequtils, sugar import asyncdispatch, httpclient, strutils, sequtils, sugar
import packedjson import packedjson
import types, query, formatters, consts, apiutils, parser import types, query, formatters, consts, apiutils, parser
import experimental/parser as newParser import experimental/parser as newParser
# Helper to generate params object for GraphQL requests
proc genParams(variables: string; fieldToggles = ""): seq[(string, string)] =
result.add ("variables", variables)
result.add ("features", gqlFeatures)
if fieldToggles.len > 0:
result.add ("fieldToggles", fieldToggles)
proc apiUrl(endpoint, variables: string; fieldToggles = ""): ApiUrl =
return ApiUrl(endpoint: endpoint, params: genParams(variables, fieldToggles))
proc apiReq(endpoint, variables: string; fieldToggles = ""): ApiReq =
let url = apiUrl(endpoint, variables, fieldToggles)
return ApiReq(cookie: url, oauth: url)
proc mediaUrl(id: string; cursor: string): ApiReq =
result = ApiReq(
cookie: apiUrl(graphUserMedia, userMediaVars % [id, cursor]),
oauth: apiUrl(graphUserMediaV2, restIdVars % [id, cursor])
)
proc userTweetsUrl(id: string; cursor: string): ApiReq =
result = ApiReq(
# cookie: apiUrl(graphUserTweets, userTweetsVars % [id, cursor], userTweetsFieldToggles),
oauth: apiUrl(graphUserTweetsV2, restIdVars % [id, cursor])
)
# might change this in the future pending testing
result.cookie = result.oauth
proc userTweetsAndRepliesUrl(id: string; cursor: string): ApiReq =
let cookieVars = userTweetsAndRepliesVars % [id, cursor]
result = ApiReq(
cookie: apiUrl(graphUserTweetsAndReplies, cookieVars, userTweetsFieldToggles),
oauth: apiUrl(graphUserTweetsAndRepliesV2, restIdVars % [id, cursor])
)
proc tweetDetailUrl(id: string; cursor: string): ApiReq =
let cookieVars = tweetDetailVars % [id, cursor]
result = ApiReq(
# cookie: apiUrl(graphTweetDetail, cookieVars, tweetDetailFieldToggles),
cookie: apiUrl(graphTweet, tweetVars % [id, cursor]),
oauth: apiUrl(graphTweet, tweetVars % [id, cursor])
)
proc userUrl(username: string): ApiReq =
let cookieVars = """{"screen_name":"$1","withGrokTranslatedBio":false}""" % username
result = ApiReq(
cookie: apiUrl(graphUser, cookieVars, tweetDetailFieldToggles),
oauth: apiUrl(graphUserV2, """{"screen_name": "$1"}""" % username)
)
proc getGraphUser*(username: string): Future[User] {.async.} = proc getGraphUser*(username: string): Future[User] {.async.} =
if username.len == 0: return if username.len == 0: return
let let js = await fetchRaw(userUrl(username))
variables = %*{"screen_name": username}
params = {"variables": $variables, "features": gqlFeatures}
js = await fetchRaw(graphUser ? params, Api.userScreenName)
result = parseGraphUser(js) result = parseGraphUser(js)
proc getGraphUserById*(id: string): Future[User] {.async.} = proc getGraphUserById*(id: string): Future[User] {.async.} =
if id.len == 0 or id.any(c => not c.isDigit): return if id.len == 0 or id.any(c => not c.isDigit): return
let let
variables = %*{"userId": id} url = apiReq(graphUserById, """{"rest_id": "$1"}""" % id)
params = {"variables": $variables, "features": gqlFeatures} js = await fetchRaw(url)
js = await fetchRaw(graphUserById ? params, Api.userRestId)
result = parseGraphUser(js) result = parseGraphUser(js)
proc getGraphUserTweets*(id: string; kind: TimelineKind; after=""): Future[Timeline] {.async.} = proc getGraphUserTweets*(id: string; kind: TimelineKind; after=""): Future[Profile] {.async.} =
if id.len == 0: return if id.len == 0: return
let let
cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: "" cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: ""
variables = userTweetsVariables % [id, cursor] url = case kind
params = {"variables": variables, "features": gqlFeatures} of TimelineKind.tweets: userTweetsUrl(id, cursor)
(url, apiId) = case kind of TimelineKind.replies: userTweetsAndRepliesUrl(id, cursor)
of TimelineKind.tweets: (graphUserTweets, Api.userTweets) of TimelineKind.media: mediaUrl(id, cursor)
of TimelineKind.replies: (graphUserTweetsAndReplies, Api.userTweetsAndReplies) js = await fetch(url)
of TimelineKind.media: (graphUserMedia, Api.userMedia) result = parseGraphTimeline(js, after)
js = await fetch(url ? params, apiId)
result = parseGraphTimeline(js, "user", after)
proc getGraphListTweets*(id: string; after=""): Future[Timeline] {.async.} = proc getGraphListTweets*(id: string; after=""): Future[Timeline] {.async.} =
if id.len == 0: return if id.len == 0: return
let let
cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: "" cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: ""
variables = listTweetsVariables % [id, cursor] url = apiReq(graphListTweets, restIdVars % [id, cursor])
params = {"variables": variables, "features": gqlFeatures} js = await fetch(url)
js = await fetch(graphListTweets ? params, Api.listTweets) result = parseGraphTimeline(js, after).tweets
result = parseGraphTimeline(js, "list", after)
proc getGraphListBySlug*(name, list: string): Future[List] {.async.} = proc getGraphListBySlug*(name, list: string): Future[List] {.async.} =
let let
variables = %*{"screenName": name, "listSlug": list} variables = %*{"screenName": name, "listSlug": list}
params = {"variables": $variables, "features": gqlFeatures} url = apiReq(graphListBySlug, $variables)
result = parseGraphList(await fetch(graphListBySlug ? params, Api.listBySlug)) js = await fetch(url)
result = parseGraphList(js)
proc getGraphList*(id: string): Future[List] {.async.} = proc getGraphList*(id: string): Future[List] {.async.} =
let let
variables = %*{"listId": id} url = apiReq(graphListById, """{"listId": "$1"}""" % id)
params = {"variables": $variables, "features": gqlFeatures} js = await fetch(url)
result = parseGraphList(await fetch(graphListById ? params, Api.list)) result = parseGraphList(js)
proc getGraphListMembers*(list: List; after=""): Future[Result[User]] {.async.} = proc getGraphListMembers*(list: List; after=""): Future[Result[User]] {.async.} =
if list.id.len == 0: return if list.id.len == 0: return
@@ -66,24 +110,23 @@ proc getGraphListMembers*(list: List; after=""): Future[Result[User]] {.async.}
} }
if after.len > 0: if after.len > 0:
variables["cursor"] = % after variables["cursor"] = % after
let url = graphListMembers ? {"variables": $variables, "features": gqlFeatures} let
result = parseGraphListMembers(await fetchRaw(url, Api.listMembers), after) url = apiReq(graphListMembers, $variables)
js = await fetchRaw(url)
result = parseGraphListMembers(js, after)
proc getGraphTweetResult*(id: string): Future[Tweet] {.async.} = proc getGraphTweetResult*(id: string): Future[Tweet] {.async.} =
if id.len == 0: return if id.len == 0: return
let let
variables = tweetResultVariables % id url = apiReq(graphTweetResult, """{"rest_id": "$1"}""" % id)
params = {"variables": variables, "features": gqlFeatures} js = await fetch(url)
js = await fetch(graphTweetResult ? params, Api.tweetResult)
result = parseGraphTweetResult(js) result = parseGraphTweetResult(js)
proc getGraphTweet(id: string; after=""): Future[Conversation] {.async.} = proc getGraphTweet(id: string; after=""): Future[Conversation] {.async.} =
if id.len == 0: return if id.len == 0: return
let let
cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: "" cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: ""
variables = tweetVariables % [id, cursor] js = await fetch(tweetDetailUrl(id, cursor))
params = {"variables": variables, "features": gqlFeatures}
js = await fetch(graphTweet ? params, Api.tweetDetail)
result = parseGraphConversation(js, id) result = parseGraphConversation(js, id)
proc getReplies*(id, after: string): Future[Result[Chain]] {.async.} = proc getReplies*(id, after: string): Future[Result[Chain]] {.async.} =
@@ -95,14 +138,22 @@ proc getTweet*(id: string; after=""): Future[Conversation] {.async.} =
if after.len > 0: if after.len > 0:
result.replies = await getReplies(id, after) result.replies = await getReplies(id, after)
proc getGraphSearch*(query: Query; after=""): Future[Result[Tweet]] {.async.} = proc getGraphEditHistory*(id: string): Future[EditHistory] {.async.} =
if id.len == 0: return
let
url = apiReq(graphTweetEditHistory, tweetEditHistoryVars % id)
js = await fetch(url)
result = parseGraphEditHistory(js, id)
proc getGraphTweetSearch*(query: Query; after=""): Future[Timeline] {.async.} =
let q = genQueryParam(query) let q = genQueryParam(query)
if q.len == 0 or q == emptyQuery: if q.len == 0 or q == emptyQuery:
return Result[Tweet](query: query, beginning: true) return Timeline(query: query, beginning: true)
var var
variables = %*{ variables = %*{
"rawQuery": q, "rawQuery": q,
"query_source": "typedQuery",
"count": 20, "count": 20,
"product": "Latest", "product": "Latest",
"withDownvotePerspective": false, "withDownvotePerspective": false,
@@ -111,35 +162,46 @@ proc getGraphSearch*(query: Query; after=""): Future[Result[Tweet]] {.async.} =
} }
if after.len > 0: if after.len > 0:
variables["cursor"] = % after variables["cursor"] = % after
let url = graphSearchTimeline ? {"variables": $variables, "features": gqlFeatures} let
result = parseGraphSearch(await fetch(url, Api.search), after) url = apiReq(graphSearchTimeline, $variables)
js = await fetch(url)
result = parseGraphSearch[Tweets](js, after)
result.query = query result.query = query
proc getUserSearch*(query: Query; page="1"): Future[Result[User]] {.async.} = # when no more items are available the API just returns the last page in
# full. this detects that and clears the page instead.
if after.len > 0 and result.bottom.len > 0 and
after[0..<64] == result.bottom[0..<64]:
result.content.setLen(0)
proc getGraphUserSearch*(query: Query; after=""): Future[Result[User]] {.async.} =
if query.text.len == 0: if query.text.len == 0:
return Result[User](query: query, beginning: true) return Result[User](query: query, beginning: true)
var url = userSearch ? { var
"q": query.text, variables = %*{
"skip_status": "1", "rawQuery": query.text,
"count": "20", "query_source": "typedQuery",
"page": page "count": 20,
} "product": "People",
"withDownvotePerspective": false,
"withReactionsMetadata": false,
"withReactionsPerspective": false
}
if after.len > 0:
variables["cursor"] = % after
result.beginning = false
result = parseUsers(await fetchRaw(url, Api.userSearch)) let
url = apiReq(graphSearchTimeline, $variables)
js = await fetch(url)
result = parseGraphSearch[User](js, after)
result.query = query result.query = query
if page.len == 0:
result.bottom = "2"
elif page.allCharsInSet(Digits):
result.bottom = $(parseInt(page) + 1)
proc getPhotoRail*(name: string): Future[PhotoRail] {.async.} = proc getPhotoRail*(id: string): Future[PhotoRail] {.async.} =
if name.len == 0: return if id.len == 0: return
let let js = await fetch(mediaUrl(id, ""))
ps = genParams({"screen_name": name, "trim_user": "true"}, result = parseGraphPhotoRail(js)
count="18", ext=false)
url = photoRail ? ps
result = parsePhotoRail(await fetch(url, Api.timeline))
proc resolve*(url: string; prefs: Prefs): Future[string] {.async.} = proc resolve*(url: string; prefs: Prefs): Future[string] {.async.} =
let client = newAsyncHttpClient(maxRedirects=0) let client = newAsyncHttpClient(maxRedirects=0)
+153 -77
View File
@@ -1,68 +1,116 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import httpclient, asyncdispatch, options, strutils, uri import httpclient, asyncdispatch, options, strutils, uri, times, math, tables
import jsony, packedjson, zippy import jsony, packedjson, zippy, oauth1
import types, tokens, consts, parserutils, http_pool import types, auth, consts, parserutils, http_pool, tid
import experimental/types/common import experimental/types/common
const const
rlRemaining = "x-rate-limit-remaining" rlRemaining = "x-rate-limit-remaining"
rlReset = "x-rate-limit-reset" rlReset = "x-rate-limit-reset"
rlLimit = "x-rate-limit-limit"
errorsToSkip = {null, doesntExist, tweetNotFound, timeout, unauthorized, badRequest}
var pool: HttpPool var
pool: HttpPool
disableTid: bool
apiProxy: string
proc genParams*(pars: openArray[(string, string)] = @[]; cursor=""; proc setDisableTid*(disable: bool) =
count="20"; ext=true): seq[(string, string)] = disableTid = disable
result = timelineParams
for p in pars:
result &= p
if ext:
result &= ("ext", "mediaStats")
result &= ("include_ext_alt_text", "1")
result &= ("include_ext_media_availability", "1")
if count.len > 0:
result &= ("count", count)
if cursor.len > 0:
# The raw cursor often has plus signs, which sometimes get turned into spaces,
# so we need to turn them back into a plus
if " " in cursor:
result &= ("cursor", cursor.replace(" ", "+"))
else:
result &= ("cursor", cursor)
proc genHeaders*(token: Token = nil): HttpHeaders = proc setApiProxy*(url: string) =
if url.len > 0:
apiProxy = url.strip(chars={'/'}) & "/"
if "http" notin apiProxy:
apiProxy = "http://" & apiProxy
proc toUrl(req: ApiReq; sessionKind: SessionKind): Uri =
case sessionKind
of oauth:
let o = req.oauth
parseUri("https://api.x.com/graphql") / o.endpoint ? o.params
of cookie:
let c = req.cookie
parseUri("https://x.com/i/api/graphql") / c.endpoint ? c.params
proc getOauthHeader(url, oauthToken, oauthTokenSecret: string): string =
let
encodedUrl = url.replace(",", "%2C").replace("+", "%20")
params = OAuth1Parameters(
consumerKey: consumerKey,
signatureMethod: "HMAC-SHA1",
timestamp: $int(round(epochTime())),
nonce: "0",
isIncludeVersionToHeader: true,
token: oauthToken
)
signature = getSignature(HttpGet, encodedUrl, "", params, consumerSecret, oauthTokenSecret)
params.signature = percentEncode(signature)
return getOauth1RequestHeader(params)["authorization"]
proc getCookieHeader(authToken, ct0: string): string =
"auth_token=" & authToken & "; ct0=" & ct0
proc genHeaders*(session: Session, url: Uri): Future[HttpHeaders] {.async.} =
result = newHttpHeaders({ result = newHttpHeaders({
"connection": "keep-alive", "accept": "*/*",
"authorization": auth,
"content-type": "application/json",
"x-guest-token": if token == nil: "" else: token.tok,
"x-twitter-active-user": "yes",
"authority": "api.twitter.com",
"accept-encoding": "gzip", "accept-encoding": "gzip",
"accept-language": "en-US,en;q=0.9", "accept-language": "en-US,en;q=0.9",
"accept": "*/*", "connection": "keep-alive",
"DNT": "1" "content-type": "application/json",
"origin": "https://x.com",
"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/142.0.0.0 Safari/537.36",
"x-twitter-active-user": "yes",
"x-twitter-client-language": "en",
"priority": "u=1, i"
}) })
template updateToken() = case session.kind
if resp.headers.hasKey(rlRemaining): of SessionKind.oauth:
let result["authorization"] = getOauthHeader($url, session.oauthToken, session.oauthSecret)
remaining = parseInt(resp.headers[rlRemaining]) of SessionKind.cookie:
reset = parseInt(resp.headers[rlReset]) result["x-twitter-auth-type"] = "OAuth2Session"
token.setRateLimit(api, remaining, reset) result["x-csrf-token"] = session.ct0
result["cookie"] = getCookieHeader(session.authToken, session.ct0)
result["sec-ch-ua"] = """"Google Chrome";v="142", "Chromium";v="142", "Not A(Brand";v="24""""
result["sec-ch-ua-mobile"] = "?0"
result["sec-ch-ua-platform"] = "Windows"
result["sec-fetch-dest"] = "empty"
result["sec-fetch-mode"] = "cors"
result["sec-fetch-site"] = "same-site"
if disableTid:
result["authorization"] = bearerToken2
else:
result["authorization"] = bearerToken
result["x-client-transaction-id"] = await genTid(url.path)
proc getAndValidateSession*(req: ApiReq): Future[Session] {.async.} =
result = await getSession(req)
case result.kind
of SessionKind.oauth:
if result.oauthToken.len == 0:
echo "[sessions] Empty oauth token, session: ", result.pretty
raise rateLimitError()
of SessionKind.cookie:
if result.authToken.len == 0 or result.ct0.len == 0:
echo "[sessions] Empty cookie credentials, session: ", result.pretty
raise rateLimitError()
template fetchImpl(result, fetchBody) {.dirty.} = template fetchImpl(result, fetchBody) {.dirty.} =
once: once:
pool = HttpPool() pool = HttpPool()
var token = await getToken(api)
if token.tok.len == 0:
raise rateLimitError()
try: try:
var resp: AsyncResponse var resp: AsyncResponse
pool.use(genHeaders(token)): pool.use(await genHeaders(session, url)):
template getContent = template getContent =
resp = await c.get($url) # TODO: this is a temporary simple implementation
if apiProxy.len > 0:
resp = await c.get(($url).replace("https://", apiProxy))
else:
resp = await c.get($url)
result = await resp.body result = await resp.body
getContent() getContent()
@@ -71,57 +119,85 @@ template fetchImpl(result, fetchBody) {.dirty.} =
badClient = true badClient = true
raise newException(BadClientError, "Bad client") raise newException(BadClientError, "Bad client")
if resp.headers.hasKey(rlRemaining):
let
remaining = parseInt(resp.headers[rlRemaining])
reset = parseInt(resp.headers[rlReset])
limit = parseInt(resp.headers[rlLimit])
session.setRateLimit(req, remaining, reset, limit)
if result.len > 0: if result.len > 0:
if resp.headers.getOrDefault("content-encoding") == "gzip": if resp.headers.getOrDefault("content-encoding") == "gzip":
result = uncompress(result, dfGzip) result = uncompress(result, dfGzip)
else:
echo "non-gzip body, url: ", url, ", body: ", result if result.startsWith("{\"errors"):
let errors = result.fromJson(Errors)
if errors notin errorsToSkip:
echo "Fetch error, API: ", url.path, ", errors: ", errors
if errors in {expiredToken, badToken, locked}:
invalidate(session)
raise rateLimitError()
elif errors in {rateLimited}:
# rate limit hit, resets after 24 hours
setLimited(session, req)
raise rateLimitError()
elif result.startsWith("429 Too Many Requests"):
echo "[sessions] 429 error, API: ", url.path, ", session: ", session.pretty
raise rateLimitError()
fetchBody fetchBody
release(token, used=true)
if resp.status == $Http400: if resp.status == $Http400:
echo "ERROR 400, ", url.path, ": ", result
raise newException(InternalError, $url) raise newException(InternalError, $url)
except InternalError as e: except InternalError as e:
raise e raise e
except BadClientError as e: except BadClientError as e:
release(token, used=true) raise e
except OSError as e:
raise e raise e
except Exception as e: except Exception as e:
echo "error: ", e.name, ", msg: ", e.msg, ", token: ", token[], ", url: ", url let s = session.pretty
if "length" notin e.msg and "descriptor" notin e.msg: echo "error: ", e.name, ", msg: ", e.msg, ", session: ", s, ", url: ", url
release(token, invalid=true)
raise rateLimitError() raise rateLimitError()
finally:
release(session)
proc fetch*(url: Uri; api: Api): Future[JsonNode] {.async.} = template retry(bod) =
var body: string try:
fetchImpl body: bod
if body.startsWith('{') or body.startsWith('['): except RateLimitError:
result = parseJson(body) echo "[sessions] Rate limited, retrying ", req.cookie.endpoint, " request..."
else: bod
echo resp.status, ": ", body, " --- url: ", url
result = newJNull()
updateToken() proc fetch*(req: ApiReq): Future[JsonNode] {.async.} =
retry:
var
body: string
session = await getAndValidateSession(req)
let error = result.getError let url = req.toUrl(session.kind)
if error in {invalidToken, badToken}:
echo "fetch error: ", result.getError
release(token, invalid=true)
raise rateLimitError()
proc fetchRaw*(url: Uri; api: Api): Future[string] {.async.} = fetchImpl body:
fetchImpl result: if body.startsWith('{') or body.startsWith('['):
if not (result.startsWith('{') or result.startsWith('[')): result = parseJson(body)
echo resp.status, ": ", result, " --- url: ", url else:
result.setLen(0) echo resp.status, ": ", body, " --- url: ", url
result = newJNull()
updateToken() let error = result.getError
if error != null and error notin errorsToSkip:
echo "Fetch error, API: ", url.path, ", error: ", error
if error in {expiredToken, badToken, locked}:
invalidate(session)
raise rateLimitError()
if result.startsWith("{\"errors"): proc fetchRaw*(req: ApiReq): Future[string] {.async.} =
let errors = result.fromJson(Errors) retry:
if errors in {invalidToken, badToken}: var session = await getAndValidateSession(req)
echo "fetch error: ", errors let url = req.toUrl(session.kind)
release(token, invalid=true)
raise rateLimitError() fetchImpl result:
if not (result.startsWith('{') or result.startsWith('[')):
echo resp.status, ": ", result, " --- url: ", url
result.setLen(0)
+213
View File
@@ -0,0 +1,213 @@
#SPDX-License-Identifier: AGPL-3.0-only
import std/[asyncdispatch, times, json, random, strutils, tables, packedsets, os]
import types, consts
import experimental/parser/session
const hourInSeconds = 60 * 60
var
sessionPool: seq[Session]
enableLogging = false
# max requests at a time per session to avoid race conditions
maxConcurrentReqs = 2
proc setMaxConcurrentReqs*(reqs: int) =
if reqs > 0:
maxConcurrentReqs = reqs
template log(str: varargs[string, `$`]) =
echo "[sessions] ", str.join("")
proc endpoint(req: ApiReq; session: Session): string =
case session.kind
of oauth: req.oauth.endpoint
of cookie: req.cookie.endpoint
proc pretty*(session: Session): string =
if session.isNil:
return "<null>"
if session.id > 0 and session.username.len > 0:
result = $session.id & " (" & session.username & ")"
elif session.username.len > 0:
result = session.username
elif session.id > 0:
result = $session.id
else:
result = "<unknown>"
result = $session.kind & " " & result
proc snowflakeToEpoch(flake: int64): int64 =
int64(((flake shr 22) + 1288834974657) div 1000)
proc getSessionPoolHealth*(): JsonNode =
let now = epochTime().int
var
totalReqs = 0
limited: PackedSet[int64]
reqsPerApi: Table[string, int]
oldest = now.int64
newest = 0'i64
average = 0'i64
for session in sessionPool:
let created = snowflakeToEpoch(session.id)
if created > newest:
newest = created
if created < oldest:
oldest = created
average += created
if session.limited:
limited.incl session.id
for api in session.apis.keys:
let
apiStatus = session.apis[api]
reqs = apiStatus.limit - apiStatus.remaining
# no requests made with this session and endpoint since the limit reset
if apiStatus.reset < now:
continue
reqsPerApi.mgetOrPut($api, 0).inc reqs
totalReqs.inc reqs
if sessionPool.len > 0:
average = average div sessionPool.len
else:
oldest = 0
average = 0
return %*{
"sessions": %*{
"total": sessionPool.len,
"limited": limited.card,
"oldest": $fromUnix(oldest),
"newest": $fromUnix(newest),
"average": $fromUnix(average)
},
"requests": %*{
"total": totalReqs,
"apis": reqsPerApi
}
}
proc getSessionPoolDebug*(): JsonNode =
let now = epochTime().int
var list = newJObject()
for session in sessionPool:
let sessionJson = %*{
"apis": newJObject(),
"pending": session.pending,
}
if session.limited:
sessionJson["limited"] = %true
for api in session.apis.keys:
let
apiStatus = session.apis[api]
obj = %*{}
if apiStatus.reset > now.int:
obj["remaining"] = %apiStatus.remaining
obj["reset"] = %apiStatus.reset
if "remaining" notin obj:
continue
sessionJson{"apis", $api} = obj
list[$session.id] = sessionJson
return %list
proc rateLimitError*(): ref RateLimitError =
newException(RateLimitError, "rate limited")
proc noSessionsError*(): ref NoSessionsError =
newException(NoSessionsError, "no sessions available")
proc isLimited(session: Session; req: ApiReq): bool =
if session.isNil:
return true
let api = req.endpoint(session)
if session.limited and api != graphUserTweetsV2:
if (epochTime().int - session.limitedAt) > hourInSeconds:
session.limited = false
log "resetting limit: ", session.pretty
return false
else:
return true
if api in session.apis:
let limit = session.apis[api]
return limit.remaining <= 10 and limit.reset > epochTime().int
else:
return false
proc isReady(session: Session; req: ApiReq): bool =
not (session.isNil or session.pending > maxConcurrentReqs or session.isLimited(req))
proc invalidate*(session: var Session) =
if session.isNil: return
log "invalidating: ", session.pretty
# TODO: This isn't sufficient, but it works for now
let idx = sessionPool.find(session)
if idx > -1: sessionPool.delete(idx)
session = nil
proc release*(session: Session) =
if session.isNil: return
dec session.pending
proc getSession*(req: ApiReq): Future[Session] {.async.} =
for i in 0 ..< sessionPool.len:
if result.isReady(req): break
result = sessionPool.sample()
if not result.isNil and result.isReady(req):
inc result.pending
else:
log "no sessions available for API: ", req.cookie.endpoint
raise noSessionsError()
proc setLimited*(session: Session; req: ApiReq) =
let api = req.endpoint(session)
session.limited = true
session.limitedAt = epochTime().int
log "rate limited by api: ", api, ", reqs left: ", session.apis[api].remaining, ", ", session.pretty
proc setRateLimit*(session: Session; req: ApiReq; remaining, reset, limit: int) =
# avoid undefined behavior in race conditions
let api = req.endpoint(session)
if api in session.apis:
let rateLimit = session.apis[api]
if rateLimit.reset >= reset and rateLimit.remaining < remaining:
return
if rateLimit.reset == reset and rateLimit.remaining >= remaining:
session.apis[api].remaining = remaining
return
session.apis[api] = RateLimit(limit: limit, remaining: remaining, reset: reset)
proc initSessionPool*(cfg: Config; path: string) =
enableLogging = cfg.enableDebug
if path.endsWith(".json"):
log "ERROR: .json is not supported, the file must be a valid JSONL file ending in .jsonl"
quit 1
if not fileExists(path):
log "ERROR: ", path, " not found. This file is required to authenticate API requests."
quit 1
log "parsing JSONL account sessions file: ", path
for line in path.lines:
sessionPool.add parseSession(line)
log "successfully added ", sessionPool.len, " valid account sessions"
+11 -2
View File
@@ -13,6 +13,8 @@ proc get*[T](config: parseCfg.Config; section, key: string; default: T): T =
proc getConfig*(path: string): (Config, parseCfg.Config) = proc getConfig*(path: string): (Config, parseCfg.Config) =
var cfg = loadConfig(path) var cfg = loadConfig(path)
let masterRss = cfg.get("Config", "enableRSS", true)
let conf = Config( let conf = Config(
# Server # Server
address: cfg.get("Server", "address", "0.0.0.0"), address: cfg.get("Server", "address", "0.0.0.0"),
@@ -37,10 +39,17 @@ proc getConfig*(path: string): (Config, parseCfg.Config) =
hmacKey: cfg.get("Config", "hmacKey", "secretkey"), hmacKey: cfg.get("Config", "hmacKey", "secretkey"),
base64Media: cfg.get("Config", "base64Media", false), base64Media: cfg.get("Config", "base64Media", false),
minTokens: cfg.get("Config", "tokenCount", 10), minTokens: cfg.get("Config", "tokenCount", 10),
enableRss: cfg.get("Config", "enableRSS", true), enableRSSUserTweets: masterRss and cfg.get("Config", "enableRSSUserTweets", true),
enableRSSUserReplies: masterRss and cfg.get("Config", "enableRSSUserReplies", true),
enableRSSUserMedia: masterRss and cfg.get("Config", "enableRSSUserMedia", true),
enableRSSSearch: masterRss and cfg.get("Config", "enableRSSSearch", true),
enableRSSList: masterRss and cfg.get("Config", "enableRSSList", true),
enableDebug: cfg.get("Config", "enableDebug", false), enableDebug: cfg.get("Config", "enableDebug", false),
proxy: cfg.get("Config", "proxy", ""), proxy: cfg.get("Config", "proxy", ""),
proxyAuth: cfg.get("Config", "proxyAuth", "") proxyAuth: cfg.get("Config", "proxyAuth", ""),
apiProxy: cfg.get("Config", "apiProxy", ""),
disableTid: cfg.get("Config", "disableTid", false),
maxConcurrentReqs: cfg.get("Config", "maxConcurrentReqs", 2)
) )
return (conf, cfg) return (conf, cfg)
+133 -83
View File
@@ -1,121 +1,171 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import uri, sequtils, strutils import strutils
const const
auth* = "Bearer AAAAAAAAAAAAAAAAAAAAANRILgAAAAAAnNwIzUejRCOuH5E6I8xnZz4puTs%3D1Zv7ttfk8LF81IUq16cHjhLTvJu4FA33AGWWjCpTnA" consumerKey* = "3nVuSoBZnx6U4vzUxf5w"
consumerSecret* = "Bcs59EFbbsdF6Sl9Ng71smgStWEGwXXKSjYvPVt7qys"
bearerToken* = "Bearer AAAAAAAAAAAAAAAAAAAAANRILgAAAAAAnNwIzUejRCOuH5E6I8xnZz4puTs%3D1Zv7ttfk8LF81IUq16cHjhLTvJu4FA33AGWWjCpTnA"
bearerToken2* = "Bearer AAAAAAAAAAAAAAAAAAAAAFXzAwAAAAAAMHCxpeSDG1gLNLghVe8d74hl6k4%3DRUMF4xAQLsbeBhTSRrCiQpJtxoGWeyHrDb5te2jpGskWDFW82F"
api = parseUri("https://api.twitter.com") graphUser* = "-oaLodhGbbnzJBACb1kk2Q/UserByScreenName"
activate* = $(api / "1.1/guest/activate.json") graphUserV2* = "WEoGnYB0EG1yGwamDCF6zg/UserResultByScreenNameQuery"
graphUserById* = "VN33vKXrPT7p35DgNR27aw/UserResultByIdQuery"
photoRail* = api / "1.1/statuses/media_timeline.json" graphUserTweetsV2* = "6QdSuZ5feXxOadEdXa4XZg/UserWithProfileTweetsQueryV2"
userSearch* = api / "1.1/users/search.json" graphUserTweetsAndRepliesV2* = "BDX77Xzqypdt11-mDfgdpQ/UserWithProfileTweetsAndRepliesQueryV2"
graphUserTweets* = "oRJs8SLCRNRbQzuZG93_oA/UserTweets"
graphql = api / "graphql" graphUserTweetsAndReplies* = "kkaJ0Mf34PZVarrxzLihjg/UserTweetsAndReplies"
graphUser* = graphql / "pVrmNaXcxPjisIvKtLDMEA/UserByScreenName" graphUserMedia* = "36oKqyQ7E_9CmtONGjJRsA/UserMedia"
graphUserById* = graphql / "1YAM811Q8Ry4XyPpJclURQ/UserByRestId" graphUserMediaV2* = "bp0e_WdXqgNBIwlLukzyYA/MediaTimelineV2"
graphUserTweets* = graphql / "WzJjibAcDa-oCjCcLOotcg/UserTweets" graphTweet* = "Y4Erk_-0hObvLpz0Iw3bzA/ConversationTimeline"
graphUserTweetsAndReplies* = graphql / "fn9oRltM1N4thkh5CVusPg/UserTweetsAndReplies" graphTweetDetail* = "YVyS4SfwYW7Uw5qwy0mQCA/TweetDetail"
graphUserMedia* = graphql / "qQoeS7szGavsi8-ehD2AWg/UserMedia" graphTweetResult* = "nzme9KiYhfIOrrLrPP_XeQ/TweetResultByIdQuery"
graphTweet* = graphql / "miKSMGb2R1SewIJv2-ablQ/TweetDetail" graphTweetEditHistory* = "upS9teTSG45aljmP9oTuXA/TweetEditHistory"
graphTweetResult* = graphql / "0kc0a_7TTr3dvweZlMslsQ/TweetResultByRestId" graphSearchTimeline* = "bshMIjqDk8LTXTq4w91WKw/SearchTimeline"
graphSearchTimeline* = graphql / "gkjsKepM6gl_HmFWoWKfgg/SearchTimeline" graphListById* = "cIUpT1UjuGgl_oWiY7Snhg/ListByRestId"
graphListById* = graphql / "iTpgCtbdxrsJfyx0cFjHqg/ListByRestId" graphListBySlug* = "K6wihoTiTrzNzSF8y1aeKQ/ListBySlug"
graphListBySlug* = graphql / "-kmqNvm5Y-cVrfvBy6docg/ListBySlug" graphListMembers* = "fuVHh5-gFn8zDBBxb8wOMA/ListMembers"
graphListMembers* = graphql / "P4NpVZDqUD_7MEM84L-8nw/ListMembers" graphListTweets* = "VQf8_XQynI3WzH6xopOMMQ/ListTimeline"
graphListTweets* = graphql / "jZntL0oVJSdjhmPcdbw_eA/ListLatestTweetsTimeline"
timelineParams* = {
"include_profile_interstitial_type": "0",
"include_blocking": "0",
"include_blocked_by": "0",
"include_followed_by": "0",
"include_want_retweets": "0",
"include_mute_edge": "0",
"include_can_dm": "0",
"include_can_media_tag": "1",
"include_ext_is_blue_verified": "1",
"skip_status": "1",
"cards_platform": "Web-12",
"include_cards": "1",
"include_composer_source": "0",
"include_reply_count": "1",
"tweet_mode": "extended",
"include_entities": "1",
"include_user_entities": "1",
"include_ext_media_color": "0",
"send_error_codes": "1",
"simple_quoted_tweet": "1",
"include_quote_count": "1"
}.toSeq
gqlFeatures* = """{ gqlFeatures* = """{
"android_ad_formats_media_component_render_overlay_enabled": false,
"android_graphql_skip_api_media_color_palette": false,
"android_professional_link_spotlight_display_enabled": false,
"articles_api_enabled": false,
"articles_preview_enabled": true,
"blue_business_profile_image_shape_enabled": false, "blue_business_profile_image_shape_enabled": false,
"c9s_tweet_anatomy_moderator_badge_enabled": true,
"commerce_android_shop_module_enabled": false,
"communities_web_enable_tweet_community_results_fetch": true,
"creator_subscriptions_quote_tweet_preview_enabled": false,
"creator_subscriptions_subscription_count_enabled": false,
"creator_subscriptions_tweet_preview_api_enabled": true, "creator_subscriptions_tweet_preview_api_enabled": true,
"freedom_of_speech_not_reach_fetch_enabled": false, "freedom_of_speech_not_reach_fetch_enabled": true,
"graphql_is_translatable_rweb_tweet_is_translatable_enabled": false, "graphql_is_translatable_rweb_tweet_is_translatable_enabled": true,
"grok_android_analyze_trend_fetch_enabled": false,
"grok_translations_community_note_auto_translation_is_enabled": false,
"grok_translations_community_note_translation_is_enabled": false,
"grok_translations_post_auto_translation_is_enabled": false,
"grok_translations_timeline_user_bio_auto_translation_is_enabled": false,
"hidden_profile_likes_enabled": false,
"highlights_tweets_tab_ui_enabled": false, "highlights_tweets_tab_ui_enabled": false,
"immersive_video_status_linkable_timestamps": false,
"interactive_text_enabled": false, "interactive_text_enabled": false,
"longform_notetweets_consumption_enabled": true, "longform_notetweets_consumption_enabled": true,
"longform_notetweets_inline_media_enabled": false, "longform_notetweets_inline_media_enabled": true,
"longform_notetweets_richtext_consumption_enabled": true, "longform_notetweets_richtext_consumption_enabled": true,
"longform_notetweets_rich_text_read_enabled": false, "longform_notetweets_rich_text_read_enabled": true,
"responsive_web_edit_tweet_api_enabled": false, "mobile_app_spotlight_module_enabled": false,
"payments_enabled": false,
"post_ctas_fetch_enabled": true,
"premium_content_api_read_enabled": false,
"profile_label_improvements_pcf_label_in_post_enabled": true,
"profile_label_improvements_pcf_label_in_profile_enabled": false,
"responsive_web_edit_tweet_api_enabled": true,
"responsive_web_enhance_cards_enabled": false, "responsive_web_enhance_cards_enabled": false,
"responsive_web_graphql_exclude_directive_enabled": true, "responsive_web_graphql_exclude_directive_enabled": true,
"responsive_web_graphql_skip_user_profile_image_extensions_enabled": false, "responsive_web_graphql_skip_user_profile_image_extensions_enabled": false,
"responsive_web_graphql_timeline_navigation_enabled": false, "responsive_web_graphql_timeline_navigation_enabled": true,
"responsive_web_grok_analysis_button_from_backend": true,
"responsive_web_grok_analyze_button_fetch_trends_enabled": false,
"responsive_web_grok_analyze_post_followups_enabled": true,
"responsive_web_grok_annotations_enabled": true,
"responsive_web_grok_community_note_auto_translation_is_enabled": false,
"responsive_web_grok_image_annotation_enabled": true,
"responsive_web_grok_imagine_annotation_enabled": true,
"responsive_web_grok_share_attachment_enabled": true,
"responsive_web_grok_show_grok_translated_post": false,
"responsive_web_jetfuel_frame": true,
"responsive_web_media_download_video_enabled": false,
"responsive_web_profile_redirect_enabled": false,
"responsive_web_text_conversations_enabled": false, "responsive_web_text_conversations_enabled": false,
"responsive_web_twitter_article_notes_tab_enabled": false,
"responsive_web_twitter_article_tweet_consumption_enabled": true,
"responsive_web_twitter_blue_verified_badge_is_enabled": true, "responsive_web_twitter_blue_verified_badge_is_enabled": true,
"rweb_lists_timeline_redesign_enabled": true, "rweb_lists_timeline_redesign_enabled": true,
"rweb_tipjar_consumption_enabled": true,
"rweb_video_screen_enabled": false,
"rweb_video_timestamps_enabled": false,
"spaces_2022_h2_clipping": true, "spaces_2022_h2_clipping": true,
"spaces_2022_h2_spaces_communities": true, "spaces_2022_h2_spaces_communities": true,
"standardized_nudges_misinfo": false, "standardized_nudges_misinfo": true,
"subscriptions_feature_can_gift_premium": false,
"subscriptions_verification_info_enabled": true,
"subscriptions_verification_info_is_identity_verified_enabled": false,
"subscriptions_verification_info_reason_enabled": true,
"subscriptions_verification_info_verified_since_enabled": true,
"super_follow_badge_privacy_enabled": false,
"super_follow_exclusive_tweet_notifications_enabled": false,
"super_follow_tweet_api_enabled": false,
"super_follow_user_api_enabled": false,
"tweet_awards_web_tipping_enabled": false, "tweet_awards_web_tipping_enabled": false,
"tweet_with_visibility_results_prefer_gql_limited_actions_policy_enabled": false, "tweet_with_visibility_results_prefer_gql_limited_actions_policy_enabled": true,
"tweetypie_unmention_optimization_enabled": false, "tweetypie_unmention_optimization_enabled": false,
"unified_cards_ad_metadata_container_dynamic_card_content_query_enabled": false,
"unified_cards_destination_url_params_enabled": false,
"verified_phone_label_enabled": false, "verified_phone_label_enabled": false,
"vibe_api_enabled": false, "vibe_api_enabled": false,
"view_counts_everywhere_api_enabled": false "view_counts_everywhere_api_enabled": true,
"hidden_profile_subscriptions_enabled": false
}""".replace(" ", "").replace("\n", "") }""".replace(" ", "").replace("\n", "")
tweetVariables* = """{ tweetVars* = """{
"postId": "$1",
$2
"includeHasBirdwatchNotes": false,
"includePromotedContent": false,
"withBirdwatchNotes": true,
"withVoice": false,
"withV2Timeline": true
}""".replace(" ", "").replace("\n", "")
tweetDetailVars* = """{
"focalTweetId": "$1", "focalTweetId": "$1",
$2 $2
"withBirdwatchNotes": false, "referrer": "profile",
"includePromotedContent": false, "with_rux_injections": false,
"withDownvotePerspective": false, "rankingMode": "Relevance",
"withReactionsMetadata": false, "includePromotedContent": true,
"withReactionsPerspective": false, "withCommunity": true,
"withVoice": false "withQuickPromoteEligibilityTweetFields": true,
}""" "withBirdwatchNotes": true,
"withVoice": true
}""".replace(" ", "").replace("\n", "")
tweetResultVariables* = """{ tweetEditHistoryVars* = """{
"tweetId": "$1", "tweetId": "$1",
"includePromotedContent": false, "withQuickPromoteEligibilityTweetFields": true
"withDownvotePerspective": false, }""".replace(" ", "").replace("\n", "")
"withReactionsMetadata": false,
"withReactionsPerspective": false, restIdVars* = """{
"withVoice": false, "rest_id": "$1", $2
"withCommunity": false "count": 20
}""" }"""
userTweetsVariables* = """{ userMediaVars* = """{
"userId": "$1", $2 "userId": "$1", $2
"count": 20, "count": 20,
"includePromotedContent": false, "includePromotedContent": false,
"withDownvotePerspective": false, "withClientEventToken": false,
"withReactionsMetadata": false, "withBirdwatchNotes": false,
"withReactionsPerspective": false, "withVoice": true
"withVoice": false, }""".replace(" ", "").replace("\n", "")
"withV2Timeline": true
}"""
listTweetsVariables* = """{ userTweetsVars* = """{
"listId": "$1", $2 "userId": "$1", $2
"count": 20, "count": 20,
"includePromotedContent": false, "includePromotedContent": false,
"withDownvotePerspective": false, "withQuickPromoteEligibilityTweetFields": true,
"withReactionsMetadata": false, "withVoice": true
"withReactionsPerspective": false, }""".replace(" ", "").replace("\n", "")
"withVoice": false
}""" userTweetsAndRepliesVars* = """{
"userId": "$1", $2
"count": 20,
"includePromotedContent": false,
"withCommunity": true,
"withVoice": true
}""".replace(" ", "").replace("\n", "")
userFieldToggles = """{"withPayments":false,"withAuxiliaryUserLabels":true}"""
userTweetsFieldToggles* = """{"withArticlePlainText":false}"""
tweetDetailFieldToggles* = """{"withArticleRichContentState":true,"withArticlePlainText":false,"withGrokAnalyze":false,"withDisallowedReplyControls":false}"""
+45 -9
View File
@@ -1,17 +1,53 @@
import options import options, strutils
import jsony import jsony
import user, ../types/[graphuser, graphlistmembers] import user, utils, ../types/[graphuser, graphlistmembers]
from ../../types import User, Result, Query, QueryKind from ../../types import User, VerifiedType, Result, Query, QueryKind
proc parseUserResult*(userResult: UserResult): User =
result = userResult.legacy
if result.verifiedType == none and userResult.isBlueVerified:
result.verifiedType = blue
if result.username.len == 0 and userResult.core.screenName.len > 0:
result.id = userResult.restId
result.username = userResult.core.screenName
result.fullname = userResult.core.name
result.userPic = userResult.avatar.imageUrl.replace("_normal", "")
if userResult.privacy.isSome:
result.protected = userResult.privacy.get.protected
if userResult.location.isSome:
result.location = userResult.location.get.location
if userResult.core.createdAt.len > 0:
result.joinDate = parseTwitterDate(userResult.core.createdAt)
if userResult.verification.isSome:
let v = userResult.verification.get
if v.verifiedType != VerifiedType.none:
result.verifiedType = v.verifiedType
if userResult.profileBio.isSome and result.bio.len == 0:
result.bio = userResult.profileBio.get.description
proc parseGraphUser*(json: string): User = proc parseGraphUser*(json: string): User =
let raw = json.fromJson(GraphUser) if json.len == 0 or json[0] != '{':
return
if raw.data.user.result.reason.get("") == "Suspended": let
raw = json.fromJson(GraphUser)
userResult =
if raw.data.userResult.isSome: raw.data.userResult.get.result
elif raw.data.user.isSome: raw.data.user.get.result
else: UserResult()
if userResult.unavailableReason.get("") == "Suspended" or
userResult.reason.get("") == "Suspended":
return User(suspended: true) return User(suspended: true)
result = toUser raw.data.user.result.legacy result = parseUserResult(userResult)
result.id = raw.data.user.result.restId
result.verified = result.verified or raw.data.user.result.isBlueVerified
proc parseGraphListMembers*(json, cursor: string): Result[User] = proc parseGraphListMembers*(json, cursor: string): Result[User] =
result = Result[User]( result = Result[User](
@@ -27,7 +63,7 @@ proc parseGraphListMembers*(json, cursor: string): Result[User] =
of TimelineTimelineItem: of TimelineTimelineItem:
let userResult = entry.content.itemContent.userResults.result let userResult = entry.content.itemContent.userResults.result
if userResult.restId.len > 0: if userResult.restId.len > 0:
result.content.add toUser userResult.legacy result.content.add parseUserResult(userResult)
of TimelineTimelineCursor: of TimelineTimelineCursor:
if entry.content.cursorType == "Bottom": if entry.content.cursorType == "Bottom":
result.bottom = entry.content.value result.bottom = entry.content.value
+30
View File
@@ -0,0 +1,30 @@
import std/strutils
import jsony
import ../types/session
from ../../types import Session, SessionKind
proc parseSession*(raw: string): Session =
let session = raw.fromJson(RawSession)
let kind = if session.kind == "": "oauth" else: session.kind
case kind
of "oauth":
let id = session.oauthToken[0 ..< session.oauthToken.find('-')]
result = Session(
kind: SessionKind.oauth,
id: parseBiggestInt(id),
username: session.username,
oauthToken: session.oauthToken,
oauthSecret: session.oauthTokenSecret
)
of "cookie":
let id = if session.id.len > 0: parseBiggestInt(session.id) else: 0
result = Session(
kind: SessionKind.cookie,
id: id,
username: session.username,
authToken: session.authToken,
ct0: session.ct0
)
else:
raise newException(ValueError, "Unknown session kind: " & kind)
+1 -1
View File
@@ -54,7 +54,7 @@ proc replacedWith*(runes: seq[Rune]; repls: openArray[ReplaceSlice];
let let
name = $runes[rep.slice.a.succ .. rep.slice.b] name = $runes[rep.slice.a.succ .. rep.slice.b]
symbol = $runes[rep.slice.a] symbol = $runes[rep.slice.a]
result.add a(symbol & name, href = "/search?q=%23" & name) result.add a(symbol & name, href = "/search?f=tweets&q=%23" & name)
of rkMention: of rkMention:
result.add a($runes[rep.slice], href = rep.url, title = rep.display) result.add a($runes[rep.slice], href = rep.url, title = rep.display)
of rkUrl: of rkUrl:
+8
View File
@@ -0,0 +1,8 @@
import jsony
import ../types/tid
export TidPair
proc parseTidPairs*(raw: string): seq[TidPair] =
result = raw.fromJson(seq[TidPair])
if result.len == 0:
raise newException(ValueError, "Parsing pairs failed: " & raw)
+26 -1
View File
@@ -1,6 +1,7 @@
import std/[options, tables, strutils, strformat, sugar] import std/[options, tables, strutils, strformat, sugar]
import jsony import jsony
import ../types/unifiedcard import user, ../types/unifiedcard
import ../../formatters
from ../../types import Card, CardKind, Video from ../../types import Card, CardKind, Video
from ../../utils import twimg, https from ../../utils import twimg, https
@@ -27,6 +28,14 @@ proc parseMediaDetails(data: ComponentData; card: UnifiedCard; result: var Card)
result.text = data.topicDetail.title result.text = data.topicDetail.title
result.dest = "Topic" result.dest = "Topic"
proc parseJobDetails(data: ComponentData; card: UnifiedCard; result: var Card) =
data.destination.parseDestination(card, result)
result.kind = CardKind.jobDetails
result.title = data.title
result.text = data.shortDescriptionText
result.dest = &"@{data.profileUser.username} · {data.location}"
proc parseAppDetails(data: ComponentData; card: UnifiedCard; result: var Card) = proc parseAppDetails(data: ComponentData; card: UnifiedCard; result: var Card) =
let app = card.appStoreData[data.appId][0] let app = card.appStoreData[data.appId][0]
@@ -69,6 +78,18 @@ proc parseMedia(component: Component; card: UnifiedCard; result: var Card) =
of model3d: of model3d:
result.title = "Unsupported 3D model ad" result.title = "Unsupported 3D model ad"
proc parseGrokShare(data: ComponentData; card: UnifiedCard; result: var Card) =
result.kind = summaryLarge
data.destination.parseDestination(card, result)
result.dest = "Answer by Grok"
for msg in data.conversationPreview:
if msg.sender == "USER":
result.title = msg.message.shorten(70)
elif msg.sender == "AGENT":
result.text = msg.message.shorten(500)
proc parseUnifiedCard*(json: string): Card = proc parseUnifiedCard*(json: string): Card =
let card = json.fromJson(UnifiedCard) let card = json.fromJson(UnifiedCard)
@@ -84,6 +105,10 @@ proc parseUnifiedCard*(json: string): Card =
component.parseMedia(card, result) component.parseMedia(card, result)
of buttonGroup: of buttonGroup:
discard discard
of grokShare:
component.data.parseGrokShare(card, result)
of ComponentType.jobDetails:
component.data.parseJobDetails(card, result)
of ComponentType.hidden: of ComponentType.hidden:
result.kind = CardKind.hidden result.kind = CardKind.hidden
of ComponentType.unknown: of ComponentType.unknown:
+9 -20
View File
@@ -9,7 +9,7 @@ let
unReplace = "$1<a href=\"/$2\">@$2</a>" unReplace = "$1<a href=\"/$2\">@$2</a>"
htRegex = nre.re"""(*U)(^|[^\w-_.?])([#$])([\w_]*+)(?!</a>|">|#)""" htRegex = nre.re"""(*U)(^|[^\w-_.?])([#$])([\w_]*+)(?!</a>|">|#)"""
htReplace = "$1<a href=\"/search?q=%23$3\">$2$3</a>" htReplace = "$1<a href=\"/search?f=tweets&q=%23$3\">$2$3</a>"
proc expandUserEntities(user: var User; raw: RawUser) = proc expandUserEntities(user: var User; raw: RawUser) =
let let
@@ -56,32 +56,21 @@ proc toUser*(raw: RawUser): User =
tweets: raw.statusesCount, tweets: raw.statusesCount,
likes: raw.favouritesCount, likes: raw.favouritesCount,
media: raw.mediaCount, media: raw.mediaCount,
verified: raw.verified, verifiedType: raw.verifiedType,
protected: raw.protected, protected: raw.protected,
joinDate: parseTwitterDate(raw.createdAt),
banner: getBanner(raw), banner: getBanner(raw),
userPic: getImageUrl(raw.profileImageUrlHttps).replace("_normal", "") userPic: getImageUrl(raw.profileImageUrlHttps).replace("_normal", "")
) )
if raw.createdAt.len > 0:
result.joinDate = parseTwitterDate(raw.createdAt)
if raw.pinnedTweetIdsStr.len > 0: if raw.pinnedTweetIdsStr.len > 0:
result.pinnedTweet = parseBiggestInt(raw.pinnedTweetIdsStr[0]) result.pinnedTweet = parseBiggestInt(raw.pinnedTweetIdsStr[0])
result.expandUserEntities(raw) result.expandUserEntities(raw)
proc parseUser*(json: string; username=""): User = proc parseHook*(s: string; i: var int; v: var User) =
handleErrors: var u: RawUser
case error.code parseHook(s, i, u)
of suspended: return User(username: username, suspended: true) v = toUser u
of userNotFound: return
else: echo "[error - parseUser]: ", error
result = toUser json.fromJson(RawUser)
proc parseUsers*(json: string; after=""): Result[User] =
result = Result[User](beginning: after.len == 0)
# starting with '{' means it's an error
if json[0] == '[':
let raw = json.fromJson(seq[RawUser])
for user in raw:
result.content.add user.toUser
+38 -5
View File
@@ -1,15 +1,48 @@
import options import options, strutils
import user from ../../types import User, VerifiedType
type type
GraphUser* = object GraphUser* = object
data*: tuple[user: UserData] data*: tuple[userResult: Option[UserData], user: Option[UserData]]
UserData* = object UserData* = object
result*: UserResult result*: UserResult
UserResult = object UserCore* = object
legacy*: RawUser name*: string
screenName*: string
createdAt*: string
UserBio* = object
description*: string
UserAvatar* = object
imageUrl*: string
Verification* = object
verifiedType*: VerifiedType
Location* = object
location*: string
Privacy* = object
protected*: bool
UserResult* = object
legacy*: User
restId*: string restId*: string
isBlueVerified*: bool isBlueVerified*: bool
core*: UserCore
avatar*: UserAvatar
unavailableReason*: Option[string]
reason*: Option[string] reason*: Option[string]
privacy*: Option[Privacy]
profileBio*: Option[UserBio]
verification*: Option[Verification]
location*: Option[Location]
proc enumHook*(s: string; v: var VerifiedType) =
v = try:
parseEnum[VerifiedType](s)
except:
VerifiedType.none
+9
View File
@@ -0,0 +1,9 @@
type
RawSession* = object
kind*: string
id*: string
username*: string
oauthToken*: string
oauthTokenSecret*: string
authToken*: string
ct0*: string
+4
View File
@@ -0,0 +1,4 @@
type
TidPair* = object
animationKey*: string
verification*: string
-23
View File
@@ -1,23 +0,0 @@
import std/tables
import user
type
Search* = object
globalObjects*: GlobalObjects
timeline*: Timeline
GlobalObjects = object
users*: Table[string, RawUser]
Timeline = object
instructions*: seq[Instructions]
Instructions = object
addEntries*: tuple[entries: seq[Entry]]
Entry = object
entryId*: string
content*: tuple[operation: Operation]
Operation = object
cursor*: tuple[value, cursorType: string]
+32 -5
View File
@@ -1,7 +1,10 @@
import options, tables import std/[options, tables, times]
from ../../types import VideoType, VideoVariant import jsony
from ../../types import VideoType, VideoVariant, User
type type
Text* = distinct string
UnifiedCard* = object UnifiedCard* = object
componentObjects*: Table[string, Component] componentObjects*: Table[string, Component]
destinationObjects*: Table[string, Destination] destinationObjects*: Table[string, Destination]
@@ -13,11 +16,13 @@ type
media media
swipeableMedia swipeableMedia
buttonGroup buttonGroup
jobDetails
appStoreDetails appStoreDetails
twitterListDetails twitterListDetails
communityDetails communityDetails
mediaWithDetailsHorizontal mediaWithDetailsHorizontal
hidden hidden
grokShare
unknown unknown
Component* = object Component* = object
@@ -29,12 +34,16 @@ type
appId*: string appId*: string
mediaId*: string mediaId*: string
destination*: string destination*: string
location*: string
title*: Text title*: Text
subtitle*: Text subtitle*: Text
name*: Text name*: Text
memberCount*: int memberCount*: int
mediaList*: seq[MediaItem] mediaList*: seq[MediaItem]
topicDetail*: tuple[title: Text] topicDetail*: tuple[title: Text]
profileUser*: User
shortDescriptionText*: string
conversationPreview*: seq[GrokConversation]
MediaItem* = object MediaItem* = object
id*: string id*: string
@@ -69,12 +78,13 @@ type
title*: Text title*: Text
category*: Text category*: Text
Text = object GrokConversation* = object
content: string message*: string
sender*: string
TypeField = Component | Destination | MediaEntity | AppStoreData TypeField = Component | Destination | MediaEntity | AppStoreData
converter fromText*(text: Text): string = text.content converter fromText*(text: Text): string = string(text)
proc renameHook*(v: var TypeField; fieldName: var string) = proc renameHook*(v: var TypeField; fieldName: var string) =
if fieldName == "type": if fieldName == "type":
@@ -86,11 +96,13 @@ proc enumHook*(s: string; v: var ComponentType) =
of "media": media of "media": media
of "swipeable_media": swipeableMedia of "swipeable_media": swipeableMedia
of "button_group": buttonGroup of "button_group": buttonGroup
of "job_details": jobDetails
of "app_store_details": appStoreDetails of "app_store_details": appStoreDetails
of "twitter_list_details": twitterListDetails of "twitter_list_details": twitterListDetails
of "community_details": communityDetails of "community_details": communityDetails
of "media_with_details_horizontal": mediaWithDetailsHorizontal of "media_with_details_horizontal": mediaWithDetailsHorizontal
of "commerce_drop_details": hidden of "commerce_drop_details": hidden
of "grok_share": grokShare
else: echo "ERROR: Unknown enum value (ComponentType): ", s; unknown else: echo "ERROR: Unknown enum value (ComponentType): ", s; unknown
proc enumHook*(s: string; v: var AppType) = proc enumHook*(s: string; v: var AppType) =
@@ -106,3 +118,18 @@ proc enumHook*(s: string; v: var MediaType) =
of "photo": photo of "photo": photo
of "model3d": model3d of "model3d": model3d
else: echo "ERROR: Unknown enum value (MediaType): ", s; photo else: echo "ERROR: Unknown enum value (MediaType): ", s; photo
proc parseHook*(s: string; i: var int; v: var DateTime) =
var str: string
parseHook(s, i, str)
v = parse(str, "yyyy-MM-dd hh:mm:ss")
proc parseHook*(s: string; i: var int; v: var Text) =
if s[i] == '"':
var str: string
parseHook(s, i, str)
v = Text(str)
else:
var t: tuple[content: string]
parseHook(s, i, t)
v = Text(t.content)
+2 -1
View File
@@ -1,5 +1,6 @@
import options import options
import common import common
from ../../types import VerifiedType
type type
RawUser* = object RawUser* = object
@@ -15,7 +16,7 @@ type
favouritesCount*: int favouritesCount*: int
statusesCount*: int statusesCount*: int
mediaCount*: int mediaCount*: int
verified*: bool verifiedType*: VerifiedType
protected*: bool protected*: bool
profileLinkColor*: string profileLinkColor*: string
profileBannerUrl*: string profileBannerUrl*: string
+50 -19
View File
@@ -1,16 +1,18 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import strutils, strformat, times, uri, tables, xmltree, htmlparser, htmlgen import strutils, strformat, times, uri, tables, xmltree, htmlparser, htmlgen, math
import std/[enumerate, re] import std/[enumerate, re]
import types, utils, query import types, utils, query
const const
cards = "cards.twitter.com/cards" cards = "cards.twitter.com/cards"
tco = "https://t.co" tco = "https://t.co"
twitter = parseUri("https://twitter.com") twitter = parseUri("https://x.com")
let let
twRegex = re"(?<=(?<!\S)https:\/\/|(?<=\s))(www\.|mobile\.)?twitter\.com" twRegex = re"(?<=(?<!\S)https:\/\/|(?<=\s))(www\.|mobile\.)?twitter\.com"
twLinkRegex = re"""<a href="https:\/\/twitter.com([^"]+)">twitter\.com(\S+)</a>""" twLinkRegex = re"""<a href="https:\/\/twitter.com([^"]+)">twitter\.com(\S+)</a>"""
xRegex = re"(?<=(?<!\S)https:\/\/|(?<=\s))(www\.|mobile\.)?x\.com"
xLinkRegex = re"""<a href="https:\/\/x.com([^"]+)">x\.com(\S+)</a>"""
ytRegex = re(r"([A-z.]+\.)?youtu(be\.com|\.be)", {reStudy, reIgnoreCase}) ytRegex = re(r"([A-z.]+\.)?youtu(be\.com|\.be)", {reStudy, reIgnoreCase})
@@ -31,10 +33,13 @@ proc getUrlPrefix*(cfg: Config): string =
if cfg.useHttps: https & cfg.hostname if cfg.useHttps: https & cfg.hostname
else: "http://" & cfg.hostname else: "http://" & cfg.hostname
proc shortLink*(text: string; length=28): string = proc shorten*(text: string; length=28): string =
result = text.replace(wwwRegex, "") result = text
if result.len > length: if result.len > length:
result = result[0 ..< length] & "" result = result[0 ..< length] & ""
proc shortLink*(text: string; length=28): string =
result = text.replace(wwwRegex, "").shorten(length)
proc stripHtml*(text: string; shorten=false): string = proc stripHtml*(text: string; shorten=false): string =
var html = parseHtml(text) var html = parseHtml(text)
@@ -54,19 +59,28 @@ proc replaceUrls*(body: string; prefs: Prefs; absolute=""): string =
result = body result = body
if prefs.replaceYouTube.len > 0 and "youtu" in result: if prefs.replaceYouTube.len > 0 and "youtu" in result:
result = result.replace(ytRegex, prefs.replaceYouTube) let youtubeHost = strip(prefs.replaceYouTube, chars={'/'})
result = result.replace(ytRegex, youtubeHost)
if prefs.replaceTwitter.len > 0 and ("twitter.com" in body or tco in body): if prefs.replaceTwitter.len > 0:
result = result.replace(tco, https & prefs.replaceTwitter & "/t.co") let twitterHost = strip(prefs.replaceTwitter, chars={'/'})
result = result.replace(cards, prefs.replaceTwitter & "/cards") if tco in result:
result = result.replace(twRegex, prefs.replaceTwitter) result = result.replace(tco, https & twitterHost & "/t.co")
result = result.replacef(twLinkRegex, a( if "x.com" in result:
prefs.replaceTwitter & "$2", href = https & prefs.replaceTwitter & "$1")) result = result.replace(xRegex, twitterHost)
result = result.replacef(xLinkRegex, a(
twitterHost & "$2", href = https & twitterHost & "$1"))
if "twitter.com" in result:
result = result.replace(cards, twitterHost & "/cards")
result = result.replace(twRegex, twitterHost)
result = result.replacef(twLinkRegex, a(
twitterHost & "$2", href = https & twitterHost & "$1"))
if prefs.replaceReddit.len > 0 and ("reddit.com" in result or "redd.it" in result): if prefs.replaceReddit.len > 0 and ("reddit.com" in result or "redd.it" in result):
result = result.replace(rdShortRegex, prefs.replaceReddit & "/comments/") let redditHost = strip(prefs.replaceReddit, chars={'/'})
result = result.replace(rdRegex, prefs.replaceReddit) result = result.replace(rdShortRegex, redditHost & "/comments/")
if prefs.replaceReddit in result and "/gallery/" in result: result = result.replace(rdRegex, redditHost)
if redditHost in result and "/gallery/" in result:
result = result.replace("/gallery/", "/comments/") result = result.replace("/gallery/", "/comments/")
if absolute.len > 0 and "href" in result: if absolute.len > 0 and "href" in result:
@@ -82,6 +96,8 @@ proc proxifyVideo*(manifest: string; proxy: bool): string =
for line in manifest.splitLines: for line in manifest.splitLines:
let url = let url =
if line.startsWith("#EXT-X-MAP:URI"): line[16 .. ^2] if line.startsWith("#EXT-X-MAP:URI"): line[16 .. ^2]
elif line.startsWith("#EXT-X-MEDIA") and "URI=" in line:
line[line.find("URI=") + 5 .. -1 + line.find("\"", start= 5 + line.find("URI="))]
else: line else: line
if url.startsWith('/'): if url.startsWith('/'):
let path = "https://video.twimg.com" & url let path = "https://video.twimg.com" & url
@@ -138,13 +154,28 @@ proc getShortTime*(tweet: Tweet): string =
else: else:
result = "now" result = "now"
proc getDuration*(video: Video): string =
let
ms = video.durationMs
sec = int(round(ms / 1000))
min = floorDiv(sec, 60)
hour = floorDiv(min, 60)
if hour > 0:
return &"{hour}:{min mod 60}:{sec mod 60:02}"
else:
return &"{min mod 60}:{sec mod 60:02}"
proc getLink*(id: int64; username="i"; focus=true): string =
var username = username
if username.len == 0:
username = "i"
result = &"/{username}/status/{id}"
if focus: result &= "#m"
proc getLink*(tweet: Tweet; focus=true): string = proc getLink*(tweet: Tweet; focus=true): string =
if tweet.id == 0: return if tweet.id == 0: return
var username = tweet.user.username var username = tweet.user.username
if username.len == 0: return getLink(tweet.id, username, focus)
username = "i"
result = &"/{username}/status/{tweet.id}"
if focus: result &= "#m"
proc getTwitterLink*(path: string; params: Table[string, string]): string = proc getTwitterLink*(path: string; params: Table[string, string]): string =
var var
@@ -172,7 +203,7 @@ proc getTwitterLink*(path: string; params: Table[string, string]): string =
proc getLocation*(u: User | Tweet): (string, string) = proc getLocation*(u: User | Tweet): (string, string) =
if "://" in u.location: return (u.location, "") if "://" in u.location: return (u.location, "")
let loc = u.location.split(":") let loc = u.location.split(":")
let url = if loc.len > 1: "/search?q=place:" & loc[1] else: "" let url = if loc.len > 1: "/search?f=tweets&q=place:" & loc[1] else: ""
(loc[0], url) (loc[0], url)
proc getSuspended*(username: string): string = proc getSuspended*(username: string): string =
+2 -5
View File
@@ -39,11 +39,8 @@ template use*(pool: HttpPool; heads: HttpHeaders; body: untyped): untyped =
try: try:
body body
except ProtocolError: except BadClientError, ProtocolError:
# Twitter closed the connection, retry # Twitter returned 503 or closed the connection, we need a new client
body
except BadClientError:
# Twitter returned 503, we need a new client
pool.release(c, true) pool.release(c, true)
badClient = false badClient = false
c = pool.acquire(heads) c = pool.acquire(heads)
+25 -9
View File
@@ -6,7 +6,7 @@ from os import getEnv
import jester import jester
import types, config, prefs, formatters, redis_cache, http_pool, tokens import types, config, prefs, formatters, redis_cache, http_pool, auth, apiutils
import views/[general, about] import views/[general, about]
import routes/[ import routes/[
preferences, timeline, status, media, search, rss, list, debug, preferences, timeline, status, media, search, rss, list, debug,
@@ -15,8 +15,13 @@ import routes/[
const instancesUrl = "https://github.com/zedeus/nitter/wiki/Instances" const instancesUrl = "https://github.com/zedeus/nitter/wiki/Instances"
const issuesUrl = "https://github.com/zedeus/nitter/issues" const issuesUrl = "https://github.com/zedeus/nitter/issues"
let configPath = getEnv("NITTER_CONF_FILE", "./nitter.conf") let
let (cfg, fullCfg) = getConfig(configPath) configPath = getEnv("NITTER_CONF_FILE", "./nitter.conf")
(cfg, fullCfg) = getConfig(configPath)
sessionsPath = getEnv("NITTER_SESSIONS_FILE", "./sessions.jsonl")
initSessionPool(cfg, sessionsPath)
if not cfg.enableDebug: if not cfg.enableDebug:
# Silence Jester's query warning # Silence Jester's query warning
@@ -32,14 +37,15 @@ setHmacKey(cfg.hmacKey)
setProxyEncoding(cfg.base64Media) setProxyEncoding(cfg.base64Media)
setMaxHttpConns(cfg.httpMaxConns) setMaxHttpConns(cfg.httpMaxConns)
setHttpProxy(cfg.proxy, cfg.proxyAuth) setHttpProxy(cfg.proxy, cfg.proxyAuth)
setApiProxy(cfg.apiProxy)
setDisableTid(cfg.disableTid)
setMaxConcurrentReqs(cfg.maxConcurrentReqs)
initAboutPage(cfg.staticDir) initAboutPage(cfg.staticDir)
waitFor initRedisPool(cfg) waitFor initRedisPool(cfg)
stdout.write &"Connected to Redis at {cfg.redisHost}:{cfg.redisPort}\n" stdout.write &"Connected to Redis at {cfg.redisHost}:{cfg.redisPort}\n"
stdout.flushFile stdout.flushFile
asyncCheck initTokenPool(cfg)
createUnsupportedRouter(cfg) createUnsupportedRouter(cfg)
createResolverRouter(cfg) createResolverRouter(cfg)
createPrefRouter(cfg) createPrefRouter(cfg)
@@ -59,11 +65,16 @@ settings:
reusePort = true reusePort = true
routes: routes:
before:
# skip all file URLs
cond "." notin request.path
applyUrlPrefs()
get "/": get "/":
resp renderMain(renderSearch(), request, cfg, themePrefs()) resp renderMain(renderSearch(), request, cfg, requestPrefs())
get "/about": get "/about":
resp renderMain(renderAbout(), request, cfg, themePrefs()) resp renderMain(renderAbout(), request, cfg, requestPrefs())
get "/explore": get "/explore":
redirect("/about") redirect("/about")
@@ -74,7 +85,7 @@ routes:
get "/i/redirect": get "/i/redirect":
let url = decodeUrl(@"url") let url = decodeUrl(@"url")
if url.len == 0: resp Http404 if url.len == 0: resp Http404
redirect(replaceUrls(url, cookiePrefs())) redirect(replaceUrls(url, requestPrefs()))
error Http404: error Http404:
resp Http404, showError("Page not found", cfg) resp Http404, showError("Page not found", cfg)
@@ -87,13 +98,18 @@ routes:
error BadClientError: error BadClientError:
echo error.exc.name, ": ", error.exc.msg echo error.exc.name, ": ", error.exc.msg
resp Http500, showError("Network error occured, please try again.", cfg) resp Http500, showError("Network error occurred, please try again.", cfg)
error RateLimitError: error RateLimitError:
const link = a("another instance", href = instancesUrl) const link = a("another instance", href = instancesUrl)
resp Http429, showError( resp Http429, showError(
&"Instance has been rate limited.<br>Use {link} or try again later.", cfg) &"Instance has been rate limited.<br>Use {link} or try again later.", cfg)
error NoSessionsError:
const link = a("another instance", href = instancesUrl)
resp Http429, showError(
&"Instance has no auth tokens, or is fully rate limited.<br>Use {link} or try again later.", cfg)
extend rss, "" extend rss, ""
extend status, "" extend status, ""
extend search, "" extend search, ""
+403 -242
View File
@@ -1,11 +1,17 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import strutils, options, tables, times, math import strutils, options, times, math, tables
import packedjson, packedjson/deserialiser import packedjson, packedjson/deserialiser
import types, parserutils, utils import types, parserutils, utils
import experimental/parser/unifiedcard import experimental/parser/unifiedcard
proc parseGraphTweet(js: JsonNode): Tweet proc parseGraphTweet(js: JsonNode): Tweet
proc parseCommunityNote(js: JsonNode): string =
let subtitle = js{"subtitle"}
result = subtitle{"text"}.getStr
with entities, subtitle{"entities"}:
result = expandBirdwatchEntities(result, entities)
proc parseUser(js: JsonNode; id=""): User = proc parseUser(js: JsonNode; id=""): User =
if js.isNull: return if js.isNull: return
result = User( result = User(
@@ -21,19 +27,45 @@ proc parseUser(js: JsonNode; id=""): User =
tweets: js{"statuses_count"}.getInt, tweets: js{"statuses_count"}.getInt,
likes: js{"favourites_count"}.getInt, likes: js{"favourites_count"}.getInt,
media: js{"media_count"}.getInt, media: js{"media_count"}.getInt,
verified: js{"verified"}.getBool or js{"ext_is_blue_verified"}.getBool, protected: js{"protected"}.getBool(js{"privacy", "protected"}.getBool),
protected: js{"protected"}.getBool,
joinDate: js{"created_at"}.getTime joinDate: js{"created_at"}.getTime
) )
if js{"is_blue_verified"}.getBool(false):
result.verifiedType = blue
with verifiedType, js{"verified_type"}:
result.verifiedType = parseEnum[VerifiedType](verifiedType.getStr)
result.expandUserEntities(js) result.expandUserEntities(js)
proc parseGraphUser(js: JsonNode): User = proc parseGraphUser(js: JsonNode): User =
let user = ? js{"user_results", "result"} var user = js{"user_result", "result"}
result = parseUser(user{"legacy"}) if user.isNull:
user = ? js{"user_results", "result"}
if "is_blue_verified" in user: if user.isNull:
result.verified = true if js{"core"}.notNull and js{"legacy"}.notNull:
user = js
else:
return
result = parseUser(user{"legacy"}, user{"rest_id"}.getStr)
if result.verifiedType == none and user{"is_blue_verified"}.getBool(false):
result.verifiedType = blue
# fallback to support UserMedia/recent GraphQL updates
if result.username.len == 0:
result.username = user{"core", "screen_name"}.getStr
result.fullname = user{"core", "name"}.getStr
result.userPic = user{"avatar", "image_url"}.getImageStr.replace("_normal", "")
if user{"is_blue_verified"}.getBool(false):
result.verifiedType = blue
with verifiedType, user{"verification", "verified_type"}:
result.verifiedType = parseEnum[VerifiedType](verifiedType.getStr)
proc parseGraphList*(js: JsonNode): List = proc parseGraphList*(js: JsonNode): List =
if js.isNull: return if js.isNull: return
@@ -72,42 +104,102 @@ proc parsePoll(js: JsonNode): Poll =
result.leader = result.values.find(max(result.values)) result.leader = result.values.find(max(result.values))
result.votes = result.values.sum result.votes = result.values.sum
proc parseGif(js: JsonNode): Gif = proc parseVideoVariants(variants: JsonNode): seq[VideoVariant] =
result = Gif( result = @[]
url: js{"video_info", "variants"}[0]{"url"}.getImageStr, for v in variants:
thumb: js{"media_url_https"}.getImageStr let
) url = v{"url"}.getStr
contentType = parseEnum[VideoType](v{"content_type"}.getStr("video/mp4"))
bitrate = v{"bit_rate"}.getInt(v{"bitrate"}.getInt(0))
result.add VideoVariant(
contentType: contentType,
bitrate: bitrate,
url: url,
resolution: if contentType == mp4: getMp4Resolution(url) else: 0
)
proc parseVideo(js: JsonNode): Video = proc parseVideo(js: JsonNode): Video =
result = Video( result = Video(
thumb: js{"media_url_https"}.getImageStr, thumb: js{"media_url_https"}.getImageStr,
views: js{"ext", "mediaStats", "r", "ok", "viewCount"}.getStr($js{"mediaStats", "viewCount"}.getInt), available: true,
available: js{"ext_media_availability", "status"}.getStr.toLowerAscii == "available",
title: js{"ext_alt_text"}.getStr, title: js{"ext_alt_text"}.getStr,
durationMs: js{"video_info", "duration_millis"}.getInt, durationMs: js{"video_info", "duration_millis"}.getInt
playbackType: m3u8 # playbackType: mp4
) )
with status, js{"ext_media_availability", "status"}:
if status.getStr.len > 0 and status.getStr.toLowerAscii != "available":
result.available = false
with title, js{"additional_media_info", "title"}: with title, js{"additional_media_info", "title"}:
result.title = title.getStr result.title = title.getStr
with description, js{"additional_media_info", "description"}: with description, js{"additional_media_info", "description"}:
result.description = description.getStr result.description = description.getStr
for v in js{"video_info", "variants"}: result.variants = parseVideoVariants(js{"video_info", "variants"})
let
contentType = parseEnum[VideoType](v{"content_type"}.getStr("summary"))
url = v{"url"}.getStr
if contentType == mp4: proc parseLegacyMediaEntities(js: JsonNode; result: var Tweet) =
result.playbackType = mp4 with jsMedia, js{"extended_entities", "media"}:
for m in jsMedia:
case m.getTypeName:
of "photo":
result.photos.add Photo(
url: m{"media_url_https"}.getImageStr,
altText: m{"ext_alt_text"}.getStr
)
of "video":
result.video = some(parseVideo(m))
with user, m{"additional_media_info", "source_user"}:
if user{"id"}.getInt > 0:
result.attribution = some(parseUser(user))
else:
result.attribution = some(parseGraphUser(user))
of "animated_gif":
result.gif = some Gif(
url: m{"video_info", "variants"}[0]{"url"}.getImageStr,
thumb: m{"media_url_https"}.getImageStr
)
else: discard
result.variants.add VideoVariant( with url, m{"url"}:
contentType: contentType, if result.text.endsWith(url.getStr):
bitrate: v{"bitrate"}.getInt, result.text.removeSuffix(url.getStr)
url: url, result.text = result.text.strip()
resolution: if contentType == mp4: getMp4Resolution(url) else: 0
) proc parseMediaEntities(js: JsonNode; result: var Tweet) =
with mediaEntities, js{"media_entities"}:
for mediaEntity in mediaEntities:
with mediaInfo, mediaEntity{"media_results", "result", "media_info"}:
case mediaInfo.getTypeName
of "ApiImage":
result.photos.add Photo(
url: mediaInfo{"original_img_url"}.getImageStr,
altText: mediaInfo{"alt_text"}.getStr
)
of "ApiVideo":
let status = mediaEntity{"media_results", "result", "media_availability_v2", "status"}
result.video = some Video(
available: status.getStr == "Available",
thumb: mediaInfo{"preview_image", "original_img_url"}.getImageStr,
durationMs: mediaInfo{"duration_millis"}.getInt,
variants: parseVideoVariants(mediaInfo{"variants"})
)
of "ApiGif":
result.gif = some Gif(
url: mediaInfo{"variants"}[0]{"url"}.getImageStr,
thumb: mediaInfo{"preview_image", "original_img_url"}.getImageStr
)
else: discard
# Remove media URLs from text
with mediaList, js{"legacy", "entities", "media"}:
for url in mediaList:
let expandedUrl = url.getExpandedUrl
if result.text.endsWith(expandedUrl):
result.text.removeSuffix(expandedUrl)
result.text = result.text.strip()
proc parsePromoVideo(js: JsonNode): Video = proc parsePromoVideo(js: JsonNode): Video =
result = Video( result = Video(
@@ -187,7 +279,7 @@ proc parseCard(js: JsonNode; urls: JsonNode): Card =
for u in ? urls: for u in ? urls:
if u{"url"}.getStr == result.url: if u{"url"}.getStr == result.url:
result.url = u{"expanded_url"}.getStr result.url = u.getExpandedUrl(result.url)
break break
if kind in {videoDirectMessage, imageDirectMessage}: if kind in {videoDirectMessage, imageDirectMessage}:
@@ -197,14 +289,20 @@ proc parseCard(js: JsonNode; urls: JsonNode): Card =
result.url.len == 0 or result.url.startsWith("card://"): result.url.len == 0 or result.url.startsWith("card://"):
result.url = getPicUrl(result.image) result.url = getPicUrl(result.image)
proc parseTweet(js: JsonNode; jsCard: JsonNode = newJNull()): Tweet = proc parseTweet(js: JsonNode; jsCard: JsonNode = newJNull();
replyId: int64 = 0): Tweet =
if js.isNull: return if js.isNull: return
let time =
if js{"created_at"}.notNull: js{"created_at"}.getTime
else: js{"created_at_ms"}.getTimeFromMs
result = Tweet( result = Tweet(
id: js{"id_str"}.getId, id: js{"id_str"}.getId,
threadId: js{"conversation_id_str"}.getId, threadId: js{"conversation_id_str"}.getId,
replyId: js{"in_reply_to_status_id_str"}.getId, replyId: js{"in_reply_to_status_id_str"}.getId,
text: js{"full_text"}.getStr, text: js{"full_text"}.getStr,
time: js{"created_at"}.getTime, time: time,
hasThread: js{"self_thread"}.notNull, hasThread: js{"self_thread"}.notNull,
available: true, available: true,
user: User(id: js{"user_id_str"}.getStr), user: User(id: js{"user_id_str"}.getStr),
@@ -212,17 +310,20 @@ proc parseTweet(js: JsonNode; jsCard: JsonNode = newJNull()): Tweet =
replies: js{"reply_count"}.getInt, replies: js{"reply_count"}.getInt,
retweets: js{"retweet_count"}.getInt, retweets: js{"retweet_count"}.getInt,
likes: js{"favorite_count"}.getInt, likes: js{"favorite_count"}.getInt,
quotes: js{"quote_count"}.getInt views: js{"views_count"}.getInt
) )
) )
result.expandTweetEntities(js) if result.replyId == 0:
result.replyId = replyId
# fix for pinned threads # fix for pinned threads
if result.hasThread and result.threadId == 0: if result.hasThread and result.threadId == 0:
result.threadId = js{"self_thread", "id_str"}.getId result.threadId = js{"self_thread", "id_str"}.getId
if js{"is_quote_status"}.getBool: if "retweeted_status" in js:
result.retweet = some Tweet()
elif js{"is_quote_status"}.getBool:
result.quote = some Tweet(id: js{"quoted_status_id_str"}.getId) result.quote = some Tweet(id: js{"quoted_status_id_str"}.getId)
# legacy # legacy
@@ -237,33 +338,28 @@ proc parseTweet(js: JsonNode; jsCard: JsonNode = newJNull()): Tweet =
result.retweet = some parseGraphTweet(rt) result.retweet = some parseGraphTweet(rt)
return return
with reposts, js{"repostedStatusResults"}:
with rt, reposts{"result"}:
if "legacy" in rt:
result.retweet = some parseGraphTweet(rt)
return
if jsCard.kind != JNull: if jsCard.kind != JNull:
let name = jsCard{"name"}.getStr let name = jsCard{"name"}.getStr
if "poll" in name: if "poll" in name:
if "image" in name: if "image" in name:
result.photos.add jsCard{"binding_values", "image_large"}.getImageVal result.photos.add Photo(
url: jsCard{"binding_values", "image_large"}.getImageVal
)
result.poll = some parsePoll(jsCard) result.poll = some parsePoll(jsCard)
elif name == "amplify": elif name == "amplify":
result.video = some(parsePromoVideo(jsCard{"binding_values"})) result.video = some parsePromoVideo(jsCard{"binding_values"})
else: else:
result.card = some parseCard(jsCard, js{"entities", "urls"}) result.card = some parseCard(jsCard, js{"entities", "urls"})
with jsMedia, js{"extended_entities", "media"}: result.expandTweetEntities(js)
for m in jsMedia: parseLegacyMediaEntities(js, result)
case m{"type"}.getStr
of "photo":
result.photos.add m{"media_url_https"}.getImageStr
of "video":
result.video = some(parseVideo(m))
with user, m{"additional_media_info", "source_user"}:
if user{"id"}.getInt > 0:
result.attribution = some(parseUser(user))
else:
result.attribution = some(parseGraphUser(user))
of "animated_gif":
result.gif = some(parseGif(m))
else: discard
with jsWithheld, js{"withheld_in_countries"}: with jsWithheld, js{"withheld_in_countries"}:
let withheldInCountries: seq[string] = let withheldInCountries: seq[string] =
@@ -279,242 +375,307 @@ proc parseTweet(js: JsonNode; jsCard: JsonNode = newJNull()): Tweet =
result.text.removeSuffix(" Learn more.") result.text.removeSuffix(" Learn more.")
result.available = false result.available = false
proc finalizeTweet(global: GlobalObjects; id: string): Tweet =
let intId = if id.len > 0: parseBiggestInt(id) else: 0
result = global.tweets.getOrDefault(id, Tweet(id: intId))
if result.quote.isSome:
let quote = get(result.quote).id
if $quote in global.tweets:
result.quote = some global.tweets[$quote]
else:
result.quote = some Tweet()
if result.retweet.isSome:
let rt = get(result.retweet).id
if $rt in global.tweets:
result.retweet = some finalizeTweet(global, $rt)
else:
result.retweet = some Tweet()
proc parsePin(js: JsonNode; global: GlobalObjects): Tweet =
let pin = js{"pinEntry", "entry", "entryId"}.getStr
if pin.len == 0: return
let id = pin.getId
if id notin global.tweets: return
global.tweets[id].pinned = true
return finalizeTweet(global, id)
proc parseGlobalObjects(js: JsonNode): GlobalObjects =
result = GlobalObjects()
let
tweets = ? js{"globalObjects", "tweets"}
users = ? js{"globalObjects", "users"}
for k, v in users:
result.users[k] = parseUser(v, k)
for k, v in tweets:
var tweet = parseTweet(v, v{"card"})
if tweet.user.id in result.users:
tweet.user = result.users[tweet.user.id]
result.tweets[k] = tweet
proc parseInstructions[T](res: var Result[T]; global: GlobalObjects; js: JsonNode) =
if js.kind != JArray or js.len == 0:
return
for i in js:
when T is Tweet:
if res.beginning and i{"pinEntry"}.notNull:
with pin, parsePin(i, global):
res.content.add pin
with r, i{"replaceEntry", "entry"}:
if "top" in r{"entryId"}.getStr:
res.top = r.getCursor
elif "bottom" in r{"entryId"}.getStr:
res.bottom = r.getCursor
proc parseTimeline*(js: JsonNode; after=""): Timeline =
result = Timeline(beginning: after.len == 0)
let global = parseGlobalObjects(? js)
let instructions = ? js{"timeline", "instructions"}
if instructions.len == 0: return
result.parseInstructions(global, instructions)
var entries: JsonNode
for i in instructions:
if "addEntries" in i:
entries = i{"addEntries", "entries"}
for e in ? entries:
let entry = e{"entryId"}.getStr
if "tweet" in entry or entry.startsWith("sq-I-t") or "tombstone" in entry:
let tweet = finalizeTweet(global, e.getEntryId)
if not tweet.available: continue
result.content.add tweet
elif "cursor-top" in entry:
result.top = e.getCursor
elif "cursor-bottom" in entry:
result.bottom = e.getCursor
elif entry.startsWith("sq-cursor"):
with cursor, e{"content", "operation", "cursor"}:
if cursor{"cursorType"}.getStr == "Bottom":
result.bottom = cursor{"value"}.getStr
else:
result.top = cursor{"value"}.getStr
proc parsePhotoRail*(js: JsonNode): PhotoRail =
for tweet in js:
let
t = parseTweet(tweet, js{"card"})
url = if t.photos.len > 0: t.photos[0]
elif t.video.isSome: get(t.video).thumb
elif t.gif.isSome: get(t.gif).thumb
elif t.card.isSome: get(t.card).image
else: ""
if url.len == 0: continue
result.add GalleryPhoto(url: url, tweetId: $t.id)
proc parseGraphTweet(js: JsonNode): Tweet = proc parseGraphTweet(js: JsonNode): Tweet =
if js.kind == JNull: if js.kind == JNull:
return Tweet() return Tweet()
case js{"__typename"}.getStr case js.getTypeName:
of "TweetUnavailable": of "TweetUnavailable":
return Tweet() return Tweet()
of "TweetTombstone": of "TweetTombstone":
return Tweet(text: js{"tombstone", "text"}.getTombstone) with text, select(js{"tombstone", "richText"}, js{"tombstone", "text"}):
return Tweet(text: text.getTombstone)
return Tweet()
of "TweetPreviewDisplay": of "TweetPreviewDisplay":
return Tweet(text: "You're unable to view this Tweet because it's only available to the Subscribers of the account owner.") return Tweet(text: "You're unable to view this Tweet because it's only available to the Subscribers of the account owner.")
of "TweetWithVisibilityResults": of "TweetWithVisibilityResults":
return parseGraphTweet(js{"tweet"}) return parseGraphTweet(js{"tweet"})
else:
discard
var jsCard = copy(js{"card", "legacy"}) if not js.hasKey("legacy"):
return Tweet()
var jsCard = select(js{"card"}, js{"tweet_card"}, js{"legacy", "tweet_card"})
if jsCard.kind != JNull: if jsCard.kind != JNull:
var values = newJObject() let legacyCard = jsCard{"legacy"}
for val in jsCard["binding_values"]: if legacyCard.kind != JNull:
values[val["key"].getStr] = val["value"] let bindingArray = legacyCard{"binding_values"}
jsCard["binding_values"] = values if bindingArray.kind == JArray:
var bindingObj: seq[(string, JsonNode)]
for item in bindingArray:
bindingObj.add((item{"key"}.getStr, item{"value"}))
# Create a new card object with flattened structure
jsCard = %*{
"name": legacyCard{"name"},
"url": legacyCard{"url"},
"binding_values": %bindingObj
}
result = parseTweet(js{"legacy"}, jsCard) var replyId = 0
with restId, js{"reply_to_results", "rest_id"}:
replyId = restId.getId
result = parseTweet(js{"legacy"}, jsCard, replyId)
result.id = js{"rest_id"}.getId
result.user = parseGraphUser(js{"core"}) result.user = parseGraphUser(js{"core"})
if result.reply.len == 0:
with replyTo, js{"reply_to_user_results", "result", "core", "screen_name"}:
result.reply = @[replyTo.getStr]
with count, js{"views", "count"}:
result.stats.views = count.getStr("0").parseInt
with noteTweet, js{"note_tweet", "note_tweet_results", "result"}: with noteTweet, js{"note_tweet", "note_tweet_results", "result"}:
result.expandNoteTweetEntities(noteTweet) result.expandNoteTweetEntities(noteTweet)
if result.quote.isSome: parseMediaEntities(js, result)
result.quote = some(parseGraphTweet(js{"quoted_status_result", "result"}))
with quoted, js{"quoted_status_result", "result"}:
result.quote = some(parseGraphTweet(quoted))
with quoted, js{"quotedPostResults"}:
if "result" in quoted:
result.quote = some(parseGraphTweet(quoted{"result"}))
else:
result.quote = some Tweet(id: js{"legacy", "quoted_status_id_str"}.getId)
with ids, js{"edit_control", "edit_control_initial", "edit_tweet_ids"}:
for id in ids:
result.history.add parseBiggestInt(id.getStr)
with birdwatch, js{"birdwatch_pivot"}:
result.note = parseCommunityNote(birdwatch)
proc parseGraphThread(js: JsonNode): tuple[thread: Chain; self: bool] = proc parseGraphThread(js: JsonNode): tuple[thread: Chain; self: bool] =
let thread = js{"content", "items"} for t in ? js{"content", "items"}:
for t in js{"content", "items"}: let entryId = t.getEntryId
let entryId = t{"entryId"}.getStr if "tweet-" in entryId and "promoted" notin entryId:
if "cursor-showmore" in entryId: let tweet = t.getTweetResult("item")
let cursor = t{"item", "itemContent", "value"} if tweet.notNull:
result.thread.content.add parseGraphTweet(tweet)
let tweetDisplayType = select(
t{"item", "content", "tweet_display_type"},
t{"item", "itemContent", "tweetDisplayType"}
)
if tweetDisplayType.getStr == "SelfThread":
result.self = true
else:
result.thread.content.add Tweet(id: entryId.getId)
elif "cursor-showmore" in entryId:
let cursor = t{"item", "content", "value"}
result.thread.cursor = cursor.getStr result.thread.cursor = cursor.getStr
result.thread.hasMore = true result.thread.hasMore = true
elif "tweet" in entryId:
let tweet = parseGraphTweet(t{"item", "itemContent", "tweet_results", "result"})
result.thread.content.add tweet
if t{"item", "itemContent", "tweetDisplayType"}.getStr == "SelfThread":
result.self = true
proc parseGraphTweetResult*(js: JsonNode): Tweet = proc parseGraphTweetResult*(js: JsonNode): Tweet =
with tweet, js{"data", "tweetResult", "result"}: with tweet, js{"data", "tweet_result", "result"}:
result = parseGraphTweet(tweet) result = parseGraphTweet(tweet)
proc parseGraphConversation*(js: JsonNode; tweetId: string): Conversation = proc parseGraphConversation*(js: JsonNode; tweetId: string): Conversation =
result = Conversation(replies: Result[Chain](beginning: true)) result = Conversation(replies: Result[Chain](beginning: true))
let instructions = ? js{"data", "threaded_conversation_with_injections", "instructions"} let instructions = ? select(
if instructions.len == 0: js{"data", "timelineResponse", "instructions"},
return js{"data", "timeline_response", "instructions"},
js{"data", "threaded_conversation_with_injections_v2", "instructions"}
for e in instructions[0]{"entries"}: )
let entryId = e{"entryId"}.getStr
# echo entryId
if entryId.startsWith("tweet"):
with tweetResult, e{"content", "itemContent", "tweet_results", "result"}:
let tweet = parseGraphTweet(tweetResult)
if not tweet.available:
tweet.id = parseBiggestInt(entryId.getId())
if $tweet.id == tweetId:
result.tweet = tweet
else:
result.before.content.add tweet
elif entryId.startsWith("tombstone"):
let id = entryId.getId()
let tweet = Tweet(
id: parseBiggestInt(id),
available: false,
text: e{"content", "itemContent", "tombstoneInfo", "richText"}.getTombstone
)
if id == tweetId:
result.tweet = tweet
else:
result.before.content.add tweet
elif entryId.startsWith("conversationthread"):
let (thread, self) = parseGraphThread(e)
if self:
result.after = thread
else:
result.replies.content.add thread
elif entryId.startsWith("cursor-bottom"):
result.replies.bottom = e{"content", "itemContent", "value"}.getStr
proc parseGraphTimeline*(js: JsonNode; root: string; after=""): Timeline =
result = Timeline(beginning: after.len == 0)
let instructions =
if root == "list": ? js{"data", "list", "tweets_timeline", "timeline", "instructions"}
else: ? js{"data", "user", "result", "timeline_v2", "timeline", "instructions"}
if instructions.len == 0: if instructions.len == 0:
return return
for i in instructions: for i in instructions:
if i{"type"}.getStr == "TimelineAddEntries": if i.getTypeName == "TimelineAddEntries":
for e in i{"entries"}: for e in i{"entries"}:
let entryId = e{"entryId"}.getStr let entryId = e.getEntryId
if entryId.startsWith("tweet"): if entryId.startsWith("tweet-"):
with tweetResult, e{"content", "itemContent", "tweet_results", "result"}: let tweetResult = getTweetResult(e)
if tweetResult.notNull:
let tweet = parseGraphTweet(tweetResult) let tweet = parseGraphTweet(tweetResult)
if not tweet.available: if not tweet.available:
tweet.id = parseBiggestInt(entryId.getId()) tweet.id = entryId.getId
result.content.add tweet
if entryId.endsWith(tweetId):
result.tweet = tweet
else:
result.before.content.add tweet
elif not entryId.endsWith(tweetId):
result.before.content.add Tweet(id: entryId.getId)
elif entryId.startsWith("conversationthread"):
let (thread, self) = parseGraphThread(e)
if self:
result.after = thread
elif thread.content.len > 0:
result.replies.content.add thread
elif entryId.startsWith("tombstone"):
let
content = select(e{"content", "content"}, e{"content", "itemContent"})
tweet = Tweet(
id: entryId.getId,
available: false,
text: content{"tombstoneInfo", "richText"}.getTombstone
)
if $tweet.id == tweetId:
result.tweet = tweet
else:
result.before.content.add tweet
elif entryId.startsWith("cursor-bottom"): elif entryId.startsWith("cursor-bottom"):
result.bottom = e{"content", "value"}.getStr var cursorValue = select(
e{"content", "value"},
e{"content", "content", "value"},
e{"content", "itemContent", "value"}
)
result.replies.bottom = cursorValue.getStr
proc parseGraphSearch*(js: JsonNode; after=""): Timeline = proc parseGraphEditHistory*(js: JsonNode; tweetId: string): EditHistory =
result = Timeline(beginning: after.len == 0) let instructions = ? js{
"data", "tweet_result_by_rest_id", "result",
"edit_history_timeline", "timeline", "instructions"
}
if instructions.len == 0:
return
let instructions = js{"data", "search_by_raw_query", "search_timeline", "timeline", "instructions"} for i in instructions:
if i.getTypeName == "TimelineAddEntries":
for e in i{"entries"}:
let entryId = e.getEntryId
if entryId == "latestTweet":
with item, e{"content", "items"}[0]:
let tweetResult = item.getTweetResult("item")
if tweetResult.notNull:
result.latest = parseGraphTweet(tweetResult)
elif entryId == "staleTweets":
for item in e{"content", "items"}:
let tweetResult = item.getTweetResult("item")
if tweetResult.notNull:
result.history.add parseGraphTweet(tweetResult)
proc extractTweetsFromEntry*(e: JsonNode): seq[Tweet] =
with tweetResult, getTweetResult(e):
var tweet = parseGraphTweet(tweetResult)
if not tweet.available:
tweet.id = e.getEntryId.getId
result.add tweet
return
for item in e{"content", "items"}:
with tweetResult, item.getTweetResult("item"):
var tweet = parseGraphTweet(tweetResult)
if not tweet.available:
tweet.id = item.getEntryId.getId
result.add tweet
proc parseGraphTimeline*(js: JsonNode; after=""): Profile =
result = Profile(tweets: Timeline(beginning: after.len == 0))
let instructions = ? select(
js{"data", "list", "timeline_response", "timeline", "instructions"},
js{"data", "user", "result", "timeline", "timeline", "instructions"},
js{"data", "user_result", "result", "timeline_response", "timeline", "instructions"}
)
if instructions.len == 0:
return
for i in instructions:
if i{"moduleItems"}.notNull:
for item in i{"moduleItems"}:
with tweetResult, item.getTweetResult("item"):
let tweet = parseGraphTweet(tweetResult)
if not tweet.available:
tweet.id = item.getEntryId.getId
result.tweets.content.add tweet
continue
if i{"entries"}.notNull:
for e in i{"entries"}:
let entryId = e.getEntryId
if entryId.startsWith("tweet") or entryId.startsWith("profile-grid"):
for tweet in extractTweetsFromEntry(e):
result.tweets.content.add tweet
elif "-conversation-" in entryId or entryId.startsWith("homeConversation"):
let (thread, self) = parseGraphThread(e)
result.tweets.content.add thread.content
elif entryId.startsWith("cursor-bottom"):
result.tweets.bottom = e{"content", "value"}.getStr
if after.len == 0:
if i.getTypeName == "TimelinePinEntry":
let tweets = extractTweetsFromEntry(i{"entry"})
if tweets.len > 0:
var tweet = tweets[0]
tweet.pinned = true
result.pinned = some tweet
proc parseGraphPhotoRail*(js: JsonNode): PhotoRail =
result = @[]
let instructions = select(
js{"data", "user", "result", "timeline", "timeline", "instructions"},
js{"data", "user_result", "result", "timeline_response", "timeline", "instructions"}
)
if instructions.len == 0:
return
for i in instructions:
if i{"moduleItems"}.notNull:
for item in i{"moduleItems"}:
with tweetResult, item.getTweetResult("item"):
let t = parseGraphTweet(tweetResult)
if not t.available:
t.id = item.getEntryId.getId
let photo = extractGalleryPhoto(t)
if photo.url.len > 0:
result.add photo
if result.len == 16:
return
continue
if i.getTypeName != "TimelineAddEntries":
continue
for e in i{"entries"}:
let entryId = e.getEntryId
if entryId.startsWith("tweet") or entryId.startsWith("profile-grid"):
for t in extractTweetsFromEntry(e):
let photo = extractGalleryPhoto(t)
if photo.url.len > 0:
result.add photo
if result.len == 16:
return
proc parseGraphSearch*[T: User | Tweets](js: JsonNode; after=""): Result[T] =
result = Result[T](beginning: after.len == 0)
let instructions = select(
js{"data", "search", "timeline_response", "timeline", "instructions"},
js{"data", "search_by_raw_query", "search_timeline", "timeline", "instructions"}
)
if instructions.len == 0: if instructions.len == 0:
return return
for instruction in instructions: for instruction in instructions:
let typ = instruction{"type"}.getStr let typ = getTypeName(instruction)
if typ == "TimelineAddEntries": if typ == "TimelineAddEntries":
for e in instructions[0]{"entries"}: for e in instruction{"entries"}:
let entryId = e{"entryId"}.getStr let entryId = e.getEntryId
if entryId.startsWith("tweet"): when T is Tweets:
with tweetResult, e{"content", "itemContent", "tweet_results", "result"}: if entryId.startsWith("tweet"):
let tweet = parseGraphTweet(tweetResult) with tweetRes, getTweetResult(e):
if not tweet.available: let tweet = parseGraphTweet(tweetRes)
tweet.id = parseBiggestInt(entryId.getId()) if not tweet.available:
result.content.add tweet tweet.id = entryId.getId
elif entryId.startsWith("cursor-bottom"): result.content.add tweet
elif T is User:
if entryId.startsWith("user"):
with userRes, e{"content", "itemContent"}:
result.content.add parseGraphUser(userRes)
if entryId.startsWith("cursor-bottom"):
result.bottom = e{"content", "value"}.getStr result.bottom = e{"content", "value"}.getStr
elif typ == "TimelineReplaceEntry": elif typ == "TimelineReplaceEntry":
if instruction{"entry_id_to_replace"}.getStr.startsWith("cursor-bottom"): if instruction{"entry_id_to_replace"}.getStr.startsWith("cursor-bottom"):
+85 -27
View File
@@ -1,15 +1,23 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import std/[strutils, times, macros, htmlgen, options, algorithm, re] import std/[times, macros, htmlgen, options, algorithm, re]
import std/strutils except escape
import std/unicode except strip import std/unicode except strip
from xmltree import escape
import packedjson import packedjson
import types, utils, formatters import types, utils, formatters
const
unicodeOpen = "\uFFFA"
unicodeClose = "\uFFFB"
xmlOpen = escape("<")
xmlClose = escape(">")
let let
unRegex = re"(^|[^A-z0-9-_./?])@([A-z0-9_]{1,15})" unRegex = re"(^|[^A-z0-9-_./?])@([A-z0-9_]{1,15})"
unReplace = "$1<a href=\"/$2\">@$2</a>" unReplace = "$1<a href=\"/$2\">@$2</a>"
htRegex = re"(^|[^\w-_./?])([#$]|)([\w_]+)" htRegex = re"(^|[^\w-_./?])([#$]|)([\w_]+)"
htReplace = "$1<a href=\"/search?q=%23$3\">$2$3</a>" htReplace = "$1<a href=\"/search?f=tweets&q=%23$3\">$2$3</a>"
type type
ReplaceSliceKind = enum ReplaceSliceKind = enum
@@ -28,6 +36,12 @@ template `?`*(js: JsonNode): untyped =
if j.isNull: return if j.isNull: return
j j
template select*(a, b: JsonNode): untyped =
if a.notNull: a else: b
template select*(a, b, c: JsonNode): untyped =
if a.notNull: a elif b.notNull: b else: c
template with*(ident, value, body): untyped = template with*(ident, value, body): untyped =
if true: if true:
let ident {.inject.} = value let ident {.inject.} = value
@@ -45,6 +59,19 @@ template getError*(js: JsonNode): Error =
if js.kind != JArray or js.len == 0: null if js.kind != JArray or js.len == 0: null
else: Error(js[0]{"code"}.getInt) else: Error(js[0]{"code"}.getInt)
proc getTweetResult*(js: JsonNode; root="content"): JsonNode =
select(
js{root, "content", "tweet_results", "result"},
js{root, "itemContent", "tweet_results", "result"},
js{root, "content", "tweetResult", "result"}
)
template getTypeName*(js: JsonNode): string =
js{"__typename"}.getStr(js{"type"}.getStr)
template getEntryId*(e: JsonNode): string =
e{"entryId"}.getStr(e{"entry_id"}.getStr)
template parseTime(time: string; f: static string; flen: int): DateTime = template parseTime(time: string; f: static string; flen: int): DateTime =
if time.len != flen: return if time.len != flen: return
parse(time, f, utc()) parse(time, f, utc())
@@ -55,29 +82,24 @@ proc getDateTime*(js: JsonNode): DateTime =
proc getTime*(js: JsonNode): DateTime = proc getTime*(js: JsonNode): DateTime =
parseTime(js.getStr, "ddd MMM dd hh:mm:ss \'+0000\' yyyy", 30) parseTime(js.getStr, "ddd MMM dd hh:mm:ss \'+0000\' yyyy", 30)
proc getId*(id: string): string {.inline.} = proc getTimeFromMs*(js: JsonNode): DateTime =
let ms = js.getInt(0)
if ms == 0: return
let seconds = ms div 1000
return fromUnix(seconds).utc()
proc getId*(id: string): int64 {.inline.} =
let start = id.rfind("-") let start = id.rfind("-")
if start < 0: return id if start < 0:
id[start + 1 ..< id.len] return parseBiggestInt(id)
return parseBiggestInt(id[start + 1 ..< id.len])
proc getId*(js: JsonNode): int64 {.inline.} = proc getId*(js: JsonNode): int64 {.inline.} =
case js.kind case js.kind
of JString: return parseBiggestInt(js.getStr("0")) of JString: return js.getStr("0").getId
of JInt: return js.getBiggestInt() of JInt: return js.getBiggestInt()
else: return 0 else: return 0
proc getEntryId*(js: JsonNode): string {.inline.} =
let entry = js{"entryId"}.getStr
if entry.len == 0: return
if "tweet" in entry or "sq-I-t" in entry:
return entry.getId
elif "tombstone" in entry:
return js{"content", "item", "content", "tombstone", "tweet", "id"}.getStr
else:
echo "unknown entry: ", entry
return
template getStrVal*(js: JsonNode; default=""): string = template getStrVal*(js: JsonNode; default=""): string =
js{"string_value"}.getStr(default) js{"string_value"}.getStr(default)
@@ -89,6 +111,9 @@ proc getImageStr*(js: JsonNode): string =
template getImageVal*(js: JsonNode): string = template getImageVal*(js: JsonNode): string =
js{"image_value", "url"}.getImageStr js{"image_value", "url"}.getImageStr
template getExpandedUrl*(js: JsonNode; fallback=""): string =
js{"expanded_url"}.getStr(js{"url"}.getStr(fallback))
proc getCardUrl*(js: JsonNode; kind: CardKind): string = proc getCardUrl*(js: JsonNode; kind: CardKind): string =
result = js{"website_url"}.getStrVal result = js{"website_url"}.getStrVal
if kind == promoVideoConvo: if kind == promoVideoConvo:
@@ -154,7 +179,7 @@ proc extractSlice(js: JsonNode): Slice[int] =
proc extractUrls(result: var seq[ReplaceSlice]; js: JsonNode; proc extractUrls(result: var seq[ReplaceSlice]; js: JsonNode;
textLen: int; hideTwitter = false) = textLen: int; hideTwitter = false) =
let let
url = js["expanded_url"].getStr url = js.getExpandedUrl
slice = js.extractSlice slice = js.extractSlice
if hideTwitter and slice.b.succ >= textLen and url.isTwitterUrl: if hideTwitter and slice.b.succ >= textLen and url.isTwitterUrl:
@@ -181,7 +206,7 @@ proc replacedWith(runes: seq[Rune]; repls: openArray[ReplaceSlice];
let let
name = $runes[rep.slice.a.succ .. rep.slice.b] name = $runes[rep.slice.a.succ .. rep.slice.b]
symbol = $runes[rep.slice.a] symbol = $runes[rep.slice.a]
result.add a(symbol & name, href = "/search?q=%23" & name) result.add a(symbol & name, href = "/search?f=tweets&q=%23" & name)
of rkMention: of rkMention:
result.add a($runes[rep.slice], href = rep.url, title = rep.display) result.add a($runes[rep.slice], href = rep.url, title = rep.display)
of rkUrl: of rkUrl:
@@ -215,7 +240,7 @@ proc expandUserEntities*(user: var User; js: JsonNode) =
ent = ? js{"entities"} ent = ? js{"entities"}
with urls, ent{"url", "urls"}: with urls, ent{"url", "urls"}:
user.website = urls[0]{"expanded_url"}.getStr user.website = urls[0].getExpandedUrl
var replacements = newSeq[ReplaceSlice]() var replacements = newSeq[ReplaceSlice]()
@@ -231,7 +256,7 @@ proc expandUserEntities*(user: var User; js: JsonNode) =
.replacef(htRegex, htReplace) .replacef(htRegex, htReplace)
proc expandTextEntities(tweet: Tweet; entities: JsonNode; text: string; textSlice: Slice[int]; proc expandTextEntities(tweet: Tweet; entities: JsonNode; text: string; textSlice: Slice[int];
replyTo=""; hasQuote=false) = replyTo=""; hasRedundantLink=false) =
let hasCard = tweet.card.isSome let hasCard = tweet.card.isSome
var replacements = newSeq[ReplaceSlice]() var replacements = newSeq[ReplaceSlice]()
@@ -242,10 +267,10 @@ proc expandTextEntities(tweet: Tweet; entities: JsonNode; text: string; textSlic
if urlStr.len == 0 or urlStr notin text: if urlStr.len == 0 or urlStr notin text:
continue continue
replacements.extractUrls(u, textSlice.b, hideTwitter = hasQuote) replacements.extractUrls(u, textSlice.b, hideTwitter = hasRedundantLink)
if hasCard and u{"url"}.getStr == get(tweet.card).url: if hasCard and u{"url"}.getStr == get(tweet.card).url:
get(tweet.card).url = u{"expanded_url"}.getStr get(tweet.card).url = u.getExpandedUrl
with media, entities{"media"}: with media, entities{"media"}:
for m in media: for m in media:
@@ -282,9 +307,10 @@ proc expandTextEntities(tweet: Tweet; entities: JsonNode; text: string; textSlic
proc expandTweetEntities*(tweet: Tweet; js: JsonNode) = proc expandTweetEntities*(tweet: Tweet; js: JsonNode) =
let let
entities = ? js{"entities"} entities = ? js{"entities"}
hasQuote = js{"is_quote_status"}.getBool
textRange = js{"display_text_range"} textRange = js{"display_text_range"}
textSlice = textRange{0}.getInt .. textRange{1}.getInt textSlice = textRange{0}.getInt .. textRange{1}.getInt
hasQuote = js{"is_quote_status"}.getBool
hasJobCard = tweet.card.isSome and get(tweet.card).kind == jobDetails
var replyTo = "" var replyTo = ""
if tweet.replyId != 0: if tweet.replyId != 0:
@@ -292,12 +318,44 @@ proc expandTweetEntities*(tweet: Tweet; js: JsonNode) =
replyTo = reply.getStr replyTo = reply.getStr
tweet.reply.add replyTo tweet.reply.add replyTo
tweet.expandTextEntities(entities, tweet.text, textSlice, replyTo, hasQuote) tweet.expandTextEntities(entities, tweet.text, textSlice, replyTo, hasQuote or hasJobCard)
proc expandNoteTweetEntities*(tweet: Tweet; js: JsonNode) = proc expandNoteTweetEntities*(tweet: Tweet; js: JsonNode) =
let let
entities = ? js{"entity_set"} entities = ? js{"entity_set"}
text = js{"text"}.getStr text = js{"text"}.getStr.multiReplace(("<", unicodeOpen), (">", unicodeClose))
textSlice = 0..text.runeLen textSlice = 0..text.runeLen
tweet.expandTextEntities(entities, text, textSlice) tweet.expandTextEntities(entities, text, textSlice)
tweet.text = tweet.text.multiReplace((unicodeOpen, xmlOpen), (unicodeClose, xmlClose))
proc expandBirdwatchEntities*(text: string; entities: JsonNode): string =
let runes = text.toRunes
var replacements: seq[ReplaceSlice]
for entity in entities:
let
fromIdx = entity{"from_index"}.getInt
toIdx = entity{"to_index"}.getInt
url = entity{"ref", "url"}.getStr
if url.len > 0:
replacements.add ReplaceSlice(
kind: rkUrl,
slice: fromIdx ..< toIdx,
url: url,
display: $runes[fromIdx ..< min(toIdx, runes.len)]
)
replacements.sort(cmp)
result = runes.replacedWith(replacements, 0 ..< runes.len)
proc extractGalleryPhoto*(t: Tweet): GalleryPhoto =
let url =
if t.photos.len > 0: t.photos[0].url
elif t.video.isSome: get(t.video).thumb
elif t.gif.isSome: get(t.gif).thumb
elif t.card.isSome: get(t.card).image
else: ""
result = GalleryPhoto(url: url, tweetId: $t.id)
+9 -9
View File
@@ -1,22 +1,22 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import tables import tables, strutils
import types, prefs_impl import types, prefs_impl
from config import get from config import get
from parsecfg import nil from parsecfg import nil
export genUpdatePrefs, genResetPrefs export genUpdatePrefs, genResetPrefs, genApplyPrefs
var defaultPrefs*: Prefs var defaultPrefs*: Prefs
proc updateDefaultPrefs*(cfg: parsecfg.Config) = proc updateDefaultPrefs*(cfg: parsecfg.Config) =
genDefaultPrefs() genDefaultPrefs()
proc getPrefs*(cookies: Table[string, string]): Prefs = proc getPrefs*(cookies, params: Table[string, string]): Prefs =
result = defaultPrefs result = defaultPrefs
genCookiePrefs(cookies) genParsePrefs(cookies)
genParsePrefs(params)
template getPref*(cookies: Table[string, string], pref): untyped = proc encodePrefs*(prefs: Prefs): string =
bind genCookiePref var encPairs: seq[string]
var res = defaultPrefs.`pref` genEncodePrefs(prefs)
genCookiePref(cookies, pref, res) encPairs.join(",")
res
+44 -28
View File
@@ -60,6 +60,9 @@ genPrefs:
stickyProfile(checkbox, true): stickyProfile(checkbox, true):
"Make profile sidebar stick to top" "Make profile sidebar stick to top"
stickyNav(checkbox, true):
"Keep navbar fixed to top"
bidiSupport(checkbox, false): bidiSupport(checkbox, false):
"Support bidirectional text (makes clicking on tweets harder)" "Support bidirectional text (makes clicking on tweets harder)"
@@ -75,12 +78,15 @@ genPrefs:
hideReplies(checkbox, false): hideReplies(checkbox, false):
"Hide tweet replies" "Hide tweet replies"
hideCommunityNotes(checkbox, false):
"Hide community notes"
squareAvatars(checkbox, false): squareAvatars(checkbox, false):
"Square profile pictures" "Square profile pictures"
Media: Media:
mp4Playback(checkbox, true): mp4Playback(checkbox, true):
"Enable mp4 video playback" "Enable mp4 video playback (only for gifs)"
hlsPlayback(checkbox, false): hlsPlayback(checkbox, false):
"Enable HLS video streaming (requires JavaScript)" "Enable HLS video streaming (requires JavaScript)"
@@ -127,7 +133,7 @@ macro genDefaultPrefs*(): untyped =
result.add quote do: result.add quote do:
defaultPrefs.`ident` = cfg.get("Preferences", `name`, `default`) defaultPrefs.`ident` = cfg.get("Preferences", `name`, `default`)
macro genCookiePrefs*(cookies): untyped = macro genParsePrefs*(prefs): untyped =
result = nnkStmtList.newTree() result = nnkStmtList.newTree()
for pref in allPrefs(): for pref in allPrefs():
let let
@@ -137,37 +143,17 @@ macro genCookiePrefs*(cookies): untyped =
options = pref.options options = pref.options
result.add quote do: result.add quote do:
if `name` in `cookies`: if `name` in `prefs`:
when `kind` == input or `name` == "theme": when `kind` == input or `name` == "theme":
result.`ident` = `cookies`[`name`] result.`ident` = `prefs`[`name`]
elif `kind` == checkbox: elif `kind` == checkbox:
result.`ident` = `cookies`[`name`] == "on" result.`ident` = `prefs`[`name`] == "on" or
`prefs`[`name`] == "true" or
`prefs`[`name`] == "1"
else: else:
let value = `cookies`[`name`] let value = `prefs`[`name`]
if value in `options`: result.`ident` = value if value in `options`: result.`ident` = value
macro genCookiePref*(cookies, prefName, res): untyped =
result = nnkStmtList.newTree()
for pref in allPrefs():
let ident = ident(pref.name)
if ident != prefName:
continue
let
name = pref.name
kind = newLit(pref.kind)
options = pref.options
result.add quote do:
if `name` in `cookies`:
when `kind` == input or `name` == "theme":
`res` = `cookies`[`name`]
elif `kind` == checkbox:
`res` = `cookies`[`name`] == "on"
else:
let value = `cookies`[`name`]
if value in `options`: `res` = value
macro genUpdatePrefs*(): untyped = macro genUpdatePrefs*(): untyped =
result = nnkStmtList.newTree() result = nnkStmtList.newTree()
let req = ident("request") let req = ident("request")
@@ -202,6 +188,36 @@ macro genResetPrefs*(): untyped =
result.add quote do: result.add quote do:
savePref(`name`, "", `req`, expire=true) savePref(`name`, "", `req`, expire=true)
macro genEncodePrefs*(prefs): untyped =
result = nnkStmtList.newTree()
for pref in allPrefs():
let
name = newLit(pref.name)
ident = ident(pref.name)
kind = newLit(pref.kind)
defaultIdent = nnkDotExpr.newTree(ident("defaultPrefs"), ident(pref.name))
result.add quote do:
when `kind` == checkbox:
if `prefs`.`ident` != `defaultIdent`:
if `prefs`.`ident`:
encPairs.add `name` & "=on"
else:
encPairs.add `name` & "="
else:
if `prefs`.`ident` != `defaultIdent`:
encPairs.add `name` & "=" & `prefs`.`ident`
macro genApplyPrefs*(params, req): untyped =
result = nnkStmtList.newTree()
for pref in allPrefs():
let name = newLit(pref.name)
result.add quote do:
if `name` in `params`:
savePref(`name`, `params`[`name`], `req`)
else:
savePref(`name`, "", `req`, expire=true)
macro genPrefsType*(): untyped = macro genPrefsType*(): untyped =
let name = nnkPostfix.newTree(ident("*"), ident("Prefs")) let name = nnkPostfix.newTree(ident("*"), ident("Prefs"))
result = quote do: result = quote do:
+23 -13
View File
@@ -6,10 +6,9 @@ import types
const const
validFilters* = @[ validFilters* = @[
"media", "images", "twimg", "videos", "media", "images", "twimg", "videos",
"native_video", "consumer_video", "pro_video", "native_video", "consumer_video", "spaces",
"links", "news", "quote", "mentions", "links", "news", "quote", "mentions",
"replies", "retweets", "nativeretweets", "replies", "retweets", "nativeretweets"
"verified", "safe"
] ]
emptyQuery* = "include:nativeretweets" emptyQuery* = "include:nativeretweets"
@@ -18,6 +17,11 @@ template `@`(param: string): untyped =
if param in pms: pms[param] if param in pms: pms[param]
else: "" else: ""
proc validateNumber(value: string): string =
if value.anyIt(not it.isDigit):
return ""
return value
proc initQuery*(pms: Table[string, string]; name=""): Query = proc initQuery*(pms: Table[string, string]; name=""): Query =
result = Query( result = Query(
kind: parseEnum[QueryKind](@"f", tweets), kind: parseEnum[QueryKind](@"f", tweets),
@@ -26,7 +30,7 @@ proc initQuery*(pms: Table[string, string]; name=""): Query =
excludes: validFilters.filterIt("e-" & it in pms), excludes: validFilters.filterIt("e-" & it in pms),
since: @"since", since: @"since",
until: @"until", until: @"until",
near: @"near" minLikes: validateNumber(@"min_faves")
) )
if name.len > 0: if name.len > 0:
@@ -54,16 +58,18 @@ proc genQueryParam*(query: Query): string =
if query.kind == users: if query.kind == users:
return query.text return query.text
param = "("
for i, user in query.fromUser: for i, user in query.fromUser:
param &= &"from:{user} " param &= &"from:{user}"
if i < query.fromUser.high: if i < query.fromUser.high:
param &= "OR " param &= " OR "
param &= ")"
if query.fromUser.len > 0 and query.kind in {posts, media}: if query.fromUser.len > 0 and query.kind in {posts, media}:
param &= "filter:self_threads OR-filter:replies " param &= " (filter:self_threads OR -filter:replies)"
if "nativeretweets" notin query.excludes: if "nativeretweets" notin query.excludes:
param &= "include:nativeretweets " param &= " include:nativeretweets"
for f in query.filters: for f in query.filters:
filters.add "filter:" & f filters.add "filter:" & f
@@ -73,13 +79,17 @@ proc genQueryParam*(query: Query): string =
for i in query.includes: for i in query.includes:
filters.add "include:" & i filters.add "include:" & i
result = strip(param & filters.join(&" {query.sep} ")) if filters.len > 0:
result = strip(param & " (" & filters.join(&" {query.sep} ") & ")")
else:
result = strip(param)
if query.since.len > 0: if query.since.len > 0:
result &= " since:" & query.since result &= " since:" & query.since
if query.until.len > 0: if query.until.len > 0:
result &= " until:" & query.until result &= " until:" & query.until
if query.near.len > 0: if query.minLikes.len > 0:
result &= &" near:\"{query.near}\" within:15mi" result &= " min_faves:" & query.minLikes
if query.text.len > 0: if query.text.len > 0:
if result.len > 0: if result.len > 0:
result &= " " & query.text result &= " " & query.text
@@ -103,8 +113,8 @@ proc genQueryUrl*(query: Query): string =
params.add "since=" & query.since params.add "since=" & query.since
if query.until.len > 0: if query.until.len > 0:
params.add "until=" & query.until params.add "until=" & query.until
if query.near.len > 0: if query.minLikes.len > 0:
params.add "near=" & query.near params.add "min_faves=" & query.minLikes
if params.len > 0: if params.len > 0:
result &= params.join("&") result &= params.join("&")
+16 -15
View File
@@ -52,6 +52,7 @@ proc initRedisPool*(cfg: Config) {.async.} =
await migrate("profileDates", "p:*") await migrate("profileDates", "p:*")
await migrate("profileStats", "p:*") await migrate("profileStats", "p:*")
await migrate("userType", "p:*") await migrate("userType", "p:*")
await migrate("verifiedType", "p:*")
pool.withAcquire(r): pool.withAcquire(r):
# optimize memory usage for user ID buckets # optimize memory usage for user ID buckets
@@ -85,7 +86,7 @@ proc cache*(data: List) {.async.} =
await setEx(data.listKey, listCacheTime, compress(toFlatty(data))) await setEx(data.listKey, listCacheTime, compress(toFlatty(data)))
proc cache*(data: PhotoRail; name: string) {.async.} = proc cache*(data: PhotoRail; name: string) {.async.} =
await setEx("pr:" & toLower(name), baseCacheTime, compress(toFlatty(data))) await setEx("pr2:" & toLower(name), baseCacheTime * 2, compress(toFlatty(data)))
proc cache*(data: User) {.async.} = proc cache*(data: User) {.async.} =
if data.username.len == 0: return if data.username.len == 0: return
@@ -147,24 +148,24 @@ proc getCachedUsername*(userId: string): Future[string] {.async.} =
if result.len > 0 and user.id.len > 0: if result.len > 0 and user.id.len > 0:
await all(cacheUserId(result, user.id), cache(user)) await all(cacheUserId(result, user.id), cache(user))
proc getCachedTweet*(id: int64): Future[Tweet] {.async.} = # proc getCachedTweet*(id: int64): Future[Tweet] {.async.} =
if id == 0: return # if id == 0: return
let tweet = await get(id.tweetKey) # let tweet = await get(id.tweetKey)
if tweet != redisNil: # if tweet != redisNil:
tweet.deserialize(Tweet) # tweet.deserialize(Tweet)
else: # else:
result = await getGraphTweetResult($id) # result = await getGraphTweetResult($id)
if not result.isNil: # if not result.isNil:
await cache(result) # await cache(result)
proc getCachedPhotoRail*(name: string): Future[PhotoRail] {.async.} = proc getCachedPhotoRail*(id: string): Future[PhotoRail] {.async.} =
if name.len == 0: return if id.len == 0: return
let rail = await get("pr:" & toLower(name)) let rail = await get("pr2:" & toLower(id))
if rail != redisNil: if rail != redisNil:
rail.deserialize(PhotoRail) rail.deserialize(PhotoRail)
else: else:
result = await getPhotoRail(name) result = await getPhotoRail(id)
await cache(result, name) await cache(result, id)
proc getCachedList*(username=""; slug=""; id=""): Future[List] {.async.} = proc getCachedList*(username=""; slug=""; id=""): Future[List] {.async.} =
let list = if id.len == 0: redisNil let list = if id.len == 0: redisNil
+6 -3
View File
@@ -1,10 +1,13 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import jester import jester
import router_utils import router_utils
import ".."/[tokens, types] import ".."/[auth, types]
proc createDebugRouter*(cfg: Config) = proc createDebugRouter*(cfg: Config) =
router debug: router debug:
get "/.tokens": get "/.health":
respJson getSessionPoolHealth()
get "/.sessions":
cond cfg.enableDebug cond cfg.enableDebug
respJson getPoolJson() respJson getSessionPoolDebug()
+7 -7
View File
@@ -10,22 +10,22 @@ export api, embed, vdom, tweet, general, router_utils
proc createEmbedRouter*(cfg: Config) = proc createEmbedRouter*(cfg: Config) =
router embed: router embed:
get "/i/videos/tweet/@id": get "/i/videos/tweet/@id":
let convo = await getTweet(@"id") let tweet = await getGraphTweetResult(@"id")
if convo == nil or convo.tweet == nil or convo.tweet.video.isNone: if tweet == nil or tweet.video.isNone:
resp Http404 resp Http404
resp renderVideoEmbed(convo.tweet, cfg, request) resp renderVideoEmbed(tweet, cfg, request)
get "/@user/status/@id/embed": get "/@user/status/@id/embed":
let let
convo = await getTweet(@"id") tweet = await getGraphTweetResult(@"id")
prefs = cookiePrefs() prefs = requestPrefs()
path = getPath() path = getPath()
if convo == nil or convo.tweet == nil: if tweet == nil:
resp Http404 resp Http404
resp renderTweetEmbed(convo.tweet, path, prefs, cfg, request) resp renderTweetEmbed(tweet, path, prefs, cfg, request)
get "/embed/Tweet.html": get "/embed/Tweet.html":
let id = @"id" let id = @"id"
+3 -3
View File
@@ -13,7 +13,7 @@ template respList*(list, timeline, title, vnode: typed) =
let let
html = renderList(vnode, timeline.query, list) html = renderList(vnode, timeline.query, list)
rss = &"""/i/lists/{@"id"}/rss""" rss = if cfg.enableRSSList: &"""/i/lists/{@"id"}/rss""" else: ""
resp renderMain(html, request, cfg, prefs, titleText=title, rss=rss, banner=list.banner) resp renderMain(html, request, cfg, prefs, titleText=title, rss=rss, banner=list.banner)
@@ -36,7 +36,7 @@ proc createListRouter*(cfg: Config) =
get "/i/lists/@id/?": get "/i/lists/@id/?":
cond '.' notin @"id" cond '.' notin @"id"
let let
prefs = cookiePrefs() prefs = requestPrefs()
list = await getCachedList(id=(@"id")) list = await getCachedList(id=(@"id"))
timeline = await getGraphListTweets(list.id, getCursor()) timeline = await getGraphListTweets(list.id, getCursor())
vnode = renderTimelineTweets(timeline, prefs, request.path) vnode = renderTimelineTweets(timeline, prefs, request.path)
@@ -45,7 +45,7 @@ proc createListRouter*(cfg: Config) =
get "/i/lists/@id/members": get "/i/lists/@id/members":
cond '.' notin @"id" cond '.' notin @"id"
let let
prefs = cookiePrefs() prefs = requestPrefs()
list = await getCachedList(id=(@"id")) list = await getCachedList(id=(@"id"))
members = await getGraphListMembers(list, getCursor()) members = await getGraphListMembers(list, getCursor())
respList(list, members, list.title, renderTimelineUsers(members, prefs, request.path)) respList(list, members, list.title, renderTimelineUsers(members, prefs, request.path))
+62 -73
View File
@@ -12,8 +12,7 @@ export httpclient, os, strutils, asyncstreams, base64, re
const const
m3u8Mime* = "application/vnd.apple.mpegurl" m3u8Mime* = "application/vnd.apple.mpegurl"
mp4Mime* = "video/mp4" maxAge* = "max-age=604800"
maxAge* = "public, max-age=604800, must-revalidate"
proc safeFetch*(url: string): Future[string] {.async.} = proc safeFetch*(url: string): Future[string] {.async.} =
let client = newAsyncHttpClient() let client = newAsyncHttpClient()
@@ -21,84 +20,59 @@ proc safeFetch*(url: string): Future[string] {.async.} =
except: discard except: discard
finally: client.close() finally: client.close()
template respond*(req: asynchttpserver.Request; code: HttpCode; template respond*(req: asynchttpserver.Request; headers) =
headers: seq[(string, string)]) = var msg = "HTTP/1.1 200 OK\c\L"
var msg = "HTTP/1.1 " & $code & "\c\L" for k, v in headers:
for (k, v) in headers:
msg.add(k & ": " & v & "\c\L") msg.add(k & ": " & v & "\c\L")
msg.add "\c\L" msg.add "\c\L"
yield req.client.send(msg, flags={}) yield req.client.send(msg)
proc getContentLength(res: AsyncResponse): string =
result = "0"
if res.headers.hasKey("content-length"):
result = $res.contentLength
elif res.headers.hasKey("content-range"):
result = res.headers["content-range"]
result = result[result.find('/') + 1 .. ^1]
if result == "*":
result.setLen(0)
proc proxyMedia*(req: jester.Request; url: string): Future[HttpCode] {.async.} = proc proxyMedia*(req: jester.Request; url: string): Future[HttpCode] {.async.} =
result = Http200 result = Http200
let let
request = req.getNativeReq() request = req.getNativeReq()
hashed = $hash(url) client = newAsyncHttpClient()
if request.headers.getOrDefault("If-None-Match") == hashed:
return Http304
let c = newAsyncHttpClient(headers=newHttpHeaders({
"accept": "*/*",
"range": $req.headers.getOrDefault("range")
}))
try: try:
var res = await c.get(url) let res = await client.get(url)
if not res.status.startsWith("20"): if res.status != "200 OK":
if res.status != "404 Not Found":
echo "[media] Proxying failed, status: $1, url: $2" % [res.status, url]
return Http404 return Http404
var headers = @{ let hashed = $hash(url)
"accept-ranges": "bytes", if request.headers.getOrDefault("If-None-Match") == hashed:
"content-type": $res.headers.getOrDefault("content-type"), return Http304
let contentLength =
if res.headers.hasKey("content-length"):
res.headers["content-length", 0]
else:
""
let headers = newHttpHeaders({
"content-type": res.headers["content-type", 0],
"content-length": contentLength,
"cache-control": maxAge, "cache-control": maxAge,
"age": $res.headers.getOrDefault("age"), "etag": hashed
"date": $res.headers.getOrDefault("date"), })
"last-modified": $res.headers.getOrDefault("last-modified")
}
var tries = 0 respond(request, headers)
while tries <= 10 and res.headers.hasKey("transfer-encoding"):
await sleepAsync(100 + tries * 200)
res = await c.get(url)
tries.inc
let contentLength = res.getContentLength
if contentLength.len > 0:
headers.add ("content-length", contentLength)
if res.headers.hasKey("content-range"):
headers.add ("content-range", $res.headers.getOrDefault("content-range"))
respond(request, Http206, headers)
else:
respond(request, Http200, headers)
var (hasValue, data) = (true, "") var (hasValue, data) = (true, "")
while hasValue: while hasValue:
(hasValue, data) = await res.bodyStream.read() (hasValue, data) = await res.bodyStream.read()
if hasValue: if hasValue:
await request.client.send(data, flags={}) await request.client.send(data)
data.setLen 0 data.setLen 0
except OSError: discard except HttpRequestError, ProtocolError, OSError:
except ProtocolError, HttpRequestError: echo "[media] Proxying exception, error: $1, url: $2" % [getCurrentExceptionMsg(), url]
result = Http404 result = Http404
finally: finally:
c.close() client.close()
template check*(c): untyped = template check*(code): untyped =
let code = c
if code != Http200: if code != Http200:
resp code resp code
else: else:
@@ -112,37 +86,52 @@ proc decoded*(req: jester.Request; index: int): string =
if based: decode(encoded) if based: decode(encoded)
else: decodeUrl(encoded) else: decodeUrl(encoded)
proc getPicUrl*(req: jester.Request): string =
result = decoded(req, 1)
if "twimg.com" notin result:
result.insert(twimg)
if not result.startsWith(https):
result.insert(https)
proc createMediaRouter*(cfg: Config) = proc createMediaRouter*(cfg: Config) =
router media: router media:
get "/pic/?": get "/pic/?":
resp Http404 resp Http404
get re"^\/pic\/orig\/(enc)?\/?(.+)": get re"^\/pic\/orig\/(enc)?\/?(.+)":
let url = getPicUrl(request) var url = decoded(request, 1)
cond isTwitterUrl(parseUri(url)) == true cond "/amplify_video/" notin url
check await proxyMedia(request, url & "?name=orig")
if "twimg.com" notin url:
url.insert(twimg)
if not url.startsWith(https):
url.insert(https)
url.add("?name=orig")
let uri = parseUri(url)
cond isTwitterUrl(uri) == true
let code = await proxyMedia(request, url)
check code
get re"^\/pic\/(enc)?\/?(.+)": get re"^\/pic\/(enc)?\/?(.+)":
let url = getPicUrl(request) var url = decoded(request, 1)
cond isTwitterUrl(parseUri(url)) == true cond "/amplify_video/" notin url
check await proxyMedia(request, url)
if "twimg.com" notin url:
url.insert(twimg)
if not url.startsWith(https):
url.insert(https)
let uri = parseUri(url)
cond isTwitterUrl(uri) == true
let code = await proxyMedia(request, url)
check code
get re"^\/video\/(enc)?\/?(.+)\/(.+)$": get re"^\/video\/(enc)?\/?(.+)\/(.+)$":
let url = decoded(request, 2) let url = decoded(request, 2)
cond "http" in url cond "http" in url
if getHmac(url) != request.matches[1]: if getHmac(url) != request.matches[1]:
resp showError("Failed to verify signature", cfg) resp Http403, showError("Failed to verify signature", cfg)
if ".mp4" in url or ".ts" in url or ".m4s" in url: if ".mp4" in url or ".ts" in url or ".m4s" in url:
check await proxyMedia(request, url) let code = await proxyMedia(request, url)
check code
var content: string var content: string
if ".vmap" in url: if ".vmap" in url:
@@ -154,6 +143,6 @@ proc createMediaRouter*(cfg: Config) =
if ".m3u8" in url: if ".m3u8" in url:
let vid = await safeFetch(url) let vid = await safeFetch(url)
content = proxifyVideo(vid, cookiePref(proxyVideos)) content = proxifyVideo(vid, requestPrefs().proxyVideos)
resp content, m3u8Mime resp content, m3u8Mime
+4 -2
View File
@@ -19,8 +19,10 @@ proc createPrefRouter*(cfg: Config) =
router preferences: router preferences:
get "/settings": get "/settings":
let let
prefs = cookiePrefs() prefs = requestPrefs()
html = renderPreferences(prefs, refPath(), findThemes(cfg.staticDir)) prefsCode = encodePrefs(prefs)
prefsUrl = getUrlPrefix(cfg) & "/?prefs=" & prefsCode
html = renderPreferences(prefs, refPath(), findThemes(cfg.staticDir), prefsUrl)
resp renderMain(html, request, cfg, prefs, "Preferences") resp renderMain(html, request, cfg, prefs, "Preferences")
get "/settings/@i?": get "/settings/@i?":
+2 -2
View File
@@ -18,8 +18,8 @@ proc createResolverRouter*(cfg: Config) =
router resolver: router resolver:
get "/cards/@card/@id": get "/cards/@card/@id":
let url = "https://cards.twitter.com/cards/$1/$2" % [@"card", @"id"] let url = "https://cards.twitter.com/cards/$1/$2" % [@"card", @"id"]
respResolved(await resolve(url, cookiePrefs()), "card") respResolved(await resolve(url, requestPrefs()), "card")
get "/t.co/@url": get "/t.co/@url":
let url = "https://t.co/" & @"url" let url = "https://t.co/" & @"url"
respResolved(await resolve(url, cookiePrefs()), "t.co") respResolved(await resolve(url, requestPrefs()), "t.co")
+27 -12
View File
@@ -9,21 +9,13 @@ export utils, prefs, types, uri
template savePref*(pref, value: string; req: Request; expire=false) = template savePref*(pref, value: string; req: Request; expire=false) =
if not expire or pref in cookies(req): if not expire or pref in cookies(req):
setCookie(pref, value, daysForward(when expire: -10 else: 360), setCookie(pref, value, daysForward(when expire: -10 else: 360),
httpOnly=true, secure=cfg.useHttps, sameSite=None) httpOnly=true, secure=cfg.useHttps, sameSite=None, path="/")
template cookiePrefs*(): untyped {.dirty.} = template requestPrefs*(): untyped {.dirty.} =
getPrefs(cookies(request)) getPrefs(cookies(request), params(request))
template cookiePref*(pref): untyped {.dirty.} =
getPref(cookies(request), pref)
template themePrefs*(): Prefs =
var res = defaultPrefs
res.theme = cookiePref(theme)
res
template showError*(error: string; cfg: Config): string = template showError*(error: string; cfg: Config): string =
renderMain(renderError(error), request, cfg, themePrefs(), "Error") renderMain(renderError(error), request, cfg, requestPrefs(), "Error")
template getPath*(): untyped {.dirty.} = template getPath*(): untyped {.dirty.} =
$(parseUri(request.path) ? filterParams(request.params)) $(parseUri(request.path) ? filterParams(request.params))
@@ -43,5 +35,28 @@ template getCursor*(req: Request): string =
proc getNames*(name: string): seq[string] = proc getNames*(name: string): seq[string] =
name.strip(chars={'/'}).split(",").filterIt(it.len > 0) name.strip(chars={'/'}).split(",").filterIt(it.len > 0)
template applyUrlPrefs*() {.dirty.} =
if @"prefs".len > 0:
var prefParams = initTable[string, string]()
for pair in @"prefs".split(','):
let kv = pair.split('=', maxsplit=1)
if kv.len == 2:
prefParams[kv[0]] = kv[1]
elif kv.len == 1 and kv[0].len > 0:
prefParams[kv[0]] = ""
genApplyPrefs(prefParams, request)
# Rebuild URL without prefs param
var params: seq[(string, string)]
for k, v in request.params:
if k != "prefs":
params.add (k, v)
if params.len > 0:
let cleanUrl = request.getNativeReq.url ? params
redirect($cleanUrl)
else:
redirect(request.path)
template respJson*(node: JsonNode) = template respJson*(node: JsonNode) =
resp $node, "application/json" resp $node, "application/json"
+35 -22
View File
@@ -15,7 +15,7 @@ proc redisKey*(page, name, cursor: string): string =
if cursor.len > 0: if cursor.len > 0:
result &= ":" & cursor result &= ":" & cursor
proc timelineRss*(req: Request; cfg: Config; query: Query): Future[Rss] {.async.} = proc timelineRss*(req: Request; cfg: Config; query: Query; prefs: Prefs): Future[Rss] {.async.} =
var profile: Profile var profile: Profile
let let
name = req.params.getOrDefault("name") name = req.params.getOrDefault("name")
@@ -23,25 +23,23 @@ proc timelineRss*(req: Request; cfg: Config; query: Query): Future[Rss] {.async.
names = getNames(name) names = getNames(name)
if names.len == 1: if names.len == 1:
profile = await fetchProfile(after, query, skipRail=true, skipPinned=true) profile = await fetchProfile(after, query, skipRail=true)
else: else:
var q = query var q = query
q.fromUser = names q.fromUser = names
profile = Profile( profile.tweets = await getGraphTweetSearch(q, after)
tweets: await getGraphSearch(q, after), # this is kinda dumb
# this is kinda dumb profile.user = User(
user: User( username: name,
username: name, fullname: names.join(" | "),
fullname: names.join(" | "), userpic: "https://abs.twimg.com/sticky/default_profile_images/default_profile.png"
userpic: "https://abs.twimg.com/sticky/default_profile_images/default_profile.png"
)
) )
if profile.user.suspended: if profile.user.suspended:
return Rss(feed: profile.user.username, cursor: "suspended") return Rss(feed: profile.user.username, cursor: "suspended")
if profile.user.fullname.len > 0: if profile.user.fullname.len > 0:
let rss = renderTimelineRss(profile, cfg, multi=(names.len > 1)) let rss = renderTimelineRss(profile, cfg, prefs, multi=(names.len > 1))
return Rss(feed: rss, cursor: profile.tweets.bottom) return Rss(feed: rss, cursor: profile.tweets.bottom)
template respRss*(rss, page) = template respRss*(rss, page) =
@@ -62,11 +60,14 @@ template respRss*(rss, page) =
proc createRssRouter*(cfg: Config) = proc createRssRouter*(cfg: Config) =
router rss: router rss:
get "/search/rss": get "/search/rss":
cond cfg.enableRss if not cfg.enableRSSSearch:
resp Http403, showError("RSS feed is disabled", cfg)
if @"q".len > 200: if @"q".len > 200:
resp Http400, showError("Search input too long.", cfg) resp Http400, showError("Search input too long.", cfg)
let query = initQuery(params(request)) let
prefs = requestPrefs()
query = initQuery(params(request))
if query.kind != tweets: if query.kind != tweets:
resp Http400, showError("Only Tweet searches are allowed for RSS feeds.", cfg) resp Http400, showError("Only Tweet searches are allowed for RSS feeds.", cfg)
@@ -78,17 +79,19 @@ proc createRssRouter*(cfg: Config) =
if rss.cursor.len > 0: if rss.cursor.len > 0:
respRss(rss, "Search") respRss(rss, "Search")
let tweets = await getGraphSearch(query, cursor) let tweets = await getGraphTweetSearch(query, cursor)
rss.cursor = tweets.bottom rss.cursor = tweets.bottom
rss.feed = renderSearchRss(tweets.content, query.text, genQueryUrl(query), cfg) rss.feed = renderSearchRss(tweets.content, query.text, genQueryUrl(query), cfg, prefs)
await cacheRss(key, rss) await cacheRss(key, rss)
respRss(rss, "Search") respRss(rss, "Search")
get "/@name/rss": get "/@name/rss":
cond cfg.enableRss
cond '.' notin @"name" cond '.' notin @"name"
if not cfg.enableRSSUserTweets:
resp Http403, showError("RSS feed is disabled", cfg)
let let
prefs = requestPrefs()
name = @"name" name = @"name"
key = redisKey("twitter", name, getCursor()) key = redisKey("twitter", name, getCursor())
@@ -96,16 +99,23 @@ proc createRssRouter*(cfg: Config) =
if rss.cursor.len > 0: if rss.cursor.len > 0:
respRss(rss, "User") respRss(rss, "User")
rss = await timelineRss(request, cfg, Query(fromUser: @[name])) rss = await timelineRss(request, cfg, Query(fromUser: @[name]), prefs)
await cacheRss(key, rss) await cacheRss(key, rss)
respRss(rss, "User") respRss(rss, "User")
get "/@name/@tab/rss": get "/@name/@tab/rss":
cond cfg.enableRss
cond '.' notin @"name" cond '.' notin @"name"
cond @"tab" in ["with_replies", "media", "search"] cond @"tab" in ["with_replies", "media", "search"]
let rssEnabled = case @"tab"
of "with_replies": cfg.enableRSSUserReplies
of "media": cfg.enableRSSUserMedia
of "search": cfg.enableRSSSearch
else: false
if not rssEnabled:
resp Http403, showError("RSS feed is disabled", cfg)
let let
prefs = requestPrefs()
name = @"name" name = @"name"
tab = @"tab" tab = @"tab"
query = query =
@@ -124,14 +134,15 @@ proc createRssRouter*(cfg: Config) =
if rss.cursor.len > 0: if rss.cursor.len > 0:
respRss(rss, "User") respRss(rss, "User")
rss = await timelineRss(request, cfg, query) rss = await timelineRss(request, cfg, query, prefs)
await cacheRss(key, rss) await cacheRss(key, rss)
respRss(rss, "User") respRss(rss, "User")
get "/@name/lists/@slug/rss": get "/@name/lists/@slug/rss":
cond cfg.enableRss
cond @"name" != "i" cond @"name" != "i"
if not cfg.enableRSSList:
resp Http403, showError("RSS feed is disabled", cfg)
let let
slug = decodeUrl(@"slug") slug = decodeUrl(@"slug")
list = await getCachedList(@"name", slug) list = await getCachedList(@"name", slug)
@@ -147,8 +158,10 @@ proc createRssRouter*(cfg: Config) =
redirect(url) redirect(url)
get "/i/lists/@id/rss": get "/i/lists/@id/rss":
cond cfg.enableRss if not cfg.enableRSSList:
resp Http403, showError("RSS feed is disabled", cfg)
let let
prefs = requestPrefs()
id = @"id" id = @"id"
cursor = getCursor() cursor = getCursor()
key = redisKey("lists", id, cursor) key = redisKey("lists", id, cursor)
@@ -161,7 +174,7 @@ proc createRssRouter*(cfg: Config) =
list = await getCachedList(id=id) list = await getCachedList(id=id)
timeline = await getGraphListTweets(list.id, cursor) timeline = await getGraphListTweets(list.id, cursor)
rss.cursor = timeline.bottom rss.cursor = timeline.bottom
rss.feed = renderListRss(timeline.content, list, cfg) rss.feed = renderListRss(timeline.content, list, cfg, prefs)
await cacheRss(key, rss) await cacheRss(key, rss)
respRss(rss, "List") respRss(rss, "List")
+6 -6
View File
@@ -19,7 +19,7 @@ proc createSearchRouter*(cfg: Config) =
resp Http400, showError("Search input too long.", cfg) resp Http400, showError("Search input too long.", cfg)
let let
prefs = cookiePrefs() prefs = requestPrefs()
query = initQuery(params(request)) query = initQuery(params(request))
title = "Search" & (if q.len > 0: " (" & q & ")" else: "") title = "Search" & (if q.len > 0: " (" & q & ")" else: "")
@@ -29,23 +29,23 @@ proc createSearchRouter*(cfg: Config) =
redirect("/" & q) redirect("/" & q)
var users: Result[User] var users: Result[User]
try: try:
users = await getUserSearch(query, getCursor()) users = await getGraphUserSearch(query, getCursor())
except InternalError: except InternalError:
users = Result[User](beginning: true, query: query) users = Result[User](beginning: true, query: query)
resp renderMain(renderUserSearch(users, prefs), request, cfg, prefs, title) resp renderMain(renderUserSearch(users, prefs), request, cfg, prefs, title)
of tweets: of tweets:
let let
tweets = await getGraphSearch(query, getCursor()) tweets = await getGraphTweetSearch(query, getCursor())
rss = "/search/rss?" & genQueryUrl(query) rss = if cfg.enableRSSSearch: "/search/rss?" & genQueryUrl(query) else: ""
resp renderMain(renderTweetSearch(tweets, prefs, getPath()), resp renderMain(renderTweetSearch(tweets, prefs, getPath()),
request, cfg, prefs, title, rss=rss) request, cfg, prefs, title, rss=rss)
else: else:
resp Http404, showError("Invalid search", cfg) resp Http404, showError("Invalid search", cfg)
get "/hashtag/@hash": get "/hashtag/@hash":
redirect("/search?q=" & encodeUrl("#" & @"hash")) redirect("/search?f=tweets&q=" & encodeUrl("#" & @"hash"))
get "/opensearch": get "/opensearch":
let url = getUrlPrefix(cfg) & "/search?q=" let url = getUrlPrefix(cfg) & "/search?f=tweets&q="
resp Http200, {"Content-Type": "application/opensearchdescription+xml"}, resp Http200, {"Content-Type": "application/opensearchdescription+xml"},
generateOpenSearchXML(cfg.title, cfg.hostname, url) generateOpenSearchXML(cfg.title, cfg.hostname, url)
+24 -6
View File
@@ -21,18 +21,16 @@ proc createStatusRouter*(cfg: Config) =
if id.len > 19 or id.any(c => not c.isDigit): if id.len > 19 or id.any(c => not c.isDigit):
resp Http404, showError("Invalid tweet ID", cfg) resp Http404, showError("Invalid tweet ID", cfg)
let prefs = cookiePrefs() let prefs = requestPrefs()
# used for the infinite scroll feature # used for the infinite scroll feature
if @"scroll".len > 0: if @"scroll".len > 0:
let replies = await getReplies(id, getCursor()) let replies = await getReplies(id, getCursor())
if replies.content.len == 0: if replies.content.len == 0:
resp Http404, "" resp Http204
resp $renderReplies(replies, prefs, getPath()) resp $renderReplies(replies, prefs, getPath())
let conv = await getTweet(id, getCursor()) let conv = await getTweet(id, getCursor())
if conv == nil:
echo "nil conv"
if conv == nil or conv.tweet == nil or conv.tweet.id == 0: if conv == nil or conv.tweet == nil or conv.tweet.id == 0:
var error = "Tweet not found" var error = "Tweet not found"
@@ -46,7 +44,7 @@ proc createStatusRouter*(cfg: Config) =
desc = conv.tweet.text desc = conv.tweet.text
var var
images = conv.tweet.photos images = conv.tweet.photos.mapIt(it.url)
video = "" video = ""
if conv.tweet.video.isSome(): if conv.tweet.video.isSome():
@@ -66,6 +64,26 @@ proc createStatusRouter*(cfg: Config) =
resp renderMain(html, request, cfg, prefs, title, desc, ogTitle, resp renderMain(html, request, cfg, prefs, title, desc, ogTitle,
images=images, video=video) images=images, video=video)
get "/@name/status/@id/history/?":
cond '.' notin @"name"
let id = @"id"
if id.len > 19 or id.any(c => not c.isDigit):
resp Http404, showError("Invalid tweet ID", cfg)
let edits = await getGraphEditHistory(id)
if edits.latest == nil or edits.latest.id == 0:
resp Http404, showError("Tweet history not found", cfg)
let
prefs = requestPrefs()
title = "History for " & pageTitle(edits.latest)
ogTitle = "Edit History for " & pageTitle(edits.latest.user)
desc = edits.latest.text
let html = renderEditHistory(edits, prefs, getPath())
resp renderMain(html, request, cfg, prefs, title, desc, ogTitle)
get "/@name/@s/@id/@m/?@i?": get "/@name/@s/@id/@m/?@i?":
cond @"s" in ["status", "statuses"] cond @"s" in ["status", "statuses"]
cond @"m" in ["video", "photo"] cond @"m" in ["video", "photo"]
@@ -76,6 +94,6 @@ proc createStatusRouter*(cfg: Config) =
get "/i/web/status/@id": get "/i/web/status/@id":
redirect("/i/status/" & @"id") redirect("/i/status/" & @"id")
get "/@name/thread/@id/?": get "/@name/thread/@id/?":
redirect("/$1/status/$2" % [@"name", @"id"]) redirect("/$1/status/$2" % [@"name", @"id"])
+34 -34
View File
@@ -27,8 +27,7 @@ template skipIf[T](cond: bool; default; body: Future[T]): Future[T] =
else: else:
body body
proc fetchProfile*(after: string; query: Query; skipRail=false; proc fetchProfile*(after: string; query: Query; skipRail=false): Future[Profile] {.async.} =
skipPinned=false): Future[Profile] {.async.} =
let let
name = query.fromUser[0] name = query.fromUser[0]
userId = await getUserId(name) userId = await getUserId(name)
@@ -45,37 +44,21 @@ proc fetchProfile*(after: string; query: Query; skipRail=false;
after.setLen 0 after.setLen 0
let let
timeline =
case query.kind
of posts: getGraphUserTweets(userId, TimelineKind.tweets, after)
of replies: getGraphUserTweets(userId, TimelineKind.replies, after)
of media: getGraphUserTweets(userId, TimelineKind.media, after)
else: getGraphSearch(query, after)
rail = rail =
skipIf(skipRail or query.kind == media, @[]): skipIf(skipRail or query.kind == media, @[]):
getCachedPhotoRail(name) getCachedPhotoRail(userId)
user = await getCachedUser(name) user = getCachedUser(name)
var pinned: Option[Tweet] result =
if not skipPinned and user.pinnedTweet > 0 and case query.kind
after.len == 0 and query.kind in {posts, replies}: of posts: await getGraphUserTweets(userId, TimelineKind.tweets, after)
let tweet = await getCachedTweet(user.pinnedTweet) of replies: await getGraphUserTweets(userId, TimelineKind.replies, after)
if not tweet.isNil: of media: await getGraphUserTweets(userId, TimelineKind.media, after)
tweet.pinned = true else: Profile(tweets: await getGraphTweetSearch(query, after))
tweet.user = user
pinned = some tweet
result = Profile( result.user = await user
user: user, result.photoRail = await rail
pinned: pinned,
tweets: await timeline,
photoRail: await rail
)
if result.user.protected or result.user.suspended:
return
result.tweets.query = query result.tweets.query = query
@@ -83,11 +66,11 @@ proc showTimeline*(request: Request; query: Query; cfg: Config; prefs: Prefs;
rss, after: string): Future[string] {.async.} = rss, after: string): Future[string] {.async.} =
if query.fromUser.len != 1: if query.fromUser.len != 1:
let let
timeline = await getGraphSearch(query, after) timeline = await getGraphTweetSearch(query, after)
html = renderTweetSearch(timeline, prefs, getPath()) html = renderTweetSearch(timeline, prefs, getPath())
return renderMain(html, request, cfg, prefs, "Multi", rss=rss) return renderMain(html, request, cfg, prefs, "Multi", rss=rss)
var profile = await fetchProfile(after, query, skipPinned=prefs.hidePins) var profile = await fetchProfile(after, query)
template u: untyped = profile.user template u: untyped = profile.user
if u.suspended: if u.suspended:
@@ -122,12 +105,19 @@ proc createTimelineRouter*(cfg: Config) =
get "/intent/user": get "/intent/user":
respUserId() respUserId()
get "/intent/follow/?":
let username = request.params.getOrDefault("screen_name")
if username.len == 0:
resp Http400, showError("Missing screen_name parameter", cfg)
redirect("/" & username)
get "/@name/?@tab?/?": get "/@name/?@tab?/?":
cond '.' notin @"name" cond '.' notin @"name"
cond @"name" notin ["pic", "gif", "video", "search", "settings", "login", "intent", "i"] cond @"name" notin ["pic", "gif", "video", "search", "settings", "login", "intent", "i"]
cond @"name".allCharsInSet({'a'..'z', 'A'..'Z', '0'..'9', '_', ','})
cond @"tab" in ["with_replies", "media", "search", ""] cond @"tab" in ["with_replies", "media", "search", ""]
let let
prefs = cookiePrefs() prefs = requestPrefs()
after = getCursor() after = getCursor()
names = getNames(@"name") names = getNames(@"name")
@@ -138,8 +128,9 @@ proc createTimelineRouter*(cfg: Config) =
# used for the infinite scroll feature # used for the infinite scroll feature
if @"scroll".len > 0: if @"scroll".len > 0:
if query.fromUser.len != 1: if query.fromUser.len != 1:
var timeline = await getGraphSearch(query, after) var timeline = await getGraphTweetSearch(query, after)
if timeline.content.len == 0: resp Http404 if timeline.content.len == 0:
resp Http204
timeline.beginning = true timeline.beginning = true
resp $renderTweetSearch(timeline, prefs, getPath()) resp $renderTweetSearch(timeline, prefs, getPath())
else: else:
@@ -148,8 +139,17 @@ proc createTimelineRouter*(cfg: Config) =
profile.tweets.beginning = true profile.tweets.beginning = true
resp $renderTimelineTweets(profile.tweets, prefs, getPath()) resp $renderTimelineTweets(profile.tweets, prefs, getPath())
let rssEnabled =
if @"tab".len == 0: cfg.enableRSSUserTweets
elif @"tab" == "with_replies": cfg.enableRSSUserReplies
elif @"tab" == "media": cfg.enableRSSUserMedia
elif @"tab" == "search": cfg.enableRSSSearch
else: false
let rss = let rss =
if @"tab".len == 0: if not rssEnabled:
""
elif @"tab".len == 0:
"/$1/rss" % @"name" "/$1/rss" % @"name"
elif @"tab" == "search": elif @"tab" == "search":
"/$1/search/rss?$2" % [@"name", genQueryUrl(query)] "/$1/search/rss?$2" % [@"name", genQueryUrl(query)]
+2 -2
View File
@@ -10,14 +10,14 @@ export feature
proc createUnsupportedRouter*(cfg: Config) = proc createUnsupportedRouter*(cfg: Config) =
router unsupported: router unsupported:
template feature {.dirty.} = template feature {.dirty.} =
resp renderMain(renderFeature(), request, cfg, themePrefs()) resp renderMain(renderFeature(), request, cfg, requestPrefs())
get "/about/feature": feature() get "/about/feature": feature()
get "/login/?@i?": feature() get "/login/?@i?": feature()
get "/@name/lists/?": feature() get "/@name/lists/?": feature()
get "/intent/?@i?": get "/intent/?@i?":
cond @"i" notin ["user"] cond @"i" notin ["user", "follow"]
feature() feature()
get "/i/@i?/?@j?": get "/i/@i?/?@j?":
+29 -28
View File
@@ -1,39 +1,40 @@
@import '_variables'; @import "_variables";
@import '_mixins'; @import "_mixins";
.panel-container { .panel-container {
margin: auto; margin: auto;
font-size: 130%; font-size: 130%;
} }
.error-panel { .error-panel {
@include center-panel(var(--error_red)); @include center-panel(var(--error_red));
text-align: center; text-align: center;
} }
.search-bar > form { .search-bar > form {
@include center-panel(var(--darkest_grey)); @include center-panel(var(--darkest_grey));
button { button {
background: var(--bg_elements); background: var(--bg_elements);
color: var(--fg_color); color: var(--fg_color);
border: 0; border: 0;
border-radius: 3px; border-radius: 3px;
cursor: pointer; cursor: pointer;
font-weight: bold; font-weight: bold;
width: 30px; width: 30px;
height: 30px; height: 30px;
} padding: 0px 5px 1px 8px;
}
input { input {
font-size: 16px; font-size: 16px;
width: 100%; width: 100%;
background: var(--bg_elements); background: var(--bg_elements);
color: var(--fg_color); color: var(--fg_color);
border: 0; border: 0;
border-radius: 4px; border-radius: 4px;
padding: 4px; padding: 4px;
margin-right: 8px; margin-right: 8px;
height: unset; height: unset;
} }
} }
+1 -12
View File
@@ -66,18 +66,7 @@
} }
#search-panel-toggle:checked ~ .search-panel { #search-panel-toggle:checked ~ .search-panel {
@if $rows == 6 { max-height: 380px !important;
max-height: 200px !important;
}
@if $rows == 5 {
max-height: 300px !important;
}
@if $rows == 4 {
max-height: 300px !important;
}
@if $rows == 3 {
max-height: 365px !important;
}
} }
} }
} }
+23 -24
View File
@@ -1,44 +1,43 @@
// colors // colors
$bg_color: #0F0F0F; $bg_color: #0f0f0f;
$fg_color: #F8F8F2; $fg_color: #f8f8f2;
$fg_faded: #F8F8F2CF; $fg_faded: #f8f8f2cf;
$fg_dark: #FF6C60; $fg_dark: #ff6c60;
$fg_nav: #FF6C60; $fg_nav: #ff6c60;
$bg_panel: #161616; $bg_panel: #161616;
$bg_elements: #121212; $bg_elements: #121212;
$bg_overlays: #1F1F1F; $bg_overlays: #1f1f1f;
$bg_hover: #1A1A1A; $bg_hover: #1a1a1a;
$grey: #888889; $grey: #888889;
$dark_grey: #404040; $dark_grey: #404040;
$darker_grey: #282828; $darker_grey: #282828;
$darkest_grey: #222222; $darkest_grey: #222222;
$border_grey: #3E3E35; $border_grey: #3e3e35;
$accent: #FF6C60; $accent: #ff6c60;
$accent_light: #FFACA0; $accent_light: #ffaca0;
$accent_dark: #8A3731; $accent_dark: #8a3731;
$accent_border: #FF6C6091; $accent_border: #ff6c6091;
$play_button: #D8574D; $play_button: #d8574d;
$play_button_hover: #FF6C60; $play_button_hover: #ff6c60;
$more_replies_dots: #AD433B; $more_replies_dots: #ad433b;
$error_red: #420A05; $error_red: #420a05;
$verified_blue: #1DA1F2; $verified_blue: #1da1f2;
$verified_business: #fac82b;
$verified_government: #c1b6a4;
$icon_text: $fg_color; $icon_text: $fg_color;
$tab: $fg_color; $tab: $fg_color;
$tab_selected: $accent; $tab_selected: $accent;
$shadow: rgba(0,0,0,.6); $shadow: rgba(0, 0, 0, 0.6);
$shadow_dark: rgba(0,0,0,.2); $shadow_dark: rgba(0, 0, 0, 0.2);
//fonts //fonts
$font_0: Helvetica Neue; $font_0: sans-serif;
$font_1: Helvetica; $font_1: fontello;
$font_2: Arial;
$font_3: sans-serif;
$font_4: fontello;
+154 -103
View File
@@ -1,165 +1,216 @@
@import '_variables'; @import "_variables";
@import 'tweet/_base'; @import "tweet/_base";
@import 'profile/_base'; @import "profile/_base";
@import 'general'; @import "general";
@import 'navbar'; @import "navbar";
@import 'inputs'; @import "inputs";
@import 'timeline'; @import "timeline";
@import 'search'; @import "search";
body { body {
// colors // colors
--bg_color: #{$bg_color}; --bg_color: #{$bg_color};
--fg_color: #{$fg_color}; --fg_color: #{$fg_color};
--fg_faded: #{$fg_faded}; --fg_faded: #{$fg_faded};
--fg_dark: #{$fg_dark}; --fg_dark: #{$fg_dark};
--fg_nav: #{$fg_nav}; --fg_nav: #{$fg_nav};
--bg_panel: #{$bg_panel}; --bg_panel: #{$bg_panel};
--bg_elements: #{$bg_elements}; --bg_elements: #{$bg_elements};
--bg_overlays: #{$bg_overlays}; --bg_overlays: #{$bg_overlays};
--bg_hover: #{$bg_hover}; --bg_hover: #{$bg_hover};
--grey: #{$grey}; --grey: #{$grey};
--dark_grey: #{$dark_grey}; --dark_grey: #{$dark_grey};
--darker_grey: #{$darker_grey}; --darker_grey: #{$darker_grey};
--darkest_grey: #{$darkest_grey}; --darkest_grey: #{$darkest_grey};
--border_grey: #{$border_grey}; --border_grey: #{$border_grey};
--accent: #{$accent}; --accent: #{$accent};
--accent_light: #{$accent_light}; --accent_light: #{$accent_light};
--accent_dark: #{$accent_dark}; --accent_dark: #{$accent_dark};
--accent_border: #{$accent_border}; --accent_border: #{$accent_border};
--play_button: #{$play_button}; --play_button: #{$play_button};
--play_button_hover: #{$play_button_hover}; --play_button_hover: #{$play_button_hover};
--more_replies_dots: #{$more_replies_dots}; --more_replies_dots: #{$more_replies_dots};
--error_red: #{$error_red}; --error_red: #{$error_red};
--verified_blue: #{$verified_blue}; --verified_blue: #{$verified_blue};
--icon_text: #{$icon_text}; --verified_business: #{$verified_business};
--verified_government: #{$verified_government};
--icon_text: #{$icon_text};
--tab: #{$fg_color}; --tab: #{$fg_color};
--tab_selected: #{$accent}; --tab_selected: #{$accent};
--profile_stat: #{$fg_color}; --profile_stat: #{$fg_color};
background-color: var(--bg_color); background-color: var(--bg_color);
color: var(--fg_color); color: var(--fg_color);
font-family: $font_0, $font_1, $font_2, $font_3; font-family: $font_0, $font_1;
font-size: 14px; font-size: 15px;
line-height: 1.3; line-height: 1.3;
margin: 0; margin: 0;
} }
* { * {
outline: unset; outline: unset;
margin: 0; margin: 0;
text-decoration: none; text-decoration: none;
}
img {
dynamic-range-limit: standard;
} }
h1 { h1 {
display: inline; display: inline;
} }
h2, h3 { h2,
font-weight: normal; h3 {
font-weight: normal;
} }
p { p {
margin: 14px 0; margin: 14px 0;
} }
a { a {
color: var(--accent); color: var(--accent);
&:hover { &:hover {
text-decoration: underline; text-decoration: underline;
} }
} }
fieldset { fieldset {
border: 0; border: 0;
padding: 0; padding: 0;
margin-top: -0.6em; margin-top: -0.6em;
} }
legend { legend {
width: 100%; width: 100%;
padding: .6em 0 .3em 0; padding: 0.6em 0 0.3em 0;
border: 0; border: 0;
font-size: 16px; font-size: 16px;
font-weight: 600; font-weight: 600;
border-bottom: 1px solid var(--border_grey); border-bottom: 1px solid var(--border_grey);
margin-bottom: 8px; margin-bottom: 8px;
} }
.preferences .note { .preferences {
.note {
border-top: 1px solid var(--border_grey); border-top: 1px solid var(--border_grey);
border-bottom: 1px solid var(--border_grey); border-bottom: 1px solid var(--border_grey);
padding: 6px 0 8px 0; padding: 6px 0 8px 0;
margin-bottom: 8px; margin-bottom: 8px;
margin-top: 16px; margin-top: 16px;
}
.bookmark-note {
margin: 0;
margin-bottom: 10px;
}
} }
ul { ul {
padding-left: 1.3em; padding-left: 1.3em;
} }
.container { .container {
display: flex; display: flex;
flex-wrap: wrap; flex-wrap: wrap;
box-sizing: border-box; box-sizing: border-box;
padding-top: 50px; margin: auto;
margin: auto; min-height: 100vh;
min-height: 100vh; }
body.fixed-nav .container {
padding-top: 50px;
} }
.icon-container { .icon-container {
display: inline; display: inline;
} }
.overlay-panel { .overlay-panel {
max-width: 600px; max-width: 600px;
width: 100%; width: 100%;
margin: 0 auto; margin: 0 auto;
margin-top: 10px; margin-top: 10px;
background-color: var(--bg_overlays); background-color: var(--bg_overlays);
padding: 10px 15px; padding: 10px 15px;
align-self: start; align-self: start;
ul { ul {
margin-bottom: 14px; margin-bottom: 14px;
} }
p { p {
word-break: break-word; word-break: break-word;
} }
} }
.verified-icon { .verified-icon {
color: var(--icon_text); display: inline-block;
background-color: var(--verified_blue); width: 14px;
border-radius: 50%; height: 14px;
flex-shrink: 0; margin-left: 2px;
margin: 2px 0 3px 3px;
padding-top: 2px;
height: 12px;
width: 14px;
font-size: 8px;
display: inline-block;
text-align: center;
vertical-align: middle;
}
@media(max-width: 600px) { .verified-icon-circle {
.preferences-container { position: absolute;
max-width: 95vw; font-size: 15px;
}
.verified-icon-check {
position: absolute;
font-size: 9px;
margin: 5px 3px;
}
&.blue {
.verified-icon-circle {
color: var(--verified_blue);
} }
.nav-item, .nav-item .icon-container { .verified-icon-check {
font-size: 16px; color: var(--icon_text);
} }
}
&.business {
.verified-icon-circle {
color: var(--verified_business);
}
.verified-icon-check {
color: var(--bg_panel);
}
}
&.government {
.verified-icon-circle {
color: var(--verified_government);
}
.verified-icon-check {
color: var(--bg_panel);
}
}
}
@media (max-width: 600px) {
.preferences-container {
max-width: 95vw;
}
.nav-item,
.nav-item .icon-container {
font-size: 16px;
}
} }
+154 -124
View File
@@ -1,185 +1,215 @@
@import '_variables'; @import "_variables";
@import '_mixins'; @import "_mixins";
button { button {
@include input-colors; @include input-colors;
background-color: var(--bg_elements); background-color: var(--bg_elements);
color: var(--fg_color); color: var(--fg_color);
border: 1px solid var(--accent_border); border: 1px solid var(--accent_border);
padding: 3px 6px; padding: 3px 6px;
font-size: 14px; font-size: 14px;
cursor: pointer; cursor: pointer;
float: right; float: right;
} }
input[type="text"], input[type="text"],
input[type="date"], input[type="date"],
input[type="number"],
select { select {
@include input-colors; @include input-colors;
background-color: var(--bg_elements); background-color: var(--bg_elements);
padding: 1px 4px; padding: 1px 4px;
color: var(--fg_color); color: var(--fg_color);
border: 1px solid var(--accent_border); border: 1px solid var(--accent_border);
border-radius: 0; border-radius: 0;
font-size: 14px; font-size: 14px;
} }
input[type="text"] { input[type="number"] {
height: 16px; -moz-appearance: textfield;
}
input[type="text"],
input[type="number"] {
height: 16px;
} }
select { select {
height: 20px; height: 20px;
padding: 0 2px; padding: 0 2px;
line-height: 1; line-height: 1;
} }
input[type="date"]::-webkit-inner-spin-button { input[type="date"]::-webkit-inner-spin-button {
display: none; display: none;
}
input[type="number"] {
-moz-appearance: textfield;
}
input[type="number"]::-webkit-inner-spin-button,
input[type="number"]::-webkit-outer-spin-button {
display: none;
-webkit-appearance: none;
margin: 0;
} }
input[type="date"]::-webkit-clear-button { input[type="date"]::-webkit-clear-button {
margin-left: 17px; margin-left: 17px;
filter: grayscale(100%); filter: grayscale(100%);
filter: hue-rotate(120deg); filter: hue-rotate(120deg);
} }
input::-webkit-calendar-picker-indicator { input::-webkit-calendar-picker-indicator {
opacity: 0; opacity: 0;
} }
input::-webkit-datetime-edit-day-field:focus, input::-webkit-datetime-edit-day-field:focus,
input::-webkit-datetime-edit-month-field:focus, input::-webkit-datetime-edit-month-field:focus,
input::-webkit-datetime-edit-year-field:focus { input::-webkit-datetime-edit-year-field:focus {
background-color: var(--accent); background-color: var(--accent);
color: var(--fg_color); color: var(--fg_color);
outline: none; outline: none;
} }
.date-range { .date-range {
.date-input { .date-input {
display: inline-block; display: inline-block;
position: relative; position: relative;
} }
.icon-container { .icon-container {
pointer-events: none; pointer-events: none;
position: absolute; position: absolute;
top: 2px; top: 2px;
right: 5px; right: 5px;
} }
.search-title { .search-title {
margin: 0 2px; margin: 0 2px;
} }
} }
.icon-button button { .icon-button button {
color: var(--accent); color: var(--accent);
text-decoration: none; text-decoration: none;
background: none; background: none;
border: none; border: none;
float: none; float: none;
padding: unset; padding: unset;
padding-left: 4px; padding-left: 4px;
&:hover { &:hover {
color: var(--accent_light); color: var(--accent_light);
} }
} }
.checkbox { .checkbox {
position: absolute; position: absolute;
top: 1px; top: 1px;
right: 0; right: 0;
height: 17px; height: 17px;
width: 17px; width: 17px;
background-color: var(--bg_elements); background-color: var(--bg_elements);
border: 1px solid var(--accent_border); border: 1px solid var(--accent_border);
&:after { &:after {
content: ""; content: "";
position: absolute; position: absolute;
display: none; display: none;
} }
} }
.checkbox-container { .checkbox-container {
display: block; display: block;
position: relative; position: relative;
margin-bottom: 5px; margin-bottom: 5px;
cursor: pointer;
user-select: none;
padding-right: 22px;
input {
position: absolute;
opacity: 0;
cursor: pointer; cursor: pointer;
user-select: none; height: 0;
padding-right: 22px; width: 0;
input { &:checked ~ .checkbox:after {
position: absolute; display: block;
opacity: 0;
cursor: pointer;
height: 0;
width: 0;
&:checked ~ .checkbox:after {
display: block;
}
} }
}
&:hover input ~ .checkbox { &:hover input ~ .checkbox {
border-color: var(--accent); border-color: var(--accent);
} }
&:active input ~ .checkbox { &:active input ~ .checkbox {
border-color: var(--accent_light); border-color: var(--accent_light);
} }
.checkbox:after { .checkbox:after {
left: 2px; left: 2px;
bottom: 0; bottom: 0;
font-size: 13px; font-size: 13px;
font-family: $font_4; font-family: $font_1;
content: '\e803'; content: "\e811";
} }
} }
.pref-group { .pref-group {
display: inline; display: inline;
} }
.preferences { .preferences {
button { button {
margin: 6px 0 3px 0; margin: 6px 0 3px 0;
} }
label { label {
padding-right: 150px; padding-right: 150px;
} }
select { select {
position: absolute; position: absolute;
top: 0; top: 0;
right: 0; right: 0;
display: block; display: block;
-moz-appearance: none; -moz-appearance: none;
-webkit-appearance: none; -webkit-appearance: none;
appearance: none; appearance: none;
} }
input[type="text"] { input[type="text"],
position: absolute; input[type="number"] {
right: 0; position: absolute;
max-width: 140px; right: 0;
} max-width: 140px;
}
.pref-group { .pref-group {
display: block; display: block;
} }
.pref-input { .pref-input {
position: relative; position: relative;
margin-bottom: 6px; margin-bottom: 6px;
} }
.pref-reset { .pref-reset {
float: left; float: left;
} }
.prefs-code {
background-color: var(--bg_elements);
border: 1px solid var(--accent_border);
color: var(--fg_color);
font-size: 13px;
padding: 6px 8px;
margin: 4px 0;
word-break: break-all;
white-space: pre-wrap;
user-select: all;
}
} }
+62 -60
View File
@@ -1,88 +1,90 @@
@import '_variables'; @import "_variables";
nav { nav {
display: flex; display: flex;
align-items: center; align-items: center;
position: fixed; background-color: var(--bg_overlays);
background-color: var(--bg_overlays); box-shadow: 0 0 4px $shadow;
box-shadow: 0 0 4px $shadow; padding: 0;
padding: 0; width: 100%;
width: 100%; height: 50px;
height: 50px; z-index: 1000;
z-index: 1000; font-size: 16px;
font-size: 16px;
a, .icon-button button { a,
color: var(--fg_nav); .icon-button button {
} color: var(--fg_nav);
}
body.fixed-nav & {
position: fixed;
}
} }
.inner-nav { .inner-nav {
margin: auto; margin: auto;
box-sizing: border-box; box-sizing: border-box;
padding: 0 10px; padding: 0 10px;
display: flex; display: flex;
align-items: center; align-items: center;
flex-basis: 920px; flex-basis: 920px;
height: 50px; height: 50px;
} }
.site-name { .site-name {
font-size: 15px; font-size: 15px;
font-weight: 600; font-weight: 600;
line-height: 1; line-height: 1;
&:hover { &:hover {
color: var(--accent_light); color: var(--accent_light);
text-decoration: unset; text-decoration: unset;
} }
} }
.site-logo { .site-logo {
display: block; display: block;
width: 35px; width: 35px;
height: 35px; height: 35px;
} }
.nav-item { .nav-item {
display: flex; display: flex;
flex: 1; flex: 1;
line-height: 50px; line-height: 50px;
height: 50px; height: 50px;
overflow: hidden; overflow: hidden;
flex-wrap: wrap; flex-wrap: wrap;
align-items: center; align-items: center;
&.right { &.right {
text-align: right; text-align: right;
justify-content: flex-end; justify-content: flex-end;
} }
&.right a { &.right a:hover {
padding-left: 4px; color: var(--accent_light);
text-decoration: unset;
&:hover { }
color: var(--accent_light);
text-decoration: unset;
}
}
} }
.lp { .lp {
height: 14px; height: 14px;
margin-top: 2px; display: inline-block;
display: block; position: relative;
fill: var(--fg_nav); top: 2px;
fill: var(--fg_nav);
&:hover { &:hover {
fill: var(--accent_light); fill: var(--accent_light);
} }
} }
.icon-info:before { .icon-info {
margin: 0 -3px; margin: 0 -3px;
} }
.icon-cog { .icon-cog {
font-size: 15px; font-size: 15px;
padding-left: 0 !important;
} }
+5 -1
View File
@@ -39,7 +39,11 @@
text-align: left; text-align: left;
vertical-align: top; vertical-align: top;
max-width: 32%; max-width: 32%;
top: 50px; top: 0;
body.fixed-nav & {
top: 50px;
}
} }
.profile-result { .profile-result {
+1 -1
View File
@@ -115,7 +115,7 @@
} }
.profile-card-tabs-name { .profile-card-tabs-name {
@include breakable; flex-shrink: 100;
} }
.profile-card-avatar { .profile-card-avatar {
+86 -86
View File
@@ -1,120 +1,120 @@
@import '_variables'; @import "_variables";
@import '_mixins'; @import "_mixins";
.search-title { .search-title {
font-weight: bold; font-weight: bold;
display: inline-block; display: inline-block;
margin-top: 4px; margin-top: 4px;
} }
.search-field { .search-field {
display: flex;
flex-wrap: wrap;
button {
margin: 0 2px 0 0;
padding: 0px 1px 1px 4px;
height: 23px;
display: flex; display: flex;
flex-wrap: wrap; align-items: center;
}
button { .pref-input {
margin: 0 2px 0 0; margin: 0 4px 0 0;
height: 23px; flex-grow: 1;
} height: 23px;
}
.pref-input { input[type="text"],
margin: 0 4px 0 0; input[type="number"] {
flex-grow: 1; height: calc(100% - 4px);
height: 23px; width: calc(100% - 8px);
} }
input[type="text"] { > label {
height: calc(100% - 4px); display: inline;
width: calc(100% - 8px); background-color: var(--bg_elements);
} color: var(--fg_color);
border: 1px solid var(--accent_border);
padding: 1px 1px 2px 4px;
font-size: 14px;
cursor: pointer;
margin-bottom: 2px;
> label { @include input-colors;
display: inline; }
background-color: var(--bg_elements);
color: var(--fg_color);
border: 1px solid var(--accent_border);
padding: 1px 6px 2px 6px;
font-size: 14px;
cursor: pointer;
margin-bottom: 2px;
@include input-colors; @include create-toggle(search-panel, 380px);
}
@include create-toggle(search-panel, 200px);
} }
.search-panel { .search-panel {
width: 100%; width: 100%;
max-height: 0; max-height: 0;
overflow: hidden; overflow: hidden;
transition: max-height 0.4s; transition: max-height 0.4s;
flex-grow: 1; flex-grow: 1;
font-weight: initial; font-weight: initial;
text-align: left; text-align: left;
> div { .checkbox-container {
line-height: 1.7em; display: inline;
} padding-right: unset;
margin-bottom: 5px;
margin-left: 23px;
}
.checkbox-container { .checkbox {
display: inline; right: unset;
padding-right: unset; left: -22px;
margin-bottom: unset; line-height: 1.6em;
margin-left: 23px; }
}
.checkbox { .checkbox-container .checkbox:after {
right: unset; top: -4px;
left: -22px; }
}
.checkbox-container .checkbox:after {
top: -4px;
}
} }
.search-row { .search-row {
display: flex; display: flex;
flex-wrap: wrap; flex-wrap: wrap;
line-height: unset; line-height: unset;
> div { > div {
flex-grow: 1; flex-grow: 1;
flex-shrink: 1; flex-shrink: 1;
} }
input {
height: 21px;
}
.pref-input {
display: block;
padding-bottom: 5px;
input { input {
height: 21px; height: 21px;
} margin-top: 1px;
.pref-input {
display: block;
padding-bottom: 5px;
input {
height: 21px;
margin-top: 1px;
}
} }
}
} }
.search-toggles { .search-toggles {
flex-grow: 1; flex-grow: 1;
display: grid; display: grid;
grid-template-columns: repeat(6, auto); grid-template-columns: repeat(5, auto);
grid-column-gap: 10px; grid-column-gap: 10px;
} }
.profile-tabs { .profile-tabs {
@include search-resize(820px, 5); @include search-resize(820px, 5);
@include search-resize(725px, 4); @include search-resize(715px, 4);
@include search-resize(600px, 6); @include search-resize(700px, 5);
@include search-resize(560px, 5); @include search-resize(485px, 4);
@include search-resize(480px, 4); @include search-resize(410px, 3);
@include search-resize(410px, 3);
} }
@include search-resize(560px, 5); @include search-resize(700px, 5);
@include search-resize(480px, 4); @include search-resize(485px, 4);
@include search-resize(410px, 3); @include search-resize(410px, 3);
+103 -106
View File
@@ -1,162 +1,159 @@
@import '_variables'; @import "_variables";
.timeline-container { .timeline-container {
@include panel(100%, 600px); @include panel(100%, 600px);
} }
.timeline { .timeline > div:not(:first-child) {
background-color: var(--bg_panel); border-top: 1px solid var(--border_grey);
> div:not(:first-child) {
border-top: 1px solid var(--border_grey);
}
} }
.timeline-header { .timeline-header {
width: 100%; width: 100%;
background-color: var(--bg_panel); background-color: var(--bg_panel);
text-align: center; text-align: center;
padding: 8px; padding: 8px;
display: block; display: block;
font-weight: bold; font-weight: bold;
margin-bottom: 5px; margin-bottom: 5px;
box-sizing: border-box; box-sizing: border-box;
button { button {
float: unset; float: unset;
} }
} }
.timeline-banner img { .timeline-banner img {
width: 100%; width: 100%;
} }
.timeline-description { .timeline-description {
font-weight: normal; font-weight: normal;
} }
.tab { .tab {
align-items: center; align-items: center;
display: flex; display: flex;
flex-wrap: wrap; flex-wrap: wrap;
list-style: none; list-style: none;
margin: 0 0 5px 0; margin: 0 0 5px 0;
background-color: var(--bg_panel); background-color: var(--bg_panel);
padding: 0; padding: 0;
} }
.tab-item { .tab-item {
flex: 1 1 0; flex: 1 1 0;
text-align: center; text-align: center;
margin-top: 0; margin-top: 0;
a { a {
border-bottom: .1rem solid transparent; border-bottom: 0.1rem solid transparent;
color: var(--tab); color: var(--tab);
display: block; display: block;
padding: 8px 0; padding: 8px 0;
text-decoration: none; text-decoration: none;
font-weight: bold; font-weight: bold;
&:hover { &:hover {
text-decoration: none; text-decoration: none;
}
&.active {
border-bottom-color: var(--tab_selected);
color: var(--tab_selected);
}
} }
&.active a { &.active {
border-bottom-color: var(--tab_selected); border-bottom-color: var(--tab_selected);
color: var(--tab_selected); color: var(--tab_selected);
} }
}
&.wide { &.active a {
flex-grow: 1.2; border-bottom-color: var(--tab_selected);
flex-basis: 50px; color: var(--tab_selected);
} }
&.wide {
flex-grow: 1.2;
flex-basis: 50px;
}
} }
.timeline-footer { .timeline-footer {
background-color: var(--bg_panel); background-color: var(--bg_panel);
padding: 6px 0; padding: 6px 0;
} }
.timeline-protected { .timeline-protected {
text-align: center; text-align: center;
p { p {
margin: 8px 0; margin: 8px 0;
} }
h2 { h2 {
color: var(--accent);
font-size: 20px;
font-weight: 600;
}
}
.timeline-none {
color: var(--accent); color: var(--accent);
font-size: 20px; font-size: 20px;
font-weight: 600; font-weight: 600;
text-align: center; }
}
.timeline-none {
color: var(--accent);
font-size: 20px;
font-weight: 600;
text-align: center;
} }
.timeline-end { .timeline-end {
background-color: var(--bg_panel); background-color: var(--bg_panel);
color: var(--accent); color: var(--accent);
font-size: 16px; font-size: 16px;
font-weight: 600; font-weight: 600;
text-align: center; text-align: center;
} }
.show-more { .show-more {
background-color: var(--bg_panel); background-color: var(--bg_panel);
text-align: center; text-align: center;
padding: .75em 0; padding: 0.75em 0;
display: block !important; display: block !important;
a { a {
background-color: var(--darkest_grey); background-color: var(--darkest_grey);
display: inline-block; display: inline-block;
height: 2em; height: 2em;
padding: 0 2em; padding: 0 2em;
line-height: 2em; line-height: 2em;
&:hover { &:hover {
background-color: var(--darker_grey); background-color: var(--darker_grey);
}
} }
}
} }
.top-ref { .top-ref {
background-color: var(--bg_color); background-color: var(--bg_color);
border-top: none !important; border-top: none !important;
.icon-down { .icon-down {
font-size: 20px; font-size: 20px;
display: flex; display: flex;
justify-content: center; justify-content: center;
text-decoration: none; text-decoration: none;
&:hover { &:hover {
color: var(--accent_light); color: var(--accent_light);
}
&::before {
transform: rotate(180deg) translateY(-1px);
}
} }
&::before {
transform: rotate(180deg) translateY(-1px);
}
}
} }
.timeline-item { .timeline-item {
overflow-wrap: break-word; overflow-wrap: break-word;
border-left-width: 0; border-left-width: 0;
min-width: 0; min-width: 0;
padding: .75em; padding: 0.75em;
display: flex; display: flex;
position: relative; position: relative;
background-color: var(--bg_panel);
} }
+204 -153
View File
@@ -1,240 +1,291 @@
@import '_variables'; @import "_variables";
@import '_mixins'; @import "_mixins";
@import 'thread'; @import "thread";
@import 'media'; @import "media";
@import 'video'; @import "video";
@import 'embed'; @import "embed";
@import 'card'; @import "card";
@import 'poll'; @import "poll";
@import 'quote'; @import "quote";
.tweet-body { .tweet-body {
flex: 1; flex: 1;
min-width: 0; min-width: 0;
margin-left: 58px; margin-left: 58px;
pointer-events: none; pointer-events: none;
z-index: 1; z-index: 1;
} }
.tweet-content { .tweet-content {
font-family: $font_3; line-height: 1.3em;
line-height: 1.3em; pointer-events: all;
pointer-events: all; display: inline;
display: inline;
} }
.tweet-bidi { .tweet-bidi {
display: block !important; display: block !important;
} }
.tweet-header { .tweet-header {
padding: 0; padding: 0;
vertical-align: bottom; vertical-align: bottom;
flex-basis: 100%; flex-basis: 100%;
margin-bottom: .2em; margin-bottom: 0.2em;
a { a {
display: inline-block; display: inline-block;
word-break: break-all; word-break: break-all;
max-width: 100%; max-width: 100%;
pointer-events: all; pointer-events: all;
} }
} }
.tweet-name-row { .tweet-name-row {
padding: 0; padding: 0;
display: flex; display: flex;
justify-content: space-between; justify-content: space-between;
} }
.fullname-and-username { .fullname-and-username {
display: flex; display: flex;
min-width: 0; min-width: 0;
} }
.fullname { .fullname {
@include ellipsis; @include ellipsis;
flex-shrink: 2; flex-shrink: 2;
max-width: 80%; max-width: 80%;
font-size: 14px; font-size: 14px;
font-weight: 700; font-weight: 700;
color: var(--fg_color); color: var(--fg_color);
} }
.username { .username {
@include ellipsis; @include ellipsis;
min-width: 1.6em; min-width: 1.6em;
margin-left: .4em; margin-left: 0.4em;
word-wrap: normal; word-wrap: normal;
} }
.tweet-date { .tweet-date {
display: flex; display: flex;
flex-shrink: 0; flex-shrink: 0;
margin-left: 4px; margin-left: 4px;
} }
.tweet-date a, .username, .show-more a { .tweet-date a,
color: var(--fg_dark); .username,
.show-more a {
color: var(--fg_dark);
} }
.tweet-published { .tweet-published {
margin: 0; margin-top: 10px;
margin-top: 5px; margin-bottom: 3px;
color: var(--grey); color: var(--grey);
pointer-events: all; pointer-events: all;
} }
.tweet-avatar { .tweet-avatar {
display: contents !important; display: contents !important;
img { img {
float: left; float: left;
margin-top: 3px; margin-top: 3px;
margin-left: -58px; margin-left: -58px;
width: 48px; width: 48px;
height: 48px; height: 48px;
} }
} }
.avatar { .avatar {
&.round { &.round {
border-radius: 50%; border-radius: 50%;
-webkit-user-select: none; -webkit-user-select: none;
} }
&.mini { &.mini {
position: unset; position: unset;
margin-right: 5px; margin-right: 5px;
margin-top: -1px; margin-top: -1px;
width: 20px; width: 20px;
height: 20px; height: 20px;
} }
} }
.tweet-embed { .tweet-embed {
display: flex;
flex-direction: column;
justify-content: center;
height: 100%;
background-color: var(--bg_panel);
.tweet-content {
font-size: 18px;
}
.tweet-body {
display: flex; display: flex;
flex-direction: column; flex-direction: column;
justify-content: center; max-height: calc(100vh - 0.75em * 2);
height: 100%; }
background-color: var(--bg_panel);
.tweet-content { .card-image img {
font-size: 18px; height: auto;
} }
.tweet-body {
display: flex;
flex-direction: column;
max-height: calc(100vh - 0.75em * 2);
}
.card-image img { .avatar {
height: auto; position: absolute;
} }
.avatar {
position: absolute;
}
} }
.attribution { .attribution {
display: flex; display: flex;
pointer-events: all; pointer-events: all;
margin: 5px 0; margin: 5px 0;
strong { strong {
color: var(--fg_color); color: var(--fg_color);
} }
} }
.media-tag-block { .media-tag-block {
padding-top: 5px; padding-top: 5px;
pointer-events: all; pointer-events: all;
color: var(--fg_faded);
.icon-container {
padding-right: 2px;
}
.media-tag,
.icon-container {
color: var(--fg_faded); color: var(--fg_faded);
}
.icon-container {
padding-right: 2px;
}
.media-tag, .icon-container {
color: var(--fg_faded);
}
} }
.timeline-container .media-tag-block { .timeline-container .media-tag-block {
font-size: 13px; font-size: 13px;
} }
.tweet-geo { .tweet-geo {
color: var(--fg_faded); color: var(--fg_faded);
} }
.replying-to { .replying-to {
color: var(--fg_faded); color: var(--fg_faded);
margin: -2px 0 4px; margin: -2px 0 4px;
a { a {
pointer-events: all; pointer-events: all;
} }
} }
.retweet-header, .pinned, .tweet-stats { .retweet-header,
align-content: center; .pinned,
color: var(--grey); .tweet-stats {
display: flex; align-content: center;
flex-shrink: 0; color: var(--grey);
flex-wrap: wrap; display: flex;
font-size: 14px; flex-shrink: 0;
font-weight: 600; flex-wrap: wrap;
line-height: 22px; font-size: 14px;
font-weight: 600;
line-height: 22px;
span { span {
@include ellipsis; @include ellipsis;
} }
} }
.retweet-header { .retweet-header {
margin-top: -5px !important; margin-top: -5px !important;
} }
.tweet-stats { .tweet-stats {
margin-bottom: -3px; margin-bottom: -3px;
-webkit-user-select: none; -webkit-user-select: none;
} }
.tweet-stat { .tweet-stat {
padding-top: 5px; padding-top: 5px;
min-width: 1em; min-width: 1em;
margin-right: 0.8em; margin-right: 0.8em;
} }
.show-thread { .show-thread {
display: block; display: block;
pointer-events: all; pointer-events: all;
padding-top: 2px; padding-top: 2px;
} }
.unavailable-box { .unavailable-box {
width: 100%; width: 100%;
height: 100%; height: 100%;
padding: 12px; padding: 12px;
border: solid 1px var(--dark_grey); border: solid 1px var(--dark_grey);
box-sizing: border-box; box-sizing: border-box;
border-radius: 10px; border-radius: 10px;
background-color: var(--bg_color); background-color: var(--bg_color);
z-index: 2; z-index: 2;
} }
.tweet-link { .tweet-link {
height: 100%; height: 100%;
width: 100%; width: 100%;
left: 0; left: 0;
top: 0; top: 0;
position: absolute; position: absolute;
-webkit-user-select: none; -webkit-user-select: none;
&:hover { &:hover {
background-color: var(--bg_hover); background-color: var(--bg_hover);
} }
}
.latest-post-version {
border-bottom: 1px solid var(--dark_grey);
border-top: 1px solid var(--dark_grey);
padding: 01ch 0px;
margin: 1ch 0px;
color: var(--grey);
a {
pointer-events: all;
}
}
.community-note {
background-color: var(--bg_elements);
margin-top: 10px;
border: solid 1px var(--dark_grey);
border-radius: 10px;
overflow: hidden;
pointer-events: all;
&:hover {
background-color: var(--bg_panel);
border-color: var(--grey);
}
}
.community-note-header {
background-color: var(--bg_hover);
font-weight: 700;
padding: 8px 10px;
padding-top: 6px;
display: flex;
align-items: center;
gap: 2px;
.icon-container {
flex-shrink: 0;
color: var(--accent);
}
}
.community-note-text {
white-space: pre-line;
padding: 10px 10px;
padding-top: 6px;
} }
+79 -78
View File
@@ -1,118 +1,119 @@
@import '_variables'; @import "_variables";
@import '_mixins'; @import "_mixins";
.card { .card {
margin: 5px 0; margin: 5px 0;
pointer-events: all; pointer-events: all;
max-height: unset; max-height: unset;
} }
.card-container { .card-container {
border-radius: 10px; border: solid 1px var(--dark_grey);
border-width: 1px; border-radius: 10px;
border-style: solid; background-color: var(--bg_elements);
border-color: var(--dark_grey); overflow: hidden;
background-color: var(--bg_elements); color: inherit;
overflow: hidden; display: flex;
color: inherit; flex-direction: row;
display: flex; text-decoration: none !important;
flex-direction: row;
text-decoration: none !important;
&:hover { &:hover {
border-color: var(--grey); border-color: var(--grey);
} }
.attachments { .attachments {
margin: 0; margin: 0;
border-radius: 0; border-radius: 0;
} }
} }
.card-content { .card-content {
padding: 0.5em; padding: 0.5em;
} }
.card-title { .card-title {
@include ellipsis; @include ellipsis;
white-space: unset; white-space: unset;
font-weight: bold; font-weight: bold;
font-size: 1.1em; font-size: 1.1em;
} }
.card-description { .card-description {
margin: 0.3em 0; margin: 0.3em 0;
white-space: pre-wrap;
} }
.card-destination { .card-destination {
@include ellipsis; @include ellipsis;
color: var(--grey); color: var(--grey);
display: block; display: block;
} }
.card-content-container { .card-content-container {
color: unset; color: unset;
overflow: auto; overflow: auto;
&:hover {
text-decoration: none; &:hover {
} text-decoration: none;
}
} }
.card-image-container { .card-image-container {
width: 98px; width: 98px;
flex-shrink: 0; flex-shrink: 0;
position: relative; position: relative;
overflow: hidden; overflow: hidden;
&:before {
content: ""; &:before {
display: block; content: "";
padding-top: 100%; display: block;
} padding-top: 100%;
}
} }
.card-image { .card-image {
position: absolute; position: absolute;
top: 0; top: 0;
left: 0; left: 0;
bottom: 0; bottom: 0;
right: 0; right: 0;
background-color: var(--bg_overlays); background-color: var(--bg_overlays);
img { img {
width: 100%; width: 100%;
height: 100%; height: 100%;
max-height: 400px; max-height: 400px;
display: block; display: block;
object-fit: cover; object-fit: cover;
} }
} }
.card-overlay { .card-overlay {
@include play-button; @include play-button;
opacity: 0.8; opacity: 0.8;
display: flex; display: flex;
justify-content: center; justify-content: center;
align-items: center; align-items: center;
} }
.large { .large {
.card-container { .card-container {
display: block; display: block;
} }
.card-image-container { .card-image-container {
width: unset; width: unset;
&:before { &:before {
display: none; display: none;
}
} }
}
.card-image { .card-image {
position: unset; position: unset;
border-style: solid; border-style: solid;
border-color: var(--dark_grey); border-color: var(--dark_grey);
border-width: 0; border-width: 0;
border-bottom-width: 1px; border-bottom-width: 1px;
} }
} }
+13 -13
View File
@@ -1,17 +1,17 @@
@import '_variables'; @import "_variables";
@import '_mixins'; @import "_mixins";
.embed-video { .embed-video {
.gallery-video { .gallery-video {
width: 100%; width: 100%;
height: 100%; height: 100%;
position: absolute; position: absolute;
background-color: black; background-color: black;
top: 0%; top: 0%;
left: 0%; left: 0%;
} }
.video-container { .video-container {
max-height: unset; max-height: unset;
} }
} }
+100 -73
View File
@@ -1,76 +1,103 @@
@import '_variables'; @import "_variables";
.gallery-row { .gallery-row {
display: flex; display: flex;
flex-direction: row; flex-direction: row;
flex-wrap: nowrap; flex-wrap: nowrap;
align-items: center; overflow: hidden;
overflow: hidden; flex-grow: 1;
flex-grow: 1; max-height: 379.5px;
max-height: 379.5px; max-width: 533px;
max-width: 533px; pointer-events: all;
pointer-events: all;
.still-image { .still-image {
width: 100%; width: 100%;
display: flex; align-self: center;
} }
} }
.attachments { .attachments {
margin-top: .35em; margin-top: 0.35em;
display: flex; display: flex;
flex-direction: row; flex-direction: row;
width: 100%; width: 100%;
max-height: 600px; max-height: 600px;
border-radius: 7px; border-radius: 7px;
overflow: hidden; overflow: hidden;
flex-flow: column; flex-flow: column;
background-color: var(--bg_color); background-color: var(--bg_color);
align-items: center; align-items: center;
pointer-events: all; pointer-events: all;
.image-attachment { .image-attachment {
width: 100%; width: 100%;
} }
} }
.attachment { .attachment {
position: relative; position: relative;
line-height: 0; line-height: 0;
overflow: hidden; overflow: hidden;
margin: 0 .25em 0 0; margin: 0 0.25em 0 0;
flex-grow: 1; flex-grow: 1;
box-sizing: border-box; box-sizing: border-box;
min-width: 2em; min-width: 2em;
&:last-child { &:last-child {
margin: 0; margin: 0;
max-height: 530px; max-height: 530px;
} }
} }
.gallery-gif video { .gallery-gif video {
max-height: 530px; max-height: 530px;
background-color: #101010; background-color: #101010;
} }
.still-image { .still-image {
max-height: 379.5px; max-height: 379.5px;
max-width: 533px; max-width: 533px;
justify-content: center;
img { img {
object-fit: cover; object-fit: cover;
max-width: 100%; max-width: 100%;
max-height: 379.5px; max-height: 379.5px;
flex-basis: 300px; flex-basis: 300px;
flex-grow: 1; flex-grow: 1;
} }
}
.alt-text {
margin: 0px;
padding: 11px 7px;
box-sizing: border-box;
position: absolute;
bottom: 10px;
left: 10px;
width: 2.98em;
max-height: 25px;
white-space: pre;
overflow: hidden;
border-radius: 10px;
color: var(--fg_color);
font-size: 12px;
font-weight: bold;
background: rgba(0, 0, 0, 0.5);
backdrop-filter: blur(12px);
}
.alt-text:hover {
padding: 7px;
width: Min(230px, calc(100% - 10px * 2));
max-height: calc(100% - 10px);
line-height: 1.2em;
white-space: pre-wrap;
transition-duration: 0.4s;
transition-property: max-height;
} }
.image { .image {
display: inline-block; display: flex;
} }
// .single-image { // .single-image {
@@ -86,34 +113,34 @@
// } // }
.overlay-circle { .overlay-circle {
border-radius: 50%; border-radius: 50%;
background-color: var(--dark_grey); background-color: var(--dark_grey);
width: 40px; width: 40px;
height: 40px; height: 40px;
align-items: center; align-items: center;
display: flex; display: flex;
border-width: 5px; border-width: 5px;
border-color: var(--play_button); border-color: var(--play_button);
border-style: solid; border-style: solid;
} }
.overlay-triangle { .overlay-triangle {
width: 0; width: 0;
height: 0; height: 0;
border-style: solid; border-style: solid;
border-width: 12px 0 12px 17px; border-width: 12px 0 12px 17px;
border-color: transparent transparent transparent var(--play_button); border-color: transparent transparent transparent var(--play_button);
margin-left: 14px; margin-left: 14px;
} }
.media-gif { .media-gif {
display: table; display: table;
background-color: unset; background-color: unset;
width: unset; width: unset;
} }
.media-body { .media-body {
flex: 1; flex: 1;
padding: 0; padding: 0;
white-space: pre-wrap; white-space: pre-wrap;
} }
+24 -24
View File
@@ -1,42 +1,42 @@
@import '_variables'; @import "_variables";
.poll-meter { .poll-meter {
overflow: hidden; overflow: hidden;
position: relative; position: relative;
margin: 6px 0; margin: 6px 0;
height: 26px; height: 26px;
background: var(--bg_color); background: var(--bg_color);
border-radius: 5px; border-radius: 5px;
display: flex; display: flex;
align-items: center; align-items: center;
} }
.poll-choice-bar { .poll-choice-bar {
height: 100%; height: 100%;
position: absolute; position: absolute;
background: var(--dark_grey); background: var(--dark_grey);
} }
.poll-choice-value { .poll-choice-value {
position: relative; position: relative;
font-weight: bold; font-weight: bold;
margin-left: 5px; margin-left: 5px;
margin-right: 6px; margin-right: 6px;
min-width: 30px; min-width: 30px;
text-align: right; text-align: right;
pointer-events: all; pointer-events: all;
} }
.poll-choice-option { .poll-choice-option {
position: relative; position: relative;
pointer-events: all; pointer-events: all;
} }
.poll-info { .poll-info {
color: var(--grey); color: var(--grey);
pointer-events: all; pointer-events: all;
} }
.leader .poll-choice-bar { .leader .poll-choice-bar {
background: var(--accent_dark); background: var(--accent_dark);
} }
+96 -71
View File
@@ -1,94 +1,119 @@
@import '_variables'; @import "_variables";
.quote { .quote {
margin-top: 10px; margin-top: 10px;
border: solid 1px var(--dark_grey); border: solid 1px var(--dark_grey);
border-radius: 10px; border-radius: 10px;
background-color: var(--bg_elements); background-color: var(--bg_elements);
overflow: hidden;
pointer-events: all;
position: relative;
width: 100%;
&:hover {
border-color: var(--grey);
}
&.unavailable:hover {
border-color: var(--dark_grey);
}
.tweet-name-row {
padding: 8px 10px 6px 10px;
}
.quote-text {
overflow: hidden; overflow: hidden;
pointer-events: all; white-space: pre-wrap;
position: relative; word-wrap: break-word;
width: 100%; padding: 10px;
padding-top: 0;
}
.show-thread {
padding: 0px 10px 6px 10px;
margin-top: -6px;
}
.quote-latest {
padding: 0px 10px 6px 10px;
color: var(--grey);
}
.replying-to {
padding: 0px 10px;
padding-bottom: 4px;
margin: unset;
}
.community-note {
background-color: var(--bg_panel);
border: unset;
border-top: solid 1px var(--dark_grey);
border-radius: unset;
margin-top: 0;
&:hover { &:hover {
border-color: var(--grey); border-top-color: var(--grey);
} }
&.unavailable:hover { .community-note-header {
border-color: var(--dark_grey); background-color: var(--bg_panel);
} padding-bottom: 0;
.tweet-name-row {
padding: 6px 8px;
margin-top: 1px;
}
.quote-text {
overflow: hidden;
white-space: pre-wrap;
word-wrap: break-word;
padding: 0px 8px 8px 8px;
}
.show-thread {
padding: 0px 8px 6px 8px;
margin-top: -6px;
}
.replying-to {
padding: 0px 8px;
margin: unset;
} }
}
} }
.unavailable-quote { .unavailable-quote {
padding: 12px; padding: 12px;
display: block;
} }
.quote-link { .quote-link {
width: 100%; width: 100%;
height: 100%; height: 100%;
left: 0; left: 0;
top: 0; top: 0;
position: absolute; position: absolute;
} }
.quote-media-container { .quote-media-container {
max-height: 300px; max-height: 300px;
display: flex;
.card {
margin: unset;
}
.attachments {
border-radius: 0;
}
.media-gif {
width: 100%;
display: flex; display: flex;
justify-content: center;
}
.card { .gallery-gif .attachment {
margin: unset; display: flex;
justify-content: center;
background-color: var(--bg_color);
video {
height: unset;
width: unset;
max-height: 100%;
max-width: 100%;
} }
}
.attachments { .gallery-video,
border-radius: 0; .gallery-gif {
} max-height: 300px;
}
.media-gif { .still-image img {
width: 100%; max-height: 250px;
display: flex; }
justify-content: center;
}
.gallery-gif .attachment {
display: flex;
justify-content: center;
background-color: var(--bg_color);
video {
height: unset;
width: unset;
max-height: 100%;
max-width: 100%;
}
}
.gallery-video, .gallery-gif {
max-height: 300px;
}
.still-image img {
max-height: 250px
}
} }
+130 -88
View File
@@ -1,112 +1,154 @@
@import '_variables'; @import "_variables";
@import '_mixins'; @import "_mixins";
.conversation { .conversation,
@include panel(100%, 600px); .edit-history {
@include panel(100%, 600px);
.show-more { .show-more {
margin-bottom: 10px; margin-bottom: 10px;
} }
} }
.main-thread { .main-thread,
margin-bottom: 20px; .latest-edit {
background-color: var(--bg_panel); margin-bottom: 20px;
}
.main-tweet, .replies {
padding-top: 50px;
margin-top: -50px;
}
.main-tweet .tweet-content {
font-size: 18px;
}
@media(max-width: 600px) {
.main-tweet .tweet-content {
font-size: 16px;
}
} }
.reply { .reply {
background-color: var(--bg_panel); margin-bottom: 10px;
margin-bottom: 10px; }
.main-tweet,
.replies,
.edit-history > div {
body.fixed-nav & {
padding-top: 50px;
margin-top: -50px;
}
}
.edit-history-header {
padding: 10px;
margin-bottom: 5px;
font-size: 16px;
font-weight: bold;
background-color: var(--bg_panel);
}
.tweet-edit {
margin-bottom: 5px;
}
.main-tweet .tweet-content {
font-size: 18px;
}
@media (max-width: 600px) {
.main-tweet .tweet-content {
font-size: 16px;
}
} }
.thread-line { .thread-line {
.timeline-item::before, .timeline-item::before,
&.timeline-item::before { &.timeline-item::before {
background: var(--accent_dark); background: var(--accent_dark);
content: ''; content: "";
position: relative; position: relative;
min-width: 3px; min-width: 3px;
width: 3px; width: 3px;
left: 26px; left: 26px;
border-radius: 2px; border-radius: 2px;
margin-left: -3px; margin-left: -3px;
margin-bottom: 37px; margin-bottom: 37px;
top: 56px; top: 56px;
z-index: 1; z-index: 1;
pointer-events: none; pointer-events: none;
} }
.with-header:not(:first-child)::after { .with-header:not(:first-child)::after {
background: var(--accent_dark); background: var(--accent_dark);
content: ''; content: "";
position: relative; position: relative;
float: left; float: left;
min-width: 3px; min-width: 3px;
width: 3px; width: 3px;
right: calc(100% - 26px); right: calc(100% - 26px);
border-radius: 2px; border-radius: 2px;
margin-left: -3px; margin-left: -3px;
margin-bottom: 37px; margin-bottom: 37px;
bottom: 10px; bottom: 10px;
height: 30px; height: 30px;
z-index: 1; z-index: 1;
pointer-events: none; pointer-events: none;
} }
.unavailable::before { .unavailable::before {
top: 48px; top: 48px;
margin-bottom: 28px; margin-bottom: 28px;
} }
.more-replies::before { .more-replies::before {
content: '...'; content: "...";
background: unset; background: unset;
color: var(--more_replies_dots); color: var(--more_replies_dots);
font-weight: bold; font-weight: bold;
font-size: 20px; font-size: 20px;
line-height: 0.25em; line-height: 0.25em;
left: 1.2em; left: 1.2em;
width: 5px; width: 5px;
top: 2px; top: 2px;
margin-bottom: 0; margin-bottom: 0;
margin-left: -2.5px; margin-left: -2.5px;
} }
.earlier-replies { .earlier-replies {
padding-bottom: 0; padding-bottom: 0;
margin-bottom: -5px; margin-bottom: -5px;
} }
} }
.timeline-item.thread-last::before { .timeline-item.thread-last::before {
background: unset; background: unset;
min-width: unset; min-width: unset;
width: 0; width: 0;
margin: 0; margin: 0;
} }
.more-replies { .more-replies {
padding-top: 0.3em !important; padding-top: 0.3em !important;
} }
.more-replies-text { .more-replies-text {
@include ellipsis; @include ellipsis;
display: block; display: block;
margin-left: 58px; margin-left: 58px;
padding: 7px 0; padding: 7px 0;
}
.timeline-item.thread.more-replies-thread {
padding: 0 0.75em;
&::before {
top: 40px;
margin-bottom: 31px;
}
.more-replies {
display: flex;
padding-top: unset !important;
margin-top: 8px;
&::before {
display: inline-block;
position: relative;
top: -1px;
line-height: 0.4em;
}
.more-replies-text {
display: inline;
}
}
} }
+57 -46
View File
@@ -1,66 +1,77 @@
@import '_variables'; @import "_variables";
@import '_mixins'; @import "_mixins";
video { video {
max-height: 100%; height: 100%;
width: 100%; width: 100%;
} }
.gallery-video { .gallery-video {
display: flex; display: flex;
overflow: hidden; overflow: hidden;
} }
.gallery-video.card-container { .gallery-video.card-container {
flex-direction: column; flex-direction: column;
width: 100%;
} }
.video-container { .video-container {
max-height: 530px; min-height: 80px;
margin: 0; min-width: 200px;
display: flex; max-height: 530px;
align-items: center; margin: 0;
justify-content: center;
img { img {
max-height: 100%; max-height: 100%;
max-width: 100%; max-width: 100%;
} }
} }
.video-overlay { .video-overlay {
@include play-button; @include play-button;
background-color: $shadow; background-color: $shadow;
p { p {
position: relative; position: relative;
z-index: 0; z-index: 0;
text-align: center; text-align: center;
top: calc(50% - 20px); top: calc(50% - 20px);
font-size: 20px; font-size: 20px;
line-height: 1.3; line-height: 1.3;
margin: 0 20px; margin: 0 20px;
} }
div { .overlay-circle {
position: relative; position: relative;
z-index: 0; z-index: 0;
top: calc(50% - 20px); top: calc(50% - 20px);
margin: 0 auto; margin: 0 auto;
width: 40px; width: 40px;
height: 40px; height: 40px;
} }
form { .overlay-duration {
width: 100%; position: absolute;
height: 100%; bottom: 8px;
align-items: center; left: 8px;
justify-content: center; background-color: #0000007a;
display: flex; line-height: 1em;
} padding: 4px 6px 4px 6px;
border-radius: 5px;
font-weight: bold;
}
button { form {
padding: 5px 8px; width: 100%;
font-size: 16px; height: 100%;
} align-items: center;
justify-content: center;
display: flex;
}
button {
padding: 5px 8px;
font-size: 16px;
}
} }
+62
View File
@@ -0,0 +1,62 @@
import std/[asyncdispatch, base64, httpclient, random, strutils, sequtils, times]
import nimcrypto
import experimental/parser/tid
randomize()
const defaultKeyword = "obfiowerehiring";
const pairsUrl =
"https://raw.githubusercontent.com/fa0311/x-client-transaction-id-pair-dict/refs/heads/main/pair.json";
var
cachedPairs: seq[TidPair] = @[]
lastCached = 0
# refresh every hour
ttlSec = 60 * 60
proc getPair(): Future[TidPair] {.async.} =
if cachedPairs.len == 0 or int(epochTime()) - lastCached > ttlSec:
lastCached = int(epochTime())
let client = newAsyncHttpClient()
defer: client.close()
let resp = await client.get(pairsUrl)
if resp.status == $Http200:
cachedPairs = parseTidPairs(await resp.body)
return sample(cachedPairs)
proc encodeSha256(text: string): array[32, byte] =
let
data = cast[ptr byte](addr text[0])
dataLen = uint(len(text))
digest = sha256.digest(data, dataLen)
return digest.data
proc encodeBase64[T](data: T): string =
return encode(data).replace("=", "")
proc decodeBase64(data: string): seq[byte] =
return cast[seq[byte]](decode(data))
proc genTid*(path: string): Future[string] {.async.} =
let
pair = await getPair()
timeNow = int(epochTime() - 1682924400)
timeNowBytes = @[
byte(timeNow and 0xff),
byte((timeNow shr 8) and 0xff),
byte((timeNow shr 16) and 0xff),
byte((timeNow shr 24) and 0xff)
]
data = "GET!" & path & "!" & $timeNow & defaultKeyword & pair.animationKey
hashBytes = encodeSha256(data)
keyBytes = decodeBase64(pair.verification)
bytesArr = keyBytes & timeNowBytes & hashBytes[0 ..< 16] & @[3'u8]
randomNum = byte(rand(256))
tid = @[randomNum] & bytesArr.mapIt(it xor randomNum)
return encodeBase64(tid)
-164
View File
@@ -1,164 +0,0 @@
# SPDX-License-Identifier: AGPL-3.0-only
import asyncdispatch, httpclient, times, sequtils, json, random
import strutils, tables
import types, consts
const
maxConcurrentReqs = 5 # max requests at a time per token, to avoid race conditions
maxLastUse = 1.hours # if a token is unused for 60 minutes, it expires
maxAge = 2.hours + 55.minutes # tokens expire after 3 hours
failDelay = initDuration(minutes=30)
var
tokenPool: seq[Token]
lastFailed: Time
enableLogging = false
let headers = newHttpHeaders({"authorization": auth})
template log(str) =
if enableLogging: echo "[tokens] ", str
proc getPoolJson*(): JsonNode =
var
list = newJObject()
totalReqs = 0
totalPending = 0
reqsPerApi: Table[string, int]
for token in tokenPool:
totalPending.inc(token.pending)
list[token.tok] = %*{
"apis": newJObject(),
"pending": token.pending,
"init": $token.init,
"lastUse": $token.lastUse
}
for api in token.apis.keys:
list[token.tok]["apis"][$api] = %token.apis[api]
let
maxReqs =
case api
of Api.timeline: 187
of Api.listMembers, Api.listBySlug, Api.list, Api.listTweets,
Api.userTweets, Api.userTweetsAndReplies, Api.userMedia,
Api.userRestId, Api.userScreenName,
Api.tweetDetail, Api.tweetResult, Api.search: 500
of Api.userSearch: 900
reqs = maxReqs - token.apis[api].remaining
reqsPerApi[$api] = reqsPerApi.getOrDefault($api, 0) + reqs
totalReqs.inc(reqs)
return %*{
"amount": tokenPool.len,
"requests": totalReqs,
"pending": totalPending,
"apis": reqsPerApi,
"tokens": list
}
proc rateLimitError*(): ref RateLimitError =
newException(RateLimitError, "rate limited")
proc fetchToken(): Future[Token] {.async.} =
if getTime() - lastFailed < failDelay:
raise rateLimitError()
let client = newAsyncHttpClient(headers=headers)
try:
let
resp = await client.postContent(activate)
tokNode = parseJson(resp)["guest_token"]
tok = tokNode.getStr($(tokNode.getInt))
time = getTime()
return Token(tok: tok, init: time, lastUse: time)
except Exception as e:
echo "[tokens] fetching token failed: ", e.msg
if "Try again" notin e.msg:
echo "[tokens] fetching tokens paused, resuming in 30 minutes"
lastFailed = getTime()
finally:
client.close()
proc expired(token: Token): bool =
let time = getTime()
token.init < time - maxAge or token.lastUse < time - maxLastUse
proc isLimited(token: Token; api: Api): bool =
if token.isNil or token.expired:
return true
if api in token.apis:
let limit = token.apis[api]
return (limit.remaining <= 10 and limit.reset > epochTime().int)
else:
return false
proc isReady(token: Token; api: Api): bool =
not (token.isNil or token.pending > maxConcurrentReqs or token.isLimited(api))
proc release*(token: Token; used=false; invalid=false) =
if token.isNil: return
if invalid or token.expired:
if invalid: log "discarding invalid token"
elif token.expired: log "discarding expired token"
let idx = tokenPool.find(token)
if idx > -1: tokenPool.delete(idx)
elif used:
dec token.pending
token.lastUse = getTime()
proc getToken*(api: Api): Future[Token] {.async.} =
for i in 0 ..< tokenPool.len:
if result.isReady(api): break
release(result)
result = tokenPool.sample()
if not result.isReady(api):
release(result)
result = await fetchToken()
log "added new token to pool"
tokenPool.add result
if not result.isNil:
inc result.pending
else:
raise rateLimitError()
proc setRateLimit*(token: Token; api: Api; remaining, reset: int) =
# avoid undefined behavior in race conditions
if api in token.apis:
let limit = token.apis[api]
if limit.reset >= reset and limit.remaining < remaining:
return
token.apis[api] = RateLimit(remaining: remaining, reset: reset)
proc poolTokens*(amount: int) {.async.} =
var futs: seq[Future[Token]]
for i in 0 ..< amount:
futs.add fetchToken()
for token in futs:
var newToken: Token
try: newToken = await token
except: discard
if not newToken.isNil:
log "added new token to pool"
tokenPool.add newToken
proc initTokenPool*(cfg: Config) {.async.} =
enableLogging = cfg.enableDebug
while true:
if tokenPool.countIt(not it.isLimited(Api.timeline)) < cfg.minTokens:
await poolTokens(min(4, cfg.minTokens - tokenPool.len))
await sleepAsync(2000)
+69 -33
View File
@@ -6,62 +6,76 @@ genPrefsType()
type type
RateLimitError* = object of CatchableError RateLimitError* = object of CatchableError
NoSessionsError* = object of CatchableError
InternalError* = object of CatchableError InternalError* = object of CatchableError
BadClientError* = object of CatchableError BadClientError* = object of CatchableError
TimelineKind* {.pure.} = enum TimelineKind* {.pure.} = enum
tweets tweets, replies, media
replies
media
Api* {.pure.} = enum ApiUrl* = object
tweetDetail endpoint*: string
tweetResult params*: seq[(string, string)]
timeline
search ApiReq* = object
userSearch oauth*: ApiUrl
list cookie*: ApiUrl
listBySlug
listMembers
listTweets
userRestId
userScreenName
userTweets
userTweetsAndReplies
userMedia
RateLimit* = object RateLimit* = object
limit*: int
remaining*: int remaining*: int
reset*: int reset*: int
Token* = ref object SessionKind* = enum
tok*: string oauth
init*: Time cookie
lastUse*: Time
Session* = ref object
id*: int64
username*: string
pending*: int pending*: int
apis*: Table[Api, RateLimit] limited*: bool
limitedAt*: int
apis*: Table[string, RateLimit]
case kind*: SessionKind
of oauth:
oauthToken*: string
oauthSecret*: string
of cookie:
authToken*: string
ct0*: string
Error* = enum Error* = enum
null = 0 null = 0
noUserMatches = 17 noUserMatches = 17
protectedUser = 22 protectedUser = 22
missingParams = 25 missingParams = 25
timeout = 29
couldntAuth = 32 couldntAuth = 32
doesntExist = 34 doesntExist = 34
unauthorized = 37
invalidParam = 47 invalidParam = 47
userNotFound = 50 userNotFound = 50
suspended = 63 suspended = 63
rateLimited = 88 rateLimited = 88
invalidToken = 89 expiredToken = 89
listIdOrSlug = 112 listIdOrSlug = 112
tweetNotFound = 144 tweetNotFound = 144
tweetNotAuthorized = 179 tweetNotAuthorized = 179
forbidden = 200 forbidden = 200
badRequest = 214
badToken = 239 badToken = 239
locked = 326
noCsrf = 353 noCsrf = 353
tweetUnavailable = 421 tweetUnavailable = 421
tweetCensored = 422 tweetCensored = 422
VerifiedType* = enum
none = "None"
blue = "Blue"
business = "Business"
government = "Government"
User* = object User* = object
id*: string id*: string
username*: string username*: string
@@ -77,7 +91,7 @@ type
tweets*: int tweets*: int
likes*: int likes*: int
media*: int media*: int
verified*: bool verifiedType*: VerifiedType
protected*: bool protected*: bool
suspended*: bool suspended*: bool
joinDate*: DateTime joinDate*: DateTime
@@ -97,7 +111,6 @@ type
durationMs*: int durationMs*: int
url*: string url*: string
thumb*: string thumb*: string
views*: string
available*: bool available*: bool
reason*: string reason*: string
title*: string title*: string
@@ -117,13 +130,17 @@ type
fromUser*: seq[string] fromUser*: seq[string]
since*: string since*: string
until*: string until*: string
near*: string minLikes*: string
sep*: string sep*: string
Gif* = object Gif* = object
url*: string url*: string
thumb*: string thumb*: string
Photo* = object
url*: string
altText*: string
GalleryPhoto* = object GalleryPhoto* = object
url*: string url*: string
tweetId*: string tweetId*: string
@@ -161,9 +178,10 @@ type
imageDirectMessage = "image_direct_message" imageDirectMessage = "image_direct_message"
audiospace = "audiospace" audiospace = "audiospace"
newsletterPublication = "newsletter_publication" newsletterPublication = "newsletter_publication"
jobDetails = "job_details"
hidden hidden
unknown unknown
Card* = object Card* = object
kind*: CardKind kind*: CardKind
url*: string url*: string
@@ -177,7 +195,7 @@ type
replies*: int replies*: int
retweets*: int retweets*: int
likes*: int likes*: int
quotes*: int views*: int
Tweet* = ref object Tweet* = ref object
id*: int64 id*: int64
@@ -203,7 +221,11 @@ type
poll*: Option[Poll] poll*: Option[Poll]
gif*: Option[Gif] gif*: Option[Gif]
video*: Option[Video] video*: Option[Video]
photos*: seq[string] photos*: seq[Photo]
history*: seq[int64]
note*: string
Tweets* = seq[Tweet]
Result*[T] = object Result*[T] = object
content*: seq[T] content*: seq[T]
@@ -212,7 +234,7 @@ type
query*: Query query*: Query
Chain* = object Chain* = object
content*: seq[Tweet] content*: Tweets
hasMore*: bool hasMore*: bool
cursor*: string cursor*: string
@@ -222,7 +244,11 @@ type
after*: Chain after*: Chain
replies*: Result[Chain] replies*: Result[Chain]
Timeline* = Result[Tweet] EditHistory* = object
latest*: Tweet
history*: Tweets
Timeline* = Result[Tweets]
Profile* = object Profile* = object
user*: User user*: User
@@ -255,10 +281,17 @@ type
hmacKey*: string hmacKey*: string
base64Media*: bool base64Media*: bool
minTokens*: int minTokens*: int
enableRss*: bool enableRSSUserTweets*: bool
enableRSSUserReplies*: bool
enableRSSUserMedia*: bool
enableRSSSearch*: bool
enableRSSList*: bool
enableDebug*: bool enableDebug*: bool
proxy*: string proxy*: string
proxyAuth*: string proxyAuth*: string
apiProxy*: string
disableTid*: bool
maxConcurrentReqs*: int
rssCacheTime*: int rssCacheTime*: int
listCacheTime*: int listCacheTime*: int
@@ -274,3 +307,6 @@ type
proc contains*(thread: Chain; tweet: Tweet): bool = proc contains*(thread: Chain; tweet: Tweet): bool =
thread.content.anyIt(it.id == tweet.id) thread.content.anyIt(it.id == tweet.id)
proc add*(timeline: var seq[Tweets]; tweet: Tweet) =
timeline.add @[tweet]
+4 -18
View File
@@ -1,7 +1,6 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import strutils, strformat, uri, tables, base64 import strutils, strformat, uri, tables, base64
import nimcrypto import nimcrypto
import types
var var
hmacKey: string hmacKey: string
@@ -10,14 +9,15 @@ var
const const
https* = "https://" https* = "https://"
twimg* = "pbs.twimg.com/" twimg* = "pbs.twimg.com/"
nitterParams = ["name", "tab", "id", "list", "referer", "scroll"] nitterParams* = ["name", "tab", "id", "list", "referer", "scroll", "prefs"]
twitterDomains = @[ twitterDomains = @[
"twitter.com", "twitter.com",
"pic.twitter.com", "pic.twitter.com",
"twimg.com", "twimg.com",
"abs.twimg.com", "abs.twimg.com",
"pbs.twimg.com", "pbs.twimg.com",
"video.twimg.com" "video.twimg.com",
"x.com"
] ]
proc setHmacKey*(key: string) = proc setHmacKey*(key: string) =
@@ -29,20 +29,6 @@ proc setProxyEncoding*(state: bool) =
proc getHmac*(data: string): string = proc getHmac*(data: string): string =
($hmac(sha256, hmacKey, data))[0 .. 12] ($hmac(sha256, hmacKey, data))[0 .. 12]
proc getBestMp4VidVariant(video: Video): VideoVariant =
for v in video.variants:
if v.bitrate >= result.bitrate:
result = v
proc getVidVariant*(video: Video; playbackType: VideoType): VideoVariant =
case playbackType
of mp4:
return video.getBestMp4VidVariant
of m3u8, vmap:
for variant in video.variants:
if variant.contentType == playbackType:
return variant
proc getVidUrl*(link: string): string = proc getVidUrl*(link: string): string =
if link.len == 0: return if link.len == 0: return
let sig = getHmac(link) let sig = getHmac(link)
@@ -72,4 +58,4 @@ proc isTwitterUrl*(uri: Uri): bool =
uri.hostname in twitterDomains uri.hostname in twitterDomains
proc isTwitterUrl*(url: string): bool = proc isTwitterUrl*(url: string): bool =
parseUri(url).hostname in twitterDomains isTwitterUrl(parseUri(url))
+1 -1
View File
@@ -11,7 +11,7 @@ const doctype = "<!DOCTYPE html>\n"
proc renderVideoEmbed*(tweet: Tweet; cfg: Config; req: Request): string = proc renderVideoEmbed*(tweet: Tweet; cfg: Config; req: Request): string =
let thumb = get(tweet.video).thumb let thumb = get(tweet.video).thumb
let vidUrl = getVideoEmbed(cfg, tweet.id) let vidUrl = getVideoEmbed(cfg, tweet.id)
let prefs = Prefs(hlsPlayback: true) let prefs = Prefs(hlsPlayback: true, mp4Playback: true)
let node = buildHtml(html(lang="en")): let node = buildHtml(html(lang="en")):
renderHead(prefs, cfg, req, video=vidUrl, images=(@[thumb])) renderHead(prefs, cfg, req, video=vidUrl, images=(@[thumb]))
+17 -18
View File
@@ -29,19 +29,17 @@ proc renderNavbar(cfg: Config; req: Request; rss, canonical: string): VNode =
tdiv(class="nav-item right"): tdiv(class="nav-item right"):
icon "search", title="Search", href="/search" icon "search", title="Search", href="/search"
if cfg.enableRss and rss.len > 0: if rss.len > 0:
icon "rss-feed", title="RSS Feed", href=rss icon "rss", title="RSS Feed", href=rss
icon "bird", title="Open in Twitter", href=canonical icon "bird", title="Open in X", href=canonical
a(href="https://liberapay.com/zedeus"): verbatim lp a(href="https://liberapay.com/zedeus"): verbatim lp
icon "info", title="About", href="/about" icon "info", title="About", href="/about"
icon "cog", title="Preferences", href=("/settings?referer=" & encodeUrl(path)) icon "cog", title="Preferences", href=("/settings?referer=" & encodeUrl(path))
proc renderHead*(prefs: Prefs; cfg: Config; req: Request; titleText=""; desc=""; proc renderHead*(prefs: Prefs; cfg: Config; req: Request; titleText=""; desc="";
video=""; images: seq[string] = @[]; banner=""; ogTitle=""; video=""; images: seq[string] = @[]; banner=""; ogTitle="";
rss=""; canonical=""): VNode = rss=""; alternate=""): VNode =
var theme = prefs.theme.toTheme let theme = prefs.theme.toTheme
if "theme" in req.params:
theme = req.params["theme"].toTheme
let ogType = let ogType =
if video.len > 0: "video" if video.len > 0: "video"
@@ -52,8 +50,8 @@ proc renderHead*(prefs: Prefs; cfg: Config; req: Request; titleText=""; desc="";
let opensearchUrl = getUrlPrefix(cfg) & "/opensearch" let opensearchUrl = getUrlPrefix(cfg) & "/opensearch"
buildHtml(head): buildHtml(head):
link(rel="stylesheet", type="text/css", href="/css/style.css?v=18") link(rel="stylesheet", type="text/css", href="/css/style.css?v=28")
link(rel="stylesheet", type="text/css", href="/css/fontello.css?v=2") link(rel="stylesheet", type="text/css", href="/css/fontello.css?v=4")
if theme.len > 0: if theme.len > 0:
link(rel="stylesheet", type="text/css", href=(&"/css/themes/{theme}.css")) link(rel="stylesheet", type="text/css", href=(&"/css/themes/{theme}.css"))
@@ -66,14 +64,14 @@ proc renderHead*(prefs: Prefs; cfg: Config; req: Request; titleText=""; desc="";
link(rel="search", type="application/opensearchdescription+xml", title=cfg.title, link(rel="search", type="application/opensearchdescription+xml", title=cfg.title,
href=opensearchUrl) href=opensearchUrl)
if canonical.len > 0: if alternate.len > 0:
link(rel="canonical", href=canonical) link(rel="alternate", href=alternate, title="View on X")
if cfg.enableRss and rss.len > 0: if rss.len > 0:
link(rel="alternate", type="application/rss+xml", href=rss, title="RSS feed") link(rel="alternate", type="application/rss+xml", href=rss, title="RSS feed")
if prefs.hlsPlayback: if prefs.hlsPlayback:
script(src="/js/hls.light.min.js", `defer`="") script(src="/js/hls.min.js", `defer`="")
script(src="/js/hlsPlayback.js", `defer`="") script(src="/js/hlsPlayback.js", `defer`="")
if prefs.infiniteScroll: if prefs.infiniteScroll:
@@ -119,20 +117,21 @@ proc renderHead*(prefs: Prefs; cfg: Config; req: Request; titleText=""; desc="";
# this is last so images are also preloaded # this is last so images are also preloaded
# if this is done earlier, Chrome only preloads one image for some reason # if this is done earlier, Chrome only preloads one image for some reason
link(rel="preload", type="font/woff2", `as`="font", link(rel="preload", type="font/woff2", `as`="font",
href="/fonts/fontello.woff2?21002321", crossorigin="anonymous") href="/fonts/fontello.woff2?61663884", crossorigin="anonymous")
proc renderMain*(body: VNode; req: Request; cfg: Config; prefs=defaultPrefs; proc renderMain*(body: VNode; req: Request; cfg: Config; prefs=defaultPrefs;
titleText=""; desc=""; ogTitle=""; rss=""; video=""; titleText=""; desc=""; ogTitle=""; rss=""; video="";
images: seq[string] = @[]; banner=""): string = images: seq[string] = @[]; banner=""): string =
let canonical = getTwitterLink(req.path, req.params) let twitterLink = getTwitterLink(req.path, req.params)
let node = buildHtml(html(lang="en")): let node = buildHtml(html(lang="en")):
renderHead(prefs, cfg, req, titleText, desc, video, images, banner, ogTitle, renderHead(prefs, cfg, req, titleText, desc, video, images, banner, ogTitle,
rss, canonical) rss, twitterLink)
body: let bodyClass = if prefs.stickyNav: "fixed-nav" else: ""
renderNavbar(cfg, req, rss, canonical) body(class=bodyClass):
renderNavbar(cfg, req, rss, twitterLink)
tdiv(class="container"): tdiv(class="container"):
body body
+10 -1
View File
@@ -32,7 +32,8 @@ macro renderPrefs*(): untyped =
result[2].add stmt result[2].add stmt
proc renderPreferences*(prefs: Prefs; path: string; themes: seq[string]): VNode = proc renderPreferences*(prefs: Prefs; path: string; themes: seq[string];
prefsUrl: string): VNode =
buildHtml(tdiv(class="overlay-panel")): buildHtml(tdiv(class="overlay-panel")):
fieldset(class="preferences"): fieldset(class="preferences"):
form(`method`="post", action="/saveprefs", autocomplete="off"): form(`method`="post", action="/saveprefs", autocomplete="off"):
@@ -40,6 +41,14 @@ proc renderPreferences*(prefs: Prefs; path: string; themes: seq[string]): VNode
renderPrefs() renderPrefs()
legend: text "Bookmark"
p(class="bookmark-note"):
text "Save this URL to restore your preferences (?prefs works on all pages)"
pre(class="prefs-code"):
text prefsUrl
p(class="bookmark-note"):
verbatim "You can override preferences with query parameters (e.g. <code>?hlsPlayback=on</code>). These overrides aren't saved to cookies, and links won't retain the parameters. Intended for configuring RSS feeds and other cookieless environments. Hover over a preference to see its name."
h4(class="note"): h4(class="note"):
text "Preferences are stored client-side using cookies without any personal information." text "Preferences are stored client-side using cookies without any personal information."
+1
View File
@@ -26,6 +26,7 @@ proc renderUserCard*(user: User; prefs: Prefs): VNode =
tdiv(class="profile-card-tabs-name"): tdiv(class="profile-card-tabs-name"):
linkUser(user, class="profile-card-fullname") linkUser(user, class="profile-card-fullname")
verifiedIcon(user)
linkUser(user, class="profile-card-username") linkUser(user, class="profile-card-username")
tdiv(class="profile-card-extra"): tdiv(class="profile-card-extra"):
+25 -10
View File
@@ -23,6 +23,15 @@ proc icon*(icon: string; text=""; title=""; class=""; href=""): VNode =
if text.len > 0: if text.len > 0:
text " " & text text " " & text
template verifiedIcon*(user: User): untyped {.dirty.} =
if user.verifiedType != VerifiedType.none:
let lower = ($user.verifiedType).toLowerAscii()
buildHtml(tdiv(class=(&"verified-icon {lower}"))):
icon "circle", class="verified-icon-circle", title=(&"Verified {lower} account")
icon "ok", class="verified-icon-check", title=(&"Verified {lower} account")
else:
text ""
proc linkUser*(user: User, class=""): VNode = proc linkUser*(user: User, class=""): VNode =
let let
isName = "username" notin class isName = "username" notin class
@@ -32,11 +41,10 @@ proc linkUser*(user: User, class=""): VNode =
buildHtml(a(href=href, class=class, title=nameText)): buildHtml(a(href=href, class=class, title=nameText)):
text nameText text nameText
if isName and user.verified: if isName:
icon "ok", class="verified-icon", title="Verified account" if user.protected:
if isName and user.protected: text " "
text " " icon "lock", title="Protected account"
icon "lock", title="Protected account"
proc linkText*(text: string; class=""): VNode = proc linkText*(text: string; class=""): VNode =
let url = if "http" notin text: https & text else: text let url = if "http" notin text: https & text else: text
@@ -57,20 +65,20 @@ proc buttonReferer*(action, text, path: string; class=""; `method`="post"): VNod
text text text text
proc genCheckbox*(pref, label: string; state: bool): VNode = proc genCheckbox*(pref, label: string; state: bool): VNode =
buildHtml(label(class="pref-group checkbox-container")): buildHtml(label(class="pref-group checkbox-container", title=pref)):
text label text label
input(name=pref, `type`="checkbox", checked=state) input(name=pref, `type`="checkbox", checked=state)
span(class="checkbox") span(class="checkbox")
proc genInput*(pref, label, state, placeholder: string; class=""; autofocus=true): VNode = proc genInput*(pref, label, state, placeholder: string; class=""; autofocus=true): VNode =
let p = placeholder let p = placeholder
buildHtml(tdiv(class=("pref-group pref-input " & class))): buildHtml(tdiv(class=("pref-group pref-input " & class), title=pref)):
if label.len > 0: if label.len > 0:
label(`for`=pref): text label label(`for`=pref): text label
input(name=pref, `type`="text", placeholder=p, value=state, autofocus=(autofocus and state.len == 0)) input(name=pref, `type`="text", placeholder=p, value=state, autofocus=(autofocus and state.len == 0))
proc genSelect*(pref, label, state: string; options: seq[string]): VNode = proc genSelect*(pref, label, state: string; options: seq[string]): VNode =
buildHtml(tdiv(class="pref-group pref-input")): buildHtml(tdiv(class="pref-group pref-input", title=pref)):
label(`for`=pref): text label label(`for`=pref): text label
select(name=pref): select(name=pref):
for opt in options: for opt in options:
@@ -82,9 +90,16 @@ proc genDate*(pref, state: string): VNode =
input(name=pref, `type`="date", value=state) input(name=pref, `type`="date", value=state)
icon "calendar" icon "calendar"
proc genImg*(url: string; class=""): VNode = proc genNumberInput*(pref, label, state, placeholder: string; class=""; autofocus=true; min="0"): VNode =
let p = placeholder
buildHtml(tdiv(class=("pref-group pref-input " & class))):
if label.len > 0:
label(`for`=pref): text label
input(name=pref, `type`="number", placeholder=p, value=state, autofocus=(autofocus and state.len == 0), min=min, step="1")
proc genImg*(url: string; class=""; alt=""): VNode =
buildHtml(): buildHtml():
img(src=getPicUrl(url), class=class, alt="", loading="lazy", decoding="async") img(src=getPicUrl(url), class=class, alt=alt, loading="lazy")
proc getTabClass*(query: Query; tab: QueryKind): string = proc getTabClass*(query: Query; tab: QueryKind): string =
if query.kind == tab: "tab-item active" if query.kind == tab: "tab-item active"
+80 -32
View File
@@ -2,6 +2,9 @@
## SPDX-License-Identifier: AGPL-3.0-only ## SPDX-License-Identifier: AGPL-3.0-only
#import strutils, xmltree, strformat, options, unicode #import strutils, xmltree, strformat, options, unicode
#import ../types, ../utils, ../formatters, ../prefs #import ../types, ../utils, ../formatters, ../prefs
## Snowflake ID cutoff for RSS GUID format transition
## Corresponds to approximately December 14, 2025 UTC
#const guidCutoff = 2000000000000000000'i64
# #
#proc getTitle(tweet: Tweet; retweet: string): string = #proc getTitle(tweet: Tweet; retweet: string): string =
#if tweet.pinned: result = "Pinned: " #if tweet.pinned: result = "Pinned: "
@@ -25,24 +28,41 @@
#end proc #end proc
# #
#proc getDescription(desc: string; cfg: Config): string = #proc getDescription(desc: string; cfg: Config): string =
Twitter feed for: ${desc}. Generated by ${cfg.hostname} Twitter feed for: ${desc}. Generated by ${getUrlPrefix(cfg)}
#end proc #end proc
# #
#proc renderRssTweet(tweet: Tweet; cfg: Config): string = #proc getTweetsWithPinned(profile: Profile): seq[Tweets] =
#result = profile.tweets.content
#if profile.pinned.isSome and result.len > 0:
# let pinnedTweet = profile.pinned.get
# var inserted = false
# for threadIdx in 0 ..< result.len:
# if not inserted:
# for tweetIdx in 0 ..< result[threadIdx].len:
# if result[threadIdx][tweetIdx].id < pinnedTweet.id:
# result[threadIdx].insert(pinnedTweet, tweetIdx)
# inserted = true
# end if
# end for
# end if
# end for
#end if
#end proc
#
#proc renderRssTweet(tweet: Tweet; cfg: Config; prefs: Prefs): string =
#let tweet = tweet.retweet.get(tweet) #let tweet = tweet.retweet.get(tweet)
#let urlPrefix = getUrlPrefix(cfg) #let urlPrefix = getUrlPrefix(cfg)
#let text = replaceUrls(tweet.text, defaultPrefs, absolute=urlPrefix) #let text = replaceUrls(tweet.text, prefs, absolute=urlPrefix)
<p>${text.replace("\n", "<br>\n")}</p> <p>${text.replace("\n", "<br>\n")}</p>
#if tweet.quote.isSome and get(tweet.quote).available:
# let quoteLink = getLink(get(tweet.quote))
<p><a href="${urlPrefix}${quoteLink}">${cfg.hostname}${quoteLink}</a></p>
#end if
#if tweet.photos.len > 0: #if tweet.photos.len > 0:
# for photo in tweet.photos: # for photo in tweet.photos:
<img src="${urlPrefix}${getPicUrl(photo)}" style="max-width:250px;" /> <img src="${urlPrefix}${getPicUrl(photo.url)}" style="max-width:250px;" />
# end for # end for
#elif tweet.video.isSome: #elif tweet.video.isSome:
<img src="${urlPrefix}${getPicUrl(get(tweet.video).thumb)}" style="max-width:250px;" /> <a href="${urlPrefix}${tweet.getLink}">
<br>Video<br>
<img src="${urlPrefix}${getPicUrl(get(tweet.video).thumb)}" style="max-width:250px;" />
</a>
#elif tweet.gif.isSome: #elif tweet.gif.isSome:
# let thumb = &"{urlPrefix}{getPicUrl(get(tweet.gif).thumb)}" # let thumb = &"{urlPrefix}{getPicUrl(get(tweet.gif).thumb)}"
# let url = &"{urlPrefix}{getPicUrl(get(tweet.gif).url)}" # let url = &"{urlPrefix}{getPicUrl(get(tweet.gif).url)}"
@@ -54,30 +74,57 @@ Twitter feed for: ${desc}. Generated by ${cfg.hostname}
<img src="${urlPrefix}${getPicUrl(card.image)}" style="max-width:250px;" /> <img src="${urlPrefix}${getPicUrl(card.image)}" style="max-width:250px;" />
# end if # end if
#end if #end if
#if tweet.note.len > 0 and not prefs.hideCommunityNotes:
<p><b>Community note:</b> ${replaceUrls(tweet.note, prefs, absolute=urlPrefix)}</p>
#end if
#if tweet.quote.isSome and get(tweet.quote).available:
# let quoteTweet = get(tweet.quote)
# let quoteLink = urlPrefix & getLink(quoteTweet)
<hr/>
<blockquote>
<b>${quoteTweet.user.fullname} (@${quoteTweet.user.username})</b>
<p>
${renderRssTweet(quoteTweet, cfg, prefs)}
</p>
<footer>
— <cite><a href="${quoteLink}">${quoteLink}</a>
</footer>
</blockquote>
#end if
#end proc #end proc
# #
#proc renderRssTweets(tweets: seq[Tweet]; cfg: Config): string = #proc renderRssTweets(tweets: seq[Tweets]; cfg: Config; prefs: Prefs; userId=""): string =
#let urlPrefix = getUrlPrefix(cfg) #let urlPrefix = getUrlPrefix(cfg)
#var links: seq[string] #var links: seq[string]
#for t in tweets: #for thread in tweets:
# let retweet = if t.retweet.isSome: t.user.username else: "" # for tweet in thread:
# let tweet = if retweet.len > 0: t.retweet.get else: t # if userId.len > 0 and tweet.user.id != userId: continue
# let link = getLink(tweet) # end if
# if link in links: continue #
# end if # let retweet = if tweet.retweet.isSome: tweet.user.username else: ""
# links.add link # let tweet = if retweet.len > 0: tweet.retweet.get else: tweet
<item> # let link = getLink(tweet)
<title>${getTitle(tweet, retweet)}</title> # if link in links: continue
<dc:creator>@${tweet.user.username}</dc:creator> # end if
<description><![CDATA[${renderRssTweet(tweet, cfg).strip(chars={'\n'})}]]></description> # links.add link
<pubDate>${getRfc822Time(tweet)}</pubDate> # let useGlobalGuid = tweet.id >= guidCutoff
<guid>${urlPrefix & link}</guid> <item>
<link>${urlPrefix & link}</link> <title>${getTitle(tweet, retweet)}</title>
</item> <dc:creator>@${tweet.user.username}</dc:creator>
<description><![CDATA[${renderRssTweet(tweet, cfg, prefs).strip(chars={'\n'})}]]></description>
<pubDate>${getRfc822Time(tweet)}</pubDate>
#if useGlobalGuid:
<guid isPermaLink="false">${tweet.id}</guid>
#else:
<guid>${urlPrefix & link}</guid>
#end if
<link>${urlPrefix & link}</link>
</item>
# end for
#end for #end for
#end proc #end proc
# #
#proc renderTimelineRss*(profile: Profile; cfg: Config; multi=false): string = #proc renderTimelineRss*(profile: Profile; cfg: Config; prefs: Prefs; multi=false): string =
#let urlPrefix = getUrlPrefix(cfg) #let urlPrefix = getUrlPrefix(cfg)
#result = "" #result = ""
#let handle = (if multi: "" else: "@") & profile.user.username #let handle = (if multi: "" else: "@") & profile.user.username
@@ -101,14 +148,15 @@ Twitter feed for: ${desc}. Generated by ${cfg.hostname}
<width>128</width> <width>128</width>
<height>128</height> <height>128</height>
</image> </image>
#if profile.tweets.content.len > 0: #let tweetsList = getTweetsWithPinned(profile)
${renderRssTweets(profile.tweets.content, cfg)} #if tweetsList.len > 0:
${renderRssTweets(tweetsList, cfg, prefs, userId=profile.user.id)}
#end if #end if
</channel> </channel>
</rss> </rss>
#end proc #end proc
# #
#proc renderListRss*(tweets: seq[Tweet]; list: List; cfg: Config): string = #proc renderListRss*(tweets: seq[Tweets]; list: List; cfg: Config; prefs: Prefs): string =
#let link = &"{getUrlPrefix(cfg)}/i/lists/{list.id}" #let link = &"{getUrlPrefix(cfg)}/i/lists/{list.id}"
#result = "" #result = ""
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
@@ -120,12 +168,12 @@ ${renderRssTweets(profile.tweets.content, cfg)}
<description>${getDescription(&"{list.name} by @{list.username}", cfg)}</description> <description>${getDescription(&"{list.name} by @{list.username}", cfg)}</description>
<language>en-us</language> <language>en-us</language>
<ttl>40</ttl> <ttl>40</ttl>
${renderRssTweets(tweets, cfg)} ${renderRssTweets(tweets, cfg, prefs)}
</channel> </channel>
</rss> </rss>
#end proc #end proc
# #
#proc renderSearchRss*(tweets: seq[Tweet]; name, param: string; cfg: Config): string = #proc renderSearchRss*(tweets: seq[Tweets]; name, param: string; cfg: Config; prefs: Prefs): string =
#let link = &"{getUrlPrefix(cfg)}/search" #let link = &"{getUrlPrefix(cfg)}/search"
#let escName = xmltree.escape(name) #let escName = xmltree.escape(name)
#result = "" #result = ""
@@ -138,7 +186,7 @@ ${renderRssTweets(tweets, cfg)}
<description>${getDescription(&"Search \"{escName}\"", cfg)}</description> <description>${getDescription(&"Search \"{escName}\"", cfg)}</description>
<language>en-us</language> <language>en-us</language>
<ttl>40</ttl> <ttl>40</ttl>
${renderRssTweets(tweets, cfg)} ${renderRssTweets(tweets, cfg, prefs)}
</channel> </channel>
</rss> </rss>
#end proc #end proc
+7 -9
View File
@@ -10,23 +10,21 @@ const toggles = {
"media": "Media", "media": "Media",
"videos": "Videos", "videos": "Videos",
"news": "News", "news": "News",
"verified": "Verified",
"native_video": "Native videos", "native_video": "Native videos",
"replies": "Replies", "replies": "Replies",
"links": "Links", "links": "Links",
"images": "Images", "images": "Images",
"safe": "Safe",
"quote": "Quotes", "quote": "Quotes",
"pro_video": "Pro videos" "spaces": "Spaces"
}.toOrderedTable }.toOrderedTable
proc renderSearch*(): VNode = proc renderSearch*(): VNode =
buildHtml(tdiv(class="panel-container")): buildHtml(tdiv(class="panel-container")):
tdiv(class="search-bar"): tdiv(class="search-bar"):
form(`method`="get", action="/search", autocomplete="off"): form(`method`="get", action="/search", autocomplete="off"):
hiddenField("f", "users") hiddenField("f", "tweets")
input(`type`="text", name="q", autofocus="", input(`type`="text", name="q", autofocus="",
placeholder="Enter username...", dir="auto") placeholder="Search...", dir="auto")
button(`type`="submit"): icon "search" button(`type`="submit"): icon "search"
proc renderProfileTabs*(query: Query; username: string): VNode = proc renderProfileTabs*(query: Query; username: string): VNode =
@@ -53,7 +51,7 @@ proc renderSearchTabs*(query: Query): VNode =
proc isPanelOpen(q: Query): bool = proc isPanelOpen(q: Query): bool =
q.fromUser.len == 0 and (q.filters.len > 0 or q.excludes.len > 0 or q.fromUser.len == 0 and (q.filters.len > 0 or q.excludes.len > 0 or
@[q.near, q.until, q.since].anyIt(it.len > 0)) @[q.minLikes, q.until, q.since].anyIt(it.len > 0))
proc renderSearchPanel*(query: Query): VNode = proc renderSearchPanel*(query: Query): VNode =
let user = query.fromUser.join(",") let user = query.fromUser.join(",")
@@ -85,10 +83,10 @@ proc renderSearchPanel*(query: Query): VNode =
span(class="search-title"): text "-" span(class="search-title"): text "-"
genDate("until", query.until) genDate("until", query.until)
tdiv: tdiv:
span(class="search-title"): text "Near" span(class="search-title"): text "Minimum likes"
genInput("near", "", query.near, "Location...", autofocus=false) genNumberInput("min_faves", "", query.minLikes, "Number...", autofocus=false)
proc renderTweetSearch*(results: Result[Tweet]; prefs: Prefs; path: string; proc renderTweetSearch*(results: Timeline; prefs: Prefs; path: string;
pinned=none(Tweet)): VNode = pinned=none(Tweet)): VNode =
let query = results.query let query = results.query
buildHtml(tdiv(class="timeline-container")): buildHtml(tdiv(class="timeline-container")):
+23 -4
View File
@@ -28,14 +28,19 @@ proc renderReplyThread(thread: Chain; prefs: Prefs; path: string): VNode =
if thread.hasMore: if thread.hasMore:
renderMoreReplies(thread) renderMoreReplies(thread)
proc renderReplies*(replies: Result[Chain]; prefs: Prefs; path: string): VNode = proc renderReplies*(replies: Result[Chain]; prefs: Prefs; path: string; tweet: Tweet = nil): VNode =
buildHtml(tdiv(class="replies", id="r")): buildHtml(tdiv(class="replies", id="r")):
var hasReplies = false
var replyCount = 0
for thread in replies.content: for thread in replies.content:
if thread.content.len == 0: continue if thread.content.len == 0: continue
hasReplies = true
replyCount += thread.content.len
renderReplyThread(thread, prefs, path) renderReplyThread(thread, prefs, path)
if replies.bottom.len > 0: if hasReplies and replies.bottom.len > 0:
renderMore(Query(), replies.bottom, focus="#r") if tweet == nil or not replies.beginning or replyCount < tweet.stats.replies:
renderMore(Query(), replies.bottom, focus="#r")
proc renderConversation*(conv: Conversation; prefs: Prefs; path: string): VNode = proc renderConversation*(conv: Conversation; prefs: Prefs; path: string): VNode =
let hasAfter = conv.after.content.len > 0 let hasAfter = conv.after.content.len > 0
@@ -70,6 +75,20 @@ proc renderConversation*(conv: Conversation; prefs: Prefs; path: string): VNode
if not conv.replies.beginning: if not conv.replies.beginning:
renderNewer(Query(), getLink(conv.tweet), focus="#r") renderNewer(Query(), getLink(conv.tweet), focus="#r")
if conv.replies.content.len > 0 or conv.replies.bottom.len > 0: if conv.replies.content.len > 0 or conv.replies.bottom.len > 0:
renderReplies(conv.replies, prefs, path) renderReplies(conv.replies, prefs, path, conv.tweet)
renderToTop(focus="#m") renderToTop(focus="#m")
proc renderEditHistory*(edits: EditHistory; prefs: Prefs; path: string): VNode =
buildHtml(tdiv(class="edit-history")):
tdiv(class="latest-edit"):
tdiv(class="edit-history-header"):
text "Latest post"
renderTweet(edits.latest, prefs, path)
tdiv(class="previous-edits"):
tdiv(class="edit-history-header"):
text "Version history"
for tweet in edits.history:
tdiv(class="tweet-edit"):
renderTweet(tweet, prefs, path)
+28 -31
View File
@@ -1,5 +1,5 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import strutils, strformat, sequtils, algorithm, uri, options import strutils, strformat, algorithm, uri, options
import karax/[karaxdsl, vdom] import karax/[karaxdsl, vdom]
import ".."/[types, query, formatters] import ".."/[types, query, formatters]
@@ -39,26 +39,24 @@ proc renderNoneFound(): VNode =
h2(class="timeline-none"): h2(class="timeline-none"):
text "No items found" text "No items found"
proc renderThread(thread: seq[Tweet]; prefs: Prefs; path: string): VNode = proc renderThread(thread: Tweets; prefs: Prefs; path: string): VNode =
buildHtml(tdiv(class="thread-line")): buildHtml(tdiv(class="thread-line")):
let sortedThread = thread.sortedByIt(it.id) let sortedThread = thread.sortedByIt(it.id)
for i, tweet in sortedThread: for i, tweet in sortedThread:
# thread has a gap, display "more replies" link
if i > 0 and tweet.replyId != sortedThread[i - 1].id:
tdiv(class="timeline-item thread more-replies-thread"):
tdiv(class="more-replies"):
a(class="more-replies-text", href=getLink(tweet)):
text "more replies"
let show = i == thread.high and sortedThread[0].id != tweet.threadId let show = i == thread.high and sortedThread[0].id != tweet.threadId
let header = if tweet.pinned or tweet.retweet.isSome: "with-header " else: "" let header = if tweet.pinned or tweet.retweet.isSome: "with-header " else: ""
renderTweet(tweet, prefs, path, class=(header & "thread"), renderTweet(tweet, prefs, path, class=(header & "thread"),
index=i, last=(i == thread.high), showThread=show) index=i, last=(i == thread.high))
proc threadFilter(tweets: openArray[Tweet]; threads: openArray[int64]; it: Tweet): seq[Tweet] =
result = @[it]
if it.retweet.isSome or it.replyId in threads: return
for t in tweets:
if t.id == result[0].replyId:
result.insert t
elif t.replyId == result[0].id:
result.add t
proc renderUser(user: User; prefs: Prefs): VNode = proc renderUser(user: User; prefs: Prefs): VNode =
buildHtml(tdiv(class="timeline-item")): buildHtml(tdiv(class="timeline-item", data-username=user.username)):
a(class="tweet-link", href=("/" & user.username)) a(class="tweet-link", href=("/" & user.username))
tdiv(class="tweet-body profile-result"): tdiv(class="tweet-body profile-result"):
tdiv(class="tweet-header"): tdiv(class="tweet-header"):
@@ -68,6 +66,7 @@ proc renderUser(user: User; prefs: Prefs): VNode =
tdiv(class="tweet-name-row"): tdiv(class="tweet-name-row"):
tdiv(class="fullname-and-username"): tdiv(class="fullname-and-username"):
linkUser(user, class="fullname") linkUser(user, class="fullname")
verifiedIcon(user)
linkUser(user, class="username") linkUser(user, class="username")
tdiv(class="tweet-content media-body", dir="auto"): tdiv(class="tweet-content media-body", dir="auto"):
@@ -89,7 +88,7 @@ proc renderTimelineUsers*(results: Result[User]; prefs: Prefs; path=""): VNode =
else: else:
renderNoMore() renderNoMore()
proc renderTimelineTweets*(results: Result[Tweet]; prefs: Prefs; path: string; proc renderTimelineTweets*(results: Timeline; prefs: Prefs; path: string;
pinned=none(Tweet)): VNode = pinned=none(Tweet)): VNode =
buildHtml(tdiv(class="timeline")): buildHtml(tdiv(class="timeline")):
if not results.beginning: if not results.beginning:
@@ -97,7 +96,7 @@ proc renderTimelineTweets*(results: Result[Tweet]; prefs: Prefs; path: string;
if not prefs.hidePins and pinned.isSome: if not prefs.hidePins and pinned.isSome:
let tweet = get pinned let tweet = get pinned
renderTweet(tweet, prefs, path, showThread=tweet.hasThread) renderTweet(tweet, prefs, path)
if results.content.len == 0: if results.content.len == 0:
if not results.beginning: if not results.beginning:
@@ -105,26 +104,24 @@ proc renderTimelineTweets*(results: Result[Tweet]; prefs: Prefs; path: string;
else: else:
renderNoneFound() renderNoneFound()
else: else:
var var retweets: seq[int64]
threads: seq[int64]
retweets: seq[int64]
for tweet in results.content: for thread in results.content:
let rt = if tweet.retweet.isSome: get(tweet.retweet).id else: 0 if thread.len == 1:
let
tweet = thread[0]
retweetId = if tweet.retweet.isSome: get(tweet.retweet).id else: 0
if tweet.id in threads or rt in retweets or tweet.id in retweets or if retweetId in retweets or tweet.id in retweets or
tweet.pinned and prefs.hidePins: continue tweet.pinned and prefs.hidePins:
continue
let thread = results.content.threadFilter(threads, tweet) if retweetId != 0 and tweet.retweet.isSome:
if thread.len < 2: retweets &= retweetId
var hasThread = tweet.hasThread renderTweet(tweet, prefs, path)
if rt != 0:
retweets &= rt
hasThread = get(tweet.retweet).hasThread
renderTweet(tweet, prefs, path, showThread=hasThread)
else: else:
renderThread(thread, prefs, path) renderThread(thread, prefs, path)
threads &= thread.mapIt(it.id)
renderMore(results.query, results.bottom) if results.bottom.len > 0:
renderMore(results.query, results.bottom)
renderToTop() renderToTop()
+83 -53
View File
@@ -1,5 +1,5 @@
# SPDX-License-Identifier: AGPL-3.0-only # SPDX-License-Identifier: AGPL-3.0-only
import strutils, sequtils, strformat, options import strutils, sequtils, strformat, options, algorithm
import karax/[karaxdsl, vdom, vstyles] import karax/[karaxdsl, vdom, vstyles]
from jester import Request from jester import Request
@@ -10,19 +10,16 @@ import general
const doctype = "<!DOCTYPE html>\n" const doctype = "<!DOCTYPE html>\n"
proc renderMiniAvatar(user: User; prefs: Prefs): VNode = proc renderMiniAvatar(user: User; prefs: Prefs): VNode =
let url = getPicUrl(user.getUserPic("_mini")) genImg(user.getUserPic("_mini"), class=(prefs.getAvatarClass & " mini"))
buildHtml():
img(class=(prefs.getAvatarClass & " mini"), src=url, loading="lazy")
proc renderHeader(tweet: Tweet; retweet: string; prefs: Prefs): VNode = proc renderHeader(tweet: Tweet; retweet: string; pinned: bool; prefs: Prefs): VNode =
buildHtml(tdiv): buildHtml(tdiv):
if retweet.len > 0: if pinned:
tdiv(class="retweet-header"):
span: icon "retweet", retweet & " retweeted"
if tweet.pinned:
tdiv(class="pinned"): tdiv(class="pinned"):
span: icon "pin", "Pinned Tweet" span: icon "pin", "Pinned Tweet"
elif retweet.len > 0:
tdiv(class="retweet-header"):
span: icon "retweet", retweet & " retweeted"
tdiv(class="tweet-header"): tdiv(class="tweet-header"):
a(class="tweet-avatar", href=("/" & tweet.user.username)): a(class="tweet-avatar", href=("/" & tweet.user.username)):
@@ -34,6 +31,7 @@ proc renderHeader(tweet: Tweet; retweet: string; prefs: Prefs): VNode =
tdiv(class="tweet-name-row"): tdiv(class="tweet-name-row"):
tdiv(class="fullname-and-username"): tdiv(class="fullname-and-username"):
linkUser(tweet.user, class="fullname") linkUser(tweet.user, class="fullname")
verifiedIcon(tweet.user)
linkUser(tweet.user, class="username") linkUser(tweet.user, class="username")
span(class="tweet-date"): span(class="tweet-date"):
@@ -52,10 +50,12 @@ proc renderAlbum(tweet: Tweet): VNode =
for photo in photos: for photo in photos:
tdiv(class="attachment image"): tdiv(class="attachment image"):
let let
named = "name=" in photo named = "name=" in photo.url
small = if named: photo else: photo & smallWebp small = if named: photo.url else: photo.url & smallWebp
a(href=getOrigPicUrl(photo), class="still-image", target="_blank"): a(href=getOrigPicUrl(photo.url), class="still-image", target="_blank"):
genImg(small) genImg(small, alt=photo.altText)
if photo.altText.len > 0:
p(class="alt-text"): text "ALT " & photo.altText
proc isPlaybackEnabled(prefs: Prefs; playbackType: VideoType): bool = proc isPlaybackEnabled(prefs: Prefs; playbackType: VideoType): bool =
case playbackType case playbackType
@@ -85,35 +85,35 @@ proc renderVideo*(video: Video; prefs: Prefs; path: string): VNode =
let let
container = if video.description.len == 0 and video.title.len == 0: "" container = if video.description.len == 0 and video.title.len == 0: ""
else: " card-container" else: " card-container"
playbackType = if prefs.proxyVideos and video.hasMp4Url: mp4 playbackType = if not prefs.proxyVideos and video.hasMp4Url: mp4
else: video.playbackType else: video.playbackType
buildHtml(tdiv(class="attachments card")): buildHtml(tdiv(class="attachments card")):
tdiv(class="gallery-video" & container): tdiv(class="gallery-video" & container):
tdiv(class="attachment video-container"): tdiv(class="attachment video-container"):
let thumb = getSmallPic(video.thumb) let thumb = getSmallPic(video.thumb)
let canPlay = prefs.isPlaybackEnabled(playbackType) if not video.available:
img(src=thumb, loading="lazy")
if video.available and canPlay: renderVideoUnavailable(video)
elif not prefs.isPlaybackEnabled(playbackType):
img(src=thumb, loading="lazy")
renderVideoDisabled(playbackType, path)
else:
let let
vidUrl = video.getVidVariant(playbackType).url vars = video.variants.filterIt(it.contentType == playbackType)
vidUrl = vars.sortedByIt(it.resolution)[^1].url
source = if prefs.proxyVideos: getVidUrl(vidUrl) source = if prefs.proxyVideos: getVidUrl(vidUrl)
else: vidUrl else: vidUrl
case playbackType case playbackType
of mp4: of mp4:
video(src=source, poster=thumb, controls="", muted=prefs.muteVideos, preload="metadata") video(poster=thumb, controls="", muted=prefs.muteVideos):
source(src=source, `type`="video/mp4")
of m3u8, vmap: of m3u8, vmap:
video(poster=thumb, data-url=source, data-autoload="false", muted=prefs.muteVideos) video(poster=thumb, data-url=source, data-autoload="false", muted=prefs.muteVideos)
verbatim "<div class=\"video-overlay\" onclick=\"playVideo(this)\">" verbatim "<div class=\"video-overlay\" onclick=\"playVideo(this)\">"
tdiv(class="overlay-circle"): span(class="overlay-triangle") tdiv(class="overlay-circle"): span(class="overlay-triangle")
tdiv(class="overlay-duration"): text getDuration(video)
verbatim "</div>" verbatim "</div>"
else:
img(src=thumb, loading="lazy", decoding="async")
if not canPlay:
renderVideoDisabled(playbackType, path)
else:
renderVideoUnavailable(video)
if container.len > 0: if container.len > 0:
tdiv(class="card-content"): tdiv(class="card-content"):
h2(class="card-title"): text video.title h2(class="card-title"): text video.title
@@ -146,7 +146,7 @@ proc renderPoll(poll: Poll): VNode =
proc renderCardImage(card: Card): VNode = proc renderCardImage(card: Card): VNode =
buildHtml(tdiv(class="card-image-container")): buildHtml(tdiv(class="card-image-container")):
tdiv(class="card-image"): tdiv(class="card-image"):
img(src=getPicUrl(card.image), alt="", loading="lazy") genImg(card.image)
if card.kind == player: if card.kind == player:
tdiv(class="card-overlay"): tdiv(class="card-overlay"):
tdiv(class="overlay-circle"): tdiv(class="overlay-circle"):
@@ -182,14 +182,12 @@ func formatStat(stat: int): string =
if stat > 0: insertSep($stat, ',') if stat > 0: insertSep($stat, ',')
else: "" else: ""
proc renderStats(stats: TweetStats; views: string): VNode = proc renderStats(stats: TweetStats): VNode =
buildHtml(tdiv(class="tweet-stats")): buildHtml(tdiv(class="tweet-stats")):
span(class="tweet-stat"): icon "comment", formatStat(stats.replies) span(class="tweet-stat"): icon "comment", formatStat(stats.replies)
span(class="tweet-stat"): icon "retweet", formatStat(stats.retweets) span(class="tweet-stat"): icon "retweet", formatStat(stats.retweets)
span(class="tweet-stat"): icon "quote", formatStat(stats.quotes)
span(class="tweet-stat"): icon "heart", formatStat(stats.likes) span(class="tweet-stat"): icon "heart", formatStat(stats.likes)
if views.len > 0: span(class="tweet-stat"): icon "views", formatStat(stats.views)
span(class="tweet-stat"): icon "play", insertSep(views, ',')
proc renderReply(tweet: Tweet): VNode = proc renderReply(tweet: Tweet): VNode =
buildHtml(tdiv(class="replying-to")): buildHtml(tdiv(class="replying-to")):
@@ -202,8 +200,7 @@ proc renderAttribution(user: User; prefs: Prefs): VNode =
buildHtml(a(class="attribution", href=("/" & user.username))): buildHtml(a(class="attribution", href=("/" & user.username))):
renderMiniAvatar(user, prefs) renderMiniAvatar(user, prefs)
strong: text user.fullname strong: text user.fullname
if user.verified: verifiedIcon(user)
icon "ok", class="verified-icon", title="Verified account"
proc renderMediaTags(tags: seq[User]): VNode = proc renderMediaTags(tags: seq[User]): VNode =
buildHtml(tdiv(class="media-tag-block")): buildHtml(tdiv(class="media-tag-block")):
@@ -214,6 +211,12 @@ proc renderMediaTags(tags: seq[User]): VNode =
if i < tags.high: if i < tags.high:
text ", " text ", "
proc renderLatestPost(username: string; id: int64): VNode =
buildHtml(tdiv(class="latest-post-version")):
text "There's a new version of this post. "
a(href=getLink(id, username)):
text "See the latest post"
proc renderQuoteMedia(quote: Tweet; prefs: Prefs; path: string): VNode = proc renderQuoteMedia(quote: Tweet; prefs: Prefs; path: string): VNode =
buildHtml(tdiv(class="quote-media-container")): buildHtml(tdiv(class="quote-media-container")):
if quote.photos.len > 0: if quote.photos.len > 0:
@@ -223,10 +226,18 @@ proc renderQuoteMedia(quote: Tweet; prefs: Prefs; path: string): VNode =
elif quote.gif.isSome: elif quote.gif.isSome:
renderGif(quote.gif.get(), prefs) renderGif(quote.gif.get(), prefs)
proc renderCommunityNote(note: string; prefs: Prefs): VNode =
buildHtml(tdiv(class="community-note")):
tdiv(class="community-note-header"):
icon "group"
span: text "Community note"
tdiv(class="community-note-text", dir="auto"):
verbatim replaceUrls(note, prefs)
proc renderQuote(quote: Tweet; prefs: Prefs; path: string): VNode = proc renderQuote(quote: Tweet; prefs: Prefs; path: string): VNode =
if not quote.available: if not quote.available:
return buildHtml(tdiv(class="quote unavailable")): return buildHtml(tdiv(class="quote unavailable")):
tdiv(class="unavailable-quote"): a(class="unavailable-quote", href=getLink(quote, focus=false)):
if quote.tombstone.len > 0: if quote.tombstone.len > 0:
text quote.tombstone text quote.tombstone
elif quote.text.len > 0: elif quote.text.len > 0:
@@ -241,6 +252,7 @@ proc renderQuote(quote: Tweet; prefs: Prefs; path: string): VNode =
tdiv(class="fullname-and-username"): tdiv(class="fullname-and-username"):
renderMiniAvatar(quote.user, prefs) renderMiniAvatar(quote.user, prefs)
linkUser(quote.user, class="fullname") linkUser(quote.user, class="fullname")
verifiedIcon(quote.user)
linkUser(quote.user, class="username") linkUser(quote.user, class="username")
span(class="tweet-date"): span(class="tweet-date"):
@@ -254,12 +266,19 @@ proc renderQuote(quote: Tweet; prefs: Prefs; path: string): VNode =
tdiv(class="quote-text", dir="auto"): tdiv(class="quote-text", dir="auto"):
verbatim replaceUrls(quote.text, prefs) verbatim replaceUrls(quote.text, prefs)
if quote.photos.len > 0 or quote.video.isSome or quote.gif.isSome:
renderQuoteMedia(quote, prefs, path)
if quote.note.len > 0 and not prefs.hideCommunityNotes:
renderCommunityNote(quote.note, prefs)
if quote.hasThread: if quote.hasThread:
a(class="show-thread", href=getLink(quote)): a(class="show-thread", href=getLink(quote)):
text "Show this thread" text "Show this thread"
if quote.photos.len > 0 or quote.video.isSome or quote.gif.isSome: if quote.history.len > 0 and quote.id != max(quote.history):
renderQuoteMedia(quote, prefs, path) tdiv(class="quote-latest"):
text "There's a new version of this post"
proc renderLocation*(tweet: Tweet): string = proc renderLocation*(tweet: Tweet): string =
let (place, url) = tweet.getLocation() let (place, url) = tweet.getLocation()
@@ -273,14 +292,14 @@ proc renderLocation*(tweet: Tweet): string =
return $node return $node
proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0; proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
last=false; showThread=false; mainTweet=false; afterTweet=false): VNode = last=false; mainTweet=false; afterTweet=false): VNode =
var divClass = class var divClass = class
if index == -1 or last: if index == -1 or last:
divClass = "thread-last " & class divClass = "thread-last " & class
if not tweet.available: if not tweet.available:
return buildHtml(tdiv(class=divClass & "unavailable timeline-item")): return buildHtml(tdiv(class=divClass & "unavailable timeline-item", data-username=tweet.user.username)):
tdiv(class="unavailable-box"): a(class="unavailable-box", href=getLink(tweet)):
if tweet.tombstone.len > 0: if tweet.tombstone.len > 0:
text tweet.tombstone text tweet.tombstone
elif tweet.text.len > 0: elif tweet.text.len > 0:
@@ -291,23 +310,25 @@ proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
if tweet.quote.isSome: if tweet.quote.isSome:
renderQuote(tweet.quote.get(), prefs, path) renderQuote(tweet.quote.get(), prefs, path)
let fullTweet = tweet let
fullTweet = tweet
pinned = tweet.pinned
var retweet: string var retweet: string
var tweet = fullTweet var tweet = fullTweet
if tweet.retweet.isSome: if tweet.retweet.isSome:
tweet = tweet.retweet.get tweet = tweet.retweet.get
retweet = fullTweet.user.fullname retweet = fullTweet.user.fullname
buildHtml(tdiv(class=("timeline-item " & divClass))): buildHtml(tdiv(class=("timeline-item " & divClass), data-username=tweet.user.username)):
if not mainTweet: if not mainTweet:
a(class="tweet-link", href=getLink(tweet)) a(class="tweet-link", href=getLink(tweet))
tdiv(class="tweet-body"): tdiv(class="tweet-body"):
var views = "" renderHeader(tweet, retweet, pinned, prefs)
renderHeader(tweet, retweet, prefs)
if not afterTweet and index == 0 and tweet.reply.len > 0 and if not afterTweet and index == 0 and tweet.reply.len > 0 and
(tweet.reply.len > 1 or tweet.reply[0] != tweet.user.username): (tweet.reply.len > 1 or tweet.reply[0] != tweet.user.username or pinned):
renderReply(tweet) renderReply(tweet)
var tweetClass = "tweet-content media-body" var tweetClass = "tweet-content media-body"
@@ -327,10 +348,8 @@ proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
renderAlbum(tweet) renderAlbum(tweet)
elif tweet.video.isSome: elif tweet.video.isSome:
renderVideo(tweet.video.get(), prefs, path) renderVideo(tweet.video.get(), prefs, path)
views = tweet.video.get().views
elif tweet.gif.isSome: elif tweet.gif.isSome:
renderGif(tweet.gif.get(), prefs) renderGif(tweet.gif.get(), prefs)
views = "GIF"
if tweet.poll.isSome: if tweet.poll.isSome:
renderPoll(tweet.poll.get()) renderPoll(tweet.poll.get())
@@ -338,18 +357,29 @@ proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
if tweet.quote.isSome: if tweet.quote.isSome:
renderQuote(tweet.quote.get(), prefs, path) renderQuote(tweet.quote.get(), prefs, path)
if tweet.note.len > 0 and not prefs.hideCommunityNotes:
renderCommunityNote(tweet.note, prefs)
let
hasEdits = tweet.history.len > 1
isLatest = hasEdits and tweet.id == max(tweet.history)
if mainTweet: if mainTweet:
p(class="tweet-published"): text &"{getTime(tweet)}" p(class="tweet-published"):
if hasEdits and isLatest:
a(href=(getLink(tweet, focus=false) & "/history")):
text &"Last edited {getTime(tweet)}"
else:
text &"{getTime(tweet)}"
if hasEdits and not isLatest:
renderLatestPost(tweet.user.username, max(tweet.history))
if tweet.mediaTags.len > 0: if tweet.mediaTags.len > 0:
renderMediaTags(tweet.mediaTags) renderMediaTags(tweet.mediaTags)
if not prefs.hideTweetStats: if not prefs.hideTweetStats:
renderStats(tweet.stats, views) renderStats(tweet.stats)
if showThread:
a(class="show-thread", href=("/i/status/" & $tweet.threadId)):
text "Show this thread"
proc renderTweetEmbed*(tweet: Tweet; path: string; prefs: Prefs; cfg: Config; req: Request): string = proc renderTweetEmbed*(tweet: Tweet; path: string; prefs: Prefs; cfg: Config; req: Request): string =
let node = buildHtml(html(lang="en")): let node = buildHtml(html(lang="en")):
+1716
View File
File diff suppressed because it is too large Load Diff
+2
View File
@@ -0,0 +1,2 @@
[virtualenvs]
in-project = true
+8
View File
@@ -0,0 +1,8 @@
[tool.poetry]
name = "nitter-tests"
version = "0.0.0"
package-mode = false
[tool.poetry.dependencies]
python = "^3.14"
seleniumbase = "4.46.5"
+1 -1
View File
@@ -1 +1 @@
seleniumbase seleniumbase==4.46.5
+12 -38
View File
@@ -11,34 +11,29 @@ card = [
['voidtarget/status/1094632512926605312', ['voidtarget/status/1094632512926605312',
'Basic OBS Studio plugin, written in nim, supporting C++ (C fine too)', 'Basic OBS Studio plugin, written in nim, supporting C++ (C fine too)',
'Basic OBS Studio plugin, written in nim, supporting C++ (C fine too) - obsplugin.nim', 'Basic OBS Studio plugin, written in nim, supporting C++ (C fine too) - obsplugin.nim',
'gist.github.com', True], 'gist.github.com', True]
['FluentAI/status/1116417904831029248',
'Amazons Alexa isnt just AI — thousands of humans are listening',
'One of the only ways to improve Alexa is to have human beings check it for errors',
'theverge.com', True]
] ]
no_thumb = [ no_thumb = [
['FluentAI/status/1116417904831029248',
'LinkedIn',
'This link will take you to a page thats not on LinkedIn',
'lnkd.in'],
['Thom_Wolf/status/1122466524860702729', ['Thom_Wolf/status/1122466524860702729',
'facebookresearch/fairseq', 'GitHub - facebookresearch/fairseq: Facebook AI Research Sequence-to-Sequence Toolkit written in',
'Facebook AI Research Sequence-to-Sequence Toolkit written in Python. - GitHub - facebookresearch/fairseq: Facebook AI Research Sequence-to-Sequence Toolkit written in Python.', '',
'github.com'], 'github.com'],
['brent_p/status/1088857328680488961', ['brent_p/status/1088857328680488961',
'Hts Nim Sugar', 'GitHub - brentp/hts-nim: nim wrapper for htslib for parsing genomics data files',
'hts-nim is a library that allows one to use htslib via the nim programming language. Nim is a garbage-collected language that compiles to C and often has similar performance. I have become very...', '',
'brentp.github.io'], 'github.com'],
['voidtarget/status/1133028231672582145', ['voidtarget/status/1133028231672582145',
'sinkingsugar/nimqt-example', 'sinkingsugar/nimqt-example',
'A sample of a Qt app written using mostly nim. Contribute to sinkingsugar/nimqt-example development by creating an account on GitHub.', 'A sample of a Qt app written using mostly nim. Contribute to sinkingsugar/nimqt-example development by creating an account on GitHub.',
'github.com'], 'github.com']
['nim_lang/status/1082989146040340480',
'Nim in 2018: A short recap',
'Posted by u/miran1 - 36 votes and 46 comments',
'reddit.com']
] ]
playable = [ playable = [
@@ -53,17 +48,6 @@ playable = [
'youtube.com'] 'youtube.com']
] ]
# promo = [
# ['BangOlufsen/status/1145698701517754368',
# 'Upgrade your journey', '',
# 'www.bang-olufsen.com'],
# ['BangOlufsen/status/1154934429900406784',
# 'Learn more about Beosound Shape', '',
# 'www.bang-olufsen.com']
# ]
class CardTest(BaseTestCase): class CardTest(BaseTestCase):
@parameterized.expand(card) @parameterized.expand(card)
def test_card(self, tweet, title, description, destination, large): def test_card(self, tweet, title, description, destination, large):
@@ -98,13 +82,3 @@ class CardTest(BaseTestCase):
self.assert_element_visible('.card-overlay') self.assert_element_visible('.card-overlay')
if len(description) > 0: if len(description) > 0:
self.assert_text(description, c.description) self.assert_text(description, c.description)
# @parameterized.expand(promo)
# def test_card_promo(self, tweet, title, description, destination):
# self.open_nitter(tweet)
# c = Card(Conversation.main + " ")
# self.assert_text(title, c.title)
# self.assert_text(destination, c.destination)
# self.assert_element_visible('.video-overlay')
# if len(description) > 0:
# self.assert_text(description, c.description)
+23 -4
View File
@@ -4,7 +4,7 @@ from parameterized import parameterized
profiles = [ profiles = [
['mobile_test', 'Test account', ['mobile_test', 'Test account',
'Test Account. test test Testing username with @mobile_test_2 and a #hashtag', 'Test Account. test test Testing username with @mobile_test_2 and a #hashtag',
'San Francisco, CA', 'example.com/foobar', 'Joined October 2009', '100'], 'San Francisco, CA', 'example.com/foobar', 'Joined October 2009', '97'],
['mobile_test_2', 'mobile test 2', '', '', '', 'Joined January 2011', '13'] ['mobile_test_2', 'mobile test 2', '', '', '', 'Joined January 2011', '13']
] ]
@@ -15,7 +15,19 @@ protected = [
['Poop', 'Randy', 'Social media fanatic.'] ['Poop', 'Randy', 'Social media fanatic.']
] ]
invalid = [['thisprofiledoesntexist'], ['%']] invalid = [['thisprofiledoesntexist']]
malformed = [
['${userId}'],
['$%7BuserId%7D'], # URL encoded version
['%'], # Percent sign is invalid
['user@name'],
['user.name'],
['user-name'],
['user$name'],
['user{name}'],
['user name'], # space
]
banner_image = [ banner_image = [
['mobile_test', 'profile_banners%2F82135242%2F1384108037%2F1500x500'] ['mobile_test', 'profile_banners%2F82135242%2F1384108037%2F1500x500']
@@ -65,9 +77,16 @@ class ProfileTest(BaseTestCase):
self.open_nitter(username) self.open_nitter(username)
self.assert_text(f'User "{username}" not found') self.assert_text(f'User "{username}" not found')
@parameterized.expand(malformed)
def test_malformed_username(self, username):
"""Test that malformed usernames (with invalid characters) return 404"""
self.open_nitter(username)
# Malformed usernames should return 404 page not found, not try to fetch from Twitter
self.assert_text('Page not found')
def test_suspended(self): def test_suspended(self):
self.open_nitter('user') self.open_nitter('suspendme')
self.assert_text('User "user" has been suspended') self.assert_text('User "suspendme" has been suspended')
@parameterized.expand(banner_image) @parameterized.expand(banner_image)
def test_banner_image(self, username, url): def test_banner_image(self, username, url):
+1 -7
View File
@@ -2,14 +2,8 @@ from base import BaseTestCase, Quote, Conversation
from parameterized import parameterized from parameterized import parameterized
text = [ text = [
['elonmusk/status/1138136540096319488',
'TREV PAGE', '@Model3Owners',
"""As of March 58.4% of new car sales in Norway are electric.
What are we doing wrong? reuters.com/article/us-norwa"""],
['nim_lang/status/1491461266849808397#m', ['nim_lang/status/1491461266849808397#m',
'Nim language', '@nim_lang', 'Nim', '@nim_lang',
"""What's better than Nim 1.6.0? """What's better than Nim 1.6.0?
Nim 1.6.2 :) Nim 1.6.2 :)
+5 -5
View File
@@ -2,8 +2,8 @@ from base import BaseTestCase
from parameterized import parameterized from parameterized import parameterized
class SearchTest(BaseTestCase): #class SearchTest(BaseTestCase):
@parameterized.expand([['@mobile_test'], ['@mobile_test_2']]) #@parameterized.expand([['@mobile_test'], ['@mobile_test_2']])
def test_username_search(self, username): #def test_username_search(self, username):
self.search_username(username) #self.search_username(username)
self.assert_text(f'{username}') #self.assert_text(f'{username}')
+12 -17
View File
@@ -1,23 +1,18 @@
from base import BaseTestCase, Timeline from base import BaseTestCase, Timeline
from parameterized import parameterized from parameterized import parameterized
normal = [['mobile_test'], ['mobile_test_2']] normal = [['jack'], ['elonmusk']]
after = [['mobile_test', 'HBaAgJPsqtGNhA0AAA%3D%3D'], after = [['jack', '1681686036294803456'],
['mobile_test_2', 'HBaAgJPsqtGNhA0AAA%3D%3D']] ['elonmusk', '1681686036294803456']]
no_more = [['mobile_test_8?cursor=HBaAwJCsk%2F6%2FtgQAAA%3D%3D']] no_more = [['mobile_test_8?cursor=DAABCgABF4YVAqN___kKAAICNn_4msIQAAgAAwAAAAIAAA']]
empty = [['emptyuser'], ['mobile_test_10']] empty = [['emptyuser'], ['mobile_test_10']]
protected = [['mobile_test_7'], ['Empty_user']] protected = [['mobile_test_7'], ['Empty_user']]
photo_rail = [['mobile_test', [ photo_rail = [['mobile_test', ['Bo0nDsYIYAIjqVn', 'BoQbwJAIUAA0QCY', 'BoQbRQxIIAA3FWD', 'Bn8Qh8iIIAABXrG']]]
'BzUnaDFCUAAmrjs', 'Bo0nDsYIYAIjqVn', 'Bos--KNIQAAA7Li', 'Boq1sDJIYAAxaoi',
'BonISmPIEAAhP3G', 'BoQbwJAIUAA0QCY', 'BoQbRQxIIAA3FWD', 'Bn8Qh8iIIAABXrG',
'Bn8QIG3IYAA0IGT', 'Bn8O3QeIUAAONai', 'Bn8NGViIAAATNG4', 'BkKovdrCUAAEz79',
'BkKoe_oCIAASAqr', 'BkKoRLNCAAAYfDf', 'BkKndxoCQAE1vFt', 'BPEmIbYCMAE44dl'
]]]
class TweetTest(BaseTestCase): class TweetTest(BaseTestCase):
@@ -60,10 +55,10 @@ class TweetTest(BaseTestCase):
self.assert_element_absent(Timeline.older) self.assert_element_absent(Timeline.older)
self.assert_element_absent(Timeline.end) self.assert_element_absent(Timeline.end)
@parameterized.expand(photo_rail) #@parameterized.expand(photo_rail)
def test_photo_rail(self, username, images): #def test_photo_rail(self, username, images):
self.open_nitter(username) #self.open_nitter(username)
self.assert_element_visible(Timeline.photo_rail) #self.assert_element_visible(Timeline.photo_rail)
for i, url in enumerate(images): #for i, url in enumerate(images):
img = self.get_attribute(Timeline.photo_rail + f' a:nth-child({i + 1}) img', 'src') #img = self.get_attribute(Timeline.photo_rail + f' a:nth-child({i + 1}) img', 'src')
self.assertIn(url, img) #self.assertIn(url, img)

Some files were not shown because too many files have changed in this diff Show More