88 Commits

Author SHA1 Message Date
dependabot[bot]
3d88876a32 [upd] pypi: Bump types-lxml from 2025.11.25 to 2026.1.1 (#5622)
Bumps [types-lxml](https://github.com/abelcheung/types-lxml) from 2025.11.25 to 2026.1.1.

- [Release notes](https://github.com/abelcheung/types-lxml/releases)
- [Commits](https://github.com/abelcheung/types-lxml/compare/2025.11.25...2026.01.01)
2026-01-02 08:45:31 +01:00
github-actions[bot]
09bedef409 [l10n] update translations from Weblate (#5623)
0ebc2ba84 - 2025-12-30 - AndersNordh <andersnordh@noreply.codeberg.org>
012b6aa25 - 2025-12-26 - tace16 <tace16@noreply.codeberg.org>
2026-01-02 08:43:37 +01:00
github-actions[bot]
a5c946a321 [data] update searx.data - update_external_bangs.py (#5607) 2025-12-30 08:29:05 +01:00
Ivan Gabaldon
29042d8e5a [mod] docs: remove libera.chat channel (#5613)
Internal discussion (lack of time/moderation)
2025-12-30 08:21:39 +01:00
Tommaso Colella
c57db45672 [fix] 360search: fix engine by adding cookie caching
Co-authored-by: Bnyro <bnyro@tutanota.com>
2025-12-29 15:26:07 +01:00
searxng-bot
9491b514c9 [data] update searx.data - update_engine_descriptions.py 2025-12-29 15:24:06 +01:00
Tommaso Colella
320c317719 [mod] settings.yml: set engines that require an api key to inactive by default 2025-12-29 15:20:43 +01:00
Tommaso Colella
abae17e6fc [mod] docs: better explanation for search api usage and format support (#5574) 2025-12-29 15:04:33 +01:00
Austin-Olacsi
3baf5c38fc [fix] bilibili engine: send referer header 2025-12-29 14:59:08 +01:00
searxng-bot
ce46f30739 [data] update searx.data - update_currencies.py 2025-12-29 13:48:16 +01:00
searxng-bot
65a95539f1 [data] update searx.data - update_ahmia_blacklist.py 2025-12-29 13:47:12 +01:00
searxng-bot
874dc3f5ea [data] update searx.data - update_firefox_version.py 2025-12-29 13:46:57 +01:00
searxng-bot
7941719371 [data] update searx.data - update_wikidata_units.py 2025-12-29 13:46:46 +01:00
Aadniz
fa9729226b [fix] ahmia engine: increase timeout to 20 seconds 2025-12-26 18:22:15 +01:00
Aadniz
9df177af85 [fix] ahmia engine: Remove comment for EngineCache 2025-12-26 18:22:15 +01:00
Aadniz
f45123356b [fix] ahmia engine: requires rotating tokens to work
Ahmia recently implemented a 60 minute rotating token system when searching.

This fix uses the cache and updates the tokens on every request.
2025-12-26 18:22:15 +01:00
Aadniz
8851f4d6b1 [fix] searx.network: fix string concatenation of proxy error message 2025-12-26 18:07:51 +01:00
searxng-bot
f954423101 [l10n] update translations from Weblate
3a4b5f36f - 2025-12-24 - Stzyxh <stzyxh@noreply.codeberg.org>
e72be22b9 - 2025-12-23 - gallegonovato <gallegonovato@noreply.codeberg.org>
18a59dd67 - 2025-12-22 - Outbreak2096 <outbreak2096@noreply.codeberg.org>
bf212eb3c - 2025-12-22 - Priit Jõerüüt <jrtcdbrg@noreply.codeberg.org>
3525b547a - 2025-12-19 - MaiuZ <maiuz@noreply.codeberg.org>
2025-12-26 11:51:22 +01:00
dependabot[bot]
95e63ac32d [upd] web-client (simple): Bump vite in /client/simple
Bumps [vite](https://github.com/vitejs/vite/tree/HEAD/packages/vite) from 8.0.0-beta.3 to 8.0.0-beta.5.
- [Release notes](https://github.com/vitejs/vite/releases)
- [Changelog](https://github.com/vitejs/vite/blob/main/packages/vite/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite/commits/v8.0.0-beta.5/packages/vite)

---
updated-dependencies:
- dependency-name: vite
  dependency-version: 8.0.0-beta.5
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-26 11:47:23 +01:00
dependabot[bot]
fc6e59d3ec [upd] pypi: Bump the minor group with 2 updates (#5598)
Bumps the minor group with 2 updates: [typer-slim](https://github.com/fastapi/typer) and [basedpyright](https://github.com/detachhead/basedpyright).


Updates `typer-slim` from 0.20.0 to 0.21.0
- [Release notes](https://github.com/fastapi/typer/releases)
- [Changelog](https://github.com/fastapi/typer/blob/master/docs/release-notes.md)
- [Commits](https://github.com/fastapi/typer/compare/0.20.0...0.21.0)

Updates `basedpyright` from 1.36.1 to 1.36.2
- [Release notes](https://github.com/detachhead/basedpyright/releases)
- [Commits](https://github.com/detachhead/basedpyright/compare/v1.36.1...v1.36.2)

---
updated-dependencies:
- dependency-name: typer-slim
  dependency-version: 0.21.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: minor
- dependency-name: basedpyright
  dependency-version: 1.36.2
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: minor
2025-12-26 10:03:42 +01:00
dependabot[bot]
da45859f32 [upd] pypi: Bump the minor group with 2 updates
Bumps the minor group with 2 updates: [selenium](https://github.com/SeleniumHQ/Selenium) and [basedpyright](https://github.com/detachhead/basedpyright).


Updates `selenium` from 4.38.0 to 4.39.0
- [Release notes](https://github.com/SeleniumHQ/Selenium/releases)
- [Commits](https://github.com/SeleniumHQ/Selenium/compare/selenium-4.38.0...selenium-4.39.0)

Updates `basedpyright` from 1.35.0 to 1.36.1
- [Release notes](https://github.com/detachhead/basedpyright/releases)
- [Commits](https://github.com/detachhead/basedpyright/compare/v1.35.0...v1.36.1)

---
updated-dependencies:
- dependency-name: selenium
  dependency-version: 4.39.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: minor
- dependency-name: basedpyright
  dependency-version: 1.36.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-25 19:55:01 +01:00
Ivan Gabaldon
8bf600cc62 [fix] themes: rebuild static 2025-12-19 17:50:23 +00:00
dependabot[bot]
aa607a379a [upd] web-client (simple): Bump the minor group
Bumps the minor group in /client/simple with 6 updates:

| Package | From | To |
| --- | --- | --- |
| [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.3.8` | `2.3.10` |
| [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) | `25.0.1` | `25.0.3` |
| [edge.js](https://github.com/edge-js/edge) | `6.3.0` | `6.4.0` |
| [less](https://github.com/less/less.js) | `4.4.2` | `4.5.1` |
| [sort-package-json](https://github.com/keithamus/sort-package-json) | `3.5.1` | `3.6.0` |
| [vite-bundle-analyzer](https://github.com/nonzzz/vite-bundle-analyzer) | `1.3.1` | `1.3.2` |

Updates `@biomejs/biome` from 2.3.8 to 2.3.10
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.3.10/packages/@biomejs/biome)

Updates `@types/node` from 25.0.1 to 25.0.3
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

Updates `edge.js` from 6.3.0 to 6.4.0
- [Release notes](https://github.com/edge-js/edge/releases)
- [Changelog](https://github.com/edge-js/edge/blob/6.x/CHANGELOG.md)
- [Commits](https://github.com/edge-js/edge/compare/v6.3.0...v6.4.0)

Updates `less` from 4.4.2 to 4.5.1
- [Release notes](https://github.com/less/less.js/releases)
- [Changelog](https://github.com/less/less.js/blob/master/CHANGELOG.md)
- [Commits](https://github.com/less/less.js/commits)

Updates `sort-package-json` from 3.5.1 to 3.6.0
- [Release notes](https://github.com/keithamus/sort-package-json/releases)
- [Commits](https://github.com/keithamus/sort-package-json/compare/v3.5.1...v3.6.0)

Updates `vite-bundle-analyzer` from 1.3.1 to 1.3.2
- [Release notes](https://github.com/nonzzz/vite-bundle-analyzer/releases)
- [Changelog](https://github.com/nonzzz/vite-bundle-analyzer/blob/master/CHANGELOG.md)
- [Commits](https://github.com/nonzzz/vite-bundle-analyzer/compare/v1.3.1...v1.3.2)

---
updated-dependencies:
- dependency-name: "@biomejs/biome"
  dependency-version: 2.3.10
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: minor
- dependency-name: "@types/node"
  dependency-version: 25.0.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: minor
- dependency-name: edge.js
  dependency-version: 6.4.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: minor
- dependency-name: less
  dependency-version: 4.5.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: minor
- dependency-name: sort-package-json
  dependency-version: 3.6.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: minor
- dependency-name: vite-bundle-analyzer
  dependency-version: 1.3.2
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-19 17:50:23 +00:00
dependabot[bot]
6ebd3f4d35 [upd] web-client (simple): Bump vite in /client/simple
Bumps [vite](https://github.com/vitejs/vite/tree/HEAD/packages/vite) from 8.0.0-beta.2 to 8.0.0-beta.3.
- [Release notes](https://github.com/vitejs/vite/releases)
- [Changelog](https://github.com/vitejs/vite/blob/main/packages/vite/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite/commits/v8.0.0-beta.3/packages/vite)

---
updated-dependencies:
- dependency-name: vite
  dependency-version: 8.0.0-beta.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-19 17:30:19 +00:00
searxng-bot
9072c77aea [l10n] update translations from Weblate
6fd00e66a - 2025-12-18 - dtalens <dtalens@noreply.codeberg.org>
037518f3b - 2025-12-17 - dtalens <dtalens@noreply.codeberg.org>
2025-12-19 08:37:40 +00:00
dependabot[bot]
c32b8100c3 [upd] github-actions: Bump actions/cache from 5.0.0 to 5.0.1
Bumps [actions/cache](https://github.com/actions/cache) from 5.0.0 to 5.0.1.
- [Release notes](https://github.com/actions/cache/releases)
- [Changelog](https://github.com/actions/cache/blob/main/RELEASES.md)
- [Commits](a783357455...9255dc7a25)

---
updated-dependencies:
- dependency-name: actions/cache
  dependency-version: 5.0.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-19 08:35:15 +00:00
dependabot[bot]
f93257941e [upd] github-actions: Bump github/codeql-action from 4.31.7 to 4.31.9
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 4.31.7 to 4.31.9.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](cf1bb45a27...5d4e8d1aca)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-version: 4.31.9
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-19 08:34:32 +00:00
Guanzhong Chen
896863802e [fix] engine: brave - BrotliDecoderDecompressStream encoding issue (#5572)
For some reason, I keep getting this error from the brave engine:

    httpx.DecodingError: BrotliDecoderDecompressStream failed while processing the stream

Forcing the server to use either gzip or deflate fixes this issue.

This makes the brave engine work when the server seems to be encoding brotli incorrectly, or at least in a way incompatible with certain installs.

Related:

- https://github.com/searxng/searxng/pull/1787
- https://github.com/searxng/searxng/pull/5536
2025-12-17 09:39:03 +01:00
searxng-bot
920b40253c [l10n] update translations from Weblate
e23dc20f7 - 2025-12-11 - SomeTr <sometr@noreply.codeberg.org>
eb67f948f - 2025-12-11 - artnay <artnay@noreply.codeberg.org>
d4fdfc449 - 2025-12-10 - SomeTr <sometr@noreply.codeberg.org>
0f02ac7cd - 2025-12-10 - IcewindX <icewindx@noreply.codeberg.org>
533ae3947 - 2025-12-11 - return42 <return42@noreply.codeberg.org>
19fe65dc7 - 2025-12-11 - return42 <return42@noreply.codeberg.org>
bca557cea - 2025-12-09 - Hangry-Studios <hangry-studios@noreply.codeberg.org>
e43e9a299 - 2025-12-10 - Priit Jõerüüt <jrtcdbrg@noreply.codeberg.org>
c98083cef - 2025-12-09 - eudemo <eudemo@noreply.codeberg.org>
316225017 - 2025-12-08 - aindriu80 <aindriu80@noreply.codeberg.org>
1b827e5a4 - 2025-12-08 - Aadniz <aadniz@noreply.codeberg.org>
68e760f9b - 2025-12-09 - kratos <makesocialfoss32@keemail.me>
99945ac31 - 2025-12-07 - lucasmz.dev <lucasmz.dev@noreply.codeberg.org>
56602eb75 - 2025-12-07 - Fjuro <fjuro@alius.cz>
df092e811 - 2025-12-06 - c2qd <c2qd@noreply.codeberg.org>
12c25cd85 - 2025-12-06 - Outbreak2096 <outbreak2096@noreply.codeberg.org>
081243428 - 2025-12-05 - SomeTr <sometr@noreply.codeberg.org>
66362c02d - 2025-12-06 - ghose <ghose@noreply.codeberg.org>
2025-12-12 20:23:43 +00:00
dependabot[bot]
07440e3332 [upd] web-client (simple): Bump the minor group
Bumps the minor group in /client/simple with 2 updates: [sort-package-json](https://github.com/keithamus/sort-package-json) and [vite-bundle-analyzer](https://github.com/nonzzz/vite-bundle-analyzer).

Updates `sort-package-json` from 3.5.0 to 3.5.1
- [Release notes](https://github.com/keithamus/sort-package-json/releases)
- [Commits](https://github.com/keithamus/sort-package-json/compare/v3.5.0...v3.5.1)

Updates `vite-bundle-analyzer` from 1.2.3 to 1.3.1
- [Release notes](https://github.com/nonzzz/vite-bundle-analyzer/releases)
- [Changelog](https://github.com/nonzzz/vite-bundle-analyzer/blob/master/CHANGELOG.md)
- [Commits](https://github.com/nonzzz/vite-bundle-analyzer/compare/v1.2.3...v1.3.1)

---
updated-dependencies:
- dependency-name: sort-package-json
  dependency-version: 3.5.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: minor
- dependency-name: vite-bundle-analyzer
  dependency-version: 1.3.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-12 20:22:48 +00:00
dependabot[bot]
1827dfc071 [upd] web-client (simple): Bump @types/node in /client/simple
Bumps [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) from 24.10.1 to 25.0.1.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

---
updated-dependencies:
- dependency-name: "@types/node"
  dependency-version: 25.0.1
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-12 20:13:11 +00:00
dependabot[bot]
c46aecd4e3 [upd] web-client (simple): Bump vite in /client/simple
Bumps [vite](https://github.com/vitejs/vite/tree/HEAD/packages/vite) from 8.0.0-beta.0 to 8.0.0-beta.2.
- [Release notes](https://github.com/vitejs/vite/releases)
- [Changelog](https://github.com/vitejs/vite/blob/main/packages/vite/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite/commits/v8.0.0-beta.2/packages/vite)

---
updated-dependencies:
- dependency-name: vite
  dependency-version: 8.0.0-beta.2
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-12 19:59:31 +00:00
dependabot[bot]
21bf8a6973 [upd] github-actions: Bump peter-evans/create-pull-request
Bumps [peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request) from 7.0.9 to 8.0.0.
- [Release notes](https://github.com/peter-evans/create-pull-request/releases)
- [Commits](84ae59a2cd...98357b18bf)

---
updated-dependencies:
- dependency-name: peter-evans/create-pull-request
  dependency-version: 8.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-12 19:56:18 +00:00
dependabot[bot]
f5475ba782 [upd] github-actions: Bump JamesIves/github-pages-deploy-action
Bumps [JamesIves/github-pages-deploy-action](https://github.com/jamesives/github-pages-deploy-action) from 4.7.4 to 4.7.6.
- [Release notes](https://github.com/jamesives/github-pages-deploy-action/releases)
- [Commits](4a3abc783e...9d877eea73)

---
updated-dependencies:
- dependency-name: JamesIves/github-pages-deploy-action
  dependency-version: 4.7.6
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-12 19:52:52 +00:00
dependabot[bot]
265f15498c [upd] github-actions: Bump github/codeql-action from 4.31.6 to 4.31.7
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 4.31.6 to 4.31.7.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](fe4161a26a...cf1bb45a27)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-version: 4.31.7
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-12 19:51:40 +00:00
dependabot[bot]
666409ec7e [upd] github-actions: Bump actions/cache from 4.3.0 to 5.0.0
Bumps [actions/cache](https://github.com/actions/cache) from 4.3.0 to 5.0.0.
- [Release notes](https://github.com/actions/cache/releases)
- [Changelog](https://github.com/actions/cache/blob/main/RELEASES.md)
- [Commits](0057852bfa...a783357455)

---
updated-dependencies:
- dependency-name: actions/cache
  dependency-version: 5.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-12 19:51:17 +00:00
Viktor
b719d559b6 [feat] marginalia: switch to the new, improved API version 2025-12-09 18:18:37 +01:00
Austin-Olacsi
9d3ec9a2a2 [feat] pixiv engine: add filter for AI generated images 2025-12-07 23:34:16 +01:00
Ivan Gabaldon
74ec225ad1 [fix] themes: clear search input (#5540)
* [fix] themes: clear search input

Closes https://github.com/searxng/searxng/issues/5539

* [fix] themes: rebuild static
2025-12-07 13:26:01 +00:00
Markus Heiser
b5a1a092f1 [debug] partial revert of 5e0e1c6b3 (#5535)
Issue #5490 was caused by a regression of GH action in v6.0.0, updating to
v6.0.1 [2] fixed our workflow and debug messages are no longer needed.

[1] https://github.com/searxng/searxng/issues/5490#issuecomment-3616230451
[2] https://github.com/searxng/searxng/pull/5530

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-12-06 09:15:56 +01:00
dependabot[bot]
ddc6d68114 [upd] pypi: Bump the minor group with 2 updates (#5527)
Bumps the minor group with 2 updates: [pylint](https://github.com/pylint-dev/pylint) and [basedpyright](https://github.com/detachhead/basedpyright).


Updates `pylint` from 4.0.3 to 4.0.4
- [Release notes](https://github.com/pylint-dev/pylint/releases)
- [Commits](https://github.com/pylint-dev/pylint/compare/v4.0.3...v4.0.4)

Updates `basedpyright` from 1.34.0 to 1.35.0
- [Release notes](https://github.com/detachhead/basedpyright/releases)
- [Commits](https://github.com/detachhead/basedpyright/compare/v1.34.0...v1.35.0)
2025-12-05 15:39:35 +01:00
github-actions[bot]
32eb84d6d3 [l10n] update translations from Weblate (#5532) 2025-12-05 15:38:35 +01:00
Ivan Gabaldon
da6c635ea2 [upd] themes: update vite 2025-12-05 11:14:26 +00:00
dependabot[bot]
e34c356e64 [upd] web-client (simple): Bump the minor group
Bumps the minor group in /client/simple with 2 updates: [browserslist](https://github.com/browserslist/browserslist) and [mathjs](https://github.com/josdejong/mathjs).

Updates `browserslist` from 4.28.0 to 4.28.1
- [Release notes](https://github.com/browserslist/browserslist/releases)
- [Changelog](https://github.com/browserslist/browserslist/blob/main/CHANGELOG.md)
- [Commits](https://github.com/browserslist/browserslist/compare/4.28.0...4.28.1)

Updates `mathjs` from 15.0.0 to 15.1.0
- [Changelog](https://github.com/josdejong/mathjs/blob/develop/HISTORY.md)
- [Commits](https://github.com/josdejong/mathjs/compare/v15.0.0...v15.1.0)

---
updated-dependencies:
- dependency-name: browserslist
  dependency-version: 4.28.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: minor
- dependency-name: mathjs
  dependency-version: 15.1.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-05 10:42:18 +00:00
dependabot[bot]
7017393647 [upd] github-actions: Bump actions/setup-node from 6.0.0 to 6.1.0
Bumps [actions/setup-node](https://github.com/actions/setup-node) from 6.0.0 to 6.1.0.
- [Release notes](https://github.com/actions/setup-node/releases)
- [Commits](2028fbc5c2...395ad32622)

---
updated-dependencies:
- dependency-name: actions/setup-node
  dependency-version: 6.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-05 10:26:25 +00:00
dependabot[bot]
aa49f5b933 [upd] github-actions: Bump github/codeql-action from 4.31.5 to 4.31.6
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 4.31.5 to 4.31.6.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](fdbfb4d275...fe4161a26a)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-version: 4.31.6
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-05 10:25:37 +00:00
dependabot[bot]
3f91ac47e6 [upd] github-actions: Bump actions/checkout from 6.0.0 to 6.0.1
Bumps [actions/checkout](https://github.com/actions/checkout) from 6.0.0 to 6.0.1.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](1af3b93b68...8e8c483db8)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: 6.0.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-05 10:14:41 +00:00
Markus Heiser
8c631b92ce [mod] setup.py package_data - use recursive globs for package_data
To test this patch build a python wheel::

    $ make clean py.build

and llok out if you are missing any files in the wheel::

    $ unzip -l dist/searxng-*-py3-none-any.whl

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-12-04 15:04:36 +01:00
leaty
0ebac144f5 [fix] py: sxng wheel build still broken
Directories chunk and img were not included

Signed-off-by: leaty <dev@leaty.net>
2025-12-04 15:04:36 +01:00
Markus Heiser
5e0e1c6b31 [debug] CI - add debug to trace issue #5490 (#5519)
The error only occurs in the CI action, which is why we need to commit the coded
debug.  As soon as the bug is identified and fixed, this commit can be reverted
/ the ``set -x`` can be removed from the code.

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-12-04 14:07:22 +01:00
Markus Heiser
3c7545c6ce [fix] plugin unit-converter - remove leftovers (#5517)
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-12-04 07:51:48 +01:00
Austin-Olacsi
aba839195b [fix] findthatmeme: hardening the response against KeyErrors (#5516) 2025-12-04 07:13:05 +01:00
Bnyro
1f6ea41272 [fix] mojeek: first search page is rate-limited
Mojeek blocks all requests where the page offset parameter `s`
is set to 0, hence we may not set it for fetching the first results page.
2025-12-03 17:03:27 +01:00
Ivan Gabaldon
5450d22796 [fix] py: sxng wheel build broken (#5510)
Include everything inside `static/themes/` on SearXNG wheel creation.

```
2025-12-03 10:40:43,509 INFO🛞 adding
'searx/static/themes/simple/manifest.json' 2025-12-03 10:40:43,509
INFO🛞 adding 'searx/static/themes/simple/sxng-core.min.js'
2025-12-03 10:40:43,509 INFO🛞 adding
'searx/static/themes/simple/sxng-core.min.js.map' 2025-12-03
10:40:43,510 INFO🛞 adding
'searx/static/themes/simple/sxng-ltr.min.css' 2025-12-03 10:40:43,510
INFO🛞 adding 'searx/static/themes/simple/sxng-rss.min.css'
2025-12-03 10:40:43,511 INFO🛞 adding
'searx/static/themes/simple/sxng-rtl.min.css' 2025-12-03 10:40:43,511
INFO🛞 adding 'searx/static/themes/simple/chunk/13gvpunf.min.js'
2025-12-03 10:40:43,511 INFO🛞 adding
'searx/static/themes/simple/chunk/13gvpunf.min.js.map' 2025-12-03
10:40:43,516 INFO🛞 adding
'searx/static/themes/simple/chunk/BAcZkB_P.min.js' 2025-12-03
10:40:43,548 INFO🛞 adding
'searx/static/themes/simple/chunk/BAcZkB_P.min.js.map' 2025-12-03
10:40:43,551 INFO🛞 adding
'searx/static/themes/simple/chunk/CHkLfdMs.min.js' 2025-12-03
10:40:43,561 INFO🛞 adding
'searx/static/themes/simple/chunk/CHkLfdMs.min.js.map' 2025-12-03
10:40:43,562 INFO🛞 adding
'searx/static/themes/simple/chunk/CyyZ9XJS.min.js' 2025-12-03
10:40:43,562 INFO🛞 adding
'searx/static/themes/simple/chunk/CyyZ9XJS.min.js.map' 2025-12-03
10:40:43,562 INFO🛞 adding
'searx/static/themes/simple/chunk/DBO1tjH7.min.js' 2025-12-03
10:40:43,562 INFO🛞 adding
'searx/static/themes/simple/chunk/DBO1tjH7.min.js.map' 2025-12-03
10:40:43,562 INFO🛞 adding
'searx/static/themes/simple/chunk/Db5v-hxx.min.js' 2025-12-03
10:40:43,563 INFO🛞 adding
'searx/static/themes/simple/chunk/Db5v-hxx.min.js.map' 2025-12-03
10:40:43,563 INFO🛞 adding
'searx/static/themes/simple/chunk/DxJxX49r.min.js' 2025-12-03
10:40:43,563 INFO🛞 adding
'searx/static/themes/simple/chunk/DxJxX49r.min.js.map' 2025-12-03
10:40:43,563 INFO🛞 adding
'searx/static/themes/simple/chunk/KPZlR0ib.min.js' 2025-12-03
10:40:43,563 INFO🛞 adding
'searx/static/themes/simple/chunk/KPZlR0ib.min.js.map' 2025-12-03
10:40:43,563 INFO🛞 adding
'searx/static/themes/simple/chunk/Q2SRo2ED.min.js' 2025-12-03
10:40:43,563 INFO🛞 adding
'searx/static/themes/simple/chunk/Q2SRo2ED.min.js.map' 2025-12-03
10:40:43,563 INFO🛞 adding
'searx/static/themes/simple/chunk/gZqIRpW1.min.js' 2025-12-03
10:40:43,563 INFO🛞 adding
'searx/static/themes/simple/chunk/gZqIRpW1.min.js.map' 2025-12-03
10:40:43,564 INFO🛞 adding
'searx/static/themes/simple/img/empty_favicon.svg' 2025-12-03
10:40:43,564 INFO🛞 adding
'searx/static/themes/simple/img/favicon.png' 2025-12-03 10:40:43,564
INFO🛞 adding 'searx/static/themes/simple/img/favicon.svg'
2025-12-03 10:40:43,564 INFO🛞 adding
'searx/static/themes/simple/img/img_load_error.svg' 2025-12-03
10:40:43,564 INFO🛞 adding
'searx/static/themes/simple/img/searxng.png' 2025-12-03 10:40:43,564
INFO🛞 adding 'searx/static/themes/simple/img/searxng.svg'
2025-12-03 10:40:43,564 INFO🛞 adding
'searx/static/themes/simple/img/select-dark.svg' 2025-12-03 10:40:43,564
INFO🛞 adding 'searx/static/themes/simple/img/select-light.svg'
```
2025-12-03 11:21:18 +00:00
Bnyro
1174fde1f3 [feat] engines: add lucide icons (#5503) 2025-12-03 09:57:42 +01:00
Ivan Gabaldon
fb089ae297 [mod] client/simple: client plugins (#5406)
* [mod] client/simple: client plugins

Defines a new interface for client side *"plugins"* that coexist with server
side plugin system. Each plugin (e.g., `InfiniteScroll`) extends the base
`ts Plugin`. Client side plugins are independent and lazy‑loaded via `router.ts`
when their `load()` conditions are met. On each navigation request, all
applicable plugins are instanced.

Since these are client side plugins, we can only invoke them once DOM is fully
loaded. E.g. `Calculator` will not render a new `answer` block until fully
loaded and executed.

For some plugins, we might want to handle its availability in `settings.yml`
and toggle in UI, like we do for server side plugins. In that case, we extend
`py Plugin` instancing only the information and then checking client side if
[`settings.plugins`](1ad832b1dc/client/simple/src/js/toolkit.ts (L134))
array has the plugin id.

* [mod] client/simple: rebuild static
2025-12-02 10:18:00 +00:00
Bnyro
ab8224c939 [fix] brave: content description also contains website URL (#5502)
there are other classes like 'site-name-content' we don't want to match,
however only using contains(@class, 'content') would e.g. also match `site-name-content`
thus, we explicitly also require the spaces as class separator
2025-12-01 15:19:06 +01:00
github-actions[bot]
c954e71f87 [data] update searx.data - update_engine_descriptions.py (#5496) 2025-11-29 16:04:34 +01:00
Ivan Gabaldon
cbc04a839a [fix] py: missing module sniffio 2025-11-29 14:56:30 +00:00
searxng-bot
cb4a5abc8c [data] update searx.data - update_currencies.py 2025-11-29 14:54:09 +00:00
searxng-bot
07ff6e3ccc [data] update searx.data - update_wikidata_units.py 2025-11-29 09:00:05 +00:00
searxng-bot
cdaab944b4 [data] update searx.data - update_firefox_version.py 2025-11-29 08:58:56 +00:00
searxng-bot
6ecf32fd4a [data] update searx.data - update_ahmia_blacklist.py 2025-11-29 08:58:26 +00:00
Markus Heiser
20de10df4e Revert "[fix:py3.14] Struct fields aren't discovered in Python 3.14"
This reverts commit 8fdc59a760.
2025-11-28 13:38:37 +01:00
dependabot[bot]
673c29efeb [upd] pypi: Bump the minor group with 2 updates
Bumps the minor group with 2 updates: [msgspec](https://github.com/jcrist/msgspec) and [types-lxml](https://github.com/abelcheung/types-lxml).


Updates `msgspec` from 0.19.0 to 0.20.0
- [Release notes](https://github.com/jcrist/msgspec/releases)
- [Changelog](https://github.com/jcrist/msgspec/blob/main/docs/changelog.md)
- [Commits](https://github.com/jcrist/msgspec/compare/0.19.0...0.20.0)

Updates `types-lxml` from 2025.8.25 to 2025.11.25
- [Release notes](https://github.com/abelcheung/types-lxml/releases)
- [Commits](https://github.com/abelcheung/types-lxml/compare/2025.08.25...2025.11.25)

---
updated-dependencies:
- dependency-name: msgspec
  dependency-version: 0.20.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: minor
- dependency-name: types-lxml
  dependency-version: 2025.11.25
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-28 13:38:37 +01:00
dependabot[bot]
c4abf40e6e [upd] web-client (simple): Bump the minor group
Bumps the minor group in /client/simple with 3 updates: [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome), [sort-package-json](https://github.com/keithamus/sort-package-json) and [stylelint](https://github.com/stylelint/stylelint).

Updates `@biomejs/biome` from 2.3.7 to 2.3.8
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.3.8/packages/@biomejs/biome)

Updates `sort-package-json` from 3.4.0 to 3.5.0
- [Release notes](https://github.com/keithamus/sort-package-json/releases)
- [Commits](https://github.com/keithamus/sort-package-json/compare/v3.4.0...v3.5.0)

Updates `stylelint` from 16.25.0 to 16.26.0
- [Release notes](https://github.com/stylelint/stylelint/releases)
- [Changelog](https://github.com/stylelint/stylelint/blob/main/CHANGELOG.md)
- [Commits](https://github.com/stylelint/stylelint/compare/16.25.0...16.26.0)

---
updated-dependencies:
- dependency-name: "@biomejs/biome"
  dependency-version: 2.3.8
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: minor
- dependency-name: sort-package-json
  dependency-version: 3.5.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: minor
- dependency-name: stylelint
  dependency-version: 16.26.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-28 10:13:02 +00:00
dependabot[bot]
39b9922609 [upd] github-actions: Bump actions/setup-python from 6.0.0 to 6.1.0
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 6.0.0 to 6.1.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](e797f83bcb...83679a892e)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-version: 6.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-28 09:28:12 +00:00
dependabot[bot]
7018e6583b [upd] github-actions: Bump peter-evans/create-pull-request
Bumps [peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request) from 7.0.8 to 7.0.9.
- [Release notes](https://github.com/peter-evans/create-pull-request/releases)
- [Commits](271a8d0340...84ae59a2cd)

---
updated-dependencies:
- dependency-name: peter-evans/create-pull-request
  dependency-version: 7.0.9
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-28 09:27:25 +00:00
dependabot[bot]
b957e587da [upd] github-actions: Bump github/codeql-action from 4.31.4 to 4.31.5
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 4.31.4 to 4.31.5.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](e12f017898...fdbfb4d275)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-version: 4.31.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-28 09:25:18 +00:00
Markus Heiser
ebb9ea4571 [fix] brave engines - web, images & videos (#5478)
brave web:
  xpath selectors needed to be justified

brave images & videos:
  The JS code with the JS object was read incorrectly; not always, but quite
  often, it led to exceptions when the Python data structure was created from it.

BTW: A complete review was conducted and corrections or additions were made to
the type definitions.

To test all brave engines in once::

    !br !brimg !brvid !brnews weather

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-11-25 13:28:47 +01:00
Markus Heiser
54a97e1043 [mod] replace js_variable_to_python by js_obj_str_to_python (#2792) (#5477)
This patch is based on PR #2792 (old PR from 2023)

- js_obj_str_to_python handle more cases
- bring tests from chompjs ..
- comment out tests do not pass

The tests from chompjs give some overview of what is not implemented.

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-11-25 12:51:08 +01:00
Markus Heiser
0ee78c19dd [mod] yandex engines: all egine should use one network
- The three Yandex engines should use the same network context.
- There is no reason to set these engines inactive

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-11-25 11:50:18 +01:00
Aadniz
bcc7a5eb2e [mod] yandex engine: add supported languages
Add support for Yandex's supported languages; Russian, English, Belarusian,
French, German, Indonesian, Kazakh, Tatar, Turkish and Ukrainian.
2025-11-25 11:50:18 +01:00
Markus Heiser
2313b972a3 [fix] engines: base URL can be a list or a string, but its not None!
The code injection and monkey patching examine the names in the module of the
engine; if a variable there starts without an underscore and has the value None,
then this variable needs to be configured. This outdated concept does not fit
engines that may have multiple URLs. At least not as long as the value of the
base URL (list) is None.

The default is now an empty list instead of None

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-11-25 06:25:45 +01:00
Markus Heiser
989b49335c [fix] engines initialization - if engine load fails, set to inactive
- if engine load fails, set the engine to inactive
- dont' load a engine, when the config says its inactive

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-11-25 06:25:45 +01:00
Markus Heiser
3f30831640 [fix] don't raise fatal exception when engine isn't available
When wikidata or any other engine with a init method (is active!)  raises an
exception in it's init method, the engine is never registered.

[1] https://github.com/searxng/searxng/issues/5456#issuecomment-3567790287

Closes: https://github.com/searxng/searxng/issues/5456
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-11-25 06:25:45 +01:00
Hermógenes Oliveira
5fcee9bc30 [fix] recoll engine: remove HTML markup from result snippets (#5472)
Recoll inserts markup tags in snippets to indicate matching terms in a
search query.  We remove them so that they don't show to users.
2025-11-24 06:54:45 +01:00
Ivan Gabaldon
2f0e52d6eb [upd] ci: docker secret maintenance
I've narrowed the permissions and rotated the token for the deploy account on
DockerHub registry. I replaced the secret ref in GitHub so that it's available
organization wide. No further actions are necessary.
2025-11-23 12:26:40 +00:00
Grant
c0d69cec4e [fix] drop mullvad-leta engine (#5428)
On 2025 November 27th, Mullvad will be shutting down the Leta servers.
For this reason, we also need to remove this engine from SearXNG.

[1] https://mullvad.net/en/blog/shutting-down-our-search-proxy-leta
2025-11-22 10:02:51 +01:00
Austin-Olacsi
c852b9a90a [feat] engine: add grokipedia (#5396) 2025-11-22 09:59:38 +01:00
Ivan Gabaldon
b876d0bed0 [upd] theme/simple: bump rolldown
(no static changes...)
2025-11-21 11:03:04 +00:00
Léon Tiekötter
e245cade25 [fix] engines: typo (#5466)
Fix typo in engine timeout definition: 'timout' -> 'timeout'
2025-11-21 11:20:10 +01:00
dependabot[bot]
7c223b32a7 [upd] web-client (simple): Bump @biomejs/biome
Bumps the minor group in /client/simple with 1 update: [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome).

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-21 10:15:49 +00:00
dependabot[bot]
33a176813d [upd] github-actions: Bump actions/checkout from 5.0.0 to 6.0.0
Bumps [actions/checkout](https://github.com/actions/checkout) from 5.0.0 to 6.0.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](08c6903cd8...1af3b93b68)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-21 09:38:50 +00:00
dependabot[bot]
20ec01c5f7 [upd] github-actions: Bump github/codeql-action from 4.31.3 to 4.31.4
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 4.31.3 to 4.31.4.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](014f16e7ab...e12f017898)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-version: 4.31.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-21 09:21:20 +00:00
dependabot[bot]
6376601ba1 [upd] pypi: Bump the minor group with 4 updates (#5462)
Bumps the minor group with 4 updates: [granian](https://github.com/emmett-framework/granian), [granian[pname]](https://github.com/emmett-framework/granian), [granian[reload]](https://github.com/emmett-framework/granian) and [basedpyright](https://github.com/detachhead/basedpyright).


Updates `granian` from 2.5.7 to 2.6.0
- [Release notes](https://github.com/emmett-framework/granian/releases)
- [Commits](https://github.com/emmett-framework/granian/compare/v2.5.7...v2.6.0)

Updates `granian[pname]` from 2.5.7 to 2.6.0
- [Release notes](https://github.com/emmett-framework/granian/releases)
- [Commits](https://github.com/emmett-framework/granian/compare/v2.5.7...v2.6.0)

Updates `granian[reload]` from 2.5.7 to 2.6.0
- [Release notes](https://github.com/emmett-framework/granian/releases)
- [Commits](https://github.com/emmett-framework/granian/compare/v2.5.7...v2.6.0)

Updates `basedpyright` from 1.33.0 to 1.34.0
- [Release notes](https://github.com/detachhead/basedpyright/releases)
- [Commits](https://github.com/detachhead/basedpyright/compare/v1.33.0...v1.34.0)
2025-11-21 08:31:16 +01:00
Markus Heiser
ca441f419c [fix] engines - set hard timouts in *sub-request* (#5460)
The requests changed here all run outside of the network context timeout,
thereby preventing the engine's timeout from being applied (the engine's timeout
can become longer than it was configured).

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2025-11-21 08:16:24 +01:00
searxng-bot
04e66a2bb4 [l10n] update translations from Weblate 2025-11-20 21:22:43 +00:00
277 changed files with 10894 additions and 8947 deletions

View File

@@ -24,17 +24,17 @@ jobs:
runs-on: ubuntu-24.04-arm runs-on: ubuntu-24.04-arm
steps: steps:
- name: Setup Python - name: Setup Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "${{ env.PYTHON_VERSION }}" python-version: "${{ env.PYTHON_VERSION }}"
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
persist-credentials: "false" persist-credentials: "false"
- name: Setup cache Python - name: Setup cache Python
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}" key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}"
restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-" restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-"

View File

@@ -78,18 +78,18 @@ jobs:
# yamllint enable rule:line-length # yamllint enable rule:line-length
- name: Setup Python - name: Setup Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "${{ env.PYTHON_VERSION }}" python-version: "${{ env.PYTHON_VERSION }}"
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
persist-credentials: "false" persist-credentials: "false"
fetch-depth: "0" fetch-depth: "0"
- name: Setup cache Python - name: Setup cache Python
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}" key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}"
restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-" restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-"
@@ -100,7 +100,7 @@ jobs:
run: echo "date=$(date +'%Y%m%d')" >>$GITHUB_OUTPUT run: echo "date=$(date +'%Y%m%d')" >>$GITHUB_OUTPUT
- name: Setup cache container - name: Setup cache container
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
key: "container-${{ matrix.arch }}-${{ steps.date.outputs.date }}-${{ hashFiles('./requirements*.txt') }}" key: "container-${{ matrix.arch }}-${{ steps.date.outputs.date }}-${{ hashFiles('./requirements*.txt') }}"
restore-keys: | restore-keys: |
@@ -145,7 +145,7 @@ jobs:
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
persist-credentials: "false" persist-credentials: "false"
@@ -179,7 +179,7 @@ jobs:
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
persist-credentials: "false" persist-credentials: "false"
@@ -194,8 +194,8 @@ jobs:
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with: with:
registry: "docker.io" registry: "docker.io"
username: "${{ secrets.DOCKERHUB_USERNAME }}" username: "${{ secrets.DOCKER_USER }}"
password: "${{ secrets.DOCKERHUB_TOKEN }}" password: "${{ secrets.DOCKER_TOKEN }}"
- name: Release - name: Release
env: env:

View File

@@ -40,17 +40,17 @@ jobs:
steps: steps:
- name: Setup Python - name: Setup Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "${{ env.PYTHON_VERSION }}" python-version: "${{ env.PYTHON_VERSION }}"
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
persist-credentials: "false" persist-credentials: "false"
- name: Setup cache Python - name: Setup cache Python
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}" key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}"
restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-" restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-"
@@ -64,7 +64,7 @@ jobs:
- name: Create PR - name: Create PR
id: cpr id: cpr
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8 uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with: with:
author: "searxng-bot <searxng-bot@users.noreply.github.com>" author: "searxng-bot <searxng-bot@users.noreply.github.com>"
committer: "searxng-bot <searxng-bot@users.noreply.github.com>" committer: "searxng-bot <searxng-bot@users.noreply.github.com>"

View File

@@ -32,18 +32,18 @@ jobs:
steps: steps:
- name: Setup Python - name: Setup Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "${{ env.PYTHON_VERSION }}" python-version: "${{ env.PYTHON_VERSION }}"
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
persist-credentials: "false" persist-credentials: "false"
fetch-depth: "0" fetch-depth: "0"
- name: Setup cache Python - name: Setup cache Python
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}" key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}"
restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-" restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-"
@@ -57,7 +57,7 @@ jobs:
- if: github.ref_name == 'master' - if: github.ref_name == 'master'
name: Release name: Release
uses: JamesIves/github-pages-deploy-action@4a3abc783e1a24aeb44c16e869ad83caf6b4cc23 # v4.7.4 uses: JamesIves/github-pages-deploy-action@9d877eea73427180ae43cf98e8914934fe157a1a # v4.7.6
with: with:
folder: "dist/docs" folder: "dist/docs"
branch: "gh-pages" branch: "gh-pages"

View File

@@ -35,17 +35,17 @@ jobs:
steps: steps:
- name: Setup Python - name: Setup Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "${{ matrix.python-version }}" python-version: "${{ matrix.python-version }}"
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
persist-credentials: "false" persist-credentials: "false"
- name: Setup cache Python - name: Setup cache Python
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
key: "python-${{ matrix.python-version }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}" key: "python-${{ matrix.python-version }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}"
restore-keys: "python-${{ matrix.python-version }}-${{ runner.arch }}-" restore-keys: "python-${{ matrix.python-version }}-${{ runner.arch }}-"
@@ -62,28 +62,28 @@ jobs:
runs-on: ubuntu-24.04-arm runs-on: ubuntu-24.04-arm
steps: steps:
- name: Setup Python - name: Setup Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "${{ env.PYTHON_VERSION }}" python-version: "${{ env.PYTHON_VERSION }}"
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
persist-credentials: "false" persist-credentials: "false"
- name: Setup Node.js - name: Setup Node.js
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6.0.0 uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with: with:
node-version-file: "./.nvmrc" node-version-file: "./.nvmrc"
- name: Setup cache Node.js - name: Setup cache Node.js
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
key: "nodejs-${{ runner.arch }}-${{ hashFiles('./.nvmrc', './package.json') }}" key: "nodejs-${{ runner.arch }}-${{ hashFiles('./.nvmrc', './package.json') }}"
path: "./client/simple/node_modules/" path: "./client/simple/node_modules/"
- name: Setup cache Python - name: Setup cache Python
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}" key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}"
restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-" restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-"

View File

@@ -35,18 +35,18 @@ jobs:
steps: steps:
- name: Setup Python - name: Setup Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "${{ env.PYTHON_VERSION }}" python-version: "${{ env.PYTHON_VERSION }}"
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
token: "${{ secrets.WEBLATE_GITHUB_TOKEN }}" token: "${{ secrets.WEBLATE_GITHUB_TOKEN }}"
fetch-depth: "0" fetch-depth: "0"
- name: Setup cache Python - name: Setup cache Python
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}" key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}"
restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-" restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-"
@@ -82,18 +82,18 @@ jobs:
steps: steps:
- name: Setup Python - name: Setup Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "${{ env.PYTHON_VERSION }}" python-version: "${{ env.PYTHON_VERSION }}"
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
token: "${{ secrets.WEBLATE_GITHUB_TOKEN }}" token: "${{ secrets.WEBLATE_GITHUB_TOKEN }}"
fetch-depth: "0" fetch-depth: "0"
- name: Setup cache Python - name: Setup cache Python
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}" key: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-${{ hashFiles('./requirements*.txt') }}"
restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-" restore-keys: "python-${{ env.PYTHON_VERSION }}-${{ runner.arch }}-"
@@ -117,7 +117,7 @@ jobs:
- name: Create PR - name: Create PR
id: cpr id: cpr
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8 uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with: with:
author: "searxng-bot <searxng-bot@users.noreply.github.com>" author: "searxng-bot <searxng-bot@users.noreply.github.com>"
committer: "searxng-bot <searxng-bot@users.noreply.github.com>" committer: "searxng-bot <searxng-bot@users.noreply.github.com>"

View File

@@ -24,7 +24,7 @@ jobs:
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
persist-credentials: "false" persist-credentials: "false"
@@ -32,8 +32,8 @@ jobs:
uses: docker/scout-action@f8c776824083494ab0d56b8105ba2ca85c86e4de # v1.18.2 uses: docker/scout-action@f8c776824083494ab0d56b8105ba2ca85c86e4de # v1.18.2
with: with:
organization: "searxng" organization: "searxng"
dockerhub-user: "${{ secrets.DOCKERHUB_USERNAME }}" dockerhub-user: "${{ secrets.DOCKER_USER }}"
dockerhub-password: "${{ secrets.DOCKERHUB_TOKEN }}" dockerhub-password: "${{ secrets.DOCKER_TOKEN }}"
image: "registry://ghcr.io/searxng/searxng:latest" image: "registry://ghcr.io/searxng/searxng:latest"
command: "cves" command: "cves"
sarif-file: "./scout.sarif" sarif-file: "./scout.sarif"
@@ -41,6 +41,6 @@ jobs:
write-comment: "false" write-comment: "false"
- name: Upload SARIFs - name: Upload SARIFs
uses: github/codeql-action/upload-sarif@014f16e7ab1402f30e7c3329d33797e7948572db # v4.31.3 uses: github/codeql-action/upload-sarif@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9
with: with:
sarif_file: "./scout.sarif" sarif_file: "./scout.sarif"

2
.nvmrc
View File

@@ -1 +1 @@
24 25

View File

@@ -46,11 +46,9 @@ Further information on *how-to* can be found `here <https://docs.searxng.org/adm
Connect Connect
======= =======
If you have questions or want to connect with others in the community, If you have questions or want to connect with others in the community:
we have two official channels:
- `#searxng:matrix.org <https://matrix.to/#/#searxng:matrix.org>`_ - `#searxng:matrix.org <https://matrix.to/#/#searxng:matrix.org>`_
- `#searxng @ libera.chat <https://web.libera.chat/?channel=#searxng>`_ (bridged to Matrix)
Contributing Contributing
============ ============

View File

@@ -1,5 +1,5 @@
{ {
"$schema": "https://biomejs.dev/schemas/2.3.5/schema.json", "$schema": "https://biomejs.dev/schemas/2.3.10/schema.json",
"files": { "files": {
"ignoreUnknown": true, "ignoreUnknown": true,
"includes": ["**", "!node_modules"] "includes": ["**", "!node_modules"]
@@ -35,12 +35,6 @@
}, },
"correctness": { "correctness": {
"noGlobalDirnameFilename": "error", "noGlobalDirnameFilename": "error",
"noUndeclaredVariables": {
"level": "error",
"options": {
"checkTypes": true
}
},
"useImportExtensions": "error", "useImportExtensions": "error",
"useJsonImportAttributes": "error", "useJsonImportAttributes": "error",
"useSingleJsDocAsterisk": "error" "useSingleJsDocAsterisk": "error"
@@ -48,15 +42,22 @@
"nursery": { "nursery": {
"noContinue": "warn", "noContinue": "warn",
"noDeprecatedImports": "warn", "noDeprecatedImports": "warn",
"noEqualsToNull": "warn",
"noFloatingPromises": "warn", "noFloatingPromises": "warn",
"noForIn": "warn",
"noImportCycles": "warn", "noImportCycles": "warn",
"noIncrementDecrement": "warn", "noIncrementDecrement": "warn",
"noMisusedPromises": "warn", "noMisusedPromises": "warn",
"noMultiStr": "warn",
"noParametersOnlyUsedInRecursion": "warn", "noParametersOnlyUsedInRecursion": "warn",
"noUselessCatchBinding": "warn", "noUselessCatchBinding": "warn",
"noUselessUndefined": "warn", "noUselessUndefined": "warn",
"useAwaitThenable": "off",
"useDestructuring": "warn",
"useExhaustiveSwitchCases": "warn", "useExhaustiveSwitchCases": "warn",
"useExplicitType": "warn" "useExplicitType": "warn",
"useFind": "warn",
"useRegexpExec": "warn"
}, },
"performance": { "performance": {
"noAwaitInLoops": "error", "noAwaitInLoops": "error",
@@ -158,15 +159,5 @@
"semicolons": "always", "semicolons": "always",
"trailingCommas": "none" "trailingCommas": "none"
} }
},
"html": {
"experimentalFullSupportEnabled": true,
"formatter": {
"attributePosition": "auto",
"bracketSameLine": false,
"indentScriptAndStyle": true,
"selfCloseVoidElements": "always",
"whitespaceSensitivity": "ignore"
}
} }
} }

File diff suppressed because it is too large Load Diff

View File

@@ -19,33 +19,31 @@
"lint:tsc": "tsc --noEmit" "lint:tsc": "tsc --noEmit"
}, },
"browserslist": [ "browserslist": [
"Chrome >= 93", "baseline 2022",
"Firefox >= 92",
"Safari >= 15.4",
"not dead" "not dead"
], ],
"dependencies": { "dependencies": {
"ionicons": "~8.0.13", "ionicons": "^8.0.13",
"normalize.css": "8.0.1", "normalize.css": "8.0.1",
"ol": "~10.7.0", "ol": "^10.7.0",
"swiped-events": "1.2.0" "swiped-events": "1.2.0"
}, },
"devDependencies": { "devDependencies": {
"@biomejs/biome": "2.3.5", "@biomejs/biome": "2.3.10",
"@types/node": "~24.10.1", "@types/node": "^25.0.3",
"browserslist": "~4.28.0", "browserslist": "^4.28.1",
"browserslist-to-esbuild": "~2.1.1", "browserslist-to-esbuild": "^2.1.1",
"edge.js": "~6.3.0", "edge.js": "^6.4.0",
"less": "~4.4.2", "less": "^4.5.1",
"lightningcss": "~1.30.2", "mathjs": "^15.1.0",
"sharp": "~0.34.5", "sharp": "~0.34.5",
"sort-package-json": "~3.4.0", "sort-package-json": "^3.6.0",
"stylelint": "~16.25.0", "stylelint": "^16.26.0",
"stylelint-config-standard-less": "~3.0.1", "stylelint-config-standard-less": "^3.0.1",
"stylelint-prettier": "~5.0.3", "stylelint-prettier": "^5.0.3",
"svgo": "~4.0.0", "svgo": "^4.0.0",
"typescript": "~5.9.3", "typescript": "~5.9.3",
"vite": "npm:rolldown-vite@7.2.2", "vite": "8.0.0-beta.5",
"vite-bundle-analyzer": "~1.2.3" "vite-bundle-analyzer": "^1.3.2"
} }
} }

View File

@@ -0,0 +1,66 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
/**
* Base class for client-side plugins.
*
* @remarks
* Handle conditional loading of the plugin in:
*
* - client/simple/src/js/router.ts
*
* @abstract
*/
export abstract class Plugin {
/**
* Plugin name.
*/
protected readonly id: string;
/**
* @remarks
* Don't hold references of this instance outside the class.
*/
protected constructor(id: string) {
this.id = id;
void this.invoke();
}
private async invoke(): Promise<void> {
try {
console.debug(`[PLUGIN] ${this.id}: Running...`);
const result = await this.run();
if (!result) return;
console.debug(`[PLUGIN] ${this.id}: Running post-exec...`);
// @ts-expect-error
void (await this.post(result as NonNullable<Awaited<ReturnType<this["run"]>>>));
} catch (error) {
console.error(`[PLUGIN] ${this.id}:`, error);
} finally {
console.debug(`[PLUGIN] ${this.id}: Done.`);
}
}
/**
* Plugin goes here.
*
* @remarks
* The plugin is already loaded at this point. If you wish to execute
* conditions to exit early, consider moving the logic to:
*
* - client/simple/src/js/router.ts
*
* ...to avoid unnecessarily loading this plugin on the client.
*/
protected abstract run(): Promise<unknown>;
/**
* Post-execution hook.
*
* @remarks
* The hook is only executed if `#run()` returns a truthy value.
*/
// @ts-expect-error
protected abstract post(result: NonNullable<Awaited<ReturnType<this["run"]>>>): Promise<void>;
}

View File

@@ -1,6 +0,0 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import "./nojs.ts";
import "./router.ts";
import "./toolkit.ts";
import "./listener.ts";

View File

@@ -1,7 +0,0 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import { listen } from "./toolkit.ts";
listen("click", ".close", function (this: HTMLElement) {
(this.parentNode as HTMLElement)?.classList.add("invisible");
});

View File

@@ -1,8 +0,0 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import { ready } from "./toolkit.ts";
ready(() => {
document.documentElement.classList.remove("no-js");
document.documentElement.classList.add("js");
});

View File

@@ -1,40 +0,0 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import { Endpoints, endpoint, ready, settings } from "./toolkit.ts";
ready(
() => {
void import("../main/keyboard.ts");
void import("../main/search.ts");
if (settings.autocomplete) {
void import("../main/autocomplete.ts");
}
},
{ on: [endpoint === Endpoints.index] }
);
ready(
() => {
void import("../main/keyboard.ts");
void import("../main/mapresult.ts");
void import("../main/results.ts");
void import("../main/search.ts");
if (settings.infinite_scroll) {
void import("../main/infinite_scroll.ts");
}
if (settings.autocomplete) {
void import("../main/autocomplete.ts");
}
},
{ on: [endpoint === Endpoints.results] }
);
ready(
() => {
void import("../main/preferences.ts");
},
{ on: [endpoint === Endpoints.preferences] }
);

View File

@@ -0,0 +1,4 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
// core
void import.meta.glob(["./*.ts", "./util/**/.ts"], { eager: true });

View File

@@ -0,0 +1,36 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import type { Plugin } from "./Plugin.ts";
import { type EndpointsKeys, endpoint } from "./toolkit.ts";
type Options =
| {
on: "global";
}
| {
on: "endpoint";
where: EndpointsKeys[];
};
export const load = <T extends Plugin>(instance: () => Promise<T>, options: Options): void => {
if (!check(options)) return;
void instance();
};
const check = (options: Options): boolean => {
// biome-ignore lint/style/useDefaultSwitchClause: options is typed
switch (options.on) {
case "global": {
return true;
}
case "endpoint": {
if (!options.where.includes(endpoint)) {
// not on the expected endpoint
return false;
}
return true;
}
}
};

View File

@@ -1,6 +1,7 @@
// SPDX-License-Identifier: AGPL-3.0-or-later // SPDX-License-Identifier: AGPL-3.0-or-later
import { assertElement, http, listen, settings } from "../core/toolkit.ts"; import { http, listen, settings } from "../toolkit.ts";
import { assertElement } from "../util/assertElement.ts";
const fetchResults = async (qInput: HTMLInputElement, query: string): Promise<void> => { const fetchResults = async (qInput: HTMLInputElement, query: string): Promise<void> => {
try { try {

View File

@@ -1,100 +0,0 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import { assertElement, http, settings } from "../core/toolkit.ts";
const newLoadSpinner = (): HTMLDivElement => {
return Object.assign(document.createElement("div"), {
className: "loader"
});
};
const loadNextPage = async (onlyImages: boolean, callback: () => void): Promise<void> => {
const searchForm = document.querySelector<HTMLFormElement>("#search");
assertElement(searchForm);
const form = document.querySelector<HTMLFormElement>("#pagination form.next_page");
assertElement(form);
const action = searchForm.getAttribute("action");
if (!action) {
throw new Error("Form action not defined");
}
const paginationElement = document.querySelector<HTMLElement>("#pagination");
assertElement(paginationElement);
paginationElement.replaceChildren(newLoadSpinner());
try {
const res = await http("POST", action, { body: new FormData(form) });
const nextPage = await res.text();
if (!nextPage) return;
const nextPageDoc = new DOMParser().parseFromString(nextPage, "text/html");
const articleList = nextPageDoc.querySelectorAll<HTMLElement>("#urls article");
const nextPaginationElement = nextPageDoc.querySelector<HTMLElement>("#pagination");
document.querySelector("#pagination")?.remove();
const urlsElement = document.querySelector<HTMLElement>("#urls");
if (!urlsElement) {
throw new Error("URLs element not found");
}
if (articleList.length > 0 && !onlyImages) {
// do not add <hr> element when there are only images
urlsElement.appendChild(document.createElement("hr"));
}
urlsElement.append(...Array.from(articleList));
if (nextPaginationElement) {
const results = document.querySelector<HTMLElement>("#results");
results?.appendChild(nextPaginationElement);
callback();
}
} catch (error) {
console.error("Error loading next page:", error);
const errorElement = Object.assign(document.createElement("div"), {
textContent: settings.translations?.error_loading_next_page ?? "Error loading next page",
className: "dialog-error"
});
errorElement.setAttribute("role", "alert");
document.querySelector("#pagination")?.replaceChildren(errorElement);
}
};
const resultsElement: HTMLElement | null = document.getElementById("results");
if (!resultsElement) {
throw new Error("Results element not found");
}
const onlyImages: boolean = resultsElement.classList.contains("only_template_images");
const observedSelector = "article.result:last-child";
const intersectionObserveOptions: IntersectionObserverInit = {
rootMargin: "320px"
};
const observer: IntersectionObserver = new IntersectionObserver((entries: IntersectionObserverEntry[]) => {
const [paginationEntry] = entries;
if (paginationEntry?.isIntersecting) {
observer.unobserve(paginationEntry.target);
void loadNextPage(onlyImages, () => {
const nextObservedElement = document.querySelector<HTMLElement>(observedSelector);
if (nextObservedElement) {
observer.observe(nextObservedElement);
}
}).then(() => {
// wait until promise is resolved
});
}
}, intersectionObserveOptions);
const initialObservedElement: HTMLElement | null = document.querySelector<HTMLElement>(observedSelector);
if (initialObservedElement) {
observer.observe(initialObservedElement);
}

View File

@@ -1,6 +1,7 @@
// SPDX-License-Identifier: AGPL-3.0-or-later // SPDX-License-Identifier: AGPL-3.0-or-later
import { assertElement, listen, mutable, settings } from "../core/toolkit.ts"; import { listen, mutable, settings } from "../toolkit.ts";
import { assertElement } from "../util/assertElement.ts";
export type KeyBindingLayout = "default" | "vim"; export type KeyBindingLayout = "default" | "vim";
@@ -219,7 +220,7 @@ const highlightResult =
// biome-ignore lint/complexity/noUselessSwitchCase: fallthrough is intended // biome-ignore lint/complexity/noUselessSwitchCase: fallthrough is intended
case "top": case "top":
default: default:
next = results[0]; [next] = results;
} }
} }
@@ -342,7 +343,7 @@ const initHelpContent = (divElement: HTMLElement, keyBindings: typeof baseKeyBin
const categories: Record<string, KeyBinding[]> = {}; const categories: Record<string, KeyBinding[]> = {};
for (const binding of Object.values(keyBindings)) { for (const binding of Object.values(keyBindings)) {
const cat = binding.cat; const { cat } = binding;
categories[cat] ??= []; categories[cat] ??= [];
categories[cat].push(binding); categories[cat].push(binding);
} }
@@ -399,7 +400,7 @@ const toggleHelp = (keyBindings: typeof baseKeyBinding): void => {
className: "dialog-modal" className: "dialog-modal"
}); });
initHelpContent(helpPanel, keyBindings); initHelpContent(helpPanel, keyBindings);
const body = document.getElementsByTagName("body")[0]; const [body] = document.getElementsByTagName("body");
if (body) { if (body) {
body.appendChild(helpPanel); body.appendChild(helpPanel);
} }

View File

@@ -1,86 +0,0 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import { listen } from "../core/toolkit.ts";
listen("click", ".searxng_init_map", async function (this: HTMLElement, event: Event) {
event.preventDefault();
this.classList.remove("searxng_init_map");
const {
View,
OlMap,
TileLayer,
VectorLayer,
OSM,
VectorSource,
Style,
Stroke,
Fill,
Circle,
fromLonLat,
GeoJSON,
Feature,
Point
} = await import("../pkg/ol.ts");
void import("ol/ol.css");
const { leafletTarget: target, mapLon, mapLat, mapGeojson } = this.dataset;
const lon = Number.parseFloat(mapLon || "0");
const lat = Number.parseFloat(mapLat || "0");
const view = new View({ maxZoom: 16, enableRotation: false });
const map = new OlMap({
target: target,
layers: [new TileLayer({ source: new OSM({ maxZoom: 16 }) })],
view: view
});
try {
const markerSource = new VectorSource({
features: [
new Feature({
geometry: new Point(fromLonLat([lon, lat]))
})
]
});
const markerLayer = new VectorLayer({
source: markerSource,
style: new Style({
image: new Circle({
radius: 6,
fill: new Fill({ color: "#3050ff" })
})
})
});
map.addLayer(markerLayer);
} catch (error) {
console.error("Failed to create marker layer:", error);
}
if (mapGeojson) {
try {
const geoSource = new VectorSource({
features: new GeoJSON().readFeatures(JSON.parse(mapGeojson), {
dataProjection: "EPSG:4326",
featureProjection: "EPSG:3857"
})
});
const geoLayer = new VectorLayer({
source: geoSource,
style: new Style({
stroke: new Stroke({ color: "#3050ff", width: 2 }),
fill: new Fill({ color: "#3050ff33" })
})
});
map.addLayer(geoLayer);
view.fit(geoSource.getExtent(), { padding: [20, 20, 20, 20] });
} catch (error) {
console.error("Failed to create GeoJSON layer:", error);
}
}
});

View File

@@ -1,6 +1,7 @@
// SPDX-License-Identifier: AGPL-3.0-or-later // SPDX-License-Identifier: AGPL-3.0-or-later
import { assertElement, http, listen, settings } from "../core/toolkit.ts"; import { http, listen, settings } from "../toolkit.ts";
import { assertElement } from "../util/assertElement.ts";
let engineDescriptions: Record<string, [string, string]> | undefined; let engineDescriptions: Record<string, [string, string]> | undefined;
@@ -69,8 +70,7 @@ listen("click", "#copy-hash", async function (this: HTMLElement) {
} }
} }
const copiedText = this.dataset.copiedText; if (this.dataset.copiedText) {
if (copiedText) { this.innerText = this.dataset.copiedText;
this.innerText = copiedText;
} }
}); });

View File

@@ -1,7 +1,8 @@
// SPDX-License-Identifier: AGPL-3.0-or-later // SPDX-License-Identifier: AGPL-3.0-or-later
import "../../../node_modules/swiped-events/src/swiped-events.js"; import "../../../node_modules/swiped-events/src/swiped-events.js";
import { assertElement, listen, mutable, settings } from "../core/toolkit.ts"; import { listen, mutable, settings } from "../toolkit.ts";
import { assertElement } from "../util/assertElement.ts";
let imgTimeoutID: number; let imgTimeoutID: number;
@@ -134,9 +135,8 @@ listen("click", "#copy_url", async function (this: HTMLElement) {
} }
} }
const copiedText = this.dataset.copiedText; if (this.dataset.copiedText) {
if (copiedText) { this.innerText = this.dataset.copiedText;
this.innerText = copiedText;
} }
}); });

View File

@@ -1,88 +1,51 @@
// SPDX-License-Identifier: AGPL-3.0-or-later // SPDX-License-Identifier: AGPL-3.0-or-later
import { assertElement, listen, settings } from "../core/toolkit.ts"; import { listen } from "../toolkit.ts";
import { getElement } from "../util/getElement.ts";
const submitIfQuery = (qInput: HTMLInputElement): void => { const searchForm: HTMLFormElement = getElement<HTMLFormElement>("search");
if (qInput.value.length > 0) { const searchInput: HTMLInputElement = getElement<HTMLInputElement>("q");
const search = document.getElementById("search") as HTMLFormElement | null; const searchReset: HTMLButtonElement = getElement<HTMLButtonElement>("clear_search");
search?.submit();
}
};
const updateClearButton = (qInput: HTMLInputElement, cs: HTMLElement): void => {
cs.classList.toggle("empty", qInput.value.length === 0);
};
const createClearButton = (qInput: HTMLInputElement): void => {
const cs = document.getElementById("clear_search");
assertElement(cs);
updateClearButton(qInput, cs);
listen("click", cs, (event: MouseEvent) => {
event.preventDefault();
qInput.value = "";
qInput.focus();
updateClearButton(qInput, cs);
});
listen("input", qInput, () => updateClearButton(qInput, cs), { passive: true });
};
const qInput = document.getElementById("q") as HTMLInputElement | null;
assertElement(qInput);
const isMobile: boolean = window.matchMedia("(max-width: 50em)").matches; const isMobile: boolean = window.matchMedia("(max-width: 50em)").matches;
const isResultsPage: boolean = document.querySelector("main")?.id === "main_results"; const isResultsPage: boolean = document.querySelector("main")?.id === "main_results";
const categoryButtons: HTMLButtonElement[] = Array.from(
document.querySelectorAll<HTMLButtonElement>("#categories_container button.category")
);
if (searchInput.value.length === 0) {
searchReset.classList.add("empty");
}
// focus search input on large screens // focus search input on large screens
if (!(isMobile || isResultsPage)) { if (!(isMobile || isResultsPage)) {
qInput.focus(); searchInput.focus();
} }
// On mobile, move cursor to the end of the input on focus // On mobile, move cursor to the end of the input on focus
if (isMobile) { if (isMobile) {
listen("focus", qInput, () => { listen("focus", searchInput, () => {
// Defer cursor move until the next frame to prevent a visual jump // Defer cursor move until the next frame to prevent a visual jump
requestAnimationFrame(() => { requestAnimationFrame(() => {
const end = qInput.value.length; const end = searchInput.value.length;
qInput.setSelectionRange(end, end); searchInput.setSelectionRange(end, end);
qInput.scrollLeft = qInput.scrollWidth; searchInput.scrollLeft = searchInput.scrollWidth;
}); });
}); });
} }
createClearButton(qInput); listen("input", searchInput, () => {
searchReset.classList.toggle("empty", searchInput.value.length === 0);
});
// Additionally to searching when selecting a new category, we also listen("click", searchReset, (event: MouseEvent) => {
// automatically start a new search request when the user changes a search event.preventDefault();
// filter (safesearch, time range or language) (this requires JavaScript searchInput.value = "";
// though) searchInput.focus();
if ( searchReset.classList.add("empty");
settings.search_on_category_select && });
// If .search_filters is undefined (invisible) we are on the homepage and
// hence don't have to set any listeners
document.querySelector(".search_filters")
) {
const safesearchElement = document.getElementById("safesearch");
if (safesearchElement) {
listen("change", safesearchElement, () => submitIfQuery(qInput));
}
const timeRangeElement = document.getElementById("time_range");
if (timeRangeElement) {
listen("change", timeRangeElement, () => submitIfQuery(qInput));
}
const languageElement = document.getElementById("language");
if (languageElement) {
listen("change", languageElement, () => submitIfQuery(qInput));
}
}
const categoryButtons: HTMLButtonElement[] = [
...document.querySelectorAll<HTMLButtonElement>("button.category_button")
];
for (const button of categoryButtons) { for (const button of categoryButtons) {
listen("click", button, (event: MouseEvent) => { listen("click", button, (event: MouseEvent) => {
if (event.shiftKey) { if (event.shiftKey) {
@@ -98,21 +61,34 @@ for (const button of categoryButtons) {
}); });
} }
const form: HTMLFormElement | null = document.querySelector<HTMLFormElement>("#search"); if (document.querySelector("div.search_filters")) {
assertElement(form); const safesearchElement = document.getElementById("safesearch");
if (safesearchElement) {
// override form submit action to update the actually selected categories listen("change", safesearchElement, () => searchForm.submit());
listen("submit", form, (event: Event) => {
event.preventDefault();
const categoryValuesInput = document.querySelector<HTMLInputElement>("#selected-categories");
if (categoryValuesInput) {
const categoryValues = categoryButtons
.filter((button) => button.classList.contains("selected"))
.map((button) => button.name.replace("category_", ""));
categoryValuesInput.value = categoryValues.join(",");
} }
form.submit(); const timeRangeElement = document.getElementById("time_range");
if (timeRangeElement) {
listen("change", timeRangeElement, () => searchForm.submit());
}
const languageElement = document.getElementById("language");
if (languageElement) {
listen("change", languageElement, () => searchForm.submit());
}
}
// override searchForm submit event
listen("submit", searchForm, (event: Event) => {
event.preventDefault();
if (categoryButtons.length > 0) {
const searchCategories = getElement<HTMLInputElement>("selected-categories");
searchCategories.value = categoryButtons
.filter((button) => button.classList.contains("selected"))
.map((button) => button.name.replace("category_", ""))
.join(",");
}
searchForm.submit();
}); });

View File

@@ -1,28 +0,0 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import { Feature, Map as OlMap, View } from "ol";
import { createEmpty } from "ol/extent";
import { GeoJSON } from "ol/format";
import { Point } from "ol/geom";
import { Tile as TileLayer, Vector as VectorLayer } from "ol/layer";
import { fromLonLat } from "ol/proj";
import { OSM, Vector as VectorSource } from "ol/source";
import { Circle, Fill, Stroke, Style } from "ol/style";
export {
View,
OlMap,
TileLayer,
VectorLayer,
OSM,
createEmpty,
VectorSource,
Style,
Stroke,
Fill,
Circle,
fromLonLat,
GeoJSON,
Feature,
Point
};

View File

@@ -0,0 +1,93 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import {
absDependencies,
addDependencies,
create,
divideDependencies,
eDependencies,
evaluateDependencies,
expDependencies,
factorialDependencies,
gcdDependencies,
lcmDependencies,
log1pDependencies,
log2Dependencies,
log10Dependencies,
logDependencies,
modDependencies,
multiplyDependencies,
nthRootDependencies,
piDependencies,
powDependencies,
roundDependencies,
signDependencies,
sqrtDependencies,
subtractDependencies
} from "mathjs/number";
import { Plugin } from "../Plugin.ts";
import { appendAnswerElement } from "../util/appendAnswerElement.ts";
import { getElement } from "../util/getElement.ts";
/**
* Parses and solves mathematical expressions. Can do basic arithmetic and
* evaluate some functions.
*
* @example
* "(3 + 5) / 2" = "4"
* "e ^ 2 + pi" = "10.530648752520442"
* "gcd(48, 18) + lcm(4, 5)" = "26"
*
* @remarks
* Depends on `mathjs` library.
*/
export default class Calculator extends Plugin {
public constructor() {
super("calculator");
}
/**
* @remarks
* Compare bundle size after adding or removing features.
*/
private static readonly math = create({
...absDependencies,
...addDependencies,
...divideDependencies,
...eDependencies,
...evaluateDependencies,
...expDependencies,
...factorialDependencies,
...gcdDependencies,
...lcmDependencies,
...log10Dependencies,
...log1pDependencies,
...log2Dependencies,
...logDependencies,
...modDependencies,
...multiplyDependencies,
...nthRootDependencies,
...piDependencies,
...powDependencies,
...roundDependencies,
...signDependencies,
...sqrtDependencies,
...subtractDependencies
});
protected async run(): Promise<string | undefined> {
const searchInput = getElement<HTMLInputElement>("q");
const node = Calculator.math.parse(searchInput.value);
try {
return `${node.toString()} = ${node.evaluate()}`;
} catch {
// not a compatible math expression
return;
}
}
protected async post(result: string): Promise<void> {
appendAnswerElement(result);
}
}

View File

@@ -0,0 +1,110 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import { Plugin } from "../Plugin.ts";
import { http, settings } from "../toolkit.ts";
import { assertElement } from "../util/assertElement.ts";
import { getElement } from "../util/getElement.ts";
/**
* Automatically loads the next page when scrolling to bottom of the current page.
*/
export default class InfiniteScroll extends Plugin {
public constructor() {
super("infiniteScroll");
}
protected async run(): Promise<void> {
const resultsElement = getElement<HTMLElement>("results");
const onlyImages: boolean = resultsElement.classList.contains("only_template_images");
const observedSelector = "article.result:last-child";
const spinnerElement = document.createElement("div");
spinnerElement.className = "loader";
const loadNextPage = async (callback: () => void): Promise<void> => {
const searchForm = document.querySelector<HTMLFormElement>("#search");
assertElement(searchForm);
const form = document.querySelector<HTMLFormElement>("#pagination form.next_page");
assertElement(form);
const action = searchForm.getAttribute("action");
if (!action) {
throw new Error("Form action not defined");
}
const paginationElement = document.querySelector<HTMLElement>("#pagination");
assertElement(paginationElement);
paginationElement.replaceChildren(spinnerElement);
try {
const res = await http("POST", action, { body: new FormData(form) });
const nextPage = await res.text();
if (!nextPage) return;
const nextPageDoc = new DOMParser().parseFromString(nextPage, "text/html");
const articleList = nextPageDoc.querySelectorAll<HTMLElement>("#urls article");
const nextPaginationElement = nextPageDoc.querySelector<HTMLElement>("#pagination");
document.querySelector("#pagination")?.remove();
const urlsElement = document.querySelector<HTMLElement>("#urls");
if (!urlsElement) {
throw new Error("URLs element not found");
}
if (articleList.length > 0 && !onlyImages) {
// do not add <hr> element when there are only images
urlsElement.appendChild(document.createElement("hr"));
}
urlsElement.append(...articleList);
if (nextPaginationElement) {
const results = document.querySelector<HTMLElement>("#results");
results?.appendChild(nextPaginationElement);
callback();
}
} catch (error) {
console.error("Error loading next page:", error);
const errorElement = Object.assign(document.createElement("div"), {
textContent: settings.translations?.error_loading_next_page ?? "Error loading next page",
className: "dialog-error"
});
errorElement.setAttribute("role", "alert");
document.querySelector("#pagination")?.replaceChildren(errorElement);
}
};
const intersectionObserveOptions: IntersectionObserverInit = {
rootMargin: "320px"
};
const observer: IntersectionObserver = new IntersectionObserver(async (entries: IntersectionObserverEntry[]) => {
const [paginationEntry] = entries;
if (paginationEntry?.isIntersecting) {
observer.unobserve(paginationEntry.target);
await loadNextPage(() => {
const nextObservedElement = document.querySelector<HTMLElement>(observedSelector);
if (nextObservedElement) {
observer.observe(nextObservedElement);
}
});
}
}, intersectionObserveOptions);
const initialObservedElement: HTMLElement | null = document.querySelector<HTMLElement>(observedSelector);
if (initialObservedElement) {
observer.observe(initialObservedElement);
}
}
protected async post(): Promise<void> {
// noop
}
}

View File

@@ -0,0 +1,90 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import "ol/ol.css?inline";
import { Feature, Map as OlMap, View } from "ol";
import { GeoJSON } from "ol/format";
import { Point } from "ol/geom";
import { Tile as TileLayer, Vector as VectorLayer } from "ol/layer";
import { fromLonLat } from "ol/proj";
import { OSM, Vector as VectorSource } from "ol/source";
import { Circle, Fill, Stroke, Style } from "ol/style";
import { Plugin } from "../Plugin.ts";
/**
* MapView
*/
export default class MapView extends Plugin {
private readonly map: HTMLElement;
public constructor(map: HTMLElement) {
super("mapView");
this.map = map;
}
protected async run(): Promise<void> {
const { leafletTarget: target, mapLon, mapLat, mapGeojson } = this.map.dataset;
const lon = Number.parseFloat(mapLon || "0");
const lat = Number.parseFloat(mapLat || "0");
const view = new View({ maxZoom: 16, enableRotation: false });
const map = new OlMap({
target: target,
layers: [new TileLayer({ source: new OSM({ maxZoom: 16 }) })],
view: view
});
try {
const markerSource = new VectorSource({
features: [
new Feature({
geometry: new Point(fromLonLat([lon, lat]))
})
]
});
const markerLayer = new VectorLayer({
source: markerSource,
style: new Style({
image: new Circle({
radius: 6,
fill: new Fill({ color: "#3050ff" })
})
})
});
map.addLayer(markerLayer);
} catch (error) {
console.error("Failed to create marker layer:", error);
}
if (mapGeojson) {
try {
const geoSource = new VectorSource({
features: new GeoJSON().readFeatures(JSON.parse(mapGeojson), {
dataProjection: "EPSG:4326",
featureProjection: "EPSG:3857"
})
});
const geoLayer = new VectorLayer({
source: geoSource,
style: new Style({
stroke: new Stroke({ color: "#3050ff", width: 2 }),
fill: new Fill({ color: "#3050ff33" })
})
});
map.addLayer(geoLayer);
view.fit(geoSource.getExtent(), { padding: [20, 20, 20, 20] });
} catch (error) {
console.error("Failed to create GeoJSON layer:", error);
}
}
}
protected async post(): Promise<void> {
// noop
}
}

View File

@@ -0,0 +1,69 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import { load } from "./loader.ts";
import { Endpoints, endpoint, listen, ready, settings } from "./toolkit.ts";
ready(() => {
document.documentElement.classList.remove("no-js");
document.documentElement.classList.add("js");
listen("click", ".close", function (this: HTMLElement) {
(this.parentNode as HTMLElement)?.classList.add("invisible");
});
listen("click", ".searxng_init_map", async function (this: HTMLElement, event: Event) {
event.preventDefault();
this.classList.remove("searxng_init_map");
load(() => import("./plugin/MapView.ts").then(({ default: Plugin }) => new Plugin(this)), {
on: "endpoint",
where: [Endpoints.results]
});
});
if (settings.plugins?.includes("infiniteScroll")) {
load(() => import("./plugin/InfiniteScroll.ts").then(({ default: Plugin }) => new Plugin()), {
on: "endpoint",
where: [Endpoints.results]
});
}
if (settings.plugins?.includes("calculator")) {
load(() => import("./plugin/Calculator.ts").then(({ default: Plugin }) => new Plugin()), {
on: "endpoint",
where: [Endpoints.results]
});
}
});
ready(
() => {
void import("./main/keyboard.ts");
void import("./main/search.ts");
if (settings.autocomplete) {
void import("./main/autocomplete.ts");
}
},
{ on: [endpoint === Endpoints.index] }
);
ready(
() => {
void import("./main/keyboard.ts");
void import("./main/results.ts");
void import("./main/search.ts");
if (settings.autocomplete) {
void import("./main/autocomplete.ts");
}
},
{ on: [endpoint === Endpoints.results] }
);
ready(
() => {
void import("./main/preferences.ts");
},
{ on: [endpoint === Endpoints.preferences] }
);

View File

@@ -1,16 +1,16 @@
// SPDX-License-Identifier: AGPL-3.0-or-later // SPDX-License-Identifier: AGPL-3.0-or-later
import type { KeyBindingLayout } from "../main/keyboard.ts"; import type { KeyBindingLayout } from "./main/keyboard.ts";
// synced with searx/webapp.py get_client_settings // synced with searx/webapp.py get_client_settings
type Settings = { type Settings = {
plugins?: string[];
advanced_search?: boolean; advanced_search?: boolean;
autocomplete?: string; autocomplete?: string;
autocomplete_min?: number; autocomplete_min?: number;
doi_resolver?: string; doi_resolver?: string;
favicon_resolver?: string; favicon_resolver?: string;
hotkeys?: KeyBindingLayout; hotkeys?: KeyBindingLayout;
infinite_scroll?: boolean;
method?: "GET" | "POST"; method?: "GET" | "POST";
query_in_title?: boolean; query_in_title?: boolean;
results_on_new_tab?: boolean; results_on_new_tab?: boolean;
@@ -32,8 +32,6 @@ type ReadyOptions = {
on?: (boolean | undefined)[]; on?: (boolean | undefined)[];
}; };
type AssertElement = (element?: HTMLElement | null) => asserts element is HTMLElement;
export type EndpointsKeys = keyof typeof Endpoints; export type EndpointsKeys = keyof typeof Endpoints;
export const Endpoints = { export const Endpoints = {
@@ -73,12 +71,6 @@ const getSettings = (): Settings => {
} }
}; };
export const assertElement: AssertElement = (element?: HTMLElement | null): asserts element is HTMLElement => {
if (!element) {
throw new Error("Bad assertion: DOM element not found");
}
};
export const http = async (method: string, url: string | URL, options?: HTTPOptions): Promise<Response> => { export const http = async (method: string, url: string | URL, options?: HTTPOptions): Promise<Response> => {
const controller = new AbortController(); const controller = new AbortController();
const timeoutId = setTimeout(() => controller.abort(), options?.timeout ?? 30_000); const timeoutId = setTimeout(() => controller.abort(), options?.timeout ?? 30_000);

View File

@@ -0,0 +1,34 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import { getElement } from "./getElement.ts";
export const appendAnswerElement = (element: HTMLElement | string | number): void => {
const results = getElement<HTMLDivElement>("results");
// ./searx/templates/elements/answers.html
let answers = getElement<HTMLDivElement>("answers", { assert: false });
if (!answers) {
// what is this?
const answersTitle = document.createElement("h4");
answersTitle.setAttribute("class", "title");
answersTitle.setAttribute("id", "answers-title");
answersTitle.textContent = "Answers : ";
answers = document.createElement("div");
answers.setAttribute("id", "answers");
answers.setAttribute("role", "complementary");
answers.setAttribute("aria-labelledby", "answers-title");
answers.appendChild(answersTitle);
}
if (!(element instanceof HTMLElement)) {
const span = document.createElement("span");
span.innerHTML = element.toString();
// biome-ignore lint/style/noParameterAssign: TODO
element = span;
}
answers.appendChild(element);
results.insertAdjacentElement("afterbegin", answers);
};

View File

@@ -0,0 +1,8 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
type AssertElement = <T>(element?: T | null) => asserts element is T;
export const assertElement: AssertElement = <T>(element?: T | null): asserts element is T => {
if (!element) {
throw new Error("DOM element not found");
}
};

View File

@@ -0,0 +1,21 @@
// SPDX-License-Identifier: AGPL-3.0-or-later
import { assertElement } from "./assertElement.ts";
type Options = {
assert?: boolean;
};
export function getElement<T>(id: string, options?: { assert: true }): T;
export function getElement<T>(id: string, options?: { assert: false }): T | null;
export function getElement<T>(id: string, options: Options = {}): T | null {
options.assert ??= true;
const element = document.getElementById(id) as T | null;
if (options.assert) {
assertElement(element);
}
return element;
}

View File

@@ -1,19 +1,16 @@
// SPDX-License-Identifier: AGPL-3.0-or-later // SPDX-License-Identifier: AGPL-3.0-or-later
iframe[src^="https://w.soundcloud.com"] iframe[src^="https://w.soundcloud.com"] {
{
height: 120px; height: 120px;
} }
iframe[src^="https://www.deezer.com"] iframe[src^="https://www.deezer.com"] {
{
// The real size is 92px, but 94px are needed to avoid an inner scrollbar of // The real size is 92px, but 94px are needed to avoid an inner scrollbar of
// the embedded HTML. // the embedded HTML.
height: 94px; height: 94px;
} }
iframe[src^="https://www.mixcloud.com"] iframe[src^="https://www.mixcloud.com"] {
{
// the embedded player from mixcloud has some quirks: initial there is an // the embedded player from mixcloud has some quirks: initial there is an
// issue with an image URL that is blocked since it is an a Cross-Origin // issue with an image URL that is blocked since it is an a Cross-Origin
// request. The alternative text (<img alt='Mixcloud Logo'> then cause an // request. The alternative text (<img alt='Mixcloud Logo'> then cause an
@@ -23,19 +20,16 @@ iframe[src^="https://www.mixcloud.com"]
height: 250px; height: 250px;
} }
iframe[src^="https://bandcamp.com/EmbeddedPlayer"] iframe[src^="https://bandcamp.com/EmbeddedPlayer"] {
{
// show playlist // show playlist
height: 350px; height: 350px;
} }
iframe[src^="https://bandcamp.com/EmbeddedPlayer/track"] iframe[src^="https://bandcamp.com/EmbeddedPlayer/track"] {
{
// hide playlist // hide playlist
height: 120px; height: 120px;
} }
iframe[src^="https://genius.com/songs"] iframe[src^="https://genius.com/songs"] {
{
height: 65px; height: 65px;
} }

View File

@@ -8,7 +8,7 @@
text-align: center; text-align: center;
.title { .title {
background: url("../img/searxng.png") no-repeat; background: url("./img/searxng.png") no-repeat;
min-height: 4rem; min-height: 4rem;
margin: 4rem auto; margin: 4rem auto;
background-position: center; background-position: center;

View File

@@ -46,39 +46,34 @@ export default {
sourcemap: true, sourcemap: true,
rolldownOptions: { rolldownOptions: {
input: { input: {
// build CSS files // entrypoint
"searxng-ltr.css": `${PATH.src}/less/style-ltr.less`, core: `${PATH.src}/js/index.ts`,
"searxng-rtl.css": `${PATH.src}/less/style-rtl.less`,
"rss.css": `${PATH.src}/less/rss.less`,
// build script files // stylesheets
"searxng.core": `${PATH.src}/js/core/index.ts`, ltr: `${PATH.src}/less/style-ltr.less`,
rtl: `${PATH.src}/less/style-rtl.less`,
// ol pkg rss: `${PATH.src}/less/rss.less`
ol: `${PATH.src}/js/pkg/ol.ts`,
"ol.css": `${PATH.modules}/ol/ol.css`
}, },
// file naming conventions / pathnames are relative to outDir (PATH.dist) // file naming conventions / pathnames are relative to outDir (PATH.dist)
output: { output: {
entryFileNames: "js/[name].min.js", entryFileNames: "sxng-[name].min.js",
chunkFileNames: "js/[name].min.js", chunkFileNames: "chunk/[hash].min.js",
assetFileNames: ({ names }: PreRenderedAsset): string => { assetFileNames: ({ names }: PreRenderedAsset): string => {
const [name] = names; const [name] = names;
const extension = name?.split(".").pop(); switch (name?.split(".").pop()) {
switch (extension) {
case "css": case "css":
return "css/[name].min[extname]"; return "sxng-[name].min[extname]";
case "js":
return "js/[name].min[extname]";
case "png":
case "svg":
return "img/[name][extname]";
default: default:
console.warn("Unknown asset:", name); return "sxng-[name][extname]";
return "[name][extname]";
} }
},
sanitizeFileName: (name: string): string => {
return name
.normalize("NFD")
.replace(/[^a-zA-Z0-9.-]/g, "_")
.toLowerCase();
} }
} }
} }

View File

@@ -124,14 +124,17 @@ engine is shown. Most of the options have a default value or even are optional.
``api_key`` : optional ``api_key`` : optional
In a few cases, using an API needs the use of a secret key. How to obtain them In a few cases, using an API needs the use of a secret key. How to obtain them
is described in the file. is described in the file. Engines that require an API key are set to
``inactive: true`` by default. To enable such an engine, provide the API key
and set ``inactive: false``.
``disabled`` : optional ``disabled`` : optional
To disable by default the engine, but not deleting it. It will allow the user To disable by default the engine, but not deleting it. It will allow the user
to manually activate it in the settings. to manually activate it in the settings.
``inactive``: optional ``inactive``: optional
Remove the engine from the settings (*disabled & removed*). Remove the engine from the settings (*disabled & removed*). This defaults to ``true`` for engines
that require an API key, please see the ``api_key`` section if you want to enable such an engine.
``language`` : optional ``language`` : optional
If you want to use another language for a specific engine, you can define it If you want to use another language for a specific engine, you can define it

View File

@@ -69,6 +69,9 @@ The built-in plugins are all located in the namespace `searx.plugins`.
searx.plugins.calculator.SXNGPlugin: searx.plugins.calculator.SXNGPlugin:
active: true active: true
searx.plugins.infinite_scroll.SXNGPlugin:
active: false
searx.plugins.hash_plugin.SXNGPlugin: searx.plugins.hash_plugin.SXNGPlugin:
active: true active: true

View File

@@ -12,7 +12,6 @@
ui: ui:
default_locale: "" default_locale: ""
query_in_title: false query_in_title: false
infinite_scroll: false
center_alignment: false center_alignment: false
cache_url: https://web.archive.org/web/ cache_url: https://web.archive.org/web/
default_theme: simple default_theme: simple
@@ -32,9 +31,6 @@
When true, the result page's titles contains the query it decreases the When true, the result page's titles contains the query it decreases the
privacy, since the browser can records the page titles. privacy, since the browser can records the page titles.
``infinite_scroll``:
When true, automatically loads the next page when scrolling to bottom of the current page.
``center_alignment`` : default ``false`` ``center_alignment`` : default ``false``
When enabled, the results are centered instead of being in the left (or RTL) When enabled, the results are centered instead of being in the left (or RTL)
side of the screen. This setting only affects the *desktop layout* side of the screen. This setting only affects the *desktop layout*

View File

@@ -1,8 +0,0 @@
.. _voidlinux mullvad_leta:
============
Mullvad-Leta
============
.. automodule:: searx.engines.mullvad_leta
:members:

View File

@@ -10,6 +10,7 @@ Built-in Plugins
calculator calculator
hash_plugin hash_plugin
hostnames hostnames
infinite_scroll
self_info self_info
tor_check tor_check
unit_converter unit_converter

View File

@@ -0,0 +1,8 @@
.. _plugins.infinite_scroll:
===============
Infinite scroll
===============
.. automodule:: searx.plugins.infinite_scroll
:members:

View File

@@ -4,15 +4,33 @@
Search API Search API
========== ==========
The search supports both ``GET`` and ``POST``. SearXNG supports querying via a simple HTTP API.
Two endpoints, ``/`` and ``/search``, are supported for both GET and POST methods.
The GET method expects parameters as URL query parameters, while the POST method expects parameters as form data.
Furthermore, two endpoints ``/`` and ``/search`` are available for querying. If you want to consume the results as JSON, CSV, or RSS, you need to set the
``format`` parameter accordingly. Supported formats are defined in ``settings.yml``, under the ``search`` section.
Requesting an unset format will return a 403 Forbidden error. Be aware that many public instances have these formats disabled.
Endpoints:
``GET /`` ``GET /``
``GET /search`` ``GET /search``
``POST /``
``POST /search``
example cURL calls:
.. code-block:: bash
curl 'https://searx.example.org/search?q=searxng&format=json'
curl -X POST 'https://searx.example.org/search' -d 'q=searxng&format=csv'
curl -L -X POST -d 'q=searxng&format=json' 'https://searx.example.org/'
Parameters Parameters
========== ==========

View File

@@ -1,7 +1,7 @@
[tools] [tools]
# minimal version we support # minimal version we support
python = "3.10" python = "3.10"
node = "24.3.0" node = "25"
go = "1.24.5" go = "1.24.5"
shellcheck = "0.11.0" shellcheck = "0.11.0"
# python 3.10 uses 3.40.1 (on mac and win) # python 3.10 uses 3.40.1 (on mac and win)

View File

@@ -2,9 +2,9 @@ mock==5.2.0
nose2[coverage_plugin]==0.15.1 nose2[coverage_plugin]==0.15.1
cov-core==1.15.0 cov-core==1.15.0
black==25.9.0 black==25.9.0
pylint==4.0.3 pylint==4.0.4
splinter==0.21.0 splinter==0.21.0
selenium==4.38.0 selenium==4.39.0
Pallets-Sphinx-Themes==2.3.0 Pallets-Sphinx-Themes==2.3.0
Sphinx==8.2.3 ; python_version >= '3.11' Sphinx==8.2.3 ; python_version >= '3.11'
Sphinx==8.1.3 ; python_version < '3.11' Sphinx==8.1.3 ; python_version < '3.11'
@@ -23,6 +23,6 @@ wlc==1.16.1
coloredlogs==15.0.1 coloredlogs==15.0.1
docutils>=0.21.2 docutils>=0.21.2
parameterized==0.9.0 parameterized==0.9.0
granian[reload]==2.5.7 granian[reload]==2.6.0
basedpyright==1.33.0 basedpyright==1.36.2
types-lxml==2025.8.25 types-lxml==2026.1.1

View File

@@ -1,2 +1,2 @@
granian==2.5.7 granian==2.6.0
granian[pname]==2.5.7 granian[pname]==2.6.0

View File

@@ -9,12 +9,13 @@ python-dateutil==2.9.0.post0
pyyaml==6.0.3 pyyaml==6.0.3
httpx[http2]==0.28.1 httpx[http2]==0.28.1
httpx-socks[asyncio]==0.10.0 httpx-socks[asyncio]==0.10.0
sniffio==1.3.1
valkey==6.1.1 valkey==6.1.1
markdown-it-py==3.0.0 markdown-it-py==3.0.0
fasttext-predict==0.9.2.4 fasttext-predict==0.9.2.4
tomli==2.3.0; python_version < '3.11' tomli==2.3.0; python_version < '3.11'
msgspec==0.19.0 msgspec==0.20.0
typer-slim==0.20.0 typer-slim==0.21.0
isodate==0.7.2 isodate==0.7.2
whitenoise==6.11.0 whitenoise==6.11.0
typing-extensions==4.15.0 typing-extensions==4.15.0

View File

@@ -5,10 +5,6 @@
---- ----
""" """
# Struct fields aren't discovered in Python 3.14
# - https://github.com/searxng/searxng/issues/5284
from __future__ import annotations
__all__ = ["ExpireCacheCfg", "ExpireCacheStats", "ExpireCache", "ExpireCacheSQLite"] __all__ = ["ExpireCacheCfg", "ExpireCacheStats", "ExpireCache", "ExpireCacheSQLite"]
import abc import abc

File diff suppressed because it is too large Load Diff

View File

@@ -321,6 +321,7 @@
"ja": "アルゼンチン・ペソ", "ja": "アルゼンチン・ペソ",
"ko": "아르헨티나 페소", "ko": "아르헨티나 페소",
"lt": "Argentinos pesas", "lt": "Argentinos pesas",
"lv": "Argentīnas peso",
"ms": "Peso Argentina", "ms": "Peso Argentina",
"nl": "Argentijnse peso", "nl": "Argentijnse peso",
"oc": "Peso", "oc": "Peso",
@@ -344,6 +345,7 @@
"af": "Australiese dollar", "af": "Australiese dollar",
"ar": "دولار أسترالي", "ar": "دولار أسترالي",
"bg": "Австралийски долар", "bg": "Австралийски долар",
"bn": "অস্ট্রেলীয় ডলার",
"ca": "dòlar australià", "ca": "dòlar australià",
"cs": "australský dolar", "cs": "australský dolar",
"cy": "Doler Awstralia", "cy": "Doler Awstralia",
@@ -565,6 +567,7 @@
"fi": "Bangladeshin taka", "fi": "Bangladeshin taka",
"fr": "taka", "fr": "taka",
"ga": "taka na Banglaidéise", "ga": "taka na Banglaidéise",
"gl": "taka",
"he": "טאקה", "he": "טאקה",
"hr": "Bangladeška taka", "hr": "Bangladeška taka",
"hu": "bangladesi taka", "hu": "bangladesi taka",
@@ -645,6 +648,7 @@
"fi": "Bahrainin dinaari", "fi": "Bahrainin dinaari",
"fr": "dinar bahreïnien", "fr": "dinar bahreïnien",
"ga": "dinar Bhairéin", "ga": "dinar Bhairéin",
"gl": "dinar de Bahrain",
"he": "דינר בחרייני", "he": "דינר בחרייני",
"hr": "Bahreinski dinar", "hr": "Bahreinski dinar",
"hu": "bahreini dinár", "hu": "bahreini dinár",
@@ -803,6 +807,7 @@
"ja": "ボリビアーノ", "ja": "ボリビアーノ",
"ko": "볼리비아 볼리비아노", "ko": "볼리비아 볼리비아노",
"lt": "Bolivianas", "lt": "Bolivianas",
"lv": "Bolīvijas boliviano",
"ms": "Boliviano", "ms": "Boliviano",
"nl": "Boliviaanse boliviano", "nl": "Boliviaanse boliviano",
"oc": "Boliviano", "oc": "Boliviano",
@@ -967,7 +972,7 @@
"et": "Botswana pula", "et": "Botswana pula",
"eu": "Pula", "eu": "Pula",
"fi": "Pula", "fi": "Pula",
"fr": "pula", "fr": "Pula",
"ga": "pula na Botsuáine", "ga": "pula na Botsuáine",
"gl": "Pula", "gl": "Pula",
"he": "פולה", "he": "פולה",
@@ -1273,6 +1278,10 @@
"uk": "Чилійський песо", "uk": "Чилійський песо",
"vi": "Peso Chile" "vi": "Peso Chile"
}, },
"CNH": {
"en": "renminbi (offshore)",
"es": "yuan offshore"
},
"CNY": { "CNY": {
"af": "Renminbi", "af": "Renminbi",
"ar": "رنمينبي", "ar": "رنمينبي",
@@ -1285,7 +1294,7 @@
"da": "Renminbi", "da": "Renminbi",
"de": "Renminbi", "de": "Renminbi",
"dv": "ރެންމިބީ", "dv": "ރެންމިބީ",
"en": "renminbi", "en": "CNY",
"eo": "Renminbio", "eo": "Renminbio",
"es": "yuan chino", "es": "yuan chino",
"et": "Renminbi", "et": "Renminbi",
@@ -1329,7 +1338,7 @@
"cs": "Kolumbijské peso", "cs": "Kolumbijské peso",
"da": "Colombiansk peso", "da": "Colombiansk peso",
"de": "kolumbianischer Peso", "de": "kolumbianischer Peso",
"en": "peso", "en": "Colombian peso",
"eo": "kolombia peso", "eo": "kolombia peso",
"es": "peso", "es": "peso",
"et": "Colombia peeso", "et": "Colombia peeso",
@@ -1413,7 +1422,7 @@
"cy": "peso (Ciwba)", "cy": "peso (Ciwba)",
"da": "Cubanske pesos", "da": "Cubanske pesos",
"de": "kubanischer Peso", "de": "kubanischer Peso",
"en": "peso", "en": "Cuban peso",
"eo": "kuba peso", "eo": "kuba peso",
"es": "peso", "es": "peso",
"fi": "Kuuban peso", "fi": "Kuuban peso",
@@ -1459,6 +1468,7 @@
"fi": "Kap Verden escudo", "fi": "Kap Verden escudo",
"fr": "escudo cap-verdien", "fr": "escudo cap-verdien",
"ga": "escudo Rinn Verde", "ga": "escudo Rinn Verde",
"gl": "escudo caboverdiano",
"he": "אשקודו כף ורדי", "he": "אשקודו כף ורדי",
"hr": "Zelenortski eskudo", "hr": "Zelenortski eskudo",
"hu": "zöld-foki köztársasági escudo", "hu": "zöld-foki köztársasági escudo",
@@ -1467,6 +1477,7 @@
"ja": "カーボベルデ・エスクード", "ja": "カーボベルデ・エスクード",
"ko": "카보베르데 이스쿠두", "ko": "카보베르데 이스쿠두",
"lt": "Žaliojo Kyšulio eskudas", "lt": "Žaliojo Kyšulio eskudas",
"lv": "Kaboverdes eskudo",
"nl": "Kaapverdische escudo", "nl": "Kaapverdische escudo",
"oc": "Escut de Cap Verd", "oc": "Escut de Cap Verd",
"pl": "escudo Zielonego Przylądka", "pl": "escudo Zielonego Przylądka",
@@ -1567,7 +1578,7 @@
"ar": "كرونة دنماركية", "ar": "كرونة دنماركية",
"bg": "Датска крона", "bg": "Датска крона",
"ca": "corona danesa", "ca": "corona danesa",
"cs": "Dánská koruna", "cs": "dánská koruna",
"cy": "Krone Danaidd", "cy": "Krone Danaidd",
"da": "dansk krone", "da": "dansk krone",
"de": "dänische Krone", "de": "dänische Krone",
@@ -1806,7 +1817,7 @@
"bg": "евро", "bg": "евро",
"bn": "ইউরো", "bn": "ইউরো",
"ca": "euro", "ca": "euro",
"cs": "Euro", "cs": "euro",
"cy": "Ewro", "cy": "Ewro",
"da": "Euro", "da": "Euro",
"de": "Euro", "de": "Euro",
@@ -1955,7 +1966,7 @@
"lt": "svaras sterlingų", "lt": "svaras sterlingų",
"lv": "sterliņu mārciņa", "lv": "sterliņu mārciņa",
"ms": "paun sterling", "ms": "paun sterling",
"nl": "pond sterling", "nl": "Britse pond",
"oc": "liure esterlina", "oc": "liure esterlina",
"pa": "ਪਾਊਂਡ ਸਟਰਲਿੰਗ", "pa": "ਪਾਊਂਡ ਸਟਰਲਿੰਗ",
"pl": "funt szterling", "pl": "funt szterling",
@@ -2026,7 +2037,7 @@
"eo": "ganaa cedio", "eo": "ganaa cedio",
"es": "cedi", "es": "cedi",
"fi": "Cedi", "fi": "Cedi",
"fr": "cedi", "fr": "Cedi",
"ga": "cedi", "ga": "cedi",
"gl": "Cedi", "gl": "Cedi",
"he": "סדי גאני", "he": "סדי גאני",
@@ -2037,6 +2048,7 @@
"ja": "セディ", "ja": "セディ",
"ko": "가나 세디", "ko": "가나 세디",
"lt": "Sedis", "lt": "Sedis",
"lv": "Ganas sedi",
"ms": "Cedi Ghana", "ms": "Cedi Ghana",
"nl": "Ghanese cedi", "nl": "Ghanese cedi",
"oc": "Cedi", "oc": "Cedi",
@@ -2064,7 +2076,7 @@
"es": "libra gibraltareña", "es": "libra gibraltareña",
"et": "Gibraltari nael", "et": "Gibraltari nael",
"fi": "Gibraltarin punta", "fi": "Gibraltarin punta",
"fr": "livre de Gibraltar", "fr": "Livre de Gibraltar",
"ga": "punt Ghiobráltar", "ga": "punt Ghiobráltar",
"gl": "Libra de Xibraltar", "gl": "Libra de Xibraltar",
"he": "לירה גיברלטרית", "he": "לירה גיברלטרית",
@@ -2151,6 +2163,7 @@
"ja": "ギニア・フラン", "ja": "ギニア・フラン",
"ko": "기니 프랑", "ko": "기니 프랑",
"lt": "Gvinėjos frankas", "lt": "Gvinėjos frankas",
"lv": "Gvinejas franks",
"ms": "Franc Guinea", "ms": "Franc Guinea",
"nl": "Guineese frank", "nl": "Guineese frank",
"oc": "Franc guinean", "oc": "Franc guinean",
@@ -2967,6 +2980,7 @@
"ms": "Won Korea Utara", "ms": "Won Korea Utara",
"nl": "Noord-Koreaanse won", "nl": "Noord-Koreaanse won",
"pa": "ਉੱਤਰੀ ਕੋਰੀਆਈ ਵੌਨ", "pa": "ਉੱਤਰੀ ਕੋਰੀਆਈ ਵੌਨ",
"pap": "won nortkoreano",
"pl": "Won północnokoreański", "pl": "Won północnokoreański",
"pt": "won norte-coreano", "pt": "won norte-coreano",
"ro": "Won nord-coreean", "ro": "Won nord-coreean",
@@ -3576,7 +3590,7 @@
"cy": "tögrög Mongolia", "cy": "tögrög Mongolia",
"da": "Tugrik", "da": "Tugrik",
"de": "Tögrög", "de": "Tögrög",
"en": "tugrik", "en": "Mongolian tögrög",
"eo": "mongola tugriko", "eo": "mongola tugriko",
"es": "tugrik mongol", "es": "tugrik mongol",
"fi": "Mongolian tugrik", "fi": "Mongolian tugrik",
@@ -3793,7 +3807,7 @@
"bg": "Мексиканско песо", "bg": "Мексиканско песо",
"ca": "peso mexicà", "ca": "peso mexicà",
"cs": "Mexické peso", "cs": "Mexické peso",
"cy": "peso (Mecsico)", "cy": "peso",
"de": "Mexikanischer Peso", "de": "Mexikanischer Peso",
"en": "peso", "en": "peso",
"eo": "meksika peso", "eo": "meksika peso",
@@ -3813,6 +3827,7 @@
"ja": "メキシコ・ペソ", "ja": "メキシコ・ペソ",
"ko": "멕시코 페소", "ko": "멕시코 페소",
"lt": "Meksikos pesas", "lt": "Meksikos pesas",
"lv": "Meksikas peso",
"ms": "Peso Mexico", "ms": "Peso Mexico",
"nl": "Mexicaanse peso", "nl": "Mexicaanse peso",
"pa": "ਮੈਕਸੀਕੀ ਪੇਸੋ", "pa": "ਮੈਕਸੀਕੀ ਪੇਸੋ",
@@ -3828,7 +3843,7 @@
"tr": "Meksika pesosu", "tr": "Meksika pesosu",
"tt": "Миксикә писысы", "tt": "Миксикә писысы",
"uk": "мексиканський песо", "uk": "мексиканський песо",
"vi": "Peso Mexico" "vi": "peso"
}, },
"MXV": { "MXV": {
"de": "UNIDAD DE INVERSION", "de": "UNIDAD DE INVERSION",
@@ -3841,7 +3856,7 @@
"ar": "رينغيت ماليزي", "ar": "رينغيت ماليزي",
"bg": "Малайзийски рингит", "bg": "Малайзийски рингит",
"ca": "ringgit", "ca": "ringgit",
"cs": "Malajsijský ringgit", "cs": "malajsijský ringgit",
"cy": "ringgit Maleisia", "cy": "ringgit Maleisia",
"de": "Ringgit", "de": "Ringgit",
"en": "Malaysian ringgit", "en": "Malaysian ringgit",
@@ -3882,7 +3897,7 @@
"MZN": { "MZN": {
"ar": "مثقال موزنبيقي", "ar": "مثقال موزنبيقي",
"ca": "metical", "ca": "metical",
"cs": "Mosambický metical", "cs": "mosambický metical",
"cy": "Metical Mosambic", "cy": "Metical Mosambic",
"da": "Metical", "da": "Metical",
"de": "Metical", "de": "Metical",
@@ -3975,6 +3990,7 @@
"ja": "ナイラ", "ja": "ナイラ",
"ko": "나이지리아 나이라", "ko": "나이지리아 나이라",
"lt": "Naira", "lt": "Naira",
"lv": "Nigērijas naira",
"ms": "Naira Nigeria", "ms": "Naira Nigeria",
"nl": "Nigeriaanse naira", "nl": "Nigeriaanse naira",
"oc": "Naira", "oc": "Naira",
@@ -4031,7 +4047,7 @@
"ar": "كرونة نروجية", "ar": "كرونة نروجية",
"bg": "норвежка крона", "bg": "норвежка крона",
"ca": "corona noruega", "ca": "corona noruega",
"cs": "Norská koruna", "cs": "norská koruna",
"cy": "krone Norwy", "cy": "krone Norwy",
"da": "norsk krone", "da": "norsk krone",
"de": "norwegische Krone", "de": "norwegische Krone",
@@ -4258,6 +4274,7 @@
"ja": "ヌエボ・ソル", "ja": "ヌエボ・ソル",
"ko": "페루 솔", "ko": "페루 솔",
"lt": "Naujasis solis", "lt": "Naujasis solis",
"lv": "Peru sols",
"ms": "Nuevo Sol Peru", "ms": "Nuevo Sol Peru",
"nl": "Peruviaanse sol", "nl": "Peruviaanse sol",
"oc": "Nuevo Sol", "oc": "Nuevo Sol",
@@ -4660,7 +4677,7 @@
"eo": "rusa rublo", "eo": "rusa rublo",
"es": "rublo ruso", "es": "rublo ruso",
"et": "Venemaa rubla", "et": "Venemaa rubla",
"eu": "Errusiar errublo", "eu": "errusiar errublo",
"fi": "Venäjän rupla", "fi": "Venäjän rupla",
"fr": "rouble russe", "fr": "rouble russe",
"ga": "rúbal na Rúise", "ga": "rúbal na Rúise",
@@ -4744,6 +4761,7 @@
"fi": "Saudi-Arabian rial", "fi": "Saudi-Arabian rial",
"fr": "riyal saoudien", "fr": "riyal saoudien",
"ga": "riyal na hAraibe Sádaí", "ga": "riyal na hAraibe Sádaí",
"gl": "riyal saudita",
"he": "ריאל סעודי", "he": "ריאל סעודי",
"hr": "Saudijski rijal", "hr": "Saudijski rijal",
"hu": "szaúdi riál", "hu": "szaúdi riál",
@@ -4782,7 +4800,7 @@
"en": "Solomon Islands dollar", "en": "Solomon Islands dollar",
"eo": "salomona dolaro", "eo": "salomona dolaro",
"es": "dólar de las Islas Salomón", "es": "dólar de las Islas Salomón",
"fi": "Salomonsaarten dollari", "fi": "Salomoninsaarten dollari",
"fr": "dollar des îles Salomon", "fr": "dollar des îles Salomon",
"ga": "dollar Oileáin Sholaimh", "ga": "dollar Oileáin Sholaimh",
"gl": "Dólar das Illas Salomón", "gl": "Dólar das Illas Salomón",
@@ -5018,6 +5036,7 @@
"ja": "レオン", "ja": "レオン",
"ko": "시에라리온 레온", "ko": "시에라리온 레온",
"lt": "leonė", "lt": "leonė",
"lv": "Sjerraleones leone",
"ms": "leone", "ms": "leone",
"nl": "Sierra Leoonse leone", "nl": "Sierra Leoonse leone",
"oc": "leone", "oc": "leone",
@@ -5055,6 +5074,7 @@
"ja": "ソマリア・シリング", "ja": "ソマリア・シリング",
"ko": "소말리아 실링", "ko": "소말리아 실링",
"lt": "Somalio šilingas", "lt": "Somalio šilingas",
"lv": "Somālijas šiliņš",
"ms": "Shilling Somalia", "ms": "Shilling Somalia",
"nl": "Somalische shilling", "nl": "Somalische shilling",
"pl": "Szyling somalijski", "pl": "Szyling somalijski",
@@ -5404,6 +5424,7 @@
"oc": "dinar tunisian", "oc": "dinar tunisian",
"pl": "Dinar tunezyjski", "pl": "Dinar tunezyjski",
"pt": "dinar tunisiano", "pt": "dinar tunisiano",
"ro": "dinar tunisian",
"ru": "тунисский динар", "ru": "тунисский динар",
"sk": "Tuniský dinár", "sk": "Tuniský dinár",
"sl": "tunizijski dinar", "sl": "tunizijski dinar",
@@ -5500,7 +5521,7 @@
"TTD": { "TTD": {
"ar": "دولار ترينيداد وتوباغو", "ar": "دولار ترينيداد وتوباغو",
"bg": "Тринидадски и тобагски долар", "bg": "Тринидадски и тобагски долар",
"ca": "dòlar de Trinitat i Tobago", "ca": "dòlar de Trinidad i Tobago",
"cs": "Dolar Trinidadu a Tobaga", "cs": "Dolar Trinidadu a Tobaga",
"cy": "doler Trinidad a Thobago", "cy": "doler Trinidad a Thobago",
"de": "Trinidad-und-Tobago-Dollar", "de": "Trinidad-und-Tobago-Dollar",
@@ -5718,7 +5739,7 @@
"lv": "ASV dolārs", "lv": "ASV dolārs",
"ml": "യുണൈറ്റഡ് സ്റ്റേറ്റ്സ് ഡോളർ", "ml": "യുണൈറ്റഡ് സ്റ്റേറ്റ്സ് ഡോളർ",
"ms": "Dolar Amerika Syarikat", "ms": "Dolar Amerika Syarikat",
"nl": "US dollar", "nl": "Amerikaanse dollar",
"oc": "dolar american", "oc": "dolar american",
"pa": "ਸੰਯੁਕਤ ਰਾਜ ਡਾਲਰ", "pa": "ਸੰਯੁਕਤ ਰਾਜ ਡਾਲਰ",
"pap": "Dollar merikano", "pap": "Dollar merikano",
@@ -5744,7 +5765,7 @@
"en": "US Dollar (Next day)" "en": "US Dollar (Next day)"
}, },
"UYI": { "UYI": {
"en": "Uruguay peso en Unidades Indexadas" "en": "Uruguay Peso en Unidades Indexadas"
}, },
"UYU": { "UYU": {
"af": "Uruguaanse Peso", "af": "Uruguaanse Peso",
@@ -5813,6 +5834,7 @@
"nl": "Oezbeekse sum", "nl": "Oezbeekse sum",
"oc": "som ozbèc", "oc": "som ozbèc",
"pa": "ਉਜ਼ਬੇਕਿਸਤਾਨੀ ਸੋਮ", "pa": "ਉਜ਼ਬੇਕਿਸਤਾਨੀ ਸੋਮ",
"pap": "som usbekistani",
"pl": "Sum", "pl": "Sum",
"pt": "som usbeque", "pt": "som usbeque",
"ro": "Som uzbec", "ro": "Som uzbec",
@@ -5838,6 +5860,7 @@
"en": "sovereign bolivar", "en": "sovereign bolivar",
"es": "bolívar soberano", "es": "bolívar soberano",
"fr": "bolivar souverain", "fr": "bolivar souverain",
"gl": "bolívar soberano",
"hu": "venezuelai bolívar", "hu": "venezuelai bolívar",
"ja": "ボリバル・ソベラノ", "ja": "ボリバル・ソベラノ",
"pt": "Bolívar soberano", "pt": "Bolívar soberano",
@@ -6578,10 +6601,13 @@
"R": "ZAR", "R": "ZAR",
"R$": "BRL", "R$": "BRL",
"RD$": "DOP", "RD$": "DOP",
"RF": "RWF",
"RM": "MYR", "RM": "MYR",
"RWF": "RWF",
"Rf": "MVR", "Rf": "MVR",
"Rp": "IDR", "Rp": "IDR",
"Rs": "LKR", "Rs": "LKR",
"R₣": "RWF",
"S$": "SGD", "S$": "SGD",
"S/.": "PEN", "S/.": "PEN",
"SI$": "SBD", "SI$": "SBD",
@@ -6601,6 +6627,7 @@
"Ush": "UGX", "Ush": "UGX",
"VT": "VUV", "VT": "VUV",
"WS$": "WST", "WS$": "WST",
"XAF": "XAF",
"XCG": "XCG", "XCG": "XCG",
"XDR": "XDR", "XDR": "XDR",
"Z$": "ZWL", "Z$": "ZWL",
@@ -6726,6 +6753,7 @@
"argentinské peso": "ARS", "argentinské peso": "ARS",
"argentinski peso": "ARS", "argentinski peso": "ARS",
"argentinski pezo": "ARS", "argentinski pezo": "ARS",
"argentīnas peso": "ARS",
"ariari": "MGA", "ariari": "MGA",
"ariari de madagascar": "MGA", "ariari de madagascar": "MGA",
"ariari de madagáscar": "MGA", "ariari de madagáscar": "MGA",
@@ -7008,6 +7036,7 @@
"birr etiopia": "ETB", "birr etiopia": "ETB",
"birr etíope": "ETB", "birr etíope": "ETB",
"birr éthiopien": "ETB", "birr éthiopien": "ETB",
"birr éthiopienne": "ETB",
"birr habsyah": "ETB", "birr habsyah": "ETB",
"birr na haetóipe": "ETB", "birr na haetóipe": "ETB",
"birre da etiópia": "ETB", "birre da etiópia": "ETB",
@@ -7045,6 +7074,7 @@
"bolívar soberano": "VES", "bolívar soberano": "VES",
"bolívar sobirà": "VES", "bolívar sobirà": "VES",
"bolíviai boliviano": "BOB", "bolíviai boliviano": "BOB",
"bolīvijas boliviano": "BOB",
"bosenská konvertibilní marka": "BAM", "bosenská konvertibilní marka": "BAM",
"bosna hersek değiştirilebilir markı": "BAM", "bosna hersek değiştirilebilir markı": "BAM",
"bosnia and herzegovina convertible mark": "BAM", "bosnia and herzegovina convertible mark": "BAM",
@@ -7193,6 +7223,7 @@
"ceatsal": "GTQ", "ceatsal": "GTQ",
"cebelitarık sterlini": "GIP", "cebelitarık sterlini": "GIP",
"cedi": "GHS", "cedi": "GHS",
"cedi du ghana": "GHS",
"cedi ghana": "GHS", "cedi ghana": "GHS",
"cedi ghanese": "GHS", "cedi ghanese": "GHS",
"centr afrika franko": "XAF", "centr afrika franko": "XAF",
@@ -7260,7 +7291,10 @@
"chilensk peso": "CLP", "chilensk peso": "CLP",
"chilské peso": "CLP", "chilské peso": "CLP",
"chinese renminbi": "CNY", "chinese renminbi": "CNY",
"chinese yuan": "CNY", "chinese yuan": [
"CNY",
"CNH"
],
"chinesischer renminbi": "CNY", "chinesischer renminbi": "CNY",
"ci$": "KYD", "ci$": "KYD",
"cibuti frangı": "DJF", "cibuti frangı": "DJF",
@@ -7270,6 +7304,7 @@
"clp": "CLP", "clp": "CLP",
"clp$": "CLP", "clp$": "CLP",
"clps": "CLP", "clps": "CLP",
"cnh": "CNH",
"cny": "CNY", "cny": "CNY",
"co $": "COP", "co $": "COP",
"co$": "COP", "co$": "COP",
@@ -7510,7 +7545,6 @@
"203" "203"
], ],
"cирійський фунт": "SYP", "cирійський фунт": "SYP",
"d.r.": "EGP",
"da": "DZD", "da": "DZD",
"dalase": "GMD", "dalase": "GMD",
"dalasi": "GMD", "dalasi": "GMD",
@@ -8190,6 +8224,7 @@
"HKD", "HKD",
"AUD" "AUD"
], ],
"dollars barbados": "BBD",
"dom$": "DOP", "dom$": "DOP",
"dominga peso": "DOP", "dominga peso": "DOP",
"dominicaanse peso": "DOP", "dominicaanse peso": "DOP",
@@ -8242,9 +8277,7 @@
"dòlar de singapur": "SGD", "dòlar de singapur": "SGD",
"dòlar de surinam": "SRD", "dòlar de surinam": "SRD",
"dòlar de taiwan": "TWD", "dòlar de taiwan": "TWD",
"dòlar de trinitat": "TTD", "dòlar de trinidad i tobago": "TTD",
"dòlar de trinitat i tobago": "TTD",
"dòlar de trinitat tobago": "TTD",
"dòlar de zimbàbue": "ZWL", "dòlar de zimbàbue": "ZWL",
"dòlar del canadà": "CAD", "dòlar del canadà": "CAD",
"dòlar del carib oriental": "XCD", "dòlar del carib oriental": "XCD",
@@ -8406,7 +8439,6 @@
"dólares canadenses": "CAD", "dólares canadenses": "CAD",
"dólares estadounidenses": "USD", "dólares estadounidenses": "USD",
"dólares neozelandeses": "NZD", "dólares neozelandeses": "NZD",
"dr": "EGP",
"dram": "AMD", "dram": "AMD",
"dram armean": "AMD", "dram armean": "AMD",
"dram armenia": "AMD", "dram armenia": "AMD",
@@ -8434,7 +8466,6 @@
"džibučio frankas": "DJF", "džibučio frankas": "DJF",
"džibutski franak": "DJF", "džibutski franak": "DJF",
"džibutský frank": "DJF", "džibutský frank": "DJF",
"d£": "EGP",
"e": "SZL", "e": "SZL",
"e rupee": "INR", "e rupee": "INR",
"e.m.u. 6": "XBB", "e.m.u. 6": "XBB",
@@ -8493,6 +8524,7 @@
"ermenistan dramı": "AMD", "ermenistan dramı": "AMD",
"ern": "ERN", "ern": "ERN",
"erreal brasildar": "BRL", "erreal brasildar": "BRL",
"errublo": "RUB",
"errublo errusiar": "RUB", "errublo errusiar": "RUB",
"errupia indiar": "INR", "errupia indiar": "INR",
"errupia indonesiar": "IDR", "errupia indonesiar": "IDR",
@@ -8905,6 +8937,7 @@
"gambijski dalasi": "GMD", "gambijski dalasi": "GMD",
"gambijský dalasi": "GMD", "gambijský dalasi": "GMD",
"ganaa cedio": "GHS", "ganaa cedio": "GHS",
"ganas sedi": "GHS",
"ganski cedi": "GHS", "ganski cedi": "GHS",
"gbp": "GBP", "gbp": "GBP",
"gbp£": "GBP", "gbp£": "GBP",
@@ -9054,6 +9087,7 @@
"gvatemalski kvecal": "GTQ", "gvatemalski kvecal": "GTQ",
"gvatemalski quetzal": "GTQ", "gvatemalski quetzal": "GTQ",
"gvinea franko": "GNF", "gvinea franko": "GNF",
"gvinejas franks": "GNF",
"gvinejski franak": "GNF", "gvinejski franak": "GNF",
"gvinejski frank": "GNF", "gvinejski frank": "GNF",
"gvinėjos frankas": "GNF", "gvinėjos frankas": "GNF",
@@ -9381,6 +9415,7 @@
"kaaimaneilandse dollar": "KYD", "kaaimaneilandse dollar": "KYD",
"kaapverdische escudo": "CVE", "kaapverdische escudo": "CVE",
"kaboverda eskudo": "CVE", "kaboverda eskudo": "CVE",
"kaboverdes eskudo": "CVE",
"kaiman dollar": "KYD", "kaiman dollar": "KYD",
"kaimanu dolārs": "KYD", "kaimanu dolārs": "KYD",
"kaimanu salu dolārs": "KYD", "kaimanu salu dolārs": "KYD",
@@ -9790,6 +9825,7 @@
"lari na seoirsia": "GEL", "lari na seoirsia": "GEL",
"lario": "GEL", "lario": "GEL",
"laris": "GEL", "laris": "GEL",
"lári": "GEL",
"länsi afrikan cfa frangi": "XOF", "länsi afrikan cfa frangi": "XOF",
"lbp": "LBP", "lbp": "LBP",
"ld": "LYD", "ld": "LYD",
@@ -10214,6 +10250,7 @@
"manat de turkmenistan": "TMT", "manat de turkmenistan": "TMT",
"manat de turkmenistán": "TMT", "manat de turkmenistán": "TMT",
"manat del turkmenistan": "TMT", "manat del turkmenistan": "TMT",
"manat di azerbeidjan": "AZN",
"manat do azerbaijão": "AZN", "manat do azerbaijão": "AZN",
"manat na hasarbaiseáine": "AZN", "manat na hasarbaiseáine": "AZN",
"manat newydd tyrcmenestan": "TMT", "manat newydd tyrcmenestan": "TMT",
@@ -10316,6 +10353,7 @@
"meksika peso": "MXN", "meksika peso": "MXN",
"meksika pesosu": "MXN", "meksika pesosu": "MXN",
"meksikaanse peso": "MXN", "meksikaanse peso": "MXN",
"meksikas peso": "MXN",
"meksikon peso": "MXN", "meksikon peso": "MXN",
"meksikos pesas": "MXN", "meksikos pesas": "MXN",
"meticais": "MZN", "meticais": "MZN",
@@ -10552,6 +10590,7 @@
"nigerijská naira": "NGN", "nigerijská naira": "NGN",
"nigériai naira": "NGN", "nigériai naira": "NGN",
"nigérijská naira": "NGN", "nigérijská naira": "NGN",
"nigērijas naira": "NGN",
"niĝera najro": "NGN", "niĝera najro": "NGN",
"niĝeria najro": "NGN", "niĝeria najro": "NGN",
"nijerya nairası": "NGN", "nijerya nairası": "NGN",
@@ -10680,7 +10719,6 @@
"nuevo dólar taiwanes": "TWD", "nuevo dólar taiwanes": "TWD",
"nuevo dólar taiwanés": "TWD", "nuevo dólar taiwanés": "TWD",
"nuevo peso": [ "nuevo peso": [
"UYU",
"MXN", "MXN",
"ARS" "ARS"
], ],
@@ -10878,6 +10916,7 @@
"penny": "GBP", "penny": "GBP",
"perak sebagai pelaburan": "XAG", "perak sebagai pelaburan": "XAG",
"peru nueva solü": "PEN", "peru nueva solü": "PEN",
"peru sols": "PEN",
"perua nova suno": "PEN", "perua nova suno": "PEN",
"peruanischer nuevo sol": "PEN", "peruanischer nuevo sol": "PEN",
"peruanischer sol": "PEN", "peruanischer sol": "PEN",
@@ -10952,7 +10991,6 @@
"peso de méxico": "MXN", "peso de méxico": "MXN",
"peso de republica dominicana": "DOP", "peso de republica dominicana": "DOP",
"peso de república dominicana": "DOP", "peso de república dominicana": "DOP",
"peso de uruguay": "UYU",
"peso de xile": "CLP", "peso de xile": "CLP",
"peso do chile": "CLP", "peso do chile": "CLP",
"peso do uruguai": "UYU", "peso do uruguai": "UYU",
@@ -11231,7 +11269,10 @@
"rends": "ZAR", "rends": "ZAR",
"renmibi": "CNY", "renmibi": "CNY",
"renminb": "CNY", "renminb": "CNY",
"renminbi": "CNY", "renminbi": [
"CNH",
"CNY"
],
"renminbi cinese": "CNY", "renminbi cinese": "CNY",
"renminbi yuan": "CNY", "renminbi yuan": "CNY",
"renminbio": "CNY", "renminbio": "CNY",
@@ -11599,7 +11640,6 @@
"rúpia indiana": "INR", "rúpia indiana": "INR",
"rúpies": "INR", "rúpies": "INR",
"rūpija": "IDR", "rūpija": "IDR",
"rwanda franc": "RWF",
"rwanda frank": "RWF", "rwanda frank": "RWF",
"rwandan franc": "RWF", "rwandan franc": "RWF",
"rwandan frank": "RWF", "rwandan frank": "RWF",
@@ -11840,6 +11880,7 @@
"sistema unificato di compensazione regionale": "XSU", "sistema unificato di compensazione regionale": "XSU",
"sistema único de compensación regional": "XSU", "sistema único de compensación regional": "XSU",
"sjekel": "ILS", "sjekel": "ILS",
"sjerraleones leone": "SLE",
"sjevernokorejski von": "KPW", "sjevernokorejski von": "KPW",
"sle": "SLE", "sle": "SLE",
"sll": "SLE", "sll": "SLE",
@@ -11884,6 +11925,7 @@
"som ozbèc": "UZS", "som ozbèc": "UZS",
"som quirguiz": "KGS", "som quirguiz": "KGS",
"som usbeco": "UZS", "som usbeco": "UZS",
"som usbekistani": "UZS",
"som usbeque": "UZS", "som usbeque": "UZS",
"som uzbec": "UZS", "som uzbec": "UZS",
"som uzbeco": "UZS", "som uzbeco": "UZS",
@@ -11906,6 +11948,7 @@
"somas": "KGS", "somas": "KGS",
"somálsky šiling": "SOS", "somálsky šiling": "SOS",
"somálský šilink": "SOS", "somálský šilink": "SOS",
"somālijas šiliņš": "SOS",
"some": "KGS", "some": "KGS",
"somoni": "TJS", "somoni": "TJS",
"somoni na táidsíceastáine": "TJS", "somoni na táidsíceastáine": "TJS",
@@ -12675,6 +12718,7 @@
"won nord coréen": "KPW", "won nord coréen": "KPW",
"won nordcoreano": "KPW", "won nordcoreano": "KPW",
"won norte coreano": "KPW", "won norte coreano": "KPW",
"won nortkoreano": "KPW",
"won południowokoreański": "KRW", "won południowokoreański": "KRW",
"won północnokoreański": "KPW", "won północnokoreański": "KPW",
"won sud corean": "KRW", "won sud corean": "KRW",
@@ -12745,10 +12789,14 @@
"yhdistyneiden arabiemiraattien dirhami": "AED", "yhdistyneiden arabiemiraattien dirhami": "AED",
"yhdysvaltain dollari": "USD", "yhdysvaltain dollari": "USD",
"ytl": "TRY", "ytl": "TRY",
"yuan": "CNY", "yuan": [
"CNH",
"CNY"
],
"yuan chinezesc": "CNY", "yuan chinezesc": "CNY",
"yuan chino": "CNY", "yuan chino": "CNY",
"yuan cinese": "CNY", "yuan cinese": "CNY",
"yuan offshore": "CNH",
"yuan renmimbi": "CNY", "yuan renmimbi": "CNY",
"yuan renminbi": "CNY", "yuan renminbi": "CNY",
"yuan rmb": "CNY", "yuan rmb": "CNY",
@@ -12949,7 +12997,8 @@
"£s": "SYP", "£s": "SYP",
"¥": [ "¥": [
"JPY", "JPY",
"CNY" "CNY",
"CNH"
], ],
"đài tệ": "TWD", "đài tệ": "TWD",
"đại hàn dân quốc weon": "KRW", "đại hàn dân quốc weon": "KRW",
@@ -15043,6 +15092,7 @@
"ޕާކިސްތާނީ ރުޕީ": "PKR", "ޕާކިސްތާނީ ރުޕީ": "PKR",
"रू": "NPR", "रू": "NPR",
"रू.": "INR", "रू.": "INR",
"অস্ট্রেলীয় ডলার": "AUD",
"অ্যাঙ্গোলীয় কুয়াঞ্জা": "AOA", "অ্যাঙ্গোলীয় কুয়াঞ্জা": "AOA",
"আইসল্যান্ডীয় ক্রোনা": "ISK", "আইসল্যান্ডীয় ক্রোনা": "ISK",
"আজারবাইজানি মানাত": "AZN", "আজারবাইজানি মানাত": "AZN",
@@ -15372,7 +15422,6 @@
"యునైటెడ్ స్టేట్స్ డాలర్": "USD", "యునైటెడ్ స్టేట్స్ డాలర్": "USD",
"యూరో": "EUR", "యూరో": "EUR",
"రూపాయి": "INR", "రూపాయి": "INR",
"సంయుక్త రాష్ట్రాల డాలర్": "USD",
"స్విస్ ఫ్రాంక్": "CHF", "స్విస్ ఫ్రాంక్": "CHF",
"അൾജീരിയൻ ദിനാർ": "DZD", "അൾജീരിയൻ ദിനാർ": "DZD",
"ഇന്തോനേഷ്യൻ റുപിയ": "IDR", "ഇന്തോനേഷ്യൻ റുപിയ": "IDR",

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

View File

@@ -5,7 +5,7 @@
], ],
"ua": "Mozilla/5.0 ({os}; rv:{version}) Gecko/20100101 Firefox/{version}", "ua": "Mozilla/5.0 ({os}; rv:{version}) Gecko/20100101 Firefox/{version}",
"versions": [ "versions": [
"144.0", "146.0",
"143.0" "145.0"
] ]
} }

View File

@@ -2319,11 +2319,6 @@
"symbol": "kJ/(kg K)", "symbol": "kJ/(kg K)",
"to_si_factor": 1000.0 "to_si_factor": 1000.0
}, },
"Q108888186": {
"si_name": "Q11570",
"symbol": "eV/c²",
"to_si_factor": 1.782661921627898e-36
},
"Q108888198": { "Q108888198": {
"si_name": "Q11570", "si_name": "Q11570",
"symbol": "keV/c²", "symbol": "keV/c²",
@@ -4394,6 +4389,11 @@
"symbol": "m²", "symbol": "m²",
"to_si_factor": 1.0 "to_si_factor": 1.0
}, },
"Q25376902": {
"si_name": null,
"symbol": "Mbp",
"to_si_factor": null
},
"Q25377184": { "Q25377184": {
"si_name": "Q25377184", "si_name": "Q25377184",
"symbol": "kg/m²", "symbol": "kg/m²",
@@ -5344,11 +5344,6 @@
"symbol": "bhp EDR", "symbol": "bhp EDR",
"to_si_factor": 12.958174 "to_si_factor": 12.958174
}, },
"Q3984193": {
"si_name": "Q25269",
"symbol": "TeV",
"to_si_factor": 1.602176634e-07
},
"Q39978339": { "Q39978339": {
"si_name": "Q25377184", "si_name": "Q25377184",
"symbol": "kg/cm²", "symbol": "kg/cm²",
@@ -5459,6 +5454,11 @@
"symbol": "T", "symbol": "T",
"to_si_factor": 907.18474 "to_si_factor": 907.18474
}, },
"Q4741": {
"si_name": null,
"symbol": "RF",
"to_si_factor": null
},
"Q474533": { "Q474533": {
"si_name": null, "si_name": null,
"symbol": "At", "symbol": "At",

View File

@@ -2,10 +2,18 @@
# pylint: disable=invalid-name # pylint: disable=invalid-name
"""360Search search engine for searxng""" """360Search search engine for searxng"""
import typing as t
from urllib.parse import urlencode from urllib.parse import urlencode
from lxml import html from lxml import html
from searx import logger
from searx.enginelib import EngineCache
from searx.utils import extract_text from searx.utils import extract_text
from searx.network import get as http_get
if t.TYPE_CHECKING:
from searx.extended_types import SXNG_Response
# Metadata # Metadata
about = { about = {
@@ -26,6 +34,35 @@ time_range_dict = {'day': 'd', 'week': 'w', 'month': 'm', 'year': 'y'}
# Base URL # Base URL
base_url = "https://www.so.com" base_url = "https://www.so.com"
COOKIE_CACHE_KEY = "cookie"
COOKIE_CACHE_EXPIRATION_SECONDS = 3600
CACHE: EngineCache
"""Stores cookies from 360search to avoid re-fetching them on every request."""
def setup(engine_settings: dict[str, t.Any]) -> bool:
"""Initialization of the engine.
- Instantiate a cache for this engine (:py:obj:`CACHE`).
"""
global CACHE # pylint: disable=global-statement
# table name needs to be quoted to start with digits, so "cache" has been added to avoid sqlite complaining
CACHE = EngineCache("cache" + engine_settings["name"])
return True
def get_cookie(url: str) -> str:
cookie: str | None = CACHE.get(COOKIE_CACHE_KEY)
if cookie:
return cookie
resp: SXNG_Response = http_get(url, timeout=10, allow_redirects=False)
headers = resp.headers
cookie = headers['set-cookie'].split(";")[0]
CACHE.set(key=COOKIE_CACHE_KEY, value=cookie, expire=COOKIE_CACHE_EXPIRATION_SECONDS)
return cookie
def request(query, params): def request(query, params):
@@ -36,8 +73,13 @@ def request(query, params):
if time_range_dict.get(params['time_range']): if time_range_dict.get(params['time_range']):
query_params["adv_t"] = time_range_dict.get(params['time_range']) query_params["adv_t"] = time_range_dict.get(params['time_range'])
params["url"] = f"{base_url}/s?{urlencode(query_params)}" params["url"] = f"{base_url}/s?{urlencode(query_params)}"
# get token by calling the query page
logger.debug("querying url: %s", params["url"])
cookie = get_cookie(params["url"])
logger.debug("obtained cookie: %s", cookie)
params['headers'] = {'Cookie': cookie}
return params return params

View File

@@ -270,7 +270,14 @@ def load_engines(engine_list: list[dict[str, t.Any]]):
categories.clear() categories.clear()
categories['general'] = [] categories['general'] = []
for engine_data in engine_list: for engine_data in engine_list:
if engine_data.get("inactive") is True:
continue
engine = load_engine(engine_data) engine = load_engine(engine_data)
if engine: if engine:
register_engine(engine) register_engine(engine)
else:
# if an engine can't be loaded (if for example the engine is missing
# tor or some other requirements) its set to inactive!
logger.error("loading engine %s failed: set engine to inactive!", engine_data.get("name", "???"))
engine_data["inactive"] = True
return engines return engines

View File

@@ -3,9 +3,14 @@
Ahmia (Onions) Ahmia (Onions)
""" """
import typing as t
from urllib.parse import urlencode, urlparse, parse_qs from urllib.parse import urlencode, urlparse, parse_qs
from lxml.html import fromstring from lxml.html import fromstring
from searx.utils import gen_useragent, ElementType
from searx.engines.xpath import extract_url, extract_text, eval_xpath_list, eval_xpath from searx.engines.xpath import extract_url, extract_text, eval_xpath_list, eval_xpath
from searx.network import get
from searx.enginelib import EngineCache
# about # about
about = { about = {
@@ -23,6 +28,7 @@ paging = True
page_size = 10 page_size = 10
# search url # search url
base_url = 'http://juhanurmihxlp77nkq76byazcldy2hlmovfu2epvl5ankdibsot4csyd.onion'
search_url = 'http://juhanurmihxlp77nkq76byazcldy2hlmovfu2epvl5ankdibsot4csyd.onion/search/?{query}' search_url = 'http://juhanurmihxlp77nkq76byazcldy2hlmovfu2epvl5ankdibsot4csyd.onion/search/?{query}'
time_range_support = True time_range_support = True
time_range_dict = {'day': 1, 'week': 7, 'month': 30} time_range_dict = {'day': 1, 'week': 7, 'month': 30}
@@ -34,10 +40,42 @@ title_xpath = './h4/a[1]'
content_xpath = './/p[1]' content_xpath = './/p[1]'
correction_xpath = '//*[@id="didYouMean"]//a' correction_xpath = '//*[@id="didYouMean"]//a'
number_of_results_xpath = '//*[@id="totalResults"]' number_of_results_xpath = '//*[@id="totalResults"]'
name_token_xpath = '//form[@id="searchForm"]/input[@type="hidden"]/@name'
value_token_xpath = '//form[@id="searchForm"]/input[@type="hidden"]/@value'
CACHE: EngineCache
def setup(engine_settings: dict[str, t.Any]) -> bool:
global CACHE # pylint: disable=global-statement
CACHE = EngineCache(engine_settings["name"])
return True
def _get_tokens(dom: ElementType | None = None) -> str:
"""
The tokens are hidden in a hidden input field.
They update every minute, but allow up to 1 hour old tokens to be used.
To spend the least amount of requests, it is best to always get the newest
tokens from each request. In worst case if it has expired, it would
need to do a total of 2 requests (over tor, might be ridiculously slow).
"""
if dom is None:
resp = get(base_url, headers={'User-Agent': gen_useragent()})
dom = fromstring(resp.text)
name_token = extract_text(dom.xpath(name_token_xpath))
value_token = extract_text(dom.xpath(value_token_xpath))
return f"{name_token}:{value_token}"
def request(query, params): def request(query, params):
params['url'] = search_url.format(query=urlencode({'q': query})) token_str: str | None = CACHE.get('ahmia-tokens')
if not token_str:
token_str = _get_tokens()
CACHE.set('ahmia-tokens', token_str, expire=60 * 60)
name_token, value_token = token_str.split(":")
params['url'] = search_url.format(query=urlencode({'q': query, name_token: value_token}))
if params['time_range'] in time_range_dict: if params['time_range'] in time_range_dict:
params['url'] += '&' + urlencode({'d': time_range_dict[params['time_range']]}) params['url'] += '&' + urlencode({'d': time_range_dict[params['time_range']]})
@@ -77,4 +115,8 @@ def response(resp):
except: # pylint: disable=bare-except except: # pylint: disable=bare-except
pass pass
# Update the tokens to the newest ones
token_str = _get_tokens(dom)
CACHE.set('ahmia-tokens', token_str, expire=60 * 60)
return results return results

View File

@@ -120,7 +120,7 @@ def fetch_traits(engine_traits: EngineTraits):
'zh': 'Special:搜索', 'zh': 'Special:搜索',
} }
resp = get('https://wiki.archlinux.org/') resp = get('https://wiki.archlinux.org/', timeout=3)
if not resp.ok: # type: ignore if not resp.ok: # type: ignore
print("ERROR: response from wiki.archlinux.org is not OK.") print("ERROR: response from wiki.archlinux.org is not OK.")

View File

@@ -50,7 +50,7 @@ def response(resp):
pos = script.index(end_tag) + len(end_tag) - 1 pos = script.index(end_tag) + len(end_tag) - 1
script = script[:pos] script = script[:pos]
json_resp = utils.js_variable_to_python(script) json_resp = utils.js_obj_str_to_python(script)
results = [] results = []

View File

@@ -95,7 +95,7 @@ def authenticate(t_id: str, c_id: str, c_secret: str) -> str:
"scope": "https://management.azure.com/.default", "scope": "https://management.azure.com/.default",
} }
resp: SXNG_Response = http_post(url, body) resp: SXNG_Response = http_post(url, body, timeout=5)
if resp.status_code != 200: if resp.status_code != 200:
raise RuntimeError(f"Azure authentication failed (status {resp.status_code}): {resp.text}") raise RuntimeError(f"Azure authentication failed (status {resp.status_code}): {resp.text}")
return resp.json()["access_token"] return resp.json()["access_token"]

View File

@@ -51,6 +51,7 @@ def request(query, params):
} }
params["url"] = f"{base_url}?{urlencode(query_params)}" params["url"] = f"{base_url}?{urlencode(query_params)}"
params["headers"]["Referer"] = "https://www.bilibili.com"
params["cookies"] = cookie params["cookies"] = cookie
return params return params

View File

@@ -124,17 +124,17 @@ from urllib.parse import (
urlparse, urlparse,
) )
import json
from dateutil import parser from dateutil import parser
from lxml import html from lxml import html
from searx import locales from searx import locales
from searx.utils import ( from searx.utils import (
extr,
extract_text, extract_text,
eval_xpath,
eval_xpath_list, eval_xpath_list,
eval_xpath_getindex, eval_xpath_getindex,
js_variable_to_python, js_obj_str_to_python,
js_obj_str_to_json_str,
get_embeded_stream_url, get_embeded_stream_url,
) )
from searx.enginelib.traits import EngineTraits from searx.enginelib.traits import EngineTraits
@@ -142,17 +142,17 @@ from searx.result_types import EngineResults
from searx.extended_types import SXNG_Response from searx.extended_types import SXNG_Response
about = { about = {
"website": 'https://search.brave.com/', "website": "https://search.brave.com/",
"wikidata_id": 'Q22906900', "wikidata_id": "Q22906900",
"official_api_documentation": None, "official_api_documentation": None,
"use_official_api": False, "use_official_api": False,
"require_api_key": False, "require_api_key": False,
"results": 'HTML', "results": "HTML",
} }
base_url = "https://search.brave.com/" base_url = "https://search.brave.com/"
categories = [] categories = []
brave_category: t.Literal["search", "videos", "images", "news", "goggles"] = 'search' brave_category: t.Literal["search", "videos", "images", "news", "goggles"] = "search"
"""Brave supports common web-search, videos, images, news, and goggles search. """Brave supports common web-search, videos, images, news, and goggles search.
- ``search``: Common WEB search - ``search``: Common WEB search
@@ -182,71 +182,87 @@ to do more won't return any result and you will most likely be flagged as a bot.
""" """
safesearch = True safesearch = True
safesearch_map = {2: 'strict', 1: 'moderate', 0: 'off'} # cookie: safesearch=off safesearch_map = {2: "strict", 1: "moderate", 0: "off"} # cookie: safesearch=off
time_range_support = False time_range_support = False
"""Brave only supports time-range in :py:obj:`brave_category` ``search`` (UI """Brave only supports time-range in :py:obj:`brave_category` ``search`` (UI
category All) and in the goggles category.""" category All) and in the goggles category."""
time_range_map: dict[str, str] = { time_range_map: dict[str, str] = {
'day': 'pd', "day": "pd",
'week': 'pw', "week": "pw",
'month': 'pm', "month": "pm",
'year': 'py', "year": "py",
} }
def request(query: str, params: dict[str, t.Any]) -> None: def request(query: str, params: dict[str, t.Any]) -> None:
args: dict[str, t.Any] = { args: dict[str, t.Any] = {
'q': query, "q": query,
'source': 'web', "source": "web",
} }
if brave_spellcheck: if brave_spellcheck:
args['spellcheck'] = '1' args["spellcheck"] = "1"
if brave_category in ('search', 'goggles'): if brave_category in ("search", "goggles"):
if params.get('pageno', 1) - 1: if params.get("pageno", 1) - 1:
args['offset'] = params.get('pageno', 1) - 1 args["offset"] = params.get("pageno", 1) - 1
if time_range_map.get(params['time_range']): if time_range_map.get(params["time_range"]):
args['tf'] = time_range_map.get(params['time_range']) args["tf"] = time_range_map.get(params["time_range"])
if brave_category == 'goggles': if brave_category == "goggles":
args['goggles_id'] = Goggles args["goggles_id"] = Goggles
params["headers"]["Accept-Encoding"] = "gzip, deflate"
params["url"] = f"{base_url}{brave_category}?{urlencode(args)}" params["url"] = f"{base_url}{brave_category}?{urlencode(args)}"
logger.debug("url %s", params["url"])
# set properties in the cookies # set properties in the cookies
params['cookies']['safesearch'] = safesearch_map.get(params['safesearch'], 'off') params["cookies"]["safesearch"] = safesearch_map.get(params["safesearch"], "off")
# the useLocation is IP based, we use cookie 'country' for the region # the useLocation is IP based, we use cookie "country" for the region
params['cookies']['useLocation'] = '0' params["cookies"]["useLocation"] = "0"
params['cookies']['summarizer'] = '0' params["cookies"]["summarizer"] = "0"
engine_region = traits.get_region(params['searxng_locale'], 'all') engine_region = traits.get_region(params["searxng_locale"], "all")
params['cookies']['country'] = engine_region.split('-')[-1].lower() # type: ignore params["cookies"]["country"] = engine_region.split("-")[-1].lower() # type: ignore
ui_lang = locales.get_engine_locale(params['searxng_locale'], traits.custom["ui_lang"], 'en-us') ui_lang = locales.get_engine_locale(params["searxng_locale"], traits.custom["ui_lang"], "en-us")
params['cookies']['ui_lang'] = ui_lang params["cookies"]["ui_lang"] = ui_lang
logger.debug("cookies %s", params["cookies"])
logger.debug("cookies %s", params['cookies'])
params['headers']['Sec-Fetch-Dest'] = "document"
params['headers']['Sec-Fetch-Mode'] = "navigate"
params['headers']['Sec-Fetch-Site'] = "same-origin"
params['headers']['Sec-Fetch-User'] = "?1"
def _extract_published_date(published_date_raw): def _extract_published_date(published_date_raw: str | None):
if published_date_raw is None: if published_date_raw is None:
return None return None
try: try:
return parser.parse(published_date_raw) return parser.parse(published_date_raw)
except parser.ParserError: except parser.ParserError:
return None return None
def extract_json_data(text: str) -> dict[str, t.Any]:
# Example script source containing the data:
#
# kit.start(app, element, {
# node_ids: [0, 19],
# data: [{type:"data",data: .... ["q","goggles_id"],route:1,url:1}}]
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
text = text[text.index("<script") : text.index("</script")]
if not text:
raise ValueError("can't find JS/JSON data in the given text")
start = text.index("data: [{")
end = text.rindex("}}]")
js_obj_str = text[start:end]
js_obj_str = "{" + js_obj_str + "}}]}"
# js_obj_str = js_obj_str.replace("\xa0", "") # remove ASCII for &nbsp;
# js_obj_str = js_obj_str.replace(r"\u003C", "<").replace(r"\u003c", "<") # fix broken HTML tags in strings
json_str = js_obj_str_to_json_str(js_obj_str)
data: dict[str, t.Any] = json.loads(json_str)
return data
def response(resp: SXNG_Response) -> EngineResults: def response(resp: SXNG_Response) -> EngineResults:
if brave_category in ('search', 'goggles'): if brave_category in ('search', 'goggles'):
@@ -261,11 +277,8 @@ def response(resp: SXNG_Response) -> EngineResults:
# node_ids: [0, 19], # node_ids: [0, 19],
# data: [{type:"data",data: .... ["q","goggles_id"],route:1,url:1}}] # data: [{type:"data",data: .... ["q","goggles_id"],route:1,url:1}}]
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ # ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
js_object = "[{" + extr(resp.text, "data: [{", "}}],") + "}}]" json_data: dict[str, t.Any] = extract_json_data(resp.text)
json_data = js_variable_to_python(js_object) json_resp: dict[str, t.Any] = json_data['data'][1]["data"]['body']['response']
# json_data is a list and at the second position (0,1) in this list we find the "response" data we need ..
json_resp = json_data[1]['data']['body']['response']
if brave_category == 'images': if brave_category == 'images':
return _parse_images(json_resp) return _parse_images(json_resp)
@@ -275,150 +288,124 @@ def response(resp: SXNG_Response) -> EngineResults:
raise ValueError(f"Unsupported brave category: {brave_category}") raise ValueError(f"Unsupported brave category: {brave_category}")
def _parse_search(resp) -> EngineResults: def _parse_search(resp: SXNG_Response) -> EngineResults:
result_list = EngineResults() res = EngineResults()
dom = html.fromstring(resp.text) dom = html.fromstring(resp.text)
# I doubt that Brave is still providing the "answer" class / I haven't seen for result in eval_xpath_list(dom, "//div[contains(@class, 'snippet ')]"):
# answers in brave for a long time.
answer_tag = eval_xpath_getindex(dom, '//div[@class="answer"]', 0, default=None)
if answer_tag:
url = eval_xpath_getindex(dom, '//div[@id="featured_snippet"]/a[@class="result-header"]/@href', 0, default=None)
answer = extract_text(answer_tag)
if answer is not None:
result_list.add(result_list.types.Answer(answer=answer, url=url))
# xpath_results = '//div[contains(@class, "snippet fdb") and @data-type="web"]' url: str | None = eval_xpath_getindex(result, ".//a/@href", 0, default=None)
xpath_results = '//div[contains(@class, "snippet ")]' title_tag = eval_xpath_getindex(result, ".//div[contains(@class, 'title')]", 0, default=None)
for result in eval_xpath_list(dom, xpath_results):
url = eval_xpath_getindex(result, './/a[contains(@class, "h")]/@href', 0, default=None)
title_tag = eval_xpath_getindex(
result, './/a[contains(@class, "h")]//div[contains(@class, "title")]', 0, default=None
)
if url is None or title_tag is None or not urlparse(url).netloc: # partial url likely means it's an ad if url is None or title_tag is None or not urlparse(url).netloc: # partial url likely means it's an ad
continue continue
content: str = extract_text( content: str = ""
eval_xpath_getindex(result, './/div[contains(@class, "snippet-description")]', 0, default='') pub_date = None
) # type: ignore
pub_date_raw = eval_xpath(result, 'substring-before(.//div[contains(@class, "snippet-description")], "-")')
pub_date = _extract_published_date(pub_date_raw)
if pub_date and content.startswith(pub_date_raw):
content = content.lstrip(pub_date_raw).strip("- \n\t")
thumbnail = eval_xpath_getindex(result, './/img[contains(@class, "thumb")]/@src', 0, default='') # there are other classes like 'site-name-content' we don't want to match,
# however only using contains(@class, 'content') would e.g. also match `site-name-content`
# thus, we explicitly also require the spaces as class separator
_content = eval_xpath_getindex(result, ".//div[contains(concat(' ', @class, ' '), ' content ')]", 0, default="")
if len(_content):
content = extract_text(_content) # type: ignore
_pub_date = extract_text(
eval_xpath_getindex(_content, ".//span[contains(@class, 't-secondary')]", 0, default="")
)
if _pub_date:
pub_date = _extract_published_date(_pub_date)
content = content.lstrip(_pub_date).strip("- \n\t")
item = { thumbnail: str = eval_xpath_getindex(result, ".//a[contains(@class, 'thumbnail')]//img/@src", 0, default="")
'url': url,
'title': extract_text(title_tag), item = res.types.LegacyResult(
'content': content, template="default.html",
'publishedDate': pub_date, url=url,
'thumbnail': thumbnail, title=extract_text(title_tag),
} content=content,
publishedDate=pub_date,
thumbnail=thumbnail,
)
res.add(item)
video_tag = eval_xpath_getindex( video_tag = eval_xpath_getindex(
result, './/div[contains(@class, "video-snippet") and @data-macro="video"]', 0, default=None result, ".//div[contains(@class, 'video-snippet') and @data-macro='video']", 0, default=[]
) )
if video_tag is not None: if len(video_tag):
# In my tests a video tag in the WEB search was most often not a # In my tests a video tag in the WEB search was most often not a
# video, except the ones from youtube .. # video, except the ones from youtube ..
iframe_src = get_embeded_stream_url(url) iframe_src = get_embeded_stream_url(url)
if iframe_src: if iframe_src:
item['iframe_src'] = iframe_src item["iframe_src"] = iframe_src
item['template'] = 'videos.html' item["template"] = "videos.html"
item['thumbnail'] = eval_xpath_getindex(video_tag, './/img/@src', 0, default='')
pub_date_raw = extract_text(
eval_xpath(video_tag, './/div[contains(@class, "snippet-attributes")]/div/text()')
)
item['publishedDate'] = _extract_published_date(pub_date_raw)
else:
item['thumbnail'] = eval_xpath_getindex(video_tag, './/img/@src', 0, default='')
result_list.append(item) return res
return result_list
def _parse_news(resp) -> EngineResults: def _parse_news(resp: SXNG_Response) -> EngineResults:
res = EngineResults()
result_list = EngineResults()
dom = html.fromstring(resp.text) dom = html.fromstring(resp.text)
for result in eval_xpath_list(dom, '//div[contains(@class, "results")]//div[@data-type="news"]'): for result in eval_xpath_list(dom, "//div[contains(@class, 'results')]//div[@data-type='news']"):
# import pdb url = eval_xpath_getindex(result, ".//a[contains(@class, 'result-header')]/@href", 0, default=None)
# pdb.set_trace()
url = eval_xpath_getindex(result, './/a[contains(@class, "result-header")]/@href', 0, default=None)
if url is None: if url is None:
continue continue
title = extract_text(eval_xpath_list(result, './/span[contains(@class, "snippet-title")]')) title = eval_xpath_list(result, ".//span[contains(@class, 'snippet-title')]")
content = extract_text(eval_xpath_list(result, './/p[contains(@class, "desc")]')) content = eval_xpath_list(result, ".//p[contains(@class, 'desc')]")
thumbnail = eval_xpath_getindex(result, './/div[contains(@class, "image-wrapper")]//img/@src', 0, default='') thumbnail = eval_xpath_getindex(result, ".//div[contains(@class, 'image-wrapper')]//img/@src", 0, default="")
item = { item = res.types.LegacyResult(
"url": url, template="default.html",
"title": title, url=url,
"content": content, title=extract_text(title),
"thumbnail": thumbnail, thumbnail=thumbnail,
} content=extract_text(content),
)
res.add(item)
result_list.append(item) return res
return result_list
def _parse_images(json_resp) -> EngineResults: def _parse_images(json_resp: dict[str, t.Any]) -> EngineResults:
result_list = EngineResults() res = EngineResults()
for result in json_resp["results"]: for result in json_resp["results"]:
item = { item = res.types.LegacyResult(
'url': result['url'], template="images.html",
'title': result['title'], url=result["url"],
'content': result['description'], title=result["title"],
'template': 'images.html', source=result["source"],
'resolution': result['properties']['format'], img_src=result["properties"]["url"],
'source': result['source'], thumbnail_src=result["thumbnail"]["src"],
'img_src': result['properties']['url'], )
'thumbnail_src': result['thumbnail']['src'], res.add(item)
}
result_list.append(item)
return result_list return res
def _parse_videos(json_resp) -> EngineResults: def _parse_videos(json_resp: dict[str, t.Any]) -> EngineResults:
result_list = EngineResults() res = EngineResults()
for result in json_resp["results"]: for result in json_resp["results"]:
item = res.types.LegacyResult(
url = result['url'] template="videos.html",
item = { url=result["url"],
'url': url, title=result["title"],
'title': result['title'], content=result["description"],
'content': result['description'], length=result["video"]["duration"],
'template': 'videos.html', duration=result["video"]["duration"],
'length': result['video']['duration'], publishedDate=_extract_published_date(result["age"]),
'duration': result['video']['duration'], )
'publishedDate': _extract_published_date(result['age']), if result["thumbnail"] is not None:
} item["thumbnail"] = result["thumbnail"]["src"]
iframe_src = get_embeded_stream_url(result["url"])
if result['thumbnail'] is not None:
item['thumbnail'] = result['thumbnail']['src']
iframe_src = get_embeded_stream_url(url)
if iframe_src: if iframe_src:
item['iframe_src'] = iframe_src item["iframe_src"] = iframe_src
result_list.append(item) res.add(item)
return result_list return res
def fetch_traits(engine_traits: EngineTraits): def fetch_traits(engine_traits: EngineTraits):
@@ -439,25 +426,25 @@ def fetch_traits(engine_traits: EngineTraits):
resp = get('https://search.brave.com/settings') resp = get('https://search.brave.com/settings')
if not resp.ok: # type: ignore if not resp.ok:
print("ERROR: response from Brave is not OK.") print("ERROR: response from Brave is not OK.")
dom = html.fromstring(resp.text) # type: ignore dom = html.fromstring(resp.text)
for option in dom.xpath('//section//option[@value="en-us"]/../option'): for option in dom.xpath("//section//option[@value='en-us']/../option"):
ui_lang = option.get('value') ui_lang = option.get("value")
try: try:
l = babel.Locale.parse(ui_lang, sep='-') l = babel.Locale.parse(ui_lang, sep="-")
if l.territory: if l.territory:
sxng_tag = region_tag(babel.Locale.parse(ui_lang, sep='-')) sxng_tag = region_tag(babel.Locale.parse(ui_lang, sep="-"))
else: else:
sxng_tag = language_tag(babel.Locale.parse(ui_lang, sep='-')) sxng_tag = language_tag(babel.Locale.parse(ui_lang, sep="-"))
except babel.UnknownLocaleError: except babel.UnknownLocaleError:
print("ERROR: can't determine babel locale of Brave's (UI) language %s" % ui_lang) print("ERROR: can't determine babel locale of Brave's (UI) language %s" % ui_lang)
continue continue
conflict = engine_traits.custom["ui_lang"].get(sxng_tag) conflict = engine_traits.custom["ui_lang"].get(sxng_tag) # type: ignore
if conflict: if conflict:
if conflict != ui_lang: if conflict != ui_lang:
print("CONFLICT: babel %s --> %s, %s" % (sxng_tag, conflict, ui_lang)) print("CONFLICT: babel %s --> %s, %s" % (sxng_tag, conflict, ui_lang))
@@ -466,26 +453,26 @@ def fetch_traits(engine_traits: EngineTraits):
# search regions of brave # search regions of brave
resp = get('https://cdn.search.brave.com/serp/v2/_app/immutable/chunks/parameters.734c106a.js') resp = get("https://cdn.search.brave.com/serp/v2/_app/immutable/chunks/parameters.734c106a.js")
if not resp.ok: # type: ignore if not resp.ok:
print("ERROR: response from Brave is not OK.") print("ERROR: response from Brave is not OK.")
country_js = resp.text[resp.text.index("options:{all") + len('options:') :] # type: ignore country_js = resp.text[resp.text.index("options:{all") + len("options:") :]
country_js = country_js[: country_js.index("},k={default")] country_js = country_js[: country_js.index("},k={default")]
country_tags = js_variable_to_python(country_js) country_tags = js_obj_str_to_python(country_js)
for k, v in country_tags.items(): for k, v in country_tags.items():
if k == 'all': if k == "all":
engine_traits.all_locale = 'all' engine_traits.all_locale = "all"
continue continue
country_tag = v['value'] country_tag = v["value"]
# add official languages of the country .. # add official languages of the country ..
for lang_tag in babel.languages.get_official_languages(country_tag, de_facto=True): for lang_tag in babel.languages.get_official_languages(country_tag, de_facto=True):
lang_tag = lang_map.get(lang_tag, lang_tag) lang_tag = lang_map.get(lang_tag, lang_tag)
sxng_tag = region_tag(babel.Locale.parse('%s_%s' % (lang_tag, country_tag.upper()))) sxng_tag = region_tag(babel.Locale.parse("%s_%s" % (lang_tag, country_tag.upper())))
# print("%-20s: %s <-- %s" % (v['label'], country_tag, sxng_tag)) # print("%-20s: %s <-- %s" % (v["label"], country_tag, sxng_tag))
conflict = engine_traits.regions.get(sxng_tag) conflict = engine_traits.regions.get(sxng_tag)
if conflict: if conflict:

View File

@@ -407,7 +407,7 @@ def fetch_traits(engine_traits: EngineTraits):
""" """
# pylint: disable=too-many-branches, too-many-statements, disable=import-outside-toplevel # pylint: disable=too-many-branches, too-many-statements, disable=import-outside-toplevel
from searx.utils import js_variable_to_python from searx.utils import js_obj_str_to_python
# fetch regions # fetch regions
@@ -455,7 +455,7 @@ def fetch_traits(engine_traits: EngineTraits):
js_code = extr(resp.text, 'languages:', ',regions') # type: ignore js_code = extr(resp.text, 'languages:', ',regions') # type: ignore
languages = js_variable_to_python(js_code) languages: dict[str, str] = js_obj_str_to_python(js_code)
for eng_lang, name in languages.items(): for eng_lang, name in languages.items():
if eng_lang == 'wt_WT': if eng_lang == 'wt_WT':

View File

@@ -42,8 +42,8 @@ def response(resp):
results.append( results.append(
{ {
'url': item['source_page_url'], 'url': item.get('source_page_url'),
'title': item['source_site'], 'title': item.get('source_site'),
'img_src': img if item['type'] == 'IMAGE' else thumb, 'img_src': img if item['type'] == 'IMAGE' else thumb,
'filesize': humanize_bytes(item['meme_file_size']), 'filesize': humanize_bytes(item['meme_file_size']),
'publishedDate': formatted_date, 'publishedDate': formatted_date,

View File

@@ -0,0 +1,52 @@
# SPDX-License-Identifier: AGPL-3.0-or-later
"""Grokipedia (general)"""
from urllib.parse import urlencode
from searx.utils import html_to_text
from searx.result_types import EngineResults
about = {
"website": 'https://grokipedia.com',
"wikidata_id": "Q136410803",
"official_api_documentation": None,
"use_official_api": False,
"require_api_key": False,
"results": "JSON",
}
base_url = "https://grokipedia.com/api/full-text-search"
categories = ['general']
paging = True
results_per_page = 10
def request(query, params):
start_index = (params["pageno"] - 1) * results_per_page
query_params = {
"query": query,
"limit": results_per_page,
"offset": start_index,
}
params["url"] = f"{base_url}?{urlencode(query_params)}"
return params
def response(resp) -> EngineResults:
results = EngineResults()
search_res = resp.json()
for item in search_res["results"]:
results.add(
results.types.MainResult(
url='https://grokipedia.com/page/' + item["slug"],
title=item["title"],
content=html_to_text(item["snippet"]),
)
)
return results

View File

@@ -31,7 +31,7 @@ paging = True
time_range_support = True time_range_support = True
# base_url can be overwritten by a list of URLs in the settings.yml # base_url can be overwritten by a list of URLs in the settings.yml
base_url: list | str = [] base_url: list[str] | str = []
def init(_): def init(_):

69
searx/engines/lucide.py Normal file
View File

@@ -0,0 +1,69 @@
# SPDX-License-Identifier: AGPL-3.0-or-later
"""Browse one of the largest collections of copyleft icons
that can be used for own projects (e.g. apps, websites).
.. _Website: https://lucide.dev
"""
import typing as t
from searx.result_types import EngineResults
if t.TYPE_CHECKING:
from extended_types import SXNG_Response
from search.processors.online import OnlineParams
about = {
"website": "https://lucide.dev/",
"wikidata_id": None,
"official_api_documentation": None,
"use_official_api": True,
"results": "JSON",
}
cdn_base_url = "https://cdn.jsdelivr.net/npm/lucide-static"
categories = ["images", "icons"]
def request(query: str, params: "OnlineParams"):
params["url"] = f"{cdn_base_url}/tags.json"
params['query'] = query
return params
def response(resp: "SXNG_Response") -> EngineResults:
res = EngineResults()
query_parts = resp.search_params["query"].lower().split(" ")
def is_result_match(result: tuple[str, list[str]]) -> bool:
icon_name, tags = result
for part in query_parts:
if part in icon_name:
return True
for tag in tags:
if part in tag:
return True
return False
filtered_results = filter(is_result_match, resp.json().items())
for icon_name, tags in filtered_results:
img_src = f"{cdn_base_url}/icons/{icon_name}.svg"
res.add(
res.types.LegacyResult(
{
"template": "images.html",
"url": img_src,
"title": icon_name,
"content": ", ".join(tags),
"img_src": img_src,
"img_format": "SVG",
}
)
)
return res

View File

@@ -28,7 +28,7 @@ Implementations
""" """
import typing as t import typing as t
from urllib.parse import urlencode, quote_plus from urllib.parse import urlencode
from searx.utils import searxng_useragent from searx.utils import searxng_useragent
from searx.result_types import EngineResults from searx.result_types import EngineResults
from searx.extended_types import SXNG_Response from searx.extended_types import SXNG_Response
@@ -42,7 +42,7 @@ about = {
"results": "JSON", "results": "JSON",
} }
base_url = "https://api.marginalia.nu" base_url = "https://api2.marginalia-search.com"
safesearch = True safesearch = True
categories = ["general"] categories = ["general"]
paging = False paging = False
@@ -85,13 +85,11 @@ class ApiSearchResults(t.TypedDict):
def request(query: str, params: dict[str, t.Any]): def request(query: str, params: dict[str, t.Any]):
query_params = { query_params = {"count": results_per_page, "nsfw": min(params["safesearch"], 1), "query": query}
"count": results_per_page,
"nsfw": min(params["safesearch"], 1),
}
params["url"] = f"{base_url}/{api_key}/search/{quote_plus(query)}?{urlencode(query_params)}" params["url"] = f"{base_url}/search?{urlencode(query_params)}"
params["headers"]["User-Agent"] = searxng_useragent() params["headers"]["User-Agent"] = searxng_useragent()
params["headers"]["API-Key"] = api_key
def response(resp: SXNG_Response): def response(resp: SXNG_Response):

View File

@@ -65,7 +65,8 @@ def request(query, params):
if search_type: if search_type:
args['fmt'] = search_type args['fmt'] = search_type
if search_type == '': # setting the page number on the first page (i.e. s=0) triggers a rate-limit
if search_type == '' and params['pageno'] > 1:
args['s'] = 10 * (params['pageno'] - 1) args['s'] = 10 * (params['pageno'] - 1)
if params['time_range'] and search_type != 'images': if params['time_range'] and search_type != 'images':

View File

@@ -1,276 +0,0 @@
# SPDX-License-Identifier: AGPL-3.0-or-later
"""Mullvad Leta is a search engine proxy. Currently Leta only offers text
search results not image, news or any other types of search result. Leta acts
as a proxy to Google and Brave search results. You can select which backend
search engine you wish to use, see (:py:obj:`leta_engine`).
.. hint::
Leta caches each search for up to 30 days. For example, if you use search
terms like ``news``, contrary to your intention you'll get very old results!
Configuration
=============
The engine has the following additional settings:
- :py:obj:`leta_engine` (:py:obj:`LetaEnginesType`)
You can configure one Leta engine for Google and one for Brave:
.. code:: yaml
- name: mullvadleta
engine: mullvad_leta
leta_engine: google
shortcut: ml
- name: mullvadleta brave
engine: mullvad_leta
network: mullvadleta # use network from engine "mullvadleta" configured above
leta_engine: brave
shortcut: mlb
Implementations
===============
"""
import typing as t
from urllib.parse import urlencode
import babel
from httpx import Response
from lxml import html
from searx.enginelib.traits import EngineTraits
from searx.extended_types import SXNG_Response
from searx.locales import get_official_locales, language_tag, region_tag
from searx.utils import eval_xpath_list
from searx.result_types import EngineResults, MainResult
from searx.network import raise_for_httperror
search_url = "https://leta.mullvad.net"
# about
about = {
"website": search_url,
"wikidata_id": 'Q47008412', # the Mullvad id - not leta, but related
"official_api_documentation": 'https://leta.mullvad.net/faq',
"use_official_api": False,
"require_api_key": False,
"results": 'HTML',
}
# engine dependent config
categories = ["general", "web"]
paging = True
max_page = 10
time_range_support = True
time_range_dict = {
"day": "d",
"week": "w",
"month": "m",
"year": "y",
}
LetaEnginesType = t.Literal["google", "brave"]
"""Engine types supported by mullvadleta."""
leta_engine: LetaEnginesType = "google"
"""Select Leta's engine type from :py:obj:`LetaEnginesType`."""
def init(_):
l = t.get_args(LetaEnginesType)
if leta_engine not in l:
raise ValueError(f"leta_engine '{leta_engine}' is invalid, use one of {', '.join(l)}")
class DataNodeQueryMetaDataIndices(t.TypedDict):
"""Indices into query metadata."""
success: int
q: int # pylint: disable=invalid-name
country: int
language: int
lastUpdated: int
engine: int
items: int
infobox: int
news: int
timestamp: int
altered: int
page: int
next: int # if -1, there no more results are available
previous: int
class DataNodeResultIndices(t.TypedDict):
"""Indices into query resultsdata."""
link: int
snippet: int
title: int
favicon: int
def request(query: str, params: dict[str, t.Any]) -> None:
params["raise_for_httperror"] = False
params["method"] = "GET"
args = {
"q": query,
"engine": leta_engine,
"x-sveltekit-invalidated": "001", # hardcoded from all requests seen
}
country = traits.get_region(params.get("searxng_locale"), traits.all_locale) # type: ignore
if country:
args["country"] = country
language = traits.get_language(params.get("searxng_locale"), traits.all_locale) # type: ignore
if language:
args["language"] = language
if params["time_range"] in time_range_dict:
args["lastUpdated"] = time_range_dict[params["time_range"]]
if params["pageno"] > 1:
args["page"] = params["pageno"]
params["url"] = f"{search_url}/search/__data.json?{urlencode(args)}"
def response(resp: SXNG_Response) -> EngineResults:
results = EngineResults()
if resp.status_code in (403, 429):
# It doesn't matter if you're using Mullvad's VPN and a proper browser,
# you'll still get blocked for specific searches with a 403 or 429 HTTP
# status code.
# https://github.com/searxng/searxng/issues/5328#issue-3518337233
return results
# raise for other errors
raise_for_httperror(resp)
json_response = resp.json()
nodes = json_response["nodes"]
# 0: is None
# 1: has "connected=True", not useful
# 2: query results within "data"
data_nodes = nodes[2]["data"]
# Instead of nested object structure, all objects are flattened into a
# list. Rather, the first object in data_node provides indices into the
# "data_nodes" to access each searchresult (which is an object of more
# indices)
#
# Read the relative TypedDict definitions for details
query_meta_data: DataNodeQueryMetaDataIndices = data_nodes[0]
query_items_indices = query_meta_data["items"]
for idx in data_nodes[query_items_indices]:
query_item_indices: DataNodeResultIndices = data_nodes[idx]
results.add(
MainResult(
url=data_nodes[query_item_indices["link"]],
title=data_nodes[query_item_indices["title"]],
content=data_nodes[query_item_indices["snippet"]],
)
)
return results
def fetch_traits(engine_traits: EngineTraits) -> None:
"""Fetch languages and regions from Mullvad-Leta"""
def extract_table_data(table):
for row in table.xpath(".//tr")[2:]:
cells = row.xpath(".//td | .//th") # includes headers and data
if len(cells) > 1: # ensure the column exists
cell0 = cells[0].text_content().strip()
cell1 = cells[1].text_content().strip()
yield [cell0, cell1]
# pylint: disable=import-outside-toplevel
# see https://github.com/searxng/searxng/issues/762
from searx.network import get as http_get
# pylint: enable=import-outside-toplevel
resp = http_get(f"{search_url}/documentation")
if not isinstance(resp, Response):
print("ERROR: failed to get response from mullvad-leta. Are you connected to the VPN?")
return
if not resp.ok:
print("ERROR: response from mullvad-leta is not OK. Are you connected to the VPN?")
return
dom = html.fromstring(resp.text)
# There are 4 HTML tables on the documentation page for extracting information:
# 0. Keyboard Shortcuts
# 1. Query Parameters (shoutout to Mullvad for accessible docs for integration)
# 2. Country Codes [Country, Code]
# 3. Language Codes [Language, Code]
tables = eval_xpath_list(dom.body, "//table")
if tables is None or len(tables) <= 0:
print("ERROR: could not find any tables. Was the page updated?")
language_table = tables[3]
lang_map = {
"zh-hant": "zh_Hans",
"zh-hans": "zh_Hant",
"jp": "ja",
}
for language, code in extract_table_data(language_table):
locale_tag = lang_map.get(code, code).replace("-", "_") # type: ignore
try:
locale = babel.Locale.parse(locale_tag)
except babel.UnknownLocaleError:
print(f"ERROR: Mullvad-Leta language {language} ({code}) is unknown by babel")
continue
sxng_tag = language_tag(locale)
engine_traits.languages[sxng_tag] = code
country_table = tables[2]
country_map = {
"cn": "zh-CN",
"hk": "zh-HK",
"jp": "ja-JP",
"my": "ms-MY",
"tw": "zh-TW",
"uk": "en-GB",
"us": "en-US",
}
for country, code in extract_table_data(country_table):
sxng_tag = country_map.get(code)
if sxng_tag:
engine_traits.regions[sxng_tag] = code
continue
try:
locale = babel.Locale.parse(f"{code.lower()}_{code.upper()}")
except babel.UnknownLocaleError:
locale = None
if locale:
engine_traits.regions[region_tag(locale)] = code
continue
official_locales = get_official_locales(code, engine_traits.languages.keys(), regional=True)
if not official_locales:
print(f"ERROR: Mullvad-Leta country '{code}' ({country}) could not be mapped as expected.")
continue
for locale in official_locales:
engine_traits.regions[region_tag(locale)] = code

View File

@@ -15,7 +15,7 @@ from searx.utils import (
extr, extr,
html_to_text, html_to_text,
parse_duration_string, parse_duration_string,
js_variable_to_python, js_obj_str_to_python,
get_embeded_stream_url, get_embeded_stream_url,
) )
@@ -125,7 +125,7 @@ def parse_images(data):
match = extr(data, '<script>var imageSearchTabData=', '</script>') match = extr(data, '<script>var imageSearchTabData=', '</script>')
if match: if match:
json = js_variable_to_python(match.strip()) json = js_obj_str_to_python(match.strip())
items = json.get('content', {}).get('items', []) items = json.get('content', {}).get('items', [])
for item in items: for item in items:

View File

@@ -40,8 +40,8 @@ Known Quirks
The implementation to support :py:obj:`paging <searx.enginelib.Engine.paging>` The implementation to support :py:obj:`paging <searx.enginelib.Engine.paging>`
is based on the *nextpage* method of Piped's REST API / the :py:obj:`frontend is based on the *nextpage* method of Piped's REST API / the :py:obj:`frontend
API <frontend_url>`. This feature is *next page driven* and plays well with the API <frontend_url>`. This feature is *next page driven* and plays well with the
:ref:`infinite_scroll <settings ui>` setting in SearXNG but it does not really :ref:`infinite_scroll <settings plugins>` plugin in SearXNG but it does not
fit into SearXNG's UI to select a page by number. really fit into SearXNG's UI to select a page by number.
Implementations Implementations
=============== ===============
@@ -72,7 +72,7 @@ categories = []
paging = True paging = True
# search-url # search-url
backend_url: list[str] | str | None = None backend_url: list[str] | str = []
"""Piped-Backend_: The core component behind Piped. The value is an URL or a """Piped-Backend_: The core component behind Piped. The value is an URL or a
list of URLs. In the latter case instance will be selected randomly. For a list of URLs. In the latter case instance will be selected randomly. For a
complete list of official instances see Piped-Instances (`JSON complete list of official instances see Piped-Instances (`JSON

View File

@@ -17,10 +17,11 @@ about = {
# Engine configuration # Engine configuration
paging = True paging = True
categories = ['images'] categories = ['images']
remove_ai_images = False
# Search URL # Search URL
base_url = "https://www.pixiv.net/ajax/search/illustrations" base_url = "https://www.pixiv.net/ajax/search/illustrations"
pixiv_image_proxies: list = [] pixiv_image_proxies: list[str] = []
def request(query, params): def request(query, params):
@@ -34,6 +35,9 @@ def request(query, params):
"lang": "en", "lang": "en",
} }
if remove_ai_images is True:
query_params.update({"ai_type": 1})
params["url"] = f"{base_url}/{query}?{urlencode(query_params)}" params["url"] = f"{base_url}/{query}?{urlencode(query_params)}"
return params return params

View File

@@ -140,7 +140,7 @@ def _get_request_id(query, params):
if l.territory: if l.territory:
headers['Accept-Language'] = f"{l.language}-{l.territory},{l.language};" "q=0.9,*;" "q=0.5" headers['Accept-Language'] = f"{l.language}-{l.territory},{l.language};" "q=0.9,*;" "q=0.5"
resp = get(url, headers=headers) resp = get(url, headers=headers, timeout=5)
for line in resp.text.split("\n"): for line in resp.text.split("\n"):
if "window.searchId = " in line: if "window.searchId = " in line:

View File

@@ -64,7 +64,7 @@ def _get_algolia_api_url():
return __CACHED_API_URL return __CACHED_API_URL
# fake request to extract api url # fake request to extract api url
resp = get(f"{pdia_base_url}/search/?q=") resp = get(f"{pdia_base_url}/search/?q=", timeout=3)
if resp.status_code != 200: if resp.status_code != 200:
raise LookupError("Failed to fetch config location (and as such the API url) for PDImageArchive") raise LookupError("Failed to fetch config location (and as such the API url) for PDImageArchive")
pdia_config_filepart = extr(resp.text, pdia_config_start, pdia_config_end) pdia_config_filepart = extr(resp.text, pdia_config_start, pdia_config_end)

View File

@@ -73,7 +73,7 @@ def request(query: str, params: "OnlineParams") -> None:
) )
esearch_url = f"{eutils_api}/esearch.fcgi?{args}" esearch_url = f"{eutils_api}/esearch.fcgi?{args}"
# DTD: https://eutils.ncbi.nlm.nih.gov/eutils/dtd/20060628/esearch.dtd # DTD: https://eutils.ncbi.nlm.nih.gov/eutils/dtd/20060628/esearch.dtd
esearch_resp: "SXNG_Response" = get(esearch_url) esearch_resp: "SXNG_Response" = get(esearch_url, timeout=3)
pmids_results = etree.XML(esearch_resp.content) pmids_results = etree.XML(esearch_resp.content)
pmids: list[str] = [i.text for i in pmids_results.xpath("//eSearchResult/IdList/Id")] pmids: list[str] = [i.text for i in pmids_results.xpath("//eSearchResult/IdList/Id")]

View File

@@ -41,6 +41,7 @@ from datetime import date, timedelta
from urllib.parse import urlencode from urllib.parse import urlencode
from searx.result_types import EngineResults from searx.result_types import EngineResults
from searx.utils import html_to_text
if t.TYPE_CHECKING: if t.TYPE_CHECKING:
from searx.extended_types import SXNG_Response from searx.extended_types import SXNG_Response
@@ -133,11 +134,14 @@ def response(resp: "SXNG_Response") -> EngineResults:
if mtype in ["image"] and subtype in ["bmp", "gif", "jpeg", "png"]: if mtype in ["image"] and subtype in ["bmp", "gif", "jpeg", "png"]:
thumbnail = url thumbnail = url
# remove HTML from snippet
content = html_to_text(result.get("snippet", ""))
res.add( res.add(
res.types.File( res.types.File(
title=result.get("label", ""), title=result.get("label", ""),
url=url, url=url,
content=result.get("snippet", ""), content=content,
size=result.get("size", ""), size=result.get("size", ""),
filename=result.get("filename", ""), filename=result.get("filename", ""),
abstract=result.get("abstract", ""), abstract=result.get("abstract", ""),

View File

@@ -32,8 +32,8 @@ Known Quirks
The implementation to support :py:obj:`paging <searx.enginelib.Engine.paging>` The implementation to support :py:obj:`paging <searx.enginelib.Engine.paging>`
is based on the *nextpage* method of Seekr's REST API. This feature is *next is based on the *nextpage* method of Seekr's REST API. This feature is *next
page driven* and plays well with the :ref:`infinite_scroll <settings ui>` page driven* and plays well with the :ref:`infinite_scroll <settings plugins>`
setting in SearXNG but it does not really fit into SearXNG's UI to select a page plugin in SearXNG but it does not really fit into SearXNG's UI to select a page
by number. by number.
Implementations Implementations

View File

@@ -66,7 +66,7 @@ def setup(engine_settings: dict[str, t.Any]) -> bool:
def get_ui_version() -> str: def get_ui_version() -> str:
ret_val: str = CACHE.get("X-S2-UI-Version") ret_val: str = CACHE.get("X-S2-UI-Version")
if not ret_val: if not ret_val:
resp = get(base_url) resp = get(base_url, timeout=3)
if not resp.ok: if not resp.ok:
raise RuntimeError("Can't determine Semantic Scholar UI version") raise RuntimeError("Can't determine Semantic Scholar UI version")

View File

@@ -27,7 +27,7 @@ base_url = 'https://search.seznam.cz/'
def request(query, params): def request(query, params):
response_index = get(base_url, headers=params['headers'], raise_for_httperror=True) response_index = get(base_url, headers=params['headers'], raise_for_httperror=True, timeout=3)
dom = html.fromstring(response_index.text) dom = html.fromstring(response_index.text)
url_params = { url_params = {

View File

@@ -124,7 +124,7 @@ def get_client_id() -> str | None:
client_id = "" client_id = ""
url = "https://soundcloud.com" url = "https://soundcloud.com"
resp = http_get(url, timeout=10) resp = http_get(url, timeout=3)
if not resp.ok: if not resp.ok:
logger.error("init: GET %s failed", url) logger.error("init: GET %s failed", url)

View File

@@ -96,7 +96,7 @@ search_type = 'text'
``video`` are not yet implemented (Pull-Requests are welcome). ``video`` are not yet implemented (Pull-Requests are welcome).
""" """
base_url: list[str] | str | None = None base_url: list[str] | str = []
"""The value is an URL or a list of URLs. In the latter case instance will be """The value is an URL or a list of URLs. In the latter case instance will be
selected randomly. selected randomly.
""" """

View File

@@ -28,6 +28,20 @@ search_type = ""
base_url_web = 'https://yandex.com/search/site/' base_url_web = 'https://yandex.com/search/site/'
base_url_images = 'https://yandex.com/images/search' base_url_images = 'https://yandex.com/images/search'
# Supported languages
yandex_supported_langs = [
"ru", # Russian
"en", # English
"be", # Belarusian
"fr", # French
"de", # German
"id", # Indonesian
"kk", # Kazakh
"tt", # Tatar
"tr", # Turkish
"uk", # Ukrainian
]
results_xpath = '//li[contains(@class, "serp-item")]' results_xpath = '//li[contains(@class, "serp-item")]'
url_xpath = './/a[@class="b-serp-item__title-link"]/@href' url_xpath = './/a[@class="b-serp-item__title-link"]/@href'
title_xpath = './/h3[@class="b-serp-item__title"]/a[@class="b-serp-item__title-link"]/span' title_xpath = './/h3[@class="b-serp-item__title"]/a[@class="b-serp-item__title-link"]/span'
@@ -48,6 +62,10 @@ def request(query, params):
"searchid": "3131712", "searchid": "3131712",
} }
lang = params["language"].split("-")[0]
if lang in yandex_supported_langs:
query_params_web["lang"] = lang
query_params_images = { query_params_images = {
"text": query, "text": query,
"uinfo": "sw-1920-sh-1080-ww-1125-wh-999", "uinfo": "sw-1920-sh-1080-ww-1125-wh-999",

View File

@@ -17,10 +17,6 @@
""" """
# Struct fields aren't discovered in Python 3.14
# - https://github.com/searxng/searxng/issues/5284
from __future__ import annotations
import typing as t import typing as t
import os import os

View File

@@ -1,9 +1,6 @@
# SPDX-License-Identifier: AGPL-3.0-or-later # SPDX-License-Identifier: AGPL-3.0-or-later
# pylint: disable=missing-module-docstring # pylint: disable=missing-module-docstring
# Struct fields aren't discovered in Python 3.14
# - https://github.com/searxng/searxng/issues/5284
from __future__ import annotations
import pathlib import pathlib
import msgspec import msgspec

View File

@@ -1,9 +1,6 @@
# SPDX-License-Identifier: AGPL-3.0-or-later # SPDX-License-Identifier: AGPL-3.0-or-later
"""Implementations for a favicon proxy""" """Implementations for a favicon proxy"""
# Struct fields aren't discovered in Python 3.14
# - https://github.com/searxng/searxng/issues/5284
from __future__ import annotations
from typing import Callable from typing import Callable

View File

@@ -106,9 +106,9 @@ class AsyncProxyTransportFixed(AsyncProxyTransport):
except ProxyConnectionError as e: except ProxyConnectionError as e:
raise httpx.ProxyError("ProxyConnectionError: " + str(e.strerror), request=request) from e raise httpx.ProxyError("ProxyConnectionError: " + str(e.strerror), request=request) from e
except ProxyTimeoutError as e: except ProxyTimeoutError as e:
raise httpx.ProxyError("ProxyTimeoutError: " + e.args[0], request=request) from e raise httpx.ProxyError("ProxyTimeoutError: " + str(e.args[0]), request=request) from e
except ProxyError as e: except ProxyError as e:
raise httpx.ProxyError("ProxyError: " + e.args[0], request=request) from e raise httpx.ProxyError("ProxyError: " + str(e.args[0]), request=request) from e
def get_transport_for_socks_proxy( def get_transport_for_socks_proxy(

View File

@@ -1,31 +1,19 @@
# SPDX-License-Identifier: AGPL-3.0-or-later # SPDX-License-Identifier: AGPL-3.0-or-later
"""Calculate mathematical expressions using :py:obj:`ast.parse` (mode="eval").""" # pylint: disable=missing-module-docstring
import typing import typing as t
import ast from flask_babel import gettext # pyright: ignore[reportUnknownVariableType]
import math
import re
import operator
import multiprocessing
import babel
import babel.numbers
from flask_babel import gettext
from searx.result_types import EngineResults
from searx.plugins import Plugin, PluginInfo from searx.plugins import Plugin, PluginInfo
if typing.TYPE_CHECKING: if t.TYPE_CHECKING:
from searx.search import SearchWithPlugins
from searx.extended_types import SXNG_Request
from searx.plugins import PluginCfg from searx.plugins import PluginCfg
@t.final
class SXNGPlugin(Plugin): class SXNGPlugin(Plugin):
"""Plugin converts strings to different hash digests. The results are """Parses and solves mathematical expressions."""
displayed in area for the "answers".
"""
id = "calculator" id = "calculator"
@@ -34,200 +22,7 @@ class SXNGPlugin(Plugin):
self.info = PluginInfo( self.info = PluginInfo(
id=self.id, id=self.id,
name=gettext("Basic Calculator"), name=gettext("Calculator"),
description=gettext("Calculate mathematical expressions via the search bar"), description=gettext("Parses and solves mathematical expressions."),
preference_section="general", preference_section="query",
) )
def timeout_func(self, timeout, func, *args, **kwargs):
que = mp_fork.Queue()
p = mp_fork.Process(target=handler, args=(que, func, args), kwargs=kwargs)
p.start()
p.join(timeout=timeout)
ret_val = None
# pylint: disable=used-before-assignment,undefined-variable
if not p.is_alive():
ret_val = que.get()
else:
self.log.debug("terminate function (%s: %s // %s) after timeout is exceeded", func.__name__, args, kwargs)
p.terminate()
p.join()
p.close()
return ret_val
def post_search(self, request: "SXNG_Request", search: "SearchWithPlugins") -> EngineResults:
results = EngineResults()
# only show the result of the expression on the first page
if search.search_query.pageno > 1:
return results
query = search.search_query.query
# in order to avoid DoS attacks with long expressions, ignore long expressions
if len(query) > 100:
return results
# replace commonly used math operators with their proper Python operator
query = query.replace("x", "*").replace(":", "/")
# Is this a term that can be calculated?
word, constants = "", set()
for x in query:
# Alphabetic characters are defined as "Letters" in the Unicode
# character database and are the constants in an equation.
if x.isalpha():
word += x.strip()
elif word:
constants.add(word)
word = ""
# In the term of an arithmetic operation there should be no other
# alphabetic characters besides the constants
if constants - set(math_constants):
return results
# use UI language
ui_locale = babel.Locale.parse(request.preferences.get_value("locale"), sep="-")
# parse the number system in a localized way
def _decimal(match: re.Match) -> str:
val = match.string[match.start() : match.end()]
val = babel.numbers.parse_decimal(val, ui_locale, numbering_system="latn")
return str(val)
decimal = ui_locale.number_symbols["latn"]["decimal"]
group = ui_locale.number_symbols["latn"]["group"]
query = re.sub(f"[0-9]+[{decimal}|{group}][0-9]+[{decimal}|{group}]?[0-9]?", _decimal, query)
# in python, powers are calculated via **
query_py_formatted = query.replace("^", "**")
# Prevent the runtime from being longer than 50 ms
res = self.timeout_func(0.05, _eval_expr, query_py_formatted)
if res is None or res[0] == "":
return results
res, is_boolean = res
if is_boolean:
res = "True" if res != 0 else "False"
else:
res = babel.numbers.format_decimal(res, locale=ui_locale)
results.add(results.types.Answer(answer=f"{search.search_query.query} = {res}"))
return results
def _compare(ops: list[ast.cmpop], values: list[int | float]) -> int:
"""
2 < 3 becomes ops=[ast.Lt] and values=[2,3]
2 < 3 <= 4 becomes ops=[ast.Lt, ast.LtE] and values=[2,3, 4]
"""
for op, a, b in zip(ops, values, values[1:]): # pylint: disable=invalid-name
if isinstance(op, ast.Eq) and a == b:
continue
if isinstance(op, ast.NotEq) and a != b:
continue
if isinstance(op, ast.Lt) and a < b:
continue
if isinstance(op, ast.LtE) and a <= b:
continue
if isinstance(op, ast.Gt) and a > b:
continue
if isinstance(op, ast.GtE) and a >= b:
continue
# Ignore impossible ops:
# * ast.Is
# * ast.IsNot
# * ast.In
# * ast.NotIn
# the result is False for a and b and operation op
return 0
# the results for all the ops are True
return 1
operators: dict[type, typing.Callable] = {
ast.Add: operator.add,
ast.Sub: operator.sub,
ast.Mult: operator.mul,
ast.Div: operator.truediv,
ast.Pow: operator.pow,
ast.BitXor: operator.xor,
ast.BitOr: operator.or_,
ast.BitAnd: operator.and_,
ast.USub: operator.neg,
ast.RShift: operator.rshift,
ast.LShift: operator.lshift,
ast.Mod: operator.mod,
ast.Compare: _compare,
}
math_constants = {
'e': math.e,
'pi': math.pi,
}
# with multiprocessing.get_context("fork") we are ready for Py3.14 (by emulating
# the old behavior "fork") but it will not solve the core problem of fork, nor
# will it remove the deprecation warnings in py3.12 & py3.13. Issue is
# ddiscussed here: https://github.com/searxng/searxng/issues/4159
mp_fork = multiprocessing.get_context("fork")
def _eval_expr(expr):
"""
Evaluates the given textual expression.
Returns a tuple of (numericResult, isBooleanResult).
>>> _eval_expr('2^6')
64, False
>>> _eval_expr('2**6')
64, False
>>> _eval_expr('1 + 2*3**(4^5) / (6 + -7)')
-5.0, False
>>> _eval_expr('1 < 3')
1, True
>>> _eval_expr('5 < 3')
0, True
>>> _eval_expr('17 == 11+1+5 == 7+5+5')
1, True
"""
try:
root_expr = ast.parse(expr, mode='eval').body
return _eval(root_expr), isinstance(root_expr, ast.Compare)
except (SyntaxError, TypeError, ZeroDivisionError):
# Expression that can't be evaluated (i.e. not a math expression)
return "", False
def _eval(node):
if isinstance(node, ast.Constant) and isinstance(node.value, (int, float)):
return node.value
if isinstance(node, ast.BinOp):
return operators[type(node.op)](_eval(node.left), _eval(node.right))
if isinstance(node, ast.UnaryOp):
return operators[type(node.op)](_eval(node.operand))
if isinstance(node, ast.Compare):
return _compare(node.ops, [_eval(node.left)] + [_eval(c) for c in node.comparators])
if isinstance(node, ast.Name) and node.id in math_constants:
return math_constants[node.id]
raise TypeError(node)
def handler(q: multiprocessing.Queue, func, args, **kwargs): # pylint:disable=invalid-name
try:
q.put(func(*args, **kwargs))
except:
q.put(None)
raise

View File

@@ -0,0 +1,28 @@
# SPDX-License-Identifier: AGPL-3.0-or-later
# pylint: disable=missing-module-docstring
import typing as t
from flask_babel import gettext # pyright: ignore[reportUnknownVariableType]
from searx.plugins import Plugin, PluginInfo
if t.TYPE_CHECKING:
from searx.plugins import PluginCfg
@t.final
class SXNGPlugin(Plugin):
"""Automatically loads the next page when scrolling to bottom of the current page."""
id = "infiniteScroll"
def __init__(self, plg_cfg: "PluginCfg") -> None:
super().__init__(plg_cfg)
self.info = PluginInfo(
id=self.id,
name=gettext("Infinite scroll"),
description=gettext("Automatically loads the next page when scrolling to bottom of the current page"),
preference_section="ui",
)

View File

@@ -24,12 +24,6 @@ if typing.TYPE_CHECKING:
from searx.plugins import PluginCfg from searx.plugins import PluginCfg
name = ""
description = gettext("")
plugin_id = ""
preference_section = ""
CONVERT_KEYWORDS = ["in", "to", "as"] CONVERT_KEYWORDS = ["in", "to", "as"]

View File

@@ -476,10 +476,6 @@ class Preferences:
settings['ui']['query_in_title'], settings['ui']['query_in_title'],
locked=is_locked('query_in_title') locked=is_locked('query_in_title')
), ),
'infinite_scroll': BooleanSetting(
settings['ui']['infinite_scroll'],
locked=is_locked('infinite_scroll')
),
'search_on_category_select': BooleanSetting( 'search_on_category_select': BooleanSetting(
settings['ui']['search_on_category_select'], settings['ui']['search_on_category_select'],
locked=is_locked('search_on_category_select') locked=is_locked('search_on_category_select')

View File

@@ -16,10 +16,6 @@
:members: :members:
""" """
# Struct fields aren't discovered in Python 3.14
# - https://github.com/searxng/searxng/issues/5284
from __future__ import annotations
__all__ = ["Result"] __all__ = ["Result"]
import typing as t import typing as t

View File

@@ -28,9 +28,6 @@ template.
""" """
# pylint: disable=too-few-public-methods # pylint: disable=too-few-public-methods
# Struct fields aren't discovered in Python 3.14
# - https://github.com/searxng/searxng/issues/5284
from __future__ import annotations
__all__ = ["AnswerSet", "Answer", "Translations", "WeatherAnswer"] __all__ = ["AnswerSet", "Answer", "Translations", "WeatherAnswer"]

View File

@@ -14,10 +14,6 @@ template. For highlighting the code passages, Pygments_ is used.
""" """
# pylint: disable=too-few-public-methods, disable=invalid-name # pylint: disable=too-few-public-methods, disable=invalid-name
# Struct fields aren't discovered in Python 3.14
# - https://github.com/searxng/searxng/issues/5284
from __future__ import annotations
__all__ = ["Code"] __all__ = ["Code"]
import typing as t import typing as t

View File

@@ -13,9 +13,6 @@ template.
""" """
# pylint: disable=too-few-public-methods # pylint: disable=too-few-public-methods
# Struct fields aren't discovered in Python 3.14
# - https://github.com/searxng/searxng/issues/5284
from __future__ import annotations
__all__ = ["KeyValue"] __all__ = ["KeyValue"]

View File

@@ -21,10 +21,6 @@ Related topics:
""" """
# pylint: disable=too-few-public-methods, disable=invalid-name # pylint: disable=too-few-public-methods, disable=invalid-name
# Struct fields aren't discovered in Python 3.14
# - https://github.com/searxng/searxng/issues/5284
from __future__ import annotations
__all__ = ["Paper"] __all__ = ["Paper"]
import typing as t import typing as t

View File

@@ -22,7 +22,7 @@ from searx.network import initialize as initialize_network, check_network_config
from searx.results import ResultContainer from searx.results import ResultContainer
from searx.search.checker import initialize as initialize_checker from searx.search.checker import initialize as initialize_checker
from searx.search.processors import PROCESSORS from searx.search.processors import PROCESSORS
from searx.search.processors.abstract import RequestParams
if t.TYPE_CHECKING: if t.TYPE_CHECKING:
from .models import SearchQuery from .models import SearchQuery
@@ -79,16 +79,20 @@ class Search:
return bool(results) return bool(results)
# do search-request # do search-request
def _get_requests(self) -> tuple[list[tuple[str, str, dict[str, t.Any]]], int]: def _get_requests(self) -> tuple[list[tuple[str, str, RequestParams]], float]:
# init vars # init vars
requests: list[tuple[str, str, dict[str, t.Any]]] = [] requests: list[tuple[str, str, RequestParams]] = []
# max of all selected engine timeout # max of all selected engine timeout
default_timeout = 0 default_timeout = 0
# start search-request for all selected engines # start search-request for all selected engines
for engineref in self.search_query.engineref_list: for engineref in self.search_query.engineref_list:
processor = PROCESSORS[engineref.name] processor = PROCESSORS.get(engineref.name)
if not processor:
# engine does not exists; not yet or the 'init' method of the
# engine has been failed and the engine has not been registered.
continue
# stop the request now if the engine is suspend # stop the request now if the engine is suspend
if processor.extend_container_if_suspended(self.result_container): if processor.extend_container_if_suspended(self.result_container):
@@ -133,7 +137,7 @@ class Search:
return requests, actual_timeout return requests, actual_timeout
def search_multiple_requests(self, requests: list[tuple[str, str, dict[str, t.Any]]]): def search_multiple_requests(self, requests: list[tuple[str, str, RequestParams]]):
# pylint: disable=protected-access # pylint: disable=protected-access
search_id = str(uuid4()) search_id = str(uuid4())

View File

@@ -51,7 +51,6 @@ class ProcessorMap(dict[str, EngineProcessor]):
eng_name: str = eng_settings["name"] eng_name: str = eng_settings["name"]
if eng_settings.get("inactive", False) is True: if eng_settings.get("inactive", False) is True:
logger.info("Engine of name '%s' is inactive.", eng_name)
continue continue
eng_obj = engines.engines.get(eng_name) eng_obj = engines.engines.get(eng_name)

Some files were not shown because too many files have changed in this diff Show More