In the past, some files were tested with the standard profile, others with a
profile in which most of the messages were switched off ... some files were not
checked at all.
- ``PYLINT_SEARXNG_DISABLE_OPTION`` has been abolished
- the distinction ``# lint: pylint`` is no longer necessary
- the pylint tasks have been reduced from three to two
1. ./searx/engines -> lint engines with additional builtins
2. ./searx ./searxng_extra ./tests -> lint all other python files
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
In the master branch, the checker starts to stream the response
and cut the connection. However it creates a lot of read error,
which are false negative. I don't know how to fix the issue.
This commit change the checker to download the whole image.
The error reporting is also changed to report only one line,
instead of the whole stacktrace.
Also, if a timeout occurs, the checker waits for one second
before retry.
Do note that I have not test the checker running in background.
This feature seems forgotten and lack of interrest despite the
initial move few years ago.
This patch fixes issue reported by ``make test.unit``::
searx/search/checker/impl.py:39: SyntaxWarning: invalid escape sequence '\>'
rep = ['<' + tag + '[^\>]*>' for tag in HTML_TAGS]
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Since https://github.com/searxng/searxng/pull/354
the searx.network.stream(...) returns a tuple
This commits update the checker code according to
this function signature change.
Disable the python code formatting from python-black, where the readability of
code suffers by formatting.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
* download images using the "image_proxy" network (HTTP/1 instead of HTTP/2)
* don't cache data: URL (reduce memory usage)
* after each test: purge image URL cache then call garbage collector
* download only the first 64kb of images
* display the median time instead of the average.
* add a "Reliability" column (sum up the metrics and the checker results).
* the "selected language", "SafeSearch", "Time range" values are displayed as "broken" when the checker tests fail.
Report to the user suspended engines.
searx.search.processor.abstract:
* manages suspend time (per network).
* reports suspended time to the ResultContainer (method extend_container_if_suspended)
* adds the results to the ResultContainer (method extend_container)
* handles exceptions (method handle_exception)
settings.yml:
* outgoing.networks:
* can contains network definition
* propertiers: enable_http, verify, http2, max_connections, max_keepalive_connections,
keepalive_expiry, local_addresses, support_ipv4, support_ipv6, proxies, max_redirects, retries
* retries: 0 by default, number of times searx retries to send the HTTP request (using different IP & proxy each time)
* local_addresses can be "192.168.0.1/24" (it supports IPv6)
* support_ipv4 & support_ipv6: both True by default
see https://github.com/searx/searx/pull/1034
* each engine can define a "network" section:
* either a full network description
* either reference an existing network
* all HTTP requests of engine use the same HTTP configuration (it was not the case before, see proxy configuration in master)