sear.c scrapes search results of popular engines, caches them and creates a simple HTML UI
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
Anton Luka Šijanec 4225b3ffd9 0.0.16, read debian/changelog 3 weeks ago
debian 0.0.16, read debian/changelog 3 weeks ago
src 0.0.16, read debian/changelog 3 weeks ago
test l and h get args, no logmem, embed css, php dep 2 months ago
.gitignore 0.0.16, read debian/changelog 3 weeks ago
Makefile 0.0.16, read debian/changelog 3 weeks ago
README.md 0.0.14-3 2 months ago

README.md

sear.c

sear.c is used as a lightweight replacement for SearX that proxies and caches search results from the Google web search engine. The main advantages over SearX are speed and simplicity.

instructions for debian and ubuntu systems

First add my software distribution repository prog.sijanec.eu into your APT sources list.

apt install sear.c
service sear.c start

requirements

  • a POSIX system
  • GNU C library
  • GNU compiler collection (it's written in GNU C - it uses nested functions)
  • GNU Make
  • libxml2-dev (for the simple HTML/1.0 client and HTML parser)
  • libmicrohttpd-dev (for serving results - use a reverse proxy, such as nginx, for HTTPS)
  • xxd (for converting HTML pages into C arrays when compiling from source)
  • php-cli for a single line of Makefile (and I talk about bloat)

compiling from source

make prepare
make

instructions

  • run the daemon - it starts listening on HTTP port 7327 (remember it by picturing phone keyboard buttons with letters SEAR (; )
  • optional: create a reverse proxy for HTTPS
  • navigate to http://localhost:7327 and do a couple of searches to see if everything works
  • the horseshoe button redirects directly to the first result without wasting time on the results page. use if you feel lucky. (BP)
  • the painting button performs a search for images. PRIVACY WARNING: images are loaded directly from servers (not from google)
  • program writes all logs to standard error
  • setting the h parameter will rewrite links to HTTP from HTTPS
  • setting the l parameter with a number will limit number of displayed links to that number.

prebuilt binaries

apart from the usual debian distribution, there are also prebuilt binaries built for amd64 and arm64, as well as debian packages.

before downloading, check that the build passed, indicated below on the badge:

Build Status

screenshots

screenshot in chromium 0 screenshot in chromium 2 screenshot in chromium 3 screenshot in chromium 4 screenshot in chromium 5

additional information

  • valgrind reports a memory leak, leak is bigger with every API search query. run make valgrind and you'll see it. I was unable to find the bug, but it just bothers me. I wrote a small bug PoC (test/bug) but I could not replicate the bug (cd tmp/bug; make; make valgrind; less valgrind-out.txt - process exits with no leaks possible). Example output from sear.c valgrind with one request done is included in test/bug/example-valgrind.txt. Such small memory leak is not a problem, since we store all extracted data from the query indefinetley anyways, but it's still pretty dumb to leak memory.
  • memory allocations are not checked for failures. This needs to be done to use fanalyzer
  • __attribute__s such as nonnull are not set in struct members of query types and in functions such as htmlspecialchars but if (!arg) return NULL is done instead, which is poor coding style and fanalyzing can't be done in this case. This needs to be fixed to use fanalyzer.