You are not logged in.
Pages: 1
uname ....
kernel 5.10.0-9-amd64 #1 SMP Debian 5.10.70-1 (2021-09-30) x86_64 GNU/Linux
cat /etc/debian_version
11.1
cat /etc/devuan_version
chimaera
I have recoll at the moment as a local search engine. It does a good job with my own cronjob in building a nightly database as a local user.
However the desktop app sometimes freezes or blocks the x server completely and then it must be killed from another terminal.
Or sometimes it does nothing seemingly from e.g. top.
After an undefined time there may be or not results of the search.
I am not happy with this recoll .
Edit : meanwhile I have seen that the recoll desktop has a cpu load of 1% maximum. Something restricts the application.
The niceness is 20 as all others.
When I wait for the result, I do not need a cpulimit or similar.
What is a recommended search engine having possibly a browser interface and running as root ?
What say se expörts?
Last edited by bai4Iej2need (2022-01-24 15:26:04)
The devil, you know, is better than the angel, you don't know. by a British Citizen, I don't know too good.
One generation abandons the enterprises of another like stranded vessels. By Henry David Thoreau, WALDEN, Economy. Line 236 (Gutenberg text Version)
broken by design :
https://bugs.debian.org/cgi-bin/bugrepo … bug=958390
Offline
There's a new one called "Cerebro" that has a free software license (github.com/cerebroapp/cerebro)
It looks kind of cool. I tried to use the Appimage today, but it was throwing up errors. I see that there's some instructions on how to overcome the errors but I haven't tried them yet. There's also a .deb package.
It would be nice if something a bit more modern than Recoll finally became available.
For command line search utilities, I've fallen in love with "fd" (on Devuan it's the 'fd-find' package). Soooo fast, and doesn't need to build an index.
Last edited by andyprough (2022-01-19 02:31:32)
Offline
I looked now at recollindex, which i run from a cron job, which calls a bash shell , which calls recollindex -x
runs as user 1000 for its $HOME and /usr/share
#!/bin/bash
# recollindex indiziert alle Dateien
. /usr/local/bin/log_functions.source
open_log
RECOLLINDEX=/usr/bin/recollindex
[ -x ${RECOLLINDEX} ] || exit 1
# alte Indices nach *.old verschieben, um einen Vergleich zu haben.
cd ~/.recoll/xapiandb/ || exit 2
for datei in *.glass ; do
[ -f ${datei} ] && mv "${datei}" "${datei}".old
done
# echo $(date +'%F %T') $(hostname) alte Indices nach *.old verschoben
LOG=~/.recoll/recollindex.log
if [ -s ${LOG} ] ; then
mv ${LOG} ${LOG}.old
touch ${LOG}
fi
cd ~ || echo "cd home failed"
# der Indizierungslauf
$RECOLLINDEX -x
echo recollindex $?
echo "$(date +'%F %T') $(hostname) Recollindex Ende"
close_log_exit
The log_functions do some logging and recording
When I look ath the processes ,
all this bash shell has niceness 0 ,
but /usr/bin/recollindex -x runs with niceness 19 and all subprocesses
This sometimes does not finish therefore, also while the cpus are idling.
It is possible to manually renice as root the recollindex and its suprocesses to 0,
but I did not see and find, where this niceness is set to this extrem value.
It is not the idea to intervene in a cron job
Nothing mention of niceness in man recollindex or info recollindex. nor on the web. nor in the conf files
:wq
The devil, you know, is better than the angel, you don't know. by a British Citizen, I don't know too good.
One generation abandons the enterprises of another like stranded vessels. By Henry David Thoreau, WALDEN, Economy. Line 236 (Gutenberg text Version)
broken by design :
https://bugs.debian.org/cgi-bin/bugrepo … bug=958390
Offline
Pages: 1