Profile for adele

Display name
Adële 🐁
Username
@adele@social.pollux.casa
Role
admin

About adele

Fields

Static hosting
https://pages.casa/
Gemini hosting
https://pollux.casa/
Email/XMPP hosting
https://message.casa/

Bio

aka 아델
#French 🇫🇷​ #PHP / #JavaScript and #Java developer
#Korean 🇰🇷 ancestry (but I don’t speak the language)
Into #SmolWeb, #GeminiProtocol, #Smolnet, #LowTech
#ArchLinux / #Debian user
#Markdown 🇲⬇️ enthusiast
Instance running #GotToSocial 🦥
fr / en
:straightally:

Stats

Joined
Posts
930
Followed by
1137
Following
299

Recent posts

include boosts

So much money to protect nations… from the people who need help

The world is turning into something I don’t recognise, and certainly not something I like[...]

➡️​ https://adele.pages.casa/md/blog/so_much_mony_to_protect_nations.md

➡️​ gemini://adele.pollux.casa/gemlog/2025-05-31_so_much_mony_to_protect_nations.gmi

The pollux.casa server sometimes experiences load peaks due to AI training crawlers and search engines.

pollux.casa allows hosted Gemini capsules to be accessible through both https and gemini protocols. Of course, it's on the https side that the crawlers come into play.

Since the main reason for having a #capsule on pollux.casa is to offer it on the #smolnet #gemini, I'm going to take the liberty of installing a crawler blocker on the https version.

#smolweb #geminiprotocol

Do you know a good CLI text editor for text writing (blog post in markdown) on Linux terminal

I like vim but I hate the cursor move when you have long lines (paragraph) on several screen lines.
I expect that going up with the cursor up key, I go up one screen line, but vim go up a full paragraph. It's not easy to move in your text.

What do you use ? I need something easy to use (no time to learn many keys combinations)

#cli #vim #texteditor #markdown

edit: got a solution with vim (gj/gk mapped to cursor arrows), see in replies below

How to backup a live gotosocial database

rm /gotosocial/backup.db

sqlite3 /gotosocial/sqlite.db "VACUUM INTO '/gotosocial/backup.db'" 

#gotosocial #gts

I'm surprised to not find any crowdfunding project to manage a new community driven top level domain (TLD).
All TLDs seen to be owned by countries or U.S. org/firms.

I have not found an open and independent TLD. I miss something?

My old 7" tablet can also run #Tusky !!!

I think that my attempt to use it ONLY for reading is actually failing 😬

I would like to acquire an eBook reader. I see that the #Kobo readers can connect to nextcloud with app #KoboCloud.

Do you use this solution ?

Any advice for Linux users to purchase the right reader: compatibility with Linux, or Nextcloud or ownCloud (no Dropbox neither GDrive for me), able to read non-drm epub and PDF.

Thanks for you help (RT if you think your followers could help)

https://github.com/fsantini/KoboCloud

You don't know what #religion I am?

That's normal, I keep that to myself. Maybe it's because I'm French and in my country secularism is a dogma. For me, religion is part of intimacy and privacy.

Block non-human crawlers with lighttpd

2025-04-20 19:05

Recently, I've put a copy of some ZIM files online with kiwix-server. I posted the url of this site on the Fediverse and, a few days later, the little server was a bit overloaded. The logs showed that the site was being crawled by search engines and AI training bots. There was no reason to let them. A robots.txt file calmed some, but not others.

Analysing user agents and IP addresses is not the answer, because, everything is done to make it complicated (randomisation, many datacenter origins). I thought about Cloudflare protection, Google captcha or the open source solution Anubis. All of them require javascript to be enabled on the human browsers.

After several tests, I have found a simple method to stop these crawlers.

The principle

When a connection arrives on the web server, it checks to see if the request comes with a cookie. If it does not, the web server redirects the browser to an HTML form that asks the user to tick a checkbox and submit. If the user submits the form correctly, he or she receives a cookie and is redirected to the previously requested page. The new request is made with a cookie. So, the web server does its job and send the expected content.

Detail on my blog