I am working on a simple static website that gives visitors basic information about myself and the work I do. I want this as a way use to introduce myself to potential clients, collaborators, etc., rather than rely solely on LinkedIn as my visiting card.

This may seem sound rather oxymoronic given that I am literally going to be placing (some relevant) details about myself and my work on the internet, but I want to limit the websites’ access from bots, web scraping and content collection for LLMs.

Is this a realistic expectation?

Also, any suggestions on privacy respecting, yet inexpensive domains that I can purchase in Europe would be of super great help.

  • Fuck Work@slrpnk.net
    link
    fedilink
    arrow-up
    2
    ·
    6 months ago

    If you look in your access logs, or /var/log/nginx/access.log and look for user agents in the log file that indicate things like chatgptbot, etc. Then add if ($http_user_agent ~* "useragent1|useragent2|... useragents") { return 403; } to the server block of your websites config file in /etc/nginx/sites-enabled/. You can also add a robots.txt that forbids scraping. Chatgpt generally checks and respects that… for now. This paired with some of the stuff above should work.