SRE working in email. Gay. Married. Doggy daddy.

I like Star Trek, genealogy, O scale model trains, history, Pokemon, LEGO, coin collecting, books, music, board gaming, video gaming, camping, 420, and more.

Mastodon: @leopardboy@netmonkey.xyz

  • 5 Posts
  • 23 Comments
Joined 2 years ago
cake
Cake day: June 14th, 2023

help-circle










  • Depends on the context, I think. For me, I rarely do it for personal stuff. If I wanted to be perfect, I could do it, assuming a signature is available to verify, but I’m lazy. I would venture to say most folks don’t do it either.

    With that being said, where I have been consistent about doing it has been writing config management code at work. If I need to have it download an installer from an untrusted source, I can verify that I’m installing the same package on all servers by verifying the signature before installation. This doesn’t always work well in all circumstances, though.











  • If scraping is against their terms of service (and it probably is), then it doesn’t seem like much of a legal gray zone to me. I think they would sue the people running the scrapers.

    I used to work for a company that was constantly fighting scrapers. They loved our data! I have no idea how successful the bad guys were at doing it, but there were ways we could slow it down, block it, etc. Also, if you spend enough money with your CDN, there are lots of ways to deal with bots and scrapers. None of it is 100% effective, but you can sure make it a pain in the ass for your casual Lemmy admin.

    I say we make our own content here, instead of pulling it from Reddit.