

That seems like a flight of stairs up.


That seems like a flight of stairs up.


Fetterman never said he was a socialist or anti-zionist. People only ever liked him because he was pro m4a and because the elites hated him for wearing hoodies in the senate. No one knew just how much of a raging Zionist he was until after 10/7. Fetterman is not the reference for Mamdani. That is AOC.


There’s nothing to indicate he will about-face like Fetterman. He will endlessly compromise, concede, and triangulate like AOC does and that’s what he has been doing, (eg Tisch). That’s the inevitable reality of being a socialist politician in a capitalist state and why Marx advised against actually getting elected, especially to an executive. Believing he was secretly a fraud all along is Hollywood brained.


With all his experience I’m genuinely surprised he only managed to kill one guard.


His model would be AOC not Fetterman. Platner is the next Fetterman.


🤙


Anubis forces the site to reload when doing the normal PoW challenge! Meta Refresh is a sufficient mouse to block 99% of all bot traffic without being any more burdensome than PoW.
You’ve failed to demonstrate why meta-refresh is more burdensome than PoW and have pivoted to arguing the point I was making from the start as though it was your own. I’m not arguing with you any further. I’m satisfied that I’ve convinced any readers of our discussion.


You will have people complain about their anti-fingerprinting being blocked with every bot-managment solution. Your ability to navigate the internet anonymously is directly correlated with a bots ability to scrape. That has never been my complaint about Anubis.
My complaint is that the calculations Anubis forces you to do are absolutely negligible burden for a bot to solve. The hardest part is just having a JavaScript interpreter available. Making the author of the scraper write custom code to deal with your website is the most effective way to prevent bots.
Think about how much computing power AI data centers have. Do you think they give a shit about hashing some values for Anubis? No. They burn more compute power than a thousand Anubis challenges generating a single llm answer. PoW is a backwards solution.
Please Think. Captchas worked because they’re supposed to be hard for a computer to solve but are easy for a human. PoW is the opposite.
In the current shape Anubis has zero impact on usability for 99% of the site visitors, not so with meta refresh.
Again, I ask you: What extra burden does meta-refresh impose on users? How does setting a cookie and immediately refreshing the page burden the user more than making them wait longer while draining their battery before doing the exact same thing? Its strictly less intrusive.


And how do you actually check for working JS in a way that can’t be easily spoofed? Hint: PoW is a good way to do that.
Accessing the browsers API in any way is way harder to spoof than some hashing. I already suggested checking if the browser has graphics acceleration. That would filter out the vast majority of headless browsers too. PoW is just math and is easy to spoof without running any JavaScript. You can even do it faster than real JavaScript users something like Rust or C.
Meta refresh is a downgrade in usability for everyone but a tiny minority that has disabled JS.
What are you talking about? It just refreshes the page without doing any of the extra computation that PoW does. What extra burden does it put on users?


LOL


scrapers (currently) don’t want to spend extra on running headless chromium
WTF, That’s what I already? That was my entire point from the start!? You don’t need PoW to force headless usage. Any JavaScript challenge will suffice. I even said the Meta Refresh challenge Anubis provides is sufficient and explicitly recommended it.


Well in most cases it would by Python requests not curl. But yes, forcing them to use a browser is the real cost. Not just in CPU time but in programmer labor. PoW is overkill for that though.


Anubis is that it has a graded tier system of how sketchy a client is and changing the kind of challenge based on a a weighted priority system.
Last I checked that was just User-Agent regexes and IP lists. But that’s where Anubis should continue development, and hopefully they’ve improved since. Discerning real users from bots is how you do proper bot management. Not imposing a flat tax on all connections.


Then there was a paper arguing that PoW can still work, as long as you scale the difficulty in such a way that a legit user
Telling a legit user from a fake user is the entire game. If you can do that you just block the fake user. Professional bot blockers like Cloudflare or Akamai have machine learning systems to analyze trends in network traffic and serve JS challenges to suspicious clients. Last I checked, all Anubis uses is User-Agent filters, which is extremely behind the curve. Bots are able to get down to faking TLS fingerprints and matching them with User-Agents.


Its like you didn’t understand anything I said. Anubis does work. I said it works. But it works because most AI crawlers don’t have a headless browser to solve the PoW. To operate efficiently at the high volume required, they use raw http requests. The vast majority are probably using basic python requests module.
You don’t need PoW to throttle general access to your site and that’s not the fundamental assumption of PoW. PoW assumes (incorrectly) that bots won’t pay the extra flops to scrape the website. But bots are paid to scape the website users aren’t. They’ll just scale horizontally and open more parallel connections. They have the money.


I’ve repeatedly stated this before: Proof of Work bot-management is only Proof of Javascript bot-management. It is nothing to a headless browser to by-pass. Proof of JavaScript does work and will stop the vast majority of bot traffic. That’s how Anubis actually works. You don’t need to punish actual users by abusing their CPU. POW is a far higher cost on your actual users than the bots.
Last I checked Anubis has an JavaScript-less strategy called “Meta Refresh”. It first serves you a blank HTML page with a <meta> tag instructing the browser to refresh and load the real page. I highly advise using the Meta Refresh strategy. It should be the default.
I’m glad someone is finally making an open source and self hostable bot management solution. And I don’t give a shit about the cat-girls, nor should you. But Techaro admitted they had little idea what they were doing when they started and went for the “nuclear option”. Fuck Proof of Work. It was a Dead On Arrival idea decades ago. Techaro should strip it from Anubis.
I haven’t caught up with what’s new with Anubis, but if they want to get stricter bot-management, they should check for actual graphics acceleration.
To build on this, it would help to install some sort of system monitoring to check temps, fanspeed, system usage and have those constantly going so OP can check for any red flags during a freeze.
Idk about that. In my case I believe my CPU was defective from the start and I lived with it because I always assumed it was my OS in some way.
If your CPU has seven years of not randomly freezing and its just doing that now then I wouldn’t suspect the CPU.
However, unless you find some clues from journalctl -xeb1 or dmesg I would assume its faulty hardware somewhere.
Last time for me it was a bad CPU. Lived with it until I upgraded my CPU and recycled the old one into a new build. Then that one was having the same issue.
Kraft American Singles is normal american cheese. All american cheese is “cheese product” because its solidified cheese sauce, not a cheese itself.