Sonarr and Radarr keep grabbing releases from a couple specific groups ( ‘SuccessfulCrab’ and ‘ELiTE’) for items that clearly haven’t even aired yet. These almost always contain only .scr or .lnk files, which have been blocklisted in my torrent client. This leaves Sonarr/Radarr awaiting manual intervention for ‘complete’ downloads that contain no files.
How do I get them to block anything and everything that contain the strings ‘SuccessfulCrab’ and ‘ELiTE’ ??? I want them to stop even trying to grab anything released by those two groups.
I’m so sick of dealing with these.
[EDIT]
OK, so I have been looking at this from the wrong angle.
It is not these groups that I’m upset with, but malware uploaders masquerading as release groups. These names can and will change, making this a game of wack-a-mole if I try to fight it this way.
Initially I’d followed instructions below to block these names/strings being grabbed by the arrs and that does work fantastically; but as above, wack-a-mole. Plus both SuccessfulCrab and ELiTE have plenty of good releases out there, it’s not their fault someone’s using their names.
So, I’m now running Cleanuparr.
This will maintain a large list of unwanted filetypes in qbittorrent. Then when qbit marks a torrent as complete because there are no wanted files, cleanupparr removes it from both qbit and the arr that requested it, while also triggering a new search for the item.
It can also cleanup items failing to import, stalled downloads, torrents stuck downloading metedata, or things that are just absurdly slow; with varying time scales/stringency.
I’ll run this for a bit and see how it goes.
I recommend just blocking those file types from being downloaded in your torrent client. It’ll still end up grabbing the torrents, but won’t download the files and just go straight to seeding at 0% downloaded. Easy to notice and delete before it poisons sonarr/radarr
I made this exact post on Github lol.
The takeaway is this: Sonarr/Radarr v5 will solve this by setting the airtime, and restricting torrent grabs before that. There may also be a manual offset override (hours).
I still think the easier solution is to set the indexer to manually choose what qualifies as a dangerous file type.
That will solve part of the problem, preventing downloads before an item has even released; but there’s still lots of potential to grab unwanted torrents and leave the arrs asking for intervention when they can’t import it.
Ideally the indexers would be filtering out this junk before users can even grab them, but failing that I think we’ve got a decent solution. Check out the edited OP
SuccessfulCrab is a legitimate scene group and ELiTE appears to be some sort of P2P x265-1080p transcode bot/group (their releases on IPT/TL look fine and go back quite a ways). I’d stop using whatever you’re indexing from that’s either serving you malware or failing to regulate the malware in its users’ uploads. The real problem is that someone is mimicking these groups and putting out fake releases, so playing whackamole with the fake tags that that person is using is only treating the symptoms, and they can easily change the tag again.
A lot of people don’t like the scene releases. They’re usually of lower quality compared to the best of what’s available. If you use the Scene custom format from TRaSH guides, it filters out the LQ scene groups like SucessfulCrab.
SuccessfulCrab only does WEB-DLs so “subjective quality” isn’t as much of an issue as it would be with the encoding groups, but yeah I agree that scene is usually best avoided if you have access to reliable P2P sources. Quality > speed for me any day.
Check out Cleanuparr! I started using it after seeing it recommended somewhere on Lemmy and haven’t seen these annoying downloads since. It also deals with stalled/slow/stuck-downloading-metadata torrents.
I’m taking a look at this. It looks like it’s the malware blocker portion that I’m interested in, but if I enable it and ‘delete known malware’, it just complains every minute that there are no blocklists enabled. (though the documents say it’s supposed to fetch one from a pages.dev url that has almost no content)
Do you have a specific malware blocklist configured? Enabling the specific service blocklists demands a url for one.
I can host/build a list over time for these to use if that’s what I’ve gotta do; just wondering if there’s a public collaboration on one already on the go.
/edit: found it
https://raw.githubusercontent.com/Cleanuparr/Cleanuparr/refs/heads/main/blacklist
Same here 🙌
The TRaSH Guides should block those. They have SuccessfulCrab in one of the Scene formats. IDK about ELiTE, but you could add them into the format as well to filter them. I followed the guides and set up a single Sonarr/Radarr to fetch both anime and normal shows/movies, and I don’t have any problems with getting bad quality stuff. I don’t think it’s pulled something I didn’t like anytime in the last couple of years since I turned it on.
https://trash-guides.info/Sonarr/sonarr-collection-of-custom-formats/#scene
deleted by creator
The .scr and .lnk is more an issue with the trackers you’re using and not with those release groups, but to answer your question: create a custom format that looks for those groups in the release group field and then score them like -10000 in your quality profile.
Sonarr also has a new setting in to fail dangerous downloads like those so they won’t stay in your activity queue, it’s in the indexer settings.
You can also approach this by blocking file types at the download client.
That’s what I’d already done as per the OP, but it leaves Sonarr/Radarr wanting manual intervention for the ‘complete’ download that doesn’t have any files to import.
Oh, I missed that. My bad.
This comment prompted me to look a little deeper at this. I looked at the history for each show where I’ve had failed downloads from those groups.
For SuccessfulCrab; any time a release has come from a torrent tracker (I only have free public torrent trackers) it’s been garbage. I have however had a number of perfectly fine downloads with that group label, whenever retrieved from NZBgeek. I’ve narrowed that filter to block the string ‘SuccessfulCrab’ on all torrent trackers, but allow NBZs. Perhaps there’s an impersonator trying to smear them or something, idk.
ELiTE on the other hand, I’ve only got history of grabbing their torrents and every one of them was trash. That’s going to stay blocked everywhere.
The block potentially dangerous setting is interesting, but what exactly is it looking for? The torrent client is already set to not download file types I don’t want, so will it recognize and remove torrents that are empty? (everything’s marked ‘do not download’) I’m having a hard time finding documentation for that.
The fail dangerous doesn’t work if you block it in the torrent client, so you’d have to actually download it but then sonarr would mark it as failed and not get stopped from downloading new items for that show. Neither option is perfect.
See if you’re getting these bad torrent moreso from one particular tracker and then rethink about using that tracker.
Best option is trying to get into some private trackers, pretty easy looking for open signups on lemmy or the dreaded reddit.
I think in quality profiles you can set up blockwords. I did that to block Dolby vision “DoVi” releases as I couldn’t get them to play reliably.
Ok, I think I’ve got this right?
Settings > Profiles > Release Profiles.
Created one, setup ‘must not contain’ words, indexer ‘any’, enabled.
That should just apply globally? I’m not seeing anywhere else I’ve got to enable it in specific series, clients, or indexers.
Yes.
Keep in mind it doesn’t apply to file extensions. I forget if there’s a feature request outstanding or if it was rejected.
Awesome. Thanks you two, I appreciate the help. :)
I think if you didn’t assign a tag on the Release Profile it applies to all series.
I think that’s right. Maybe the release profile needs to be applied to the quality? I’ll see if I can open my configs up and find where it applies.
Awesome. Thanks you two, I appreciate the help. :)
Thank you, you three, setting this up right away.
Check out the edited OP.
Thanks for the warning, I will do the same then, I was left wondering as I made the rules… Why would they keep using the same release name making it easy to find? This makes more sense :)
This is how i do it and it’s very successful. I have a group block list and terms blocklist. Just a nice baseline too have before you set up your regular quality profiles.
Mh, I thought there was a way to do this with tags but this is probably it
Take a look at Profilarr. You can set fine custom conditions to block anything you want.
Honestly we should just start a list we maintain, and then ask Sonarr/Radarr to offer a feature to pull from a URL of our choosing periodically. That way we can curate the blocklists as a collective rather than this manual bullshit.
Check out Trash Guides.
I am familiar, thanks though
This kind of functionality belongs in Sonarr/Radarr imo, and not separate service(s)
The functionality is in the apps. That’s why TRaSH guides work.
You’re not supposed to just use Sonarr/Radarr without customizing your filters first. It’s also not too hard to make your own, if you don’t want to use the TRaSH formats.
I appreciate the advice, but I’m not solely referring to the customization of filtering, of course it has the ability to do that, almost every comment in this thread is about that. The feature I am raising is about syncing community maintained txt blobs, Trash tells us exactly how to sync the guide with Sonarr/Radarr, but not without additional software or manual effort.
Trash may offer excellent filters, but they are often incomplete, and do not promote community involvement. A URL pointed at a single source of truth inherently would, this is a popular approach with blocklists all over the web. If a user wants to modify the list, and has to post to a single moderated source, everyone benefits. But currently as a user you either setup syncing orchestration or you manually copy Trash. Neither of which lends itself toward keeping the community up to date.
If you are aware of a way to have Sonarr/Radarr pull directly from a single source of truth and update itself, that would be great though.
Ah, I see what you mean now. Something like that would be nice, but I don’t think it would work for a lot of people. I know I’m pretty particular about the stuff I download, and I doubt a community maintained list would tick the boxes I want. Especially since a lot of people seem to use separate instances of Sonarr for shows and animes, and I combine them into a single one, and then assign formats and libraries accordingly using Jellyseer so that my family can request media without effort.
I fucking hate SuccessfulCrab so much
How come?
I’ve never had a download tagged as SuccessfulCrab that wasn’t malware. I don’t know enough about them to know if that’s their fault or the indexers’.
It’s the indexers. I’ve had many day 1 releases from them that were perfectly fine.
I just did some digging and found I do have some good quality content from them, but they were all grabbed via NZBGeek.
Every torrent I’ve gotten with that label has been garbage/malware.
The one release group I always struggle with Sonarr over is
MeM.GP
, they re-release other people’s content as dual language ITA/ENG with Italian being forced default. Even blocking them as a release group in a custom format messes up, because Sonarr doesn’t parse the release group correctly with how they name their releases, so it’ll still snatch it and then complain on import. Arseholes.You can do a custom format again the release title
I have Recyclarr setup to dynamically update and one thing it does is keep a current bad release group.