Thanks for the reminder to grab the subs for this, sonarr downloaded the ollonborre release a while back and I couldn’t find subs.
Looks like there’s now a season pack on 1337x from deadorbit with muxed eng subs, and all in DD5.1 too.
Thanks for the reminder to grab the subs for this, sonarr downloaded the ollonborre release a while back and I couldn’t find subs.
Looks like there’s now a season pack on 1337x from deadorbit with muxed eng subs, and all in DD5.1 too.
I’m pretty sure this will be a lot of work. Even assuming you didn’t rename any files and all the torrents are in a single folder, you still need to re-search for every torrent file and add them to a torrent client that should automatically find the files in the target location and check them to see that it’s already downloaded. I’m not sure if all torrent clients do this so I would advise testing it first - when re-adding a torrent with existing files I usually add it paused and then force re-check to be safe.
In theory it should be possible to make a program that does this - automatically indexes and crawls torrent sites and DHT, fetches torrent meta info, then scans your HD for matching files to seed. This feature is a proposed one for the DHT crawler “bit magnet” so I’m hopeful that one day there will be a program that can do that.
Keep in mind that if you set up raid using zfs or btrfs (idk how it works with other systems but that’s what I’ve used) then you also get scrubs which detect and fix bit rot and unrecoverable read errors. Without that or a similar system, those errors will go undetected and your backup system will backup those corrupted files as well.
Personally one of the main reasons I used zfs and now btrfs with redundancy is to protect irreplaceable files (family memories and stuff) from those kinds of errors, as I used to just keep stuff on a hard drive until I discovered loads of my irreplaceable vacation photos to be corrupted, including the backups which backed up the corruption.
If your files can be reacquired, then I don’t think it’s a big deal. But if they aren’t, then I think having scrubs or integrity checks with redundancy so that issues can be repaired, as well as backups with snapshots to prevent errors or mistakes from messing up your backups, is a necessity. But it just depends on how much you value your files.
I see, but couldn’t they just sign up for a provider and then hook up their bots to the same search that you use? Or is the search obfuscated for you too? In other words how do they obfuscate it for the bots but not for the customers? That’s what I never really understood - if the answer is just that the people running the bots are just too lazy to hook them up through the same unobfuscated search that paying customers use then that makes sense, but I always assumed there was more of a barrier since Usenet seems to have evaded legal action since forever.
What stops Usenet from being attacked legally in the same way, aren’t they straight up hosting copyrighted content? I’ve always stuck to torrents because it seemed more decentralized, especially if you use DHT instead of an indexer.
In addition to the possibility of reinforcement bars it could also have to do with the thickness of the tactile pad or it could have recessed spots underneath (to save on material if it was injection molded) that could create an insulating air gap.
ensuring greater security and privacy for users
Don’t worry guys, they’re just concerned for the users security and privacy
My primary use case is safeguarding my important personal artifacts (family photos, digitized paperwork, encryption key / account recovery / 2FA backups) against drive failure (~2TB), followed by my decently sized Plex server (23TB), immich, nextcloud, and various other small things like selfhosted bitwarden, grocy, ollama, and stuff like that.
I run all of my stuff off of a 6 bay Synology (more drives helps with capacity efficiency as double redundancy with 6 drives costs you 30% and I wanted to be protected against drive failures during rebuilding) with an Intel nuc on top to run plex/jellyfin transcoding using quicksync instead of loading the poor nas with cpu transcoding, I also run ollama on the nuc since it has faster cores than the nas.
Lmao the first thing that came to mind was the “is there anyone else you forgot to ask” meme with apple in between the user and app developer.
I also canceled my subscription because it’s been months and half of the posts with comments still show no comments forcing me to open every post in Firefox just to read the comments. Seems like too many bugs that leave the app unusable are being ignored.
First I’ll say, if you aren’t able to boot a windows installer off of a flash drive (and nothing’s wrong with your flash drive and you created it without errors and there isn’t a setting in bios preventing you from booting it like disabled USB boot) then it could be a hardware issue that Linux won’t fix (it’s not clear how you tried to reinstall windows). But if you’re able to get to the windows logo or the menu for safe mode then it sounds like it “POSTs” (gets past showing the bios screen) and windows should be reinstallable. In short, I think it’s unlikely to be a problem that only Linux can fix if you want to stick with windows.
That said, if you are otherwise interested in trying Linux and create a USB installer, most will allow you to exit the installer or choose to go into a temporary “live desktop” where nothing is saved, so it’s a good opportunity to try out how that distro feels to use - just don’t save anything important. And if you do end up installing either OS, you can use that “live mode” to use programs to back up any files from your main hard drive to another plugged in drive.
Oh nice where was it? Might help anyone else having the same problem.
That sounds like an ombi issue, as I understand it the minimum availability is only a per-movie setting and can’t be set globally so you’ll need to figure out why ombi isn’t setting it correctly when adding to radarr. Unfortunately I can’t offer any tips for ombi because I use overseerr
At least with radicle all the forks will still exist even if the authoritative copy is taken down. And even then I think because radicle is like BitTorrent, anybody who pinned the main repo would still be seeding it so it would be very hard to scrub it completely. The main challenge in using radicle is getting an active contributor with some reputation to maintain their copy on there. Otherwise there’s no momentum and nobody will pin the countless mirrors published by randos.
Hah I wish we could ignore them. It seems to just vary from ISP to ISP in the US but our small town ISP turns off your connection and puts you behind a captive portal forcing you to click through and accept what you did wrong before your connection is turned back on.
Our ISP sends 3 strike letters :(
I’ve done a backup swap with friends a couple times. Security wasn’t much of a worry since we connected to each other’s boxes over ssh or wireguard or similar and used tools that allowed encryption. The biggest challenge for us was that in my selfhosting friend group we all prefer different protocols so we had to figure out what each of us wanted to use to connect and access filesystems and set that up. The second challenge was ensuring uptime and that the remote access we set up for each other stayed up - and that’s what killed the project as we all eventually stopped maintaining the remote access and nobody seemed to care - so if I were to do it again I would make sure all participants have alerts monitoring their shared endpoint.
I tried the .ps one and it worked for me
You can achieve a similar thing using vlans - usually by default they’re isolated but you may add specific rules that allow traffic between vlans if it meets certain criteria (specific ports, specific types of traffic, traffic to or from specific hosts, any combination of those). So yeah you can imagine client isolation being like having each client on their own vlan - except without needing a different subnet for each client.
So realized that the season pack from deadorbit uses subs from opensubtitles which seems to be missing on screen text translations and title-cards, so I remuxed in the subs OP linked - which appears to have been edited to include title-cards and on screen text - to replace the ones in the deadorbit pack. Here’s a base64-ed link to a paste with the magnet for that if anyone wants it since I already went through the work for my own collection - it’s 1080p web-dl x264 8 bit AC3 5.1
Edit: since torrents can be slow to start, here’s a direct download link on mega, if you download that way consider grabbing the torrent from the first link and pointing your torrent client to the downloaded files. And of course feel free to repost and reshare everything: