Sonarr and Radarr are popular self-hosted, browser-based applications that organize, rename, monitor, and download movies and TV shows for media libraries (often used in conjunction with Plex or Emby).
Because they are typically granted read/write permissions to manage files and access public and private torrent sites/trackers, the commonly accepted practice is to password protect them when port forwarding/reverse proxying for external access. This prevents other users who have randomly stumbled upon the application online from being able to edit files or access private tracker credentials.
Unfortunately, Sonarr and Radarr do not enable authentication by default and users must manually enable it by specifying a username/password before their data will be protected. While most users do this, many don’t.
In this post, I’m going to demonstrate how easy it is for anyone with internet access and basic scripting skills to scrape information from users who haven’t protected their instances of Sonarr and Radarr.
Note that I will not reveal any of the specific information or methods used to access these publicly-accessible websites. This post is meant to serve as a warning and word of caution for the damage that can be done for users who choose not to secure their applications.
As of the time of publishing, I’ve already deleted the identifying information from my own records and have retained only the aggregated data shown below.
Method of Retrieval
As mentioned earlier, I will not specifically provide any information that will easily allow someone to replicate the process, but I think it’s worth spending a little time outlining my process to help readers understand just how easy it is to replicate.
To begin, a somewhat-popular website exists (that I’ve chosen not to reveal the name of ) that allows security professionals to search for specific ports or terms and return public IP addresses that have open ports matching this criteria. The site conveniently has an API to perform searches from the command line, so I was able to easily search “8989” and “Sonarr” for a list of IPs with accessible Sonarr instances and “7878” and “Radarr” for a list of IPs with accessible Radarr instances.
Using this method, if you haven’t reverse proxied either application and chose only to port forward, not even changing the port numbers allowed any instances to escape my search as they were picked up with my ‘sonarr’ and ‘radarr’ search terms.
After executing the query through the command line, I exported these results to a CSV file with a column of public IP addresses and column of open ports. In total, the search tool returned 2,355 results for available Sonarr instances and 347 results for available Radarr instances. Cool.
Now that I had a list of IP addresses and ports, I needed to determine which ones actually had open instances of Sonarr and which were either locked or unrelated. To do this, I wrote a Python script that loaded each page and scraped the page’s source for an API key, which by default can be accessed if the page isn’t authenticated.
After running the script, I now had a list of public IP addresses and ports with open instances of Sonarr or Radarr as well as their API keys, which can be used to retrieve almost anything in the application after familiarizing yourself with either application’s documentation.
Using this information, I wrote another quick Python script to scrape through each site using the API key to obtain a list of indexers on each site and the information needed to access their queries/links.
What I Found
According to each application’s documentation, there’s a lot of destruction one can cause with access to the application and an API key – file deletion, searches, bulk queries, downloads, etc. I wasn’t personally interested in doing any of this – I only retrieved indexer information because I think it’s effective data to aggregate and display as an example of what can be retrieved with the right know-how.
Below is a table of the indexers – public and private – that I found through my Sonarr search alone. Some were direct links while others were references to Jackett instances. The capitalization/spelling of some is goofy because they’re the actual titles people entered when populating the indexer in Sonarr.
After scraping each IP address and port, I had enough information about each indexer that I could have entered them into my own instances of Radarr and Sonarr to perform queries and downloads through.
While most of the trackers are public and easily accessible, there were at least two exclusive trackers that anyone could theoretically gain access to through this exploit. There are obvious negative implications to someone having access to private tracker information – most of which lead to a ban or suspension due to misuse.
|Avistaz – Torznab||FreedomHD||Ninja Central||PrivateHD|
|Batwoman||Generation Free||NZB finder||RU Tracker|
|Beta Nzb||gingadaddy||NZB Geek||sabnzb|
|Beyond HD||girotorrent||NZB Hydra||Scenetime|
|bigFANgroup||gK TORRENT||NZB IS||Sharewood|
|BitMeTV||HD Torrents||NZB Tortuga||ShowRSS|
|Black Lightning||Horrible Subs||nzbindex.in||SimplyNZBs|
|c pas bien||il corsario red||nzbplanet.net||Superbits|
|chilebt||il corsaro blu||nzbs.chica.be||Supergirl|
|Concen||Il Corsaro Nero||nzbs.in||Tabula Rasa|
|DB RSS TEST||IsoHunt||NZB-SA||TorrentLeech|
As I was working on this article, I tried to think about the best solution for this problem. The easiest seems to be forcing a user to create a username/password at installation (I could have easily adjusted my script to bypass a default username/password combo), but I’m not sure the Sonarr and Radarr developers have enough incentive to even care about what happens to users who don’t bother to secure their applications.
The developers could also allow users to change the site title of their application so queries for “sonarr” and “radarr” wouldn’t yield their sites as results.
You could also choose not to open Sonarr and Radarr externally and access them via VPN when you’re not on your home network.
But I digress. For now, I’m hoping this article will serve as a warning to those who haven’t enabled authentication, and for future users who may run into issues if they’re not aware of the dangers in not doing so.
Please do not contact me for any additional details about the methods used to obtain application data for this article.