Selling [SCRIPT] Reupload System – Multi-Host Video Distribution Service Mega,Byse,LuluStream + many more hosts

Mizore95

Member
20
2025
4
510
Reupload System – Multi-Host Video Distribution Service

Price SCRIPT:
$250$
[ ===== BUY NOW ===== ]

Upload once—our system handles the rest. The platform pushes your video to every configured host, then keeps a 24/7 watch: if an embed goes down on Doodstream, Filemoon, Streamtape or any other adapter, the system auto-re-uploads the file within minutes. Your original file stays safe in your own S3 bucket while you get a single, permanent URL that always points to the fastest working mirror. Zero manual work, zero dead links.


yR9NmrD.png



KyD34SZ.png



jj4sueg.png



7Hbj5TO.png




Main Features
  • Remote Upload – Push videos to multiple hosts simultaneously through their APIs
  • Private S3 Storage – Each user connects their own bucket for secure, redundant backups
  • Smart Failure Recovery – Detects broken embeds and re-uploads automatically
  • Folder & Sub-folder Organization – Keep large libraries neatly categorized
  • Unified Mirror Links – One short URL displays every active mirror for each video
  • Asynchronous Queue Engine – BullMQ + Redis handle thousands of jobs without blocking
  • Custom MultiEmbed Domain – Brand your player with your own domain
  • Embed Parameter Control – Tweak Filemoon and other hosts' player settings at will

Advanced Customization
  • White-label MultiEmbed Domain – Replace our domain with yours for a fully branded experience (requires a simple PHP MiniScript on your end)
  • Priority Queues – Rush critical uploads ahead of the regular queue

Supported Hosts
  • Doodstream
  • Byse
  • Lulustream
  • Savefiles
  • Streamtape
  • Streamwish
  • Vidhide
  • Voe
  • Vtube
  • Mega.nz

Ideal for
  • Streaming sites that need redundancy and fast load times worldwide
  • Course platforms requiring stable video delivery at scale
  • Content creators who want to stop worrying about dead links

Start today and turn video distribution into a background task that just works.
Post automatically merged:

PD: This version is not SaaS.
Post automatically merged:

* Support for zip and rar files; zip and rar files are uploaded only to the hosts file.

* New embed added for hosts file.

HARM5b5.png
 
Last edited:
7 comments
Offer available until December 26th.

$200

* Support for ZIP and RAR files: Users can now upload ZIP and RAR files. These files will only be uploaded to hosts that support the archive file type.
* Support for multiple accounts per host/adapter: It is possible to add multiple accounts per adapter/host, offering greater flexibility and management options.

OClFtXA.png
 
This looks good, is it encoded? is it php? how often are hosts added/fixed?

I see 'expires' in the header, does this ever need a licence renewal to run?

1. The code is not encrypted.

2. It's built with Node.js and Redis for the queue system.

3. * Hosts are fixed the same day they are reported by the user.

* New hosts are added only at the user's request for a cost of $10/$25. This system also comes with a modular adapter system, where you can pass a single file to ChatGPT and easily add new hosts with just one file per host.

4. There is no licensing system; instead, there is a system for generating registration links. If you want to create more accounts for others to upload files (clearly using the host configuration already set up by the admin), you can use this system.

--------- Moduler Adaptar HOST:

6toNar1.png


V9ldx1d.png


--------- New Screenshot

8wYeNXD.png


rCfnFZH.png
 
Last edited:
I purchased the script. It looks well made and polished.
There aren't install instructions but the seller is on telegram and is very helpfull with any questions , He would have installed the script if I had asked.
A small VPS will do as the files are remotely managed, dont get an expensive storage vps.
You need to add a cloud storage account eg. cloudflare storage is cheap per TB. Add your host accounts in the panel, then upload a file using the form & Its mirrored to all hosts, these are given a landing page url containing the links & auto update when they die.
The seller will add additional hosts on request for a fair price.

Added >>
There was a bug with script config that became apparent with file hosts. The dev was having holidays and fixed the problems and tested when he was back.
All hosts are working so everything is good with the script.


Overall very good for the price.
 
Last edited:
Independently of the issues reported here, I’m very interested in your solution but I don’t fully understand the core workflow yet: How does your system permanently map hoster embed URLs (e.g. domain/e/xxxx) to the correct original file in my S3 (e.g. MovieTitle.Year.1080p.BluRay.mp4), so that an automatic re-upload reliably picks the right file again?
 
Independently of the issues reported here, I’m very interested in your solution but I don’t fully understand the core workflow yet: How does your system permanently map hoster embed URLs (e.g. domain/e/xxxx) to the correct original file in my S3 (e.g. MovieTitle.Year.1080p.BluRay.mp4), so that an automatic re-upload reliably picks the right file again?

The logic relies on treating your S3 file as the permanent 'Source of Truth' and the hoster URLs as temporary 'Distribution Copies'.

Here is the exact workflow:

1. The Master Record (Permanent Map) When you import a file, we create a permanent entry in our database that links directly to your S3 object (e.g., MovieTitle.mp4 ). This ID never changes, regardless of what happens on the external hosts.

2. Smart Monitoring (The 'Sync' Method) Instead of checking links one by one (which is slow and inefficient), we use a List Comparison method:

- We periodically fetch the full list of active file codes directly from the host's API (e.g., Rapidgator).
- We compare this external list against our internal database.
- The Detection: If a file code exists in our database but is missing from the host's API list, the system instantly identifies it as deleted.
3. Automatic Re-upload Once a file is flagged as missing, the system uses the Master Record to grab the original file from your S3 and uploads it again. We then simply update the database with the new embed URL.

This ensures 100% reliability: as long as the file exists in your S3, the system will always generate a new link when the old one dies.
 
Back
Top