This tool primary goal is to sync your backends users play state without relying on third party services, out of the
box, this tool support Jellyfin
, Plex
and Emby
media servers.
The new webhooks system is now available, please start mirating your systems to use it as, we have deprecated the old webhook system, and it will be removed in the next release. The new system is more robust and user-friendly compared to the old one.
Please refer to NEWS for the latest updates and changes.
- Management via WebUI.
- Sub-users support.
- Sync backends play state (
Many-to-Many
orOne-Way
). - Backup your backends play state into
portable
format. - Receive webhook events from media backends.
- Find
un-matched
ormis-matched
items. - Search your backend metadata.
- Check if your media servers reporting same data via the parity checks.
- Sync your watch progress/play state via webhooks or scheduled tasks.
- Check if your media backends have stale references to old files.
If you like my work, you might also like my other project YTPTube, which is simple and to the point yt-dlp frontend to help download content from all supported sites by yt-dlp.
If you prefer video format AlienTech42 YouTube Channel had a video about installing WatchState using unraid at this link. Much appreciated.
PS: I don't know the channel owner, but I appreciate the effort. There is small mistake in the video regarding the webhook URL, please copy the URL directly from the backends page. And this tool does support multi-users.
First, start by creating a directory to store the data, to follow along with this setup, create directory called data
at your working directory. Then proceed to use your preferred method to install the tool.
create your compose.yaml
next to the data
directory, and add the following content to it.
services:
watchstate:
image: ghcr.io/arabcoders/watchstate:latest
# To change the user/group id associated with the tool change the following line.
user: "${UID:-1000}:${GID:-1000}"
container_name: watchstate
restart: unless-stopped
ports:
- "8080:8080" # The port which the webui will be available on.
volumes:
- ./data:/config:rw # mount current directory to container /config directory.
Next, to run the container, use the following command
$ docker compose up -d
$ docker run -d --rm --user "${UID:-1000}:${GID:-1000}" --name watchstate --restart unless-stopped -p 8080:8080 -v ./data:/config:rw ghcr.io/arabcoders/watchstate:latest
Important
It's really important to match the user:
, --user
to the owner of the data
directory, the container is rootless,
as such it will crash if it's unable to write to the data directory.
It's really not recommended to run containers as root, but if you fail to run the container you can try setting the
user: "0:0"
or --user '0:0'
if that works it means you have permissions issues. refer to FAQ to
troubleshoot the problem.
For Unraid
users You can install the Community Applications
plugin, and search for watchstate it comes
preconfigured. Otherwise, to manually install it, you need to add value to the Extra Parameters
section in advanced
tab/view. add the following value --user 99:100
.
This has to happen before you start the container, otherwise it will have the old user id, and
you then have to run the following command from terminal chown -R 99:100 /mnt/user/appdata/watchstate
.
To use this container with podman
set compose.yaml
user
to 0:0
. it will appear to be working as root inside the
container, but it will be mapped to the user in which the command was run under.
After starting the container, you can access the WebUI by visiting http://localhost:8080
in your browser.
Note
Note, For the first time, you will be prompted to create a new system user, this is a one time operation.
To add your backends, please click on the help button in the top right corner, and choose which method you want one-way or two-way sync. and follow the instructions.
Once you have setup your backends and imported your data you should see something like
Currently, the tool supports three methods to import data from backends.
- Scheduled Tasks.
A scheduled job that pull data from backends on a schedule.
- On demand.
Pull data from backends on demand. By running the import task manually.
- Webhooks.
Receive events from backends and update the database accordingly.
Note
Even if all your backends support webhooks, you should keep import task enabled. This help keep healthy relationship and pick up any missed events. For more information please check the webhook guide to understand webhooks limitations.
Take look at this frequently asked questions page, or the guides for more in-depth guides on how to configure things.
If you have short or quick questions, or just want to chat with other users, feel free to join
this discord server, keep in mind it's solo project, as such it might take me a bit of
time to reply to questions, I operate in UTC+3
timezone.
If you feel like donating and appreciate my work, you can do so by donating to children charity. For example the International Make-A-Wish foundation.