Merge branch 'dev' into Config-Filenames
This commit is contained in:
@@ -11,6 +11,7 @@ repos:
|
|||||||
rev: v0.4.0
|
rev: v0.4.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: poetry-ruff-check
|
- id: poetry-ruff-check
|
||||||
|
args: [--fix]
|
||||||
- repo: https://github.com/pycqa/isort
|
- repo: https://github.com/pycqa/isort
|
||||||
rev: 6.0.1
|
rev: 6.0.1
|
||||||
hooks:
|
hooks:
|
||||||
|
|||||||
104
docs/ADVANCED_CONFIG.md
Normal file
104
docs/ADVANCED_CONFIG.md
Normal file
@@ -0,0 +1,104 @@
|
|||||||
|
# Advanced & System Configuration
|
||||||
|
|
||||||
|
This document covers advanced features, debugging, and system-level configuration options.
|
||||||
|
|
||||||
|
## serve (dict)
|
||||||
|
|
||||||
|
Configuration data for pywidevine's serve functionality run through unshackle.
|
||||||
|
This effectively allows you to run `unshackle serve` to start serving pywidevine Serve-compliant CDMs right from your
|
||||||
|
local widevine device files.
|
||||||
|
|
||||||
|
- `api_secret` - Secret key for REST API authentication. When set, enables the REST API server alongside the CDM serve functionality. This key is required for authenticating API requests.
|
||||||
|
- `devices` - List of Widevine device files (.wvd). If not specified, auto-populated from the WVDs directory.
|
||||||
|
- `playready_devices` - List of PlayReady device files (.prd). If not specified, auto-populated from the PRDs directory.
|
||||||
|
- `users` - Dictionary mapping user secret keys to their access configuration:
|
||||||
|
- `devices` - List of Widevine devices this user can access
|
||||||
|
- `playready_devices` - List of PlayReady devices this user can access
|
||||||
|
- `username` - Internal logging name for the user (not visible to users)
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
serve:
|
||||||
|
api_secret: "your-secret-key-here"
|
||||||
|
users:
|
||||||
|
secret_key_for_jane: # 32bit hex recommended, case-sensitive
|
||||||
|
devices: # list of allowed Widevine devices for this user
|
||||||
|
- generic_nexus_4464_l3
|
||||||
|
playready_devices: # list of allowed PlayReady devices for this user
|
||||||
|
- my_playready_device
|
||||||
|
username: jane # only for internal logging, users will not see this name
|
||||||
|
secret_key_for_james:
|
||||||
|
devices:
|
||||||
|
- generic_nexus_4464_l3
|
||||||
|
username: james
|
||||||
|
secret_key_for_john:
|
||||||
|
devices:
|
||||||
|
- generic_nexus_4464_l3
|
||||||
|
username: john
|
||||||
|
# devices can be manually specified by path if you don't want to add it to
|
||||||
|
# unshackle's WVDs directory for whatever reason
|
||||||
|
# devices:
|
||||||
|
# - 'C:\Users\john\Devices\test_devices_001.wvd'
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## debug (bool)
|
||||||
|
|
||||||
|
Enables comprehensive debug logging. Default: `false`
|
||||||
|
|
||||||
|
When enabled (either via config or the `-d`/`--debug` CLI flag):
|
||||||
|
- Sets console log level to DEBUG for verbose output
|
||||||
|
- Creates JSON Lines (`.jsonl`) debug log files with structured logging
|
||||||
|
- Logs detailed information about sessions, service configuration, DRM operations, and errors with full stack traces
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
debug: true
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## debug_keys (bool)
|
||||||
|
|
||||||
|
Controls whether actual decryption keys (CEKs) are included in debug logs. Default: `false`
|
||||||
|
|
||||||
|
When enabled:
|
||||||
|
- Content encryption keys are logged in debug output
|
||||||
|
- Only affects `content_key` and `key` fields (the actual CEKs)
|
||||||
|
- Key metadata (`kid`, `keys_count`, `key_id`) is always logged regardless of this setting
|
||||||
|
- Passwords, tokens, cookies, and session tokens remain redacted even when enabled
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
debug_keys: true
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## set_terminal_bg (bool)
|
||||||
|
|
||||||
|
Controls whether unshackle should set the terminal background color. Default: `false`
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
set_terminal_bg: true
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## update_checks (bool)
|
||||||
|
|
||||||
|
Check for updates from the GitHub repository on startup. Default: `true`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## update_check_interval (int)
|
||||||
|
|
||||||
|
How often to check for updates, in hours. Default: `24`.
|
||||||
|
|
||||||
|
---
|
||||||
174
docs/DOWNLOAD_CONFIG.md
Normal file
174
docs/DOWNLOAD_CONFIG.md
Normal file
@@ -0,0 +1,174 @@
|
|||||||
|
# Download & Processing Configuration
|
||||||
|
|
||||||
|
This document covers configuration options related to downloading and processing media content.
|
||||||
|
|
||||||
|
## aria2c (dict)
|
||||||
|
|
||||||
|
- `max_concurrent_downloads`
|
||||||
|
Maximum number of parallel downloads. Default: `min(32,(cpu_count+4))`
|
||||||
|
Note: Overrides the `max_workers` parameter of the aria2(c) downloader function.
|
||||||
|
- `max_connection_per_server`
|
||||||
|
Maximum number of connections to one server for each download. Default: `1`
|
||||||
|
- `split`
|
||||||
|
Split a file into N chunks and download each chunk on its own connection. Default: `5`
|
||||||
|
- `file_allocation`
|
||||||
|
Specify file allocation method. Default: `"prealloc"`
|
||||||
|
|
||||||
|
- `"none"` doesn't pre-allocate file space.
|
||||||
|
- `"prealloc"` pre-allocates file space before download begins. This may take some time depending on the size of the
|
||||||
|
file.
|
||||||
|
- `"falloc"` is your best choice if you are using newer file systems such as ext4 (with extents support), btrfs, xfs
|
||||||
|
or NTFS (MinGW build only). It allocates large(few GiB) files almost instantly. Don't use falloc with legacy file
|
||||||
|
systems such as ext3 and FAT32 because it takes almost same time as prealloc, and it blocks aria2 entirely until
|
||||||
|
allocation finishes. falloc may not be available if your system doesn't have posix_fallocate(3) function.
|
||||||
|
- `"trunc"` uses ftruncate(2) system call or platform-specific counterpart to truncate a file to a specified length.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## curl_impersonate (dict)
|
||||||
|
|
||||||
|
- `browser` - The Browser to impersonate as. A list of available Browsers and Versions are listed here:
|
||||||
|
<https://github.com/yifeikong/curl_cffi#sessions>
|
||||||
|
|
||||||
|
Default: `"chrome124"`
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
curl_impersonate:
|
||||||
|
browser: "chrome120"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## downloader (str | dict)
|
||||||
|
|
||||||
|
Choose what software to use to download data throughout unshackle where needed.
|
||||||
|
You may provide a single downloader globally or a mapping of service tags to
|
||||||
|
downloaders.
|
||||||
|
|
||||||
|
Options:
|
||||||
|
|
||||||
|
- `requests` (default) - <https://github.com/psf/requests>
|
||||||
|
- `aria2c` - <https://github.com/aria2/aria2>
|
||||||
|
- `curl_impersonate` - <https://github.com/yifeikong/curl-impersonate> (via <https://github.com/yifeikong/curl_cffi>)
|
||||||
|
- `n_m3u8dl_re` - <https://github.com/nilaoda/N_m3u8DL-RE>
|
||||||
|
|
||||||
|
Note that aria2c can reach the highest speeds as it utilizes threading and more connections than the other downloaders. However, aria2c can also be one of the more unstable downloaders. It will work one day, then not another day. It also does not support HTTP(S) proxies while the other downloaders do.
|
||||||
|
|
||||||
|
Example mapping:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
downloader:
|
||||||
|
NF: requests
|
||||||
|
AMZN: n_m3u8dl_re
|
||||||
|
DSNP: n_m3u8dl_re
|
||||||
|
default: requests
|
||||||
|
```
|
||||||
|
|
||||||
|
The `default` entry is optional. If omitted, `requests` will be used for services not listed.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## n_m3u8dl_re (dict)
|
||||||
|
|
||||||
|
Configuration for N_m3u8DL-RE downloader. This downloader is particularly useful for HLS streams.
|
||||||
|
|
||||||
|
- `thread_count`
|
||||||
|
Number of threads to use for downloading. Default: Uses the same value as max_workers from the command.
|
||||||
|
- `ad_keyword`
|
||||||
|
Keyword to identify and potentially skip advertisement segments. Default: `None`
|
||||||
|
- `use_proxy`
|
||||||
|
Whether to use proxy when downloading. Default: `true`
|
||||||
|
- `retry_count`
|
||||||
|
Number of times to retry failed downloads. Default: `10`
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
n_m3u8dl_re:
|
||||||
|
thread_count: 16
|
||||||
|
ad_keyword: "advertisement"
|
||||||
|
use_proxy: true
|
||||||
|
retry_count: 10
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## dl (dict)
|
||||||
|
|
||||||
|
Pre-define default options and switches of the `dl` command.
|
||||||
|
The values will be ignored if explicitly set in the CLI call.
|
||||||
|
|
||||||
|
The Key must be the same value Python click would resolve it to as an argument.
|
||||||
|
E.g., `@click.option("-r", "--range", "range_", type=...` actually resolves as `range_` variable.
|
||||||
|
|
||||||
|
For example to set the default primary language to download to German,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
lang: de
|
||||||
|
```
|
||||||
|
|
||||||
|
You can also set multiple preferred languages using a list, e.g.,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
lang:
|
||||||
|
- en
|
||||||
|
- fr
|
||||||
|
```
|
||||||
|
|
||||||
|
to set how many tracks to download concurrently to 4 and download threads to 16,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
downloads: 4
|
||||||
|
workers: 16
|
||||||
|
```
|
||||||
|
|
||||||
|
to set `--bitrate=CVBR` for the AMZN service,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
lang: de
|
||||||
|
AMZN:
|
||||||
|
bitrate: CVBR
|
||||||
|
```
|
||||||
|
|
||||||
|
or to change the output subtitle format from the default (original format) to WebVTT,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
sub_format: vtt
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## decryption (str | dict)
|
||||||
|
|
||||||
|
Choose what software to use to decrypt DRM-protected content throughout unshackle where needed.
|
||||||
|
You may provide a single decryption method globally or a mapping of service tags to
|
||||||
|
decryption methods.
|
||||||
|
|
||||||
|
Options:
|
||||||
|
|
||||||
|
- `shaka` (default) - Shaka Packager - <https://github.com/shaka-project/shaka-packager>
|
||||||
|
- `mp4decrypt` - mp4decrypt from Bento4 - <https://github.com/axiomatic-systems/Bento4>
|
||||||
|
|
||||||
|
Note that Shaka Packager is the traditional method and works with most services. mp4decrypt
|
||||||
|
is an alternative that may work better with certain services that have specific encryption formats.
|
||||||
|
|
||||||
|
Example mapping:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
decryption:
|
||||||
|
ATVP: mp4decrypt
|
||||||
|
AMZN: shaka
|
||||||
|
default: shaka
|
||||||
|
```
|
||||||
|
|
||||||
|
The `default` entry is optional. If omitted, `shaka` will be used for services not listed.
|
||||||
|
|
||||||
|
Simple configuration (single method for all services):
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
decryption: mp4decrypt
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
403
docs/DRM_CONFIG.md
Normal file
403
docs/DRM_CONFIG.md
Normal file
@@ -0,0 +1,403 @@
|
|||||||
|
# DRM & CDM Configuration
|
||||||
|
|
||||||
|
This document covers Digital Rights Management (DRM) and Content Decryption Module (CDM) configuration options.
|
||||||
|
|
||||||
|
## cdm (dict)
|
||||||
|
|
||||||
|
Pre-define which Widevine or PlayReady device to use for each Service by Service Tag as Key (case-sensitive).
|
||||||
|
The value should be a WVD or PRD filename without the file extension. When
|
||||||
|
loading the device, unshackle will look in both the `WVDs` and `PRDs` directories
|
||||||
|
for a matching file.
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
AMZN: chromecdm_903_l3
|
||||||
|
NF: nexus_6_l1
|
||||||
|
```
|
||||||
|
|
||||||
|
You may also specify this device based on the profile used.
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
AMZN: chromecdm_903_l3
|
||||||
|
NF: nexus_6_l1
|
||||||
|
DSNP:
|
||||||
|
john_sd: chromecdm_903_l3
|
||||||
|
jane_uhd: nexus_5_l1
|
||||||
|
```
|
||||||
|
|
||||||
|
You can also specify a fallback value to predefine if a match was not made.
|
||||||
|
This can be done using `default` key. This can help reduce redundancy in your specifications.
|
||||||
|
|
||||||
|
For example, the following has the same result as the previous example, as well as all other
|
||||||
|
services and profiles being pre-defined to use `chromecdm_903_l3`.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
NF: nexus_6_l1
|
||||||
|
DSNP:
|
||||||
|
jane_uhd: nexus_5_l1
|
||||||
|
default: chromecdm_903_l3
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## remote_cdm (list\[dict])
|
||||||
|
|
||||||
|
Configure remote CDM (Content Decryption Module) APIs to use for decrypting DRM-protected content.
|
||||||
|
Remote CDMs allow you to use high-security CDMs (L1/L2 for Widevine, SL2000/SL3000 for PlayReady) without
|
||||||
|
having the physical device files locally.
|
||||||
|
|
||||||
|
unshackle supports multiple types of remote CDM providers:
|
||||||
|
|
||||||
|
1. **DecryptLabs CDM** - Official DecryptLabs KeyXtractor API with intelligent caching
|
||||||
|
2. **Custom API CDM** - Highly configurable adapter for any third-party CDM API
|
||||||
|
3. **Legacy PyWidevine Serve** - Standard pywidevine serve-compliant APIs
|
||||||
|
|
||||||
|
The name of each defined remote CDM can be referenced in the `cdm` configuration as if it was a local device file.
|
||||||
|
|
||||||
|
### DecryptLabs Remote CDM
|
||||||
|
|
||||||
|
DecryptLabs provides a professional CDM API service with support for multiple device types and intelligent key caching.
|
||||||
|
|
||||||
|
**Supported Devices:**
|
||||||
|
- **Widevine**: `ChromeCDM` (L3), `L1` (Security Level 1), `L2` (Security Level 2)
|
||||||
|
- **PlayReady**: `SL2` (SL2000), `SL3` (SL3000)
|
||||||
|
|
||||||
|
**Configuration:**
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
remote_cdm:
|
||||||
|
# Widevine L1 Device
|
||||||
|
- name: decrypt_labs_l1
|
||||||
|
type: decrypt_labs # Required: identifies as DecryptLabs CDM
|
||||||
|
device_name: L1 # Required: must match exactly (L1, L2, ChromeCDM, SL2, SL3)
|
||||||
|
host: https://keyxtractor.decryptlabs.com
|
||||||
|
secret: YOUR_API_KEY # Your DecryptLabs API key
|
||||||
|
|
||||||
|
# Widevine L2 Device
|
||||||
|
- name: decrypt_labs_l2
|
||||||
|
type: decrypt_labs
|
||||||
|
device_name: L2
|
||||||
|
host: https://keyxtractor.decryptlabs.com
|
||||||
|
secret: YOUR_API_KEY
|
||||||
|
|
||||||
|
# Chrome CDM (L3)
|
||||||
|
- name: decrypt_labs_chrome
|
||||||
|
type: decrypt_labs
|
||||||
|
device_name: ChromeCDM
|
||||||
|
host: https://keyxtractor.decryptlabs.com
|
||||||
|
secret: YOUR_API_KEY
|
||||||
|
|
||||||
|
# PlayReady SL2000
|
||||||
|
- name: decrypt_labs_playready_sl2
|
||||||
|
type: decrypt_labs
|
||||||
|
device_name: SL2
|
||||||
|
device_type: PLAYREADY # Required for PlayReady
|
||||||
|
host: https://keyxtractor.decryptlabs.com
|
||||||
|
secret: YOUR_API_KEY
|
||||||
|
|
||||||
|
# PlayReady SL3000
|
||||||
|
- name: decrypt_labs_playready_sl3
|
||||||
|
type: decrypt_labs
|
||||||
|
device_name: SL3
|
||||||
|
device_type: PLAYREADY
|
||||||
|
host: https://keyxtractor.decryptlabs.com
|
||||||
|
secret: YOUR_API_KEY
|
||||||
|
```
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- Intelligent key caching system (reduces API calls)
|
||||||
|
- Automatic integration with unshackle's vault system
|
||||||
|
- Support for both Widevine and PlayReady
|
||||||
|
- Multiple security levels (L1, L2, L3, SL2000, SL3000)
|
||||||
|
|
||||||
|
**Note:** The `device_type` and `security_level` fields are optional metadata. They don't affect API communication
|
||||||
|
but are used for internal device identification.
|
||||||
|
|
||||||
|
### Custom API Remote CDM
|
||||||
|
|
||||||
|
A highly configurable CDM adapter that can work with virtually any third-party CDM API through YAML configuration.
|
||||||
|
This allows you to integrate custom CDM services without writing code.
|
||||||
|
|
||||||
|
**Basic Example:**
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
remote_cdm:
|
||||||
|
- name: custom_chrome_cdm
|
||||||
|
type: custom_api # Required: identifies as Custom API CDM
|
||||||
|
host: https://your-cdm-api.com
|
||||||
|
timeout: 30 # Optional: request timeout in seconds
|
||||||
|
|
||||||
|
device:
|
||||||
|
name: ChromeCDM
|
||||||
|
type: CHROME # CHROME, ANDROID, PLAYREADY
|
||||||
|
system_id: 27175
|
||||||
|
security_level: 3
|
||||||
|
|
||||||
|
auth:
|
||||||
|
type: bearer # bearer, header, basic, body
|
||||||
|
key: YOUR_API_TOKEN
|
||||||
|
|
||||||
|
endpoints:
|
||||||
|
get_request:
|
||||||
|
path: /get-challenge
|
||||||
|
method: POST
|
||||||
|
decrypt_response:
|
||||||
|
path: /get-keys
|
||||||
|
method: POST
|
||||||
|
|
||||||
|
caching:
|
||||||
|
enabled: true # Enable key caching
|
||||||
|
use_vaults: true # Integrate with vault system
|
||||||
|
```
|
||||||
|
|
||||||
|
**Advanced Example with Field Mapping:**
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
remote_cdm:
|
||||||
|
- name: advanced_custom_api
|
||||||
|
type: custom_api
|
||||||
|
host: https://api.example.com
|
||||||
|
device:
|
||||||
|
name: L1
|
||||||
|
type: ANDROID
|
||||||
|
security_level: 1
|
||||||
|
|
||||||
|
# Authentication configuration
|
||||||
|
auth:
|
||||||
|
type: header
|
||||||
|
header_name: X-API-Key
|
||||||
|
key: YOUR_SECRET_KEY
|
||||||
|
custom_headers:
|
||||||
|
User-Agent: Unshackle/2.0.0
|
||||||
|
X-Client-Version: "1.0"
|
||||||
|
|
||||||
|
# Endpoint configuration
|
||||||
|
endpoints:
|
||||||
|
get_request:
|
||||||
|
path: /v2/challenge
|
||||||
|
method: POST
|
||||||
|
timeout: 30
|
||||||
|
decrypt_response:
|
||||||
|
path: /v2/decrypt
|
||||||
|
method: POST
|
||||||
|
timeout: 30
|
||||||
|
|
||||||
|
# Request parameter mapping
|
||||||
|
request_mapping:
|
||||||
|
get_request:
|
||||||
|
param_names:
|
||||||
|
init_data: pssh # Rename 'init_data' to 'pssh'
|
||||||
|
scheme: device_type # Rename 'scheme' to 'device_type'
|
||||||
|
static_params:
|
||||||
|
api_version: "2.0" # Add static parameter
|
||||||
|
decrypt_response:
|
||||||
|
param_names:
|
||||||
|
license_request: challenge
|
||||||
|
license_response: license
|
||||||
|
|
||||||
|
# Response field mapping
|
||||||
|
response_mapping:
|
||||||
|
get_request:
|
||||||
|
fields:
|
||||||
|
challenge: data.challenge # Deep field access
|
||||||
|
session_id: session.id
|
||||||
|
success_conditions:
|
||||||
|
- status == 'ok' # Validate response
|
||||||
|
decrypt_response:
|
||||||
|
fields:
|
||||||
|
keys: data.keys
|
||||||
|
key_fields:
|
||||||
|
kid: key_id # Map 'kid' field
|
||||||
|
key: content_key # Map 'key' field
|
||||||
|
|
||||||
|
caching:
|
||||||
|
enabled: true
|
||||||
|
use_vaults: true
|
||||||
|
check_cached_first: true # Check cache before API calls
|
||||||
|
```
|
||||||
|
|
||||||
|
**Supported Authentication Types:**
|
||||||
|
- `bearer` - Bearer token authentication
|
||||||
|
- `header` - Custom header authentication
|
||||||
|
- `basic` - HTTP Basic authentication
|
||||||
|
- `body` - Credentials in request body
|
||||||
|
|
||||||
|
### Legacy PyWidevine Serve Format
|
||||||
|
|
||||||
|
Standard [pywidevine] serve-compliant remote CDM configuration (backwards compatibility).
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
remote_cdm:
|
||||||
|
- name: legacy_chrome_cdm
|
||||||
|
device_name: chrome
|
||||||
|
device_type: CHROME
|
||||||
|
system_id: 27175
|
||||||
|
security_level: 3
|
||||||
|
host: https://domain.com/api
|
||||||
|
secret: secret_key
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note:** If the `type` field is not specified, the entry is treated as a legacy pywidevine serve CDM.
|
||||||
|
|
||||||
|
[pywidevine]: https://github.com/rlaphoenix/pywidevine
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## decrypt_labs_api_key (str)
|
||||||
|
|
||||||
|
API key for DecryptLabs CDM service integration.
|
||||||
|
|
||||||
|
When set, enables the use of DecryptLabs remote CDM services in your `remote_cdm` configuration.
|
||||||
|
This is used specifically for `type: "decrypt_labs"` entries in the remote CDM list.
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
decrypt_labs_api_key: "your_api_key_here"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note**: This is different from the per-CDM `secret` field in `remote_cdm` entries. This provides a global
|
||||||
|
API key that can be referenced across multiple DecryptLabs CDM configurations. If a `remote_cdm` entry with
|
||||||
|
`type: "decrypt_labs"` does not have a `secret` field specified, the global `decrypt_labs_api_key` will be
|
||||||
|
used as a fallback.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## key_vaults (list\[dict])
|
||||||
|
|
||||||
|
Key Vaults store your obtained Content Encryption Keys (CEKs) and Key IDs per-service.
|
||||||
|
|
||||||
|
This can help reduce unnecessary License calls even during the first download. This is because a Service may
|
||||||
|
provide the same Key ID and CEK for both Video and Audio, as well as for multiple resolutions or bitrates.
|
||||||
|
|
||||||
|
You can have as many Key Vaults as you would like. It's nice to share Key Vaults or use a unified Vault on
|
||||||
|
Teams as sharing CEKs immediately can help reduce License calls drastically.
|
||||||
|
|
||||||
|
Four types of Vaults are in the Core codebase: API, SQLite, MySQL, and HTTP. API and HTTP make HTTP requests to a RESTful API,
|
||||||
|
whereas SQLite and MySQL directly connect to an SQLite or MySQL Database.
|
||||||
|
|
||||||
|
Note: SQLite and MySQL vaults have to connect directly to the Host/IP. It cannot be in front of a PHP API or such.
|
||||||
|
Beware that some Hosting Providers do not let you access the MySQL server outside their intranet and may not be
|
||||||
|
accessible outside their hosting platform.
|
||||||
|
|
||||||
|
Additional behavior:
|
||||||
|
|
||||||
|
- `no_push` (bool): Optional per-vault flag. When `true`, the vault will not receive pushed keys (writes) but
|
||||||
|
will still be queried and can provide keys for lookups. Useful for read-only/backup vaults.
|
||||||
|
|
||||||
|
### Using an API Vault
|
||||||
|
|
||||||
|
API vaults use a specific HTTP request format, therefore API or HTTP Key Vault APIs from other projects or services may
|
||||||
|
not work in unshackle. The API format can be seen in the [API Vault Code](unshackle/vaults/API.py).
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
- type: API
|
||||||
|
name: "John#0001's Vault" # arbitrary vault name
|
||||||
|
uri: "https://key-vault.example.com" # api base uri (can also be an IP or IP:Port)
|
||||||
|
# uri: "127.0.0.1:80/key-vault"
|
||||||
|
# uri: "https://api.example.com/key-vault"
|
||||||
|
token: "random secret key" # authorization token
|
||||||
|
# no_push: true # optional; make this API vault read-only (lookups only)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Using a MySQL Vault
|
||||||
|
|
||||||
|
MySQL vaults can be either MySQL or MariaDB servers. I recommend MariaDB.
|
||||||
|
A MySQL Vault can be on a local or remote network, but I recommend SQLite for local Vaults.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
- type: MySQL
|
||||||
|
name: "John#0001's Vault" # arbitrary vault name
|
||||||
|
host: "127.0.0.1" # host/ip
|
||||||
|
# port: 3306 # port (defaults to 3306)
|
||||||
|
database: vault # database used for unshackle
|
||||||
|
username: jane11
|
||||||
|
password: Doe123
|
||||||
|
# no_push: false # optional; defaults to false
|
||||||
|
```
|
||||||
|
|
||||||
|
I recommend giving only a trustable user (or yourself) CREATE permission and then use unshackle to cache at least one CEK
|
||||||
|
per Service to have it create the tables. If you don't give any user permissions to create tables, you will need to
|
||||||
|
make tables yourself.
|
||||||
|
|
||||||
|
- Use a password on all user accounts.
|
||||||
|
- Never use the root account with unshackle (even if it's you).
|
||||||
|
- Do not give multiple users the same username and/or password.
|
||||||
|
- Only give users access to the database used for unshackle.
|
||||||
|
- You may give trusted users CREATE permission so unshackle can create tables if needed.
|
||||||
|
- Other uses should only be given SELECT and INSERT permissions.
|
||||||
|
|
||||||
|
### Using an SQLite Vault
|
||||||
|
|
||||||
|
SQLite Vaults are usually only used for locally stored vaults. This vault may be stored on a mounted Cloud storage
|
||||||
|
drive, but I recommend using SQLite exclusively as an offline-only vault. Effectively this is your backup vault in
|
||||||
|
case something happens to your MySQL Vault.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
- type: SQLite
|
||||||
|
name: "My Local Vault" # arbitrary vault name
|
||||||
|
path: "C:/Users/Jane11/Documents/unshackle/data/key_vault.db"
|
||||||
|
# no_push: true # optional; commonly true for local backup vaults
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note**: You do not need to create the file at the specified path.
|
||||||
|
SQLite will create a new SQLite database at that path if one does not exist.
|
||||||
|
Try not to accidentally move the `db` file once created without reflecting the change in the config, or you will end
|
||||||
|
up with multiple databases.
|
||||||
|
|
||||||
|
If you work on a Team I recommend every team member having their own SQLite Vault even if you all use a MySQL vault
|
||||||
|
together.
|
||||||
|
|
||||||
|
### Using an HTTP Vault
|
||||||
|
|
||||||
|
HTTP Vaults provide flexible HTTP-based key storage with support for multiple API modes. This vault type
|
||||||
|
is useful for integrating with various third-party key vault APIs.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
- type: HTTP
|
||||||
|
name: "My HTTP Vault"
|
||||||
|
host: "https://vault-api.example.com"
|
||||||
|
api_key: "your_api_key" # or use 'password' field
|
||||||
|
api_mode: "json" # query, json, or decrypt_labs
|
||||||
|
# username: "user" # required for query mode only
|
||||||
|
# no_push: false # optional; defaults to false
|
||||||
|
```
|
||||||
|
|
||||||
|
**Supported API Modes:**
|
||||||
|
|
||||||
|
- `query` - Uses GET requests with query parameters. Requires `username` field.
|
||||||
|
- `json` - Uses POST requests with JSON payloads. Token-based authentication.
|
||||||
|
- `decrypt_labs` - DecryptLabs API format. Read-only mode (`no_push` is forced to `true`).
|
||||||
|
|
||||||
|
**Example configurations:**
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# Query mode (requires username)
|
||||||
|
- type: HTTP
|
||||||
|
name: "Query Vault"
|
||||||
|
host: "https://api.example.com/keys"
|
||||||
|
username: "myuser"
|
||||||
|
password: "mypassword"
|
||||||
|
api_mode: "query"
|
||||||
|
|
||||||
|
# JSON mode
|
||||||
|
- type: HTTP
|
||||||
|
name: "JSON Vault"
|
||||||
|
host: "https://api.example.com/vault"
|
||||||
|
api_key: "secret_token"
|
||||||
|
api_mode: "json"
|
||||||
|
|
||||||
|
# DecryptLabs mode (read-only)
|
||||||
|
- type: HTTP
|
||||||
|
name: "DecryptLabs Cache"
|
||||||
|
host: "https://keyxtractor.decryptlabs.com/cache"
|
||||||
|
api_key: "your_decrypt_labs_api_key"
|
||||||
|
api_mode: "decrypt_labs"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note**: The `decrypt_labs` mode is always read-only and cannot receive pushed keys.
|
||||||
|
|
||||||
|
---
|
||||||
183
docs/GLUETUN.md
Normal file
183
docs/GLUETUN.md
Normal file
@@ -0,0 +1,183 @@
|
|||||||
|
# Gluetun VPN Proxy
|
||||||
|
|
||||||
|
Gluetun provides Docker-managed VPN proxies supporting 50+ VPN providers.
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
**Docker must be installed and running.**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Linux
|
||||||
|
curl -fsSL https://get.docker.com | sh
|
||||||
|
sudo usermod -aG docker $USER # Then log out/in
|
||||||
|
|
||||||
|
# Windows/Mac
|
||||||
|
# Install Docker Desktop: https://www.docker.com/products/docker-desktop/
|
||||||
|
```
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
### 1. Configuration
|
||||||
|
|
||||||
|
Add to `~/.config/unshackle/unshackle.yaml`:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
proxy_providers:
|
||||||
|
gluetun:
|
||||||
|
providers:
|
||||||
|
windscribe:
|
||||||
|
vpn_type: openvpn
|
||||||
|
credentials:
|
||||||
|
username: "YOUR_OPENVPN_USERNAME"
|
||||||
|
password: "YOUR_OPENVPN_PASSWORD"
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Usage
|
||||||
|
|
||||||
|
Use 2-letter country codes directly:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
unshackle dl SERVICE CONTENT --proxy gluetun:windscribe:us
|
||||||
|
unshackle dl SERVICE CONTENT --proxy gluetun:windscribe:uk
|
||||||
|
```
|
||||||
|
|
||||||
|
Format: `gluetun:provider:region`
|
||||||
|
|
||||||
|
## Provider Credential Requirements
|
||||||
|
|
||||||
|
**OpenVPN (Recommended)**: Most providers support OpenVPN with just `username` and `password` - the simplest setup.
|
||||||
|
|
||||||
|
**WireGuard**: Requires private keys and varies by provider. See the [Gluetun Wiki](https://github.com/qdm12/gluetun-wiki/tree/main/setup/providers) for provider-specific requirements.
|
||||||
|
|
||||||
|
## Getting Your Credentials
|
||||||
|
|
||||||
|
### Windscribe (OpenVPN)
|
||||||
|
|
||||||
|
1. Go to [windscribe.com/getconfig/openvpn](https://windscribe.com/getconfig/openvpn)
|
||||||
|
2. Log in with your Windscribe account
|
||||||
|
3. Select any location and click "Get Config"
|
||||||
|
4. Copy the username and password shown
|
||||||
|
|
||||||
|
### NordVPN (OpenVPN)
|
||||||
|
|
||||||
|
1. Go to [NordVPN Service Credentials](https://my.nordaccount.com/dashboard/nordvpn/manual-configuration/service-credentials/)
|
||||||
|
2. Log in with your NordVPN account
|
||||||
|
3. Generate or view your service credentials
|
||||||
|
4. Copy the username and password
|
||||||
|
|
||||||
|
> **Note**: Use service credentials, NOT your account email/password.
|
||||||
|
|
||||||
|
### WireGuard Credentials (Advanced)
|
||||||
|
|
||||||
|
WireGuard requires private keys instead of username/password. See the [Gluetun Wiki](https://github.com/qdm12/gluetun-wiki/tree/main/setup/providers) for provider-specific WireGuard setup.
|
||||||
|
|
||||||
|
## Configuration Examples
|
||||||
|
|
||||||
|
**OpenVPN (Recommended)**
|
||||||
|
|
||||||
|
Most providers support OpenVPN with just username and password:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
providers:
|
||||||
|
windscribe:
|
||||||
|
vpn_type: openvpn
|
||||||
|
credentials:
|
||||||
|
username: YOUR_OPENVPN_USERNAME
|
||||||
|
password: YOUR_OPENVPN_PASSWORD
|
||||||
|
|
||||||
|
nordvpn:
|
||||||
|
vpn_type: openvpn
|
||||||
|
credentials:
|
||||||
|
username: YOUR_SERVICE_USERNAME
|
||||||
|
password: YOUR_SERVICE_PASSWORD
|
||||||
|
```
|
||||||
|
|
||||||
|
**WireGuard (Advanced)**
|
||||||
|
|
||||||
|
WireGuard can be faster but requires more complex credential setup:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# NordVPN/ProtonVPN (only private_key needed)
|
||||||
|
providers:
|
||||||
|
nordvpn:
|
||||||
|
vpn_type: wireguard
|
||||||
|
credentials:
|
||||||
|
private_key: YOUR_PRIVATE_KEY
|
||||||
|
|
||||||
|
# Windscribe (all three credentials required)
|
||||||
|
windscribe:
|
||||||
|
vpn_type: wireguard
|
||||||
|
credentials:
|
||||||
|
private_key: YOUR_PRIVATE_KEY
|
||||||
|
addresses: 10.x.x.x/32
|
||||||
|
preshared_key: YOUR_PRESHARED_KEY
|
||||||
|
```
|
||||||
|
|
||||||
|
## Server Selection
|
||||||
|
|
||||||
|
Most providers use `SERVER_COUNTRIES`, but some use `SERVER_REGIONS`:
|
||||||
|
|
||||||
|
| Variable | Providers |
|
||||||
|
|----------|-----------|
|
||||||
|
| `SERVER_COUNTRIES` | NordVPN, ProtonVPN, Surfshark, Mullvad, ExpressVPN, and most others |
|
||||||
|
| `SERVER_REGIONS` | Windscribe, VyprVPN, VPN Secure |
|
||||||
|
|
||||||
|
Unshackle handles this automatically - just use 2-letter country codes.
|
||||||
|
|
||||||
|
## Global Settings
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
proxy_providers:
|
||||||
|
gluetun:
|
||||||
|
providers: {...}
|
||||||
|
base_port: 8888 # Starting port (default: 8888)
|
||||||
|
auto_cleanup: true # Remove containers on exit (default: true)
|
||||||
|
verify_ip: true # Verify IP matches region (default: true)
|
||||||
|
container_prefix: "unshackle-gluetun"
|
||||||
|
auth_user: username # Proxy auth (optional)
|
||||||
|
auth_password: password
|
||||||
|
```
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- **Container Reuse**: First request takes 10-30s; subsequent requests are instant
|
||||||
|
- **IP Verification**: Automatically verifies VPN exit IP matches requested region
|
||||||
|
- **Concurrent Sessions**: Multiple downloads share the same container
|
||||||
|
- **Specific Servers**: Use `--proxy gluetun:nordvpn:us1239` for specific server selection
|
||||||
|
|
||||||
|
## Container Management
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# View containers
|
||||||
|
docker ps | grep unshackle-gluetun
|
||||||
|
|
||||||
|
# Check logs
|
||||||
|
docker logs unshackle-gluetun-nordvpn-us
|
||||||
|
|
||||||
|
# Remove all containers
|
||||||
|
docker ps -a | grep unshackle-gluetun | awk '{print $1}' | xargs docker rm -f
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Docker Permission Denied (Linux)
|
||||||
|
```bash
|
||||||
|
sudo usermod -aG docker $USER
|
||||||
|
# Then log out and log back in
|
||||||
|
```
|
||||||
|
|
||||||
|
### VPN Connection Failed
|
||||||
|
Check container logs for specific errors:
|
||||||
|
```bash
|
||||||
|
docker logs unshackle-gluetun-nordvpn-us
|
||||||
|
```
|
||||||
|
|
||||||
|
Common issues:
|
||||||
|
- Invalid/missing credentials
|
||||||
|
- Windscribe requires `preshared_key` (can be empty string)
|
||||||
|
- VPN provider server issues
|
||||||
|
|
||||||
|
## Resources
|
||||||
|
|
||||||
|
- [Gluetun Wiki](https://github.com/qdm12/gluetun-wiki) - Official provider documentation
|
||||||
|
- [Gluetun GitHub](https://github.com/qdm12/gluetun)
|
||||||
154
docs/NETWORK_CONFIG.md
Normal file
154
docs/NETWORK_CONFIG.md
Normal file
@@ -0,0 +1,154 @@
|
|||||||
|
# Network & Proxy Configuration
|
||||||
|
|
||||||
|
This document covers network and proxy configuration options for bypassing geofencing and managing connections.
|
||||||
|
|
||||||
|
## proxy_providers (dict)
|
||||||
|
|
||||||
|
Enable external proxy provider services. These proxies will be used automatically where needed as defined by the
|
||||||
|
Service's GEOFENCE class property, but can also be explicitly used with `--proxy`. You can specify which provider
|
||||||
|
to use by prefixing it with the provider key name, e.g., `--proxy basic:de` or `--proxy nordvpn:de`. Some providers
|
||||||
|
support specific query formats for selecting a country/server.
|
||||||
|
|
||||||
|
### basic (dict[str, str|list])
|
||||||
|
|
||||||
|
Define a mapping of country to proxy to use where required.
|
||||||
|
The keys are region Alpha 2 Country Codes. Alpha 2 Country Codes are `[a-z]{2}` codes, e.g., `us`, `gb`, and `jp`.
|
||||||
|
Don't get this mixed up with language codes like `en` vs. `gb`, or `ja` vs. `jp`.
|
||||||
|
|
||||||
|
Do note that each key's value can be a list of strings, or a string. For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
us:
|
||||||
|
- "http://john%40email.tld:password123@proxy-us.domain.tld:8080"
|
||||||
|
- "http://jane%40email.tld:password456@proxy-us.domain2.tld:8080"
|
||||||
|
de: "https://127.0.0.1:8080"
|
||||||
|
```
|
||||||
|
|
||||||
|
Note that if multiple proxies are defined for a region, then by default one will be randomly chosen.
|
||||||
|
You can choose a specific one by specifying it's number, e.g., `--proxy basic:us2` will choose the
|
||||||
|
second proxy of the US list.
|
||||||
|
|
||||||
|
### nordvpn (dict)
|
||||||
|
|
||||||
|
Set your NordVPN Service credentials with `username` and `password` keys to automate the use of NordVPN as a Proxy
|
||||||
|
system where required.
|
||||||
|
|
||||||
|
You can also specify specific servers to use per-region with the `server_map` key.
|
||||||
|
Sometimes a specific server works best for a service than others, so hard-coding one for a day or two helps.
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
username: zxqsR7C5CyGwmGb6KSvk8qsZ # example of the login format
|
||||||
|
password: wXVHmht22hhRKUEQ32PQVjCZ
|
||||||
|
server_map:
|
||||||
|
us: 12 # force US server #12 for US proxies
|
||||||
|
```
|
||||||
|
|
||||||
|
The username and password should NOT be your normal NordVPN Account Credentials.
|
||||||
|
They should be the `Service credentials` which can be found on your Nord Account Dashboard.
|
||||||
|
|
||||||
|
Once set, you can also specifically opt in to use a NordVPN proxy by specifying `--proxy=gb` or such.
|
||||||
|
You can even set a specific server number this way, e.g., `--proxy=gb2366`.
|
||||||
|
|
||||||
|
Note that `gb` is used instead of `uk` to be more consistent across regional systems.
|
||||||
|
|
||||||
|
### surfsharkvpn (dict)
|
||||||
|
|
||||||
|
Enable Surfshark VPN proxy service using Surfshark Service credentials (not your login password).
|
||||||
|
You may pin specific server IDs per region using `server_map`.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
username: your_surfshark_service_username # https://my.surfshark.com/vpn/manual-setup/main/openvpn
|
||||||
|
password: your_surfshark_service_password # service credentials, not account password
|
||||||
|
server_map:
|
||||||
|
us: 3844 # force US server #3844
|
||||||
|
gb: 2697 # force GB server #2697
|
||||||
|
au: 4621 # force AU server #4621
|
||||||
|
```
|
||||||
|
|
||||||
|
### hola (dict)
|
||||||
|
|
||||||
|
Enable Hola VPN proxy service. Requires the `hola-proxy` binary to be installed and available in your PATH.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
proxy_providers:
|
||||||
|
hola: {}
|
||||||
|
```
|
||||||
|
|
||||||
|
Once configured, use `--proxy hola:us` or similar to connect through Hola.
|
||||||
|
|
||||||
|
### windscribevpn (dict)
|
||||||
|
|
||||||
|
Enable Windscribe VPN proxy service using static OpenVPN service credentials.
|
||||||
|
|
||||||
|
Use the service credentials from https://windscribe.com/getconfig/openvpn (not your account login credentials).
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
proxy_providers:
|
||||||
|
windscribevpn:
|
||||||
|
username: openvpn_username # From https://windscribe.com/getconfig/openvpn
|
||||||
|
password: openvpn_password # Service credentials, NOT your account password
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Server Mapping
|
||||||
|
|
||||||
|
You can optionally pin specific servers using `server_map`:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
proxy_providers:
|
||||||
|
windscribevpn:
|
||||||
|
username: openvpn_username
|
||||||
|
password: openvpn_password
|
||||||
|
server_map:
|
||||||
|
us: us-central-096.totallyacdn.com # Force specific US server
|
||||||
|
gb: uk-london-001.totallyacdn.com # Force specific UK server
|
||||||
|
```
|
||||||
|
|
||||||
|
Once configured, use `--proxy windscribe:us` or `--proxy windscribe:gb` etc. to connect through Windscribe.
|
||||||
|
|
||||||
|
### Legacy nordvpn Configuration
|
||||||
|
|
||||||
|
**Legacy configuration. Use `proxy_providers.nordvpn` instead.**
|
||||||
|
|
||||||
|
Set your NordVPN Service credentials with `username` and `password` keys to automate the use of NordVPN as a Proxy
|
||||||
|
system where required.
|
||||||
|
|
||||||
|
You can also specify specific servers to use per-region with the `server_map` key.
|
||||||
|
Sometimes a specific server works best for a service than others, so hard-coding one for a day or two helps.
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
nordvpn:
|
||||||
|
username: zxqsR7C5CyGwmGb6KSvk8qsZ # example of the login format
|
||||||
|
password: wXVHmht22hhRKUEQ32PQVjCZ
|
||||||
|
server_map:
|
||||||
|
us: 12 # force US server #12 for US proxies
|
||||||
|
```
|
||||||
|
|
||||||
|
The username and password should NOT be your normal NordVPN Account Credentials.
|
||||||
|
They should be the `Service credentials` which can be found on your Nord Account Dashboard.
|
||||||
|
|
||||||
|
Note that `gb` is used instead of `uk` to be more consistent across regional systems.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## headers (dict)
|
||||||
|
|
||||||
|
Case-Insensitive dictionary of headers that all Services begin their Request Session state with.
|
||||||
|
All requests will use these unless changed explicitly or implicitly via a Server response.
|
||||||
|
These should be sane defaults and anything that would only be useful for some Services should not
|
||||||
|
be put here.
|
||||||
|
|
||||||
|
Avoid headers like 'Accept-Encoding' as that would be a compatibility header that Python-requests will
|
||||||
|
set for you.
|
||||||
|
|
||||||
|
I recommend using,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
Accept-Language: "en-US,en;q=0.8"
|
||||||
|
User-Agent: "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3865.75 Safari/537.36"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
123
docs/OUTPUT_CONFIG.md
Normal file
123
docs/OUTPUT_CONFIG.md
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
# Output & Naming Configuration
|
||||||
|
|
||||||
|
This document covers output file organization and naming configuration options.
|
||||||
|
|
||||||
|
## filenames (dict)
|
||||||
|
|
||||||
|
Override the default filenames used across unshackle.
|
||||||
|
The filenames use various variables that are replaced during runtime.
|
||||||
|
|
||||||
|
The following filenames are available and may be overridden:
|
||||||
|
|
||||||
|
- `log` - Log filenames. Uses `{name}` and `{time}` variables.
|
||||||
|
- `debug_log` - Debug log filenames. Uses `{service}` and `{time}` variables.
|
||||||
|
- `config` - Service configuration filenames.
|
||||||
|
- `root_config` - Root configuration filename.
|
||||||
|
- `chapters` - Chapter export filenames. Uses `{title}` and `{random}` variables.
|
||||||
|
- `subtitle` - Subtitle export filenames. Uses `{id}` and `{language}` variables.
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
filenames:
|
||||||
|
log: "unshackle_{name}_{time}.log"
|
||||||
|
debug_log: "unshackle_debug_{service}_{time}.jsonl"
|
||||||
|
config: "config.yaml"
|
||||||
|
root_config: "unshackle.yaml"
|
||||||
|
chapters: "Chapters_{title}_{random}.txt"
|
||||||
|
subtitle: "Subtitle_{id}_{language}.srt"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## scene_naming (bool)
|
||||||
|
|
||||||
|
Set scene-style naming for titles. When `true` uses scene naming patterns (e.g., `Prime.Suspect.S07E01...`), when
|
||||||
|
`false` uses a more human-readable style (e.g., `Prime Suspect S07E01 ...`). Default: `true`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## series_year (bool)
|
||||||
|
|
||||||
|
Whether to include the series year in series names for episodes and folders. Default: `true`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## tag (str)
|
||||||
|
|
||||||
|
Group or Username to postfix to the end of download filenames following a dash.
|
||||||
|
Only applies when `scene_naming` is enabled.
|
||||||
|
For example, `tag: "J0HN"` will have `-J0HN` at the end of all download filenames.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## tag_group_name (bool)
|
||||||
|
|
||||||
|
Enable/disable tagging downloads with your group name when `tag` is set. Default: `true`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## tag_imdb_tmdb (bool)
|
||||||
|
|
||||||
|
Enable/disable tagging downloaded files with IMDB/TMDB/TVDB identifiers (when available). Default: `true`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## muxing (dict)
|
||||||
|
|
||||||
|
- `set_title`
|
||||||
|
Set the container title to `Show SXXEXX Episode Name` or `Movie (Year)`. Default: `true`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## chapter_fallback_name (str)
|
||||||
|
|
||||||
|
The Chapter Name to use when exporting a Chapter without a Name.
|
||||||
|
The default is no fallback name at all and no Chapter name will be set.
|
||||||
|
|
||||||
|
The fallback name can use the following variables in f-string style:
|
||||||
|
|
||||||
|
- `{i}`: The Chapter number starting at 1.
|
||||||
|
E.g., `"Chapter {i}"`: "Chapter 1", "Intro", "Chapter 3".
|
||||||
|
- `{j}`: A number starting at 1 that increments any time a Chapter has no title.
|
||||||
|
E.g., `"Chapter {j}"`: "Chapter 1", "Intro", "Chapter 2".
|
||||||
|
|
||||||
|
These are formatted with f-strings, directives are supported.
|
||||||
|
For example, `"Chapter {i:02}"` will result in `"Chapter 01"`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## directories (dict)
|
||||||
|
|
||||||
|
Override the default directories used across unshackle.
|
||||||
|
The directories are set to common values by default.
|
||||||
|
|
||||||
|
The following directories are available and may be overridden,
|
||||||
|
|
||||||
|
- `commands` - CLI Command Classes.
|
||||||
|
- `services` - Service Classes.
|
||||||
|
- `vaults` - Vault Classes.
|
||||||
|
- `fonts` - Font files (ttf or otf).
|
||||||
|
- `downloads` - Downloads.
|
||||||
|
- `temp` - Temporary files or conversions during download.
|
||||||
|
- `cache` - Expiring data like Authorization tokens, or other misc data.
|
||||||
|
- `cookies` - Expiring Cookie data.
|
||||||
|
- `logs` - Logs.
|
||||||
|
- `wvds` - Widevine Devices.
|
||||||
|
- `prds` - PlayReady Devices.
|
||||||
|
- `dcsl` - Device Certificate Status List.
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
|
||||||
|
- `services` accepts either a single directory or a list of directories to search for service modules.
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
downloads: "D:/Downloads/unshackle"
|
||||||
|
temp: "D:/Temp/unshackle"
|
||||||
|
```
|
||||||
|
|
||||||
|
There are directories not listed that cannot be modified as they are crucial to the operation of unshackle.
|
||||||
|
|
||||||
|
---
|
||||||
116
docs/SERVICE_CONFIG.md
Normal file
116
docs/SERVICE_CONFIG.md
Normal file
@@ -0,0 +1,116 @@
|
|||||||
|
# Service Integration & Authentication Configuration
|
||||||
|
|
||||||
|
This document covers service-specific configuration, authentication, and metadata integration options.
|
||||||
|
|
||||||
|
## services (dict)
|
||||||
|
|
||||||
|
Configuration data for each Service. The Service will have the data within this section merged into the `config.yaml`
|
||||||
|
before provided to the Service class.
|
||||||
|
|
||||||
|
Think of this config to be used for more sensitive configuration data, like user or device-specific API keys, IDs,
|
||||||
|
device attributes, and so on. A `config.yaml` file is typically shared and not meant to be modified, so use this for
|
||||||
|
any sensitive configuration data.
|
||||||
|
|
||||||
|
The Key is the Service Tag, but can take any arbitrary form for its value. It's expected to begin as either a list or
|
||||||
|
a dictionary.
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
NOW:
|
||||||
|
client:
|
||||||
|
auth_scheme: MESSO
|
||||||
|
# ... more sensitive data
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## credentials (dict[str, str|list|dict])
|
||||||
|
|
||||||
|
Specify login credentials to use for each Service, and optionally per-profile.
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
ALL4: jane@gmail.com:LoremIpsum100 # directly
|
||||||
|
AMZN: # or per-profile, optionally with a default
|
||||||
|
default: jane@example.tld:LoremIpsum99 # <-- used by default if -p/--profile is not used
|
||||||
|
james: james@gmail.com:TheFriend97
|
||||||
|
john: john@example.tld:LoremIpsum98
|
||||||
|
NF: # the `default` key is not necessary, but no credential will be used by default
|
||||||
|
john: john@gmail.com:TheGuyWhoPaysForTheNetflix69420
|
||||||
|
```
|
||||||
|
|
||||||
|
The value should be in string form, i.e. `john@gmail.com:password123` or `john:password123`.
|
||||||
|
Any arbitrary values can be used on the left (username/password/phone) and right (password/secret).
|
||||||
|
You can also specify these in list form, i.e., `["john@gmail.com", ":PasswordWithAColon"]`.
|
||||||
|
|
||||||
|
If you specify multiple credentials with keys like the `AMZN` and `NF` example above, then you should
|
||||||
|
use a `default` key or no credential will be loaded automatically unless you use `-p/--profile`. You
|
||||||
|
do not have to use a `default` key at all.
|
||||||
|
|
||||||
|
Please be aware that this information is sensitive and to keep it safe. Do not share your config.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## tmdb_api_key (str)
|
||||||
|
|
||||||
|
API key for The Movie Database (TMDB). This is used for tagging downloaded files with TMDB,
|
||||||
|
IMDB and TVDB identifiers. Leave empty to disable automatic lookups.
|
||||||
|
|
||||||
|
To obtain a TMDB API key:
|
||||||
|
|
||||||
|
1. Create an account at <https://www.themoviedb.org/>
|
||||||
|
2. Go to <https://www.themoviedb.org/settings/api> to register for API access
|
||||||
|
3. Fill out the API application form with your project details
|
||||||
|
4. Once approved, you'll receive your API key
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
tmdb_api_key: cf66bf18956kca5311ada3bebb84eb9a # Not a real key
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note**: Keep your API key secure and do not share it publicly. This key is used by the core/utils/tags.py module to fetch metadata from TMDB for proper file tagging.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## simkl_client_id (str)
|
||||||
|
|
||||||
|
Client ID for SIMKL API integration. SIMKL is used as a metadata source for improved title matching and tagging,
|
||||||
|
especially when a TMDB API key is not configured.
|
||||||
|
|
||||||
|
To obtain a SIMKL Client ID:
|
||||||
|
|
||||||
|
1. Create an account at <https://simkl.com/>
|
||||||
|
2. Go to <https://simkl.com/settings/developer/>
|
||||||
|
3. Register a new application to receive your Client ID
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
simkl_client_id: "your_client_id_here"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note**: While optional, having a SIMKL Client ID improves metadata lookup reliability. SIMKL serves as an alternative or fallback metadata source to TMDB. This is used by the `core/utils/tags.py` module.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## title_cache_enabled (bool)
|
||||||
|
|
||||||
|
Enable/disable caching of title metadata to reduce redundant API calls. Default: `true`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## title_cache_time (int)
|
||||||
|
|
||||||
|
Cache duration in seconds for title metadata. Default: `1800` (30 minutes).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## title_cache_max_retention (int)
|
||||||
|
|
||||||
|
Maximum retention time in seconds for serving slightly stale cached title metadata when API calls fail.
|
||||||
|
Default: `86400` (24 hours). Effective retention is `min(title_cache_time + grace, title_cache_max_retention)`.
|
||||||
|
|
||||||
|
---
|
||||||
39
docs/SUBTITLE_CONFIG.md
Normal file
39
docs/SUBTITLE_CONFIG.md
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
# Subtitle Processing Configuration
|
||||||
|
|
||||||
|
This document covers subtitle processing and formatting options.
|
||||||
|
|
||||||
|
## subtitle (dict)
|
||||||
|
|
||||||
|
Control subtitle conversion and SDH (hearing-impaired) stripping behavior.
|
||||||
|
|
||||||
|
- `conversion_method`: How to convert subtitles between formats. Default: `pysubs2`.
|
||||||
|
- `auto`: Use subby for WebVTT/SAMI, standard for others.
|
||||||
|
- `subby`: Always use subby with CommonIssuesFixer.
|
||||||
|
- `subtitleedit`: Prefer SubtitleEdit when available; otherwise fallback to standard conversion.
|
||||||
|
- `pycaption`: Use only the pycaption library (no SubtitleEdit, no subby).
|
||||||
|
- `pysubs2`: Use pysubs2 library (supports SRT, SSA, ASS, WebVTT, TTML, SAMI, MicroDVD, MPL2, TMP formats).
|
||||||
|
|
||||||
|
- `sdh_method`: How to strip SDH cues. Default: `auto`.
|
||||||
|
- `auto`: Try subby for SRT first, then SubtitleEdit, then filter-subs.
|
||||||
|
- `subby`: Use subby's SDHStripper. **Note:** Only works with SRT files; other formats will fall back to alternative methods.
|
||||||
|
- `subtitleedit`: Use SubtitleEdit's RemoveTextForHI when available.
|
||||||
|
- `filter-subs`: Use the subtitle-filter library.
|
||||||
|
|
||||||
|
- `strip_sdh`: Enable/disable automatic SDH (hearing-impaired) cue stripping. Default: `true`.
|
||||||
|
|
||||||
|
- `convert_before_strip`: When using `filter-subs` SDH method, automatically convert subtitles to SRT format first for better compatibility. Default: `true`.
|
||||||
|
|
||||||
|
- `preserve_formatting`: Keep original subtitle tags and positioning during conversion. Default: `true`.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
subtitle:
|
||||||
|
conversion_method: pysubs2
|
||||||
|
sdh_method: auto
|
||||||
|
strip_sdh: true
|
||||||
|
convert_before_strip: true
|
||||||
|
preserve_formatting: true
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
@@ -64,6 +64,9 @@ dependencies = [
|
|||||||
"aiohttp-swagger3>=0.9.0,<1",
|
"aiohttp-swagger3>=0.9.0,<1",
|
||||||
"pysubs2>=1.7.0,<2",
|
"pysubs2>=1.7.0,<2",
|
||||||
"PyExecJS>=1.5.1,<2",
|
"PyExecJS>=1.5.1,<2",
|
||||||
|
"pycountry>=24.6.1",
|
||||||
|
"language-data>=1.4.0",
|
||||||
|
"wasmtime>=41.0.0",
|
||||||
]
|
]
|
||||||
|
|
||||||
[project.urls]
|
[project.urls]
|
||||||
@@ -87,7 +90,6 @@ dev = [
|
|||||||
"types-requests>=2.31.0.20240406,<3",
|
"types-requests>=2.31.0.20240406,<3",
|
||||||
"isort>=5.13.2,<8",
|
"isort>=5.13.2,<8",
|
||||||
"ruff>=0.3.7,<0.15",
|
"ruff>=0.3.7,<0.15",
|
||||||
"unshackle",
|
|
||||||
]
|
]
|
||||||
|
|
||||||
[tool.hatch.build.targets.wheel]
|
[tool.hatch.build.targets.wheel]
|
||||||
|
|||||||
@@ -27,6 +27,7 @@ from construct import ConstError
|
|||||||
from pymediainfo import MediaInfo
|
from pymediainfo import MediaInfo
|
||||||
from pyplayready.cdm import Cdm as PlayReadyCdm
|
from pyplayready.cdm import Cdm as PlayReadyCdm
|
||||||
from pyplayready.device import Device as PlayReadyDevice
|
from pyplayready.device import Device as PlayReadyDevice
|
||||||
|
from pyplayready.remote.remotecdm import RemoteCdm as PlayReadyRemoteCdm
|
||||||
from pywidevine.cdm import Cdm as WidevineCdm
|
from pywidevine.cdm import Cdm as WidevineCdm
|
||||||
from pywidevine.device import Device
|
from pywidevine.device import Device
|
||||||
from pywidevine.remotecdm import RemoteCdm
|
from pywidevine.remotecdm import RemoteCdm
|
||||||
@@ -46,9 +47,9 @@ from unshackle.core.config import config
|
|||||||
from unshackle.core.console import console
|
from unshackle.core.console import console
|
||||||
from unshackle.core.constants import DOWNLOAD_LICENCE_ONLY, AnyTrack, context_settings
|
from unshackle.core.constants import DOWNLOAD_LICENCE_ONLY, AnyTrack, context_settings
|
||||||
from unshackle.core.credential import Credential
|
from unshackle.core.credential import Credential
|
||||||
from unshackle.core.drm import DRM_T, PlayReady, Widevine
|
from unshackle.core.drm import DRM_T, MonaLisa, PlayReady, Widevine
|
||||||
from unshackle.core.events import events
|
from unshackle.core.events import events
|
||||||
from unshackle.core.proxies import Basic, Hola, NordVPN, SurfsharkVPN, WindscribeVPN
|
from unshackle.core.proxies import Basic, Gluetun, Hola, NordVPN, SurfsharkVPN, WindscribeVPN
|
||||||
from unshackle.core.service import Service
|
from unshackle.core.service import Service
|
||||||
from unshackle.core.services import Services
|
from unshackle.core.services import Services
|
||||||
from unshackle.core.title_cacher import get_account_hash
|
from unshackle.core.title_cacher import get_account_hash
|
||||||
@@ -60,8 +61,8 @@ from unshackle.core.tracks.hybrid import Hybrid
|
|||||||
from unshackle.core.utilities import (find_font_with_fallbacks, get_debug_logger, get_system_fonts, init_debug_logger,
|
from unshackle.core.utilities import (find_font_with_fallbacks, get_debug_logger, get_system_fonts, init_debug_logger,
|
||||||
is_close_match, suggest_font_packages, time_elapsed_since)
|
is_close_match, suggest_font_packages, time_elapsed_since)
|
||||||
from unshackle.core.utils import tags
|
from unshackle.core.utils import tags
|
||||||
from unshackle.core.utils.click_types import (LANGUAGE_RANGE, QUALITY_LIST, SEASON_RANGE, ContextData, MultipleChoice,
|
from unshackle.core.utils.click_types import (AUDIO_CODEC_LIST, LANGUAGE_RANGE, QUALITY_LIST, SEASON_RANGE,
|
||||||
SubtitleCodecChoice, VideoCodecChoice)
|
ContextData, MultipleChoice, SubtitleCodecChoice, VideoCodecChoice)
|
||||||
from unshackle.core.utils.collections import merge_dict
|
from unshackle.core.utils.collections import merge_dict
|
||||||
from unshackle.core.utils.subprocess import ffprobe
|
from unshackle.core.utils.subprocess import ffprobe
|
||||||
from unshackle.core.vaults import Vaults
|
from unshackle.core.vaults import Vaults
|
||||||
@@ -97,11 +98,7 @@ class dl:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
def prepare_temp_font(
|
def prepare_temp_font(
|
||||||
self,
|
self, font_name: str, matched_font: Path, system_fonts: dict[str, Path], temp_font_files: list[Path]
|
||||||
font_name: str,
|
|
||||||
matched_font: Path,
|
|
||||||
system_fonts: dict[str, Path],
|
|
||||||
temp_font_files: list[Path]
|
|
||||||
) -> Path:
|
) -> Path:
|
||||||
"""
|
"""
|
||||||
Copy system font to temp and log if using fallback.
|
Copy system font to temp and log if using fallback.
|
||||||
@@ -116,10 +113,7 @@ class dl:
|
|||||||
Path to temp font file
|
Path to temp font file
|
||||||
"""
|
"""
|
||||||
# Find the matched name for logging
|
# Find the matched name for logging
|
||||||
matched_name = next(
|
matched_name = next((name for name, path in system_fonts.items() if path == matched_font), None)
|
||||||
(name for name, path in system_fonts.items() if path == matched_font),
|
|
||||||
None
|
|
||||||
)
|
|
||||||
|
|
||||||
if matched_name and matched_name.lower() != font_name.lower():
|
if matched_name and matched_name.lower() != font_name.lower():
|
||||||
self.log.info(f"Using '{matched_name}' as fallback for '{font_name}'")
|
self.log.info(f"Using '{matched_name}' as fallback for '{font_name}'")
|
||||||
@@ -136,10 +130,7 @@ class dl:
|
|||||||
return temp_path
|
return temp_path
|
||||||
|
|
||||||
def attach_subtitle_fonts(
|
def attach_subtitle_fonts(
|
||||||
self,
|
self, font_names: list[str], title: Title_T, temp_font_files: list[Path]
|
||||||
font_names: list[str],
|
|
||||||
title: Title_T,
|
|
||||||
temp_font_files: list[Path]
|
|
||||||
) -> tuple[int, list[str]]:
|
) -> tuple[int, list[str]]:
|
||||||
"""
|
"""
|
||||||
Attach fonts for subtitle rendering.
|
Attach fonts for subtitle rendering.
|
||||||
@@ -188,6 +179,99 @@ class dl:
|
|||||||
self.log.info(f" $ sudo apt install {package_cmd}")
|
self.log.info(f" $ sudo apt install {package_cmd}")
|
||||||
self.log.info(f" → Provides: {', '.join(fonts)}")
|
self.log.info(f" → Provides: {', '.join(fonts)}")
|
||||||
|
|
||||||
|
def generate_sidecar_subtitle_path(
|
||||||
|
self,
|
||||||
|
subtitle: Subtitle,
|
||||||
|
base_filename: str,
|
||||||
|
output_dir: Path,
|
||||||
|
target_codec: Optional[Subtitle.Codec] = None,
|
||||||
|
source_path: Optional[Path] = None,
|
||||||
|
) -> Path:
|
||||||
|
"""Generate sidecar path: {base}.{lang}[.forced][.sdh].{ext}"""
|
||||||
|
lang_suffix = str(subtitle.language) if subtitle.language else "und"
|
||||||
|
forced_suffix = ".forced" if subtitle.forced else ""
|
||||||
|
sdh_suffix = ".sdh" if (subtitle.sdh or subtitle.cc) else ""
|
||||||
|
|
||||||
|
extension = (target_codec or subtitle.codec or Subtitle.Codec.SubRip).extension
|
||||||
|
if (
|
||||||
|
not target_codec
|
||||||
|
and not subtitle.codec
|
||||||
|
and source_path
|
||||||
|
and source_path.suffix
|
||||||
|
):
|
||||||
|
extension = source_path.suffix.lstrip(".")
|
||||||
|
|
||||||
|
filename = f"{base_filename}.{lang_suffix}{forced_suffix}{sdh_suffix}.{extension}"
|
||||||
|
return output_dir / filename
|
||||||
|
|
||||||
|
def output_subtitle_sidecars(
|
||||||
|
self,
|
||||||
|
subtitles: list[Subtitle],
|
||||||
|
base_filename: str,
|
||||||
|
output_dir: Path,
|
||||||
|
sidecar_format: str,
|
||||||
|
original_paths: Optional[dict[str, Path]] = None,
|
||||||
|
) -> list[Path]:
|
||||||
|
"""Output subtitles as sidecar files, converting if needed."""
|
||||||
|
created_paths: list[Path] = []
|
||||||
|
config.directories.temp.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
for subtitle in subtitles:
|
||||||
|
source_path = subtitle.path
|
||||||
|
if sidecar_format == "original" and original_paths and subtitle.id in original_paths:
|
||||||
|
source_path = original_paths[subtitle.id]
|
||||||
|
|
||||||
|
if not source_path or not source_path.exists():
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Determine target codec
|
||||||
|
if sidecar_format == "original":
|
||||||
|
target_codec = None
|
||||||
|
if source_path.suffix:
|
||||||
|
try:
|
||||||
|
target_codec = Subtitle.Codec.from_mime(source_path.suffix.lstrip("."))
|
||||||
|
except ValueError:
|
||||||
|
target_codec = None
|
||||||
|
else:
|
||||||
|
target_codec = Subtitle.Codec.from_mime(sidecar_format)
|
||||||
|
|
||||||
|
sidecar_path = self.generate_sidecar_subtitle_path(
|
||||||
|
subtitle, base_filename, output_dir, target_codec, source_path=source_path
|
||||||
|
)
|
||||||
|
|
||||||
|
# Copy or convert
|
||||||
|
if not target_codec or subtitle.codec == target_codec:
|
||||||
|
shutil.copy2(source_path, sidecar_path)
|
||||||
|
else:
|
||||||
|
# Create temp copy for conversion to preserve original
|
||||||
|
temp_path = config.directories.temp / f"sidecar_{subtitle.id}{source_path.suffix}"
|
||||||
|
shutil.copy2(source_path, temp_path)
|
||||||
|
|
||||||
|
temp_sub = Subtitle(
|
||||||
|
subtitle.url,
|
||||||
|
subtitle.language,
|
||||||
|
is_original_lang=subtitle.is_original_lang,
|
||||||
|
descriptor=subtitle.descriptor,
|
||||||
|
codec=subtitle.codec,
|
||||||
|
forced=subtitle.forced,
|
||||||
|
sdh=subtitle.sdh,
|
||||||
|
cc=subtitle.cc,
|
||||||
|
id_=f"{subtitle.id}_sc",
|
||||||
|
)
|
||||||
|
temp_sub.path = temp_path
|
||||||
|
try:
|
||||||
|
temp_sub.convert(target_codec)
|
||||||
|
if temp_sub.path and temp_sub.path.exists():
|
||||||
|
shutil.copy2(temp_sub.path, sidecar_path)
|
||||||
|
finally:
|
||||||
|
if temp_sub.path and temp_sub.path.exists():
|
||||||
|
temp_sub.path.unlink(missing_ok=True)
|
||||||
|
temp_path.unlink(missing_ok=True)
|
||||||
|
|
||||||
|
created_paths.append(sidecar_path)
|
||||||
|
|
||||||
|
return created_paths
|
||||||
|
|
||||||
@click.command(
|
@click.command(
|
||||||
short_help="Download, Decrypt, and Mux tracks for titles from a Service.",
|
short_help="Download, Decrypt, and Mux tracks for titles from a Service.",
|
||||||
cls=Services,
|
cls=Services,
|
||||||
@@ -213,9 +297,9 @@ class dl:
|
|||||||
@click.option(
|
@click.option(
|
||||||
"-a",
|
"-a",
|
||||||
"--acodec",
|
"--acodec",
|
||||||
type=click.Choice(Audio.Codec, case_sensitive=False),
|
type=AUDIO_CODEC_LIST,
|
||||||
default=None,
|
default=[],
|
||||||
help="Audio Codec to download, defaults to any codec.",
|
help="Audio Codec(s) to download (comma-separated), e.g., 'AAC,EC3'. Defaults to any.",
|
||||||
)
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
"-vb",
|
"-vb",
|
||||||
@@ -254,6 +338,13 @@ class dl:
|
|||||||
default=False,
|
default=False,
|
||||||
help="Exclude Dolby Atmos audio tracks when selecting audio.",
|
help="Exclude Dolby Atmos audio tracks when selecting audio.",
|
||||||
)
|
)
|
||||||
|
@click.option(
|
||||||
|
"--split-audio",
|
||||||
|
"split_audio",
|
||||||
|
is_flag=True,
|
||||||
|
default=None,
|
||||||
|
help="Create separate output files per audio codec instead of merging all audio.",
|
||||||
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
"-w",
|
"-w",
|
||||||
"--wanted",
|
"--wanted",
|
||||||
@@ -261,13 +352,6 @@ class dl:
|
|||||||
default=None,
|
default=None,
|
||||||
help="Wanted episodes, e.g. `S01-S05,S07`, `S01E01-S02E03`, `S02-S02E03`, e.t.c, defaults to all.",
|
help="Wanted episodes, e.g. `S01-S05,S07`, `S01E01-S02E03`, `S02-S02E03`, e.t.c, defaults to all.",
|
||||||
)
|
)
|
||||||
@click.option(
|
|
||||||
"-le",
|
|
||||||
"--latest-episode",
|
|
||||||
is_flag=True,
|
|
||||||
default=False,
|
|
||||||
help="Download only the single most recent episode available.",
|
|
||||||
)
|
|
||||||
@click.option(
|
@click.option(
|
||||||
"-l",
|
"-l",
|
||||||
"--lang",
|
"--lang",
|
||||||
@@ -275,6 +359,12 @@ class dl:
|
|||||||
default="orig",
|
default="orig",
|
||||||
help="Language wanted for Video and Audio. Use 'orig' to select the original language, e.g. 'orig,en' for both original and English.",
|
help="Language wanted for Video and Audio. Use 'orig' to select the original language, e.g. 'orig,en' for both original and English.",
|
||||||
)
|
)
|
||||||
|
@click.option(
|
||||||
|
"--latest-episode",
|
||||||
|
is_flag=True,
|
||||||
|
default=False,
|
||||||
|
help="Download only the single most recent episode available.",
|
||||||
|
)
|
||||||
@click.option(
|
@click.option(
|
||||||
"-vl",
|
"-vl",
|
||||||
"--v-lang",
|
"--v-lang",
|
||||||
@@ -638,12 +728,17 @@ class dl:
|
|||||||
"device_type": self.cdm.device_type.name,
|
"device_type": self.cdm.device_type.name,
|
||||||
}
|
}
|
||||||
else:
|
else:
|
||||||
self.log.info(
|
# Handle both local PlayReady CDM and RemoteCdm (which has certificate_chain=None)
|
||||||
f"Loaded PlayReady CDM: {self.cdm.certificate_chain.get_name()} (L{self.cdm.security_level})"
|
is_remote = self.cdm.certificate_chain is None and hasattr(self.cdm, "device_name")
|
||||||
)
|
if is_remote:
|
||||||
|
cdm_name = self.cdm.device_name
|
||||||
|
self.log.info(f"Loaded PlayReady Remote CDM: {cdm_name} (L{self.cdm.security_level})")
|
||||||
|
else:
|
||||||
|
cdm_name = self.cdm.certificate_chain.get_name() if self.cdm.certificate_chain else "Unknown"
|
||||||
|
self.log.info(f"Loaded PlayReady CDM: {cdm_name} (L{self.cdm.security_level})")
|
||||||
cdm_info = {
|
cdm_info = {
|
||||||
"type": "PlayReady",
|
"type": "PlayReady",
|
||||||
"certificate": self.cdm.certificate_chain.get_name(),
|
"certificate": cdm_name,
|
||||||
"security_level": self.cdm.security_level,
|
"security_level": self.cdm.security_level,
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -665,6 +760,8 @@ class dl:
|
|||||||
self.proxy_providers.append(SurfsharkVPN(**config.proxy_providers["surfsharkvpn"]))
|
self.proxy_providers.append(SurfsharkVPN(**config.proxy_providers["surfsharkvpn"]))
|
||||||
if config.proxy_providers.get("windscribevpn"):
|
if config.proxy_providers.get("windscribevpn"):
|
||||||
self.proxy_providers.append(WindscribeVPN(**config.proxy_providers["windscribevpn"]))
|
self.proxy_providers.append(WindscribeVPN(**config.proxy_providers["windscribevpn"]))
|
||||||
|
if config.proxy_providers.get("gluetun"):
|
||||||
|
self.proxy_providers.append(Gluetun(**config.proxy_providers["gluetun"]))
|
||||||
if binaries.HolaProxy:
|
if binaries.HolaProxy:
|
||||||
self.proxy_providers.append(Hola())
|
self.proxy_providers.append(Hola())
|
||||||
for proxy_provider in self.proxy_providers:
|
for proxy_provider in self.proxy_providers:
|
||||||
@@ -675,9 +772,17 @@ class dl:
|
|||||||
if re.match(r"^[a-z]+:.+$", proxy, re.IGNORECASE):
|
if re.match(r"^[a-z]+:.+$", proxy, re.IGNORECASE):
|
||||||
# requesting proxy from a specific proxy provider
|
# requesting proxy from a specific proxy provider
|
||||||
requested_provider, proxy = proxy.split(":", maxsplit=1)
|
requested_provider, proxy = proxy.split(":", maxsplit=1)
|
||||||
if re.match(r"^[a-z]{2}(?:\d+)?$", proxy, re.IGNORECASE):
|
# Match simple region codes (us, ca, uk1) or provider:region format (nordvpn:ca, windscribe:us)
|
||||||
|
if re.match(r"^[a-z]{2}(?:\d+)?$", proxy, re.IGNORECASE) or re.match(
|
||||||
|
r"^[a-z]+:[a-z]{2}(?:\d+)?$", proxy, re.IGNORECASE
|
||||||
|
):
|
||||||
proxy = proxy.lower()
|
proxy = proxy.lower()
|
||||||
with console.status(f"Getting a Proxy to {proxy}...", spinner="dots"):
|
status_msg = (
|
||||||
|
f"Connecting to VPN ({proxy})..."
|
||||||
|
if requested_provider == "gluetun"
|
||||||
|
else f"Getting a Proxy to {proxy}..."
|
||||||
|
)
|
||||||
|
with console.status(status_msg, spinner="dots"):
|
||||||
if requested_provider:
|
if requested_provider:
|
||||||
proxy_provider = next(
|
proxy_provider = next(
|
||||||
(x for x in self.proxy_providers if x.__class__.__name__.lower() == requested_provider),
|
(x for x in self.proxy_providers if x.__class__.__name__.lower() == requested_provider),
|
||||||
@@ -686,21 +791,49 @@ class dl:
|
|||||||
if not proxy_provider:
|
if not proxy_provider:
|
||||||
self.log.error(f"The proxy provider '{requested_provider}' was not recognised.")
|
self.log.error(f"The proxy provider '{requested_provider}' was not recognised.")
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
proxy_query = proxy # Save query before overwriting with URI
|
||||||
proxy_uri = proxy_provider.get_proxy(proxy)
|
proxy_uri = proxy_provider.get_proxy(proxy)
|
||||||
if not proxy_uri:
|
if not proxy_uri:
|
||||||
self.log.error(f"The proxy provider {requested_provider} had no proxy for {proxy}")
|
self.log.error(f"The proxy provider {requested_provider} had no proxy for {proxy}")
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
proxy = ctx.params["proxy"] = proxy_uri
|
proxy = ctx.params["proxy"] = proxy_uri
|
||||||
self.log.info(f"Using {proxy_provider.__class__.__name__} Proxy: {proxy}")
|
# Show connection info for Gluetun (IP, location) instead of proxy URL
|
||||||
|
if hasattr(proxy_provider, "get_connection_info"):
|
||||||
|
conn_info = proxy_provider.get_connection_info(proxy_query)
|
||||||
|
if conn_info and conn_info.get("public_ip"):
|
||||||
|
location_parts = [conn_info.get("city"), conn_info.get("country")]
|
||||||
|
location = ", ".join(p for p in location_parts if p)
|
||||||
|
self.log.info(f"VPN Connected: {conn_info['public_ip']} ({location})")
|
||||||
|
else:
|
||||||
|
self.log.info(f"Using {proxy_provider.__class__.__name__} Proxy: {proxy}")
|
||||||
|
else:
|
||||||
|
self.log.info(f"Using {proxy_provider.__class__.__name__} Proxy: {proxy}")
|
||||||
else:
|
else:
|
||||||
for proxy_provider in self.proxy_providers:
|
for proxy_provider in self.proxy_providers:
|
||||||
|
proxy_query = proxy # Save query before overwriting with URI
|
||||||
proxy_uri = proxy_provider.get_proxy(proxy)
|
proxy_uri = proxy_provider.get_proxy(proxy)
|
||||||
if proxy_uri:
|
if proxy_uri:
|
||||||
proxy = ctx.params["proxy"] = proxy_uri
|
proxy = ctx.params["proxy"] = proxy_uri
|
||||||
self.log.info(f"Using {proxy_provider.__class__.__name__} Proxy: {proxy}")
|
# Show connection info for Gluetun (IP, location) instead of proxy URL
|
||||||
|
if hasattr(proxy_provider, "get_connection_info"):
|
||||||
|
conn_info = proxy_provider.get_connection_info(proxy_query)
|
||||||
|
if conn_info and conn_info.get("public_ip"):
|
||||||
|
location_parts = [conn_info.get("city"), conn_info.get("country")]
|
||||||
|
location = ", ".join(p for p in location_parts if p)
|
||||||
|
self.log.info(f"VPN Connected: {conn_info['public_ip']} ({location})")
|
||||||
|
else:
|
||||||
|
self.log.info(f"Using {proxy_provider.__class__.__name__} Proxy: {proxy}")
|
||||||
|
else:
|
||||||
|
self.log.info(f"Using {proxy_provider.__class__.__name__} Proxy: {proxy}")
|
||||||
break
|
break
|
||||||
|
# Store proxy query info for service-specific overrides
|
||||||
|
ctx.params["proxy_query"] = proxy
|
||||||
|
ctx.params["proxy_provider"] = requested_provider
|
||||||
else:
|
else:
|
||||||
self.log.info(f"Using explicit Proxy: {proxy}")
|
self.log.info(f"Using explicit Proxy: {proxy}")
|
||||||
|
# For explicit proxies, store None for query/provider
|
||||||
|
ctx.params["proxy_query"] = None
|
||||||
|
ctx.params["proxy_provider"] = None
|
||||||
|
|
||||||
ctx.obj = ContextData(
|
ctx.obj = ContextData(
|
||||||
config=self.service_config, cdm=self.cdm, proxy_providers=self.proxy_providers, profile=self.profile
|
config=self.service_config, cdm=self.cdm, proxy_providers=self.proxy_providers, profile=self.profile
|
||||||
@@ -718,7 +851,7 @@ class dl:
|
|||||||
service: Service,
|
service: Service,
|
||||||
quality: list[int],
|
quality: list[int],
|
||||||
vcodec: Optional[Video.Codec],
|
vcodec: Optional[Video.Codec],
|
||||||
acodec: Optional[Audio.Codec],
|
acodec: list[Audio.Codec],
|
||||||
vbitrate: int,
|
vbitrate: int,
|
||||||
abitrate: int,
|
abitrate: int,
|
||||||
range_: list[Video.Range],
|
range_: list[Video.Range],
|
||||||
@@ -756,6 +889,7 @@ class dl:
|
|||||||
workers: Optional[int],
|
workers: Optional[int],
|
||||||
downloads: int,
|
downloads: int,
|
||||||
best_available: bool,
|
best_available: bool,
|
||||||
|
split_audio: Optional[bool] = None,
|
||||||
*_: Any,
|
*_: Any,
|
||||||
**__: Any,
|
**__: Any,
|
||||||
) -> None:
|
) -> None:
|
||||||
@@ -763,6 +897,15 @@ class dl:
|
|||||||
self.search_source = None
|
self.search_source = None
|
||||||
start_time = time.time()
|
start_time = time.time()
|
||||||
|
|
||||||
|
if not acodec:
|
||||||
|
acodec = []
|
||||||
|
elif isinstance(acodec, Audio.Codec):
|
||||||
|
acodec = [acodec]
|
||||||
|
elif isinstance(acodec, str) or (
|
||||||
|
isinstance(acodec, list) and not all(isinstance(v, Audio.Codec) for v in acodec)
|
||||||
|
):
|
||||||
|
acodec = AUDIO_CODEC_LIST.convert(acodec)
|
||||||
|
|
||||||
if require_subs and s_lang != ["all"]:
|
if require_subs and s_lang != ["all"]:
|
||||||
self.log.error("--require-subs and --s-lang cannot be used together")
|
self.log.error("--require-subs and --s-lang cannot be used together")
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
@@ -1059,7 +1202,9 @@ class dl:
|
|||||||
title.tracks.add(non_sdh_sub)
|
title.tracks.add(non_sdh_sub)
|
||||||
events.subscribe(
|
events.subscribe(
|
||||||
events.Types.TRACK_MULTIPLEX,
|
events.Types.TRACK_MULTIPLEX,
|
||||||
lambda track, sub_id=non_sdh_sub.id: (track.strip_hearing_impaired()) if track.id == sub_id else None,
|
lambda track, sub_id=non_sdh_sub.id: (track.strip_hearing_impaired())
|
||||||
|
if track.id == sub_id
|
||||||
|
else None,
|
||||||
)
|
)
|
||||||
|
|
||||||
with console.status("Sorting tracks by language and bitrate...", spinner="dots"):
|
with console.status("Sorting tracks by language and bitrate...", spinner="dots"):
|
||||||
@@ -1272,9 +1417,10 @@ class dl:
|
|||||||
if not audio_description:
|
if not audio_description:
|
||||||
title.tracks.select_audio(lambda x: not x.descriptive) # exclude descriptive audio
|
title.tracks.select_audio(lambda x: not x.descriptive) # exclude descriptive audio
|
||||||
if acodec:
|
if acodec:
|
||||||
title.tracks.select_audio(lambda x: x.codec == acodec)
|
title.tracks.select_audio(lambda x: x.codec in acodec)
|
||||||
if not title.tracks.audio:
|
if not title.tracks.audio:
|
||||||
self.log.error(f"There's no {acodec.name} Audio Tracks...")
|
codec_names = ", ".join(c.name for c in acodec)
|
||||||
|
self.log.error(f"No audio tracks matching codecs: {codec_names}")
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
if channels:
|
if channels:
|
||||||
title.tracks.select_audio(lambda x: math.ceil(x.channels) == math.ceil(channels))
|
title.tracks.select_audio(lambda x: math.ceil(x.channels) == math.ceil(channels))
|
||||||
@@ -1313,15 +1459,27 @@ class dl:
|
|||||||
if "best" in processed_lang:
|
if "best" in processed_lang:
|
||||||
unique_languages = {track.language for track in title.tracks.audio}
|
unique_languages = {track.language for track in title.tracks.audio}
|
||||||
selected_audio = []
|
selected_audio = []
|
||||||
for language in unique_languages:
|
if acodec and len(acodec) > 1:
|
||||||
highest_quality = max(
|
for language in unique_languages:
|
||||||
(track for track in title.tracks.audio if track.language == language),
|
for codec in acodec:
|
||||||
key=lambda x: x.bitrate or 0,
|
candidates = [
|
||||||
)
|
track
|
||||||
selected_audio.append(highest_quality)
|
for track in title.tracks.audio
|
||||||
|
if track.language == language and track.codec == codec
|
||||||
|
]
|
||||||
|
if not candidates:
|
||||||
|
continue
|
||||||
|
selected_audio.append(max(candidates, key=lambda x: x.bitrate or 0))
|
||||||
|
else:
|
||||||
|
for language in unique_languages:
|
||||||
|
highest_quality = max(
|
||||||
|
(track for track in title.tracks.audio if track.language == language),
|
||||||
|
key=lambda x: x.bitrate or 0,
|
||||||
|
)
|
||||||
|
selected_audio.append(highest_quality)
|
||||||
title.tracks.audio = selected_audio
|
title.tracks.audio = selected_audio
|
||||||
elif "all" not in processed_lang:
|
elif "all" not in processed_lang:
|
||||||
per_language = 1
|
per_language = 0 if acodec and len(acodec) > 1 else 1
|
||||||
title.tracks.audio = title.tracks.by_language(
|
title.tracks.audio = title.tracks.by_language(
|
||||||
title.tracks.audio, processed_lang, per_language=per_language, exact_match=exact_lang
|
title.tracks.audio, processed_lang, per_language=per_language, exact_match=exact_lang
|
||||||
)
|
)
|
||||||
@@ -1329,7 +1487,16 @@ class dl:
|
|||||||
self.log.error(f"There's no {processed_lang} Audio Track, cannot continue...")
|
self.log.error(f"There's no {processed_lang} Audio Track, cannot continue...")
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
if video_only or audio_only or subs_only or chapters_only or no_subs or no_audio or no_chapters or no_video:
|
if (
|
||||||
|
video_only
|
||||||
|
or audio_only
|
||||||
|
or subs_only
|
||||||
|
or chapters_only
|
||||||
|
or no_subs
|
||||||
|
or no_audio
|
||||||
|
or no_chapters
|
||||||
|
or no_video
|
||||||
|
):
|
||||||
keep_videos = False
|
keep_videos = False
|
||||||
keep_audio = False
|
keep_audio = False
|
||||||
keep_subtitles = False
|
keep_subtitles = False
|
||||||
@@ -1552,6 +1719,25 @@ class dl:
|
|||||||
break
|
break
|
||||||
video_track_n += 1
|
video_track_n += 1
|
||||||
|
|
||||||
|
# Subtitle output mode configuration (for sidecar originals)
|
||||||
|
subtitle_output_mode = config.subtitle.get("output_mode", "mux")
|
||||||
|
sidecar_format = config.subtitle.get("sidecar_format", "srt")
|
||||||
|
skip_subtitle_mux = (
|
||||||
|
subtitle_output_mode == "sidecar" and (title.tracks.videos or title.tracks.audio)
|
||||||
|
)
|
||||||
|
sidecar_subtitles: list[Subtitle] = []
|
||||||
|
sidecar_original_paths: dict[str, Path] = {}
|
||||||
|
if subtitle_output_mode in ("sidecar", "both") and not no_mux:
|
||||||
|
sidecar_subtitles = [s for s in title.tracks.subtitles if s.path and s.path.exists()]
|
||||||
|
if sidecar_format == "original":
|
||||||
|
config.directories.temp.mkdir(parents=True, exist_ok=True)
|
||||||
|
for subtitle in sidecar_subtitles:
|
||||||
|
original_path = (
|
||||||
|
config.directories.temp / f"sidecar_original_{subtitle.id}{subtitle.path.suffix}"
|
||||||
|
)
|
||||||
|
shutil.copy2(subtitle.path, original_path)
|
||||||
|
sidecar_original_paths[subtitle.id] = original_path
|
||||||
|
|
||||||
with console.status("Converting Subtitles..."):
|
with console.status("Converting Subtitles..."):
|
||||||
for subtitle in title.tracks.subtitles:
|
for subtitle in title.tracks.subtitles:
|
||||||
if sub_format:
|
if sub_format:
|
||||||
@@ -1569,9 +1755,7 @@ class dl:
|
|||||||
if line.startswith("Style: "):
|
if line.startswith("Style: "):
|
||||||
font_names.append(line.removeprefix("Style: ").split(",")[1].strip())
|
font_names.append(line.removeprefix("Style: ").split(",")[1].strip())
|
||||||
|
|
||||||
font_count, missing_fonts = self.attach_subtitle_fonts(
|
font_count, missing_fonts = self.attach_subtitle_fonts(font_names, title, temp_font_files)
|
||||||
font_names, title, temp_font_files
|
|
||||||
)
|
|
||||||
|
|
||||||
if font_count:
|
if font_count:
|
||||||
self.log.info(f"Attached {font_count} fonts for the Subtitles")
|
self.log.info(f"Attached {font_count} fonts for the Subtitles")
|
||||||
@@ -1592,7 +1776,8 @@ class dl:
|
|||||||
drm = track.get_drm_for_cdm(self.cdm)
|
drm = track.get_drm_for_cdm(self.cdm)
|
||||||
if drm and hasattr(drm, "decrypt"):
|
if drm and hasattr(drm, "decrypt"):
|
||||||
drm.decrypt(track.path)
|
drm.decrypt(track.path)
|
||||||
has_decrypted = True
|
if not isinstance(drm, MonaLisa):
|
||||||
|
has_decrypted = True
|
||||||
events.emit(events.Types.TRACK_REPACKED, track=track)
|
events.emit(events.Types.TRACK_REPACKED, track=track)
|
||||||
else:
|
else:
|
||||||
self.log.warning(
|
self.log.warning(
|
||||||
@@ -1614,6 +1799,7 @@ class dl:
|
|||||||
self.log.info("Repacked one or more tracks with FFMPEG")
|
self.log.info("Repacked one or more tracks with FFMPEG")
|
||||||
|
|
||||||
muxed_paths = []
|
muxed_paths = []
|
||||||
|
muxed_audio_codecs: dict[Path, Optional[Audio.Codec]] = {}
|
||||||
|
|
||||||
if no_mux:
|
if no_mux:
|
||||||
# Skip muxing, handle individual track files
|
# Skip muxing, handle individual track files
|
||||||
@@ -1630,7 +1816,40 @@ class dl:
|
|||||||
console=console,
|
console=console,
|
||||||
)
|
)
|
||||||
|
|
||||||
multiplex_tasks: list[tuple[TaskID, Tracks]] = []
|
if split_audio is not None:
|
||||||
|
merge_audio = not split_audio
|
||||||
|
else:
|
||||||
|
merge_audio = config.muxing.get("merge_audio", True)
|
||||||
|
|
||||||
|
multiplex_tasks: list[tuple[TaskID, Tracks, Optional[Audio.Codec]]] = []
|
||||||
|
|
||||||
|
def clone_tracks_for_audio(base_tracks: Tracks, audio_tracks: list[Audio]) -> Tracks:
|
||||||
|
task_tracks = Tracks()
|
||||||
|
task_tracks.videos = list(base_tracks.videos)
|
||||||
|
task_tracks.audio = audio_tracks
|
||||||
|
task_tracks.subtitles = list(base_tracks.subtitles)
|
||||||
|
task_tracks.chapters = base_tracks.chapters
|
||||||
|
task_tracks.attachments = list(base_tracks.attachments)
|
||||||
|
return task_tracks
|
||||||
|
|
||||||
|
def enqueue_mux_tasks(task_description: str, base_tracks: Tracks) -> None:
|
||||||
|
if merge_audio or not base_tracks.audio:
|
||||||
|
task_id = progress.add_task(f"{task_description}...", total=None, start=False)
|
||||||
|
multiplex_tasks.append((task_id, base_tracks, None))
|
||||||
|
return
|
||||||
|
|
||||||
|
audio_by_codec: dict[Optional[Audio.Codec], list[Audio]] = {}
|
||||||
|
for audio_track in base_tracks.audio:
|
||||||
|
audio_by_codec.setdefault(audio_track.codec, []).append(audio_track)
|
||||||
|
|
||||||
|
for audio_codec, codec_audio_tracks in audio_by_codec.items():
|
||||||
|
description = task_description
|
||||||
|
if audio_codec:
|
||||||
|
description = f"{task_description} {audio_codec.name}"
|
||||||
|
|
||||||
|
task_id = progress.add_task(f"{description}...", total=None, start=False)
|
||||||
|
task_tracks = clone_tracks_for_audio(base_tracks, codec_audio_tracks)
|
||||||
|
multiplex_tasks.append((task_id, task_tracks, audio_codec))
|
||||||
|
|
||||||
# Check if we're in hybrid mode
|
# Check if we're in hybrid mode
|
||||||
if any(r == Video.Range.HYBRID for r in range_) and title.tracks.videos:
|
if any(r == Video.Range.HYBRID for r in range_) and title.tracks.videos:
|
||||||
@@ -1670,11 +1889,8 @@ class dl:
|
|||||||
if default_output.exists():
|
if default_output.exists():
|
||||||
shutil.move(str(default_output), str(hybrid_output_path))
|
shutil.move(str(default_output), str(hybrid_output_path))
|
||||||
|
|
||||||
# Create a mux task for this resolution
|
|
||||||
task_description = f"Multiplexing Hybrid HDR10+DV {resolution}p"
|
|
||||||
task_id = progress.add_task(f"{task_description}...", total=None, start=False)
|
|
||||||
|
|
||||||
# Create tracks with the hybrid video output for this resolution
|
# Create tracks with the hybrid video output for this resolution
|
||||||
|
task_description = f"Multiplexing Hybrid HDR10+DV {resolution}p"
|
||||||
task_tracks = Tracks(title.tracks) + title.tracks.chapters + title.tracks.attachments
|
task_tracks = Tracks(title.tracks) + title.tracks.chapters + title.tracks.attachments
|
||||||
|
|
||||||
# Create a new video track for the hybrid output
|
# Create a new video track for the hybrid output
|
||||||
@@ -1684,7 +1900,7 @@ class dl:
|
|||||||
hybrid_track.needs_duration_fix = True
|
hybrid_track.needs_duration_fix = True
|
||||||
task_tracks.videos = [hybrid_track]
|
task_tracks.videos = [hybrid_track]
|
||||||
|
|
||||||
multiplex_tasks.append((task_id, task_tracks))
|
enqueue_mux_tasks(task_description, task_tracks)
|
||||||
|
|
||||||
console.print()
|
console.print()
|
||||||
else:
|
else:
|
||||||
@@ -1697,16 +1913,15 @@ class dl:
|
|||||||
if len(range_) > 1:
|
if len(range_) > 1:
|
||||||
task_description += f" {video_track.range.name}"
|
task_description += f" {video_track.range.name}"
|
||||||
|
|
||||||
task_id = progress.add_task(f"{task_description}...", total=None, start=False)
|
|
||||||
|
|
||||||
task_tracks = Tracks(title.tracks) + title.tracks.chapters + title.tracks.attachments
|
task_tracks = Tracks(title.tracks) + title.tracks.chapters + title.tracks.attachments
|
||||||
if video_track:
|
if video_track:
|
||||||
task_tracks.videos = [video_track]
|
task_tracks.videos = [video_track]
|
||||||
|
|
||||||
multiplex_tasks.append((task_id, task_tracks))
|
enqueue_mux_tasks(task_description, task_tracks)
|
||||||
|
|
||||||
with Live(Padding(progress, (0, 5, 1, 5)), console=console):
|
with Live(Padding(progress, (0, 5, 1, 5)), console=console):
|
||||||
for task_id, task_tracks in multiplex_tasks:
|
mux_index = 0
|
||||||
|
for task_id, task_tracks, audio_codec in multiplex_tasks:
|
||||||
progress.start_task(task_id) # TODO: Needed?
|
progress.start_task(task_id) # TODO: Needed?
|
||||||
audio_expected = not video_only and not no_audio
|
audio_expected = not video_only and not no_audio
|
||||||
muxed_path, return_code, errors = task_tracks.mux(
|
muxed_path, return_code, errors = task_tracks.mux(
|
||||||
@@ -1715,8 +1930,18 @@ class dl:
|
|||||||
delete=False,
|
delete=False,
|
||||||
audio_expected=audio_expected,
|
audio_expected=audio_expected,
|
||||||
title_language=title.language,
|
title_language=title.language,
|
||||||
|
skip_subtitles=skip_subtitle_mux,
|
||||||
)
|
)
|
||||||
|
if muxed_path.exists():
|
||||||
|
mux_index += 1
|
||||||
|
unique_path = muxed_path.with_name(
|
||||||
|
f"{muxed_path.stem}.{mux_index}{muxed_path.suffix}"
|
||||||
|
)
|
||||||
|
if unique_path != muxed_path:
|
||||||
|
shutil.move(muxed_path, unique_path)
|
||||||
|
muxed_path = unique_path
|
||||||
muxed_paths.append(muxed_path)
|
muxed_paths.append(muxed_path)
|
||||||
|
muxed_audio_codecs[muxed_path] = audio_codec
|
||||||
if return_code >= 2:
|
if return_code >= 2:
|
||||||
self.log.error(f"Failed to Mux video to Matroska file ({return_code}):")
|
self.log.error(f"Failed to Mux video to Matroska file ({return_code}):")
|
||||||
elif return_code == 1 or errors:
|
elif return_code == 1 or errors:
|
||||||
@@ -1728,8 +1953,31 @@ class dl:
|
|||||||
self.log.warning(line)
|
self.log.warning(line)
|
||||||
if return_code >= 2:
|
if return_code >= 2:
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
for video_track in task_tracks.videos:
|
|
||||||
video_track.delete()
|
# Output sidecar subtitles before deleting track files
|
||||||
|
if sidecar_subtitles and not no_mux:
|
||||||
|
media_info = MediaInfo.parse(muxed_paths[0]) if muxed_paths else None
|
||||||
|
if media_info:
|
||||||
|
base_filename = title.get_filename(media_info, show_service=not no_source)
|
||||||
|
else:
|
||||||
|
base_filename = str(title)
|
||||||
|
|
||||||
|
sidecar_dir = config.directories.downloads
|
||||||
|
if not no_folder and isinstance(title, (Episode, Song)) and media_info:
|
||||||
|
sidecar_dir /= title.get_filename(media_info, show_service=not no_source, folder=True)
|
||||||
|
sidecar_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
with console.status("Saving subtitle sidecar files..."):
|
||||||
|
created = self.output_subtitle_sidecars(
|
||||||
|
sidecar_subtitles,
|
||||||
|
base_filename,
|
||||||
|
sidecar_dir,
|
||||||
|
sidecar_format,
|
||||||
|
original_paths=sidecar_original_paths or None,
|
||||||
|
)
|
||||||
|
if created:
|
||||||
|
self.log.info(f"Saved {len(created)} sidecar subtitle files")
|
||||||
|
|
||||||
for track in title.tracks:
|
for track in title.tracks:
|
||||||
track.delete()
|
track.delete()
|
||||||
|
|
||||||
@@ -1743,6 +1991,8 @@ class dl:
|
|||||||
# Clean up temp fonts
|
# Clean up temp fonts
|
||||||
for temp_path in temp_font_files:
|
for temp_path in temp_font_files:
|
||||||
temp_path.unlink(missing_ok=True)
|
temp_path.unlink(missing_ok=True)
|
||||||
|
for temp_path in sidecar_original_paths.values():
|
||||||
|
temp_path.unlink(missing_ok=True)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
# dont mux
|
# dont mux
|
||||||
@@ -1804,6 +2054,9 @@ class dl:
|
|||||||
media_info = MediaInfo.parse(muxed_path)
|
media_info = MediaInfo.parse(muxed_path)
|
||||||
final_dir = config.directories.downloads
|
final_dir = config.directories.downloads
|
||||||
final_filename = title.get_filename(media_info, show_service=not no_source)
|
final_filename = title.get_filename(media_info, show_service=not no_source)
|
||||||
|
audio_codec_suffix = muxed_audio_codecs.get(muxed_path)
|
||||||
|
if audio_codec_suffix:
|
||||||
|
final_filename = f"{final_filename}.{audio_codec_suffix.name}"
|
||||||
|
|
||||||
if not no_folder and isinstance(title, (Episode, Song)):
|
if not no_folder and isinstance(title, (Episode, Song)):
|
||||||
final_dir /= title.get_filename(media_info, show_service=not no_source, folder=True)
|
final_dir /= title.get_filename(media_info, show_service=not no_source, folder=True)
|
||||||
@@ -2208,6 +2461,26 @@ class dl:
|
|||||||
|
|
||||||
export.write_text(jsonpickle.dumps(keys, indent=4), encoding="utf8")
|
export.write_text(jsonpickle.dumps(keys, indent=4), encoding="utf8")
|
||||||
|
|
||||||
|
elif isinstance(drm, MonaLisa):
|
||||||
|
with self.DRM_TABLE_LOCK:
|
||||||
|
display_id = drm.content_id or drm.pssh
|
||||||
|
pssh_display = self.truncate_pssh_for_display(display_id, "MonaLisa")
|
||||||
|
cek_tree = Tree(Text.assemble(("MonaLisa", "cyan"), (f"({pssh_display})", "text"), overflow="fold"))
|
||||||
|
pre_existing_tree = next(
|
||||||
|
(x for x in table.columns[0].cells if isinstance(x, Tree) and x.label == cek_tree.label), None
|
||||||
|
)
|
||||||
|
if pre_existing_tree:
|
||||||
|
cek_tree = pre_existing_tree
|
||||||
|
|
||||||
|
for kid_, key in drm.content_keys.items():
|
||||||
|
label = f"[text2]{kid_.hex}:{key}"
|
||||||
|
if not any(f"{kid_.hex}:{key}" in x.label for x in cek_tree.children):
|
||||||
|
cek_tree.add(label)
|
||||||
|
|
||||||
|
if cek_tree.children and not pre_existing_tree:
|
||||||
|
table.add_row()
|
||||||
|
table.add_row(cek_tree)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def get_cookie_path(service: str, profile: Optional[str]) -> Optional[Path]:
|
def get_cookie_path(service: str, profile: Optional[str]) -> Optional[Path]:
|
||||||
"""Get Service Cookie File Path for Profile."""
|
"""Get Service Cookie File Path for Profile."""
|
||||||
@@ -2390,14 +2663,23 @@ class dl:
|
|||||||
return CustomRemoteCDM(service_name=service, vaults=self.vaults, **cdm_api)
|
return CustomRemoteCDM(service_name=service, vaults=self.vaults, **cdm_api)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
return RemoteCdm(
|
device_type = cdm_api.get("Device Type", cdm_api.get("device_type", ""))
|
||||||
device_type=cdm_api["Device Type"],
|
if str(device_type).upper() == "PLAYREADY":
|
||||||
system_id=cdm_api["System ID"],
|
return PlayReadyRemoteCdm(
|
||||||
security_level=cdm_api["Security Level"],
|
security_level=cdm_api.get("Security Level", cdm_api.get("security_level", 3000)),
|
||||||
host=cdm_api["Host"],
|
host=cdm_api.get("Host", cdm_api.get("host")),
|
||||||
secret=cdm_api["Secret"],
|
secret=cdm_api.get("Secret", cdm_api.get("secret")),
|
||||||
device_name=cdm_api["Device Name"],
|
device_name=cdm_api.get("Device Name", cdm_api.get("device_name")),
|
||||||
)
|
)
|
||||||
|
else:
|
||||||
|
return RemoteCdm(
|
||||||
|
device_type=cdm_api["Device Type"],
|
||||||
|
system_id=cdm_api["System ID"],
|
||||||
|
security_level=cdm_api["Security Level"],
|
||||||
|
host=cdm_api["Host"],
|
||||||
|
secret=cdm_api["Secret"],
|
||||||
|
device_name=cdm_api["Device Name"],
|
||||||
|
)
|
||||||
|
|
||||||
prd_path = config.directories.prds / f"{cdm_name}.prd"
|
prd_path = config.directories.prds / f"{cdm_name}.prd"
|
||||||
if not prd_path.is_file():
|
if not prd_path.is_file():
|
||||||
|
|||||||
@@ -52,6 +52,13 @@ def check() -> None:
|
|||||||
"desc": "DRM decryption",
|
"desc": "DRM decryption",
|
||||||
"cat": "DRM",
|
"cat": "DRM",
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"name": "ML-Worker",
|
||||||
|
"binary": binaries.ML_Worker,
|
||||||
|
"required": False,
|
||||||
|
"desc": "DRM licensing",
|
||||||
|
"cat": "DRM",
|
||||||
|
},
|
||||||
# HDR Processing
|
# HDR Processing
|
||||||
{"name": "dovi_tool", "binary": binaries.DoviTool, "required": False, "desc": "Dolby Vision", "cat": "HDR"},
|
{"name": "dovi_tool", "binary": binaries.DoviTool, "required": False, "desc": "Dolby Vision", "cat": "HDR"},
|
||||||
{
|
{
|
||||||
@@ -97,6 +104,7 @@ def check() -> None:
|
|||||||
"cat": "Network",
|
"cat": "Network",
|
||||||
},
|
},
|
||||||
{"name": "Caddy", "binary": binaries.Caddy, "required": False, "desc": "Web server", "cat": "Network"},
|
{"name": "Caddy", "binary": binaries.Caddy, "required": False, "desc": "Web server", "cat": "Network"},
|
||||||
|
{"name": "Docker", "binary": binaries.Docker, "required": False, "desc": "Gluetun VPN", "cat": "Network"},
|
||||||
]
|
]
|
||||||
|
|
||||||
# Track overall status
|
# Track overall status
|
||||||
|
|||||||
@@ -16,7 +16,7 @@ from unshackle.core import binaries
|
|||||||
from unshackle.core.config import config
|
from unshackle.core.config import config
|
||||||
from unshackle.core.console import console
|
from unshackle.core.console import console
|
||||||
from unshackle.core.constants import context_settings
|
from unshackle.core.constants import context_settings
|
||||||
from unshackle.core.proxies import Basic, Hola, NordVPN, SurfsharkVPN
|
from unshackle.core.proxies import Basic, Gluetun, Hola, NordVPN, SurfsharkVPN, WindscribeVPN
|
||||||
from unshackle.core.service import Service
|
from unshackle.core.service import Service
|
||||||
from unshackle.core.services import Services
|
from unshackle.core.services import Services
|
||||||
from unshackle.core.utils.click_types import ContextData
|
from unshackle.core.utils.click_types import ContextData
|
||||||
@@ -71,6 +71,10 @@ def search(ctx: click.Context, no_proxy: bool, profile: Optional[str] = None, pr
|
|||||||
proxy_providers.append(NordVPN(**config.proxy_providers["nordvpn"]))
|
proxy_providers.append(NordVPN(**config.proxy_providers["nordvpn"]))
|
||||||
if config.proxy_providers.get("surfsharkvpn"):
|
if config.proxy_providers.get("surfsharkvpn"):
|
||||||
proxy_providers.append(SurfsharkVPN(**config.proxy_providers["surfsharkvpn"]))
|
proxy_providers.append(SurfsharkVPN(**config.proxy_providers["surfsharkvpn"]))
|
||||||
|
if config.proxy_providers.get("windscribevpn"):
|
||||||
|
proxy_providers.append(WindscribeVPN(**config.proxy_providers["windscribevpn"]))
|
||||||
|
if config.proxy_providers.get("gluetun"):
|
||||||
|
proxy_providers.append(Gluetun(**config.proxy_providers["gluetun"]))
|
||||||
if binaries.HolaProxy:
|
if binaries.HolaProxy:
|
||||||
proxy_providers.append(Hola())
|
proxy_providers.append(Hola())
|
||||||
for proxy_provider in proxy_providers:
|
for proxy_provider in proxy_providers:
|
||||||
@@ -81,7 +85,8 @@ def search(ctx: click.Context, no_proxy: bool, profile: Optional[str] = None, pr
|
|||||||
if re.match(r"^[a-z]+:.+$", proxy, re.IGNORECASE):
|
if re.match(r"^[a-z]+:.+$", proxy, re.IGNORECASE):
|
||||||
# requesting proxy from a specific proxy provider
|
# requesting proxy from a specific proxy provider
|
||||||
requested_provider, proxy = proxy.split(":", maxsplit=1)
|
requested_provider, proxy = proxy.split(":", maxsplit=1)
|
||||||
if re.match(r"^[a-z]{2}(?:\d+)?$", proxy, re.IGNORECASE):
|
# Match simple region codes (us, ca, uk1) or provider:region format (nordvpn:ca, windscribe:us)
|
||||||
|
if re.match(r"^[a-z]{2}(?:\d+)?$", proxy, re.IGNORECASE) or re.match(r"^[a-z]+:[a-z]{2}(?:\d+)?$", proxy, re.IGNORECASE):
|
||||||
proxy = proxy.lower()
|
proxy = proxy.lower()
|
||||||
with console.status(f"Getting a Proxy to {proxy}...", spinner="dots"):
|
with console.status(f"Getting a Proxy to {proxy}...", spinner="dots"):
|
||||||
if requested_provider:
|
if requested_provider:
|
||||||
|
|||||||
@@ -11,12 +11,17 @@ from unshackle.core.constants import context_settings
|
|||||||
|
|
||||||
|
|
||||||
@click.command(
|
@click.command(
|
||||||
short_help="Serve your Local Widevine Devices and REST API for Remote Access.", context_settings=context_settings
|
short_help="Serve your Local Widevine/PlayReady Devices and REST API for Remote Access.",
|
||||||
|
context_settings=context_settings,
|
||||||
)
|
)
|
||||||
@click.option("-h", "--host", type=str, default="0.0.0.0", help="Host to serve from.")
|
@click.option("-h", "--host", type=str, default="127.0.0.1", help="Host to serve from.")
|
||||||
@click.option("-p", "--port", type=int, default=8786, help="Port to serve from.")
|
@click.option("-p", "--port", type=int, default=8786, help="Port to serve from.")
|
||||||
@click.option("--caddy", is_flag=True, default=False, help="Also serve with Caddy.")
|
@click.option("--caddy", is_flag=True, default=False, help="Also serve with Caddy.")
|
||||||
@click.option("--api-only", is_flag=True, default=False, help="Serve only the REST API, not pywidevine CDM.")
|
@click.option(
|
||||||
|
"--api-only", is_flag=True, default=False, help="Serve only the REST API, not pywidevine/pyplayready CDM."
|
||||||
|
)
|
||||||
|
@click.option("--no-widevine", is_flag=True, default=False, help="Disable Widevine CDM endpoints.")
|
||||||
|
@click.option("--no-playready", is_flag=True, default=False, help="Disable PlayReady CDM endpoints.")
|
||||||
@click.option("--no-key", is_flag=True, default=False, help="Disable API key authentication (allows all requests).")
|
@click.option("--no-key", is_flag=True, default=False, help="Disable API key authentication (allows all requests).")
|
||||||
@click.option(
|
@click.option(
|
||||||
"--debug-api",
|
"--debug-api",
|
||||||
@@ -24,13 +29,30 @@ from unshackle.core.constants import context_settings
|
|||||||
default=False,
|
default=False,
|
||||||
help="Include technical debug information (tracebacks, stderr) in API error responses.",
|
help="Include technical debug information (tracebacks, stderr) in API error responses.",
|
||||||
)
|
)
|
||||||
def serve(host: str, port: int, caddy: bool, api_only: bool, no_key: bool, debug_api: bool) -> None:
|
@click.option(
|
||||||
|
"--debug",
|
||||||
|
is_flag=True,
|
||||||
|
default=False,
|
||||||
|
help="Enable debug logging for API operations.",
|
||||||
|
)
|
||||||
|
def serve(
|
||||||
|
host: str,
|
||||||
|
port: int,
|
||||||
|
caddy: bool,
|
||||||
|
api_only: bool,
|
||||||
|
no_widevine: bool,
|
||||||
|
no_playready: bool,
|
||||||
|
no_key: bool,
|
||||||
|
debug_api: bool,
|
||||||
|
debug: bool,
|
||||||
|
) -> None:
|
||||||
"""
|
"""
|
||||||
Serve your Local Widevine Devices and REST API for Remote Access.
|
Serve your Local Widevine and PlayReady Devices and REST API for Remote Access.
|
||||||
|
|
||||||
\b
|
\b
|
||||||
Host as 127.0.0.1 may block remote access even if port-forwarded.
|
CDM ENDPOINTS:
|
||||||
Instead, use 0.0.0.0 and ensure the TCP port you choose is forwarded.
|
- Widevine: /{device}/open, /{device}/close/{session_id}, etc.
|
||||||
|
- PlayReady: /playready/{device}/open, /playready/{device}/close/{session_id}, etc.
|
||||||
|
|
||||||
\b
|
\b
|
||||||
You may serve with Caddy at the same time with --caddy. You can use Caddy
|
You may serve with Caddy at the same time with --caddy. You can use Caddy
|
||||||
@@ -38,14 +60,31 @@ def serve(host: str, port: int, caddy: bool, api_only: bool, no_key: bool, debug
|
|||||||
next to the unshackle config.
|
next to the unshackle config.
|
||||||
|
|
||||||
\b
|
\b
|
||||||
The REST API provides programmatic access to unshackle functionality.
|
DEVICE CONFIGURATION:
|
||||||
Configure authentication in your config under serve.users and serve.api_secret.
|
WVD files are auto-loaded from the WVDs directory, PRD files from the PRDs directory.
|
||||||
|
Configure user access in unshackle.yaml:
|
||||||
|
|
||||||
|
\b
|
||||||
|
serve:
|
||||||
|
api_secret: "your-api-secret"
|
||||||
|
users:
|
||||||
|
your-secret-key:
|
||||||
|
devices: ["device_name"] # Widevine devices
|
||||||
|
playready_devices: ["device_name"] # PlayReady devices
|
||||||
|
username: user
|
||||||
"""
|
"""
|
||||||
|
from pyplayready.remote import serve as pyplayready_serve
|
||||||
from pywidevine import serve as pywidevine_serve
|
from pywidevine import serve as pywidevine_serve
|
||||||
|
|
||||||
log = logging.getLogger("serve")
|
log = logging.getLogger("serve")
|
||||||
|
|
||||||
# Validate API secret for REST API routes (unless --no-key is used)
|
if debug:
|
||||||
|
logging.basicConfig(level=logging.DEBUG, format="%(name)s - %(levelname)s - %(message)s")
|
||||||
|
log.info("Debug logging enabled for API operations")
|
||||||
|
else:
|
||||||
|
logging.getLogger("api").setLevel(logging.WARNING)
|
||||||
|
logging.getLogger("api.remote").setLevel(logging.WARNING)
|
||||||
|
|
||||||
if not no_key:
|
if not no_key:
|
||||||
api_secret = config.serve.get("api_secret")
|
api_secret = config.serve.get("api_secret")
|
||||||
if not api_secret:
|
if not api_secret:
|
||||||
@@ -59,6 +98,9 @@ def serve(host: str, port: int, caddy: bool, api_only: bool, no_key: bool, debug
|
|||||||
if debug_api:
|
if debug_api:
|
||||||
log.warning("Running with --debug-api: Error responses will include technical debug information!")
|
log.warning("Running with --debug-api: Error responses will include technical debug information!")
|
||||||
|
|
||||||
|
if api_only and (no_widevine or no_playready):
|
||||||
|
raise click.ClickException("Cannot use --api-only with --no-widevine or --no-playready.")
|
||||||
|
|
||||||
if caddy:
|
if caddy:
|
||||||
if not binaries.Caddy:
|
if not binaries.Caddy:
|
||||||
raise click.ClickException('Caddy executable "caddy" not found but is required for --caddy.')
|
raise click.ClickException('Caddy executable "caddy" not found but is required for --caddy.')
|
||||||
@@ -73,9 +115,12 @@ def serve(host: str, port: int, caddy: bool, api_only: bool, no_key: bool, debug
|
|||||||
config.serve["devices"] = []
|
config.serve["devices"] = []
|
||||||
config.serve["devices"].extend(list(config.directories.wvds.glob("*.wvd")))
|
config.serve["devices"].extend(list(config.directories.wvds.glob("*.wvd")))
|
||||||
|
|
||||||
|
if not config.serve.get("playready_devices"):
|
||||||
|
config.serve["playready_devices"] = []
|
||||||
|
config.serve["playready_devices"].extend(list(config.directories.prds.glob("*.prd")))
|
||||||
|
|
||||||
if api_only:
|
if api_only:
|
||||||
# API-only mode: serve just the REST API
|
log.info("Starting REST API server (pywidevine/pyplayready CDM disabled)")
|
||||||
log.info("Starting REST API server (pywidevine CDM disabled)")
|
|
||||||
if no_key:
|
if no_key:
|
||||||
app = web.Application(middlewares=[cors_middleware])
|
app = web.Application(middlewares=[cors_middleware])
|
||||||
app["config"] = {"users": []}
|
app["config"] = {"users": []}
|
||||||
@@ -90,35 +135,108 @@ def serve(host: str, port: int, caddy: bool, api_only: bool, no_key: bool, debug
|
|||||||
log.info("(Press CTRL+C to quit)")
|
log.info("(Press CTRL+C to quit)")
|
||||||
web.run_app(app, host=host, port=port, print=None)
|
web.run_app(app, host=host, port=port, print=None)
|
||||||
else:
|
else:
|
||||||
# Integrated mode: serve both pywidevine + REST API
|
serve_widevine = not no_widevine
|
||||||
log.info("Starting integrated server (pywidevine CDM + REST API)")
|
serve_playready = not no_playready
|
||||||
|
|
||||||
|
serve_config = dict(config.serve)
|
||||||
|
wvd_devices = serve_config.get("devices", []) if serve_widevine else []
|
||||||
|
prd_devices = serve_config.get("playready_devices", []) if serve_playready else []
|
||||||
|
|
||||||
|
cdm_parts = []
|
||||||
|
if serve_widevine:
|
||||||
|
cdm_parts.append("pywidevine CDM")
|
||||||
|
if serve_playready:
|
||||||
|
cdm_parts.append("pyplayready CDM")
|
||||||
|
log.info(f"Starting integrated server ({' + '.join(cdm_parts)} + REST API)")
|
||||||
|
|
||||||
|
wvd_device_names = [d.stem if hasattr(d, "stem") else str(d) for d in wvd_devices]
|
||||||
|
prd_device_names = [d.stem if hasattr(d, "stem") else str(d) for d in prd_devices]
|
||||||
|
|
||||||
|
if not serve_config.get("users") or not isinstance(serve_config["users"], dict):
|
||||||
|
serve_config["users"] = {}
|
||||||
|
|
||||||
|
if not no_key and api_secret not in serve_config["users"]:
|
||||||
|
serve_config["users"][api_secret] = {
|
||||||
|
"devices": wvd_device_names,
|
||||||
|
"playready_devices": prd_device_names,
|
||||||
|
"username": "api_user",
|
||||||
|
}
|
||||||
|
|
||||||
|
for user_key, user_config in serve_config["users"].items():
|
||||||
|
if "playready_devices" not in user_config:
|
||||||
|
user_config["playready_devices"] = prd_device_names
|
||||||
|
|
||||||
|
def create_serve_authentication(serve_playready_flag: bool):
|
||||||
|
@web.middleware
|
||||||
|
async def serve_authentication(request: web.Request, handler) -> web.Response:
|
||||||
|
if serve_playready_flag and request.path in ("/playready", "/playready/"):
|
||||||
|
response = await handler(request)
|
||||||
|
else:
|
||||||
|
response = await pywidevine_serve.authentication(request, handler)
|
||||||
|
|
||||||
|
if serve_playready_flag and request.path.startswith("/playready"):
|
||||||
|
from pyplayready import __version__ as pyplayready_version
|
||||||
|
response.headers["Server"] = f"https://git.gay/ready-dl/pyplayready serve v{pyplayready_version}"
|
||||||
|
|
||||||
|
return response
|
||||||
|
return serve_authentication
|
||||||
|
|
||||||
# Create integrated app with both pywidevine and API routes
|
|
||||||
if no_key:
|
if no_key:
|
||||||
app = web.Application(middlewares=[cors_middleware])
|
app = web.Application(middlewares=[cors_middleware])
|
||||||
app["config"] = dict(config.serve)
|
|
||||||
app["config"]["users"] = []
|
|
||||||
else:
|
else:
|
||||||
app = web.Application(middlewares=[cors_middleware, pywidevine_serve.authentication])
|
serve_auth = create_serve_authentication(serve_playready and bool(prd_devices))
|
||||||
# Setup config - add API secret to users for authentication
|
app = web.Application(middlewares=[cors_middleware, serve_auth])
|
||||||
serve_config = dict(config.serve)
|
|
||||||
if not serve_config.get("users") or not isinstance(serve_config["users"], dict):
|
|
||||||
serve_config["users"] = {}
|
|
||||||
if api_secret not in serve_config["users"]:
|
|
||||||
device_names = [d.stem if hasattr(d, "stem") else str(d) for d in serve_config.get("devices", [])]
|
|
||||||
serve_config["users"][api_secret] = {
|
|
||||||
"devices": device_names,
|
|
||||||
"username": "api_user"
|
|
||||||
}
|
|
||||||
app["config"] = serve_config
|
|
||||||
|
|
||||||
app.on_startup.append(pywidevine_serve._startup)
|
app["config"] = serve_config
|
||||||
app.on_cleanup.append(pywidevine_serve._cleanup)
|
|
||||||
app.add_routes(pywidevine_serve.routes)
|
|
||||||
app["debug_api"] = debug_api
|
app["debug_api"] = debug_api
|
||||||
|
|
||||||
|
if serve_widevine:
|
||||||
|
app.on_startup.append(pywidevine_serve._startup)
|
||||||
|
app.on_cleanup.append(pywidevine_serve._cleanup)
|
||||||
|
app.add_routes(pywidevine_serve.routes)
|
||||||
|
|
||||||
|
if serve_playready and prd_devices:
|
||||||
|
if no_key:
|
||||||
|
playready_app = web.Application()
|
||||||
|
else:
|
||||||
|
playready_app = web.Application(middlewares=[pyplayready_serve.authentication])
|
||||||
|
|
||||||
|
# PlayReady subapp config maps playready_devices to "devices" for pyplayready compatibility
|
||||||
|
playready_config = {
|
||||||
|
"devices": prd_devices,
|
||||||
|
"users": {
|
||||||
|
user_key: {
|
||||||
|
"devices": user_cfg.get("playready_devices", prd_device_names),
|
||||||
|
"username": user_cfg.get("username", "user"),
|
||||||
|
}
|
||||||
|
for user_key, user_cfg in serve_config["users"].items()
|
||||||
|
}
|
||||||
|
if not no_key
|
||||||
|
else [],
|
||||||
|
}
|
||||||
|
playready_app["config"] = playready_config
|
||||||
|
playready_app.on_startup.append(pyplayready_serve._startup)
|
||||||
|
playready_app.on_cleanup.append(pyplayready_serve._cleanup)
|
||||||
|
playready_app.add_routes(pyplayready_serve.routes)
|
||||||
|
|
||||||
|
async def playready_ping(_: web.Request) -> web.Response:
|
||||||
|
from pyplayready import __version__ as pyplayready_version
|
||||||
|
response = web.json_response({"message": "OK"})
|
||||||
|
response.headers["Server"] = f"https://git.gay/ready-dl/pyplayready serve v{pyplayready_version}"
|
||||||
|
return response
|
||||||
|
|
||||||
|
app.router.add_route("*", "/playready", playready_ping)
|
||||||
|
|
||||||
|
app.add_subapp("/playready", playready_app)
|
||||||
|
log.info(f"PlayReady CDM endpoints available at http://{host}:{port}/playready/")
|
||||||
|
elif serve_playready:
|
||||||
|
log.info("No PlayReady devices found, skipping PlayReady CDM endpoints")
|
||||||
|
|
||||||
setup_routes(app)
|
setup_routes(app)
|
||||||
setup_swagger(app)
|
setup_swagger(app)
|
||||||
|
|
||||||
|
if serve_widevine:
|
||||||
|
log.info(f"Widevine CDM endpoints available at http://{host}:{port}/{{device}}/open")
|
||||||
log.info(f"REST API endpoints available at http://{host}:{port}/api/")
|
log.info(f"REST API endpoints available at http://{host}:{port}/api/")
|
||||||
log.info(f"Swagger UI available at http://{host}:{port}/api/docs/")
|
log.info(f"Swagger UI available at http://{host}:{port}/api/docs/")
|
||||||
log.info("(Press CTRL+C to quit)")
|
log.info("(Press CTRL+C to quit)")
|
||||||
|
|||||||
145
unshackle/core/api/api_keys.py
Normal file
145
unshackle/core/api/api_keys.py
Normal file
@@ -0,0 +1,145 @@
|
|||||||
|
"""API key tier management for remote services."""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from typing import Any, Dict, List, Optional
|
||||||
|
|
||||||
|
from aiohttp import web
|
||||||
|
|
||||||
|
log = logging.getLogger("api.keys")
|
||||||
|
|
||||||
|
|
||||||
|
def get_api_key_from_request(request: web.Request) -> Optional[str]:
|
||||||
|
"""
|
||||||
|
Extract API key from request headers.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
request: aiohttp request object
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
API key string or None
|
||||||
|
"""
|
||||||
|
api_key = request.headers.get("X-API-Key")
|
||||||
|
if api_key:
|
||||||
|
return api_key
|
||||||
|
|
||||||
|
auth_header = request.headers.get("Authorization", "")
|
||||||
|
if auth_header.startswith("Bearer "):
|
||||||
|
return auth_header[7:] # len("Bearer ") == 7
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def get_api_key_config(app: web.Application, api_key: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get configuration for a specific API key.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: aiohttp application
|
||||||
|
api_key: API key to look up
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
API key configuration dict or None if not found
|
||||||
|
"""
|
||||||
|
config = app.get("config", {})
|
||||||
|
|
||||||
|
# Check new-style tiered API keys
|
||||||
|
api_keys = config.get("api_keys", [])
|
||||||
|
for key_config in api_keys:
|
||||||
|
if isinstance(key_config, dict) and key_config.get("key") == api_key:
|
||||||
|
return key_config
|
||||||
|
|
||||||
|
# Check legacy users list (backward compatibility)
|
||||||
|
users = config.get("users", [])
|
||||||
|
if api_key in users:
|
||||||
|
return {
|
||||||
|
"key": api_key,
|
||||||
|
"tier": "basic",
|
||||||
|
"allowed_cdms": []
|
||||||
|
}
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def is_premium_user(app: web.Application, api_key: str) -> bool:
|
||||||
|
"""
|
||||||
|
Check if an API key belongs to a premium user.
|
||||||
|
|
||||||
|
Premium users can use server-side CDM for decryption.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: aiohttp application
|
||||||
|
api_key: API key to check
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if premium user, False otherwise
|
||||||
|
"""
|
||||||
|
key_config = get_api_key_config(app, api_key)
|
||||||
|
if not key_config:
|
||||||
|
return False
|
||||||
|
|
||||||
|
tier = key_config.get("tier", "basic")
|
||||||
|
return tier == "premium"
|
||||||
|
|
||||||
|
|
||||||
|
def get_allowed_cdms(app: web.Application, api_key: str) -> List[str]:
|
||||||
|
"""
|
||||||
|
Get list of CDMs that an API key is allowed to use.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: aiohttp application
|
||||||
|
api_key: API key to check
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of allowed CDM names, or empty list if not premium
|
||||||
|
"""
|
||||||
|
key_config = get_api_key_config(app, api_key)
|
||||||
|
if not key_config:
|
||||||
|
return []
|
||||||
|
|
||||||
|
allowed_cdms = key_config.get("allowed_cdms", [])
|
||||||
|
|
||||||
|
# Handle wildcard
|
||||||
|
if allowed_cdms == "*" or allowed_cdms == ["*"]:
|
||||||
|
return ["*"]
|
||||||
|
|
||||||
|
return allowed_cdms if isinstance(allowed_cdms, list) else []
|
||||||
|
|
||||||
|
|
||||||
|
def get_default_cdm(app: web.Application, api_key: str) -> Optional[str]:
|
||||||
|
"""
|
||||||
|
Get default CDM for an API key.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: aiohttp application
|
||||||
|
api_key: API key to check
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Default CDM name or None
|
||||||
|
"""
|
||||||
|
key_config = get_api_key_config(app, api_key)
|
||||||
|
if not key_config:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return key_config.get("default_cdm")
|
||||||
|
|
||||||
|
|
||||||
|
def can_use_cdm(app: web.Application, api_key: str, cdm_name: str) -> bool:
|
||||||
|
"""
|
||||||
|
Check if an API key can use a specific CDM.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: aiohttp application
|
||||||
|
api_key: API key to check
|
||||||
|
cdm_name: CDM name to check access for
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if allowed, False otherwise
|
||||||
|
"""
|
||||||
|
allowed_cdms = get_allowed_cdms(app, api_key)
|
||||||
|
|
||||||
|
# Wildcard access
|
||||||
|
if "*" in allowed_cdms:
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Specific CDM access
|
||||||
|
return cdm_name in allowed_cdms
|
||||||
@@ -227,6 +227,7 @@ def _perform_download(
|
|||||||
range_=params.get("range", ["SDR"]),
|
range_=params.get("range", ["SDR"]),
|
||||||
channels=params.get("channels"),
|
channels=params.get("channels"),
|
||||||
no_atmos=params.get("no_atmos", False),
|
no_atmos=params.get("no_atmos", False),
|
||||||
|
split_audio=params.get("split_audio"),
|
||||||
wanted=params.get("wanted", []),
|
wanted=params.get("wanted", []),
|
||||||
latest_episode=params.get("latest_episode", False),
|
latest_episode=params.get("latest_episode", False),
|
||||||
lang=params.get("lang", ["orig"]),
|
lang=params.get("lang", ["orig"]),
|
||||||
|
|||||||
@@ -191,12 +191,73 @@ def serialize_title(title: Title_T) -> Dict[str, Any]:
|
|||||||
return result
|
return result
|
||||||
|
|
||||||
|
|
||||||
def serialize_video_track(track: Video) -> Dict[str, Any]:
|
def serialize_drm(drm_list) -> Optional[List[Dict[str, Any]]]:
|
||||||
|
"""Serialize DRM objects to JSON-serializable list."""
|
||||||
|
if not drm_list:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if not isinstance(drm_list, list):
|
||||||
|
drm_list = [drm_list]
|
||||||
|
|
||||||
|
result = []
|
||||||
|
for drm in drm_list:
|
||||||
|
drm_info = {}
|
||||||
|
drm_class = drm.__class__.__name__
|
||||||
|
drm_info["type"] = drm_class.lower()
|
||||||
|
|
||||||
|
# Get PSSH - handle both Widevine and PlayReady
|
||||||
|
if hasattr(drm, "_pssh") and drm._pssh:
|
||||||
|
try:
|
||||||
|
pssh_obj = drm._pssh
|
||||||
|
# Try to get base64 representation
|
||||||
|
if hasattr(pssh_obj, "dumps"):
|
||||||
|
# pywidevine PSSH has dumps() method
|
||||||
|
drm_info["pssh"] = pssh_obj.dumps()
|
||||||
|
elif hasattr(pssh_obj, "__bytes__"):
|
||||||
|
# Convert to base64
|
||||||
|
import base64
|
||||||
|
drm_info["pssh"] = base64.b64encode(bytes(pssh_obj)).decode()
|
||||||
|
elif hasattr(pssh_obj, "to_base64"):
|
||||||
|
drm_info["pssh"] = pssh_obj.to_base64()
|
||||||
|
else:
|
||||||
|
# Fallback - str() works for pywidevine PSSH
|
||||||
|
pssh_str = str(pssh_obj)
|
||||||
|
# Check if it's already base64-like or an object repr
|
||||||
|
if not pssh_str.startswith("<"):
|
||||||
|
drm_info["pssh"] = pssh_str
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Get KIDs
|
||||||
|
if hasattr(drm, "kids") and drm.kids:
|
||||||
|
drm_info["kids"] = [str(kid) for kid in drm.kids]
|
||||||
|
|
||||||
|
# Get content keys if available
|
||||||
|
if hasattr(drm, "content_keys") and drm.content_keys:
|
||||||
|
drm_info["content_keys"] = {str(k): v for k, v in drm.content_keys.items()}
|
||||||
|
|
||||||
|
# Get license URL - essential for remote licensing
|
||||||
|
if hasattr(drm, "license_url") and drm.license_url:
|
||||||
|
drm_info["license_url"] = str(drm.license_url)
|
||||||
|
elif hasattr(drm, "_license_url") and drm._license_url:
|
||||||
|
drm_info["license_url"] = str(drm._license_url)
|
||||||
|
|
||||||
|
result.append(drm_info)
|
||||||
|
|
||||||
|
return result if result else None
|
||||||
|
|
||||||
|
|
||||||
|
def serialize_video_track(track: Video, include_url: bool = False) -> Dict[str, Any]:
|
||||||
"""Convert video track to JSON-serializable dict."""
|
"""Convert video track to JSON-serializable dict."""
|
||||||
codec_name = track.codec.name if hasattr(track.codec, "name") else str(track.codec)
|
codec_name = track.codec.name if hasattr(track.codec, "name") else str(track.codec)
|
||||||
range_name = track.range.name if hasattr(track.range, "name") else str(track.range)
|
range_name = track.range.name if hasattr(track.range, "name") else str(track.range)
|
||||||
|
|
||||||
return {
|
# Get descriptor for N_m3u8DL-RE compatibility (HLS, DASH, URL, etc.)
|
||||||
|
descriptor_name = None
|
||||||
|
if hasattr(track, "descriptor") and track.descriptor:
|
||||||
|
descriptor_name = track.descriptor.name if hasattr(track.descriptor, "name") else str(track.descriptor)
|
||||||
|
|
||||||
|
result = {
|
||||||
"id": str(track.id),
|
"id": str(track.id),
|
||||||
"codec": codec_name,
|
"codec": codec_name,
|
||||||
"codec_display": VIDEO_CODEC_MAP.get(codec_name, codec_name),
|
"codec_display": VIDEO_CODEC_MAP.get(codec_name, codec_name),
|
||||||
@@ -208,15 +269,24 @@ def serialize_video_track(track: Video) -> Dict[str, Any]:
|
|||||||
"range": range_name,
|
"range": range_name,
|
||||||
"range_display": DYNAMIC_RANGE_MAP.get(range_name, range_name),
|
"range_display": DYNAMIC_RANGE_MAP.get(range_name, range_name),
|
||||||
"language": str(track.language) if track.language else None,
|
"language": str(track.language) if track.language else None,
|
||||||
"drm": str(track.drm) if hasattr(track, "drm") and track.drm else None,
|
"drm": serialize_drm(track.drm) if hasattr(track, "drm") and track.drm else None,
|
||||||
|
"descriptor": descriptor_name,
|
||||||
}
|
}
|
||||||
|
if include_url and hasattr(track, "url") and track.url:
|
||||||
|
result["url"] = str(track.url)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
def serialize_audio_track(track: Audio) -> Dict[str, Any]:
|
def serialize_audio_track(track: Audio, include_url: bool = False) -> Dict[str, Any]:
|
||||||
"""Convert audio track to JSON-serializable dict."""
|
"""Convert audio track to JSON-serializable dict."""
|
||||||
codec_name = track.codec.name if hasattr(track.codec, "name") else str(track.codec)
|
codec_name = track.codec.name if hasattr(track.codec, "name") else str(track.codec)
|
||||||
|
|
||||||
return {
|
# Get descriptor for N_m3u8DL-RE compatibility
|
||||||
|
descriptor_name = None
|
||||||
|
if hasattr(track, "descriptor") and track.descriptor:
|
||||||
|
descriptor_name = track.descriptor.name if hasattr(track.descriptor, "name") else str(track.descriptor)
|
||||||
|
|
||||||
|
result = {
|
||||||
"id": str(track.id),
|
"id": str(track.id),
|
||||||
"codec": codec_name,
|
"codec": codec_name,
|
||||||
"codec_display": AUDIO_CODEC_MAP.get(codec_name, codec_name),
|
"codec_display": AUDIO_CODEC_MAP.get(codec_name, codec_name),
|
||||||
@@ -225,20 +295,33 @@ def serialize_audio_track(track: Audio) -> Dict[str, Any]:
|
|||||||
"language": str(track.language) if track.language else None,
|
"language": str(track.language) if track.language else None,
|
||||||
"atmos": track.atmos if hasattr(track, "atmos") else False,
|
"atmos": track.atmos if hasattr(track, "atmos") else False,
|
||||||
"descriptive": track.descriptive if hasattr(track, "descriptive") else False,
|
"descriptive": track.descriptive if hasattr(track, "descriptive") else False,
|
||||||
"drm": str(track.drm) if hasattr(track, "drm") and track.drm else None,
|
"drm": serialize_drm(track.drm) if hasattr(track, "drm") and track.drm else None,
|
||||||
|
"descriptor": descriptor_name,
|
||||||
}
|
}
|
||||||
|
if include_url and hasattr(track, "url") and track.url:
|
||||||
|
result["url"] = str(track.url)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
def serialize_subtitle_track(track: Subtitle) -> Dict[str, Any]:
|
def serialize_subtitle_track(track: Subtitle, include_url: bool = False) -> Dict[str, Any]:
|
||||||
"""Convert subtitle track to JSON-serializable dict."""
|
"""Convert subtitle track to JSON-serializable dict."""
|
||||||
return {
|
# Get descriptor for compatibility
|
||||||
|
descriptor_name = None
|
||||||
|
if hasattr(track, "descriptor") and track.descriptor:
|
||||||
|
descriptor_name = track.descriptor.name if hasattr(track.descriptor, "name") else str(track.descriptor)
|
||||||
|
|
||||||
|
result = {
|
||||||
"id": str(track.id),
|
"id": str(track.id),
|
||||||
"codec": track.codec.name if hasattr(track.codec, "name") else str(track.codec),
|
"codec": track.codec.name if hasattr(track.codec, "name") else str(track.codec),
|
||||||
"language": str(track.language) if track.language else None,
|
"language": str(track.language) if track.language else None,
|
||||||
"forced": track.forced if hasattr(track, "forced") else False,
|
"forced": track.forced if hasattr(track, "forced") else False,
|
||||||
"sdh": track.sdh if hasattr(track, "sdh") else False,
|
"sdh": track.sdh if hasattr(track, "sdh") else False,
|
||||||
"cc": track.cc if hasattr(track, "cc") else False,
|
"cc": track.cc if hasattr(track, "cc") else False,
|
||||||
|
"descriptor": descriptor_name,
|
||||||
}
|
}
|
||||||
|
if include_url and hasattr(track, "url") and track.url:
|
||||||
|
result["url"] = str(track.url)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
async def list_titles_handler(data: Dict[str, Any], request: Optional[web.Request] = None) -> web.Response:
|
async def list_titles_handler(data: Dict[str, Any], request: Optional[web.Request] = None) -> web.Response:
|
||||||
@@ -665,9 +748,17 @@ def validate_download_parameters(data: Dict[str, Any]) -> Optional[str]:
|
|||||||
return f"Invalid vcodec: {data['vcodec']}. Must be one of: {', '.join(valid_vcodecs)}"
|
return f"Invalid vcodec: {data['vcodec']}. Must be one of: {', '.join(valid_vcodecs)}"
|
||||||
|
|
||||||
if "acodec" in data and data["acodec"]:
|
if "acodec" in data and data["acodec"]:
|
||||||
valid_acodecs = ["AAC", "AC3", "EAC3", "OPUS", "FLAC", "ALAC", "VORBIS", "DTS"]
|
valid_acodecs = ["AAC", "AC3", "EC3", "EAC3", "DD", "DD+", "AC4", "OPUS", "FLAC", "ALAC", "VORBIS", "OGG", "DTS"]
|
||||||
if data["acodec"].upper() not in valid_acodecs:
|
if isinstance(data["acodec"], str):
|
||||||
return f"Invalid acodec: {data['acodec']}. Must be one of: {', '.join(valid_acodecs)}"
|
acodec_values = [v.strip() for v in data["acodec"].split(",") if v.strip()]
|
||||||
|
elif isinstance(data["acodec"], list):
|
||||||
|
acodec_values = [str(v).strip() for v in data["acodec"] if str(v).strip()]
|
||||||
|
else:
|
||||||
|
return "acodec must be a string or list"
|
||||||
|
|
||||||
|
invalid = [value for value in acodec_values if value.upper() not in valid_acodecs]
|
||||||
|
if invalid:
|
||||||
|
return f"Invalid acodec: {', '.join(invalid)}. Must be one of: {', '.join(valid_acodecs)}"
|
||||||
|
|
||||||
if "sub_format" in data and data["sub_format"]:
|
if "sub_format" in data and data["sub_format"]:
|
||||||
valid_sub_formats = ["SRT", "VTT", "ASS", "SSA"]
|
valid_sub_formats = ["SRT", "VTT", "ASS", "SSA"]
|
||||||
|
|||||||
2195
unshackle/core/api/remote_handlers.py
Normal file
2195
unshackle/core/api/remote_handlers.py
Normal file
File diff suppressed because it is too large
Load Diff
@@ -8,6 +8,9 @@ from unshackle.core import __version__
|
|||||||
from unshackle.core.api.errors import APIError, APIErrorCode, build_error_response, handle_api_exception
|
from unshackle.core.api.errors import APIError, APIErrorCode, build_error_response, handle_api_exception
|
||||||
from unshackle.core.api.handlers import (cancel_download_job_handler, download_handler, get_download_job_handler,
|
from unshackle.core.api.handlers import (cancel_download_job_handler, download_handler, get_download_job_handler,
|
||||||
list_download_jobs_handler, list_titles_handler, list_tracks_handler)
|
list_download_jobs_handler, list_titles_handler, list_tracks_handler)
|
||||||
|
from unshackle.core.api.remote_handlers import (remote_decrypt, remote_get_chapters, remote_get_license,
|
||||||
|
remote_get_manifest, remote_get_titles, remote_get_tracks,
|
||||||
|
remote_list_services, remote_search)
|
||||||
from unshackle.core.services import Services
|
from unshackle.core.services import Services
|
||||||
from unshackle.core.update_checker import UpdateChecker
|
from unshackle.core.update_checker import UpdateChecker
|
||||||
|
|
||||||
@@ -413,7 +416,7 @@ async def download(request: web.Request) -> web.Response:
|
|||||||
description: Video codec to download (e.g., H264, H265, VP9, AV1) (default - None)
|
description: Video codec to download (e.g., H264, H265, VP9, AV1) (default - None)
|
||||||
acodec:
|
acodec:
|
||||||
type: string
|
type: string
|
||||||
description: Audio codec to download (e.g., AAC, AC3, EAC3) (default - None)
|
description: Audio codec(s) to download (e.g., AAC or AAC,EC3) (default - None)
|
||||||
vbitrate:
|
vbitrate:
|
||||||
type: integer
|
type: integer
|
||||||
description: Video bitrate in kbps (default - None)
|
description: Video bitrate in kbps (default - None)
|
||||||
@@ -730,6 +733,16 @@ def setup_routes(app: web.Application) -> None:
|
|||||||
app.router.add_get("/api/download/jobs/{job_id}", download_job_detail)
|
app.router.add_get("/api/download/jobs/{job_id}", download_job_detail)
|
||||||
app.router.add_delete("/api/download/jobs/{job_id}", cancel_download_job)
|
app.router.add_delete("/api/download/jobs/{job_id}", cancel_download_job)
|
||||||
|
|
||||||
|
# Remote service endpoints
|
||||||
|
app.router.add_get("/api/remote/services", remote_list_services)
|
||||||
|
app.router.add_post("/api/remote/{service}/search", remote_search)
|
||||||
|
app.router.add_post("/api/remote/{service}/titles", remote_get_titles)
|
||||||
|
app.router.add_post("/api/remote/{service}/tracks", remote_get_tracks)
|
||||||
|
app.router.add_post("/api/remote/{service}/manifest", remote_get_manifest)
|
||||||
|
app.router.add_post("/api/remote/{service}/chapters", remote_get_chapters)
|
||||||
|
app.router.add_post("/api/remote/{service}/license", remote_get_license)
|
||||||
|
app.router.add_post("/api/remote/{service}/decrypt", remote_decrypt)
|
||||||
|
|
||||||
|
|
||||||
def setup_swagger(app: web.Application) -> None:
|
def setup_swagger(app: web.Application) -> None:
|
||||||
"""Setup Swagger UI documentation."""
|
"""Setup Swagger UI documentation."""
|
||||||
@@ -754,5 +767,14 @@ def setup_swagger(app: web.Application) -> None:
|
|||||||
web.get("/api/download/jobs", download_jobs),
|
web.get("/api/download/jobs", download_jobs),
|
||||||
web.get("/api/download/jobs/{job_id}", download_job_detail),
|
web.get("/api/download/jobs/{job_id}", download_job_detail),
|
||||||
web.delete("/api/download/jobs/{job_id}", cancel_download_job),
|
web.delete("/api/download/jobs/{job_id}", cancel_download_job),
|
||||||
|
# Remote service routes
|
||||||
|
web.get("/api/remote/services", remote_list_services),
|
||||||
|
web.post("/api/remote/{service}/search", remote_search),
|
||||||
|
web.post("/api/remote/{service}/titles", remote_get_titles),
|
||||||
|
web.post("/api/remote/{service}/tracks", remote_get_tracks),
|
||||||
|
web.post("/api/remote/{service}/manifest", remote_get_manifest),
|
||||||
|
web.post("/api/remote/{service}/chapters", remote_get_chapters),
|
||||||
|
web.post("/api/remote/{service}/license", remote_get_license),
|
||||||
|
web.post("/api/remote/{service}/decrypt", remote_decrypt),
|
||||||
]
|
]
|
||||||
)
|
)
|
||||||
|
|||||||
236
unshackle/core/api/session_serializer.py
Normal file
236
unshackle/core/api/session_serializer.py
Normal file
@@ -0,0 +1,236 @@
|
|||||||
|
"""Session serialization helpers for remote services."""
|
||||||
|
|
||||||
|
from http.cookiejar import CookieJar
|
||||||
|
from typing import Any, Dict, Optional
|
||||||
|
|
||||||
|
import requests
|
||||||
|
|
||||||
|
from unshackle.core.credential import Credential
|
||||||
|
|
||||||
|
|
||||||
|
def serialize_session(session: requests.Session) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Serialize a requests.Session into a JSON-serializable dictionary.
|
||||||
|
|
||||||
|
Extracts cookies, headers, and other session data that can be
|
||||||
|
transferred to a remote client for downloading.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
session: The requests.Session to serialize
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing serialized session data
|
||||||
|
"""
|
||||||
|
session_data = {
|
||||||
|
"cookies": {},
|
||||||
|
"headers": {},
|
||||||
|
"proxies": session.proxies.copy() if session.proxies else {},
|
||||||
|
}
|
||||||
|
|
||||||
|
# Serialize cookies
|
||||||
|
if session.cookies:
|
||||||
|
for cookie in session.cookies:
|
||||||
|
session_data["cookies"][cookie.name] = {
|
||||||
|
"value": cookie.value,
|
||||||
|
"domain": cookie.domain,
|
||||||
|
"path": cookie.path,
|
||||||
|
"secure": cookie.secure,
|
||||||
|
"expires": cookie.expires,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Serialize headers (exclude proxy-authorization for security)
|
||||||
|
if session.headers:
|
||||||
|
for key, value in session.headers.items():
|
||||||
|
# Skip proxy-related headers as they're server-specific
|
||||||
|
if key.lower() not in ["proxy-authorization"]:
|
||||||
|
session_data["headers"][key] = value
|
||||||
|
|
||||||
|
return session_data
|
||||||
|
|
||||||
|
|
||||||
|
def deserialize_session(
|
||||||
|
session_data: Dict[str, Any], target_session: Optional[requests.Session] = None
|
||||||
|
) -> requests.Session:
|
||||||
|
"""
|
||||||
|
Deserialize session data into a requests.Session.
|
||||||
|
|
||||||
|
Applies cookies, headers, and other session data from a remote server
|
||||||
|
to a local session for downloading.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
session_data: Dictionary containing serialized session data
|
||||||
|
target_session: Optional existing session to update (creates new if None)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
requests.Session with applied session data
|
||||||
|
"""
|
||||||
|
if target_session is None:
|
||||||
|
target_session = requests.Session()
|
||||||
|
|
||||||
|
# Apply cookies
|
||||||
|
if "cookies" in session_data:
|
||||||
|
for cookie_name, cookie_data in session_data["cookies"].items():
|
||||||
|
target_session.cookies.set(
|
||||||
|
name=cookie_name,
|
||||||
|
value=cookie_data["value"],
|
||||||
|
domain=cookie_data.get("domain"),
|
||||||
|
path=cookie_data.get("path", "/"),
|
||||||
|
secure=cookie_data.get("secure", False),
|
||||||
|
expires=cookie_data.get("expires"),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Apply headers
|
||||||
|
if "headers" in session_data:
|
||||||
|
target_session.headers.update(session_data["headers"])
|
||||||
|
|
||||||
|
# Note: We don't apply proxies from remote as the local client
|
||||||
|
# should use its own proxy configuration
|
||||||
|
|
||||||
|
return target_session
|
||||||
|
|
||||||
|
|
||||||
|
def extract_session_tokens(session: requests.Session) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Extract authentication tokens and similar data from a session.
|
||||||
|
|
||||||
|
Looks for common authentication patterns like Bearer tokens,
|
||||||
|
API keys in headers, etc.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
session: The requests.Session to extract tokens from
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing extracted tokens
|
||||||
|
"""
|
||||||
|
tokens = {}
|
||||||
|
|
||||||
|
# Check for Authorization header
|
||||||
|
if "Authorization" in session.headers:
|
||||||
|
tokens["authorization"] = session.headers["Authorization"]
|
||||||
|
|
||||||
|
# Check for common API key headers
|
||||||
|
for key in ["X-API-Key", "Api-Key", "X-Auth-Token"]:
|
||||||
|
if key in session.headers:
|
||||||
|
tokens[key.lower().replace("-", "_")] = session.headers[key]
|
||||||
|
|
||||||
|
return tokens
|
||||||
|
|
||||||
|
|
||||||
|
def apply_session_tokens(tokens: Dict[str, Any], target_session: requests.Session) -> None:
|
||||||
|
"""
|
||||||
|
Apply authentication tokens to a session.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tokens: Dictionary containing tokens to apply
|
||||||
|
target_session: Session to apply tokens to
|
||||||
|
"""
|
||||||
|
# Apply Authorization header
|
||||||
|
if "authorization" in tokens:
|
||||||
|
target_session.headers["Authorization"] = tokens["authorization"]
|
||||||
|
|
||||||
|
# Apply other token headers
|
||||||
|
token_header_map = {
|
||||||
|
"x_api_key": "X-API-Key",
|
||||||
|
"api_key": "Api-Key",
|
||||||
|
"x_auth_token": "X-Auth-Token",
|
||||||
|
}
|
||||||
|
|
||||||
|
for token_key, header_name in token_header_map.items():
|
||||||
|
if token_key in tokens:
|
||||||
|
target_session.headers[header_name] = tokens[token_key]
|
||||||
|
|
||||||
|
|
||||||
|
def serialize_cookies(cookie_jar: Optional[CookieJar]) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Serialize a CookieJar into a JSON-serializable dictionary.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cookie_jar: The CookieJar to serialize
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing serialized cookies
|
||||||
|
"""
|
||||||
|
if not cookie_jar:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
cookies = {}
|
||||||
|
for cookie in cookie_jar:
|
||||||
|
cookies[cookie.name] = {
|
||||||
|
"value": cookie.value,
|
||||||
|
"domain": cookie.domain,
|
||||||
|
"path": cookie.path,
|
||||||
|
"secure": cookie.secure,
|
||||||
|
"expires": cookie.expires,
|
||||||
|
}
|
||||||
|
|
||||||
|
return cookies
|
||||||
|
|
||||||
|
|
||||||
|
def deserialize_cookies(cookies_data: Dict[str, Any]) -> CookieJar:
|
||||||
|
"""
|
||||||
|
Deserialize cookies into a CookieJar.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cookies_data: Dictionary containing serialized cookies
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
CookieJar with cookies
|
||||||
|
"""
|
||||||
|
import http.cookiejar
|
||||||
|
|
||||||
|
cookie_jar = http.cookiejar.CookieJar()
|
||||||
|
|
||||||
|
for cookie_name, cookie_data in cookies_data.items():
|
||||||
|
cookie = http.cookiejar.Cookie(
|
||||||
|
version=0,
|
||||||
|
name=cookie_name,
|
||||||
|
value=cookie_data["value"],
|
||||||
|
port=None,
|
||||||
|
port_specified=False,
|
||||||
|
domain=cookie_data.get("domain", ""),
|
||||||
|
domain_specified=bool(cookie_data.get("domain")),
|
||||||
|
domain_initial_dot=cookie_data.get("domain", "").startswith("."),
|
||||||
|
path=cookie_data.get("path", "/"),
|
||||||
|
path_specified=True,
|
||||||
|
secure=cookie_data.get("secure", False),
|
||||||
|
expires=cookie_data.get("expires"),
|
||||||
|
discard=False,
|
||||||
|
comment=None,
|
||||||
|
comment_url=None,
|
||||||
|
rest={},
|
||||||
|
)
|
||||||
|
cookie_jar.set_cookie(cookie)
|
||||||
|
|
||||||
|
return cookie_jar
|
||||||
|
|
||||||
|
|
||||||
|
def serialize_credential(credential: Optional[Credential]) -> Optional[Dict[str, str]]:
|
||||||
|
"""
|
||||||
|
Serialize a Credential into a JSON-serializable dictionary.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
credential: The Credential to serialize
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing username and password, or None
|
||||||
|
"""
|
||||||
|
if not credential:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return {"username": credential.username, "password": credential.password}
|
||||||
|
|
||||||
|
|
||||||
|
def deserialize_credential(credential_data: Optional[Dict[str, str]]) -> Optional[Credential]:
|
||||||
|
"""
|
||||||
|
Deserialize credential data into a Credential object.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
credential_data: Dictionary containing username and password
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Credential object or None
|
||||||
|
"""
|
||||||
|
if not credential_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return Credential(username=credential_data["username"], password=credential_data["password"])
|
||||||
@@ -17,6 +17,10 @@ def find(*names: str) -> Optional[Path]:
|
|||||||
if local_binaries_dir.exists():
|
if local_binaries_dir.exists():
|
||||||
candidate_paths = [local_binaries_dir / f"{name}{ext}", local_binaries_dir / name / f"{name}{ext}"]
|
candidate_paths = [local_binaries_dir / f"{name}{ext}", local_binaries_dir / name / f"{name}{ext}"]
|
||||||
|
|
||||||
|
for subdir in local_binaries_dir.iterdir():
|
||||||
|
if subdir.is_dir():
|
||||||
|
candidate_paths.append(subdir / f"{name}{ext}")
|
||||||
|
|
||||||
for path in candidate_paths:
|
for path in candidate_paths:
|
||||||
if path.is_file():
|
if path.is_file():
|
||||||
# On Unix-like systems, check if file is executable
|
# On Unix-like systems, check if file is executable
|
||||||
@@ -52,6 +56,8 @@ Mkvpropedit = find("mkvpropedit")
|
|||||||
DoviTool = find("dovi_tool")
|
DoviTool = find("dovi_tool")
|
||||||
HDR10PlusTool = find("hdr10plus_tool", "HDR10Plus_tool")
|
HDR10PlusTool = find("hdr10plus_tool", "HDR10Plus_tool")
|
||||||
Mp4decrypt = find("mp4decrypt")
|
Mp4decrypt = find("mp4decrypt")
|
||||||
|
Docker = find("docker")
|
||||||
|
ML_Worker = find("ML-Worker")
|
||||||
|
|
||||||
|
|
||||||
__all__ = (
|
__all__ = (
|
||||||
@@ -71,5 +77,7 @@ __all__ = (
|
|||||||
"DoviTool",
|
"DoviTool",
|
||||||
"HDR10PlusTool",
|
"HDR10PlusTool",
|
||||||
"Mp4decrypt",
|
"Mp4decrypt",
|
||||||
|
"Docker",
|
||||||
|
"ML_Worker",
|
||||||
"find",
|
"find",
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
from .custom_remote_cdm import CustomRemoteCDM
|
from .custom_remote_cdm import CustomRemoteCDM
|
||||||
from .decrypt_labs_remote_cdm import DecryptLabsRemoteCDM
|
from .decrypt_labs_remote_cdm import DecryptLabsRemoteCDM
|
||||||
|
from .monalisa import MonaLisaCDM
|
||||||
|
|
||||||
__all__ = ["DecryptLabsRemoteCDM", "CustomRemoteCDM"]
|
__all__ = ["DecryptLabsRemoteCDM", "CustomRemoteCDM", "MonaLisaCDM"]
|
||||||
|
|||||||
3
unshackle/core/cdm/monalisa/__init__.py
Normal file
3
unshackle/core/cdm/monalisa/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
from .monalisa_cdm import MonaLisaCDM
|
||||||
|
|
||||||
|
__all__ = ["MonaLisaCDM"]
|
||||||
371
unshackle/core/cdm/monalisa/monalisa_cdm.py
Normal file
371
unshackle/core/cdm/monalisa/monalisa_cdm.py
Normal file
@@ -0,0 +1,371 @@
|
|||||||
|
"""
|
||||||
|
MonaLisa CDM - WASM-based Content Decryption Module wrapper.
|
||||||
|
|
||||||
|
This module provides key extraction from MonaLisa-protected content using
|
||||||
|
a WebAssembly module that runs locally via wasmtime.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import ctypes
|
||||||
|
import json
|
||||||
|
import re
|
||||||
|
import uuid
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Dict, Optional, Union
|
||||||
|
|
||||||
|
import wasmtime
|
||||||
|
|
||||||
|
from unshackle.core import binaries
|
||||||
|
|
||||||
|
|
||||||
|
class MonaLisaCDM:
|
||||||
|
"""
|
||||||
|
MonaLisa CDM wrapper for WASM-based key extraction.
|
||||||
|
|
||||||
|
This CDM differs from Widevine/PlayReady in that it does not use a
|
||||||
|
challenge/response flow with a license server. Instead, the license
|
||||||
|
(ticket) is provided directly by the service API, and keys are extracted
|
||||||
|
locally via the WASM module.
|
||||||
|
"""
|
||||||
|
|
||||||
|
DYNAMIC_BASE = 6065008
|
||||||
|
DYNAMICTOP_PTR = 821968
|
||||||
|
LICENSE_KEY_OFFSET = 0x5C8C0C
|
||||||
|
LICENSE_KEY_LENGTH = 16
|
||||||
|
|
||||||
|
ENV_STRINGS = (
|
||||||
|
"USER=web_user",
|
||||||
|
"LOGNAME=web_user",
|
||||||
|
"PATH=/",
|
||||||
|
"PWD=/",
|
||||||
|
"HOME=/home/web_user",
|
||||||
|
"LANG=zh_CN.UTF-8",
|
||||||
|
"_=./this.program",
|
||||||
|
)
|
||||||
|
|
||||||
|
def __init__(self, device_path: Path):
|
||||||
|
"""
|
||||||
|
Initialize the MonaLisa CDM.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
device_path: Path to the device file (.mld).
|
||||||
|
"""
|
||||||
|
device_path = Path(device_path)
|
||||||
|
|
||||||
|
self.device_path = device_path
|
||||||
|
self.base_dir = device_path.parent
|
||||||
|
|
||||||
|
if not self.device_path.is_file():
|
||||||
|
raise FileNotFoundError(f"Device file not found at: {self.device_path}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
data = json.loads(self.device_path.read_text(encoding="utf-8", errors="replace"))
|
||||||
|
except Exception as e:
|
||||||
|
raise ValueError(f"Invalid device file (JSON): {e}")
|
||||||
|
|
||||||
|
wasm_path_str = data.get("wasm_path")
|
||||||
|
if not wasm_path_str:
|
||||||
|
raise ValueError("Device file missing 'wasm_path'")
|
||||||
|
|
||||||
|
wasm_filename = Path(wasm_path_str).name
|
||||||
|
wasm_path = self.base_dir / wasm_filename
|
||||||
|
|
||||||
|
if not wasm_path.exists():
|
||||||
|
raise FileNotFoundError(f"WASM file not found at: {wasm_path}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.engine = wasmtime.Engine()
|
||||||
|
if wasm_path.suffix.lower() == ".wat":
|
||||||
|
self.module = wasmtime.Module.from_file(self.engine, str(wasm_path))
|
||||||
|
else:
|
||||||
|
self.module = wasmtime.Module(self.engine, wasm_path.read_bytes())
|
||||||
|
except Exception as e:
|
||||||
|
raise RuntimeError(f"Failed to load WASM module: {e}")
|
||||||
|
|
||||||
|
self.store = None
|
||||||
|
self.memory = None
|
||||||
|
self.instance = None
|
||||||
|
self.exports = {}
|
||||||
|
self.ctx = None
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_worker_path() -> Optional[Path]:
|
||||||
|
"""Get ML-Worker binary path from the unshackle binaries system."""
|
||||||
|
if binaries.ML_Worker:
|
||||||
|
return Path(binaries.ML_Worker)
|
||||||
|
return None
|
||||||
|
|
||||||
|
def open(self) -> int:
|
||||||
|
"""
|
||||||
|
Open a CDM session.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Session ID (always 1 for MonaLisa).
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
RuntimeError: If session initialization fails.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
self.store = wasmtime.Store(self.engine)
|
||||||
|
memory_type = wasmtime.MemoryType(wasmtime.Limits(256, 256))
|
||||||
|
self.memory = wasmtime.Memory(self.store, memory_type)
|
||||||
|
|
||||||
|
self._write_i32(self.DYNAMICTOP_PTR, self.DYNAMIC_BASE)
|
||||||
|
imports = self._build_imports()
|
||||||
|
self.instance = wasmtime.Instance(self.store, self.module, imports)
|
||||||
|
|
||||||
|
ex = self.instance.exports(self.store)
|
||||||
|
self.exports = {
|
||||||
|
"___wasm_call_ctors": ex["s"],
|
||||||
|
"_monalisa_context_alloc": ex["D"],
|
||||||
|
"monalisa_set_license": ex["F"],
|
||||||
|
"_monalisa_set_canvas_id": ex["t"],
|
||||||
|
"_monalisa_version_get": ex["A"],
|
||||||
|
"monalisa_get_line_number": ex["v"],
|
||||||
|
"stackAlloc": ex["N"],
|
||||||
|
"stackSave": ex["L"],
|
||||||
|
"stackRestore": ex["M"],
|
||||||
|
}
|
||||||
|
|
||||||
|
self.exports["___wasm_call_ctors"](self.store)
|
||||||
|
self.ctx = self.exports["_monalisa_context_alloc"](self.store)
|
||||||
|
return 1
|
||||||
|
except Exception as e:
|
||||||
|
raise RuntimeError(f"Failed to initialize session: {e}")
|
||||||
|
|
||||||
|
def close(self, session_id: int = 1) -> None:
|
||||||
|
"""
|
||||||
|
Close the CDM session and release resources.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
session_id: The session ID to close (unused, for API compatibility).
|
||||||
|
"""
|
||||||
|
self.store = None
|
||||||
|
self.memory = None
|
||||||
|
self.instance = None
|
||||||
|
self.exports = {}
|
||||||
|
self.ctx = None
|
||||||
|
|
||||||
|
def extract_keys(self, license_data: Union[str, bytes]) -> Dict:
|
||||||
|
"""
|
||||||
|
Extract decryption keys from license/ticket data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
license_data: The license ticket, either as base64 string or raw bytes.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with keys: kid (hex), key (hex), type ("CONTENT").
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
RuntimeError: If session not open or license validation fails.
|
||||||
|
ValueError: If license_data is empty.
|
||||||
|
"""
|
||||||
|
if not self.instance or not self.memory or self.ctx is None:
|
||||||
|
raise RuntimeError("Session not open. Call open() first.")
|
||||||
|
|
||||||
|
if not license_data:
|
||||||
|
raise ValueError("license_data is empty")
|
||||||
|
|
||||||
|
if isinstance(license_data, bytes):
|
||||||
|
license_b64 = base64.b64encode(license_data).decode("utf-8")
|
||||||
|
else:
|
||||||
|
license_b64 = license_data
|
||||||
|
|
||||||
|
ret = self._ccall(
|
||||||
|
"monalisa_set_license",
|
||||||
|
int,
|
||||||
|
self.ctx,
|
||||||
|
license_b64,
|
||||||
|
len(license_b64),
|
||||||
|
"0",
|
||||||
|
)
|
||||||
|
|
||||||
|
if ret != 0:
|
||||||
|
raise RuntimeError(f"License validation failed with code: {ret}")
|
||||||
|
|
||||||
|
key_bytes = self._extract_license_key_bytes()
|
||||||
|
|
||||||
|
# Extract DCID from license to generate KID
|
||||||
|
try:
|
||||||
|
decoded = base64.b64decode(license_b64).decode("ascii", errors="ignore")
|
||||||
|
except Exception:
|
||||||
|
decoded = ""
|
||||||
|
|
||||||
|
m = re.search(
|
||||||
|
r"DCID-[A-Z0-9]+-[A-Z0-9]+-\d{8}-\d{6}-[A-Z0-9]+-\d{10}-[A-Z0-9]+",
|
||||||
|
decoded,
|
||||||
|
)
|
||||||
|
if m:
|
||||||
|
kid_bytes = uuid.uuid5(uuid.NAMESPACE_DNS, m.group()).bytes
|
||||||
|
else:
|
||||||
|
kid_bytes = uuid.UUID(int=0).bytes
|
||||||
|
|
||||||
|
return {"kid": kid_bytes.hex(), "key": key_bytes.hex(), "type": "CONTENT"}
|
||||||
|
|
||||||
|
def _extract_license_key_bytes(self) -> bytes:
|
||||||
|
"""Extract the 16-byte decryption key from WASM memory."""
|
||||||
|
data_ptr = self.memory.data_ptr(self.store)
|
||||||
|
data_len = self.memory.data_len(self.store)
|
||||||
|
|
||||||
|
if self.LICENSE_KEY_OFFSET + self.LICENSE_KEY_LENGTH > data_len:
|
||||||
|
raise RuntimeError("License key offset beyond memory bounds")
|
||||||
|
|
||||||
|
mem_ptr = ctypes.cast(data_ptr, ctypes.POINTER(ctypes.c_ubyte * data_len))
|
||||||
|
start = self.LICENSE_KEY_OFFSET
|
||||||
|
end = self.LICENSE_KEY_OFFSET + self.LICENSE_KEY_LENGTH
|
||||||
|
|
||||||
|
return bytes(mem_ptr.contents[start:end])
|
||||||
|
|
||||||
|
def _ccall(self, func_name: str, return_type: type, *args):
|
||||||
|
"""Call a WASM function with automatic string conversion."""
|
||||||
|
stack = 0
|
||||||
|
converted_args = []
|
||||||
|
|
||||||
|
for arg in args:
|
||||||
|
if isinstance(arg, str):
|
||||||
|
if stack == 0:
|
||||||
|
stack = self.exports["stackSave"](self.store)
|
||||||
|
max_length = (len(arg) << 2) + 1
|
||||||
|
ptr = self.exports["stackAlloc"](self.store, max_length)
|
||||||
|
self._string_to_utf8(arg, ptr, max_length)
|
||||||
|
converted_args.append(ptr)
|
||||||
|
else:
|
||||||
|
converted_args.append(arg)
|
||||||
|
|
||||||
|
result = self.exports[func_name](self.store, *converted_args)
|
||||||
|
|
||||||
|
if stack != 0:
|
||||||
|
self.exports["stackRestore"](self.store, stack)
|
||||||
|
|
||||||
|
if return_type is bool:
|
||||||
|
return bool(result)
|
||||||
|
return result
|
||||||
|
|
||||||
|
def _write_i32(self, addr: int, value: int) -> None:
|
||||||
|
"""Write a 32-bit integer to WASM memory."""
|
||||||
|
data = self.memory.data_ptr(self.store)
|
||||||
|
mem_ptr = ctypes.cast(data, ctypes.POINTER(ctypes.c_int32))
|
||||||
|
mem_ptr[addr >> 2] = value
|
||||||
|
|
||||||
|
def _string_to_utf8(self, data: str, ptr: int, max_length: int) -> int:
|
||||||
|
"""Convert string to UTF-8 and write to WASM memory."""
|
||||||
|
encoded = data.encode("utf-8")
|
||||||
|
write_length = min(len(encoded), max_length - 1)
|
||||||
|
|
||||||
|
mem_data = self.memory.data_ptr(self.store)
|
||||||
|
mem_ptr = ctypes.cast(mem_data, ctypes.POINTER(ctypes.c_ubyte))
|
||||||
|
|
||||||
|
for i in range(write_length):
|
||||||
|
mem_ptr[ptr + i] = encoded[i]
|
||||||
|
mem_ptr[ptr + write_length] = 0
|
||||||
|
return write_length
|
||||||
|
|
||||||
|
def _write_ascii_to_memory(self, string: str, buffer: int, dont_add_null: int = 0) -> None:
|
||||||
|
"""Write ASCII string to WASM memory."""
|
||||||
|
mem_data = self.memory.data_ptr(self.store)
|
||||||
|
mem_ptr = ctypes.cast(mem_data, ctypes.POINTER(ctypes.c_ubyte))
|
||||||
|
|
||||||
|
encoded = string.encode("utf-8")
|
||||||
|
for i, byte_val in enumerate(encoded):
|
||||||
|
mem_ptr[buffer + i] = byte_val
|
||||||
|
|
||||||
|
if dont_add_null == 0:
|
||||||
|
mem_ptr[buffer + len(encoded)] = 0
|
||||||
|
|
||||||
|
def _build_imports(self):
|
||||||
|
"""Build the WASM import stubs required by the MonaLisa module."""
|
||||||
|
|
||||||
|
def sys_fcntl64(a, b, c):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def fd_write(a, b, c, d):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def fd_close(a):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def sys_ioctl(a, b, c):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def sys_open(a, b, c):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def sys_rmdir(a):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def sys_unlink(a):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def clock():
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def time(a):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def emscripten_run_script(a):
|
||||||
|
return None
|
||||||
|
|
||||||
|
def fd_seek(a, b, c, d, e):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def emscripten_resize_heap(a):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def fd_read(a, b, c, d):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def emscripten_run_script_string(a):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def emscripten_run_script_int(a):
|
||||||
|
return 1
|
||||||
|
|
||||||
|
def emscripten_memcpy_big(dest, src, num):
|
||||||
|
mem_data = self.memory.data_ptr(self.store)
|
||||||
|
data_len = self.memory.data_len(self.store)
|
||||||
|
if num is None:
|
||||||
|
num = data_len - 1
|
||||||
|
mem_ptr = ctypes.cast(mem_data, ctypes.POINTER(ctypes.c_ubyte))
|
||||||
|
for i in range(num):
|
||||||
|
if dest + i < data_len and src + i < data_len:
|
||||||
|
mem_ptr[dest + i] = mem_ptr[src + i]
|
||||||
|
return dest
|
||||||
|
|
||||||
|
def environ_get(environ_ptr, environ_buf):
|
||||||
|
buf_size = 0
|
||||||
|
for index, string in enumerate(self.ENV_STRINGS):
|
||||||
|
ptr = environ_buf + buf_size
|
||||||
|
self._write_i32(environ_ptr + index * 4, ptr)
|
||||||
|
self._write_ascii_to_memory(string, ptr)
|
||||||
|
buf_size += len(string) + 1
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def environ_sizes_get(penviron_count, penviron_buf_size):
|
||||||
|
self._write_i32(penviron_count, len(self.ENV_STRINGS))
|
||||||
|
buf_size = sum(len(s) + 1 for s in self.ENV_STRINGS)
|
||||||
|
self._write_i32(penviron_buf_size, buf_size)
|
||||||
|
return 0
|
||||||
|
|
||||||
|
i32 = wasmtime.ValType.i32()
|
||||||
|
|
||||||
|
return [
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32, i32, i32], [i32]), sys_fcntl64),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32, i32, i32, i32], [i32]), fd_write),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32], [i32]), fd_close),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32, i32, i32], [i32]), sys_ioctl),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32, i32, i32], [i32]), sys_open),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32], [i32]), sys_rmdir),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32], [i32]), sys_unlink),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([], [i32]), clock),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32], [i32]), time),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32], []), emscripten_run_script),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32, i32, i32, i32, i32], [i32]), fd_seek),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32, i32, i32], [i32]), emscripten_memcpy_big),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32], [i32]), emscripten_resize_heap),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32, i32], [i32]), environ_get),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32, i32], [i32]), environ_sizes_get),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32, i32, i32, i32], [i32]), fd_read),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32], [i32]), emscripten_run_script_string),
|
||||||
|
wasmtime.Func(self.store, wasmtime.FuncType([i32], [i32]), emscripten_run_script_int),
|
||||||
|
self.memory,
|
||||||
|
]
|
||||||
@@ -94,6 +94,7 @@ class Config:
|
|||||||
self.update_checks: bool = kwargs.get("update_checks", True)
|
self.update_checks: bool = kwargs.get("update_checks", True)
|
||||||
self.update_check_interval: int = kwargs.get("update_check_interval", 24)
|
self.update_check_interval: int = kwargs.get("update_check_interval", 24)
|
||||||
self.scene_naming: bool = kwargs.get("scene_naming", True)
|
self.scene_naming: bool = kwargs.get("scene_naming", True)
|
||||||
|
self.dash_naming: bool = kwargs.get("dash_naming", False)
|
||||||
self.series_year: bool = kwargs.get("series_year", True)
|
self.series_year: bool = kwargs.get("series_year", True)
|
||||||
self.unicode_filenames: bool = kwargs.get("unicode_filenames", False)
|
self.unicode_filenames: bool = kwargs.get("unicode_filenames", False)
|
||||||
self.insert_episodename_into_filenames: bool = kwargs.get("insert_episodename_into_filenames", True)
|
self.insert_episodename_into_filenames: bool = kwargs.get("insert_episodename_into_filenames", True)
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
import os
|
import os
|
||||||
import subprocess
|
import subprocess
|
||||||
import textwrap
|
import textwrap
|
||||||
|
import threading
|
||||||
import time
|
import time
|
||||||
from functools import partial
|
from functools import partial
|
||||||
from http.cookiejar import CookieJar
|
from http.cookiejar import CookieJar
|
||||||
@@ -49,6 +50,138 @@ def rpc(caller: Callable, secret: str, method: str, params: Optional[list[Any]]
|
|||||||
return
|
return
|
||||||
|
|
||||||
|
|
||||||
|
class _Aria2Manager:
|
||||||
|
"""Singleton manager to run one aria2c process and enqueue downloads via RPC."""
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self._proc: Optional[subprocess.Popen] = None
|
||||||
|
self._rpc_port: Optional[int] = None
|
||||||
|
self._rpc_secret: Optional[str] = None
|
||||||
|
self._rpc_uri: Optional[str] = None
|
||||||
|
self._session: Session = Session()
|
||||||
|
self._max_concurrent_downloads: int = 0
|
||||||
|
self._max_connection_per_server: int = 1
|
||||||
|
self._split_default: int = 5
|
||||||
|
self._file_allocation: str = "prealloc"
|
||||||
|
self._proxy: Optional[str] = None
|
||||||
|
self._lock: threading.Lock = threading.Lock()
|
||||||
|
|
||||||
|
def _build_args(self) -> list[str]:
|
||||||
|
args = [
|
||||||
|
"--continue=true",
|
||||||
|
f"--max-concurrent-downloads={self._max_concurrent_downloads}",
|
||||||
|
f"--max-connection-per-server={self._max_connection_per_server}",
|
||||||
|
f"--split={self._split_default}",
|
||||||
|
"--max-file-not-found=5",
|
||||||
|
"--max-tries=5",
|
||||||
|
"--retry-wait=2",
|
||||||
|
"--allow-overwrite=true",
|
||||||
|
"--auto-file-renaming=false",
|
||||||
|
"--console-log-level=warn",
|
||||||
|
"--download-result=default",
|
||||||
|
f"--file-allocation={self._file_allocation}",
|
||||||
|
"--summary-interval=0",
|
||||||
|
"--enable-rpc=true",
|
||||||
|
f"--rpc-listen-port={self._rpc_port}",
|
||||||
|
f"--rpc-secret={self._rpc_secret}",
|
||||||
|
]
|
||||||
|
if self._proxy:
|
||||||
|
args.extend(["--all-proxy", self._proxy])
|
||||||
|
return args
|
||||||
|
|
||||||
|
def ensure_started(
|
||||||
|
self,
|
||||||
|
proxy: Optional[str],
|
||||||
|
max_workers: Optional[int],
|
||||||
|
) -> None:
|
||||||
|
with self._lock:
|
||||||
|
if self._proc and self._proc.poll() is None:
|
||||||
|
return
|
||||||
|
|
||||||
|
if not binaries.Aria2:
|
||||||
|
debug_logger = get_debug_logger()
|
||||||
|
if debug_logger:
|
||||||
|
debug_logger.log(
|
||||||
|
level="ERROR",
|
||||||
|
operation="downloader_aria2c_binary_missing",
|
||||||
|
message="Aria2c executable not found in PATH or local binaries directory",
|
||||||
|
context={"searched_names": ["aria2c", "aria2"]},
|
||||||
|
)
|
||||||
|
raise EnvironmentError("Aria2c executable not found...")
|
||||||
|
|
||||||
|
if not max_workers:
|
||||||
|
max_workers = min(32, (os.cpu_count() or 1) + 4)
|
||||||
|
elif not isinstance(max_workers, int):
|
||||||
|
raise TypeError(f"Expected max_workers to be {int}, not {type(max_workers)}")
|
||||||
|
|
||||||
|
self._rpc_port = get_free_port()
|
||||||
|
self._rpc_secret = get_random_bytes(16).hex()
|
||||||
|
self._rpc_uri = f"http://127.0.0.1:{self._rpc_port}/jsonrpc"
|
||||||
|
|
||||||
|
self._max_concurrent_downloads = int(config.aria2c.get("max_concurrent_downloads", max_workers))
|
||||||
|
self._max_connection_per_server = int(config.aria2c.get("max_connection_per_server", 1))
|
||||||
|
self._split_default = int(config.aria2c.get("split", 5))
|
||||||
|
self._file_allocation = config.aria2c.get("file_allocation", "prealloc")
|
||||||
|
self._proxy = proxy or None
|
||||||
|
|
||||||
|
args = self._build_args()
|
||||||
|
self._proc = subprocess.Popen(
|
||||||
|
[binaries.Aria2, *args], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL
|
||||||
|
)
|
||||||
|
# Give aria2c a moment to start up and bind to the RPC port
|
||||||
|
time.sleep(0.5)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def rpc_uri(self) -> str:
|
||||||
|
assert self._rpc_uri
|
||||||
|
return self._rpc_uri
|
||||||
|
|
||||||
|
@property
|
||||||
|
def rpc_secret(self) -> str:
|
||||||
|
assert self._rpc_secret
|
||||||
|
return self._rpc_secret
|
||||||
|
|
||||||
|
@property
|
||||||
|
def session(self) -> Session:
|
||||||
|
return self._session
|
||||||
|
|
||||||
|
def add_uris(self, uris: list[str], options: dict[str, Any]) -> str:
|
||||||
|
"""Add a single download with multiple URIs via RPC."""
|
||||||
|
gid = rpc(
|
||||||
|
caller=partial(self._session.post, url=self.rpc_uri),
|
||||||
|
secret=self.rpc_secret,
|
||||||
|
method="aria2.addUri",
|
||||||
|
params=[uris, options],
|
||||||
|
)
|
||||||
|
return gid or ""
|
||||||
|
|
||||||
|
def get_global_stat(self) -> dict[str, Any]:
|
||||||
|
return rpc(
|
||||||
|
caller=partial(self.session.post, url=self.rpc_uri),
|
||||||
|
secret=self.rpc_secret,
|
||||||
|
method="aria2.getGlobalStat",
|
||||||
|
) or {}
|
||||||
|
|
||||||
|
def tell_status(self, gid: str) -> Optional[dict[str, Any]]:
|
||||||
|
return rpc(
|
||||||
|
caller=partial(self.session.post, url=self.rpc_uri),
|
||||||
|
secret=self.rpc_secret,
|
||||||
|
method="aria2.tellStatus",
|
||||||
|
params=[gid, ["status", "errorCode", "errorMessage", "files", "completedLength", "totalLength"]],
|
||||||
|
)
|
||||||
|
|
||||||
|
def remove(self, gid: str) -> None:
|
||||||
|
rpc(
|
||||||
|
caller=partial(self.session.post, url=self.rpc_uri),
|
||||||
|
secret=self.rpc_secret,
|
||||||
|
method="aria2.forceRemove",
|
||||||
|
params=[gid],
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
_manager = _Aria2Manager()
|
||||||
|
|
||||||
|
|
||||||
def download(
|
def download(
|
||||||
urls: Union[str, list[str], dict[str, Any], list[dict[str, Any]]],
|
urls: Union[str, list[str], dict[str, Any], list[dict[str, Any]]],
|
||||||
output_dir: Path,
|
output_dir: Path,
|
||||||
@@ -58,6 +191,7 @@ def download(
|
|||||||
proxy: Optional[str] = None,
|
proxy: Optional[str] = None,
|
||||||
max_workers: Optional[int] = None,
|
max_workers: Optional[int] = None,
|
||||||
) -> Generator[dict[str, Any], None, None]:
|
) -> Generator[dict[str, Any], None, None]:
|
||||||
|
"""Enqueue downloads to the singleton aria2c instance via stdin and track per-call progress via RPC."""
|
||||||
debug_logger = get_debug_logger()
|
debug_logger = get_debug_logger()
|
||||||
|
|
||||||
if not urls:
|
if not urls:
|
||||||
@@ -92,102 +226,10 @@ def download(
|
|||||||
if not isinstance(urls, list):
|
if not isinstance(urls, list):
|
||||||
urls = [urls]
|
urls = [urls]
|
||||||
|
|
||||||
if not binaries.Aria2:
|
|
||||||
if debug_logger:
|
|
||||||
debug_logger.log(
|
|
||||||
level="ERROR",
|
|
||||||
operation="downloader_aria2c_binary_missing",
|
|
||||||
message="Aria2c executable not found in PATH or local binaries directory",
|
|
||||||
context={"searched_names": ["aria2c", "aria2"]},
|
|
||||||
)
|
|
||||||
raise EnvironmentError("Aria2c executable not found...")
|
|
||||||
|
|
||||||
if proxy and not proxy.lower().startswith("http://"):
|
|
||||||
raise ValueError("Only HTTP proxies are supported by aria2(c)")
|
|
||||||
|
|
||||||
if cookies and not isinstance(cookies, CookieJar):
|
if cookies and not isinstance(cookies, CookieJar):
|
||||||
cookies = cookiejar_from_dict(cookies)
|
cookies = cookiejar_from_dict(cookies)
|
||||||
|
|
||||||
url_files = []
|
_manager.ensure_started(proxy=proxy, max_workers=max_workers)
|
||||||
for i, url in enumerate(urls):
|
|
||||||
if isinstance(url, str):
|
|
||||||
url_data = {"url": url}
|
|
||||||
else:
|
|
||||||
url_data: dict[str, Any] = url
|
|
||||||
url_filename = filename.format(i=i, ext=get_extension(url_data["url"]))
|
|
||||||
url_text = url_data["url"]
|
|
||||||
url_text += f"\n\tdir={output_dir}"
|
|
||||||
url_text += f"\n\tout={url_filename}"
|
|
||||||
if cookies:
|
|
||||||
mock_request = requests.Request(url=url_data["url"])
|
|
||||||
cookie_header = get_cookie_header(cookies, mock_request)
|
|
||||||
if cookie_header:
|
|
||||||
url_text += f"\n\theader=Cookie: {cookie_header}"
|
|
||||||
for key, value in url_data.items():
|
|
||||||
if key == "url":
|
|
||||||
continue
|
|
||||||
if key == "headers":
|
|
||||||
for header_name, header_value in value.items():
|
|
||||||
url_text += f"\n\theader={header_name}: {header_value}"
|
|
||||||
else:
|
|
||||||
url_text += f"\n\t{key}={value}"
|
|
||||||
url_files.append(url_text)
|
|
||||||
url_file = "\n".join(url_files)
|
|
||||||
|
|
||||||
rpc_port = get_free_port()
|
|
||||||
rpc_secret = get_random_bytes(16).hex()
|
|
||||||
rpc_uri = f"http://127.0.0.1:{rpc_port}/jsonrpc"
|
|
||||||
rpc_session = Session()
|
|
||||||
|
|
||||||
max_concurrent_downloads = int(config.aria2c.get("max_concurrent_downloads", max_workers))
|
|
||||||
max_connection_per_server = int(config.aria2c.get("max_connection_per_server", 1))
|
|
||||||
split = int(config.aria2c.get("split", 5))
|
|
||||||
file_allocation = config.aria2c.get("file_allocation", "prealloc")
|
|
||||||
if len(urls) > 1:
|
|
||||||
split = 1
|
|
||||||
file_allocation = "none"
|
|
||||||
|
|
||||||
arguments = [
|
|
||||||
# [Basic Options]
|
|
||||||
"--input-file",
|
|
||||||
"-",
|
|
||||||
"--all-proxy",
|
|
||||||
proxy or "",
|
|
||||||
"--continue=true",
|
|
||||||
# [Connection Options]
|
|
||||||
f"--max-concurrent-downloads={max_concurrent_downloads}",
|
|
||||||
f"--max-connection-per-server={max_connection_per_server}",
|
|
||||||
f"--split={split}", # each split uses their own connection
|
|
||||||
"--max-file-not-found=5", # counted towards --max-tries
|
|
||||||
"--max-tries=5",
|
|
||||||
"--retry-wait=2",
|
|
||||||
# [Advanced Options]
|
|
||||||
"--allow-overwrite=true",
|
|
||||||
"--auto-file-renaming=false",
|
|
||||||
"--console-log-level=warn",
|
|
||||||
"--download-result=default",
|
|
||||||
f"--file-allocation={file_allocation}",
|
|
||||||
"--summary-interval=0",
|
|
||||||
# [RPC Options]
|
|
||||||
"--enable-rpc=true",
|
|
||||||
f"--rpc-listen-port={rpc_port}",
|
|
||||||
f"--rpc-secret={rpc_secret}",
|
|
||||||
]
|
|
||||||
|
|
||||||
for header, value in (headers or {}).items():
|
|
||||||
if header.lower() == "cookie":
|
|
||||||
raise ValueError("You cannot set Cookies as a header manually, please use the `cookies` param.")
|
|
||||||
if header.lower() == "accept-encoding":
|
|
||||||
# we cannot set an allowed encoding, or it will return compressed
|
|
||||||
# and the code is not set up to uncompress the data
|
|
||||||
continue
|
|
||||||
if header.lower() == "referer":
|
|
||||||
arguments.extend(["--referer", value])
|
|
||||||
continue
|
|
||||||
if header.lower() == "user-agent":
|
|
||||||
arguments.extend(["--user-agent", value])
|
|
||||||
continue
|
|
||||||
arguments.extend(["--header", f"{header}: {value}"])
|
|
||||||
|
|
||||||
if debug_logger:
|
if debug_logger:
|
||||||
first_url = urls[0] if isinstance(urls[0], str) else urls[0].get("url", "")
|
first_url = urls[0] if isinstance(urls[0], str) else urls[0].get("url", "")
|
||||||
@@ -202,128 +244,151 @@ def download(
|
|||||||
"first_url": url_display,
|
"first_url": url_display,
|
||||||
"output_dir": str(output_dir),
|
"output_dir": str(output_dir),
|
||||||
"filename": filename,
|
"filename": filename,
|
||||||
"max_concurrent_downloads": max_concurrent_downloads,
|
|
||||||
"max_connection_per_server": max_connection_per_server,
|
|
||||||
"split": split,
|
|
||||||
"file_allocation": file_allocation,
|
|
||||||
"has_proxy": bool(proxy),
|
"has_proxy": bool(proxy),
|
||||||
"rpc_port": rpc_port,
|
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
yield dict(total=len(urls))
|
# Build options for each URI and add via RPC
|
||||||
|
gids: list[str] = []
|
||||||
|
|
||||||
|
for i, url in enumerate(urls):
|
||||||
|
if isinstance(url, str):
|
||||||
|
url_data = {"url": url}
|
||||||
|
else:
|
||||||
|
url_data: dict[str, Any] = url
|
||||||
|
|
||||||
|
url_filename = filename.format(i=i, ext=get_extension(url_data["url"]))
|
||||||
|
|
||||||
|
opts: dict[str, Any] = {
|
||||||
|
"dir": str(output_dir),
|
||||||
|
"out": url_filename,
|
||||||
|
"split": str(1 if len(urls) > 1 else int(config.aria2c.get("split", 5))),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Cookies as header
|
||||||
|
if cookies:
|
||||||
|
mock_request = requests.Request(url=url_data["url"])
|
||||||
|
cookie_header = get_cookie_header(cookies, mock_request)
|
||||||
|
if cookie_header:
|
||||||
|
opts.setdefault("header", []).append(f"Cookie: {cookie_header}")
|
||||||
|
|
||||||
|
# Global headers
|
||||||
|
for header, value in (headers or {}).items():
|
||||||
|
if header.lower() == "cookie":
|
||||||
|
raise ValueError("You cannot set Cookies as a header manually, please use the `cookies` param.")
|
||||||
|
if header.lower() == "accept-encoding":
|
||||||
|
continue
|
||||||
|
if header.lower() == "referer":
|
||||||
|
opts["referer"] = str(value)
|
||||||
|
continue
|
||||||
|
if header.lower() == "user-agent":
|
||||||
|
opts["user-agent"] = str(value)
|
||||||
|
continue
|
||||||
|
opts.setdefault("header", []).append(f"{header}: {value}")
|
||||||
|
|
||||||
|
# Per-url extra args
|
||||||
|
for key, value in url_data.items():
|
||||||
|
if key == "url":
|
||||||
|
continue
|
||||||
|
if key == "headers":
|
||||||
|
for header_name, header_value in value.items():
|
||||||
|
opts.setdefault("header", []).append(f"{header_name}: {header_value}")
|
||||||
|
else:
|
||||||
|
opts[key] = str(value)
|
||||||
|
|
||||||
|
# Add via RPC
|
||||||
|
gid = _manager.add_uris([url_data["url"]], opts)
|
||||||
|
if gid:
|
||||||
|
gids.append(gid)
|
||||||
|
|
||||||
|
yield dict(total=len(gids))
|
||||||
|
|
||||||
|
completed: set[str] = set()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
p = subprocess.Popen([binaries.Aria2, *arguments], stdin=subprocess.PIPE, stdout=subprocess.DEVNULL)
|
while len(completed) < len(gids):
|
||||||
|
if DOWNLOAD_CANCELLED.is_set():
|
||||||
|
# Remove tracked downloads on cancel
|
||||||
|
for gid in gids:
|
||||||
|
if gid not in completed:
|
||||||
|
_manager.remove(gid)
|
||||||
|
yield dict(downloaded="[yellow]CANCELLED")
|
||||||
|
raise KeyboardInterrupt()
|
||||||
|
|
||||||
p.stdin.write(url_file.encode())
|
stats = _manager.get_global_stat()
|
||||||
p.stdin.close()
|
dl_speed = int(stats.get("downloadSpeed", -1))
|
||||||
|
|
||||||
while p.poll() is None:
|
# Aggregate progress across all GIDs for this call
|
||||||
global_stats: dict[str, Any] = (
|
total_completed = 0
|
||||||
rpc(caller=partial(rpc_session.post, url=rpc_uri), secret=rpc_secret, method="aria2.getGlobalStat")
|
total_size = 0
|
||||||
or {}
|
|
||||||
)
|
|
||||||
|
|
||||||
number_stopped = int(global_stats.get("numStoppedTotal", 0))
|
# Check each tracked GID
|
||||||
download_speed = int(global_stats.get("downloadSpeed", -1))
|
for gid in gids:
|
||||||
|
if gid in completed:
|
||||||
|
continue
|
||||||
|
|
||||||
if number_stopped:
|
status = _manager.tell_status(gid)
|
||||||
yield dict(completed=number_stopped)
|
if not status:
|
||||||
if download_speed != -1:
|
continue
|
||||||
yield dict(downloaded=f"{filesize.decimal(download_speed)}/s")
|
|
||||||
|
|
||||||
stopped_downloads: list[dict[str, Any]] = (
|
completed_length = int(status.get("completedLength", 0))
|
||||||
rpc(
|
total_length = int(status.get("totalLength", 0))
|
||||||
caller=partial(rpc_session.post, url=rpc_uri),
|
total_completed += completed_length
|
||||||
secret=rpc_secret,
|
total_size += total_length
|
||||||
method="aria2.tellStopped",
|
|
||||||
params=[0, 999999],
|
|
||||||
)
|
|
||||||
or []
|
|
||||||
)
|
|
||||||
|
|
||||||
for dl in stopped_downloads:
|
state = status.get("status")
|
||||||
if dl["status"] == "error":
|
if state in ("complete", "error"):
|
||||||
used_uri = next(
|
completed.add(gid)
|
||||||
uri["uri"]
|
yield dict(completed=len(completed))
|
||||||
for file in dl["files"]
|
|
||||||
if file["selected"] == "true"
|
|
||||||
for uri in file["uris"]
|
|
||||||
if uri["status"] == "used"
|
|
||||||
)
|
|
||||||
error = f"Download Error (#{dl['gid']}): {dl['errorMessage']} ({dl['errorCode']}), {used_uri}"
|
|
||||||
error_pretty = "\n ".join(
|
|
||||||
textwrap.wrap(error, width=console.width - 20, initial_indent="")
|
|
||||||
)
|
|
||||||
console.log(Text.from_ansi("\n[Aria2c]: " + error_pretty))
|
|
||||||
if debug_logger:
|
|
||||||
debug_logger.log(
|
|
||||||
level="ERROR",
|
|
||||||
operation="downloader_aria2c_download_error",
|
|
||||||
message=f"Aria2c download failed: {dl['errorMessage']}",
|
|
||||||
context={
|
|
||||||
"gid": dl["gid"],
|
|
||||||
"error_code": dl["errorCode"],
|
|
||||||
"error_message": dl["errorMessage"],
|
|
||||||
"used_uri": used_uri[:200] + "..." if len(used_uri) > 200 else used_uri,
|
|
||||||
"completed_length": dl.get("completedLength"),
|
|
||||||
"total_length": dl.get("totalLength"),
|
|
||||||
},
|
|
||||||
)
|
|
||||||
raise ValueError(error)
|
|
||||||
|
|
||||||
if number_stopped == len(urls):
|
if state == "error":
|
||||||
rpc(caller=partial(rpc_session.post, url=rpc_uri), secret=rpc_secret, method="aria2.shutdown")
|
used_uri = None
|
||||||
break
|
try:
|
||||||
|
used_uri = next(
|
||||||
|
uri["uri"]
|
||||||
|
for file in status.get("files", [])
|
||||||
|
for uri in file.get("uris", [])
|
||||||
|
if uri.get("status") == "used"
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
used_uri = "unknown"
|
||||||
|
error = f"Download Error (#{gid}): {status.get('errorMessage')} ({status.get('errorCode')}), {used_uri}"
|
||||||
|
error_pretty = "\n ".join(textwrap.wrap(error, width=console.width - 20, initial_indent=""))
|
||||||
|
console.log(Text.from_ansi("\n[Aria2c]: " + error_pretty))
|
||||||
|
if debug_logger:
|
||||||
|
debug_logger.log(
|
||||||
|
level="ERROR",
|
||||||
|
operation="downloader_aria2c_download_error",
|
||||||
|
message=f"Aria2c download failed: {status.get('errorMessage')}",
|
||||||
|
context={
|
||||||
|
"gid": gid,
|
||||||
|
"error_code": status.get("errorCode"),
|
||||||
|
"error_message": status.get("errorMessage"),
|
||||||
|
"used_uri": used_uri[:200] + "..." if used_uri and len(used_uri) > 200 else used_uri,
|
||||||
|
"completed_length": status.get("completedLength"),
|
||||||
|
"total_length": status.get("totalLength"),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
raise ValueError(error)
|
||||||
|
|
||||||
|
# Yield aggregate progress for this call's downloads
|
||||||
|
if total_size > 0:
|
||||||
|
# Yield both advance (bytes downloaded this iteration) and total for rich progress
|
||||||
|
if dl_speed != -1:
|
||||||
|
yield dict(downloaded=f"{filesize.decimal(dl_speed)}/s", advance=0, completed=total_completed, total=total_size)
|
||||||
|
else:
|
||||||
|
yield dict(advance=0, completed=total_completed, total=total_size)
|
||||||
|
elif dl_speed != -1:
|
||||||
|
yield dict(downloaded=f"{filesize.decimal(dl_speed)}/s")
|
||||||
|
|
||||||
time.sleep(1)
|
time.sleep(1)
|
||||||
|
|
||||||
p.wait()
|
|
||||||
|
|
||||||
if p.returncode != 0:
|
|
||||||
if debug_logger:
|
|
||||||
debug_logger.log(
|
|
||||||
level="ERROR",
|
|
||||||
operation="downloader_aria2c_failed",
|
|
||||||
message=f"Aria2c exited with code {p.returncode}",
|
|
||||||
context={
|
|
||||||
"returncode": p.returncode,
|
|
||||||
"url_count": len(urls),
|
|
||||||
"output_dir": str(output_dir),
|
|
||||||
},
|
|
||||||
)
|
|
||||||
raise subprocess.CalledProcessError(p.returncode, arguments)
|
|
||||||
|
|
||||||
if debug_logger:
|
|
||||||
debug_logger.log(
|
|
||||||
level="DEBUG",
|
|
||||||
operation="downloader_aria2c_complete",
|
|
||||||
message="Aria2c download completed successfully",
|
|
||||||
context={
|
|
||||||
"url_count": len(urls),
|
|
||||||
"output_dir": str(output_dir),
|
|
||||||
"filename": filename,
|
|
||||||
},
|
|
||||||
)
|
|
||||||
|
|
||||||
except ConnectionResetError:
|
|
||||||
# interrupted while passing URI to download
|
|
||||||
raise KeyboardInterrupt()
|
|
||||||
except subprocess.CalledProcessError as e:
|
|
||||||
if e.returncode in (7, 0xC000013A):
|
|
||||||
# 7 is when Aria2(c) handled the CTRL+C
|
|
||||||
# 0xC000013A is when it never got the chance to
|
|
||||||
raise KeyboardInterrupt()
|
|
||||||
raise
|
|
||||||
except KeyboardInterrupt:
|
except KeyboardInterrupt:
|
||||||
DOWNLOAD_CANCELLED.set() # skip pending track downloads
|
DOWNLOAD_CANCELLED.set()
|
||||||
yield dict(downloaded="[yellow]CANCELLED")
|
|
||||||
raise
|
raise
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
DOWNLOAD_CANCELLED.set() # skip pending track downloads
|
DOWNLOAD_CANCELLED.set()
|
||||||
yield dict(downloaded="[red]FAILED")
|
yield dict(downloaded="[red]FAILED")
|
||||||
if debug_logger and not isinstance(e, (subprocess.CalledProcessError, ValueError)):
|
if debug_logger and not isinstance(e, ValueError):
|
||||||
debug_logger.log(
|
debug_logger.log(
|
||||||
level="ERROR",
|
level="ERROR",
|
||||||
operation="downloader_aria2c_exception",
|
operation="downloader_aria2c_exception",
|
||||||
@@ -335,8 +400,6 @@ def download(
|
|||||||
},
|
},
|
||||||
)
|
)
|
||||||
raise
|
raise
|
||||||
finally:
|
|
||||||
rpc(caller=partial(rpc_session.post, url=rpc_uri), secret=rpc_secret, method="aria2.shutdown")
|
|
||||||
|
|
||||||
|
|
||||||
def aria2c(
|
def aria2c(
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ import requests
|
|||||||
from requests.cookies import cookiejar_from_dict, get_cookie_header
|
from requests.cookies import cookiejar_from_dict, get_cookie_header
|
||||||
|
|
||||||
from unshackle.core import binaries
|
from unshackle.core import binaries
|
||||||
|
from unshackle.core.binaries import FFMPEG, Mp4decrypt, ShakaPackager
|
||||||
from unshackle.core.config import config
|
from unshackle.core.config import config
|
||||||
from unshackle.core.console import console
|
from unshackle.core.console import console
|
||||||
from unshackle.core.constants import DOWNLOAD_CANCELLED
|
from unshackle.core.constants import DOWNLOAD_CANCELLED
|
||||||
@@ -19,7 +20,7 @@ PERCENT_RE = re.compile(r"(\d+\.\d+%)")
|
|||||||
SPEED_RE = re.compile(r"(\d+\.\d+(?:MB|KB)ps)")
|
SPEED_RE = re.compile(r"(\d+\.\d+(?:MB|KB)ps)")
|
||||||
SIZE_RE = re.compile(r"(\d+\.\d+(?:MB|GB|KB)/\d+\.\d+(?:MB|GB|KB))")
|
SIZE_RE = re.compile(r"(\d+\.\d+(?:MB|GB|KB)/\d+\.\d+(?:MB|GB|KB))")
|
||||||
WARN_RE = re.compile(r"(WARN : Response.*|WARN : One or more errors occurred.*)")
|
WARN_RE = re.compile(r"(WARN : Response.*|WARN : One or more errors occurred.*)")
|
||||||
ERROR_RE = re.compile(r"(ERROR.*)")
|
ERROR_RE = re.compile(r"(\bERROR\b.*|\bFAILED\b.*|\bException\b.*)")
|
||||||
|
|
||||||
DECRYPTION_ENGINE = {
|
DECRYPTION_ENGINE = {
|
||||||
"shaka": "SHAKA_PACKAGER",
|
"shaka": "SHAKA_PACKAGER",
|
||||||
@@ -181,17 +182,33 @@ def build_download_args(
|
|||||||
"--tmp-dir": output_dir,
|
"--tmp-dir": output_dir,
|
||||||
"--thread-count": thread_count,
|
"--thread-count": thread_count,
|
||||||
"--download-retry-count": retry_count,
|
"--download-retry-count": retry_count,
|
||||||
"--write-meta-json": False,
|
|
||||||
}
|
}
|
||||||
|
if FFMPEG:
|
||||||
|
args["--ffmpeg-binary-path"] = str(FFMPEG)
|
||||||
if proxy:
|
if proxy:
|
||||||
args["--custom-proxy"] = proxy
|
args["--custom-proxy"] = proxy
|
||||||
if skip_merge:
|
if skip_merge:
|
||||||
args["--skip-merge"] = skip_merge
|
args["--skip-merge"] = skip_merge
|
||||||
if ad_keyword:
|
if ad_keyword:
|
||||||
args["--ad-keyword"] = ad_keyword
|
args["--ad-keyword"] = ad_keyword
|
||||||
|
|
||||||
if content_keys:
|
if content_keys:
|
||||||
args["--key"] = next((f"{kid.hex}:{key.lower()}" for kid, key in content_keys.items()), None)
|
args["--key"] = next((f"{kid.hex}:{key.lower()}" for kid, key in content_keys.items()), None)
|
||||||
args["--decryption-engine"] = DECRYPTION_ENGINE.get(config.decryption.lower()) or "SHAKA_PACKAGER"
|
|
||||||
|
decryption_config = config.decryption.lower()
|
||||||
|
engine_name = DECRYPTION_ENGINE.get(decryption_config) or "SHAKA_PACKAGER"
|
||||||
|
args["--decryption-engine"] = engine_name
|
||||||
|
|
||||||
|
binary_path = None
|
||||||
|
if engine_name == "SHAKA_PACKAGER":
|
||||||
|
if ShakaPackager:
|
||||||
|
binary_path = str(ShakaPackager)
|
||||||
|
elif engine_name == "MP4DECRYPT":
|
||||||
|
if Mp4decrypt:
|
||||||
|
binary_path = str(Mp4decrypt)
|
||||||
|
if binary_path:
|
||||||
|
args["--decryption-binary-path"] = binary_path
|
||||||
|
|
||||||
if custom_args:
|
if custom_args:
|
||||||
args.update(custom_args)
|
args.update(custom_args)
|
||||||
|
|
||||||
@@ -288,7 +305,10 @@ def download(
|
|||||||
log_file_path: Path | None = None
|
log_file_path: Path | None = None
|
||||||
if debug_logger:
|
if debug_logger:
|
||||||
log_file_path = output_dir / f".n_m3u8dl_re_{filename}.log"
|
log_file_path = output_dir / f".n_m3u8dl_re_{filename}.log"
|
||||||
arguments.extend(["--log-file-path", str(log_file_path)])
|
arguments.extend([
|
||||||
|
"--log-file-path", str(log_file_path),
|
||||||
|
"--log-level", "DEBUG",
|
||||||
|
])
|
||||||
|
|
||||||
track_url_display = track.url[:200] + "..." if len(track.url) > 200 else track.url
|
track_url_display = track.url[:200] + "..." if len(track.url) > 200 else track.url
|
||||||
debug_logger.log(
|
debug_logger.log(
|
||||||
@@ -376,6 +396,14 @@ def download(
|
|||||||
raise subprocess.CalledProcessError(process.returncode, arguments)
|
raise subprocess.CalledProcessError(process.returncode, arguments)
|
||||||
|
|
||||||
if debug_logger:
|
if debug_logger:
|
||||||
|
output_dir_exists = output_dir.exists()
|
||||||
|
output_files = []
|
||||||
|
if output_dir_exists:
|
||||||
|
try:
|
||||||
|
output_files = [f.name for f in output_dir.iterdir() if f.is_file()][:20]
|
||||||
|
except Exception:
|
||||||
|
output_files = ["<error listing files>"]
|
||||||
|
|
||||||
debug_logger.log(
|
debug_logger.log(
|
||||||
level="DEBUG",
|
level="DEBUG",
|
||||||
operation="downloader_n_m3u8dl_re_complete",
|
operation="downloader_n_m3u8dl_re_complete",
|
||||||
@@ -384,10 +412,38 @@ def download(
|
|||||||
"track_id": getattr(track, "id", None),
|
"track_id": getattr(track, "id", None),
|
||||||
"track_type": track.__class__.__name__,
|
"track_type": track.__class__.__name__,
|
||||||
"output_dir": str(output_dir),
|
"output_dir": str(output_dir),
|
||||||
|
"output_dir_exists": output_dir_exists,
|
||||||
|
"output_files_count": len(output_files),
|
||||||
|
"output_files": output_files,
|
||||||
"filename": filename,
|
"filename": filename,
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Warn if no output was produced - include N_m3u8DL-RE's logs for diagnosis
|
||||||
|
if not output_dir_exists or not output_files:
|
||||||
|
# Read N_m3u8DL-RE's log file for debugging
|
||||||
|
n_m3u8dl_log = ""
|
||||||
|
if log_file_path and log_file_path.exists():
|
||||||
|
try:
|
||||||
|
n_m3u8dl_log = log_file_path.read_text(encoding="utf-8", errors="replace")
|
||||||
|
except Exception:
|
||||||
|
n_m3u8dl_log = "<failed to read log file>"
|
||||||
|
|
||||||
|
debug_logger.log(
|
||||||
|
level="WARNING",
|
||||||
|
operation="downloader_n_m3u8dl_re_no_output",
|
||||||
|
message="N_m3u8DL-RE exited successfully but produced no output files",
|
||||||
|
context={
|
||||||
|
"track_id": getattr(track, "id", None),
|
||||||
|
"track_type": track.__class__.__name__,
|
||||||
|
"output_dir": str(output_dir),
|
||||||
|
"output_dir_exists": output_dir_exists,
|
||||||
|
"selection_args": selection_args,
|
||||||
|
"track_url": track.url[:200] + "..." if len(track.url) > 200 else track.url,
|
||||||
|
"n_m3u8dl_re_log": n_m3u8dl_log,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
except ConnectionResetError:
|
except ConnectionResetError:
|
||||||
# interrupted while passing URI to download
|
# interrupted while passing URI to download
|
||||||
raise KeyboardInterrupt()
|
raise KeyboardInterrupt()
|
||||||
@@ -419,6 +475,7 @@ def download(
|
|||||||
)
|
)
|
||||||
raise
|
raise
|
||||||
finally:
|
finally:
|
||||||
|
# Clean up temporary debug files
|
||||||
if log_file_path and log_file_path.exists():
|
if log_file_path and log_file_path.exists():
|
||||||
try:
|
try:
|
||||||
log_file_path.unlink()
|
log_file_path.unlink()
|
||||||
|
|||||||
@@ -122,7 +122,7 @@ def download(
|
|||||||
last_speed_refresh = now
|
last_speed_refresh = now
|
||||||
download_sizes.clear()
|
download_sizes.clear()
|
||||||
|
|
||||||
if content_length and written < content_length:
|
if not segmented and content_length and written < content_length:
|
||||||
raise IOError(f"Failed to read {content_length} bytes from the track URI.")
|
raise IOError(f"Failed to read {content_length} bytes from the track URI.")
|
||||||
|
|
||||||
yield dict(file_downloaded=save_path, written=written)
|
yield dict(file_downloaded=save_path, written=written)
|
||||||
@@ -264,7 +264,7 @@ def requests(
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
with ThreadPoolExecutor(max_workers=max_workers) as pool:
|
with ThreadPoolExecutor(max_workers=max_workers) as pool:
|
||||||
for future in as_completed(pool.submit(download, session=session, segmented=False, **url) for url in urls):
|
for future in as_completed(pool.submit(download, session=session, segmented=True, **url) for url in urls):
|
||||||
try:
|
try:
|
||||||
yield from future.result()
|
yield from future.result()
|
||||||
except KeyboardInterrupt:
|
except KeyboardInterrupt:
|
||||||
|
|||||||
@@ -1,10 +1,11 @@
|
|||||||
from typing import Union
|
from typing import Union
|
||||||
|
|
||||||
from unshackle.core.drm.clearkey import ClearKey
|
from unshackle.core.drm.clearkey import ClearKey
|
||||||
|
from unshackle.core.drm.monalisa import MonaLisa
|
||||||
from unshackle.core.drm.playready import PlayReady
|
from unshackle.core.drm.playready import PlayReady
|
||||||
from unshackle.core.drm.widevine import Widevine
|
from unshackle.core.drm.widevine import Widevine
|
||||||
|
|
||||||
DRM_T = Union[ClearKey, Widevine, PlayReady]
|
DRM_T = Union[ClearKey, Widevine, PlayReady, MonaLisa]
|
||||||
|
|
||||||
|
|
||||||
__all__ = ("ClearKey", "Widevine", "PlayReady", "DRM_T")
|
__all__ = ("ClearKey", "Widevine", "PlayReady", "MonaLisa", "DRM_T")
|
||||||
|
|||||||
280
unshackle/core/drm/monalisa.py
Normal file
280
unshackle/core/drm/monalisa.py
Normal file
@@ -0,0 +1,280 @@
|
|||||||
|
"""
|
||||||
|
MonaLisa DRM System.
|
||||||
|
|
||||||
|
A WASM-based DRM system that uses local key extraction and two-stage
|
||||||
|
segment decryption (ML-Worker binary + AES-ECB).
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import os
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any, Optional, Union
|
||||||
|
from uuid import UUID
|
||||||
|
|
||||||
|
from Cryptodome.Cipher import AES
|
||||||
|
from Cryptodome.Util.Padding import unpad
|
||||||
|
|
||||||
|
|
||||||
|
class MonaLisa:
|
||||||
|
"""
|
||||||
|
MonaLisa DRM System.
|
||||||
|
|
||||||
|
Unlike Widevine/PlayReady, MonaLisa does not use a challenge/response flow
|
||||||
|
with a license server. Instead, the PSSH value (ticket) is provided directly
|
||||||
|
by the service API, and keys are extracted locally via a WASM module.
|
||||||
|
|
||||||
|
Decryption is performed in two stages:
|
||||||
|
1. ML-Worker binary: Removes MonaLisa encryption layer (bbts -> ents)
|
||||||
|
2. AES-ECB decryption: Final decryption with service-provided key
|
||||||
|
"""
|
||||||
|
|
||||||
|
class Exceptions:
|
||||||
|
class TicketNotFound(Exception):
|
||||||
|
"""Raised when no PSSH/ticket data is provided."""
|
||||||
|
|
||||||
|
class KeyExtractionFailed(Exception):
|
||||||
|
"""Raised when key extraction from the ticket fails."""
|
||||||
|
|
||||||
|
class WorkerNotFound(Exception):
|
||||||
|
"""Raised when the ML-Worker binary is not found."""
|
||||||
|
|
||||||
|
class DecryptionFailed(Exception):
|
||||||
|
"""Raised when segment decryption fails."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
ticket: Union[str, bytes],
|
||||||
|
aes_key: Union[str, bytes],
|
||||||
|
device_path: Path,
|
||||||
|
**kwargs: Any,
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Initialize MonaLisa DRM.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
ticket: PSSH value from service API (base64 string or raw bytes).
|
||||||
|
aes_key: AES-ECB key for second-stage decryption (hex string or bytes).
|
||||||
|
device_path: Path to the CDM device file (.mld).
|
||||||
|
**kwargs: Additional metadata stored in self.data.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
TicketNotFound: If ticket/PSSH is empty.
|
||||||
|
KeyExtractionFailed: If key extraction fails.
|
||||||
|
"""
|
||||||
|
if not ticket:
|
||||||
|
raise MonaLisa.Exceptions.TicketNotFound("No PSSH/ticket data provided.")
|
||||||
|
|
||||||
|
self._ticket = ticket
|
||||||
|
|
||||||
|
# Store AES key for second-stage decryption
|
||||||
|
if isinstance(aes_key, str):
|
||||||
|
self._aes_key = bytes.fromhex(aes_key)
|
||||||
|
else:
|
||||||
|
self._aes_key = aes_key
|
||||||
|
|
||||||
|
self._device_path = device_path
|
||||||
|
self._kid: Optional[UUID] = None
|
||||||
|
self._key: Optional[str] = None
|
||||||
|
self.data: dict = kwargs or {}
|
||||||
|
|
||||||
|
# Extract keys immediately
|
||||||
|
self._extract_keys()
|
||||||
|
|
||||||
|
def _extract_keys(self) -> None:
|
||||||
|
"""Extract keys from the ticket using the MonaLisa CDM."""
|
||||||
|
# Import here to avoid circular import
|
||||||
|
from unshackle.core.cdm.monalisa import MonaLisaCDM
|
||||||
|
|
||||||
|
try:
|
||||||
|
cdm = MonaLisaCDM(device_path=self._device_path)
|
||||||
|
session_id = cdm.open()
|
||||||
|
try:
|
||||||
|
keys = cdm.extract_keys(self._ticket)
|
||||||
|
if keys:
|
||||||
|
kid_hex = keys.get("kid")
|
||||||
|
if kid_hex:
|
||||||
|
self._kid = UUID(hex=kid_hex)
|
||||||
|
self._key = keys.get("key")
|
||||||
|
finally:
|
||||||
|
cdm.close(session_id)
|
||||||
|
except Exception as e:
|
||||||
|
raise MonaLisa.Exceptions.KeyExtractionFailed(f"Failed to extract keys: {e}")
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_ticket(
|
||||||
|
cls,
|
||||||
|
ticket: Union[str, bytes],
|
||||||
|
aes_key: Union[str, bytes],
|
||||||
|
device_path: Path,
|
||||||
|
) -> MonaLisa:
|
||||||
|
"""
|
||||||
|
Create a MonaLisa DRM instance from a PSSH/ticket.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
ticket: PSSH value from service API.
|
||||||
|
aes_key: AES-ECB key for second-stage decryption.
|
||||||
|
device_path: Path to the CDM device file (.mld).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
MonaLisa DRM instance with extracted keys.
|
||||||
|
"""
|
||||||
|
return cls(ticket=ticket, aes_key=aes_key, device_path=device_path)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def kid(self) -> Optional[UUID]:
|
||||||
|
"""Get the Key ID."""
|
||||||
|
return self._kid
|
||||||
|
|
||||||
|
@property
|
||||||
|
def key(self) -> Optional[str]:
|
||||||
|
"""Get the content key as hex string."""
|
||||||
|
return self._key
|
||||||
|
|
||||||
|
@property
|
||||||
|
def pssh(self) -> str:
|
||||||
|
"""
|
||||||
|
Get the raw PSSH/ticket value as a string.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The raw PSSH value as a base64 string.
|
||||||
|
"""
|
||||||
|
if isinstance(self._ticket, bytes):
|
||||||
|
return self._ticket.decode("utf-8")
|
||||||
|
return self._ticket
|
||||||
|
|
||||||
|
@property
|
||||||
|
def content_id(self) -> Optional[str]:
|
||||||
|
"""
|
||||||
|
Extract the Content ID from the PSSH for display.
|
||||||
|
|
||||||
|
The PSSH contains an embedded Content ID at bytes 21-75 with format:
|
||||||
|
H5DCID-V3-P1-YYYYMMDD-HHMMSS-MEDIAID-TIMESTAMP-SUFFIX
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The Content ID string if extractable, None otherwise.
|
||||||
|
"""
|
||||||
|
import base64
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Decode base64 PSSH to get raw bytes
|
||||||
|
if isinstance(self._ticket, bytes):
|
||||||
|
data = self._ticket
|
||||||
|
else:
|
||||||
|
data = base64.b64decode(self._ticket)
|
||||||
|
|
||||||
|
# Content ID is at bytes 21-75 (55 bytes)
|
||||||
|
if len(data) >= 76:
|
||||||
|
content_id = data[21:76].decode("ascii")
|
||||||
|
# Validate it looks like a content ID
|
||||||
|
if content_id.startswith("H5DCID-"):
|
||||||
|
return content_id
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def content_keys(self) -> dict[UUID, str]:
|
||||||
|
"""
|
||||||
|
Get content keys in the same format as Widevine/PlayReady.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary mapping KID to key hex string.
|
||||||
|
"""
|
||||||
|
if self._kid and self._key:
|
||||||
|
return {self._kid: self._key}
|
||||||
|
return {}
|
||||||
|
|
||||||
|
def decrypt_segment(self, segment_path: Path) -> None:
|
||||||
|
"""
|
||||||
|
Decrypt a single segment using two-stage decryption.
|
||||||
|
|
||||||
|
Stage 1: ML-Worker binary (bbts -> ents)
|
||||||
|
Stage 2: AES-ECB decryption (ents -> ts)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
segment_path: Path to the encrypted segment file.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
WorkerNotFound: If ML-Worker binary is not available.
|
||||||
|
DecryptionFailed: If decryption fails at any stage.
|
||||||
|
"""
|
||||||
|
if not self._key:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Import here to avoid circular import
|
||||||
|
from unshackle.core.cdm.monalisa import MonaLisaCDM
|
||||||
|
|
||||||
|
worker_path = MonaLisaCDM.get_worker_path()
|
||||||
|
if not worker_path or not worker_path.exists():
|
||||||
|
raise MonaLisa.Exceptions.WorkerNotFound("ML-Worker not found.")
|
||||||
|
|
||||||
|
bbts_path = segment_path.with_suffix(".bbts")
|
||||||
|
ents_path = segment_path.with_suffix(".ents")
|
||||||
|
|
||||||
|
try:
|
||||||
|
if segment_path.exists():
|
||||||
|
segment_path.replace(bbts_path)
|
||||||
|
else:
|
||||||
|
raise MonaLisa.Exceptions.DecryptionFailed(f"Segment file does not exist: {segment_path}")
|
||||||
|
|
||||||
|
# Stage 1: ML-Worker decryption
|
||||||
|
cmd = [str(worker_path), self._key, str(bbts_path), str(ents_path)]
|
||||||
|
|
||||||
|
startupinfo = None
|
||||||
|
if sys.platform == "win32":
|
||||||
|
startupinfo = subprocess.STARTUPINFO()
|
||||||
|
startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
|
||||||
|
|
||||||
|
process = subprocess.run(
|
||||||
|
cmd,
|
||||||
|
stdout=subprocess.PIPE,
|
||||||
|
stderr=subprocess.PIPE,
|
||||||
|
text=True,
|
||||||
|
startupinfo=startupinfo,
|
||||||
|
)
|
||||||
|
|
||||||
|
if process.returncode != 0:
|
||||||
|
raise MonaLisa.Exceptions.DecryptionFailed(
|
||||||
|
f"ML-Worker failed for {segment_path.name}: {process.stderr}"
|
||||||
|
)
|
||||||
|
|
||||||
|
if not ents_path.exists():
|
||||||
|
raise MonaLisa.Exceptions.DecryptionFailed(
|
||||||
|
f"Decrypted .ents file was not created for {segment_path.name}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Stage 2: AES-ECB decryption
|
||||||
|
with open(ents_path, "rb") as f:
|
||||||
|
ents_data = f.read()
|
||||||
|
|
||||||
|
crypto = AES.new(self._aes_key, AES.MODE_ECB)
|
||||||
|
decrypted_data = unpad(crypto.decrypt(ents_data), AES.block_size)
|
||||||
|
|
||||||
|
# Write decrypted segment back to original path
|
||||||
|
with open(segment_path, "wb") as f:
|
||||||
|
f.write(decrypted_data)
|
||||||
|
|
||||||
|
except MonaLisa.Exceptions.DecryptionFailed:
|
||||||
|
raise
|
||||||
|
except Exception as e:
|
||||||
|
raise MonaLisa.Exceptions.DecryptionFailed(f"Failed to decrypt segment {segment_path.name}: {e}")
|
||||||
|
finally:
|
||||||
|
if ents_path.exists():
|
||||||
|
os.remove(ents_path)
|
||||||
|
if bbts_path != segment_path and bbts_path.exists():
|
||||||
|
os.remove(bbts_path)
|
||||||
|
|
||||||
|
def decrypt(self, _path: Path) -> None:
|
||||||
|
"""
|
||||||
|
MonaLisa uses per-segment decryption during download via the
|
||||||
|
on_segment_downloaded callback. By the time this method is called,
|
||||||
|
the content has already been decrypted and muxed into a container.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
path: Path to the file (ignored).
|
||||||
|
"""
|
||||||
|
pass
|
||||||
@@ -151,6 +151,11 @@ class DASH:
|
|||||||
if not track_fps and segment_base is not None:
|
if not track_fps and segment_base is not None:
|
||||||
track_fps = segment_base.get("timescale")
|
track_fps = segment_base.get("timescale")
|
||||||
|
|
||||||
|
scan_type = None
|
||||||
|
scan_type_str = get("scanType")
|
||||||
|
if scan_type_str and scan_type_str.lower() == "interlaced":
|
||||||
|
scan_type = Video.ScanType.INTERLACED
|
||||||
|
|
||||||
track_args = dict(
|
track_args = dict(
|
||||||
range_=self.get_video_range(
|
range_=self.get_video_range(
|
||||||
codecs, findall("SupplementalProperty"), findall("EssentialProperty")
|
codecs, findall("SupplementalProperty"), findall("EssentialProperty")
|
||||||
@@ -159,6 +164,7 @@ class DASH:
|
|||||||
width=get("width") or 0,
|
width=get("width") or 0,
|
||||||
height=get("height") or 0,
|
height=get("height") or 0,
|
||||||
fps=track_fps or None,
|
fps=track_fps or None,
|
||||||
|
scan_type=scan_type,
|
||||||
)
|
)
|
||||||
elif content_type == "audio":
|
elif content_type == "audio":
|
||||||
track_type = Audio
|
track_type = Audio
|
||||||
@@ -366,6 +372,9 @@ class DASH:
|
|||||||
|
|
||||||
if not end_number:
|
if not end_number:
|
||||||
end_number = len(segment_durations)
|
end_number = len(segment_durations)
|
||||||
|
# Handle high startNumber in DVR/catch-up manifests where startNumber > segment count
|
||||||
|
if start_number > end_number:
|
||||||
|
end_number = start_number + len(segment_durations) - 1
|
||||||
|
|
||||||
for t, n in zip(segment_durations, range(start_number, end_number + 1)):
|
for t, n in zip(segment_durations, range(start_number, end_number + 1)):
|
||||||
segments.append(
|
segments.append(
|
||||||
@@ -467,8 +476,9 @@ class DASH:
|
|||||||
track.data["dash"]["timescale"] = int(segment_timescale)
|
track.data["dash"]["timescale"] = int(segment_timescale)
|
||||||
track.data["dash"]["segment_durations"] = segment_durations
|
track.data["dash"]["segment_durations"] = segment_durations
|
||||||
|
|
||||||
if init_data and isinstance(track, (Video, Audio)):
|
if not track.drm and init_data and isinstance(track, (Video, Audio)):
|
||||||
if isinstance(cdm, PlayReadyCdm):
|
prefers_playready = isinstance(cdm, PlayReadyCdm) or (hasattr(cdm, "is_playready") and cdm.is_playready)
|
||||||
|
if prefers_playready:
|
||||||
try:
|
try:
|
||||||
track.drm = [PlayReady.from_init_data(init_data)]
|
track.drm = [PlayReady.from_init_data(init_data)]
|
||||||
except PlayReady.Exceptions.PSSHNotFound:
|
except PlayReady.Exceptions.PSSHNotFound:
|
||||||
@@ -572,8 +582,64 @@ class DASH:
|
|||||||
for control_file in save_dir.glob("*.aria2__temp"):
|
for control_file in save_dir.glob("*.aria2__temp"):
|
||||||
control_file.unlink()
|
control_file.unlink()
|
||||||
|
|
||||||
|
# Verify output directory exists and contains files
|
||||||
|
if not save_dir.exists():
|
||||||
|
error_msg = f"Output directory does not exist: {save_dir}"
|
||||||
|
if debug_logger:
|
||||||
|
debug_logger.log(
|
||||||
|
level="ERROR",
|
||||||
|
operation="manifest_dash_download_output_missing",
|
||||||
|
message=error_msg,
|
||||||
|
context={
|
||||||
|
"track_id": getattr(track, "id", None),
|
||||||
|
"track_type": track.__class__.__name__,
|
||||||
|
"save_dir": str(save_dir),
|
||||||
|
"save_path": str(save_path),
|
||||||
|
"downloader": downloader.__name__,
|
||||||
|
"skip_merge": skip_merge,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
raise FileNotFoundError(error_msg)
|
||||||
|
|
||||||
segments_to_merge = [x for x in sorted(save_dir.iterdir()) if x.is_file()]
|
segments_to_merge = [x for x in sorted(save_dir.iterdir()) if x.is_file()]
|
||||||
|
|
||||||
|
if debug_logger:
|
||||||
|
debug_logger.log(
|
||||||
|
level="DEBUG",
|
||||||
|
operation="manifest_dash_download_complete",
|
||||||
|
message="DASH download complete, preparing to merge",
|
||||||
|
context={
|
||||||
|
"track_id": getattr(track, "id", None),
|
||||||
|
"track_type": track.__class__.__name__,
|
||||||
|
"save_dir": str(save_dir),
|
||||||
|
"save_dir_exists": save_dir.exists(),
|
||||||
|
"segments_found": len(segments_to_merge),
|
||||||
|
"segment_files": [f.name for f in segments_to_merge[:10]], # Limit to first 10
|
||||||
|
"downloader": downloader.__name__,
|
||||||
|
"skip_merge": skip_merge,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
if not segments_to_merge:
|
||||||
|
error_msg = f"No segment files found in output directory: {save_dir}"
|
||||||
|
if debug_logger:
|
||||||
|
# List all contents of the directory for debugging
|
||||||
|
all_contents = list(save_dir.iterdir()) if save_dir.exists() else []
|
||||||
|
debug_logger.log(
|
||||||
|
level="ERROR",
|
||||||
|
operation="manifest_dash_download_no_segments",
|
||||||
|
message=error_msg,
|
||||||
|
context={
|
||||||
|
"track_id": getattr(track, "id", None),
|
||||||
|
"track_type": track.__class__.__name__,
|
||||||
|
"save_dir": str(save_dir),
|
||||||
|
"directory_contents": [str(p) for p in all_contents],
|
||||||
|
"downloader": downloader.__name__,
|
||||||
|
"skip_merge": skip_merge,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
raise FileNotFoundError(error_msg)
|
||||||
|
|
||||||
if skip_merge:
|
if skip_merge:
|
||||||
# N_m3u8DL-RE handles merging and decryption internally
|
# N_m3u8DL-RE handles merging and decryption internally
|
||||||
shutil.move(segments_to_merge[0], save_path)
|
shutil.move(segments_to_merge[0], save_path)
|
||||||
@@ -800,7 +866,7 @@ class DASH:
|
|||||||
urn = (protection.get("schemeIdUri") or "").lower()
|
urn = (protection.get("schemeIdUri") or "").lower()
|
||||||
|
|
||||||
if urn == WidevineCdm.urn:
|
if urn == WidevineCdm.urn:
|
||||||
pssh_text = protection.findtext("pssh")
|
pssh_text = protection.findtext("pssh") or protection.findtext("{urn:mpeg:cenc:2013}pssh")
|
||||||
if not pssh_text:
|
if not pssh_text:
|
||||||
continue
|
continue
|
||||||
pssh = PSSH(pssh_text)
|
pssh = PSSH(pssh_text)
|
||||||
@@ -831,6 +897,7 @@ class DASH:
|
|||||||
elif urn in ("urn:uuid:9a04f079-9840-4286-ab92-e65be0885f95", "urn:microsoft:playready"):
|
elif urn in ("urn:uuid:9a04f079-9840-4286-ab92-e65be0885f95", "urn:microsoft:playready"):
|
||||||
pr_pssh_b64 = (
|
pr_pssh_b64 = (
|
||||||
protection.findtext("pssh")
|
protection.findtext("pssh")
|
||||||
|
or protection.findtext("{urn:mpeg:cenc:2013}pssh")
|
||||||
or protection.findtext("pro")
|
or protection.findtext("pro")
|
||||||
or protection.findtext("{urn:microsoft:playready}pro")
|
or protection.findtext("{urn:microsoft:playready}pro")
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -30,7 +30,7 @@ from requests import Session
|
|||||||
from unshackle.core import binaries
|
from unshackle.core import binaries
|
||||||
from unshackle.core.constants import DOWNLOAD_CANCELLED, DOWNLOAD_LICENCE_ONLY, AnyTrack
|
from unshackle.core.constants import DOWNLOAD_CANCELLED, DOWNLOAD_LICENCE_ONLY, AnyTrack
|
||||||
from unshackle.core.downloaders import requests as requests_downloader
|
from unshackle.core.downloaders import requests as requests_downloader
|
||||||
from unshackle.core.drm import DRM_T, ClearKey, PlayReady, Widevine
|
from unshackle.core.drm import DRM_T, ClearKey, MonaLisa, PlayReady, Widevine
|
||||||
from unshackle.core.events import events
|
from unshackle.core.events import events
|
||||||
from unshackle.core.tracks import Audio, Subtitle, Tracks, Video
|
from unshackle.core.tracks import Audio, Subtitle, Tracks, Video
|
||||||
from unshackle.core.utilities import get_debug_logger, get_extension, is_close_match, try_ensure_utf8
|
from unshackle.core.utilities import get_debug_logger, get_extension, is_close_match, try_ensure_utf8
|
||||||
@@ -316,6 +316,10 @@ class HLS:
|
|||||||
progress(downloaded="[red]FAILED")
|
progress(downloaded="[red]FAILED")
|
||||||
raise
|
raise
|
||||||
|
|
||||||
|
if not initial_drm_licensed and session_drm and isinstance(session_drm, MonaLisa):
|
||||||
|
if license_widevine:
|
||||||
|
license_widevine(session_drm)
|
||||||
|
|
||||||
if DOWNLOAD_LICENCE_ONLY.is_set():
|
if DOWNLOAD_LICENCE_ONLY.is_set():
|
||||||
progress(downloaded="[yellow]SKIPPED")
|
progress(downloaded="[yellow]SKIPPED")
|
||||||
return
|
return
|
||||||
@@ -591,7 +595,11 @@ class HLS:
|
|||||||
|
|
||||||
segment_keys = getattr(segment, "keys", None)
|
segment_keys = getattr(segment, "keys", None)
|
||||||
if segment_keys:
|
if segment_keys:
|
||||||
key = HLS.get_supported_key(segment_keys)
|
if cdm:
|
||||||
|
cdm_segment_keys = HLS.filter_keys_for_cdm(segment_keys, cdm)
|
||||||
|
key = HLS.get_supported_key(cdm_segment_keys) if cdm_segment_keys else HLS.get_supported_key(segment_keys)
|
||||||
|
else:
|
||||||
|
key = HLS.get_supported_key(segment_keys)
|
||||||
if encryption_data and encryption_data[0] != key and i != 0 and segment not in unwanted_segments:
|
if encryption_data and encryption_data[0] != key and i != 0 and segment not in unwanted_segments:
|
||||||
decrypt(include_this_segment=False)
|
decrypt(include_this_segment=False)
|
||||||
|
|
||||||
@@ -650,6 +658,44 @@ class HLS:
|
|||||||
|
|
||||||
# finally merge all the discontinuity save files together to the final path
|
# finally merge all the discontinuity save files together to the final path
|
||||||
segments_to_merge = find_segments_recursively(save_dir)
|
segments_to_merge = find_segments_recursively(save_dir)
|
||||||
|
|
||||||
|
if debug_logger:
|
||||||
|
debug_logger.log(
|
||||||
|
level="DEBUG",
|
||||||
|
operation="manifest_hls_download_complete",
|
||||||
|
message="HLS download complete, preparing to merge",
|
||||||
|
context={
|
||||||
|
"track_id": getattr(track, "id", None),
|
||||||
|
"track_type": track.__class__.__name__,
|
||||||
|
"save_dir": str(save_dir),
|
||||||
|
"save_dir_exists": save_dir.exists(),
|
||||||
|
"segments_found": len(segments_to_merge),
|
||||||
|
"segment_files": [f.name for f in segments_to_merge[:10]], # Limit to first 10
|
||||||
|
"downloader": downloader.__name__,
|
||||||
|
"skip_merge": skip_merge,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
if not segments_to_merge:
|
||||||
|
error_msg = f"No segment files found in output directory: {save_dir}"
|
||||||
|
if debug_logger:
|
||||||
|
all_contents = list(save_dir.iterdir()) if save_dir.exists() else []
|
||||||
|
debug_logger.log(
|
||||||
|
level="ERROR",
|
||||||
|
operation="manifest_hls_download_no_segments",
|
||||||
|
message=error_msg,
|
||||||
|
context={
|
||||||
|
"track_id": getattr(track, "id", None),
|
||||||
|
"track_type": track.__class__.__name__,
|
||||||
|
"save_dir": str(save_dir),
|
||||||
|
"save_dir_exists": save_dir.exists(),
|
||||||
|
"directory_contents": [str(p) for p in all_contents],
|
||||||
|
"downloader": downloader.__name__,
|
||||||
|
"skip_merge": skip_merge,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
raise FileNotFoundError(error_msg)
|
||||||
|
|
||||||
if len(segments_to_merge) == 1:
|
if len(segments_to_merge) == 1:
|
||||||
shutil.move(segments_to_merge[0], save_path)
|
shutil.move(segments_to_merge[0], save_path)
|
||||||
else:
|
else:
|
||||||
@@ -889,7 +935,8 @@ class HLS:
|
|||||||
elif key.keyformat and key.keyformat.lower() == WidevineCdm.urn:
|
elif key.keyformat and key.keyformat.lower() == WidevineCdm.urn:
|
||||||
return key
|
return key
|
||||||
elif key.keyformat and key.keyformat.lower() in {
|
elif key.keyformat and key.keyformat.lower() in {
|
||||||
f"urn:uuid:{PR_PSSH.SYSTEM_ID}", "com.microsoft.playready"
|
f"urn:uuid:{PR_PSSH.SYSTEM_ID}",
|
||||||
|
"com.microsoft.playready",
|
||||||
}:
|
}:
|
||||||
return key
|
return key
|
||||||
else:
|
else:
|
||||||
@@ -927,9 +974,7 @@ class HLS:
|
|||||||
pssh=WV_PSSH(key.uri.split(",")[-1]),
|
pssh=WV_PSSH(key.uri.split(",")[-1]),
|
||||||
**key._extra_params, # noqa
|
**key._extra_params, # noqa
|
||||||
)
|
)
|
||||||
elif key.keyformat and key.keyformat.lower() in {
|
elif key.keyformat and key.keyformat.lower() in {f"urn:uuid:{PR_PSSH.SYSTEM_ID}", "com.microsoft.playready"}:
|
||||||
f"urn:uuid:{PR_PSSH.SYSTEM_ID}", "com.microsoft.playready"
|
|
||||||
}:
|
|
||||||
drm = PlayReady(
|
drm = PlayReady(
|
||||||
pssh=PR_PSSH(key.uri.split(",")[-1]),
|
pssh=PR_PSSH(key.uri.split(",")[-1]),
|
||||||
pssh_b64=key.uri.split(",")[-1],
|
pssh_b64=key.uri.split(",")[-1],
|
||||||
|
|||||||
@@ -314,8 +314,63 @@ class ISM:
|
|||||||
for control_file in save_dir.glob("*.aria2__temp"):
|
for control_file in save_dir.glob("*.aria2__temp"):
|
||||||
control_file.unlink()
|
control_file.unlink()
|
||||||
|
|
||||||
|
# Verify output directory exists and contains files
|
||||||
|
if not save_dir.exists():
|
||||||
|
error_msg = f"Output directory does not exist: {save_dir}"
|
||||||
|
if debug_logger:
|
||||||
|
debug_logger.log(
|
||||||
|
level="ERROR",
|
||||||
|
operation="manifest_ism_download_output_missing",
|
||||||
|
message=error_msg,
|
||||||
|
context={
|
||||||
|
"track_id": getattr(track, "id", None),
|
||||||
|
"track_type": track.__class__.__name__,
|
||||||
|
"save_dir": str(save_dir),
|
||||||
|
"save_path": str(save_path),
|
||||||
|
"downloader": downloader.__name__,
|
||||||
|
"skip_merge": skip_merge,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
raise FileNotFoundError(error_msg)
|
||||||
|
|
||||||
segments_to_merge = [x for x in sorted(save_dir.iterdir()) if x.is_file()]
|
segments_to_merge = [x for x in sorted(save_dir.iterdir()) if x.is_file()]
|
||||||
|
|
||||||
|
if debug_logger:
|
||||||
|
debug_logger.log(
|
||||||
|
level="DEBUG",
|
||||||
|
operation="manifest_ism_download_complete",
|
||||||
|
message="ISM download complete, preparing to merge",
|
||||||
|
context={
|
||||||
|
"track_id": getattr(track, "id", None),
|
||||||
|
"track_type": track.__class__.__name__,
|
||||||
|
"save_dir": str(save_dir),
|
||||||
|
"save_dir_exists": save_dir.exists(),
|
||||||
|
"segments_found": len(segments_to_merge),
|
||||||
|
"segment_files": [f.name for f in segments_to_merge[:10]], # Limit to first 10
|
||||||
|
"downloader": downloader.__name__,
|
||||||
|
"skip_merge": skip_merge,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
if not segments_to_merge:
|
||||||
|
error_msg = f"No segment files found in output directory: {save_dir}"
|
||||||
|
if debug_logger:
|
||||||
|
all_contents = list(save_dir.iterdir()) if save_dir.exists() else []
|
||||||
|
debug_logger.log(
|
||||||
|
level="ERROR",
|
||||||
|
operation="manifest_ism_download_no_segments",
|
||||||
|
message=error_msg,
|
||||||
|
context={
|
||||||
|
"track_id": getattr(track, "id", None),
|
||||||
|
"track_type": track.__class__.__name__,
|
||||||
|
"save_dir": str(save_dir),
|
||||||
|
"directory_contents": [str(p) for p in all_contents],
|
||||||
|
"downloader": downloader.__name__,
|
||||||
|
"skip_merge": skip_merge,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
raise FileNotFoundError(error_msg)
|
||||||
|
|
||||||
if skip_merge:
|
if skip_merge:
|
||||||
shutil.move(segments_to_merge[0], save_path)
|
shutil.move(segments_to_merge[0], save_path)
|
||||||
else:
|
else:
|
||||||
|
|||||||
@@ -1,7 +1,8 @@
|
|||||||
from .basic import Basic
|
from .basic import Basic
|
||||||
|
from .gluetun import Gluetun
|
||||||
from .hola import Hola
|
from .hola import Hola
|
||||||
from .nordvpn import NordVPN
|
from .nordvpn import NordVPN
|
||||||
from .surfsharkvpn import SurfsharkVPN
|
from .surfsharkvpn import SurfsharkVPN
|
||||||
from .windscribevpn import WindscribeVPN
|
from .windscribevpn import WindscribeVPN
|
||||||
|
|
||||||
__all__ = ("Basic", "Hola", "NordVPN", "SurfsharkVPN", "WindscribeVPN")
|
__all__ = ("Basic", "Gluetun", "Hola", "NordVPN", "SurfsharkVPN", "WindscribeVPN")
|
||||||
|
|||||||
1338
unshackle/core/proxies/gluetun.py
Normal file
1338
unshackle/core/proxies/gluetun.py
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,4 +1,5 @@
|
|||||||
import json
|
import json
|
||||||
|
import random
|
||||||
import re
|
import re
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
@@ -46,8 +47,21 @@ class NordVPN(Proxy):
|
|||||||
|
|
||||||
HTTP proxies under port 80 were disabled on the 15th of Feb, 2021:
|
HTTP proxies under port 80 were disabled on the 15th of Feb, 2021:
|
||||||
https://nordvpn.com/blog/removing-http-proxies
|
https://nordvpn.com/blog/removing-http-proxies
|
||||||
|
|
||||||
|
Supports:
|
||||||
|
- Country code: "us", "ca", "gb"
|
||||||
|
- Country ID: "228"
|
||||||
|
- Specific server: "us1234"
|
||||||
|
- City selection: "us:seattle", "ca:calgary"
|
||||||
"""
|
"""
|
||||||
query = query.lower()
|
query = query.lower()
|
||||||
|
city = None
|
||||||
|
|
||||||
|
# Check if query includes city specification (e.g., "ca:calgary")
|
||||||
|
if ":" in query:
|
||||||
|
query, city = query.split(":", maxsplit=1)
|
||||||
|
city = city.strip()
|
||||||
|
|
||||||
if re.match(r"^[a-z]{2}\d+$", query):
|
if re.match(r"^[a-z]{2}\d+$", query):
|
||||||
# country and nordvpn server id, e.g., us1, fr1234
|
# country and nordvpn server id, e.g., us1, fr1234
|
||||||
hostname = f"{query}.nordvpn.com"
|
hostname = f"{query}.nordvpn.com"
|
||||||
@@ -64,7 +78,12 @@ class NordVPN(Proxy):
|
|||||||
# NordVPN doesnt have servers in this region
|
# NordVPN doesnt have servers in this region
|
||||||
return
|
return
|
||||||
|
|
||||||
server_mapping = self.server_map.get(country["code"].lower())
|
# Check server_map for pinned servers (can include city)
|
||||||
|
server_map_key = f"{country['code'].lower()}:{city}" if city else country["code"].lower()
|
||||||
|
server_mapping = self.server_map.get(server_map_key) or (
|
||||||
|
self.server_map.get(country["code"].lower()) if not city else None
|
||||||
|
)
|
||||||
|
|
||||||
if server_mapping:
|
if server_mapping:
|
||||||
# country was set to a specific server ID in config
|
# country was set to a specific server ID in config
|
||||||
hostname = f"{country['code'].lower()}{server_mapping}.nordvpn.com"
|
hostname = f"{country['code'].lower()}{server_mapping}.nordvpn.com"
|
||||||
@@ -76,7 +95,19 @@ class NordVPN(Proxy):
|
|||||||
f"The NordVPN Country {query} currently has no recommended servers. "
|
f"The NordVPN Country {query} currently has no recommended servers. "
|
||||||
"Try again later. If the issue persists, double-check the query."
|
"Try again later. If the issue persists, double-check the query."
|
||||||
)
|
)
|
||||||
hostname = recommended_servers[0]["hostname"]
|
|
||||||
|
# Filter by city if specified
|
||||||
|
if city:
|
||||||
|
city_servers = self.filter_servers_by_city(recommended_servers, city)
|
||||||
|
if not city_servers:
|
||||||
|
raise ValueError(
|
||||||
|
f"No servers found in city '{city}' for country '{country['name']}'. "
|
||||||
|
"Try a different city or check the city name spelling."
|
||||||
|
)
|
||||||
|
recommended_servers = city_servers
|
||||||
|
|
||||||
|
# Pick a random server from the filtered list
|
||||||
|
hostname = random.choice(recommended_servers)["hostname"]
|
||||||
|
|
||||||
if hostname.startswith("gb"):
|
if hostname.startswith("gb"):
|
||||||
# NordVPN uses the alpha2 of 'GB' in API responses, but 'UK' in the hostname
|
# NordVPN uses the alpha2 of 'GB' in API responses, but 'UK' in the hostname
|
||||||
@@ -95,6 +126,41 @@ class NordVPN(Proxy):
|
|||||||
):
|
):
|
||||||
return country
|
return country
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def filter_servers_by_city(servers: list[dict], city: str) -> list[dict]:
|
||||||
|
"""
|
||||||
|
Filter servers by city name.
|
||||||
|
|
||||||
|
The API returns servers with location data that includes city information.
|
||||||
|
This method filters servers to only those in the specified city.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
servers: List of server dictionaries from the NordVPN API
|
||||||
|
city: City name to filter by (case-insensitive)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of servers in the specified city
|
||||||
|
"""
|
||||||
|
city_lower = city.lower()
|
||||||
|
filtered = []
|
||||||
|
|
||||||
|
for server in servers:
|
||||||
|
# Each server has a 'locations' list with location data
|
||||||
|
locations = server.get("locations", [])
|
||||||
|
for location in locations:
|
||||||
|
# City data can be in different formats:
|
||||||
|
# - {"city": {"name": "Seattle", ...}}
|
||||||
|
# - {"city": "Seattle"}
|
||||||
|
city_data = location.get("city")
|
||||||
|
if city_data:
|
||||||
|
# Handle both dict and string formats
|
||||||
|
city_name = city_data.get("name") if isinstance(city_data, dict) else city_data
|
||||||
|
if city_name and city_name.lower() == city_lower:
|
||||||
|
filtered.append(server)
|
||||||
|
break # Found a match, no need to check other locations for this server
|
||||||
|
|
||||||
|
return filtered
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def get_recommended_servers(country_id: int) -> list[dict]:
|
def get_recommended_servers(country_id: int) -> list[dict]:
|
||||||
"""
|
"""
|
||||||
|
|||||||
@@ -44,8 +44,21 @@ class SurfsharkVPN(Proxy):
|
|||||||
def get_proxy(self, query: str) -> Optional[str]:
|
def get_proxy(self, query: str) -> Optional[str]:
|
||||||
"""
|
"""
|
||||||
Get an HTTP(SSL) proxy URI for a SurfsharkVPN server.
|
Get an HTTP(SSL) proxy URI for a SurfsharkVPN server.
|
||||||
|
|
||||||
|
Supports:
|
||||||
|
- Country code: "us", "ca", "gb"
|
||||||
|
- Country ID: "228"
|
||||||
|
- Specific server: "us-bos" (Boston)
|
||||||
|
- City selection: "us:seattle", "ca:toronto"
|
||||||
"""
|
"""
|
||||||
query = query.lower()
|
query = query.lower()
|
||||||
|
city = None
|
||||||
|
|
||||||
|
# Check if query includes city specification (e.g., "us:seattle")
|
||||||
|
if ":" in query:
|
||||||
|
query, city = query.split(":", maxsplit=1)
|
||||||
|
city = city.strip()
|
||||||
|
|
||||||
if re.match(r"^[a-z]{2}\d+$", query):
|
if re.match(r"^[a-z]{2}\d+$", query):
|
||||||
# country and surfsharkvpn server id, e.g., au-per, be-anr, us-bos
|
# country and surfsharkvpn server id, e.g., au-per, be-anr, us-bos
|
||||||
hostname = f"{query}.prod.surfshark.com"
|
hostname = f"{query}.prod.surfshark.com"
|
||||||
@@ -62,13 +75,18 @@ class SurfsharkVPN(Proxy):
|
|||||||
# SurfsharkVPN doesnt have servers in this region
|
# SurfsharkVPN doesnt have servers in this region
|
||||||
return
|
return
|
||||||
|
|
||||||
server_mapping = self.server_map.get(country["countryCode"].lower())
|
# Check server_map for pinned servers (can include city)
|
||||||
|
server_map_key = f"{country['countryCode'].lower()}:{city}" if city else country["countryCode"].lower()
|
||||||
|
server_mapping = self.server_map.get(server_map_key) or (
|
||||||
|
self.server_map.get(country["countryCode"].lower()) if not city else None
|
||||||
|
)
|
||||||
|
|
||||||
if server_mapping:
|
if server_mapping:
|
||||||
# country was set to a specific server ID in config
|
# country was set to a specific server ID in config
|
||||||
hostname = f"{country['code'].lower()}{server_mapping}.prod.surfshark.com"
|
hostname = f"{country['code'].lower()}{server_mapping}.prod.surfshark.com"
|
||||||
else:
|
else:
|
||||||
# get the random server ID
|
# get the random server ID
|
||||||
random_server = self.get_random_server(country["countryCode"])
|
random_server = self.get_random_server(country["countryCode"], city)
|
||||||
if not random_server:
|
if not random_server:
|
||||||
raise ValueError(
|
raise ValueError(
|
||||||
f"The SurfsharkVPN Country {query} currently has no random servers. "
|
f"The SurfsharkVPN Country {query} currently has no random servers. "
|
||||||
@@ -92,18 +110,44 @@ class SurfsharkVPN(Proxy):
|
|||||||
):
|
):
|
||||||
return country
|
return country
|
||||||
|
|
||||||
def get_random_server(self, country_id: str):
|
def get_random_server(self, country_id: str, city: Optional[str] = None):
|
||||||
"""
|
"""
|
||||||
Get the list of random Server for a Country.
|
Get a random server for a Country, optionally filtered by city.
|
||||||
|
|
||||||
Note: There may not always be more than one recommended server.
|
Args:
|
||||||
|
country_id: The country code (e.g., "US", "CA")
|
||||||
|
city: Optional city name to filter by (case-insensitive)
|
||||||
|
|
||||||
|
Note: The API may include a 'location' field with city information.
|
||||||
|
If not available, this will return any server from the country.
|
||||||
"""
|
"""
|
||||||
country = [x["connectionName"] for x in self.countries if x["countryCode"].lower() == country_id.lower()]
|
servers = [x for x in self.countries if x["countryCode"].lower() == country_id.lower()]
|
||||||
|
|
||||||
|
# Filter by city if specified
|
||||||
|
if city:
|
||||||
|
city_lower = city.lower()
|
||||||
|
# Check if servers have a 'location' field for city filtering
|
||||||
|
city_servers = [
|
||||||
|
x
|
||||||
|
for x in servers
|
||||||
|
if x.get("location", "").lower() == city_lower or x.get("city", "").lower() == city_lower
|
||||||
|
]
|
||||||
|
|
||||||
|
if city_servers:
|
||||||
|
servers = city_servers
|
||||||
|
else:
|
||||||
|
raise ValueError(
|
||||||
|
f"No servers found in city '{city}' for country '{country_id}'. "
|
||||||
|
"Try a different city or check the city name spelling."
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get connection names from filtered servers
|
||||||
|
connection_names = [x["connectionName"] for x in servers]
|
||||||
|
|
||||||
try:
|
try:
|
||||||
country = random.choice(country)
|
return random.choice(connection_names)
|
||||||
return country
|
except (IndexError, KeyError):
|
||||||
except Exception:
|
raise ValueError(f"Could not get random server for country '{country_id}'.")
|
||||||
raise ValueError("Could not get random countrycode from the countries list.")
|
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def get_countries() -> list[dict]:
|
def get_countries() -> list[dict]:
|
||||||
|
|||||||
@@ -45,22 +45,27 @@ class WindscribeVPN(Proxy):
|
|||||||
"""
|
"""
|
||||||
Get an HTTPS proxy URI for a WindscribeVPN server.
|
Get an HTTPS proxy URI for a WindscribeVPN server.
|
||||||
|
|
||||||
Note: Windscribe's static OpenVPN credentials work reliably on US, AU, and NZ servers.
|
Supports:
|
||||||
|
- Country code: "us", "ca", "gb"
|
||||||
|
- City selection: "us:seattle", "ca:toronto"
|
||||||
"""
|
"""
|
||||||
query = query.lower()
|
query = query.lower()
|
||||||
supported_regions = {"us", "au", "nz"}
|
city = None
|
||||||
|
|
||||||
if query not in supported_regions and query not in self.server_map:
|
# Check if query includes city specification (e.g., "ca:toronto")
|
||||||
raise ValueError(
|
if ":" in query:
|
||||||
f"Windscribe proxy does not currently support the '{query.upper()}' region. "
|
query, city = query.split(":", maxsplit=1)
|
||||||
f"Supported regions with reliable credentials: {', '.join(sorted(supported_regions))}. "
|
city = city.strip()
|
||||||
)
|
|
||||||
|
|
||||||
if query in self.server_map:
|
# Check server_map for pinned servers (can include city)
|
||||||
|
server_map_key = f"{query}:{city}" if city else query
|
||||||
|
if server_map_key in self.server_map:
|
||||||
|
hostname = self.server_map[server_map_key]
|
||||||
|
elif query in self.server_map and not city:
|
||||||
hostname = self.server_map[query]
|
hostname = self.server_map[query]
|
||||||
else:
|
else:
|
||||||
if re.match(r"^[a-z]+$", query):
|
if re.match(r"^[a-z]+$", query):
|
||||||
hostname = self.get_random_server(query)
|
hostname = self.get_random_server(query, city)
|
||||||
else:
|
else:
|
||||||
raise ValueError(f"The query provided is unsupported and unrecognized: {query}")
|
raise ValueError(f"The query provided is unsupported and unrecognized: {query}")
|
||||||
|
|
||||||
@@ -70,22 +75,42 @@ class WindscribeVPN(Proxy):
|
|||||||
hostname = hostname.split(':')[0]
|
hostname = hostname.split(':')[0]
|
||||||
return f"https://{self.username}:{self.password}@{hostname}:443"
|
return f"https://{self.username}:{self.password}@{hostname}:443"
|
||||||
|
|
||||||
def get_random_server(self, country_code: str) -> Optional[str]:
|
def get_random_server(self, country_code: str, city: Optional[str] = None) -> Optional[str]:
|
||||||
"""
|
"""
|
||||||
Get a random server hostname for a country.
|
Get a random server hostname for a country, optionally filtered by city.
|
||||||
|
|
||||||
Returns None if no servers are available for the country.
|
Args:
|
||||||
|
country_code: The country code (e.g., "us", "ca")
|
||||||
|
city: Optional city name to filter by (case-insensitive)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
A random hostname from matching servers, or None if none available.
|
||||||
"""
|
"""
|
||||||
|
hostnames = []
|
||||||
|
|
||||||
|
# Collect hostnames from ALL locations matching the country code
|
||||||
for location in self.countries:
|
for location in self.countries:
|
||||||
if location.get("country_code", "").lower() == country_code.lower():
|
if location.get("country_code", "").lower() == country_code.lower():
|
||||||
hostnames = []
|
|
||||||
for group in location.get("groups", []):
|
for group in location.get("groups", []):
|
||||||
|
# Filter by city if specified
|
||||||
|
if city:
|
||||||
|
group_city = group.get("city", "")
|
||||||
|
if group_city.lower() != city.lower():
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Collect hostnames from this group
|
||||||
for host in group.get("hosts", []):
|
for host in group.get("hosts", []):
|
||||||
if hostname := host.get("hostname"):
|
if hostname := host.get("hostname"):
|
||||||
hostnames.append(hostname)
|
hostnames.append(hostname)
|
||||||
|
|
||||||
if hostnames:
|
if hostnames:
|
||||||
return random.choice(hostnames)
|
return random.choice(hostnames)
|
||||||
|
elif city:
|
||||||
|
# No servers found for the specified city
|
||||||
|
raise ValueError(
|
||||||
|
f"No servers found in city '{city}' for country code '{country_code}'. "
|
||||||
|
"Try a different city or check the city name spelling."
|
||||||
|
)
|
||||||
|
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|||||||
@@ -53,8 +53,55 @@ class Service(metaclass=ABCMeta):
|
|||||||
if not ctx.parent or not ctx.parent.params.get("no_proxy"):
|
if not ctx.parent or not ctx.parent.params.get("no_proxy"):
|
||||||
if ctx.parent:
|
if ctx.parent:
|
||||||
proxy = ctx.parent.params["proxy"]
|
proxy = ctx.parent.params["proxy"]
|
||||||
|
proxy_query = ctx.parent.params.get("proxy_query")
|
||||||
|
proxy_provider_name = ctx.parent.params.get("proxy_provider")
|
||||||
else:
|
else:
|
||||||
proxy = None
|
proxy = None
|
||||||
|
proxy_query = None
|
||||||
|
proxy_provider_name = None
|
||||||
|
|
||||||
|
# Check for service-specific proxy mapping
|
||||||
|
service_name = self.__class__.__name__
|
||||||
|
service_config_dict = config.services.get(service_name, {})
|
||||||
|
proxy_map = service_config_dict.get("proxy_map", {})
|
||||||
|
|
||||||
|
if proxy_map and proxy_query:
|
||||||
|
# Build the full proxy query key (e.g., "nordvpn:ca" or "us")
|
||||||
|
if proxy_provider_name:
|
||||||
|
full_proxy_key = f"{proxy_provider_name}:{proxy_query}"
|
||||||
|
else:
|
||||||
|
full_proxy_key = proxy_query
|
||||||
|
|
||||||
|
# Check if there's a mapping for this query
|
||||||
|
mapped_value = proxy_map.get(full_proxy_key)
|
||||||
|
if mapped_value:
|
||||||
|
self.log.info(f"Found service-specific proxy mapping: {full_proxy_key} -> {mapped_value}")
|
||||||
|
# Query the proxy provider with the mapped value
|
||||||
|
if proxy_provider_name:
|
||||||
|
# Specific provider requested
|
||||||
|
proxy_provider = next(
|
||||||
|
(x for x in ctx.obj.proxy_providers if x.__class__.__name__.lower() == proxy_provider_name),
|
||||||
|
None,
|
||||||
|
)
|
||||||
|
if proxy_provider:
|
||||||
|
mapped_proxy_uri = proxy_provider.get_proxy(mapped_value)
|
||||||
|
if mapped_proxy_uri:
|
||||||
|
proxy = mapped_proxy_uri
|
||||||
|
self.log.info(f"Using mapped proxy from {proxy_provider.__class__.__name__}: {proxy}")
|
||||||
|
else:
|
||||||
|
self.log.warning(f"Failed to get proxy for mapped value '{mapped_value}', using default")
|
||||||
|
else:
|
||||||
|
self.log.warning(f"Proxy provider '{proxy_provider_name}' not found, using default proxy")
|
||||||
|
else:
|
||||||
|
# No specific provider, try all providers
|
||||||
|
for proxy_provider in ctx.obj.proxy_providers:
|
||||||
|
mapped_proxy_uri = proxy_provider.get_proxy(mapped_value)
|
||||||
|
if mapped_proxy_uri:
|
||||||
|
proxy = mapped_proxy_uri
|
||||||
|
self.log.info(f"Using mapped proxy from {proxy_provider.__class__.__name__}: {proxy}")
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
self.log.warning(f"No provider could resolve mapped value '{mapped_value}', using default")
|
||||||
|
|
||||||
if not proxy:
|
if not proxy:
|
||||||
# don't override the explicit proxy set by the user, even if they may be geoblocked
|
# don't override the explicit proxy set by the user, even if they may be geoblocked
|
||||||
|
|||||||
@@ -58,6 +58,7 @@ class Services(click.MultiCommand):
|
|||||||
def get_path(name: str) -> Path:
|
def get_path(name: str) -> Path:
|
||||||
"""Get the directory path of a command."""
|
"""Get the directory path of a command."""
|
||||||
tag = Services.get_tag(name)
|
tag = Services.get_tag(name)
|
||||||
|
|
||||||
for service in _SERVICES:
|
for service in _SERVICES:
|
||||||
if service.parent.stem == tag:
|
if service.parent.stem == tag:
|
||||||
return service.parent
|
return service.parent
|
||||||
@@ -72,19 +73,22 @@ class Services(click.MultiCommand):
|
|||||||
"""
|
"""
|
||||||
original_value = value
|
original_value = value
|
||||||
value = value.lower()
|
value = value.lower()
|
||||||
|
|
||||||
for path in _SERVICES:
|
for path in _SERVICES:
|
||||||
tag = path.parent.stem
|
tag = path.parent.stem
|
||||||
if value in (tag.lower(), *_ALIASES.get(tag, [])):
|
if value in (tag.lower(), *_ALIASES.get(tag, [])):
|
||||||
return tag
|
return tag
|
||||||
|
|
||||||
return original_value
|
return original_value
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def load(tag: str) -> Service:
|
def load(tag: str) -> Service:
|
||||||
"""Load a Service module by Service tag."""
|
"""Load a Service module by Service tag."""
|
||||||
module = _MODULES.get(tag)
|
module = _MODULES.get(tag)
|
||||||
if not module:
|
if module:
|
||||||
raise KeyError(f"There is no Service added by the Tag '{tag}'")
|
return module
|
||||||
return module
|
|
||||||
|
raise KeyError(f"There is no Service added by the Tag '{tag}'")
|
||||||
|
|
||||||
|
|
||||||
__all__ = ("Services",)
|
__all__ = ("Services",)
|
||||||
|
|||||||
@@ -47,6 +47,8 @@ class Movie(Title):
|
|||||||
|
|
||||||
def __str__(self) -> str:
|
def __str__(self) -> str:
|
||||||
if self.year:
|
if self.year:
|
||||||
|
if config.dash_naming:
|
||||||
|
return f"{self.name} - {self.year}"
|
||||||
return f"{self.name} ({self.year})"
|
return f"{self.name} ({self.year})"
|
||||||
return self.name
|
return self.name
|
||||||
|
|
||||||
@@ -86,11 +88,21 @@ class Movie(Title):
|
|||||||
# likely a movie or HD source, so it's most likely widescreen so
|
# likely a movie or HD source, so it's most likely widescreen so
|
||||||
# 16:9 canvas makes the most sense.
|
# 16:9 canvas makes the most sense.
|
||||||
resolution = int(primary_video_track.width * (9 / 16))
|
resolution = int(primary_video_track.width * (9 / 16))
|
||||||
name += f" {resolution}p"
|
# Determine scan type suffix - default to "p", use "i" only if explicitly interlaced
|
||||||
|
scan_suffix = "p"
|
||||||
|
scan_type = getattr(primary_video_track, 'scan_type', None)
|
||||||
|
if scan_type and str(scan_type).lower() == "interlaced":
|
||||||
|
scan_suffix = "i"
|
||||||
|
name += f" {resolution}{scan_suffix}"
|
||||||
|
|
||||||
# Service
|
# Service (use track source if available)
|
||||||
if show_service:
|
if show_service:
|
||||||
name += f" {self.service.__name__}"
|
source_name = None
|
||||||
|
if self.tracks:
|
||||||
|
first_track = next(iter(self.tracks), None)
|
||||||
|
if first_track and hasattr(first_track, "source") and first_track.source:
|
||||||
|
source_name = first_track.source
|
||||||
|
name += f" {source_name or self.service.__name__}"
|
||||||
|
|
||||||
# 'WEB-DL'
|
# 'WEB-DL'
|
||||||
name += " WEB-DL"
|
name += " WEB-DL"
|
||||||
|
|||||||
@@ -101,9 +101,14 @@ class Song(Title):
|
|||||||
name = str(self).split(" / ")[1]
|
name = str(self).split(" / ")[1]
|
||||||
|
|
||||||
if config.scene_naming:
|
if config.scene_naming:
|
||||||
# Service
|
# Service (use track source if available)
|
||||||
if show_service:
|
if show_service:
|
||||||
name += f" {self.service.__name__}"
|
source_name = None
|
||||||
|
if self.tracks:
|
||||||
|
first_track = next(iter(self.tracks), None)
|
||||||
|
if first_track and hasattr(first_track, "source") and first_track.source:
|
||||||
|
source_name = first_track.source
|
||||||
|
name += f" {source_name or self.service.__name__}"
|
||||||
|
|
||||||
# 'WEB-DL'
|
# 'WEB-DL'
|
||||||
name += " WEB-DL"
|
name += " WEB-DL"
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ from pathlib import Path
|
|||||||
from rich.padding import Padding
|
from rich.padding import Padding
|
||||||
from rich.rule import Rule
|
from rich.rule import Rule
|
||||||
|
|
||||||
from unshackle.core.binaries import DoviTool, HDR10PlusTool
|
from unshackle.core.binaries import FFMPEG, DoviTool, HDR10PlusTool
|
||||||
from unshackle.core.config import config
|
from unshackle.core.config import config
|
||||||
from unshackle.core.console import console
|
from unshackle.core.console import console
|
||||||
|
|
||||||
@@ -109,7 +109,7 @@ class Hybrid:
|
|||||||
"""Simple ffmpeg execution without progress tracking"""
|
"""Simple ffmpeg execution without progress tracking"""
|
||||||
p = subprocess.run(
|
p = subprocess.run(
|
||||||
[
|
[
|
||||||
"ffmpeg",
|
str(FFMPEG) if FFMPEG else "ffmpeg",
|
||||||
"-nostdin",
|
"-nostdin",
|
||||||
"-i",
|
"-i",
|
||||||
str(save_path),
|
str(save_path),
|
||||||
|
|||||||
@@ -314,6 +314,7 @@ class Tracks:
|
|||||||
progress: Optional[partial] = None,
|
progress: Optional[partial] = None,
|
||||||
audio_expected: bool = True,
|
audio_expected: bool = True,
|
||||||
title_language: Optional[Language] = None,
|
title_language: Optional[Language] = None,
|
||||||
|
skip_subtitles: bool = False,
|
||||||
) -> tuple[Path, int, list[str]]:
|
) -> tuple[Path, int, list[str]]:
|
||||||
"""
|
"""
|
||||||
Multiplex all the Tracks into a Matroska Container file.
|
Multiplex all the Tracks into a Matroska Container file.
|
||||||
@@ -328,6 +329,7 @@ class Tracks:
|
|||||||
if embedded audio metadata should be added.
|
if embedded audio metadata should be added.
|
||||||
title_language: The title's intended language. Used to select the best video track
|
title_language: The title's intended language. Used to select the best video track
|
||||||
for audio metadata when multiple video tracks exist.
|
for audio metadata when multiple video tracks exist.
|
||||||
|
skip_subtitles: Skip muxing subtitle tracks into the container.
|
||||||
"""
|
"""
|
||||||
if self.videos and not self.audio and audio_expected:
|
if self.videos and not self.audio and audio_expected:
|
||||||
video_track = None
|
video_track = None
|
||||||
@@ -439,34 +441,35 @@ class Tracks:
|
|||||||
]
|
]
|
||||||
)
|
)
|
||||||
|
|
||||||
for st in self.subtitles:
|
if not skip_subtitles:
|
||||||
if not st.path or not st.path.exists():
|
for st in self.subtitles:
|
||||||
raise ValueError("Text Track must be downloaded before muxing...")
|
if not st.path or not st.path.exists():
|
||||||
events.emit(events.Types.TRACK_MULTIPLEX, track=st)
|
raise ValueError("Text Track must be downloaded before muxing...")
|
||||||
default = bool(self.audio and is_close_match(st.language, [self.audio[0].language]) and st.forced)
|
events.emit(events.Types.TRACK_MULTIPLEX, track=st)
|
||||||
cl.extend(
|
default = bool(self.audio and is_close_match(st.language, [self.audio[0].language]) and st.forced)
|
||||||
[
|
cl.extend(
|
||||||
"--track-name",
|
[
|
||||||
f"0:{st.get_track_name() or ''}",
|
"--track-name",
|
||||||
"--language",
|
f"0:{st.get_track_name() or ''}",
|
||||||
f"0:{st.language}",
|
"--language",
|
||||||
"--sub-charset",
|
f"0:{st.language}",
|
||||||
"0:UTF-8",
|
"--sub-charset",
|
||||||
"--forced-track",
|
"0:UTF-8",
|
||||||
f"0:{st.forced}",
|
"--forced-track",
|
||||||
"--default-track",
|
f"0:{st.forced}",
|
||||||
f"0:{default}",
|
"--default-track",
|
||||||
"--hearing-impaired-flag",
|
f"0:{default}",
|
||||||
f"0:{st.sdh}",
|
"--hearing-impaired-flag",
|
||||||
"--original-flag",
|
f"0:{st.sdh}",
|
||||||
f"0:{st.is_original_lang}",
|
"--original-flag",
|
||||||
"--compression",
|
f"0:{st.is_original_lang}",
|
||||||
"0:none", # disable extra compression (probably zlib)
|
"--compression",
|
||||||
"(",
|
"0:none", # disable extra compression (probably zlib)
|
||||||
str(st.path),
|
"(",
|
||||||
")",
|
str(st.path),
|
||||||
]
|
")",
|
||||||
)
|
]
|
||||||
|
)
|
||||||
|
|
||||||
if self.chapters:
|
if self.chapters:
|
||||||
chapters_path = config.directories.temp / config.filenames.chapters.format(
|
chapters_path = config.directories.temp / config.filenames.chapters.format(
|
||||||
|
|||||||
@@ -186,6 +186,10 @@ class Video(Track):
|
|||||||
# for some reason there's no Dolby Vision info tag
|
# for some reason there's no Dolby Vision info tag
|
||||||
raise ValueError(f"The M3U Range Tag '{tag}' is not a supported Video Range")
|
raise ValueError(f"The M3U Range Tag '{tag}' is not a supported Video Range")
|
||||||
|
|
||||||
|
class ScanType(str, Enum):
|
||||||
|
PROGRESSIVE = "progressive"
|
||||||
|
INTERLACED = "interlaced"
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
*args: Any,
|
*args: Any,
|
||||||
@@ -195,6 +199,7 @@ class Video(Track):
|
|||||||
width: Optional[int] = None,
|
width: Optional[int] = None,
|
||||||
height: Optional[int] = None,
|
height: Optional[int] = None,
|
||||||
fps: Optional[Union[str, int, float]] = None,
|
fps: Optional[Union[str, int, float]] = None,
|
||||||
|
scan_type: Optional[Video.ScanType] = None,
|
||||||
**kwargs: Any,
|
**kwargs: Any,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""
|
"""
|
||||||
@@ -232,6 +237,8 @@ class Video(Track):
|
|||||||
raise TypeError(f"Expected height to be a {int}, not {height!r}")
|
raise TypeError(f"Expected height to be a {int}, not {height!r}")
|
||||||
if not isinstance(fps, (str, int, float, type(None))):
|
if not isinstance(fps, (str, int, float, type(None))):
|
||||||
raise TypeError(f"Expected fps to be a {str}, {int}, or {float}, not {fps!r}")
|
raise TypeError(f"Expected fps to be a {str}, {int}, or {float}, not {fps!r}")
|
||||||
|
if not isinstance(scan_type, (Video.ScanType, type(None))):
|
||||||
|
raise TypeError(f"Expected scan_type to be a {Video.ScanType}, not {scan_type!r}")
|
||||||
|
|
||||||
self.codec = codec
|
self.codec = codec
|
||||||
self.range = range_ or Video.Range.SDR
|
self.range = range_ or Video.Range.SDR
|
||||||
@@ -256,6 +263,7 @@ class Video(Track):
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
raise ValueError("Expected fps to be a number, float, or a string as numerator/denominator form, " + str(e))
|
raise ValueError("Expected fps to be a number, float, or a string as numerator/denominator form, " + str(e))
|
||||||
|
|
||||||
|
self.scan_type = scan_type
|
||||||
self.needs_duration_fix = False
|
self.needs_duration_fix = False
|
||||||
|
|
||||||
def __str__(self) -> str:
|
def __str__(self) -> str:
|
||||||
|
|||||||
@@ -19,6 +19,7 @@ from urllib.parse import ParseResult, urlparse
|
|||||||
from uuid import uuid4
|
from uuid import uuid4
|
||||||
|
|
||||||
import chardet
|
import chardet
|
||||||
|
import pycountry
|
||||||
import requests
|
import requests
|
||||||
from construct import ValidationError
|
from construct import ValidationError
|
||||||
from fontTools import ttLib
|
from fontTools import ttLib
|
||||||
@@ -277,6 +278,80 @@ def ap_case(text: str, keep_spaces: bool = False, stop_words: tuple[str] = None)
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# Common country code aliases that differ from ISO 3166-1 alpha-2
|
||||||
|
COUNTRY_CODE_ALIASES = {
|
||||||
|
"uk": "gb", # United Kingdom -> Great Britain
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_country_name(code: str) -> Optional[str]:
|
||||||
|
"""
|
||||||
|
Convert a 2-letter country code to full country name.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
code: ISO 3166-1 alpha-2 country code (e.g., 'ca', 'us', 'gb', 'uk')
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Full country name (e.g., 'Canada', 'United States', 'United Kingdom') or None if not found
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
>>> get_country_name('ca')
|
||||||
|
'Canada'
|
||||||
|
>>> get_country_name('US')
|
||||||
|
'United States'
|
||||||
|
>>> get_country_name('uk')
|
||||||
|
'United Kingdom'
|
||||||
|
"""
|
||||||
|
# Handle common aliases
|
||||||
|
code = COUNTRY_CODE_ALIASES.get(code.lower(), code.lower())
|
||||||
|
|
||||||
|
try:
|
||||||
|
country = pycountry.countries.get(alpha_2=code.upper())
|
||||||
|
if country:
|
||||||
|
return country.name
|
||||||
|
except (KeyError, LookupError):
|
||||||
|
pass
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def get_country_code(name: str) -> Optional[str]:
|
||||||
|
"""
|
||||||
|
Convert a country name to its 2-letter ISO 3166-1 alpha-2 code.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
name: Full country name (e.g., 'Canada', 'United States', 'United Kingdom')
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
2-letter country code in uppercase (e.g., 'CA', 'US', 'GB') or None if not found
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
>>> get_country_code('Canada')
|
||||||
|
'CA'
|
||||||
|
>>> get_country_code('united states')
|
||||||
|
'US'
|
||||||
|
>>> get_country_code('United Kingdom')
|
||||||
|
'GB'
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Try exact name match first
|
||||||
|
country = pycountry.countries.get(name=name.title())
|
||||||
|
if country:
|
||||||
|
return country.alpha_2.upper()
|
||||||
|
|
||||||
|
# Try common name (e.g., "Bolivia" vs "Bolivia, Plurinational State of")
|
||||||
|
country = pycountry.countries.get(common_name=name.title())
|
||||||
|
if country:
|
||||||
|
return country.alpha_2.upper()
|
||||||
|
|
||||||
|
# Try fuzzy search as fallback
|
||||||
|
results = pycountry.countries.search_fuzzy(name)
|
||||||
|
if results:
|
||||||
|
return results[0].alpha_2.upper()
|
||||||
|
except (KeyError, LookupError):
|
||||||
|
pass
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
def get_ip_info(session: Optional[requests.Session] = None) -> dict:
|
def get_ip_info(session: Optional[requests.Session] = None) -> dict:
|
||||||
"""
|
"""
|
||||||
Use ipinfo.io to get IP location information.
|
Use ipinfo.io to get IP location information.
|
||||||
|
|||||||
@@ -5,6 +5,8 @@ import click
|
|||||||
from click.shell_completion import CompletionItem
|
from click.shell_completion import CompletionItem
|
||||||
from pywidevine.cdm import Cdm as WidevineCdm
|
from pywidevine.cdm import Cdm as WidevineCdm
|
||||||
|
|
||||||
|
from unshackle.core.tracks.audio import Audio
|
||||||
|
|
||||||
|
|
||||||
class VideoCodecChoice(click.Choice):
|
class VideoCodecChoice(click.Choice):
|
||||||
"""
|
"""
|
||||||
@@ -241,6 +243,52 @@ class QualityList(click.ParamType):
|
|||||||
return sorted(resolutions, reverse=True)
|
return sorted(resolutions, reverse=True)
|
||||||
|
|
||||||
|
|
||||||
|
class AudioCodecList(click.ParamType):
|
||||||
|
"""Parses comma-separated audio codecs like 'AAC,EC3'."""
|
||||||
|
|
||||||
|
name = "audio_codec_list"
|
||||||
|
|
||||||
|
def __init__(self, codec_enum):
|
||||||
|
self.codec_enum = codec_enum
|
||||||
|
self._name_to_codec: dict[str, Audio.Codec] = {}
|
||||||
|
for codec in codec_enum:
|
||||||
|
self._name_to_codec[codec.name.lower()] = codec
|
||||||
|
self._name_to_codec[codec.value.lower()] = codec
|
||||||
|
|
||||||
|
aliases = {
|
||||||
|
"eac3": "EC3",
|
||||||
|
"ddp": "EC3",
|
||||||
|
"vorbis": "OGG",
|
||||||
|
}
|
||||||
|
for alias, target in aliases.items():
|
||||||
|
if target in codec_enum.__members__:
|
||||||
|
self._name_to_codec[alias] = codec_enum[target]
|
||||||
|
|
||||||
|
def convert(self, value: Any, param: Optional[click.Parameter] = None, ctx: Optional[click.Context] = None) -> list:
|
||||||
|
if not value:
|
||||||
|
return []
|
||||||
|
if isinstance(value, self.codec_enum):
|
||||||
|
return [value]
|
||||||
|
if isinstance(value, list):
|
||||||
|
if all(isinstance(v, self.codec_enum) for v in value):
|
||||||
|
return value
|
||||||
|
values = [str(v).strip() for v in value]
|
||||||
|
else:
|
||||||
|
values = [v.strip() for v in str(value).split(",")]
|
||||||
|
|
||||||
|
codecs = []
|
||||||
|
for val in values:
|
||||||
|
if not val:
|
||||||
|
continue
|
||||||
|
key = val.lower()
|
||||||
|
if key in self._name_to_codec:
|
||||||
|
codecs.append(self._name_to_codec[key])
|
||||||
|
else:
|
||||||
|
valid = sorted(set(self._name_to_codec.keys()))
|
||||||
|
self.fail(f"'{val}' is not valid. Choices: {', '.join(valid)}", param, ctx)
|
||||||
|
return list(dict.fromkeys(codecs)) # Remove duplicates, preserve order
|
||||||
|
|
||||||
|
|
||||||
class MultipleChoice(click.Choice):
|
class MultipleChoice(click.Choice):
|
||||||
"""
|
"""
|
||||||
The multiple choice type allows multiple values to be checked against
|
The multiple choice type allows multiple values to be checked against
|
||||||
@@ -288,5 +336,6 @@ class MultipleChoice(click.Choice):
|
|||||||
SEASON_RANGE = SeasonRange()
|
SEASON_RANGE = SeasonRange()
|
||||||
LANGUAGE_RANGE = LanguageRange()
|
LANGUAGE_RANGE = LanguageRange()
|
||||||
QUALITY_LIST = QualityList()
|
QUALITY_LIST = QualityList()
|
||||||
|
AUDIO_CODEC_LIST = AudioCodecList(Audio.Codec)
|
||||||
|
|
||||||
# VIDEO_CODEC_CHOICE will be created dynamically when imported
|
# VIDEO_CODEC_CHOICE will be created dynamically when imported
|
||||||
|
|||||||
@@ -66,6 +66,11 @@ debug_keys:
|
|||||||
# Muxing configuration
|
# Muxing configuration
|
||||||
muxing:
|
muxing:
|
||||||
set_title: false
|
set_title: false
|
||||||
|
# merge_audio: Merge all audio tracks into each output file
|
||||||
|
# true (default): All selected audio in one MKV per quality
|
||||||
|
# false: Separate MKV per (quality, audio_codec) combination
|
||||||
|
# Example: Title.1080p.AAC.mkv, Title.1080p.EC3.mkv
|
||||||
|
merge_audio: true
|
||||||
|
|
||||||
# Login credentials for each Service
|
# Login credentials for each Service
|
||||||
credentials:
|
credentials:
|
||||||
@@ -268,6 +273,15 @@ remote_cdm:
|
|||||||
host: "https://keyxtractor.decryptlabs.com"
|
host: "https://keyxtractor.decryptlabs.com"
|
||||||
secret: "your_decrypt_labs_api_key_here"
|
secret: "your_decrypt_labs_api_key_here"
|
||||||
|
|
||||||
|
# PyPlayReady RemoteCdm - connects to an unshackle serve instance
|
||||||
|
- name: "playready_remote"
|
||||||
|
Device Type: PLAYREADY
|
||||||
|
System ID: 0
|
||||||
|
Security Level: 3000 # 2000 for SL2000, 3000 for SL3000
|
||||||
|
Host: "http://127.0.0.1:8786/playready" # Include /playready path
|
||||||
|
Secret: "your-api-secret-key"
|
||||||
|
Device Name: "my_prd_device" # Device name on the serve instance
|
||||||
|
|
||||||
# Key Vaults store your obtained Content Encryption Keys (CEKs)
|
# Key Vaults store your obtained Content Encryption Keys (CEKs)
|
||||||
# Use 'no_push: true' to prevent a vault from receiving pushed keys
|
# Use 'no_push: true' to prevent a vault from receiving pushed keys
|
||||||
# while still allowing it to provide keys when requested
|
# while still allowing it to provide keys when requested
|
||||||
@@ -368,17 +382,29 @@ subtitle:
|
|||||||
# When true, skips pycaption processing for WebVTT files to keep tags like <i>, <b>, positioning intact
|
# When true, skips pycaption processing for WebVTT files to keep tags like <i>, <b>, positioning intact
|
||||||
# Combined with no sub_format setting, ensures subtitles remain in their original format (default: true)
|
# Combined with no sub_format setting, ensures subtitles remain in their original format (default: true)
|
||||||
preserve_formatting: true
|
preserve_formatting: true
|
||||||
|
# output_mode: Output mode for subtitles
|
||||||
|
# - mux: Embed subtitles in MKV container only (default)
|
||||||
|
# - sidecar: Save subtitles as separate files only
|
||||||
|
# - both: Embed in MKV AND save as sidecar files
|
||||||
|
output_mode: mux
|
||||||
|
# sidecar_format: Format for sidecar subtitle files
|
||||||
|
# Options: srt, vtt, ass, original (keep current format)
|
||||||
|
sidecar_format: srt
|
||||||
|
|
||||||
# Configuration for pywidevine's serve functionality
|
# Configuration for pywidevine and pyplayready's serve functionality
|
||||||
serve:
|
serve:
|
||||||
api_secret: "your-secret-key-here"
|
api_secret: "your-secret-key-here"
|
||||||
users:
|
users:
|
||||||
secret_key_for_user:
|
secret_key_for_user:
|
||||||
devices:
|
devices: # Widevine devices (WVDs) this user can access
|
||||||
- generic_nexus_4464_l3
|
- generic_nexus_4464_l3
|
||||||
|
playready_devices: # PlayReady devices (PRDs) this user can access
|
||||||
|
- playready_device_sl3000
|
||||||
username: user
|
username: user
|
||||||
# devices:
|
# devices: # Widevine device paths (auto-populated from directories.wvds)
|
||||||
# - '/path/to/device.wvd'
|
# - '/path/to/device.wvd'
|
||||||
|
# playready_devices: # PlayReady device paths (auto-populated from directories.prds)
|
||||||
|
# - '/path/to/device.prd'
|
||||||
|
|
||||||
# Configuration data for each Service
|
# Configuration data for each Service
|
||||||
services:
|
services:
|
||||||
@@ -412,6 +438,19 @@ services:
|
|||||||
app_name: "AIV"
|
app_name: "AIV"
|
||||||
device_model: "Fire TV Stick 4K"
|
device_model: "Fire TV Stick 4K"
|
||||||
|
|
||||||
|
# Service-specific proxy mappings
|
||||||
|
# Override global proxy selection with specific servers for this service
|
||||||
|
# When --proxy matches a key in proxy_map, the mapped server will be used
|
||||||
|
# instead of the default/random server selection
|
||||||
|
proxy_map:
|
||||||
|
nordvpn:ca: ca1577 # Use ca1577 when --proxy nordvpn:ca is specified
|
||||||
|
nordvpn:us: us9842 # Use us9842 when --proxy nordvpn:us is specified
|
||||||
|
us: 123 # Use server 123 (from any provider) when --proxy us is specified
|
||||||
|
gb: 456 # Use server 456 (from any provider) when --proxy gb is specified
|
||||||
|
# Without this service, --proxy nordvpn:ca picks a random CA server
|
||||||
|
# With this config, --proxy nordvpn:ca EXAMPLE uses ca1577 specifically
|
||||||
|
# Other services or no service specified will still use random selection
|
||||||
|
|
||||||
# NEW: Configuration overrides (can be combined with profiles and certificates)
|
# NEW: Configuration overrides (can be combined with profiles and certificates)
|
||||||
# Override dl command defaults for this service
|
# Override dl command defaults for this service
|
||||||
dl:
|
dl:
|
||||||
@@ -482,8 +521,15 @@ proxy_providers:
|
|||||||
nordvpn:
|
nordvpn:
|
||||||
username: username_from_service_credentials
|
username: username_from_service_credentials
|
||||||
password: password_from_service_credentials
|
password: password_from_service_credentials
|
||||||
|
# server_map: global mapping that applies to ALL services
|
||||||
|
# Difference from service-specific proxy_map:
|
||||||
|
# - server_map: applies to ALL services when --proxy nordvpn:us is used
|
||||||
|
# - proxy_map: only applies to the specific service configured (see services: EXAMPLE: proxy_map above)
|
||||||
|
# - proxy_map takes precedence over server_map for that service
|
||||||
server_map:
|
server_map:
|
||||||
us: 12 # force US server #12 for US proxies
|
us: 12 # force US server #12 for US proxies
|
||||||
|
ca:calgary: 2534 # force CA server #2534 for Calgary proxies
|
||||||
|
us:seattle: 7890 # force US server #7890 for Seattle proxies
|
||||||
surfsharkvpn:
|
surfsharkvpn:
|
||||||
username: your_surfshark_service_username # Service credentials from https://my.surfshark.com/vpn/manual-setup/main/openvpn
|
username: your_surfshark_service_username # Service credentials from https://my.surfshark.com/vpn/manual-setup/main/openvpn
|
||||||
password: your_surfshark_service_password # Service credentials (not your login password)
|
password: your_surfshark_service_password # Service credentials (not your login password)
|
||||||
@@ -491,12 +537,81 @@ proxy_providers:
|
|||||||
us: 3844 # force US server #3844 for US proxies
|
us: 3844 # force US server #3844 for US proxies
|
||||||
gb: 2697 # force GB server #2697 for GB proxies
|
gb: 2697 # force GB server #2697 for GB proxies
|
||||||
au: 4621 # force AU server #4621 for AU proxies
|
au: 4621 # force AU server #4621 for AU proxies
|
||||||
|
us:seattle: 5678 # force US server #5678 for Seattle proxies
|
||||||
|
ca:toronto: 1234 # force CA server #1234 for Toronto proxies
|
||||||
windscribevpn:
|
windscribevpn:
|
||||||
username: your_windscribe_username # Service credentials from https://windscribe.com/getconfig/openvpn
|
username: your_windscribe_username # Service credentials from https://windscribe.com/getconfig/openvpn
|
||||||
password: your_windscribe_password # Service credentials (not your login password)
|
password: your_windscribe_password # Service credentials (not your login password)
|
||||||
server_map:
|
server_map:
|
||||||
us: "us-central-096.totallyacdn.com" # force US server
|
us: "us-central-096.totallyacdn.com" # force US server
|
||||||
gb: "uk-london-055.totallyacdn.com" # force GB server
|
gb: "uk-london-055.totallyacdn.com" # force GB server
|
||||||
|
us:seattle: "us-west-011.totallyacdn.com" # force US Seattle server
|
||||||
|
ca:toronto: "ca-toronto-012.totallyacdn.com" # force CA Toronto server
|
||||||
|
|
||||||
|
# Gluetun: Dynamic Docker-based VPN proxy (supports 50+ VPN providers)
|
||||||
|
# Creates Docker containers running Gluetun to bridge VPN connections to HTTP proxies
|
||||||
|
# Requires Docker to be installed and running
|
||||||
|
# Usage: --proxy gluetun:windscribe:us or --proxy gluetun:nordvpn:de
|
||||||
|
gluetun:
|
||||||
|
# Global settings
|
||||||
|
base_port: 8888 # Starting port for HTTP proxies (increments for each container)
|
||||||
|
auto_cleanup: true # Automatically remove containers when done
|
||||||
|
container_prefix: "unshackle-gluetun" # Docker container name prefix
|
||||||
|
verify_ip: true # Verify VPN IP matches expected region
|
||||||
|
# Optional HTTP proxy authentication (for the proxy itself, not VPN)
|
||||||
|
# auth_user: proxy_user
|
||||||
|
# auth_password: proxy_password
|
||||||
|
|
||||||
|
# VPN provider configurations
|
||||||
|
providers:
|
||||||
|
# Windscribe (WireGuard) - Get credentials from https://windscribe.com/getconfig/wireguard
|
||||||
|
windscribe:
|
||||||
|
vpn_type: wireguard
|
||||||
|
credentials:
|
||||||
|
private_key: "YOUR_WIREGUARD_PRIVATE_KEY"
|
||||||
|
addresses: "YOUR_WIREGUARD_ADDRESS" # e.g., "10.x.x.x/32"
|
||||||
|
# Map friendly names to country codes
|
||||||
|
server_countries:
|
||||||
|
us: US
|
||||||
|
uk: GB
|
||||||
|
ca: CA
|
||||||
|
de: DE
|
||||||
|
|
||||||
|
# NordVPN (OpenVPN) - Get service credentials from https://my.nordaccount.com/dashboard/nordvpn/manual-configuration/
|
||||||
|
# Note: Service credentials are NOT your email+password - generate them from the link above
|
||||||
|
# nordvpn:
|
||||||
|
# vpn_type: openvpn
|
||||||
|
# credentials:
|
||||||
|
# username: "YOUR_NORDVPN_SERVICE_USERNAME"
|
||||||
|
# password: "YOUR_NORDVPN_SERVICE_PASSWORD"
|
||||||
|
# server_countries:
|
||||||
|
# us: US
|
||||||
|
# uk: GB
|
||||||
|
|
||||||
|
# ExpressVPN (OpenVPN) - Get credentials from ExpressVPN setup page
|
||||||
|
# expressvpn:
|
||||||
|
# vpn_type: openvpn
|
||||||
|
# credentials:
|
||||||
|
# username: "YOUR_EXPRESSVPN_USERNAME"
|
||||||
|
# password: "YOUR_EXPRESSVPN_PASSWORD"
|
||||||
|
# server_countries:
|
||||||
|
# us: US
|
||||||
|
# uk: GB
|
||||||
|
|
||||||
|
# Surfshark (WireGuard) - Get credentials from https://my.surfshark.com/vpn/manual-setup/main/wireguard
|
||||||
|
# surfshark:
|
||||||
|
# vpn_type: wireguard
|
||||||
|
# credentials:
|
||||||
|
# private_key: "YOUR_SURFSHARK_PRIVATE_KEY"
|
||||||
|
# addresses: "YOUR_SURFSHARK_ADDRESS"
|
||||||
|
# server_countries:
|
||||||
|
# us: US
|
||||||
|
# uk: GB
|
||||||
|
|
||||||
|
# Specific server selection: Use format like "us1239" to select specific servers
|
||||||
|
# Example: --proxy gluetun:nordvpn:us1239 connects to us1239.nordvpn.com
|
||||||
|
# Supported providers: nordvpn, surfshark, expressvpn, cyberghost
|
||||||
|
|
||||||
basic:
|
basic:
|
||||||
GB:
|
GB:
|
||||||
- "socks5://username:password@bhx.socks.ipvanish.com:1080" # 1 (Birmingham)
|
- "socks5://username:password@bhx.socks.ipvanish.com:1080" # 1 (Birmingham)
|
||||||
|
|||||||
Reference in New Issue
Block a user