* Yandex Music: Add configurable My Wave settings
Add 6 new configuration options for Yandex Music provider:
- My Wave maximum tracks (default: 150) - Control total number of tracks fetched
- My Wave batch count (default: 3) - Number of API calls for initial load
- Track details batch size (default: 50) - Batch size for track detail requests
- Discovery initial tracks (default: 5) - Initial display limit for Discover
- Browse initial tracks (default: 15) - Initial display limit for Browse
- Enable Discover (default: true) - Toggle recommendations on/off
Implemented duplicate protection for My Wave tracks using set-based tracking.
Recommendations now refresh every 60 seconds instead of 3 hours for fresher discoveries.
All new settings have sensible defaults that maintain current behavior.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* feat(yandex_music): add configurable base URL
Add Advanced setting for API base URL in Yandex Music provider,
allowing users to change the API endpoint if service updates their URL.
Changes:
- Add CONF_BASE_URL and DEFAULT_BASE_URL constants
- Add Advanced ConfigEntry for base_url in provider settings
- Update YandexMusicClient to accept base_url parameter
- Pass base_url from config to ClientAsync initialization
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* feat(yandex_music): improve Liked Tracks with virtual playlist and sorting
This commit adds three major improvements to the Liked Tracks feature:
1. Reverse chronological sorting:
- Liked tracks are now sorted by timestamp (most recent first)
- Matches mobile app behavior for better UX
- Applied automatically in get_liked_tracks() method
2. Browse folder visibility toggle:
- Added CONF_ENABLE_LIKED_TRACKS_BROWSE config option
- Allows hiding Liked Tracks folder from Browse section
- Default: True (backward compatible)
3. Virtual playlist for Liked Tracks:
- Added LIKED_TRACKS_PLAYLIST_ID virtual playlist
- Appears in library playlists (similar to My Wave)
- Supports full MA playlist features (radio, favorites, etc.)
- Configurable via CONF_ENABLE_LIKED_TRACKS_PLAYLIST
- Respects CONF_LIKED_TRACKS_MAX_TRACKS limit (default: 500)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* refactor(yandex_music): organize configuration settings with categories and advanced flags
Restructures the 16 Yandex Music provider configuration entries to improve UX:
- Groups My Wave settings (6 entries) in "my_wave" category
- Groups Liked Tracks settings (3 entries) in "liked_tracks" category
- Marks performance tuning settings as advanced (7 entries total)
- Maintains authentication/quality settings at top level
This reduces visible clutter from 16 to ~8 settings by default, with advanced
options hidden behind a toggle. No breaking changes - all config keys, defaults,
and functionality remain unchanged.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* feat(yandex_music): Implement FLAC lossless playback with AES-256 decryption
This commit fully implements lossless FLAC playback for Yandex Music, fixing the
issue where only MP3 320kbps was available despite user having a premium subscription.
## Changes
### Quality Levels Restructure
- Replaced HIGH/LOSSLESS quality options with three-tier system matching reference implementation
- EFFICIENT (AAC ~64kbps) - Low quality, efficient bandwidth
- BALANCED (AAC ~192kbps) - Medium quality (new default)
- SUPERB (FLAC Lossless) - Highest quality
- Updated manifest.json and config entries to reflect new quality options
### API Authentication Fix
- Implemented manual HMAC-SHA256 sign calculation matching yandex-music-downloader-realflac
- Critical fix: Remove last character from base64-encoded sign ([:-1])
- Fixed HTTP 403 "not-allowed" errors from /get-file-info endpoint
- Uses DEFAULT_SIGN_KEY from yandex-music library
### FLAC Decryption Implementation
- Added _decrypt_track_url() method using AES-256 CTR mode
- Uses PyCrypto (pycryptodome) which supports 12-byte nonce for CTR mode
- Key is HEX-encoded (bytes.fromhex), not base64 as initially attempted
- Downloads encrypted stream, decrypts on-the-fly, saves to temp file
- Returns StreamDetails with StreamType.LOCAL_FILE pointing to decrypted temp file
### Streaming Logic Updates
- Enhanced get_stream_details() to handle encrypted URLs from encraw transport
- Detects needs_decryption flag in API response
- Falls back gracefully to MP3 if decryption fails
- Supports both encrypted and unencrypted FLAC URLs
- Updated _select_best_quality() to intelligently select based on three quality tiers
### Dependencies
- Added pycryptodome==3.21.0 to support AES CTR mode with 12-byte nonce
- Uses aiohttp for direct HTTP download of encrypted streams
### Testing
- All existing tests pass (7/7 in test_streaming.py)
- Type checking passes (mypy success)
- Code quality checks pass (ruff linter/formatter)
## Technical Details
The Yandex Music API returns encrypted URLs when using transports=encraw. The decryption
process matches the working reference implementation:
1. Calculate HMAC-SHA256 sign with all param values joined
2. Base64 encode and remove last character (critical!)
3. Request /get-file-info with quality=lossless, codecs=flac-mp4,flac,...
4. Download encrypted stream from returned URL
5. Decrypt using AES-256 CTR with 12-byte null nonce and HEX-decoded key
6. Save decrypted FLAC to temporary file
7. Return stream details pointing to temp file
## Tested
- Server starts successfully with Yandex Music provider loaded
- FLAC codec detected from API (codec=flac-mp4)
- Encryption detected and decryption executes (55MB encrypted → decrypted)
- StreamType.LOCAL_FILE allows playback from temp file
- Graceful fallback to MP3 if decryption fails
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* feat(yandex_music): Implement on-the-fly streaming decryption for FLAC
Replace temp file approach with memory-based streaming decryption for better
performance and reduced disk I/O.
## Changes
### StreamType.CUSTOM Implementation
- Use AsyncGenerator[bytes, None] for streaming decryption
- No temporary files on disk - all processing in memory
- Store encrypted URL and key in StreamDetails.data
### Streaming Decryption Pipeline
- Download encrypted stream in 64KB chunks using aiohttp
- Decrypt incrementally with AES-256 CTR mode
- Counter auto-increments for each block (streaming-friendly)
- Yield decrypted chunks directly to audio pipeline
### Performance Improvements
- **First chunk:** 0.39s vs 5+ seconds with temp file approach
- **No disk I/O:** Streaming directly from memory
- **Lower latency:** Start playback while downloading/decrypting
- **Efficient:** 64KB chunks balance memory and throughput
### Implementation Details
- Added get_audio_stream() method to YandexMusicStreamingManager
- Cipher initialization with 12-byte null nonce (PyCrypto)
- ClientTimeout: connect=30s, sock_read=600s for stable streaming
- Proper error handling and logging throughout pipeline
### Technical Notes
AES CTR mode is ideal for streaming because:
- Each block can be encrypted/decrypted independently
- Counter increments automatically - no state management needed
- Supports arbitrary chunk sizes (not just 16-byte blocks)
## Tested
- All 7 unit tests pass
- Type checking passes (mypy)
- Code quality checks pass (ruff)
- Live streaming confirmed: codec=flac, streamtype=custom
- First audio chunk in <0.4s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* fix(deps): Update pycryptodome to 3.23.0 to resolve dependency conflict
pywidevine==1.9.0 requires pycryptodome>=3.23.0
Updated from 3.21.0 to 3.23.0 to satisfy both dependencies
* feat(yandex_music): add High quality level (MP3 320kbps)
Add new "High" quality level between Balanced and Superb:
- Efficient (AAC ~64kbps) - Low quality
- Balanced (AAC ~192kbps) - Medium quality
- High (MP3 ~320kbps) - High quality lossy (NEW)
- Superb (FLAC Lossless) - Highest quality
Implementation:
- Add QUALITY_HIGH constant in constants.py
- Add "High (MP3 ~320kbps)" option in config UI
- Update _select_best_quality() logic to prefer MP3 >=256kbps
- Fallback chain: high bitrate MP3 → any MP3 → highest non-FLAC
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* feat(yandex_music): add configurable FLAC streaming modes
On low-power devices, encrypted FLAC streaming breaks after ~1 minute
because AES decryption can't keep up with download speed, causing the
server to drop the connection (ClientPayloadError).
Add 3 configurable streaming modes to decouple download from decryption:
- Direct: on-the-fly decrypt (original behavior, fast devices)
- Buffered: async queue with backpressure (recommended, default)
- Preload: full download then decrypt via SpooledTemporaryFile (slow devices)
Closes #29
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* fix(yandex_music): auto-reconnect API client after network outage
_ensure_connected() now attempts reconnect when _client is None instead
of raising immediately. Added _call_with_retry() helper that wraps API
calls with one reconnect attempt on connection errors. Refactored all
API methods to use it, eliminating ad-hoc retry loops.
Fixes permanent provider death after temporary network outage where
_reconnect() failure set _client=None with no recovery path.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* feat(yandex_music): extend recommendations with feed, chart, releases, and playlists
Add 4 new recommendation sections beyond My Wave: Made for You (personalized
feed playlists), Chart (top tracks), New Releases (albums), and New Playlists
(editorial). Each section has its own config toggle under a "Discovery" category
and independent cache TTLs (30min for feed, 1h for chart/releases/playlists).
Closes #34
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* fix(yandex_music): enable seek in Preload streaming mode for FLAC
- Preload mode now downloads and decrypts to temp file before playback
- Returns StreamType.LOCAL_FILE with can_seek=True for proper navigation
- Files exceeding size limit (config) fall back to Buffered mode
- Rename config 'Preload memory limit' to 'Preload max file size' with updated description
- Add temp file cleanup on stream completion and provider unload
Fixes: seek not working when using Superb quality + Preload mode
Fixes: progress bar starting before audio is ready
Co-authored-by: factory-droid[bot] <138933559+factory-droid[bot]@users.noreply.github.com>
* feat(yandex_music): add Picks & Mixes browse and discovery sections
Browse:
- picks/ folder with mood, activity, era, genres subfolders
- mixes/ folder with seasonal collections (winter, summer, autumn, newyear)
- Each tag returns curated playlists from Yandex Music tags API
Discovery (home page):
- Top Picks: curated playlists (tag: top)
- Mood Mix: rotating mood playlists (chill, sad, romantic, party, relax)
- Activity Mix: rotating activity playlists (workout, focus, morning, evening, driving)
- Seasonal Mix: playlists based on current season
Configuration (new picks_mixes category):
- Enable Picks in Browse
- Enable Mixes in Browse
- Enable Top Picks on Home
- Enable Mood/Activity/Seasonal Mix on Home
Localization:
- All folder and tag names in Russian and English
Co-authored-by: factory-droid[bot] <138933559+factory-droid[bot]@users.noreply.github.com>
* feat(yandex_music): add lyrics support
- Add ProviderFeature.LYRICS to supported features
- Add get_track_lyrics() method to api_client.py
- Extend parse_track() to accept lyrics and lyrics_synced parameters
- Fetch lyrics in get_track() and attach to track metadata
- Support both synced LRC format and plain text lyrics
- Handle geo-restrictions and unavailable lyrics gracefully
Co-authored-by: factory-droid[bot] <138933559+factory-droid[bot]@users.noreply.github.com>
* fix(yandex_music): fix locale-dependent playlist names caching issue
- Split get_playlist() to separate virtual playlists (not cached) from real ones
- Virtual playlists (My Wave, Liked Tracks) now always use current locale
- Real playlists continue to be cached for 30 days
- Add debug logging for locale detection
Co-authored-by: factory-droid[bot] <138933559+factory-droid[bot]@users.noreply.github.com>
* fix(yandex_music): fix Browse Picks path construction
- Fix base path calculation in _browse_picks() and _browse_mixes()
- Previously: base was incorrectly trimming to parent path
- Now: base correctly appends to current path
- Add debug logging for troubleshooting
Co-authored-by: factory-droid[bot] <138933559+factory-droid[bot]@users.noreply.github.com>
* fix(yandex_music): add missing get_track_lyrics mock to integration tests
The lyrics feature added in
4fc11fda introduced a get_track_lyrics call
in get_track, but the test fixtures were not updated with the mock,
causing ValueError on tuple unpacking.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Address review comments: simplify Yandex Music provider configuration (#41)
* Initial plan
* Fix manifest.json URL and move pycryptodome to provider requirements
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Drastically reduce config entries and fix logging format
- Keep only 8 essential config entries: token, clear auth, quality, streaming mode, preload buffer, My Wave max tracks, Liked Tracks max tracks, and base URL
- Remove 20+ config entries and hardcode sane defaults
- Remove all category attributes from config entries
- Remove conditional recommendation check in setup()
- Replace all f-string logging with %s formatting (17 instances)
- Update constants.py to keep only necessary constants
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Address review feedback: fix parameter naming, temp file leak, and double close (#42)
* Initial plan
* Fix critical review issues: rename key_base64, fix double close, add temp file cleanup
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Fix linter issues: use cleanup_temp_file method and add noqa comment
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* [WIP] Fix file descriptor leak in Yandex Music provider (#44)
* Initial plan
* Fix streaming issues: FD leak, buffered fallback, size limits, task cleanup, stale files
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Fix provider issues: change INFO to DEBUG logs, fix track_map by ID
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Rename misnamed test to reflect QUALITY_BALANCED behavior
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Fix linting issues: move import to top, use Path.stat(), fix line lengths
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Fix Yandex Music AAC codec mapping and timezone awareness (#45)
* Initial plan
* Fix Yandex Music provider: AAC codec mapping, caching issues, and datetime timezone
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Restore @use_cache decorators for mood and activity mix recommendations to reduce API load
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Add test coverage for Yandex Music recommendation methods (#46)
* Initial plan
* Add comprehensive test coverage for Yandex Music recommendations
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Fix linting issues in test_recommendations.py
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Fix mypy type annotations in test_recommendations.py
Add return type annotations to async helper functions to resolve mypy errors:
- return_folder() -> RecommendationFolder (lines 783, 810)
- return_none() -> None (lines 813, 837)
Fixes CI mypy check failure.
* Initial plan
* Restore full test_recommendations.py file and add mypy return type annotations
Co-authored-by: trudenboy0 <261913410+trudenboy0@users.noreply.github.com>
* Increase Discovery initial tracks from 5 to 20
Show more tracks in the Discovery section on Home for better content visibility.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix Yandex Music browse: "Invalid subpath" error and empty tag folders
When pressing Play on a Picks tag folder (e.g. "chill"), the BrowseFolder
URI reconstruction loses the full path (picks/mood/chill → just chill),
causing browse() to fall through to the base class which raises
"Invalid subpath". Add tag recognition fallback before the base class call.
Empty tag folders were caused by hardcoded tag slugs not matching actual
Yandex API slugs. Add dynamic tag discovery via the landing("mixes") API
which returns real MixLink entities with valid tag URLs. Results are cached
for 1 hour with hardcoded tags as fallback.
Fixes #34
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Address PR review feedback: dead code, infinite loop, lyrics optimization
- Remove unreachable _stream_preload() from get_audio_stream() (preload
handled in get_stream_details via LOCAL_FILE)
- Add aac-mp4/he-aac-mp4 codec variants to Efficient and Balanced
quality selection for consistency with _get_content_type()
- Switch _get_content_length() to shared http_session; add comments
explaining why streaming methods use dedicated sessions
- Fix My Wave infinite re-fetch when all batch tracks are duplicates
by breaking on first_track_id_this_batch is None
- Simplify _get_discovered_tags() return type from dict to list
- Add get_track_lyrics_from_track() to eliminate duplicate API call
when track object is already available
- Update test mocks for new lyrics method
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix Picks & Mixes: use dynamic tag discovery, filter empty/useless tags
Replace hardcoded tag lists in browse with dynamically discovered tags
from Landing API. Only tags actually returned by the API are shown,
eliminating empty folders. Blacklist non-musical tags like
"albomy-s-videoshotami". Set is_playable=False on tag/category folders
to prevent loading thousands of tracks on Play. Reduce tag playlist
cache TTL from 1h to 10min for faster recovery.
Fixes #34
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Validate tags via API before showing in browse and recommendations
Replace blacklist-based filtering with runtime tag validation via
client.tags() to ensure only tags with actual playlists are displayed.
Adds new tags (top, newbies, in the mood, background) and filters out
editorial /post/ URLs from landing discovery.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix exception handling, docstring, and log noise in Yandex Music provider (#52)
* Initial plan
* Fix 3 Copilot review issues in Yandex Music provider
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Fix race conditions, dead code, caching bugs, and performance issues in Yandex Music provider
Critical fixes:
- Add asyncio.Lock to protect My Wave shared mutable state from concurrent access
- Fix cache key collision in get_track by normalizing composite IDs before caching
- Fix random.choice + @use_cache interaction: tag selection now happens outside cached methods
- Lower "No liked tracks found" from warning to debug (normal for new accounts)
Dead code cleanup:
- Remove unused _decrypt_track_url from api_client.py (and unused aiohttp/AES imports)
- Remove dead _get_content_type stub from parsers.py, inline ContentType.UNKNOWN
Performance:
- Parallelize tag validation with asyncio.gather + Semaphore(8) instead of sequential
- Reuse shared aiohttp session for streaming instead of creating one per request
- Reduce search cache from 14 days to 3 hours
- Remove excessive per-batch/per-ID debug logging in liked tracks and library playlists
Code quality:
- Extract _parse_my_wave_track helper to eliminate 3x duplicated parsing logic
- Fix LRC detection: use regex r'\[\d{2}:\d{2}' instead of startswith('[')
- Replace hardcoded spring→autumn fallback with dynamic _validate_tag check
- Fix _validate_tag docstring (client.tags → client.get_tag_playlists)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Address PR review: retry for lossless API, restore search cache, add quality tests
- Wrap get_track_file_info_lossless in _call_with_retry for automatic
reconnection on transient API failures (extracted _build_signed_params
and _do_request helpers)
- Revert search cache from 3h back to 14 days (search results are stable)
- Add 8 unit tests for Efficient and High quality selection branches
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix Yandex Music provider review issues: regex, HMAC signing, temp file cleanup (#54)
* Initial plan
* Fix LRC regex, HMAC sign construction, and temp file cleanup order
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Fix linting issues in tests
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Fix spelling in test comments
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: trudenboy <139659391+trudenboy@users.noreply.github.com>
* Address PR review comments: fix dict equality check and remove duplicate import
- Use == instead of is for dict comparison (BROWSE_NAMES_RU)
- Remove duplicate AsyncGenerator import from TYPE_CHECKING block
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix MinimalProvider stub missing client/mass attributes in test
The test_temp_file_replacement_order test was failing because
MinimalProvider lacked client and mass attributes expected by
YandexMusicStreamingManager.__init__.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix My Wave over-fetching and locale-aware tag caching
Address Copilot review comments on PR #3147:
- _browse_my_wave: cap fetch loop to BROWSE_INITIAL_TRACKS on initial browse
instead of post-loop slicing, preventing tracks from being marked as "seen"
but never shown to the user
- _get_discovered_tags: add locale parameter to cache key so tag titles are
re-fetched when locale changes
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix flac-mp4 container handling: separate MP4 container from FLAC codec
Yandex API returns codec "flac-mp4" meaning FLAC audio inside an MP4
container. Previously _get_content_type mapped this to ContentType.FLAC,
causing ffmpeg to misidentify the container format. Now correctly returns
(ContentType.MP4, ContentType.FLAC) as container/codec pair, matching
how Apple Music handles MP4+AAC. Also fixes temp file extension for
preload mode (.mp4 instead of .flac).
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Improve flac-mp4 downstream: update logs, temp prefix, and comments
Follow-up to
b6b11a22 (flac-mp4 container fix). Replaces hardcoded
"FLAC" in log messages with the actual codec string so logs correctly
reflect flac-mp4 vs flac. Renames TEMP_FILE_PREFIX to "yandex_audio_"
since temp files may now be .mp4. Adds clarifying comment about
flac-mp4 in _select_best_quality.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Set correct sample_rate/bit_depth for Yandex Music AudioFormat
The Yandex API does not return sample rate or bit depth, so AudioFormat
used model defaults (44100/16) even for flac-mp4 streams that are
actually 48kHz/24bit. This caused unnecessary downsampling in
_select_pcm_format. Add codec-based defaults via _get_audio_params and
_build_audio_format helpers to set the correct values.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Add configurable stream buffer for buffered streaming mode
Increase default buffer from 2MB (32 chunks) to 8MB (128 chunks)
to reduce stuttering on slow/unstable connections (~45s of FLAC audio).
Add CONF_STREAM_BUFFER_MB config entry (1-32 MB, default 8) so users
can tune the buffer size for their network conditions.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix unresolved Copilot review comments in PR #3147
- Add implementation to test_temp_file_replacement_order() verifying
new path is stored before old file deletion to prevent leaks
- Remove misplaced MinimalProvider stub code from test_get_audio_params_none()
- Add "spring" to TAG_MIXES and TAG_SLUG_CATEGORY to align with TAG_SEASONAL_MAP
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* Fix stale comments flagged by Copilot review #
3818153899
- Correct base64 padding note: SHA-256 (32 bytes) encodes with "=" not "=="
- Remove hard-coded line number reference from LRC regex test comment
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* Fix naming consistency for spring and liked_tracks in BROWSE_NAMES
Add missing "spring" and "liked_tracks" entries to both BROWSE_NAMES_RU
and BROWSE_NAMES_EN dicts, and use LIKED_TRACKS_PLAYLIST_ID constant
instead of hardcoded "tracks" key in get_playlist().
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix concurrency issues and harden config entries per Copilot review
- Add asyncio.Lock to _get_streaming_session to prevent session leak on races
- Add threading.Lock to _temp_files to prevent concurrent modification
- Extract _replace_temp_file helper for atomic temp file replacement
- Clarify AES CTR cipher safety in _prepare_cipher docstring
- Clarify HMAC timestamp regeneration in _build_signed_params docstring
- Add range constraints to My Wave and Liked Tracks max tracks config
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix temp_fd double-close bug and clarify seasonal fallback comment
- Move temp_fd = -1 before os.fdopen() to prevent double-close when an
exception occurs inside the with block (Copilot #
2823816163)
- Clarify constants.py comment to reflect that fallback is dynamic and
applies to any unavailable seasonal tag (Copilot #
2823816236)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* Fix FLAC seek in buffered/direct mode and improve My Wave reliability
- Enable ffmpeg seek (allow_seek=True) for encrypted CUSTOM streams so
that seeking fast-forwards through the decrypted pipe instead of
restarting from the beginning; can_seek stays False since the provider
can't reposition the AES-CTR cipher mid-stream
- Batch multiple Rotor API calls per playlist-page request to reduce
round-trips and ensure enough tracks are returned per pagination page
- Guard radioStarted feedback so the flag is only set on confirmed send
- Promote rotor feedback errors from debug to warning for visibility
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* Increase max stream buffer size to 64 MB for buffered FLAC streaming
Raises the upper bound of the stream buffer config from 32 MB to 64 MB
to accommodate longer tracks on slow/weak hardware without dropping the
connection mid-stream.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* Address Copilot review comments (round 3)
- Clarify stream buffer description to mention Preload fallback scenario
- Improve HMAC sign comment: state concatenation order and reference
yandex-music-downloader-realflac as the source of the comma-strip rule
- Add inline comments to LRC regex tests documenting \d{2,3} lower/upper bounds
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* Address Copilot review comments: unquote tag slugs, skip known folders, show buffer for preload
- URL-decode tag slugs from get_landing_tags() via urllib.parse.unquote
- Skip _get_discovered_tag_slugs() API call for known top-level browse folders
- Show CONF_STREAM_BUFFER_MB config for both Buffered and Preload modes
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* Fix issue #57 (owner name locale) and issue #56 (stream retry + buffered seek)
- Fix #57: normalize all known system owner name variants to locale-aware
canonical form ("Yandex Music" / "Яндекс Музыка") so playlist filtering
works regardless of API locale
- Fix #56 retry: resume encrypted streams on ClientPayloadError via HTTP
Range + AES-CTR counter resume in both direct and buffered modes (3 retries)
- Fix #56 seek: enable can_seek=True for flac-mp4 tracks; new mp4_seek.py
parses moov/stts/stsc/stco to find byte offset; _stream_buffered_seek
serves patched moov prefix + Range-resumed mdat from seek position
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* Add .worktrees/ to .gitignore
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* Add backward-compat shim for advanced= vs category= in Yandex Music config
Replace 6 hardcoded advanced=True with **_ADVANCED that detects at runtime
whether models >=1.1.87 (advanced= field) or 1.1.86 (category= field) is
installed. This eliminates manual patching when backporting to stable.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* refactor(yandex_music): remove preload streaming mode
Remove the Preload streaming mode which downloaded and cached entire
encrypted tracks to temp files on disk. This approach is incompatible
with a streaming service model and was flagged in upstream code review.
Direct and Buffered modes are sufficient: Direct decrypts on-the-fly
chunk-by-chunk, Buffered decouples download from consumption via a
bounded async queue.
Also removes the backward-compat _ADVANCED shim (replaced with
advanced=True directly) since upstream models >=1.1.87 supports it.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* refactor(yandex_music): remove direct streaming mode, buffered only
Simplify FLAC streaming to a single buffered mode. The direct
on-the-fly mode adds complexity without meaningful benefit over
the buffered async-queue approach.
Removes CONF_STREAMING_MODE config entry entirely. The stream
buffer size (MB) remains as an Advanced setting.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* fix(yandex_music): fix FLAC seek by delegating to ffmpeg (-ss)
The _stream_buffered_seek approach was broken: it patched chunk offsets
in the moov atom but ffmpeg still tried to decode from sample 0 (time=0),
whose adjusted offsets pointed into the moov itself, producing empty output.
Fix: set can_seek=False, allow_seek=True for encrypted FLAC streams.
The provider always streams from position 0; ffmpeg uses -ss to seek.
Seeking now works correctly at the cost of streaming from the beginning,
which is acceptable and consistent with non-encrypted HTTP streams.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* fix(yandex_music): include liked editorial playlists in library sync
Add get_liked_playlists() to api_client.py calling users_likes_playlists()
endpoint. Update get_library_playlists() to yield liked editorial playlists
alongside user-created ones, with deduplication via seen_ids. Fixes provider
filter showing only virtual playlists (closes #57).
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* refactor(yandex_music): remove buffering/seeking infrastructure from streaming
Per maintainer feedback on PR #3147, remove all custom downloading,
buffering and seeking from the provider. Replace the ~430-line buffered
streaming implementation with a minimal on-the-fly AES-256-CTR decrypt
wrapper (~15 lines) that uses mass.http_session.
Changes:
- Remove _stream_buffered, _stream_buffered_seek, _fetch_and_decrypt_partial,
_patch_stco, _prepare_cipher, _get_streaming_session, close()
- Delete mp4_seek.py (MP4 moov/stco parser, ~250 lines)
- Remove CONF_STREAM_BUFFER_MB config entry
- Use mass.http_session instead of custom aiohttp.ClientSession
- Seeking delegated to ffmpeg via can_seek=False / allow_seek=True
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.0 (#81)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.1 (#82)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.2 (#83)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.3 (#84)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.4 (#85)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.5 (#86)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.6 (#87)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.7 (#88)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.8 (#89)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.9 (#91)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.2.10 (#94)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* fix(yandex_music): address PR review comments
- provider.py: fix docstring for _get_valid_tags_for_category — tags are
validated via _validate_tag() → client.get_tag_playlists(), not client.tags()
- .gitignore: remove .worktrees/ (not applicable to upstream repo)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.3.0 (#95)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.3.1 (#96)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.3.2 (#97)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.3.3 (#98)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.4.1 (#99)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.4.2 (#100)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.5.1 (#101)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.5.2 (#102)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.5.3 (#103)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.5.4 (#104)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* feat(yandex_music): sync provider from ma-provider-yandex-music v2.5.5 (#105)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
---------
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: factory-droid[bot] <138933559+factory-droid[bot]@users.noreply.github.com>
Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
Co-authored-by: trudenboy0 <trudenboy0@gmail.com>
Co-authored-by: trudenboy0 <261913410+trudenboy0@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
from .constants import (
CONF_ACTION_CLEAR_AUTH,
+ CONF_BASE_URL,
+ CONF_LIKED_TRACKS_MAX_TRACKS,
+ CONF_MY_WAVE_MAX_TRACKS,
CONF_QUALITY,
CONF_TOKEN,
+ DEFAULT_BASE_URL,
+ QUALITY_BALANCED,
+ QUALITY_EFFICIENT,
QUALITY_HIGH,
- QUALITY_LOSSLESS,
+ QUALITY_SUPERB,
)
from .provider import YandexMusicProvider
from music_assistant.mass import MusicAssistant
from music_assistant.models import ProviderInstanceType
-
SUPPORTED_FEATURES = {
ProviderFeature.LIBRARY_ARTISTS,
ProviderFeature.LIBRARY_ALBUMS,
ProviderFeature.BROWSE,
ProviderFeature.SIMILAR_TRACKS,
ProviderFeature.RECOMMENDATIONS,
+ ProviderFeature.LYRICS,
}
is_authenticated = bool(values.get(CONF_TOKEN))
return (
+ # Authentication
ConfigEntry(
key=CONF_TOKEN,
type=ConfigEntryType.SECURE_STRING,
action=CONF_ACTION_CLEAR_AUTH,
hidden=not is_authenticated,
),
+ # Quality
ConfigEntry(
key=CONF_QUALITY,
type=ConfigEntryType.STRING,
label="Audio quality",
description="Select preferred audio quality.",
options=[
- ConfigValueOption("High (320 kbps)", QUALITY_HIGH),
- ConfigValueOption("Lossless (FLAC)", QUALITY_LOSSLESS),
+ ConfigValueOption("Efficient (AAC ~64kbps)", QUALITY_EFFICIENT),
+ ConfigValueOption("Balanced (AAC ~192kbps)", QUALITY_BALANCED),
+ ConfigValueOption("High (MP3 ~320kbps)", QUALITY_HIGH),
+ ConfigValueOption("Superb (FLAC Lossless)", QUALITY_SUPERB),
],
- default_value=QUALITY_HIGH,
+ default_value=QUALITY_BALANCED,
+ ),
+ # My Wave maximum tracks (advanced)
+ ConfigEntry(
+ key=CONF_MY_WAVE_MAX_TRACKS,
+ type=ConfigEntryType.INTEGER,
+ label="My Wave maximum tracks",
+ description="Maximum number of tracks to fetch for My Wave playlist. "
+ "Lower values load faster but provide fewer tracks. Default: 150.",
+ range=(10, 1000),
+ default_value=150,
+ required=False,
+ advanced=True,
+ ),
+ # Liked Tracks maximum tracks (advanced)
+ ConfigEntry(
+ key=CONF_LIKED_TRACKS_MAX_TRACKS,
+ type=ConfigEntryType.INTEGER,
+ label="Liked Tracks maximum tracks",
+ description="Maximum number of tracks to show in Liked Tracks virtual playlist. "
+ "Higher values may significantly increase load time. "
+ "Lower values load faster. Default: 500.",
+ range=(50, 2000),
+ default_value=500,
+ required=False,
+ advanced=True,
+ ),
+ # API Base URL (advanced)
+ ConfigEntry(
+ key=CONF_BASE_URL,
+ type=ConfigEntryType.STRING,
+ label="API Base URL",
+ description="API endpoint base URL. "
+ "Only change if Yandex Music changes their API endpoint. "
+ "Default: https://api.music.yandex.net",
+ default_value=DEFAULT_BASE_URL,
+ required=False,
+ advanced=True,
),
)
from __future__ import annotations
+import asyncio
+import base64
+import hashlib
+import hmac
import logging
+import re
+import time
+from collections.abc import Awaitable, Callable
from datetime import UTC, datetime
-from typing import TYPE_CHECKING, Any, cast
+from typing import TYPE_CHECKING, Any, TypeVar, cast
from music_assistant_models.errors import (
LoginFailed,
)
from yandex_music import Album as YandexAlbum
from yandex_music import Artist as YandexArtist
-from yandex_music import ClientAsync, Search, TrackShort
+from yandex_music import ClientAsync, MixLink, Search, TrackShort
from yandex_music import Playlist as YandexPlaylist
from yandex_music import Track as YandexTrack
from yandex_music.exceptions import BadRequestError, NetworkError, UnauthorizedError
-from yandex_music.utils.sign_request import get_sign_request
+from yandex_music.utils.sign_request import DEFAULT_SIGN_KEY
if TYPE_CHECKING:
from yandex_music import DownloadInfo
+ from yandex_music.feed.feed import Feed
+ from yandex_music.landing.chart_info import ChartInfo
+ from yandex_music.landing.landing import Landing
+ from yandex_music.landing.landing_list import LandingList
+ from yandex_music.rotor.dashboard import Dashboard
+ from yandex_music.rotor.station_result import StationResult
from .constants import DEFAULT_LIMIT, ROTOR_STATION_MY_WAVE
# get-file-info with quality=lossless returns FLAC; default /tracks/.../download-info often does not
# Prefer flac-mp4/aac-mp4 (Yandex API moved to these formats around 2025)
GET_FILE_INFO_CODECS = "flac-mp4,flac,aac-mp4,aac,he-aac,mp3,he-aac-mp4"
-# get-file-info: same host as library (all requests go through one API)
-GET_FILE_INFO_BASE_URL = "https://api.music.yandex.net"
LOGGER = logging.getLogger(__name__)
+_T = TypeVar("_T")
+
class YandexMusicClient:
"""Wrapper around yandex-music-api ClientAsync."""
- def __init__(self, token: str) -> None:
+ def __init__(self, token: str, base_url: str | None = None) -> None:
"""Initialize the Yandex Music client.
:param token: Yandex Music OAuth token.
+ :param base_url: Optional API base URL (defaults to Yandex Music API).
"""
self._token = token
+ self._base_url = base_url
self._client: ClientAsync | None = None
self._user_id: int | None = None
+ self._last_reconnect_at: float = -30.0 # allow first reconnect immediately
+ self._reconnect_lock = asyncio.Lock()
@property
def user_id(self) -> int:
:raises LoginFailed: If the token is invalid.
"""
try:
- self._client = await ClientAsync(self._token).init()
+ self._client = await ClientAsync(self._token, base_url=self._base_url).init()
if self._client.me is None or self._client.me.account is None:
raise LoginFailed("Failed to get account info")
self._user_id = self._client.me.account.uid
self._client = None
self._user_id = None
- def _ensure_connected(self) -> ClientAsync:
- """Ensure the client is connected and return it."""
- if self._client is None:
- raise ProviderUnavailableError("Client not connected, call connect() first")
- return self._client
+ async def _ensure_connected(self) -> ClientAsync:
+ """Ensure the client is connected, attempting reconnect if needed."""
+ if self._client is not None:
+ return self._client
+ async with self._reconnect_lock:
+ # Re-check after acquiring lock — another task may have connected already
+ if self._client is not None:
+ return self._client # type: ignore[unreachable]
+ LOGGER.info("Client disconnected, attempting to reconnect...")
+ try:
+ await self.connect()
+ except LoginFailed:
+ raise
+ except Exception as err:
+ raise ProviderUnavailableError("Client not connected and reconnect failed") from err
+ return cast("ClientAsync", self._client)
def _is_connection_error(self, err: Exception) -> bool:
"""Return True if the exception indicates a connection or server drop."""
return "disconnect" in msg or "connection" in msg or "timeout" in msg
async def _reconnect(self) -> None:
- """Disconnect and connect again to recover from Server disconnected / connection errors."""
- await self.disconnect()
- await self.connect()
+ """Disconnect and connect again to recover from Server disconnected / connection errors.
+
+ Enforces a 30-second cooldown between reconnect attempts to avoid hammering Yandex
+ and triggering rate limiting. A lock ensures concurrent callers don't bypass the cooldown.
+ """
+ async with self._reconnect_lock:
+ now = time.monotonic()
+ if now - self._last_reconnect_at < 30.0:
+ raise ProviderUnavailableError("Reconnect cooldown active, skipping")
+ self._last_reconnect_at = now
+ await self.disconnect()
+ await self.connect()
+
+ async def _call_with_retry(self, func: Callable[[ClientAsync], Awaitable[_T]]) -> _T:
+ """Execute an async API call with one reconnect attempt on connection error.
+
+ :param func: Async callable that takes a ClientAsync and returns a result.
+ :return: The result of the API call.
+ """
+ client = await self._ensure_connected()
+ try:
+ return await func(client)
+ except Exception as err:
+ if not self._is_connection_error(err):
+ raise
+ LOGGER.warning("Connection error, reconnecting and retrying: %s", err)
+ try:
+ await self._reconnect()
+ except Exception as recon_err:
+ raise ProviderUnavailableError("Reconnect failed") from recon_err
+ client = cast("ClientAsync", self._client)
+ return await func(client)
+
+ async def _call_no_retry(self, func: Callable[[ClientAsync], Awaitable[_T]]) -> _T:
+ """Execute an async API call without reconnect retry on call failure.
+
+ Used for fire-and-forget calls (e.g. rotor feedback) where a failed request
+ should be silently dropped rather than triggering a reconnect cycle that
+ could cause rate limiting. Note: _ensure_connected() is still called to
+ establish the initial connection if needed; only the reconnect-on-error
+ path is skipped.
+
+ :param func: Async callable that takes a ClientAsync and returns a result.
+ :return: The result of the API call.
+ """
+ client = await self._ensure_connected()
+ return await func(client)
# Rotor (radio station) methods
:param queue: Optional track ID for pagination (first track of previous batch).
:return: Tuple of (list of track objects, batch_id for feedback or None).
"""
- for attempt in range(2):
- client = self._ensure_connected()
- try:
- result = await client.rotor_station_tracks(station_id, settings2=True, queue=queue)
- if not result or not result.sequence:
- return ([], result.batch_id if result else None)
- track_ids = []
- for seq in result.sequence:
- if seq.track is None:
- continue
- tid = getattr(seq.track, "id", None) or getattr(seq.track, "track_id", None)
- if tid is not None:
- track_ids.append(str(tid))
- if not track_ids:
- return ([], result.batch_id if result else None)
- full_tracks = await self.get_tracks(track_ids)
- order_map = {str(t.id): t for t in full_tracks if hasattr(t, "id") and t.id}
- ordered = [order_map[tid] for tid in track_ids if tid in order_map]
- return (ordered, result.batch_id if result else None)
- except BadRequestError as err:
- LOGGER.warning("Error fetching rotor station %s tracks: %s", station_id, err)
- return ([], None)
- except (NetworkError, Exception) as err:
- if attempt == 0 and self._is_connection_error(err):
- LOGGER.warning(
- "Connection error fetching rotor tracks, reconnecting: %s",
- err,
- )
- try:
- await self._reconnect()
- except Exception as recon_err:
- LOGGER.warning("Reconnect failed: %s", recon_err)
- return ([], None)
- else:
- LOGGER.warning("Error fetching rotor station tracks: %s", err)
- return ([], None)
- return ([], None)
+ try:
+ result = await self._call_with_retry(
+ lambda c: c.rotor_station_tracks(station_id, settings2=True, queue=queue)
+ )
+ except BadRequestError as err:
+ LOGGER.warning("Error fetching rotor station %s tracks: %s", station_id, err)
+ return ([], None)
+ except (NetworkError, ProviderUnavailableError) as err:
+ LOGGER.warning("Error fetching rotor station tracks: %s", err)
+ return ([], None)
+
+ if not result or not result.sequence:
+ return ([], result.batch_id if result else None)
+ track_ids = []
+ for seq in result.sequence:
+ if seq.track is None:
+ continue
+ tid = getattr(seq.track, "id", None) or getattr(seq.track, "track_id", None)
+ if tid is not None:
+ track_ids.append(str(tid))
+ if not track_ids:
+ return ([], result.batch_id if result else None)
+ try:
+ full_tracks = await self.get_tracks(track_ids)
+ except ResourceTemporarilyUnavailable as err:
+ LOGGER.warning("Error fetching rotor station track details: %s", err)
+ return ([], result.batch_id if result else None)
+ order_map = {str(t.id): t for t in full_tracks if hasattr(t, "id") and t.id}
+ ordered = [order_map[tid] for tid in track_ids if tid in order_map]
+ return (ordered, result.batch_id if result else None)
async def get_my_wave_tracks(
self, queue: str | int | None = None
) -> tuple[list[YandexTrack], str | None]:
- """Get tracks from the My Wave (Моя волна) radio station.
+ """Get tracks from the My Wave radio station.
:param queue: Optional track ID of the last track from the previous batch (API uses it for
pagination; do not pass batch_id).
:param total_played_seconds: Seconds played (for trackFinished, skip).
:return: True if the request succeeded.
"""
- client = self._ensure_connected()
payload: dict[str, Any] = {
"type": feedback_type,
"timestamp": datetime.now(UTC).isoformat().replace("+00:00", "Z"),
if batch_id is not None:
payload["batchId"] = batch_id
- url = f"{client.base_url}/rotor/station/{station_id}/feedback"
- for attempt in range(2):
- client = self._ensure_connected()
- try:
- await client._request.post(url, payload)
- return True
- except BadRequestError as err:
- LOGGER.debug("Rotor feedback %s failed: %s", feedback_type, err)
- return False
- except (NetworkError, Exception) as err:
- if attempt == 0 and self._is_connection_error(err):
- LOGGER.warning(
- "Connection error on rotor feedback %s, reconnecting: %s",
- feedback_type,
- err,
- )
- try:
- await self._reconnect()
- except Exception as recon_err:
- LOGGER.debug("Reconnect failed: %s", recon_err)
- return False
- else:
- LOGGER.debug("Rotor feedback %s failed: %s", feedback_type, err)
- return False
- return False
+ async def _post(c: ClientAsync) -> bool:
+ url = f"{c.base_url}/rotor/station/{station_id}/feedback"
+ await c._request.post(url, payload)
+ return True
+
+ try:
+ result = await self._call_no_retry(_post)
+ LOGGER.debug(
+ "Rotor feedback %s track_id=%s total_played_seconds=%s",
+ feedback_type,
+ track_id,
+ total_played_seconds,
+ )
+ return result
+ except BadRequestError as err:
+ LOGGER.warning("Rotor feedback %s failed: %s", feedback_type, err)
+ return False
+ except (NetworkError, ProviderUnavailableError) as err:
+ LOGGER.warning("Rotor feedback %s failed: %s", feedback_type, err)
+ return False
# Library methods
async def get_liked_tracks(self) -> list[TrackShort]:
- """Get user's liked tracks.
+ """Get user's liked tracks sorted by timestamp (most recent first).
- :return: List of liked track objects.
+ :return: List of liked track objects sorted in reverse chronological order.
"""
- client = self._ensure_connected()
try:
- result = await client.users_likes_tracks()
+ result = await self._call_with_retry(lambda c: c.users_likes_tracks())
if result is None:
return []
- return result.tracks or []
- except (BadRequestError, NetworkError) as err:
+ tracks = result.tracks or []
+ # Sort by timestamp in descending order (most recently liked first)
+ # TrackShort objects have a timestamp field containing the date the track was liked
+ return sorted(
+ tracks,
+ key=lambda t: getattr(t, "timestamp", datetime.min.replace(tzinfo=UTC)),
+ reverse=True,
+ )
+ except BadRequestError as err:
+ LOGGER.error("Error fetching liked tracks: %s", err)
+ raise ResourceTemporarilyUnavailable("Failed to fetch liked tracks") from err
+ except (NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error fetching liked tracks: %s", err)
raise ResourceTemporarilyUnavailable("Failed to fetch liked tracks") from err
- async def get_liked_albums(self) -> list[YandexAlbum]:
+ async def get_liked_albums(self, batch_size: int = 50) -> list[YandexAlbum]:
"""Get user's liked albums with full details (including cover art).
The users_likes_albums endpoint returns minimal album data without
:return: List of liked album objects with full details.
"""
- client = self._ensure_connected()
try:
- result = await client.users_likes_albums()
- if result is None:
- return []
- album_ids = [
- str(like.album.id) for like in result if like.album is not None and like.album.id
- ]
- if not album_ids:
- return []
- # Fetch full album details in batches to get cover_uri and other metadata
- batch_size = 50
- full_albums: list[YandexAlbum] = []
- for i in range(0, len(album_ids), batch_size):
- batch = album_ids[i : i + batch_size]
- try:
- batch_result = await client.albums(batch)
- if batch_result:
- full_albums.extend(batch_result)
- except (BadRequestError, NetworkError) as batch_err:
- LOGGER.warning("Error fetching album details batch: %s", batch_err)
- # Fall back to minimal data for this batch
- batch_set = set(batch)
- for like in result:
- if (
- like.album is not None
- and like.album.id
- and str(like.album.id) in batch_set
- ):
- full_albums.append(like.album)
- return full_albums
- except (BadRequestError, NetworkError) as err:
+ result = await self._call_with_retry(lambda c: c.users_likes_albums())
+ except BadRequestError as err:
LOGGER.error("Error fetching liked albums: %s", err)
raise ResourceTemporarilyUnavailable("Failed to fetch liked albums") from err
+ except (NetworkError, ProviderUnavailableError) as err:
+ LOGGER.error("Error fetching liked albums: %s", err)
+ raise ResourceTemporarilyUnavailable("Failed to fetch liked albums") from err
+
+ if result is None:
+ return []
+ album_ids = [
+ str(like.album.id) for like in result if like.album is not None and like.album.id
+ ]
+ if not album_ids:
+ return []
+ # Fetch full album details in batches to get cover_uri and other metadata
+ full_albums: list[YandexAlbum] = []
+ for i in range(0, len(album_ids), batch_size):
+ batch = album_ids[i : i + batch_size]
+ try:
+ batch_result = await self._call_with_retry(
+ lambda c, _b=batch: c.albums(_b) # type: ignore[misc]
+ )
+ if batch_result:
+ full_albums.extend(batch_result)
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as batch_err:
+ LOGGER.warning("Error fetching album details batch: %s", batch_err)
+ # Fall back to minimal data for this batch
+ batch_set = set(batch)
+ for like in result:
+ if like.album is not None and like.album.id and str(like.album.id) in batch_set:
+ full_albums.append(like.album)
+ return full_albums
async def get_liked_artists(self) -> list[YandexArtist]:
"""Get user's liked artists.
:return: List of liked artist objects.
"""
- client = self._ensure_connected()
try:
- result = await client.users_likes_artists()
+ result = await self._call_with_retry(lambda c: c.users_likes_artists())
if result is None:
return []
return [like.artist for like in result if like.artist is not None]
- except (BadRequestError, NetworkError) as err:
+ except BadRequestError as err:
+ LOGGER.error("Error fetching liked artists: %s", err)
+ raise ResourceTemporarilyUnavailable("Failed to fetch liked artists") from err
+ except (NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error fetching liked artists: %s", err)
raise ResourceTemporarilyUnavailable("Failed to fetch liked artists") from err
:return: List of playlist objects.
"""
- client = self._ensure_connected()
try:
- result = await client.users_playlists_list()
+ result = await self._call_with_retry(lambda c: c.users_playlists_list())
if result is None:
return []
return list(result)
- except (BadRequestError, NetworkError) as err:
+ except BadRequestError as err:
+ LOGGER.error("Error fetching playlists: %s", err)
+ raise ResourceTemporarilyUnavailable("Failed to fetch playlists") from err
+ except (NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error fetching playlists: %s", err)
raise ResourceTemporarilyUnavailable("Failed to fetch playlists") from err
+ async def get_liked_playlists(self) -> list[YandexPlaylist]:
+ """Get user's liked/saved editorial playlists.
+
+ :return: List of liked playlist objects.
+ """
+ try:
+ result = await self._call_with_retry(lambda c: c.users_likes_playlists())
+ if result is None:
+ return []
+ playlists = []
+ for like in result:
+ if like.playlist is not None:
+ playlists.append(like.playlist)
+ return playlists
+ except BadRequestError as err:
+ LOGGER.error("Error fetching liked playlists: %s", err)
+ raise ResourceTemporarilyUnavailable("Failed to fetch liked playlists") from err
+ except (NetworkError, ProviderUnavailableError) as err:
+ LOGGER.error("Error fetching liked playlists: %s", err)
+ raise ResourceTemporarilyUnavailable("Failed to fetch liked playlists") from err
+
# Search
async def search(
:param limit: Maximum number of results per type.
:return: Search results object.
"""
- client = self._ensure_connected()
try:
- return await client.search(query, type_=search_type, page=0, nocorrect=False)
- except (BadRequestError, NetworkError) as err:
+ return await self._call_with_retry(
+ lambda c: c.search(query, type_=search_type, page=0, nocorrect=False)
+ )
+ except BadRequestError as err:
+ LOGGER.error("Search error: %s", err)
+ raise ResourceTemporarilyUnavailable("Search failed") from err
+ except (NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Search error: %s", err)
raise ResourceTemporarilyUnavailable("Search failed") from err
:param track_id: Track ID.
:return: Track object or None if not found.
"""
- client = self._ensure_connected()
try:
- tracks = await client.tracks([track_id])
+ tracks = await self._call_with_retry(lambda c: c.tracks([track_id]))
return tracks[0] if tracks else None
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error fetching track %s: %s", track_id, err)
return None
+ async def get_track_lyrics(self, track_id: str) -> tuple[str | None, bool]:
+ """Get lyrics for a track.
+
+ Fetches lyrics from Yandex Music API. Returns the lyrics text and whether
+ it's in synced LRC format (with timestamps) or plain text.
+
+ Note: This method fetches the track first to check lyrics_available. If you
+ already have the YandexTrack object, use get_track_lyrics_from_track() to
+ avoid a redundant API call.
+
+ :param track_id: Track ID.
+ :return: Tuple of (lyrics_text, is_synced). Returns (None, False) if unavailable.
+ """
+ try:
+ tracks = await self._call_with_retry(lambda c: c.tracks([track_id]))
+ if not tracks:
+ return None, False
+
+ return await self.get_track_lyrics_from_track(tracks[0])
+
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching lyrics for track %s: %s", track_id, err)
+ return None, False
+ except Exception as err:
+ # Catch any other errors (e.g., geo-restrictions, API changes)
+ LOGGER.debug("Unexpected error fetching lyrics for track %s: %s", track_id, err)
+ return None, False
+
+ async def get_track_lyrics_from_track(self, track: YandexTrack) -> tuple[str | None, bool]:
+ """Get lyrics for an already-fetched track.
+
+ Avoids the extra tracks([track_id]) API call when the YandexTrack object
+ is already available.
+
+ :param track: YandexTrack object (already fetched).
+ :return: Tuple of (lyrics_text, is_synced). Returns (None, False) if unavailable.
+ """
+ track_id = getattr(track, "id", None) or getattr(track, "track_id", "unknown")
+ try:
+ if not getattr(track, "lyrics_available", False):
+ LOGGER.debug("Lyrics not available for track %s", track_id)
+ return None, False
+
+ track_lyrics = await track.get_lyrics_async()
+ if not track_lyrics:
+ LOGGER.debug("Failed to get lyrics metadata for track %s", track_id)
+ return None, False
+
+ lyrics_text = await track_lyrics.fetch_lyrics_async()
+ if not lyrics_text:
+ return None, False
+
+ # Check if it's LRC format (synced lyrics have timestamps like [00:12.34])
+ # Use re.search without ^ so metadata lines like [ar:Artist] don't prevent detection
+ is_synced = bool(re.search(r"\[\d{2}:\d{2}(?:\.\d{2,3})?\]", lyrics_text))
+ return lyrics_text, is_synced
+
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching lyrics for track %s: %s", track_id, err)
+ return None, False
+ except Exception as err:
+ # Catch any other errors (e.g., geo-restrictions, API changes)
+ LOGGER.debug("Unexpected error fetching lyrics for track %s: %s", track_id, err)
+ return None, False
+
async def get_tracks(self, track_ids: list[str]) -> list[YandexTrack]:
"""Get multiple tracks by IDs.
:return: List of track objects.
:raises ResourceTemporarilyUnavailable: On network errors after retry.
"""
- client = self._ensure_connected()
try:
- result = await client.tracks(track_ids)
+ result = await self._call_with_retry(lambda c: c.tracks(track_ids))
return result or []
- except NetworkError as err:
- # Retry once on network errors (timeout, disconnect, etc.)
- LOGGER.warning("Network error fetching tracks, retrying once: %s", err)
- try:
- result = await client.tracks(track_ids)
- return result or []
- except NetworkError as retry_err:
- LOGGER.error("Error fetching tracks (retry failed): %s", retry_err)
- raise ResourceTemporarilyUnavailable("Failed to fetch tracks") from retry_err
except BadRequestError as err:
LOGGER.error("Error fetching tracks: %s", err)
return []
+ except (NetworkError, ProviderUnavailableError) as err:
+ LOGGER.error("Error fetching tracks (retry failed): %s", err)
+ raise ResourceTemporarilyUnavailable("Failed to fetch tracks") from err
async def get_album(self, album_id: str) -> YandexAlbum | None:
"""Get a single album by ID.
:param album_id: Album ID.
:return: Album object or None if not found.
"""
- client = self._ensure_connected()
try:
- albums = await client.albums([album_id])
+ albums = await self._call_with_retry(lambda c: c.albums([album_id]))
return albums[0] if albums else None
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error fetching album %s: %s", album_id, err)
return None
:param album_id: Album ID.
:return: Album object with tracks or None if not found.
"""
- client = self._ensure_connected()
+
+ async def _fetch(c: ClientAsync) -> YandexAlbum | None:
+ try:
+ return await c.albums_with_tracks(
+ album_id,
+ resumeStream=True,
+ richTracks=True,
+ withListeningFinished=True,
+ )
+ except TypeError:
+ # Older yandex-music may not accept these kwargs
+ return await c.albums_with_tracks(album_id)
+
try:
- return await client.albums_with_tracks(
- album_id,
- resumeStream=True,
- richTracks=True,
- withListeningFinished=True,
- )
- except TypeError:
- # Older yandex-music may not accept these kwargs
- return await client.albums_with_tracks(album_id)
- except (BadRequestError, NetworkError) as err:
+ return await self._call_with_retry(_fetch)
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error fetching album with tracks %s: %s", album_id, err)
return None
:param artist_id: Artist ID.
:return: Artist object or None if not found.
"""
- client = self._ensure_connected()
try:
- artists = await client.artists([artist_id])
+ artists = await self._call_with_retry(lambda c: c.artists([artist_id]))
return artists[0] if artists else None
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error fetching artist %s: %s", artist_id, err)
return None
:param limit: Maximum number of albums.
:return: List of album objects.
"""
- client = self._ensure_connected()
try:
- result = await client.artists_direct_albums(artist_id, page=0, page_size=limit)
+ result = await self._call_with_retry(
+ lambda c: c.artists_direct_albums(artist_id, page=0, page_size=limit)
+ )
if result is None:
return []
return result.albums or []
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error fetching artist albums %s: %s", artist_id, err)
return []
:param limit: Maximum number of tracks.
:return: List of track objects.
"""
- client = self._ensure_connected()
try:
- result = await client.artists_tracks(artist_id, page=0, page_size=limit)
+ result = await self._call_with_retry(
+ lambda c: c.artists_tracks(artist_id, page=0, page_size=limit)
+ )
if result is None:
return []
return result.tracks or []
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error fetching artist tracks %s: %s", artist_id, err)
return []
:return: Playlist object or None if not found.
:raises ResourceTemporarilyUnavailable: On network errors.
"""
- client = self._ensure_connected()
try:
- result = await client.users_playlists(kind=int(playlist_id), user_id=user_id)
+ result = await self._call_with_retry(
+ lambda c: c.users_playlists(kind=int(playlist_id), user_id=user_id)
+ )
if isinstance(result, list):
return result[0] if result else None
return result
- except NetworkError as err:
- LOGGER.warning("Network error fetching playlist %s/%s: %s", user_id, playlist_id, err)
- raise ResourceTemporarilyUnavailable("Failed to fetch playlist") from err
except BadRequestError as err:
LOGGER.error("Error fetching playlist %s/%s: %s", user_id, playlist_id, err)
return None
+ except (NetworkError, ProviderUnavailableError) as err:
+ LOGGER.warning("Network error fetching playlist %s/%s: %s", user_id, playlist_id, err)
+ raise ResourceTemporarilyUnavailable("Failed to fetch playlist") from err
# Streaming
:param get_direct_links: Whether to get direct download links.
:return: List of download info objects.
"""
- client = self._ensure_connected()
try:
- result = await client.tracks_download_info(track_id, get_direct_links=get_direct_links)
+ result = await self._call_with_retry(
+ lambda c: c.tracks_download_info(track_id, get_direct_links=get_direct_links)
+ )
return result or []
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error fetching download info for track %s: %s", track_id, err)
return []
The /tracks/{id}/download-info endpoint often returns only MP3; get-file-info
with quality=lossless and codecs=flac,... returns FLAC when available.
+ Uses manual sign calculation matching yandex-music-downloader-realflac.
+ Uses _call_with_retry for automatic reconnection on transient failures.
+
:param track_id: Track ID.
:return: Parsed downloadInfo dict (url, codec, urls, ...) or None on error.
"""
- client = self._ensure_connected()
- sign = get_sign_request(track_id)
- base_params = {
- "ts": sign.timestamp,
- "trackId": track_id,
- "quality": "lossless",
- "codecs": GET_FILE_INFO_CODECS,
- "sign": sign.value,
- }
+
+ def _build_signed_params(client: ClientAsync) -> tuple[str, dict[str, Any]]:
+ """Build URL and signed params using current client and timestamp.
+
+ Called on each attempt by _call_with_retry, so the HMAC signature
+ is recomputed with a fresh timestamp on every retry.
+ """
+ timestamp = int(time.time())
+ params = {
+ "ts": timestamp,
+ "trackId": track_id,
+ "quality": "lossless",
+ "codecs": GET_FILE_INFO_CODECS,
+ "transports": "encraw",
+ }
+ # Build sign string explicitly matching Yandex API specification:
+ # concatenate ts + trackId + quality + codecs (commas stripped) + transports.
+ # Comma stripping matches yandex-music-downloader-realflac reference implementation
+ # (see get_file_info signing in that project).
+ codecs_for_sign = GET_FILE_INFO_CODECS.replace(",", "")
+ param_string = f"{timestamp}{track_id}lossless{codecs_for_sign}encraw"
+ hmac_sign = hmac.new(
+ DEFAULT_SIGN_KEY.encode(),
+ param_string.encode(),
+ hashlib.sha256,
+ )
+ # SHA-256 (32 bytes) -> base64 = 44 chars with "=" padding.
+ # Yandex API expects exactly 43 chars (one "=" removed).
+ # Matches yandex-music-downloader-realflac reference implementation.
+ params["sign"] = base64.b64encode(hmac_sign.digest()).decode()[:-1]
+ url = f"{client.base_url}/get-file-info"
+ return url, params
def _parse_file_info_result(raw: dict[str, Any] | None) -> dict[str, Any] | None:
if not raw or not isinstance(raw, dict):
download_info = raw.get("download_info")
if not download_info or not download_info.get("url"):
return None
- return cast("dict[str, Any]", download_info)
- url = f"{GET_FILE_INFO_BASE_URL}/get-file-info"
- params_encraw = {**base_params, "transports": "encraw"}
+ result = cast("dict[str, Any]", download_info)
+
+ if "key" in download_info:
+ result["needs_decryption"] = True
+ LOGGER.debug(
+ "Encrypted URL received for track %s, will require decryption",
+ track_id,
+ )
+ else:
+ result["needs_decryption"] = False
+
+ return result
+
+ async def _do_request(c: ClientAsync) -> dict[str, Any] | None:
+ url, params = _build_signed_params(c)
+ return await c._request.get(url, params=params) # type: ignore[no-any-return]
+
try:
- result = await client._request.get(url, params=params_encraw)
- return _parse_file_info_result(result)
+ result = await self._call_with_retry(_do_request)
+ parsed = _parse_file_info_result(result)
+ if parsed:
+ LOGGER.debug(
+ "get-file-info lossless for track %s: Success, codec=%s",
+ track_id,
+ parsed.get("codec"),
+ )
+ return parsed
except (BadRequestError, NetworkError) as err:
LOGGER.debug(
"get-file-info lossless for track %s: %s %s",
type(err).__name__,
getattr(err, "message", str(err)) or repr(err),
)
- return None
except UnauthorizedError as err:
LOGGER.debug(
- "get-file-info lossless for track %s (transports=encraw): %s %s",
+ "get-file-info lossless for track %s: UnauthorizedError %s",
track_id,
- type(err).__name__,
getattr(err, "message", str(err)) or repr(err),
)
- LOGGER.debug(
- "If you have Yandex Music Plus and this track has lossless, "
- "try a token from the web client (music.yandex.ru)."
+ except Exception as err:
+ LOGGER.warning(
+ "get-file-info lossless for track %s: Unexpected error: %s",
+ track_id,
+ err,
+ exc_info=True,
)
- params_raw = {**base_params, "transports": "raw"}
- try:
- result = await client._request.get(url, params=params_raw)
- return _parse_file_info_result(result)
- except (BadRequestError, NetworkError, UnauthorizedError) as retry_err:
+
+ return None
+
+ # Discovery / recommendations
+
+ async def get_feed(self) -> Feed | None:
+ """Get personalized feed with generated playlists (Playlist of the Day, etc.).
+
+ :return: Feed object with generated_playlists, or None on error.
+ """
+ try:
+ return await self._call_with_retry(lambda c: c.feed())
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching feed: %s", err)
+ return None
+
+ async def get_chart(self, chart_option: str = "") -> ChartInfo | None:
+ """Get chart data.
+
+ :param chart_option: Optional chart variant (e.g. 'world', 'russia').
+ :return: ChartInfo object or None on error.
+ """
+ try:
+ return await self._call_with_retry(lambda c: c.chart(chart_option))
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching chart: %s", err)
+ return None
+
+ async def get_new_releases(self) -> LandingList | None:
+ """Get new album releases.
+
+ :return: LandingList with new_releases (list of album IDs) or None on error.
+ """
+ try:
+ return await self._call_with_retry(lambda c: c.new_releases())
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching new releases: %s", err)
+ return None
+
+ async def get_new_playlists(self) -> LandingList | None:
+ """Get new editorial playlists.
+
+ :return: LandingList with new_playlists (list of PlaylistId) or None on error.
+ """
+ try:
+ return await self._call_with_retry(lambda c: c.new_playlists())
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching new playlists: %s", err)
+ return None
+
+ async def get_albums(self, album_ids: list[str]) -> list[YandexAlbum]:
+ """Get multiple albums by IDs.
+
+ :param album_ids: List of album IDs.
+ :return: List of album objects.
+ """
+ try:
+ result = await self._call_with_retry(lambda c: c.albums(album_ids))
+ return result or []
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching albums: %s", err)
+ return []
+
+ async def get_playlists(self, playlist_ids: list[str]) -> list[YandexPlaylist]:
+ """Get multiple playlists by IDs (format: 'uid:kind').
+
+ :param playlist_ids: List of playlist IDs in 'uid:kind' format.
+ :return: List of playlist objects.
+ """
+ try:
+ result = await self._call_with_retry(lambda c: c.playlists_list(playlist_ids))
+ return result or []
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching playlists: %s", err)
+ return []
+
+ async def get_tag_playlists(self, tag_id: str) -> list[YandexPlaylist]:
+ """Get playlists for a specific tag (mood, era, activity, genre, etc.).
+
+ Tags are used for curated collections like 'chill', '80s', 'workout', 'rock', etc.
+ The API returns playlist IDs which are then fetched in full.
+
+ :param tag_id: Tag identifier (e.g. 'chill', '80s', 'workout', 'rock').
+ :return: List of playlist objects with full details.
+ """
+ try:
+ tag_result = await self._call_with_retry(lambda c: c.tags(tag_id))
+ if not tag_result or not tag_result.ids:
+ LOGGER.debug("No playlists found for tag: %s", tag_id)
+ return []
+
+ # Convert PlaylistId objects to 'uid:kind' format
+ playlist_ids = [f"{pid.uid}:{pid.kind}" for pid in tag_result.ids]
+
+ # Fetch full playlist details
+ return await self.get_playlists(playlist_ids)
+ except BadRequestError as err:
+ LOGGER.debug("Tag %s not found: %s", tag_id, err)
+ return []
+ except (NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching tag %s playlists: %s", tag_id, err)
+ return []
+
+ async def get_landing_tags(self) -> list[tuple[str, str]]:
+ """Discover available tag slugs from the landing mixes block.
+
+ Uses the landing("mixes") API which returns MixLink entities
+ containing tag URLs (e.g., /tag/chill/) and display titles.
+ Filters out editorial post entries (/post/ URLs) which have no playlists.
+
+ :return: List of (tag_slug, title) tuples for real tag entries only.
+ """
+ try:
+ landing: Landing | None = await self._call_with_retry(lambda c: c.landing("mixes"))
+ if not landing or not landing.blocks:
+ return []
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching landing tags: %s", err)
+ return []
+
+ tags: list[tuple[str, str]] = []
+ for block in landing.blocks:
+ if not block.entities:
+ continue
+ for entity in block.entities:
+ if entity.type == "mix-link" and isinstance(entity.data, MixLink):
+ url = entity.data.url # e.g., "/tag/chill/" or "/post/..."
+ # Filter out editorial posts — only include /tag/ URLs
+ if not url.startswith("/tag/"):
+ continue
+ slug = url.strip("/").split("/")[-1]
+ if slug:
+ tags.append((slug, entity.data.title))
+ return tags
+
+ async def get_mixes_waves(self) -> list[dict[str, Any]] | None:
+ """Get AI Wave Set stations from /landing-blocks/mixes-waves endpoint.
+
+ Returns structured mix data with categories and station items, each
+ containing station_id, title, seeds, and visual metadata.
+
+ :return: List of mix category dicts, or None on error.
+ """
+ return await self._get_landing_waves("mixes-waves")
+
+ async def get_waves_landing(self) -> list[dict[str, Any]] | None:
+ """Get featured wave stations from /landing-blocks/waves endpoint.
+
+ Returns Yandex-curated wave categories with station items — the "Волны"
+ landing page content, separate from the full rotor/stations/list and from
+ the AI mixes-waves sets.
+
+ :return: List of wave category dicts, or None on error.
+ """
+ return await self._get_landing_waves("waves")
+
+ async def _get_landing_waves(self, block: str) -> list[dict[str, Any]] | None:
+ """Fetch wave categories from a /landing-blocks/<block> endpoint.
+
+ Note: Response keys are auto-converted from camelCase to snake_case
+ by the yandex-music library's JSON parser.
+
+ :param block: Block name, e.g. 'waves' or 'mixes-waves'.
+ :return: List of wave category dicts, or None on error.
+ """
+
+ async def _get(c: ClientAsync) -> dict[str, Any]:
+ url = f"{c.base_url}/landing-blocks/{block}"
+ return await c._request.get(url) # type: ignore[no-any-return]
+
+ try:
+ result = await self._call_with_retry(_get)
+ if result and isinstance(result, dict):
+ waves = result.get("waves", [])
LOGGER.debug(
- "get-file-info lossless for track %s (transports=raw): %s %s",
- track_id,
- type(retry_err).__name__,
- getattr(retry_err, "message", str(retry_err)) or repr(retry_err),
+ "landing-blocks/%s returned %d categories",
+ block,
+ len(waves) if isinstance(waves, list) else -1,
)
- return None
+ return waves if isinstance(waves, list) else []
+ return None
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.debug("Error fetching landing-blocks/%s: %s", block, err)
+ return None
+
+ async def get_wave_stations(
+ self, language: str | None = None
+ ) -> list[tuple[str, str, str, str | None]]:
+ """Get available rotor wave stations grouped by category.
+
+ Calls rotor_stations_list() — equivalent to the rotor/stations/list API endpoint.
+ Filters out personal stations (type 'user') since My Wave is handled separately.
+
+ :param language: Language for station names (e.g. 'ru', 'en'). Defaults to API default.
+ :return: List of (station_id, category, name, image_url) tuples,
+ e.g. ('genre:rock', 'genre', 'Рок', 'https://...').
+ """
+ try:
+ results: list[StationResult] = await self._call_with_retry(
+ lambda c: c.rotor_stations_list(language)
+ )
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.warning("Error fetching wave stations: %s", err)
+ return []
+
+ stations: list[tuple[str, str, str, str | None]] = []
+ for result in results or []:
+ station = result.station
+ if station is None or station.id is None:
+ continue
+ category = station.id.type
+ tag = station.id.tag
+ if not category or not tag:
+ continue
+ if category in ("user", "local-language"):
+ # Skip personal stations (My Wave is handled separately)
+ # and local-language stations (Yandex returns overlapping tracks across them)
+ continue
+ station_id = f"{category}:{tag}"
+ name = station.name or result.rup_title or tag
+ image_url: str | None = None
+ raw_url = station.full_image_url or (station.icon.image_url if station.icon else None)
+ if raw_url:
+ # Yandex avatar URIs use '%%' as a size placeholder; replace it with
+ # the desired size. If no placeholder, append the size as a suffix
+ # since these URLs return HTTP 400 without a size component.
+ if not raw_url.startswith("http"):
+ raw_url = f"https://{raw_url}"
+ if "%%" in raw_url:
+ image_url = raw_url.replace("%%", "400x400")
+ else:
+ image_url = f"{raw_url}/400x400"
+ stations.append((station_id, category, name, image_url))
+ return stations
+
+ async def get_dashboard_stations(self) -> list[tuple[str, str, str | None]]:
+ """Get personalized recommended stations for the current user.
+
+ Calls rotor_stations_dashboard() — returns user-specific stations based
+ on listening history, unlike rotor_stations_list() which is non-personalized.
+
+ :return: List of (station_id, name, image_url) tuples,
+ e.g. ('genre:rock', 'Рок', 'https://...').
+ """
+ try:
+ dashboard: Dashboard | None = await self._call_with_retry(
+ lambda c: c.rotor_stations_dashboard()
+ )
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
+ LOGGER.warning("Error fetching dashboard stations: %s", err)
+ return []
+
+ if not dashboard or not dashboard.stations:
+ return []
+
+ stations: list[tuple[str, str, str | None]] = []
+ for result in dashboard.stations:
+ station = result.station
+ if station is None or station.id is None:
+ continue
+ category = station.id.type
+ tag = station.id.tag
+ if not category or not tag:
+ continue
+ if category == "user":
+ continue
+ station_id = f"{category}:{tag}"
+ name = station.name or result.rup_title or tag
+ image_url: str | None = None
+ raw_url = station.full_image_url or (station.icon.image_url if station.icon else None)
+ if raw_url:
+ if not raw_url.startswith("http"):
+ raw_url = f"https://{raw_url}"
+ if "%%" in raw_url:
+ image_url = raw_url.replace("%%", "400x400")
+ else:
+ image_url = f"{raw_url}/400x400"
+ stations.append((station_id, name, image_url))
+ return stations
# Library modifications
:param track_id: Track ID to like.
:return: True if successful.
"""
- client = self._ensure_connected()
try:
- result = await client.users_likes_tracks_add(track_id)
+ result = await self._call_with_retry(lambda c: c.users_likes_tracks_add(track_id))
return result is not None
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error liking track %s: %s", track_id, err)
return False
:param track_id: Track ID to unlike.
:return: True if successful.
"""
- client = self._ensure_connected()
try:
- result = await client.users_likes_tracks_remove(track_id)
+ result = await self._call_with_retry(lambda c: c.users_likes_tracks_remove(track_id))
return result is not None
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error unliking track %s: %s", track_id, err)
return False
:param album_id: Album ID to like.
:return: True if successful.
"""
- client = self._ensure_connected()
try:
- result = await client.users_likes_albums_add(album_id)
+ result = await self._call_with_retry(lambda c: c.users_likes_albums_add(album_id))
return result is not None
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error liking album %s: %s", album_id, err)
return False
:param album_id: Album ID to unlike.
:return: True if successful.
"""
- client = self._ensure_connected()
try:
- result = await client.users_likes_albums_remove(album_id)
+ result = await self._call_with_retry(lambda c: c.users_likes_albums_remove(album_id))
return result is not None
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error unliking album %s: %s", album_id, err)
return False
:param artist_id: Artist ID to like.
:return: True if successful.
"""
- client = self._ensure_connected()
try:
- result = await client.users_likes_artists_add(artist_id)
+ result = await self._call_with_retry(lambda c: c.users_likes_artists_add(artist_id))
return result is not None
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error liking artist %s: %s", artist_id, err)
return False
:param artist_id: Artist ID to unlike.
:return: True if successful.
"""
- client = self._ensure_connected()
try:
- result = await client.users_likes_artists_remove(artist_id)
+ result = await self._call_with_retry(lambda c: c.users_likes_artists_remove(artist_id))
return result is not None
- except (BadRequestError, NetworkError) as err:
+ except (BadRequestError, NetworkError, ProviderUnavailableError) as err:
LOGGER.error("Error unliking artist %s: %s", artist_id, err)
return False
# Configuration Keys
CONF_TOKEN = "token"
CONF_QUALITY = "quality"
+CONF_BASE_URL = "base_url"
# Actions
CONF_ACTION_AUTH = "auth"
# API defaults
DEFAULT_LIMIT: Final[int] = 50
+DEFAULT_BASE_URL: Final[str] = "https://api.music.yandex.net"
-# Quality options
-QUALITY_HIGH = "high"
-QUALITY_LOSSLESS = "lossless"
+# Quality options (matching reference implementation)
+QUALITY_EFFICIENT = "efficient" # Low quality, efficient bandwidth (~64kbps AAC)
+QUALITY_BALANCED = "balanced" # Medium quality, balanced performance (~192kbps AAC)
+QUALITY_HIGH = "high" # High quality, lossy (~320kbps MP3)
+QUALITY_SUPERB = "superb" # Highest quality, lossless (FLAC)
+
+# Configuration keys for My Wave behavior (kept)
+CONF_MY_WAVE_MAX_TRACKS: Final[str] = "my_wave_max_tracks"
+
+# Configuration keys for Liked Tracks behavior (kept)
+CONF_LIKED_TRACKS_MAX_TRACKS: Final[str] = "liked_tracks_max_tracks"
+
+# Hardcoded default values for removed config entries
+MY_WAVE_BATCH_SIZE: Final[int] = 3
+TRACK_BATCH_SIZE: Final[int] = 50
+DISCOVERY_INITIAL_TRACKS: Final[int] = 20
+BROWSE_INITIAL_TRACKS: Final[int] = 15
# Image sizes
IMAGE_SIZE_SMALL = "200x200"
IMAGE_SIZE_MEDIUM = "400x400"
IMAGE_SIZE_LARGE = "1000x1000"
+# Locale-aware provider display names for owner normalization
+PROVIDER_DISPLAY_NAME_RU: Final[str] = "Яндекс Музыка"
+PROVIDER_DISPLAY_NAME_EN: Final[str] = "Yandex Music"
+
+# Known API-returned system owner name variants (all locales/capitalizations)
+# All entries are lowercase; compare with owner_name.lower() for case-insensitive lookup
+YANDEX_SYSTEM_OWNER_NAMES: Final[frozenset[str]] = frozenset(
+ {
+ "яндекс музыка",
+ "яндекс.музыка",
+ "yandex.music",
+ "yandexmusic",
+ "yandex music",
+ }
+)
+
# ID separators
PLAYLIST_ID_SPLITTER: Final[str] = ":"
# Virtual playlist ID for My Wave (used in get_playlist / get_playlist_tracks; not owner_id:kind)
MY_WAVE_PLAYLIST_ID: Final[str] = "my_wave"
+# Virtual playlist ID for Liked Tracks
+LIKED_TRACKS_PLAYLIST_ID: Final[str] = "liked_tracks"
+
# Composite item_id for My Wave tracks: track_id + separator + station_id (for rotor feedback)
RADIO_TRACK_ID_SEP: Final[str] = "@"
"albums": "Мои альбомы",
"tracks": "Мне нравится",
"playlists": "Мои плейлисты",
+ "feed": "Для вас",
+ "chart": "Чарт",
+ "new_releases": "Новинки",
+ "new_playlists": "Новые плейлисты",
+ # Picks & Mixes
+ "picks": "Подборки",
+ "mixes": "Миксы",
+ "mood": "Настроение",
+ "activity": "Активность",
+ "era": "Эпоха",
+ "genres": "Жанры",
+ # Mood tags
+ "chill": "Расслабляющее",
+ "sad": "Грустное",
+ "romantic": "Романтическое",
+ "party": "Вечеринка",
+ "relax": "Релакс",
+ # Activity tags
+ "workout": "Тренировка",
+ "focus": "Концентрация",
+ "morning": "Утро",
+ "evening": "Вечер",
+ "driving": "В дороге", # noqa: RUF001
+ # Era tags
+ "80s": "80-е", # noqa: RUF001
+ "90s": "90-е", # noqa: RUF001
+ "2000s": "2000-е", # noqa: RUF001
+ "retro": "Ретро",
+ # Genre tags
+ "rock": "Рок",
+ "jazz": "Джаз",
+ "classical": "Классика",
+ "electronic": "Электроника",
+ "rnb": "R&B",
+ "hiphop": "Хип-хоп",
+ "top": "Топ",
+ "newbies": "По жанру",
+ # Landing-discovered tags
+ "in the mood": "В настроение", # noqa: RUF001
+ "background": "Послушать фоном",
+ # Seasonal tags
+ "winter": "Зима",
+ "summer": "Лето",
+ "autumn": "Осень",
+ "spring": "Весна",
+ "newyear": "Новый год",
+ # Liked Tracks
+ "liked_tracks": "Мне нравится",
+ # Discovery
+ "top_picks": "Топ подборки",
+ "mood_mix": "Настроение",
+ "activity_mix": "Активность",
+ "seasonal_mix": "Сезонное",
+ # Top-level browse groups
+ "for_you": "Для вас",
+ "collection": "Коллекция",
+ # Waves / Radio (rotor station categories)
+ "waves": "Радио",
+ "radio": "Радио",
+ "my_waves": "Персональные",
+ "my_waves_set": "AI Сеты",
+ "waves_landing": "Избранные волны",
+ "genre": "Жанры",
+ "epoch": "Эпоха",
+ "local": "Местное",
}
BROWSE_NAMES_EN: Final[dict[str, str]] = {
"my_wave": "My Wave",
"albums": "My Albums",
"tracks": "My Favorites",
"playlists": "My Playlists",
+ "feed": "Made for You",
+ "chart": "Chart",
+ "new_releases": "New Releases",
+ "new_playlists": "New Playlists",
+ # Picks & Mixes
+ "picks": "Picks",
+ "mixes": "Mixes",
+ "mood": "Mood",
+ "activity": "Activity",
+ "era": "Era",
+ "genres": "Genres",
+ # Mood tags
+ "chill": "Chill",
+ "sad": "Sad",
+ "romantic": "Romantic",
+ "party": "Party",
+ "relax": "Relax",
+ # Activity tags
+ "workout": "Workout",
+ "focus": "Focus",
+ "morning": "Morning",
+ "evening": "Evening",
+ "driving": "Driving",
+ # Era tags
+ "80s": "80s",
+ "90s": "90s",
+ "2000s": "2000s",
+ "retro": "Retro",
+ # Genre tags
+ "rock": "Rock",
+ "jazz": "Jazz",
+ "classical": "Classical",
+ "electronic": "Electronic",
+ "rnb": "R&B",
+ "hiphop": "Hip-Hop",
+ "top": "Top",
+ "newbies": "By Genre",
+ # Landing-discovered tags
+ "in the mood": "In the Mood",
+ "background": "Background",
+ # Seasonal tags
+ "winter": "Winter",
+ "summer": "Summer",
+ "autumn": "Autumn",
+ "spring": "Spring",
+ "newyear": "New Year",
+ # Liked Tracks
+ "liked_tracks": "My Favorites",
+ # Discovery
+ "top_picks": "Top Picks",
+ "mood_mix": "Mood Mix",
+ "activity_mix": "Activity Mix",
+ "seasonal_mix": "Seasonal",
+ # Top-level browse groups
+ "for_you": "For You",
+ "collection": "Collection",
+ # Waves / Radio (rotor station categories)
+ "waves": "Radio",
+ "radio": "Radio",
+ "my_waves": "Personal",
+ "my_waves_set": "AI Wave Sets",
+ "waves_landing": "Featured Waves",
+ "genre": "Genres",
+ "epoch": "Era",
+ "local": "Local",
+}
+
+# Tag categories for Picks and Recommendations
+# Used by _get_valid_tags_for_category to validate tags at runtime.
+TAG_CATEGORY_MOOD: Final[list[str]] = [
+ "chill",
+ "sad",
+ "romantic",
+ "party",
+ "relax",
+ "in the mood",
+]
+TAG_CATEGORY_ACTIVITY: Final[list[str]] = [
+ "workout",
+ "focus",
+ "morning",
+ "evening",
+ "driving",
+ "background",
+]
+TAG_CATEGORY_ERA: Final[list[str]] = ["80s", "90s", "2000s", "retro"]
+TAG_CATEGORY_GENRES: Final[list[str]] = [
+ "rock",
+ "jazz",
+ "classical",
+ "electronic",
+ "rnb",
+ "hiphop",
+ "top",
+ "newbies",
+]
+
+# Tag slug -> display category mapping
+# Used to categorize dynamically discovered tags into browse folders.
+# Tags not in this mapping default to "mood" category.
+TAG_SLUG_CATEGORY: Final[dict[str, str]] = {
+ # Mood
+ "chill": "mood",
+ "sad": "mood",
+ "romantic": "mood",
+ "party": "mood",
+ "relax": "mood",
+ "in the mood": "mood",
+ # Activity
+ "workout": "activity",
+ "focus": "activity",
+ "morning": "activity",
+ "evening": "activity",
+ "driving": "activity",
+ "background": "activity",
+ # Era
+ "80s": "era",
+ "90s": "era",
+ "2000s": "era",
+ "retro": "era",
+ # Genres
+ "rock": "genres",
+ "jazz": "genres",
+ "classical": "genres",
+ "electronic": "genres",
+ "rnb": "genres",
+ "hiphop": "genres",
+ "top": "genres",
+ "newbies": "genres",
+ # Seasonal (for mixes)
+ "winter": "seasonal",
+ "spring": "seasonal",
+ "summer": "seasonal",
+ "autumn": "seasonal",
+ "newyear": "seasonal",
+}
+
+# Preferred tag order within categories (discovered tags sorted by this)
+TAG_CATEGORY_ORDER: Final[dict[str, list[str]]] = {
+ "mood": ["chill", "sad", "romantic", "party", "relax", "in the mood"],
+ "activity": ["workout", "focus", "morning", "evening", "driving", "background"],
+ "era": ["80s", "90s", "2000s", "retro"],
+ "genres": ["rock", "jazz", "classical", "electronic", "rnb", "hiphop", "top", "newbies"],
+}
+
+# Seasonal tags mapped to months (month number -> tag)
+TAG_SEASONAL_MAP: Final[dict[int, str]] = {
+ 1: "winter", # January
+ 2: "winter", # February
+ 3: "spring", # March (validated at runtime; falls back to autumn if unavailable)
+ 4: "spring", # April
+ 5: "spring", # May
+ 6: "summer", # June
+ 7: "summer", # July
+ 8: "summer", # August
+ 9: "autumn", # September
+ 10: "autumn", # October
+ 11: "autumn", # November
+ 12: "winter", # December
}
+
+# Tags for Mixes (seasonal collections)
+TAG_MIXES: Final[list[str]] = ["winter", "spring", "summer", "autumn", "newyear"]
+
+# Waves by tag (rotor stations) — canonical ID is "waves", "radio" is an alias
+WAVES_FOLDER_ID: Final[str] = "waves"
+RADIO_FOLDER_ID: Final[str] = "radio"
+
+# Personalized waves subfolder (rotor/stations/dashboard)
+MY_WAVES_FOLDER_ID: Final[str] = "my_waves"
+
+# AI Wave Sets subfolder (from /landing-blocks/mixes-waves)
+MY_WAVES_SET_FOLDER_ID: Final[str] = "my_waves_set"
+
+# Featured Waves subfolder inside Radio (from /landing-blocks/waves)
+WAVES_LANDING_FOLDER_ID: Final[str] = "waves_landing"
+
+# Top-level browse group folders
+FOR_YOU_FOLDER_ID: Final[str] = "for_you"
+COLLECTION_FOLDER_ID: Final[str] = "collection"
+
+# Preferred display order for wave categories (rotor station types)
+WAVE_CATEGORY_DISPLAY_ORDER: Final[list[str]] = [
+ "genre",
+ "mood",
+ "activity",
+ "epoch",
+ "local",
+]
from music_assistant.helpers.util import parse_title_and_version
-from .constants import IMAGE_SIZE_LARGE
+from .constants import (
+ IMAGE_SIZE_LARGE,
+ PROVIDER_DISPLAY_NAME_EN,
+ PROVIDER_DISPLAY_NAME_RU,
+ YANDEX_SYSTEM_OWNER_NAMES,
+)
if TYPE_CHECKING:
from yandex_music import Album as YandexAlbum
from .provider import YandexMusicProvider
-def _get_content_type(provider: YandexMusicProvider) -> ContentType:
- """Get content type based on provider quality setting.
+def get_canonical_provider_name(provider: YandexMusicProvider) -> str:
+ """Return the locale-aware canonical display name for the Yandex Music system account.
:param provider: The Yandex Music provider instance.
- :return: ContentType.UNKNOWN as actual codec is determined at stream time.
+ :return: Localized provider display name.
"""
- # Actual codec is determined when getting stream details
- # Suppress unused argument warning
- _ = provider
- return ContentType.UNKNOWN
+ with suppress(Exception):
+ locale = (provider.mass.metadata.locale or "en_US").lower()
+ if locale.startswith("ru"):
+ return PROVIDER_DISPLAY_NAME_RU
+ return PROVIDER_DISPLAY_NAME_EN
def _get_image_url(cover_uri: str | None, size: str = IMAGE_SIZE_LARGE) -> str | None:
provider_domain=provider.domain,
provider_instance=provider.instance_id,
audio_format=AudioFormat(
- content_type=_get_content_type(provider),
+ content_type=ContentType.UNKNOWN,
),
url=f"https://music.yandex.ru/album/{album_id}",
available=available,
return album
-def parse_track(provider: YandexMusicProvider, track_obj: YandexTrack) -> Track:
+def parse_track(
+ provider: YandexMusicProvider,
+ track_obj: YandexTrack,
+ lyrics: str | None = None,
+ lyrics_synced: bool = False,
+) -> Track:
"""Parse Yandex track object to MA Track model.
:param provider: The Yandex Music provider instance.
:param track_obj: Yandex track object.
+ :param lyrics: Optional lyrics text.
+ :param lyrics_synced: Whether lyrics are in synced LRC format.
:return: Music Assistant Track model.
"""
name, version = parse_title_and_version(
provider_domain=provider.domain,
provider_instance=provider.instance_id,
audio_format=AudioFormat(
- content_type=_get_content_type(provider),
+ content_type=ContentType.UNKNOWN,
),
url=f"https://music.yandex.ru/track/{track_id}",
available=available,
if track_obj.content_warning:
track.metadata.explicit = track_obj.content_warning == "explicit"
+ # Lyrics
+ if lyrics:
+ if lyrics_synced:
+ track.metadata.lrc_lyrics = lyrics
+ else:
+ track.metadata.lyrics = lyrics
+
return track
elif is_editable:
owner_name = "Me"
else:
- owner_name = "Yandex Music"
+ owner_name = get_canonical_provider_name(provider)
+
+ # Normalize all known system account name variants to locale-aware canonical form
+ if owner_name and owner_name.lower() in YANDEX_SYSTEM_OWNER_NAMES:
+ owner_name = get_canonical_provider_name(provider)
playlist = Playlist(
item_id=playlist_id,
from __future__ import annotations
+import asyncio
import logging
-from collections.abc import Sequence
-from typing import TYPE_CHECKING
+import random
+from collections.abc import AsyncGenerator, Sequence
+from datetime import UTC, datetime
+from io import BytesIO
+from typing import TYPE_CHECKING, Any
-from music_assistant_models.enums import MediaType, ProviderFeature
+from music_assistant_models.enums import ImageType, MediaType, ProviderFeature
from music_assistant_models.errors import (
InvalidDataError,
LoginFailed,
Artist,
BrowseFolder,
ItemMapping,
+ MediaItemImage,
MediaItemType,
Playlist,
ProviderMapping,
Track,
UniqueList,
)
+from PIL import Image as PilImage
from music_assistant.controllers.cache import use_cache
from music_assistant.models.music_provider import MusicProvider
from .api_client import YandexMusicClient
from .constants import (
+ BROWSE_INITIAL_TRACKS,
BROWSE_NAMES_EN,
BROWSE_NAMES_RU,
+ COLLECTION_FOLDER_ID,
+ CONF_BASE_URL,
+ CONF_LIKED_TRACKS_MAX_TRACKS,
+ CONF_MY_WAVE_MAX_TRACKS,
CONF_TOKEN,
+ DEFAULT_BASE_URL,
+ DISCOVERY_INITIAL_TRACKS,
+ FOR_YOU_FOLDER_ID,
+ IMAGE_SIZE_MEDIUM,
+ LIKED_TRACKS_PLAYLIST_ID,
+ MY_WAVE_BATCH_SIZE,
MY_WAVE_PLAYLIST_ID,
+ MY_WAVES_FOLDER_ID,
+ MY_WAVES_SET_FOLDER_ID,
PLAYLIST_ID_SPLITTER,
+ RADIO_FOLDER_ID,
RADIO_TRACK_ID_SEP,
ROTOR_STATION_MY_WAVE,
+ TAG_CATEGORY_ACTIVITY,
+ TAG_CATEGORY_ERA,
+ TAG_CATEGORY_GENRES,
+ TAG_CATEGORY_MOOD,
+ TAG_CATEGORY_ORDER,
+ TAG_MIXES,
+ TAG_SEASONAL_MAP,
+ TAG_SLUG_CATEGORY,
+ TRACK_BATCH_SIZE,
+ WAVE_CATEGORY_DISPLAY_ORDER,
+ WAVES_FOLDER_ID,
+ WAVES_LANDING_FOLDER_ID,
+)
+from .parsers import (
+ _get_image_url as get_image_url,
+)
+from .parsers import (
+ get_canonical_provider_name,
+ parse_album,
+ parse_artist,
+ parse_playlist,
+ parse_track,
)
-from .parsers import parse_album, parse_artist, parse_playlist, parse_track
from .streaming import YandexMusicStreamingManager
if TYPE_CHECKING:
- from collections.abc import AsyncGenerator
-
from music_assistant_models.streamdetails import StreamDetails
return (item_id, None)
+class _WaveState:
+ """Per-station mutable state for rotor wave playback."""
+
+ def __init__(self) -> None:
+ self.batch_id: str | None = None
+ self.last_track_id: str | None = None
+ self.seen_track_ids: set[str] = set()
+ self.radio_started_sent: bool = False
+ self.lock: asyncio.Lock = asyncio.Lock()
+
+
class YandexMusicProvider(MusicProvider):
"""Implementation of a Yandex Music MusicProvider."""
_my_wave_last_track_id: str | None = None # last track id for "Load more" (API queue param)
_my_wave_playlist_next_cursor: str | None = None # first_track_id for next playlist page
_my_wave_radio_started_sent: bool = False
+ _my_wave_seen_track_ids: set[str] # Track IDs seen in current My Wave session
+ _my_wave_lock: asyncio.Lock # Protects My Wave mutable state
+ _wave_states: dict[str, _WaveState] # Per-station state for tagged wave stations
+ _wave_bg_colors: dict[str, str] # image_url -> hex bg color for transparent covers
@property
def client(self) -> YandexMusicClient:
try:
locale = (self.mass.metadata.locale or "en_US").lower()
use_russian = locale.startswith("ru")
- except Exception:
+ self.logger.debug("Locale detection: locale=%s, use_russian=%s", locale, use_russian)
+ except Exception as err:
+ self.logger.debug("Locale detection failed: %s", err)
use_russian = False
return BROWSE_NAMES_RU if use_russian else BROWSE_NAMES_EN
if not token:
raise LoginFailed("No Yandex Music token provided")
- self._client = YandexMusicClient(str(token))
+ base_url = self.config.get_value(CONF_BASE_URL, DEFAULT_BASE_URL)
+ self._client = YandexMusicClient(str(token), base_url=str(base_url))
await self._client.connect()
# Suppress yandex_music library DEBUG dumps (full API request/response JSON)
logging.getLogger("yandex_music").setLevel(self.logger.level + 10)
self._streaming = YandexMusicStreamingManager(self)
+ # Initialize My Wave duplicate tracking
+ self._my_wave_seen_track_ids = set()
+ self._my_wave_lock = asyncio.Lock()
+ # Initialize per-station wave state dict
+ self._wave_states = {}
+ self._wave_bg_colors = {}
self.logger.info("Successfully connected to Yandex Music")
async def unload(self, is_removed: bool = False) -> None:
name=name,
)
- async def browse( # noqa: PLR0915
- self, path: str
- ) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
+ async def browse(self, path: str) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
"""Browse provider items with locale-based folder names and My Wave.
Root level shows My Wave, artists, albums, liked tracks, playlists. Names
sub_subpath = path_parts[1] if len(path_parts) > 1 else None
if subpath == MY_WAVE_PLAYLIST_ID:
- # Root my_wave: fetch up to 3 batches so Play adds more tracks.
- # "Load more" uses single next batch.
- max_batches = 3 if sub_subpath != "next" else 1
- queue: str | int | None = None
- if sub_subpath == "next":
- queue = self._my_wave_last_track_id
- elif sub_subpath:
- queue = sub_subpath
-
- all_tracks: list[Track | BrowseFolder] = []
- last_batch_id: str | None = None
- first_track_id_this_batch: str | None = None
-
- for _ in range(max_batches):
- yandex_tracks, batch_id = await self.client.get_my_wave_tracks(queue=queue)
- if batch_id:
- self._my_wave_batch_id = batch_id
- last_batch_id = batch_id
- if not self._my_wave_radio_started_sent and yandex_tracks:
- self._my_wave_radio_started_sent = True
- await self.client.send_rotor_station_feedback(
- ROTOR_STATION_MY_WAVE,
- "radioStarted",
- batch_id=batch_id,
- )
- first_track_id_this_batch = None
- for yt in yandex_tracks:
- try:
- t = parse_track(self, yt)
- track_id = (
- str(yt.id)
- if hasattr(yt, "id") and yt.id
- else getattr(yt, "track_id", None)
- )
- if track_id:
- if first_track_id_this_batch is None:
- first_track_id_this_batch = track_id
- t.item_id = f"{track_id}{RADIO_TRACK_ID_SEP}{ROTOR_STATION_MY_WAVE}"
- for pm in t.provider_mappings:
- if pm.provider_instance == self.instance_id:
- pm.item_id = t.item_id
- break
- all_tracks.append(t)
- except InvalidDataError as err:
- self.logger.debug("Error parsing My Wave track: %s", err)
- if first_track_id_this_batch is not None:
- self._my_wave_last_track_id = first_track_id_this_batch
- if not batch_id or not yandex_tracks:
- break
- queue = first_track_id_this_batch
-
- if last_batch_id:
- names = self._get_browse_names()
- next_name = "Ещё" if names is BROWSE_NAMES_RU else "Load more"
- all_tracks.append(
- BrowseFolder(
- item_id="next",
- provider=self.instance_id,
- path=f"{path.rstrip('/')}/next",
- name=next_name,
- is_playable=False,
- )
- )
- return all_tracks
+ async with self._my_wave_lock:
+ return await self._browse_my_wave(path, sub_subpath)
+
+ # For You folder (picks + mixes)
+ if subpath == FOR_YOU_FOLDER_ID:
+ return await self._browse_for_you(path, path_parts)
+
+ # Collection folder (library items)
+ if subpath == COLLECTION_FOLDER_ID:
+ return await self._browse_collection(path)
+
+ # Handle picks/ path (mood, activity, era, genres)
+ if subpath == "picks":
+ return await self._browse_picks(path, path_parts)
+
+ # Handle mixes/ path (seasonal collections)
+ if subpath == "mixes":
+ return await self._browse_mixes(path, path_parts)
+
+ # Handle waves/ and radio/ paths (rotor stations by genre/mood/activity)
+ if subpath in (WAVES_FOLDER_ID, RADIO_FOLDER_ID):
+ return await self._browse_waves(path, path_parts)
+
+ # Handle my_waves_set/ path (AI Wave Sets from /landing-blocks/mixes-waves)
+ if subpath == MY_WAVES_SET_FOLDER_ID:
+ return await self._browse_vibe_sets(path, path_parts)
+
+ # Handle waves_landing/ path (Featured Waves from /landing-blocks/waves)
+ if subpath == WAVES_LANDING_FOLDER_ID:
+ return await self._browse_waves_landing(path, path_parts)
+
+ # Handle direct tag subpath (when folder is played by URI, the full path
+ # "picks/category/tag" is lost and only the tag slug arrives as subpath).
+ # Skip the API call for standard top-level folders that are never tag slugs.
+ _known_folders = {
+ "artists",
+ "albums",
+ "tracks",
+ "playlists",
+ LIKED_TRACKS_PLAYLIST_ID,
+ WAVES_FOLDER_ID,
+ RADIO_FOLDER_ID,
+ MY_WAVES_FOLDER_ID,
+ MY_WAVES_SET_FOLDER_ID,
+ WAVES_LANDING_FOLDER_ID,
+ FOR_YOU_FOLDER_ID,
+ COLLECTION_FOLDER_ID,
+ }
+ if subpath and subpath not in _known_folders:
+ # Handle direct wave station_id (e.g. "activity:workout") passed when
+ # MA plays a wave station folder using its item_id as the path subpath.
+ # Station IDs have format "category:tag" where category is non-numeric.
+ if ":" in subpath:
+ cat_part = subpath.split(":", 1)[0]
+ if not cat_part.isdigit():
+ return await self._browse_wave_station(subpath)
+
+ discovered_tags = await self._get_discovered_tag_slugs()
+ if subpath in discovered_tags:
+ return await self._get_tag_playlists_as_browse(subpath)
if subpath:
return await super().browse(path)
folders: list[BrowseFolder] = []
base = path if path.endswith("//") else path.rstrip("/") + "/"
+ # My Wave folder (always enabled — Яндекс «Моя волна»)
folders.append(
BrowseFolder(
item_id=MY_WAVE_PLAYLIST_ID,
is_playable=True,
)
)
+ # For You folder — Picks + Mixes (Яндекс «Для вас»)
+ folders.append(
+ BrowseFolder(
+ item_id=FOR_YOU_FOLDER_ID,
+ provider=self.instance_id,
+ path=f"{base}{FOR_YOU_FOLDER_ID}",
+ name=names.get(FOR_YOU_FOLDER_ID, "For You"),
+ is_playable=False,
+ )
+ )
+ # Collection folder — library items (Яндекс «Коллекция»)
+ has_library = any(
+ f in self.supported_features
+ for f in (
+ ProviderFeature.LIBRARY_ARTISTS,
+ ProviderFeature.LIBRARY_ALBUMS,
+ ProviderFeature.LIBRARY_TRACKS,
+ ProviderFeature.LIBRARY_PLAYLISTS,
+ )
+ )
+ if has_library:
+ folders.append(
+ BrowseFolder(
+ item_id=COLLECTION_FOLDER_ID,
+ provider=self.instance_id,
+ path=f"{base}{COLLECTION_FOLDER_ID}",
+ name=names.get(COLLECTION_FOLDER_ID, "Collection"),
+ is_playable=False,
+ )
+ )
+ # Radio folder — rotor stations (Яндекс волны, renamed to Radio)
+ folders.append(
+ BrowseFolder(
+ item_id=RADIO_FOLDER_ID,
+ provider=self.instance_id,
+ path=f"{base}{RADIO_FOLDER_ID}",
+ name=names.get(RADIO_FOLDER_ID, "Radio"),
+ is_playable=False,
+ )
+ )
+ # AI Wave Sets — parametric stations from /landing-blocks/mixes-waves
+ folders.append(
+ BrowseFolder(
+ item_id=MY_WAVES_SET_FOLDER_ID,
+ provider=self.instance_id,
+ path=f"{base}{MY_WAVES_SET_FOLDER_ID}",
+ name=names.get(MY_WAVES_SET_FOLDER_ID, "AI Wave Sets"),
+ is_playable=False,
+ )
+ )
+ if len(folders) == 1:
+ return await self.browse(folders[0].path)
+ return folders
+
+ async def _browse_my_wave(
+ self, path: str, sub_subpath: str | None
+ ) -> list[Track | BrowseFolder]:
+ """Browse My Wave tracks (must be called under _my_wave_lock).
+
+ :param path: Full browse path.
+ :param sub_subpath: Sub-path part ('next' for load more, or track_id cursor).
+ :return: List of Track and optional BrowseFolder for "Load more".
+ """
+ max_tracks_config = int(
+ self.config.get_value(CONF_MY_WAVE_MAX_TRACKS) or 150 # type: ignore[arg-type]
+ )
+ batch_size_config = MY_WAVE_BATCH_SIZE
+
+ # Effective limit on tracks to collect for this call:
+ # initial browse is capped to BROWSE_INITIAL_TRACKS to avoid marking
+ # extra tracks as "seen" that are never shown to the user.
+ effective_limit = min(
+ BROWSE_INITIAL_TRACKS if sub_subpath != "next" else max_tracks_config,
+ max_tracks_config,
+ )
+
+ # Root my_wave: fetch up to batch_size_config batches so Play adds more tracks.
+ # "Load more" always uses single next batch.
+ max_batches = batch_size_config if sub_subpath != "next" else 1
+
+ # Reset seen tracks on fresh browse (not "load more")
+ if sub_subpath != "next":
+ self._my_wave_seen_track_ids = set()
+
+ queue: str | int | None = None
+ if sub_subpath == "next":
+ queue = self._my_wave_last_track_id
+ elif sub_subpath:
+ queue = sub_subpath
+
+ all_tracks: list[Track | BrowseFolder] = []
+ last_batch_id: str | None = None
+ first_track_id_this_batch: str | None = None
+ total_track_count = 0
+
+ for _ in range(max_batches):
+ if total_track_count >= effective_limit:
+ break
+
+ yandex_tracks, batch_id = await self.client.get_my_wave_tracks(queue=queue)
+ if batch_id:
+ self._my_wave_batch_id = batch_id
+ last_batch_id = batch_id
+ if not self._my_wave_radio_started_sent and yandex_tracks:
+ sent = await self.client.send_rotor_station_feedback(
+ ROTOR_STATION_MY_WAVE,
+ "radioStarted",
+ batch_id=batch_id,
+ )
+ if sent:
+ self._my_wave_radio_started_sent = True
+ first_track_id_this_batch = None
+ for yt in yandex_tracks:
+ if total_track_count >= effective_limit:
+ break
+
+ track = self._parse_my_wave_track(yt, self._my_wave_seen_track_ids)
+ if track is None:
+ continue
+ all_tracks.append(track)
+ total_track_count += 1
+
+ track_id = track.item_id.split(RADIO_TRACK_ID_SEP, 1)[0]
+ if first_track_id_this_batch is None:
+ first_track_id_this_batch = track_id
+
+ if first_track_id_this_batch is not None:
+ self._my_wave_last_track_id = first_track_id_this_batch
+ if (
+ first_track_id_this_batch is None
+ or not batch_id
+ or not yandex_tracks
+ or total_track_count >= effective_limit
+ ):
+ break
+ queue = first_track_id_this_batch
+
+ # Only show "Load more" if we haven't reached the limit and there's more data
+ if last_batch_id and total_track_count < max_tracks_config:
+ names = self._get_browse_names()
+ next_name = "Ещё" if names == BROWSE_NAMES_RU else "Load more"
+ all_tracks.append(
+ BrowseFolder(
+ item_id="next",
+ provider=self.instance_id,
+ path=f"{path.rstrip('/')}/next",
+ name=next_name,
+ is_playable=False,
+ )
+ )
+ return all_tracks
+
+ def _parse_my_wave_track(self, yt: Any, seen_ids: set[str]) -> Track | None:
+ """Parse a Yandex track into a My Wave Track with composite item_id.
+
+ Extracts the track_id, checks for duplicates in the seen_ids set,
+ sets composite item_id (track_id@station_id), and updates provider_mappings.
+ Callers using shared state must hold _my_wave_lock.
+
+ :param yt: Yandex track object from rotor station response.
+ :param seen_ids: Set of already-seen track IDs to check and update.
+ :return: Parsed Track with composite item_id, or None if duplicate/invalid.
+ """
+ try:
+ t = parse_track(self, yt)
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing My Wave track: %s", err)
+ return None
+
+ track_id = str(yt.id) if hasattr(yt, "id") and yt.id else getattr(yt, "track_id", None)
+ if not track_id:
+ return t
+
+ if track_id in seen_ids:
+ self.logger.debug("Skipping duplicate My Wave track: %s", track_id)
+ return None
+
+ seen_ids.add(track_id)
+ t.item_id = f"{track_id}{RADIO_TRACK_ID_SEP}{ROTOR_STATION_MY_WAVE}"
+ for pm in t.provider_mappings:
+ if pm.provider_instance == self.instance_id:
+ pm.item_id = t.item_id
+ break
+ return t
+
+ @use_cache(3600)
+ async def _validate_tag(self, tag_slug: str) -> bool:
+ """Check if a tag has playlists by calling client.get_tag_playlists().
+
+ :param tag_slug: Tag identifier (e.g. 'chill', '80s').
+ :return: True if the tag has at least one playlist.
+ """
+ try:
+ playlists = await self.client.get_tag_playlists(tag_slug)
+ return len(playlists) > 0
+ except Exception as err:
+ self.logger.debug("Tag validation failed for %s: %s", tag_slug, err)
+ return False
+
+ @use_cache(3600)
+ async def _get_valid_tags_for_category(self, category: str) -> list[str]:
+ """Get validated tags for a category (only those with playlists).
+
+ Combines hardcoded tags from the category lists with any landing-discovered
+ tags, validates each by calling client.tags(), and returns only those with
+ playlists.
+
+ :param category: Category name ('mood', 'activity', 'era', 'genres').
+ :return: List of valid tag slugs.
+ """
+ category_lists: dict[str, list[str]] = {
+ "mood": list(TAG_CATEGORY_MOOD),
+ "activity": list(TAG_CATEGORY_ACTIVITY),
+ "era": list(TAG_CATEGORY_ERA),
+ "genres": list(TAG_CATEGORY_GENRES),
+ }
+ tags = category_lists.get(category, [])
+
+ # Add landing-discovered tags for this category
+ try:
+ landing_tags = await self.client.get_landing_tags()
+ for slug, _title in landing_tags:
+ cat = TAG_SLUG_CATEGORY.get(slug, "mood")
+ if cat == category and slug not in tags:
+ tags.append(slug)
+ except Exception as err:
+ self.logger.debug("Landing tag discovery failed: %s", err)
+
+ # Validate tags in parallel with bounded concurrency
+ sem = asyncio.Semaphore(8)
+
+ async def _check(tag: str) -> str | None:
+ async with sem:
+ return tag if await self._validate_tag(tag) else None
+
+ results = await asyncio.gather(*[_check(tag) for tag in tags])
+ return [tag for tag in results if tag is not None]
+
+ @use_cache(3600)
+ async def _get_discovered_tags(self, locale: str) -> list[tuple[str, str]]:
+ """Get all available tags by combining hardcoded tags with landing discovery.
+
+ Starts with all hardcoded tags from category lists, adds landing-discovered
+ tags, validates each via client.tags(), and returns only those with playlists.
+ Results are cached for 1 hour. The locale parameter is included in the cache
+ key so that a locale change invalidates the cached result.
+
+ :param locale: Current metadata locale (used as part of cache key).
+ :return: List of (slug, title) tuples for tags that have playlists.
+ """
+ names = self._get_browse_names()
+
+ # Collect all hardcoded tags (non-seasonal)
+ all_tags: dict[str, str] = {}
+ for slug, cat in TAG_SLUG_CATEGORY.items():
+ if cat != "seasonal":
+ all_tags[slug] = names.get(slug, slug.title())
+
+ # Add landing-discovered tags
+ try:
+ landing_tags = await self.client.get_landing_tags()
+ for slug, title in landing_tags:
+ if slug not in all_tags:
+ all_tags[slug] = title
+ except Exception as err:
+ self.logger.debug("Failed to discover tags from landing API: %s", err)
+
+ # Validate tags in parallel with bounded concurrency
+ sem = asyncio.Semaphore(8)
+
+ async def _check(slug: str) -> bool:
+ async with sem:
+ return await self._validate_tag(slug)
+
+ tag_items = list(all_tags.items())
+ results = await asyncio.gather(*[_check(slug) for slug, _ in tag_items])
+ return [
+ (slug, title) for (slug, title), valid in zip(tag_items, results, strict=True) if valid
+ ]
+
+ async def _get_discovered_tag_slugs(self) -> set[str]:
+ """Get set of all valid tag slugs (cached).
+
+ :return: Set of tag slug strings that have playlists.
+ """
+ discovered = await self._get_discovered_tags(self.mass.metadata.locale or "en_US")
+ return {slug for slug, _title in discovered}
+
+ async def _browse_for_you(
+ self, path: str, path_parts: list[str]
+ ) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
+ """Browse «For You» folder — shows Picks and Mixes sub-folders.
+
+ :param path: Full browse path.
+ :param path_parts: Split path parts after ://.
+ :return: List of sub-folders (Picks, Mixes).
+ """
+ names = self._get_browse_names()
+ # Strip the for_you segment to build child paths that route to picks/mixes
+ # Path format: ...//for_you → child paths should be ...//picks, ...//mixes
+ # We build base from the root (before for_you) by dropping the last segment.
+ base_parts = path.split("//", 1)
+ root_base = (base_parts[0] + "//") if len(base_parts) > 1 else path.rstrip("/") + "/"
+
+ if len(path_parts) == 1:
+ return [
+ BrowseFolder(
+ item_id="picks",
+ provider=self.instance_id,
+ path=f"{root_base}picks",
+ name=names.get("picks", "Picks"),
+ is_playable=False,
+ ),
+ BrowseFolder(
+ item_id="mixes",
+ provider=self.instance_id,
+ path=f"{root_base}mixes",
+ name=names.get("mixes", "Mixes"),
+ is_playable=False,
+ ),
+ ]
+ # Deeper path: delegate to picks or mixes handler via canonical paths
+ return await super().browse(path)
+
+ async def _browse_collection(
+ self, path: str
+ ) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
+ """Browse «Collection» folder — shows library sub-folders (tracks/artists/albums/playlists).
+
+ :param path: Full browse path.
+ :return: List of library sub-folders.
+ """
+ names = self._get_browse_names()
+ base_parts = path.split("//", 1)
+ root_base = (base_parts[0] + "//") if len(base_parts) > 1 else path.rstrip("/") + "/"
+
+ folders: list[BrowseFolder] = []
+ if ProviderFeature.LIBRARY_TRACKS in self.supported_features:
+ folders.append(
+ BrowseFolder(
+ item_id="tracks",
+ provider=self.instance_id,
+ path=f"{root_base}tracks",
+ name=names["tracks"],
+ is_playable=True,
+ )
+ )
if ProviderFeature.LIBRARY_ARTISTS in self.supported_features:
folders.append(
BrowseFolder(
item_id="artists",
provider=self.instance_id,
- path=f"{base}artists",
+ path=f"{root_base}artists",
name=names["artists"],
is_playable=True,
)
BrowseFolder(
item_id="albums",
provider=self.instance_id,
- path=f"{base}albums",
+ path=f"{root_base}albums",
name=names["albums"],
is_playable=True,
)
)
- if ProviderFeature.LIBRARY_TRACKS in self.supported_features:
+ if ProviderFeature.LIBRARY_PLAYLISTS in self.supported_features:
folders.append(
BrowseFolder(
- item_id="tracks",
+ item_id="playlists",
provider=self.instance_id,
- path=f"{base}tracks",
- name=names["tracks"],
+ path=f"{root_base}playlists",
+ name=names["playlists"],
is_playable=True,
)
)
- if ProviderFeature.LIBRARY_PLAYLISTS in self.supported_features:
+ return folders
+
+ async def _browse_picks(
+ self, path: str, path_parts: list[str]
+ ) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
+ """Browse picks folder using hardcoded tags validated against the API.
+
+ Tags are sourced from hardcoded category lists and landing API discovery,
+ then validated via client.tags() to ensure they have playlists.
+ Only categories with at least one valid tag are shown.
+
+ :param path: Full browse path.
+ :param path_parts: Split path parts after ://.
+ :return: List of folders or playlists.
+ """
+ names = self._get_browse_names()
+ base = path.rstrip("/") + "/"
+
+ # Get validated tags
+ discovered = await self._get_discovered_tags(self.mass.metadata.locale or "en_US")
+
+ # Categorize valid tags
+ categorized: dict[str, list[tuple[str, str]]] = {}
+ for slug, title in discovered:
+ cat = TAG_SLUG_CATEGORY.get(slug, "mood")
+ # Skip seasonal tags — they belong in mixes, not picks
+ if cat == "seasonal":
+ continue
+ categorized.setdefault(cat, []).append((slug, title))
+
+ # Sort tags within each category by preferred order
+ for cat, cat_tags in categorized.items():
+ order = TAG_CATEGORY_ORDER.get(cat, [])
+ order_map = {s: i for i, s in enumerate(order)}
+ cat_tags.sort(key=lambda t: order_map.get(t[0], len(order)))
+
+ # picks/ - show category folders (only those with valid tags)
+ if len(path_parts) == 1:
+ category_display_order = ["mood", "activity", "era", "genres"]
+ folders: list[BrowseFolder] = []
+ for cat in category_display_order:
+ if cat in categorized:
+ folders.append(
+ BrowseFolder(
+ item_id=cat,
+ provider=self.instance_id,
+ path=f"{base}{cat}",
+ name=names.get(cat, cat.title()),
+ is_playable=False,
+ )
+ )
+ # Show any extra categories not in the standard order
+ for cat in categorized:
+ if cat not in category_display_order:
+ folders.append(
+ BrowseFolder(
+ item_id=cat,
+ provider=self.instance_id,
+ path=f"{base}{cat}",
+ name=names.get(cat, cat.title()),
+ is_playable=False,
+ )
+ )
+ return folders
+
+ category: str | None = path_parts[1] if len(path_parts) > 1 else None
+ tag: str | None = path_parts[2] if len(path_parts) > 2 else None
+
+ self.logger.debug(
+ "Browse picks: path=%s, category=%s, tag=%s",
+ path,
+ category,
+ tag,
+ )
+
+ # picks/category/ - show valid tag folders for this category
+ if category and not tag:
+ category_tags = categorized.get(category, [])
+ folders = []
+ for slug, title in category_tags:
+ folders.append(
+ BrowseFolder(
+ item_id=slug,
+ provider=self.instance_id,
+ path=f"{base}{slug}",
+ name=names.get(slug, title),
+ is_playable=False,
+ )
+ )
+ self.logger.debug("Returning %d tag folders for category %s", len(folders), category)
+ return folders
+
+ # picks/category/tag - show playlists for the tag
+ if tag:
+ discovered_slugs = {slug for slug, _ in discovered}
+ if tag in discovered_slugs:
+ self.logger.debug("Fetching playlists for tag: %s", tag)
+ return await self._get_tag_playlists_as_browse(tag)
+
+ self.logger.debug("No match found, returning empty list")
+ return []
+
+ async def _browse_mixes(
+ self, path: str, path_parts: list[str]
+ ) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
+ """Browse mixes folder (seasonal collections) using hardcoded tags.
+
+ Uses TAG_MIXES directly and validates each tag via client.tags()
+ to check if it has playlists. Does not depend on landing API discovery.
+
+ :param path: Full browse path.
+ :param path_parts: Split path parts after ://.
+ :return: List of folders or playlists.
+ """
+ names = self._get_browse_names()
+ base = path.rstrip("/") + "/"
+
+ # Validate seasonal tags in parallel (no landing dependency)
+ sem = asyncio.Semaphore(5)
+
+ async def _check(tag: str) -> str | None:
+ async with sem:
+ return tag if await self._validate_tag(tag) else None
+
+ results = await asyncio.gather(*[_check(t) for t in TAG_MIXES])
+ available_mixes = [t for t in results if t is not None]
+
+ # mixes/ - show seasonal folders (only valid ones)
+ if len(path_parts) == 1:
+ folders = []
+ for t in available_mixes:
+ folders.append(
+ BrowseFolder(
+ item_id=t,
+ provider=self.instance_id,
+ path=f"{base}{t}",
+ name=names.get(t, t.title()),
+ is_playable=False,
+ )
+ )
+ return folders
+
+ # mixes/tag - show playlists for the tag
+ tag = path_parts[1] if len(path_parts) > 1 else None
+ if tag and tag in TAG_MIXES:
+ return await self._get_tag_playlists_as_browse(tag)
+
+ return []
+
+ def _get_wave_state(self, station_id: str) -> _WaveState:
+ """Get or create per-station wave state.
+
+ :param station_id: Rotor station ID (e.g. 'genre:rock', 'mood:chill').
+ :return: _WaveState instance for this station.
+ """
+ return self._wave_states.setdefault(station_id, _WaveState())
+
+ async def _browse_waves(
+ self, path: str, path_parts: list[str]
+ ) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
+ """Browse waves folder (rotor stations by genre/mood/activity/epoch/local).
+
+ Fetches available stations from the Yandex rotor API and groups them by category.
+
+ :param path: Full browse path.
+ :param path_parts: Split path parts after ://.
+ :return: List of folders or tracks.
+ """
+ names = self._get_browse_names()
+ base = path.rstrip("/") + "/"
+
+ locale = (self.mass.metadata.locale or "en_US").lower()
+ language = "ru" if locale.startswith("ru") else "en"
+
+ all_stations = await self.client.get_wave_stations(language)
+
+ # Group stations by category, preserving image_url
+ categorized: dict[str, list[tuple[str, str, str | None]]] = {}
+ for station_id, cat_key, name, image_url in all_stations:
+ categorized.setdefault(cat_key, []).append((station_id, name, image_url))
+
+ # waves/ — show category folders
+ if len(path_parts) == 1:
+ folders: list[BrowseFolder] = []
+ # Personalized "My Waves" first — only show if dashboard returns stations
+ dashboard_stations = await self._get_dashboard_stations_cached()
+ if dashboard_stations:
+ folders.append(
+ BrowseFolder(
+ item_id=MY_WAVES_FOLDER_ID,
+ provider=self.instance_id,
+ path=f"{base}{MY_WAVES_FOLDER_ID}",
+ name=names.get(MY_WAVES_FOLDER_ID, "My Waves"),
+ is_playable=False,
+ )
+ )
+ # Featured Waves — only show if landing-blocks/waves returns data
+ waves_landing = await self._get_waves_landing_cached()
+ if waves_landing:
+ folders.append(
+ BrowseFolder(
+ item_id=WAVES_LANDING_FOLDER_ID,
+ provider=self.instance_id,
+ path=f"{base}{WAVES_LANDING_FOLDER_ID}",
+ name=names.get(WAVES_LANDING_FOLDER_ID, "Featured Waves"),
+ is_playable=False,
+ )
+ )
+ for cat in WAVE_CATEGORY_DISPLAY_ORDER:
+ if cat in categorized:
+ folders.append(
+ BrowseFolder(
+ item_id=cat,
+ provider=self.instance_id,
+ path=f"{base}{cat}",
+ name=names.get(cat, cat.title()),
+ is_playable=False,
+ )
+ )
+ # Append any categories returned by API that aren't in the predefined order
+ for cat in categorized:
+ if cat not in WAVE_CATEGORY_DISPLAY_ORDER:
+ folders.append(
+ BrowseFolder(
+ item_id=cat,
+ provider=self.instance_id,
+ path=f"{base}{cat}",
+ name=names.get(cat, cat.title()),
+ is_playable=False,
+ )
+ )
+ return folders
+
+ category: str | None = path_parts[1] if len(path_parts) > 1 else None
+ tag: str | None = path_parts[2] if len(path_parts) > 2 else None
+
+ # waves/my_waves/ — show personalized stations from dashboard
+ if category == MY_WAVES_FOLDER_ID and not tag:
+ return await self._browse_my_waves_stations(path)
+
+ # waves/waves_landing/... — redirect to Featured Waves browse
+ if category == WAVES_LANDING_FOLDER_ID:
+ return await self._browse_waves_landing(path, path_parts[1:])
+
+ # waves/my_waves/<tag>[/next] — play a specific personal station
+ # The full station_id has format "genre:allrock", not "my_waves:allrock".
+ # Resolve by matching against dashboard stations cache.
+ if category == MY_WAVES_FOLDER_ID and tag:
+ dashboard_stations = await self._get_dashboard_stations_cached()
+ for sid, _, _ in dashboard_stations:
+ sid_tag = sid.split(":", 1)[1] if ":" in sid else sid
+ if sid_tag == tag:
+ return await self._browse_wave_station(sid, path=path)
+ # Fallback: try tag as direct station_id (e.g. "genre:allrock" passed verbatim)
+ if ":" in tag:
+ return await self._browse_wave_station(tag, path=path)
+ return []
+
+ # waves/<category>/ — show station folders with artwork
+ if category and not tag:
+ cat_stations = categorized.get(category, [])
+ folders = []
+ for station_id, station_name, image_url in cat_stations:
+ tag_part = station_id.split(":", 1)[1] if ":" in station_id else station_id
+ station_image: MediaItemImage | None = None
+ if image_url:
+ station_image = MediaItemImage(
+ type=ImageType.THUMB,
+ path=image_url,
+ provider=self.instance_id,
+ remotely_accessible=True,
+ )
+ folders.append(
+ BrowseFolder(
+ item_id=station_id,
+ provider=self.instance_id,
+ path=f"{base}{tag_part}",
+ name=station_name,
+ is_playable=True,
+ image=station_image,
+ )
+ )
+ return folders
+
+ # waves/<category>/<tag>[/next] — stream tracks from rotor station
+ if category and tag:
+ station_id = f"{category}:{tag}"
+ return await self._browse_wave_station(station_id, path=path)
+
+ return []
+
+ @use_cache(600)
+ async def _get_dashboard_stations_cached(self) -> list[tuple[str, str, str | None]]:
+ """Get personalized dashboard stations, cached for 10 minutes.
+
+ :return: List of (station_id, name, image_url) tuples.
+ """
+ return await self.client.get_dashboard_stations()
+
+ async def _browse_my_waves_stations(self, path: str) -> list[BrowseFolder]:
+ """Browse personalized wave stations from rotor/stations/dashboard.
+
+ Names are resolved from the non-personalized station list so that
+ stations show their actual genre/mood name (e.g. "Рок") rather than
+ the generic "Моя волна" label that the dashboard API returns.
+
+ :param path: Full browse path (used to build sub-paths).
+ :return: List of playable BrowseFolder items, one per station.
+ """
+ stations = await self._get_dashboard_stations_cached()
+
+ # Build a name map from the non-personalized list for proper localized names.
+ locale = (self.mass.metadata.locale or "en_US").lower()
+ language = "ru" if locale.startswith("ru") else "en"
+ all_stations = await self.client.get_wave_stations(language)
+ station_name_map: dict[str, str] = {sid: name for sid, _, name, _ in all_stations}
+
+ base = path.rstrip("/") + "/"
+ folders: list[BrowseFolder] = []
+ for station_id, fallback_name, image_url in stations:
+ # Use full station_id (e.g. "genre:rock") in path to avoid collisions
+ # when two stations share the same tag but differ by category.
+ # The routing fallback (if ":" in tag) handles this correctly.
+ name = station_name_map.get(station_id, fallback_name)
+ station_image: MediaItemImage | None = None
+ if image_url:
+ station_image = MediaItemImage(
+ type=ImageType.THUMB,
+ path=image_url,
+ provider=self.instance_id,
+ remotely_accessible=True,
+ )
folders.append(
BrowseFolder(
- item_id="playlists",
+ item_id=station_id,
provider=self.instance_id,
- path=f"{base}playlists",
- name=names["playlists"],
+ path=f"{base}{station_id}",
+ name=name,
is_playable=True,
+ image=station_image,
+ )
+ )
+ return folders
+
+ async def _browse_wave_station(
+ self, station_id: str, path: str = ""
+ ) -> list[Track | BrowseFolder]:
+ """Browse a rotor wave station and return tracks.
+
+ Fetches tracks from the rotor station, deduplicates within the current session,
+ and sends radioStarted feedback on first call. Appends a "Load more" BrowseFolder
+ at the end so MA can continue fetching the next batch automatically (radio mode).
+
+ :param station_id: Rotor station ID (e.g. 'genre:rock', 'mood:chill').
+ :param path: Current browse path, used to construct the "Load more" next path.
+ :return: List of Track objects with composite item_id (track_id@station_id),
+ followed by a "Load more" BrowseFolder if more tracks are available.
+ """
+ state = self._get_wave_state(station_id)
+ async with state.lock:
+ max_tracks = int(
+ self.config.get_value(CONF_MY_WAVE_MAX_TRACKS) or 150 # type: ignore[arg-type]
+ )
+
+ self.logger.debug(
+ "Browse wave station: station_id=%s path=%s last_track_id=%s",
+ station_id,
+ path,
+ state.last_track_id,
+ )
+ yandex_tracks, batch_id = await self.client.get_rotor_station_tracks(
+ station_id, queue=state.last_track_id
+ )
+ if batch_id:
+ state.batch_id = batch_id
+
+ if not state.radio_started_sent and yandex_tracks:
+ sent = await self.client.send_rotor_station_feedback(
+ station_id,
+ "radioStarted",
+ batch_id=batch_id,
+ )
+ if sent:
+ state.radio_started_sent = True
+
+ tracks: list[Track] = []
+ first_track_id: str | None = None
+ for yt in yandex_tracks:
+ if len(state.seen_track_ids) >= max_tracks:
+ break
+ track = self._parse_my_wave_track(yt, state.seen_track_ids)
+ if track is None:
+ continue
+ # Override station_id in composite item_id to reflect this specific station
+ old_item_id = track.item_id
+ track_id = old_item_id.split(RADIO_TRACK_ID_SEP, 1)[0]
+ track.item_id = f"{track_id}{RADIO_TRACK_ID_SEP}{station_id}"
+ # Keep provider mappings in sync with the new item_id
+ for pm in getattr(track, "provider_mappings", []):
+ if (
+ getattr(pm, "item_id", None) == old_item_id
+ and getattr(pm, "provider_instance", None) == self.instance_id
+ ):
+ pm.item_id = track.item_id
+ if first_track_id is None:
+ first_track_id = track_id
+ tracks.append(track)
+
+ if first_track_id is not None:
+ state.last_track_id = first_track_id
+
+ self.logger.debug(
+ "Wave station %s returned %d tracks: %s",
+ station_id,
+ len(tracks),
+ [t.item_id.split(RADIO_TRACK_ID_SEP, 1)[0] for t in tracks[:5]],
+ )
+ result: list[Track | BrowseFolder] = list(tracks)
+
+ # Append "Load more" sentinel so MA knows to call browse again for next batch.
+ # This mirrors the My Wave mechanism and enables continuous radio playback.
+ if tracks and len(state.seen_track_ids) < max_tracks and path:
+ names = self._get_browse_names()
+ next_name = "Ещё" if names == BROWSE_NAMES_RU else "Load more"
+ # Append /next to the current path (same pattern as _browse_my_wave).
+ # This makes each "Load more" path unique (e.g. /next/next/next...)
+ # so MA never serves a cached result for subsequent presses.
+ result.append(
+ BrowseFolder(
+ item_id="next",
+ provider=self.instance_id,
+ path=f"{path.rstrip('/')}/next",
+ name=next_name,
+ is_playable=False,
+ )
+ )
+
+ return result
+
+ @staticmethod
+ def _extract_wave_item_cover(item: dict[str, Any]) -> tuple[str | None, str | None]:
+ """Extract cover URI and background color from a wave/mix item.
+
+ :param item: Wave or mix item dict from the API.
+ :return: (cover_uri, bg_color) tuple where bg_color is a hex string or None.
+ """
+ agent_uri = item.get("agent", {}).get("cover", {}).get("uri", "")
+ cover_uri = agent_uri or item.get("compact_image_url")
+ bg_color = item.get("colors", {}).get("average")
+ return cover_uri, bg_color
+
+ @use_cache(3600)
+ async def _get_mixes_waves_cached(self) -> list[dict[str, Any]] | None:
+ """Get AI Wave Set data from /landing-blocks/mixes-waves, cached for 1 hour.
+
+ :return: List of mix category dicts from the API, or None on error.
+ """
+ return await self.client.get_mixes_waves()
+
+ @use_cache(3600)
+ async def _get_waves_landing_cached(self) -> list[dict[str, Any]] | None:
+ """Get Featured Waves data from /landing-blocks/waves, cached for 1 hour.
+
+ :return: List of wave category dicts from the API, or None on error.
+ """
+ return await self.client.get_waves_landing()
+
+ async def _browse_waves_landing(
+ self, path: str, path_parts: list[str]
+ ) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
+ """Browse Featured Waves (from /landing-blocks/waves).
+
+ :param path: Full browse path.
+ :param path_parts: Split path parts after ://.
+ :return: List of folders or tracks.
+ """
+ waves_data = await self._get_waves_landing_cached()
+ return await self._browse_wave_categories(
+ path, path_parts, waves_data or [], WAVES_LANDING_FOLDER_ID
+ )
+
+ async def _browse_wave_categories(
+ self,
+ path: str,
+ path_parts: list[str],
+ categories_data: list[dict[str, Any]],
+ id_prefix: str,
+ ) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
+ """Browse wave-like category folders and their station items.
+
+ Shared logic for both 'my_waves_set' browse trees:
+ - Level 1 (e.g. my_waves_set/): category folders
+ - Level 2 (e.g. my_waves_set/ai-sets/): playable station folders with artwork
+ - Level 3+ (e.g. my_waves_set/ai-sets/genre:rock[/next]): track listing
+
+ :param path: Full browse path.
+ :param path_parts: Split path parts after ://.
+ :param categories_data: List of category dicts from the API.
+ :param id_prefix: Prefix for BrowseFolder item_id (e.g. 'my_waves_set').
+ :return: List of folders or tracks.
+ """
+ base = path.rstrip("/") + "/"
+
+ if not categories_data:
+ return []
+
+ # Level 1 → category folders
+ if len(path_parts) == 1:
+ folders: list[BrowseFolder] = []
+ for wave_category in categories_data:
+ cat_id = wave_category.get("id", "")
+ cat_title = wave_category.get("title", "")
+ items = wave_category.get("items", [])
+ if not items or not cat_id:
+ continue
+ display_name = cat_title.capitalize() if cat_title else cat_id.capitalize()
+ folders.append(
+ BrowseFolder(
+ item_id=f"{id_prefix}_{cat_id}",
+ provider=self.instance_id,
+ path=f"{base}{cat_id}",
+ name=display_name,
+ is_playable=False,
+ )
)
- )
- if len(folders) == 1:
- return await self.browse(folders[0].path)
- return folders
+ return folders
+
+ category_id = path_parts[1] if len(path_parts) > 1 else None
+ if not category_id:
+ return []
+
+ # Level 3+ → stream tracks from rotor station
+ if len(path_parts) > 2:
+ station_id = path_parts[2]
+ return await self._browse_wave_station(station_id, path=path)
+
+ # Level 2 → playable station folders with artwork
+ for wave_category in categories_data:
+ if wave_category.get("id") == category_id:
+ items = wave_category.get("items", [])
+ result: list[BrowseFolder] = []
+ for item in items:
+ station_id = item.get("station_id", "")
+ title = item.get("title", "")
+ if not station_id or not title:
+ continue
+ cover_uri, bg_color = self._extract_wave_item_cover(item)
+ image: MediaItemImage | None = None
+ if cover_uri:
+ if cover_uri.startswith("http"):
+ img_url: str = cover_uri.replace("%%", IMAGE_SIZE_MEDIUM)
+ else:
+ raw = get_image_url(cover_uri)
+ img_url = "" if raw is None else raw
+ if img_url:
+ if bg_color:
+ # Append bg_color as URL fragment for cache-key uniqueness.
+ # MA will call resolve_image() to composite the transparent PNG.
+ if len(self._wave_bg_colors) > 200:
+ self._wave_bg_colors.clear()
+ img_url = f"{img_url}#{bg_color.lstrip('#')}"
+ self._wave_bg_colors[img_url] = bg_color
+ image = MediaItemImage(
+ type=ImageType.THUMB,
+ path=img_url,
+ provider=self.instance_id,
+ remotely_accessible=bg_color is None,
+ )
+ result.append(
+ BrowseFolder(
+ item_id=station_id,
+ provider=self.instance_id,
+ path=f"{base}{station_id}",
+ name=title,
+ is_playable=True,
+ image=image,
+ )
+ )
+ return result
+
+ return []
+
+ async def _browse_vibe_sets(
+ self, path: str, path_parts: list[str]
+ ) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
+ """Browse AI Wave Sets (from /landing-blocks/mixes-waves).
+
+ :param path: Full browse path.
+ :param path_parts: Split path parts after ://.
+ :return: List of folders or tracks.
+ """
+ mixes_data = await self._get_mixes_waves_cached()
+ return await self._browse_wave_categories(
+ path, path_parts, mixes_data or [], MY_WAVES_SET_FOLDER_ID
+ )
+
+ @use_cache(600)
+ async def _get_tag_playlists_as_browse(
+ self, tag_id: str
+ ) -> Sequence[MediaItemType | ItemMapping | BrowseFolder]:
+ """Get playlists for a tag and return as browse items.
+
+ :param tag_id: Tag identifier (e.g. 'chill', '80s').
+ :return: List of Playlist objects.
+ """
+ self.logger.debug("Fetching playlists for tag: %s", tag_id)
+ playlists = await self.client.get_tag_playlists(tag_id)
+ self.logger.debug("Got %d playlists for tag %s", len(playlists), tag_id)
+ result: list[Playlist] = []
+ for playlist in playlists:
+ try:
+ result.append(parse_playlist(self, playlist))
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing tag playlist: %s", err)
+ self.logger.debug("Parsed %d playlists for tag %s", len(result), tag_id)
+ return result
# Search
raise MediaNotFoundError(f"Album {prov_album_id} not found")
return parse_album(self, album)
- @use_cache(3600 * 24 * 30)
async def get_track(self, prov_track_id: str) -> Track:
"""Get track details by ID.
Supports composite item_id (track_id@station_id) for My Wave tracks;
- only the track_id part is used for the API.
+ only the track_id part is used for the API. Normalizes the ID before
+ caching to avoid duplicate cache entries.
:param prov_track_id: The provider track ID (or track_id@station_id).
:return: Track object.
:raises MediaNotFoundError: If track not found.
"""
track_id, _ = _parse_radio_item_id(prov_track_id)
+ return await self._get_track_cached(track_id)
+
+ @use_cache(3600 * 24 * 30)
+ async def _get_track_cached(self, track_id: str) -> Track:
+ """Get track details by normalized ID (cached).
+
+ :param track_id: Normalized track ID (without station suffix).
+ :return: Track object.
+ :raises MediaNotFoundError: If track not found.
+ """
yandex_track = await self.client.get_track(track_id)
if not yandex_track:
- raise MediaNotFoundError(f"Track {prov_track_id} not found")
- return parse_track(self, yandex_track)
+ raise MediaNotFoundError(f"Track {track_id} not found")
+
+ # Use the already-fetched track object to avoid a duplicate API call
+ lyrics, lyrics_synced = await self.client.get_track_lyrics_from_track(yandex_track)
+
+ return parse_track(self, yandex_track, lyrics=lyrics, lyrics_synced=lyrics_synced)
- @use_cache(3600 * 24 * 30)
async def get_playlist(self, prov_playlist_id: str) -> Playlist:
"""Get playlist details by ID.
- Supports virtual playlist MY_WAVE_PLAYLIST_ID (My Wave). Real playlists
- use format "owner_id:kind".
+ Supports virtual playlists MY_WAVE_PLAYLIST_ID (My Wave) and
+ LIKED_TRACKS_PLAYLIST_ID (Liked Tracks). Real playlists use format "owner_id:kind".
- :param prov_playlist_id: The provider playlist ID (format: "owner_id:kind" or my_wave).
+ :param prov_playlist_id: The provider playlist ID (format: "owner_id:kind",
+ my_wave, or liked_tracks).
:return: Playlist object.
:raises MediaNotFoundError: If playlist not found.
"""
+ # Virtual playlists - not cached (locale-dependent names)
if prov_playlist_id == MY_WAVE_PLAYLIST_ID:
names = self._get_browse_names()
return Playlist(
item_id=MY_WAVE_PLAYLIST_ID,
provider=self.instance_id,
name=names[MY_WAVE_PLAYLIST_ID],
- owner="Yandex Music",
+ owner=get_canonical_provider_name(self),
provider_mappings={
ProviderMapping(
item_id=MY_WAVE_PLAYLIST_ID,
is_editable=False,
)
+ if prov_playlist_id == LIKED_TRACKS_PLAYLIST_ID:
+ names = self._get_browse_names()
+ return Playlist(
+ item_id=LIKED_TRACKS_PLAYLIST_ID,
+ provider=self.instance_id,
+ name=names[LIKED_TRACKS_PLAYLIST_ID],
+ owner=get_canonical_provider_name(self),
+ provider_mappings={
+ ProviderMapping(
+ item_id=LIKED_TRACKS_PLAYLIST_ID,
+ provider_domain=self.domain,
+ provider_instance=self.instance_id,
+ is_unique=True,
+ )
+ },
+ is_editable=False,
+ )
+
+ # Real playlists - use cached method
+ return await self._get_real_playlist(prov_playlist_id)
+
+ @use_cache(3600 * 24 * 30)
+ async def _get_real_playlist(self, prov_playlist_id: str) -> Playlist:
+ """Get real playlist details by ID (cached).
+
+ :param prov_playlist_id: The provider playlist ID (format: "owner_id:kind").
+ :return: Playlist object.
+ :raises MediaNotFoundError: If playlist not found.
+ """
# Parse the playlist ID (format: owner_id:kind)
if PLAYLIST_ID_SPLITTER in prov_playlist_id:
owner_id, kind = prov_playlist_id.split(PLAYLIST_ID_SPLITTER, 1)
async def _get_my_wave_playlist_tracks(self, page: int) -> list[Track]:
"""Get My Wave tracks for virtual playlist (uncached; uses cursor for page > 0).
+ Fetches MY_WAVE_BATCH_SIZE Rotor API batches per page call to reduce
+ the number of round-trips when the player controller paginates through pages.
+
:param page: Page number (0 = first batch, 1+ = next batches via queue cursor).
:return: List of Track objects for this page.
"""
- queue: str | int | None = None
- if page > 0:
- queue = self._my_wave_playlist_next_cursor
- if not queue:
- return []
- yandex_tracks, batch_id = await self.client.get_my_wave_tracks(queue=queue)
- if batch_id:
- self._my_wave_batch_id = batch_id
- if not self._my_wave_radio_started_sent and yandex_tracks:
- self._my_wave_radio_started_sent = True
- await self.client.send_rotor_station_feedback(
- ROTOR_STATION_MY_WAVE,
- "radioStarted",
- batch_id=batch_id,
+ async with self._my_wave_lock:
+ max_tracks_config = int(
+ self.config.get_value(CONF_MY_WAVE_MAX_TRACKS) or 150 # type: ignore[arg-type]
)
- first_track_id_this_batch = None
- tracks = []
- for yt in yandex_tracks:
- try:
- t = parse_track(self, yt)
- track_id = (
- str(yt.id) if hasattr(yt, "id") and yt.id else getattr(yt, "track_id", None)
- )
- if track_id:
+
+ # Reset seen tracks on first page
+ if page == 0:
+ self._my_wave_seen_track_ids = set()
+
+ queue: str | int | None = None
+ if page > 0:
+ queue = self._my_wave_playlist_next_cursor
+ if not queue:
+ return []
+
+ # Check if we've already reached the limit
+ if len(self._my_wave_seen_track_ids) >= max_tracks_config:
+ return []
+
+ tracks: list[Track] = []
+ next_cursor: str | None = None
+
+ # Fetch MY_WAVE_BATCH_SIZE Rotor API batches per page to reduce API round-trips
+ for _ in range(MY_WAVE_BATCH_SIZE):
+ if len(self._my_wave_seen_track_ids) >= max_tracks_config:
+ break
+
+ yandex_tracks, batch_id = await self.client.get_my_wave_tracks(queue=queue)
+ if batch_id:
+ self._my_wave_batch_id = batch_id
+ if not self._my_wave_radio_started_sent and yandex_tracks:
+ sent = await self.client.send_rotor_station_feedback(
+ ROTOR_STATION_MY_WAVE,
+ "radioStarted",
+ batch_id=batch_id,
+ )
+ if sent:
+ self._my_wave_radio_started_sent = True
+
+ if not yandex_tracks:
+ break
+
+ first_track_id_this_batch = None
+ for yt in yandex_tracks:
+ if len(self._my_wave_seen_track_ids) >= max_tracks_config:
+ break
+
+ track = self._parse_my_wave_track(yt, self._my_wave_seen_track_ids)
+ if track is None:
+ continue
+
+ tracks.append(track)
+ track_id = track.item_id.split(RADIO_TRACK_ID_SEP, 1)[0]
if first_track_id_this_batch is None:
first_track_id_this_batch = track_id
- t.item_id = f"{track_id}{RADIO_TRACK_ID_SEP}{ROTOR_STATION_MY_WAVE}"
- for pm in t.provider_mappings:
- if pm.provider_instance == self.instance_id:
- pm.item_id = t.item_id
- break
- tracks.append(t)
- except InvalidDataError as err:
- self.logger.debug("Error parsing My Wave track: %s", err)
- if first_track_id_this_batch is not None:
- self._my_wave_playlist_next_cursor = first_track_id_this_batch
+
+ if first_track_id_this_batch is not None:
+ next_cursor = first_track_id_this_batch
+ queue = first_track_id_this_batch
+ else:
+ # All tracks in this batch were duplicates or failed to parse
+ break
+
+ # Store cursor for next page call (None clears pagination so next call returns [])
+ self._my_wave_playlist_next_cursor = next_cursor
+ return tracks
+
+ async def _get_liked_tracks_playlist_tracks(self, page: int) -> list[Track]:
+ """Get liked tracks for virtual playlist (sorted in reverse chronological order).
+
+ :param page: Page number (0 = all tracks limited by config, >0 = empty for pagination).
+ :return: List of Track objects.
+ """
+ # Liked tracks API returns all tracks at once, so only return tracks on page 0
+ if page > 0:
+ return []
+
+ max_tracks_config = int(
+ self.config.get_value(CONF_LIKED_TRACKS_MAX_TRACKS) or 500 # type: ignore[arg-type]
+ )
+
+ # Fetch liked tracks (already sorted in reverse chronological order by api_client)
+ track_shorts = await self.client.get_liked_tracks()
+ if not track_shorts:
+ self.logger.debug("No liked tracks found")
+ return []
+
+ # Apply max tracks limit
+ track_shorts = track_shorts[:max_tracks_config]
+
+ # Fetch full track details in batches
+ track_ids = [str(ts.track_id) for ts in track_shorts if ts.track_id]
+
+ batch_size = TRACK_BATCH_SIZE
+ full_tracks = []
+ for i in range(0, len(track_ids), batch_size):
+ batch_ids = track_ids[i : i + batch_size]
+ batch_result = await self.client.get_tracks(batch_ids)
+ full_tracks.extend(batch_result)
+
+ # Create track ID to full track mapping by track ID directly
+ track_map = {}
+ for t in full_tracks:
+ if hasattr(t, "id") and t.id:
+ track_map[str(t.id)] = t
+
+ # Parse tracks in the original order (reverse chronological)
+ tracks = []
+ for track_id in track_ids:
+ # track_id may be compound "trackId:albumId", extract base ID for lookup
+ base_id = track_id.split(":")[0] if ":" in track_id else track_id
+ found = track_map.get(track_id) or track_map.get(base_id)
+ if found:
+ try:
+ tracks.append(parse_track(self, found))
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing liked track %s: %s", track_id, err)
+
+ self.logger.debug("Liked tracks: fetched %s, parsed %s", len(track_shorts), len(tracks))
return tracks
# Get related items
self.logger.debug("Error parsing similar track: %s", err)
return tracks
- @use_cache(3600 * 3)
async def recommendations(self) -> list[RecommendationFolder]:
- """Get recommendations; includes My Wave (Моя волна) as first folder.
+ """Get recommendations with multiple discovery folders.
- :return: List of recommendation folders (My Wave with first batch of tracks).
+ Returns My Wave, Feed (Made for You), Chart, New Releases, and
+ New Playlists sections.
+
+ :return: List of recommendation folders.
"""
- names = self._get_browse_names()
- yandex_tracks, _ = await self.client.get_my_wave_tracks(queue=None)
+ folders: list[RecommendationFolder] = []
+
+ folder = await self._get_my_wave_recommendations()
+ if folder:
+ folders.append(folder)
+
+ folder = await self._get_feed_recommendations()
+ if folder:
+ folders.append(folder)
+
+ folder = await self._get_chart_recommendations()
+ if folder:
+ folders.append(folder)
+
+ folder = await self._get_new_releases_recommendations()
+ if folder:
+ folders.append(folder)
+
+ folder = await self._get_new_playlists_recommendations()
+ if folder:
+ folders.append(folder)
+
+ # Picks & Mixes recommendations
+ folder = await self._get_top_picks_recommendations()
+ if folder:
+ folders.append(folder)
+
+ # Mood mix: select tag outside cache so rotation actually works
+ mood_tag = await self._pick_random_tag_for_category("mood")
+ if mood_tag:
+ folder = await self._get_mood_mix_recommendations(mood_tag)
+ if folder:
+ folders.append(folder)
+
+ # Activity mix: select tag outside cache so rotation actually works
+ activity_tag = await self._pick_random_tag_for_category("activity")
+ if activity_tag:
+ folder = await self._get_activity_mix_recommendations(activity_tag)
+ if folder:
+ folders.append(folder)
+
+ folder = await self._get_seasonal_mix_recommendations()
+ if folder:
+ folders.append(folder)
+
+ return folders
+
+ @use_cache(600)
+ async def _get_my_wave_recommendations(self) -> RecommendationFolder | None:
+ """Get My Wave recommendation folder with personalized tracks.
+
+ :return: RecommendationFolder with My Wave tracks, or None if empty.
+ """
+ max_tracks_config = int(
+ self.config.get_value(CONF_MY_WAVE_MAX_TRACKS) or 150 # type: ignore[arg-type]
+ )
+ batch_size_config = MY_WAVE_BATCH_SIZE
+
+ seen_track_ids: set[str] = set()
items: list[Track] = []
- for yt in yandex_tracks:
+ queue: str | int | None = None
+
+ for _ in range(batch_size_config):
+ if len(seen_track_ids) >= max_tracks_config:
+ break
+
+ yandex_tracks, _ = await self.client.get_my_wave_tracks(queue=queue)
+ if not yandex_tracks:
+ break
+
+ first_track_id_this_batch = None
+ for yt in yandex_tracks:
+ if len(seen_track_ids) >= max_tracks_config:
+ break
+
+ track = self._parse_my_wave_track(yt, seen_ids=seen_track_ids)
+ if track is None:
+ continue
+
+ items.append(track)
+ track_id = track.item_id.split(RADIO_TRACK_ID_SEP, 1)[0]
+ if first_track_id_this_batch is None:
+ first_track_id_this_batch = track_id
+
+ queue = first_track_id_this_batch
+ if not queue:
+ break
+
+ if not items:
+ return None
+
+ initial_tracks_limit = DISCOVERY_INITIAL_TRACKS
+ if len(items) > initial_tracks_limit:
+ items = items[:initial_tracks_limit]
+
+ names = self._get_browse_names()
+ return RecommendationFolder(
+ item_id=MY_WAVE_PLAYLIST_ID,
+ provider=self.instance_id,
+ name=names[MY_WAVE_PLAYLIST_ID],
+ items=UniqueList(items),
+ icon="mdi-waveform",
+ )
+
+ @use_cache(1800)
+ async def _get_feed_recommendations(self) -> RecommendationFolder | None:
+ """Get personalized feed playlists (Playlist of the Day, DejaVu, etc.).
+
+ :return: RecommendationFolder with generated playlists, or None if unavailable.
+ """
+ feed = await self.client.get_feed()
+ if not feed or not feed.generated_playlists:
+ return None
+ items: list[Playlist] = []
+ for gen_playlist in feed.generated_playlists:
+ if gen_playlist.data and gen_playlist.ready:
+ try:
+ items.append(parse_playlist(self, gen_playlist.data))
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing feed playlist: %s", err)
+ if not items:
+ return None
+ names = self._get_browse_names()
+ return RecommendationFolder(
+ item_id="feed",
+ provider=self.instance_id,
+ name=names["feed"],
+ items=UniqueList(items),
+ icon="mdi-account-music",
+ )
+
+ @use_cache(3600)
+ async def _get_chart_recommendations(self) -> RecommendationFolder | None:
+ """Get chart tracks (hot tracks of the month).
+
+ :return: RecommendationFolder with chart tracks, or None if unavailable.
+ """
+ chart_info = await self.client.get_chart()
+ if not chart_info or not chart_info.chart:
+ return None
+ playlist = chart_info.chart
+ if not playlist.tracks:
+ return None
+ # TrackShort objects in chart context have .track (full Track) and .chart (position)
+ tracks: list[Track] = []
+ for track_short in playlist.tracks[:20]:
+ track_obj = getattr(track_short, "track", None)
+ if not track_obj:
+ continue
try:
- t = parse_track(self, yt)
- track_id = (
- str(yt.id) if hasattr(yt, "id") and yt.id else getattr(yt, "track_id", None)
- )
- if track_id:
- t.item_id = f"{track_id}{RADIO_TRACK_ID_SEP}{ROTOR_STATION_MY_WAVE}"
- for pm in t.provider_mappings:
- if pm.provider_instance == self.instance_id:
- pm.item_id = t.item_id
- break
- items.append(t)
+ tracks.append(parse_track(self, track_obj))
except InvalidDataError as err:
- self.logger.debug("Error parsing My Wave track for recommendations: %s", err)
- return [
- RecommendationFolder(
- item_id=MY_WAVE_PLAYLIST_ID,
- provider=self.instance_id,
- name=names[MY_WAVE_PLAYLIST_ID],
- items=UniqueList(items),
- icon="mdi-waveform",
- )
+ self.logger.debug("Error parsing chart track: %s", err)
+ if not tracks:
+ return None
+ names = self._get_browse_names()
+ return RecommendationFolder(
+ item_id="chart",
+ provider=self.instance_id,
+ name=names["chart"],
+ items=UniqueList(tracks),
+ icon="mdi-chart-line",
+ )
+
+ @use_cache(3600)
+ async def _get_new_releases_recommendations(self) -> RecommendationFolder | None:
+ """Get new album releases.
+
+ :return: RecommendationFolder with new albums, or None if unavailable.
+ """
+ releases = await self.client.get_new_releases()
+ if not releases or not releases.new_releases:
+ return None
+ # new_releases is a list of album IDs (int) — need to batch-fetch full details
+ album_ids = [str(aid) for aid in releases.new_releases[:20]]
+ if not album_ids:
+ return None
+ full_albums = await self.client.get_albums(album_ids)
+ if not full_albums:
+ return None
+ albums: list[Album] = []
+ for album in full_albums:
+ try:
+ albums.append(parse_album(self, album))
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing new release album: %s", err)
+ if not albums:
+ return None
+ names = self._get_browse_names()
+ return RecommendationFolder(
+ item_id="new_releases",
+ provider=self.instance_id,
+ name=names["new_releases"],
+ items=UniqueList(albums),
+ icon="mdi-new-box",
+ )
+
+ @use_cache(3600)
+ async def _get_new_playlists_recommendations(self) -> RecommendationFolder | None:
+ """Get new editorial playlists.
+
+ :return: RecommendationFolder with new playlists, or None if unavailable.
+ """
+ result = await self.client.get_new_playlists()
+ if not result or not result.new_playlists:
+ return None
+ # new_playlists is a list of PlaylistId objects (uid, kind) — fetch full details
+ playlist_ids = [
+ f"{pid.uid}:{pid.kind}"
+ for pid in result.new_playlists[:20]
+ if hasattr(pid, "uid") and hasattr(pid, "kind")
]
+ if not playlist_ids:
+ return None
+ full_playlists = await self.client.get_playlists(playlist_ids)
+ if not full_playlists:
+ return None
+ playlists: list[Playlist] = []
+ for playlist in full_playlists:
+ try:
+ playlists.append(parse_playlist(self, playlist))
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing new playlist: %s", err)
+ if not playlists:
+ return None
+ names = self._get_browse_names()
+ return RecommendationFolder(
+ item_id="new_playlists",
+ provider=self.instance_id,
+ name=names["new_playlists"],
+ items=UniqueList(playlists),
+ icon="mdi-playlist-star",
+ )
+
+ @use_cache(3600)
+ async def _get_top_picks_recommendations(self) -> RecommendationFolder | None:
+ """Get Top Picks recommendation folder (tag: top).
+
+ :return: RecommendationFolder with top playlists, or None if unavailable.
+ """
+ playlists = await self.client.get_tag_playlists("top")
+ if not playlists:
+ return None
+ items: list[Playlist] = []
+ for playlist in playlists[:10]:
+ try:
+ items.append(parse_playlist(self, playlist))
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing top picks playlist: %s", err)
+ if not items:
+ return None
+ names = self._get_browse_names()
+ return RecommendationFolder(
+ item_id="top_picks",
+ provider=self.instance_id,
+ name=names.get("top_picks", "Top Picks"),
+ items=UniqueList(items),
+ icon="mdi-star",
+ )
+
+ async def _pick_random_tag_for_category(self, category: str) -> str | None:
+ """Pick a random valid tag for a category (not cached — enables rotation).
+
+ :param category: Category name ('mood', 'activity', etc.).
+ :return: Random tag slug, or None if no valid tags.
+ """
+ valid_tags = await self._get_valid_tags_for_category(category)
+ if not valid_tags:
+ return None
+ return random.choice(valid_tags)
+
+ @use_cache(1800)
+ async def _get_mood_mix_recommendations(self, mood_tag: str) -> RecommendationFolder | None:
+ """Get Mood Mix recommendation folder for a specific tag.
+
+ :param mood_tag: Pre-selected mood tag slug.
+ :return: RecommendationFolder with mood playlists, or None if unavailable.
+ """
+ playlists = await self.client.get_tag_playlists(mood_tag)
+ if not playlists:
+ self.logger.debug("No playlists for mood tag %s, skipping recommendation", mood_tag)
+ return None
+ items: list[Playlist] = []
+ for playlist in playlists[:8]:
+ try:
+ items.append(parse_playlist(self, playlist))
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing mood playlist: %s", err)
+ if not items:
+ return None
+ names = self._get_browse_names()
+ tag_name = names.get(mood_tag, mood_tag.title())
+ return RecommendationFolder(
+ item_id="mood_mix",
+ provider=self.instance_id,
+ name=f"{names.get('mood_mix', 'Mood')}: {tag_name}",
+ items=UniqueList(items),
+ icon="mdi-emoticon-outline",
+ )
+
+ @use_cache(1800)
+ async def _get_activity_mix_recommendations(
+ self, activity_tag: str
+ ) -> RecommendationFolder | None:
+ """Get Activity Mix recommendation folder for a specific tag.
+
+ :param activity_tag: Pre-selected activity tag slug.
+ :return: RecommendationFolder with activity playlists, or None if unavailable.
+ """
+ playlists = await self.client.get_tag_playlists(activity_tag)
+ if not playlists:
+ self.logger.debug(
+ "No playlists for activity tag %s, skipping recommendation", activity_tag
+ )
+ return None
+ items: list[Playlist] = []
+ for playlist in playlists[:8]:
+ try:
+ items.append(parse_playlist(self, playlist))
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing activity playlist: %s", err)
+ if not items:
+ return None
+ names = self._get_browse_names()
+ tag_name = names.get(activity_tag, activity_tag.title())
+ return RecommendationFolder(
+ item_id="activity_mix",
+ provider=self.instance_id,
+ name=f"{names.get('activity_mix', 'Activity')}: {tag_name}",
+ items=UniqueList(items),
+ icon="mdi-run",
+ )
+
+ @use_cache(3600 * 6)
+ async def _get_seasonal_mix_recommendations(self) -> RecommendationFolder | None:
+ """Get Seasonal Mix recommendation folder (based on current month).
+
+ :return: RecommendationFolder with seasonal playlists, or None if unavailable.
+ """
+ # Determine current season tag
+ current_month = datetime.now(tz=UTC).month
+ seasonal_tag = TAG_SEASONAL_MAP.get(current_month, "autumn")
+
+ # Validate the seasonal tag; fall back to autumn if not available
+ if not await self._validate_tag(seasonal_tag):
+ seasonal_tag = "autumn"
+
+ playlists = await self.client.get_tag_playlists(seasonal_tag)
+ if not playlists:
+ return None
+ items: list[Playlist] = []
+ for playlist in playlists[:8]:
+ try:
+ items.append(parse_playlist(self, playlist))
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing seasonal playlist: %s", err)
+ if not items:
+ return None
+ names = self._get_browse_names()
+ tag_name = names.get(seasonal_tag, seasonal_tag.title())
+ return RecommendationFolder(
+ item_id="seasonal_mix",
+ provider=self.instance_id,
+ name=f"{names.get('seasonal_mix', 'Seasonal')}: {tag_name}",
+ items=UniqueList(items),
+ icon="mdi-weather-sunny",
+ )
@use_cache(3600 * 3)
async def get_playlist_tracks(self, prov_playlist_id: str, page: int = 0) -> list[Track]:
"""Get playlist tracks.
- :param prov_playlist_id: The provider playlist ID (format: "owner_id:kind" or my_wave).
+ :param prov_playlist_id: The provider playlist ID (format: "owner_id:kind",
+ my_wave, or liked_tracks).
:param page: Page number for pagination.
:return: List of Track objects.
"""
+ self.logger.debug(
+ "get_playlist_tracks called: prov_playlist_id=%s, page=%s", prov_playlist_id, page
+ )
+
if prov_playlist_id == MY_WAVE_PLAYLIST_ID:
+ self.logger.debug("Fetching My Wave tracks")
return await self._get_my_wave_playlist_tracks(page)
+ if prov_playlist_id == LIKED_TRACKS_PLAYLIST_ID:
+ self.logger.debug("Fetching Liked Tracks for virtual playlist")
+ result = await self._get_liked_tracks_playlist_tracks(page)
+ self.logger.debug("Liked Tracks playlist returned %s tracks", len(result))
+ return result
+
# Yandex Music API returns all playlist tracks in one call (no server-side pagination).
# Return empty list for page > 0 so the controller pagination loop terminates.
if page > 0:
return []
# Fetch full track details in batches to avoid timeouts
- batch_size = 50
+ batch_size = TRACK_BATCH_SIZE
full_tracks = []
for i in range(0, len(track_ids), batch_size):
batch = track_ids[i : i + batch_size]
async def get_library_albums(self) -> AsyncGenerator[Album, None]:
"""Retrieve library albums from Yandex Music."""
- albums = await self.client.get_liked_albums()
+ batch_size = TRACK_BATCH_SIZE
+ albums = await self.client.get_liked_albums(batch_size=batch_size)
for album in albums:
try:
yield parse_album(self, album)
# Fetch full track details in batches
track_ids = [str(ts.track_id) for ts in track_shorts if ts.track_id]
- batch_size = 50
+ batch_size = TRACK_BATCH_SIZE
for i in range(0, len(track_ids), batch_size):
batch_ids = track_ids[i : i + batch_size]
full_tracks = await self.client.get_tracks(batch_ids)
async def get_library_playlists(self) -> AsyncGenerator[Playlist, None]:
"""Retrieve library playlists from Yandex Music.
- Includes the virtual My Wave playlist first, then user playlists.
+ Includes virtual playlists (My Wave and Liked Tracks if enabled), user-created playlists,
+ and user-liked editorial playlists (returned by a separate API endpoint).
"""
yield await self.get_playlist(MY_WAVE_PLAYLIST_ID)
+ yield await self.get_playlist(LIKED_TRACKS_PLAYLIST_ID)
+ seen_ids: set[str] = set()
+ # User-created playlists
playlists = await self.client.get_user_playlists()
for playlist in playlists:
try:
- yield parse_playlist(self, playlist)
+ parsed = parse_playlist(self, playlist)
+ seen_ids.add(parsed.item_id)
+ yield parsed
except InvalidDataError as err:
self.logger.debug("Error parsing library playlist: %s", err)
+ # User-liked editorial playlists (not in users_playlists_list)
+ liked_playlists = await self.client.get_liked_playlists()
+ for playlist in liked_playlists:
+ try:
+ parsed = parse_playlist(self, playlist)
+ if parsed.item_id not in seen_ids:
+ yield parsed
+ except InvalidDataError as err:
+ self.logger.debug("Error parsing liked playlist: %s", err)
# Library edit methods
"""
return await self.streaming.get_stream_details(item_id)
+ async def get_audio_stream(
+ self, streamdetails: StreamDetails, seek_position: int = 0
+ ) -> AsyncGenerator[bytes, None]:
+ """Return the audio stream for the provider item.
+
+ This method is called when StreamType.CUSTOM is used, enabling on-the-fly
+ decryption of encrypted FLAC streams without disk I/O.
+
+ :param streamdetails: Stream details containing encrypted URL and decryption key.
+ :param seek_position: Seek position in seconds (not supported for encrypted streams).
+ :return: Async generator yielding decrypted audio chunks.
+ """
+ async for chunk in self.streaming.get_audio_stream(streamdetails, seek_position):
+ yield chunk
+
+ async def resolve_image(self, path: str) -> str | bytes:
+ """Resolve wave cover image with background color fill for transparent PNGs.
+
+ If the image URL has an associated background color (stored in _wave_bg_colors),
+ downloads the PNG from Yandex CDN and composites it on a solid color background
+ using Pillow, returning JPEG bytes. Falls back to the original URL on any error.
+
+ :param path: Image URL (may include #rrggbb fragment used as cache key).
+ :return: Composited JPEG bytes, or original path string as fallback.
+ """
+ bg_color = self._wave_bg_colors.get(path)
+ if not bg_color:
+ return path
+
+ # Strip the #color fragment before fetching the actual image
+ fetch_url = path.split("#", maxsplit=1)[0] if "#" in path else path
+ try:
+ async with self.mass.http_session.get(fetch_url) as resp:
+ resp.raise_for_status()
+ raw = await resp.read()
+ except Exception as err:
+ self.logger.debug("Failed to fetch wave cover %s: %s", fetch_url, err)
+ return fetch_url
+
+ def _composite() -> bytes:
+ bg_clean = bg_color.lstrip("#")
+ try:
+ r = int(bg_clean[0:2], 16)
+ g = int(bg_clean[2:4], 16)
+ b = int(bg_clean[4:6], 16)
+ except (ValueError, IndexError):
+ return raw
+ fg = PilImage.open(BytesIO(raw)).convert("RGBA")
+ bg = PilImage.new("RGBA", fg.size, (r, g, b, 255))
+ bg.paste(fg, mask=fg)
+ out = BytesIO()
+ bg.convert("RGB").save(out, "JPEG", quality=92)
+ return out.getvalue()
+
+ try:
+ return await asyncio.to_thread(_composite)
+ except Exception as err:
+ self.logger.debug("Wave cover composite failed for %s: %s", fetch_url, err)
+ return fetch_url
+
async def on_played(
self,
media_type: MediaType,
Sends trackStarted when the track is currently playing (is_playing=True).
trackFinished/skip are sent from on_streamed to use accurate seconds_streamed.
+
+ Also auto-enables "Don't stop the music" for any queue playing a radio track
+ so that MA refills the queue via get_similar_tracks when < 5 tracks remain.
"""
+ # Radio feedback always enabled
if media_type != MediaType.TRACK:
return
track_id, station_id = _parse_radio_item_id(prov_item_id)
if not station_id:
return
+ # Auto-enable "Don't stop the music" on every on_played call for radio tracks.
+ # Calling on every invocation (not just is_playing=True) ensures it fires even
+ # for short tracks that finish before the 30-second periodic callback.
+ self._ensure_dont_stop_the_music(prov_item_id)
if is_playing:
+ if station_id == ROTOR_STATION_MY_WAVE:
+ batch_id = self._my_wave_batch_id
+ else:
+ state = self._wave_states.get(station_id)
+ batch_id = state.batch_id if state else None
await self.client.send_rotor_station_feedback(
station_id,
"trackStarted",
track_id=track_id,
- batch_id=self._my_wave_batch_id,
+ batch_id=batch_id,
)
+ # Remove duplicate call that was under is_playing guard.
+ # _ensure_dont_stop_the_music is now called unconditionally above.
+
+ def _ensure_dont_stop_the_music(self, prov_item_id: str) -> None:
+ """Enable 'Don't stop the music' on queues playing this specific radio item.
+
+ Iterates all queues and enables the setting on queues whose current track
+ mapping matches this exact composite item_id (track_id@station_id) for this
+ provider instance.
+
+ Also sets queue.radio_source directly to the current track because
+ enqueued_media_items is empty for BrowseFolder-initiated playback, which
+ normally prevents MA's auto-fill from triggering. Setting radio_source
+ directly bypasses that gap so _fill_radio_tracks runs when < 5 tracks remain.
+ """
+ for queue in self.mass.player_queues:
+ current = queue.current_item
+ if current is None or current.media_item is None:
+ continue
+ item = current.media_item
+ # Match by provider instance and exact composite item_id
+ for mapping in getattr(item, "provider_mappings", []):
+ if (
+ mapping.provider_instance == self.instance_id
+ and mapping.item_id == prov_item_id
+ ):
+ # Set radio_source directly so MA's fill mechanism works even when
+ # the queue was started from a BrowseFolder (enqueued_media_items empty).
+ if not queue.radio_source and isinstance(item, Track):
+ queue.radio_source = [item]
+ if not queue.dont_stop_the_music_enabled:
+ try:
+ self.mass.player_queues.set_dont_stop_the_music(
+ queue.queue_id, dont_stop_the_music_enabled=True
+ )
+ self.logger.info(
+ "Auto-enabled 'Don't stop the music' for queue %s (radio station)",
+ queue.display_name,
+ )
+ except Exception as err:
+ self.logger.debug(
+ "Could not enable 'Don't stop the music' for queue %s: %s",
+ queue.display_name,
+ err,
+ )
+ break
+
+ def _ensure_dont_stop_the_music_for_queue(self, queue_id: str | None) -> None:
+ """Enable 'Don't stop the music' for a specific queue by ID.
+
+ Faster variant of _ensure_dont_stop_the_music used from on_streamed where
+ queue_id is available directly, avoiding iteration over all queues.
+ """
+ if not queue_id:
+ return
+ queue = self.mass.player_queues.get(queue_id)
+ if queue is None:
+ return
+ current = queue.current_item
+ if current is None or current.media_item is None:
+ return
+ item = current.media_item
+ for mapping in getattr(item, "provider_mappings", []):
+ if (
+ mapping.provider_instance == self.instance_id
+ and RADIO_TRACK_ID_SEP in mapping.item_id
+ ):
+ if not queue.radio_source and isinstance(item, Track):
+ queue.radio_source = [item]
+ if not queue.dont_stop_the_music_enabled:
+ try:
+ self.mass.player_queues.set_dont_stop_the_music(
+ queue_id, dont_stop_the_music_enabled=True
+ )
+ self.logger.info(
+ "Auto-enabled 'Don't stop the music' for queue %s (radio)",
+ queue.display_name,
+ )
+ except Exception as err:
+ self.logger.debug(
+ "Could not enable 'Don't stop the music' for queue %s: %s",
+ queue.display_name,
+ err,
+ )
+ break
async def on_streamed(self, streamdetails: StreamDetails) -> None:
"""Report stream completion for My Wave rotor feedback.
Sends trackFinished or skip with actual seconds_streamed so Yandex
can improve recommendations.
"""
+ # Radio feedback always enabled
track_id, station_id = _parse_radio_item_id(streamdetails.item_id)
if not station_id:
return
+ # Also ensure Don't stop the music is active — on_streamed fires even for
+ # very short tracks and we have queue_id here directly.
+ self._ensure_dont_stop_the_music_for_queue(streamdetails.queue_id)
seconds = int(streamdetails.seconds_streamed or 0)
duration = streamdetails.duration or 0
feedback_type = "trackFinished" if duration and seconds >= max(0, duration - 10) else "skip"
+ if station_id == ROTOR_STATION_MY_WAVE:
+ batch_id = self._my_wave_batch_id
+ else:
+ state = self._wave_states.get(station_id)
+ batch_id = state.batch_id if state else None
await self.client.send_rotor_station_feedback(
station_id,
feedback_type,
track_id=track_id,
total_played_seconds=seconds,
- batch_id=self._my_wave_batch_id,
+ batch_id=batch_id,
)
from __future__ import annotations
+import asyncio
+from collections.abc import AsyncGenerator
from typing import TYPE_CHECKING, Any
+from aiohttp import ClientPayloadError
+from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
from music_assistant_models.enums import ContentType, StreamType
from music_assistant_models.errors import MediaNotFoundError
from music_assistant_models.media_items import AudioFormat
from music_assistant_models.streamdetails import StreamDetails
-from .constants import CONF_QUALITY, QUALITY_LOSSLESS, RADIO_TRACK_ID_SEP
+from .constants import (
+ CONF_QUALITY,
+ QUALITY_EFFICIENT,
+ QUALITY_HIGH,
+ QUALITY_SUPERB,
+ RADIO_TRACK_ID_SEP,
+)
if TYPE_CHECKING:
from yandex_music import DownloadInfo
quality = self.provider.config.get_value(CONF_QUALITY)
quality_str = str(quality) if quality is not None else None
preferred_normalized = (quality_str or "").strip().lower()
- want_lossless = (
- QUALITY_LOSSLESS in preferred_normalized or preferred_normalized == QUALITY_LOSSLESS
- )
+
+ # Check for superb (lossless) quality
+ want_lossless = preferred_normalized in (QUALITY_SUPERB, "superb")
+
+ # Backward compatibility: also check old "lossless" value (exact match)
+ if preferred_normalized == "lossless":
+ want_lossless = True
# When user wants lossless, try get-file-info first (FLAC; download-info often MP3 only)
if want_lossless:
if file_info:
url = file_info.get("url")
codec = file_info.get("codec") or ""
+ needs_decryption = file_info.get("needs_decryption", False)
+
if url and codec.lower() in ("flac", "flac-mp4"):
- content_type = self._get_content_type(codec)
+ audio_format = self._build_audio_format(codec)
+
+ # Handle encrypted URLs from encraw transport
+ if needs_decryption and "key" in file_info:
+ self.logger.info(
+ "Streaming encrypted %s for track %s - will decrypt on-the-fly",
+ codec,
+ track_id,
+ )
+ # Return StreamType.CUSTOM for streaming decryption.
+ # can_seek=False: provider always streams from position 0;
+ # allow_seek=True: ffmpeg handles seek with -ss input flag.
+ return StreamDetails(
+ item_id=item_id,
+ provider=self.provider.instance_id,
+ audio_format=audio_format,
+ stream_type=StreamType.CUSTOM,
+ duration=track.duration,
+ data={
+ "encrypted_url": url,
+ "decryption_key": file_info["key"],
+ "codec": codec,
+ },
+ can_seek=False,
+ allow_seek=True,
+ )
+ # Unencrypted URL, use directly
self.logger.debug(
- "Stream selected for track %s via get-file-info: codec=%s",
+ "Unencrypted stream for track %s: codec=%s",
item_id,
codec,
)
return StreamDetails(
item_id=item_id,
provider=self.provider.instance_id,
- audio_format=AudioFormat(
- content_type=content_type,
- bit_rate=0,
- ),
+ audio_format=audio_format,
stream_type=StreamType.HTTP,
duration=track.duration,
path=url,
getattr(selected_info, "bitrate_in_kbps", None),
)
- content_type = self._get_content_type(selected_info.codec)
bitrate = selected_info.bitrate_in_kbps or 0
return StreamDetails(
item_id=item_id,
provider=self.provider.instance_id,
- audio_format=AudioFormat(
- content_type=content_type,
- bit_rate=bitrate,
- ),
+ audio_format=self._build_audio_format(selected_info.codec, bit_rate=bitrate),
stream_type=StreamType.HTTP,
duration=track.duration,
path=selected_info.direct_link,
def _select_best_quality(
self, download_infos: list[Any], preferred_quality: str | None
) -> DownloadInfo | None:
- """Select the best quality download info.
+ """Select the best quality download info based on user preference.
:param download_infos: List of DownloadInfo objects.
- :param preferred_quality: User's preferred quality (e.g. "lossless" or "Lossless (FLAC)").
+ :param preferred_quality: User's quality preference (efficient/high/balanced/superb).
:return: Best matching DownloadInfo or None.
"""
if not download_infos:
return None
- # Normalize so we accept "lossless", "Lossless (FLAC)", etc.
preferred_normalized = (preferred_quality or "").strip().lower()
- want_lossless = (
- QUALITY_LOSSLESS in preferred_normalized or preferred_normalized == QUALITY_LOSSLESS
- )
# Sort by bitrate descending
sorted_infos = sorted(
reverse=True,
)
- # If user wants lossless, prefer flac-mp4 then flac (API formats ~2025)
- if want_lossless:
+ # Superb: Prefer FLAC (backward compatibility with "lossless")
+ if preferred_normalized == QUALITY_SUPERB or "lossless" in preferred_normalized:
+ # Note: flac-mp4 typically comes from get-file-info API, not download-info,
+ # but we check here for forward compatibility in case the API changes.
for codec in ("flac-mp4", "flac"):
for info in sorted_infos:
if info.codec and info.codec.lower() == codec:
return info
self.logger.warning(
- "Lossless (FLAC) requested but no FLAC in API response for this "
- "track; using best available"
+ "Superb quality (FLAC) requested but not available; using best available"
+ )
+ return sorted_infos[0]
+
+ # Efficient: Prefer lowest bitrate AAC/MP3
+ if preferred_normalized == QUALITY_EFFICIENT:
+ # Sort ascending for lowest bitrate
+ sorted_infos_asc = sorted(
+ download_infos,
+ key=lambda x: x.bitrate_in_kbps or 999,
)
+ # Prefer AAC for efficiency, then MP3 (include MP4 container variants)
+ for codec in ("aac-mp4", "aac", "he-aac-mp4", "he-aac", "mp3"):
+ for info in sorted_infos_asc:
+ if info.codec and info.codec.lower() == codec:
+ return info
+ return sorted_infos_asc[0]
- # Return highest bitrate
+ # High: Prefer high bitrate MP3 (~320kbps)
+ if preferred_normalized == QUALITY_HIGH:
+ # Look for MP3 with bitrate >= 256kbps
+ high_quality_mp3 = [
+ info
+ for info in sorted_infos
+ if info.codec
+ and info.codec.lower() == "mp3"
+ and info.bitrate_in_kbps
+ and info.bitrate_in_kbps >= 256
+ ]
+ if high_quality_mp3:
+ return high_quality_mp3[0] # Already sorted by bitrate descending
+
+ # Fallback: any MP3 available (highest bitrate)
+ for info in sorted_infos:
+ if info.codec and info.codec.lower() == "mp3":
+ return info
+
+ # If no MP3, use highest available (excluding FLAC)
+ for info in sorted_infos:
+ if info.codec and info.codec.lower() not in ("flac", "flac-mp4"):
+ return info
+
+ # Last resort: highest available
+ return sorted_infos[0]
+
+ # Balanced (default): Prefer ~192kbps AAC, or medium quality MP3
+ # Look for bitrate around 192kbps (within range 128-256)
+ balanced_infos = [
+ info
+ for info in sorted_infos
+ if info.bitrate_in_kbps and 128 <= info.bitrate_in_kbps <= 256
+ ]
+ if balanced_infos:
+ # Prefer AAC over MP3 at similar bitrate (include MP4 container variants)
+ for codec in ("aac-mp4", "aac", "he-aac-mp4", "he-aac", "mp3"):
+ for info in balanced_infos:
+ if info.codec and info.codec.lower() == codec:
+ return info
+ return balanced_infos[0]
+
+ # Fallback to highest available if no balanced option
return sorted_infos[0] if sorted_infos else None
- def _get_content_type(self, codec: str | None) -> ContentType:
- """Determine content type from codec string.
+ def _get_content_type(self, codec: str | None) -> tuple[ContentType, ContentType]:
+ """Determine container and codec type from Yandex API codec string.
+
+ Yandex API returns codec strings like "flac-mp4" (FLAC in MP4 container),
+ "aac-mp4" (AAC in MP4 container), or plain "flac", "mp3", "aac".
:param codec: Codec string from Yandex API.
- :return: ContentType enum value.
+ :return: Tuple of (content_type/container, codec_type).
"""
if not codec:
- return ContentType.UNKNOWN
+ return ContentType.UNKNOWN, ContentType.UNKNOWN
codec_lower = codec.lower()
- if codec_lower in ("flac", "flac-mp4"):
- return ContentType.FLAC
+
+ # MP4 container variants: codec is inside an MP4 container
+ if codec_lower == "flac-mp4":
+ return ContentType.MP4, ContentType.FLAC
+ if codec_lower in ("aac-mp4", "he-aac-mp4"):
+ return ContentType.MP4, ContentType.AAC
+
+ # Plain single-codec formats: codec is implied by content_type, no separate codec_type
+ if codec_lower == "flac":
+ return ContentType.FLAC, ContentType.UNKNOWN
if codec_lower in ("mp3", "mpeg"):
- return ContentType.MP3
- if codec_lower == "aac":
- return ContentType.AAC
+ return ContentType.MP3, ContentType.UNKNOWN
+ if codec_lower in ("aac", "he-aac"):
+ return ContentType.AAC, ContentType.UNKNOWN
- return ContentType.UNKNOWN
+ return ContentType.UNKNOWN, ContentType.UNKNOWN
+
+ def _get_audio_params(self, codec: str | None) -> tuple[int, int]:
+ """Return (sample_rate, bit_depth) defaults based on codec string.
+
+ The Yandex get-file-info API does not return sample rate or bit depth,
+ so we use codec-based defaults. These values help the core select the
+ correct PCM output format and avoid unnecessary resampling.
+
+ :param codec: Codec string from Yandex API (e.g. "flac-mp4", "flac", "mp3").
+ :return: Tuple of (sample_rate, bit_depth).
+ """
+ if codec and codec.lower() == "flac-mp4":
+ return 48000, 24
+ # CD-quality defaults for all other codecs
+ return 44100, 16
+
+ def _build_audio_format(self, codec: str | None, bit_rate: int = 0) -> AudioFormat:
+ """Build AudioFormat with content type and codec-based audio params.
+
+ :param codec: Codec string from Yandex API (e.g. "flac-mp4", "flac", "mp3").
+ :param bit_rate: Bitrate in kbps (0 for variable/unknown).
+ :return: Configured AudioFormat instance.
+ """
+ content_type, codec_type = self._get_content_type(codec)
+ sample_rate, bit_depth = self._get_audio_params(codec)
+ return AudioFormat(
+ content_type=content_type,
+ codec_type=codec_type,
+ bit_rate=bit_rate,
+ sample_rate=sample_rate,
+ bit_depth=bit_depth,
+ )
+
+ async def get_audio_stream(
+ self, streamdetails: StreamDetails, seek_position: int = 0
+ ) -> AsyncGenerator[bytes, None]:
+ """Return the audio stream for the provider item with on-the-fly decryption.
+
+ Downloads and decrypts the encrypted stream chunk-by-chunk without buffering.
+ On connection drop, reconnects using a Range header and resumes AES-CTR
+ decryption from the correct block boundary (up to 3 retries).
+
+ :param streamdetails: Stream details containing encrypted URL and key.
+ :param seek_position: Always 0 (seeking delegated to ffmpeg via allow_seek=True).
+ :return: Async generator yielding decrypted audio bytes.
+ """
+ encrypted_url: str = streamdetails.data["encrypted_url"]
+ key_hex: str = streamdetails.data["decryption_key"]
+ key_bytes = bytes.fromhex(key_hex)
+ if len(key_bytes) not in (16, 24, 32):
+ raise MediaNotFoundError(f"Unsupported AES key length: {len(key_bytes)} bytes")
+
+ block_size = 16 # AES-CTR block size in bytes
+ max_retries = 3
+ bytes_yielded = 0 # total decrypted bytes delivered to caller
+
+ for attempt in range(max_retries + 1):
+ if attempt > 0:
+ await asyncio.sleep(min(2**attempt, 8)) # 2s, 4s, 8s
+
+ # Align resume position to AES-CTR block boundary
+ block_start = (bytes_yielded // block_size) * block_size
+ block_skip = bytes_yielded - block_start # overlap bytes to discard in first chunk
+
+ # AES-CTR: original nonce is 0x00..00, so counter = block number
+ nonce = (block_start // block_size).to_bytes(block_size, "big")
+ decryptor = Cipher(algorithms.AES(key_bytes), modes.CTR(nonce)).decryptor()
+ headers = {"Range": f"bytes={block_start}-"} if block_start > 0 else {}
+
+ try:
+ async with self.mass.http_session.get(encrypted_url, headers=headers) as response:
+ try:
+ response.raise_for_status()
+ except Exception as err:
+ raise MediaNotFoundError(
+ f"Failed to fetch encrypted stream: {err}"
+ ) from err
+
+ carry_skip = block_skip
+ async for chunk in response.content.iter_chunked(65536):
+ decrypted = decryptor.update(chunk)
+ if carry_skip > 0:
+ skip = min(carry_skip, len(decrypted))
+ decrypted = decrypted[skip:]
+ carry_skip -= skip
+ if decrypted:
+ bytes_yielded += len(decrypted)
+ yield decrypted
+
+ final = decryptor.finalize()
+ if final:
+ bytes_yielded += len(final)
+ yield final
+ return # stream completed normally
+
+ except asyncio.CancelledError:
+ raise # propagate cancellation immediately, do not retry
+ except ClientPayloadError as err:
+ if attempt < max_retries:
+ self.logger.warning(
+ "Encrypted stream dropped at %d bytes (attempt %d/%d): %s — retrying",
+ bytes_yielded,
+ attempt + 1,
+ max_retries,
+ err,
+ )
+ else:
+ raise MediaNotFoundError(
+ "Encrypted stream ended early after retries were exhausted"
+ ) from err
from __future__ import annotations
+import base64
+import hashlib
+import hmac
+import re
from unittest import mock
import pytest
from music_assistant_models.errors import ResourceTemporarilyUnavailable
from yandex_music.exceptions import NetworkError
+from yandex_music.rotor.dashboard import Dashboard
+from yandex_music.rotor.station_result import StationResult
+from yandex_music.utils.sign_request import DEFAULT_SIGN_KEY
-from music_assistant.providers.yandex_music.api_client import YandexMusicClient
+from music_assistant.providers.yandex_music.api_client import (
+ GET_FILE_INFO_CODECS,
+ YandexMusicClient,
+)
def _make_client() -> tuple[YandexMusicClient, mock.AsyncMock]:
"""Create a YandexMusicClient with a mocked underlying ClientAsync.
+ Also mocks connect() so that _reconnect() restores the mock client
+ instead of trying to create a real connection.
+
:return: Tuple of (YandexMusicClient, mock_underlying_client).
"""
client = YandexMusicClient(token="fake_token")
mock_underlying = mock.AsyncMock()
client._client = mock_underlying
client._user_id = 12345
+
+ async def _fake_connect() -> bool:
+ client._client = mock_underlying
+ client._user_id = 12345
+ return True
+
+ client.connect = _fake_connect # type: ignore[method-assign]
return client, mock_underlying
assert body["type"] == "trackStarted"
assert body["trackId"] == "12345"
assert body["batchId"] == "batch_xyz"
+
+
+# -- LRC regex tests ---------------------------------------------------------
+
+
+def test_lrc_regex_matches_valid_synced_lyrics() -> None:
+ """LRC regex matches valid synced lyrics with proper format [mm:ss.xx].
+
+ Uses re.search (no ^ anchor) matching the implementation in api_client.py,
+ which intentionally allows timestamps anywhere in the text so that LRC
+ metadata lines like [ar:Artist] before the first timestamp don't prevent
+ detection.
+ """
+ pattern = r"\[\d{2}:\d{2}(?:\.\d{2,3})?\]"
+
+ # Valid LRC formats that should match
+ valid_cases = [
+ "[00:12]", # Basic format (no fractional part)
+ "[00:12.34]", # With centiseconds (2-digit fractional part — lower bound of \d{2,3})
+ "[00:12.345]", # With milliseconds (3-digit fractional part — upper bound of \d{2,3})
+ "[12:34]", # Another basic format
+ "[99:59.99]", # Edge case
+ "Some [00:12] text", # Timestamp embedded in text — re.search finds it
+ ]
+
+ for case in valid_cases:
+ assert re.search(pattern, case), f"Should match: {case}"
+
+
+def test_lrc_regex_rejects_invalid_formats() -> None:
+ """LRC regex rejects invalid formats (no closing bracket, wrong format)."""
+ pattern = r"\[\d{2}:\d{2}(?:\.\d{2,3})?\]"
+
+ # Invalid formats that should NOT match
+ invalid_cases = [
+ "[00:12", # Missing closing bracket
+ "00:12]", # Missing opening bracket
+ "[0:12]", # Single digit minute
+ "[00:1]", # Single digit second
+ "[00:12.1]", # Single digit centiseconds (should be 2-3 digits)
+ "[00:12.1234]", # Four digit milliseconds
+ ]
+
+ for case in invalid_cases:
+ assert not re.search(pattern, case), f"Should NOT match: {case}"
+
+
+# -- HMAC sign construction tests --------------------------------------------
+
+
+def test_hmac_sign_construction_explicit() -> None:
+ """HMAC sign is constructed explicitly with commas stripped from codecs."""
+ # Simulate the parameters
+ timestamp = 1234567890
+ track_id = "12345"
+
+ # The correct way (explicit construction)
+ codecs_for_sign = GET_FILE_INFO_CODECS.replace(",", "")
+ param_string = f"{timestamp}{track_id}lossless{codecs_for_sign}encraw"
+
+ # Verify codecs_for_sign has no commas
+ assert "," not in codecs_for_sign
+
+ # Verify the construction is correct
+ expected = f"1234567890{track_id}lossless{codecs_for_sign}encraw"
+ assert param_string == expected
+
+ # Verify HMAC can be constructed
+ hmac_sign = hmac.new(
+ DEFAULT_SIGN_KEY.encode(),
+ param_string.encode(),
+ hashlib.sha256,
+ )
+ sign = base64.b64encode(hmac_sign.digest()).decode()[:-1]
+
+ # Verify sign is 43 characters (SHA-256 base64 with one "=" removed)
+ assert len(sign) == 43
+ assert not sign.endswith("=")
+
+
+# -- get_dashboard_stations --------------------------------------------------
+
+
+async def test_get_dashboard_stations_returns_personalized_stations() -> None:
+ """get_dashboard_stations() returns stations from rotor/stations/dashboard."""
+ client, underlying = _make_client()
+
+ _de_client = type("C", (), {"report_unknown_fields": False})()
+
+ station_result = StationResult.de_json(
+ {
+ "station": {
+ "id": {"type": "mood", "tag": "sad"},
+ "name": "Грустное",
+ "restrictions": {},
+ "restrictions2": {},
+ "full_image_url": None,
+ "id_for_from": "mood-sad",
+ "icon": None,
+ },
+ "settings": None,
+ "settings2": None,
+ "ad_params": None,
+ "rup_title": "Sad Songs",
+ "rup_description": "",
+ },
+ _de_client,
+ )
+
+ dashboard = mock.MagicMock(spec=Dashboard)
+ dashboard.stations = [station_result]
+ underlying.rotor_stations_dashboard.return_value = dashboard
+
+ stations = await client.get_dashboard_stations()
+
+ assert len(stations) == 1
+ station_id, name, _image_url = stations[0]
+ assert station_id == "mood:sad"
+ assert name == "Грустное" # station.name takes priority over rup_title
+ underlying.rotor_stations_dashboard.assert_called_once()
+
+
+async def test_get_dashboard_stations_empty_on_error() -> None:
+ """get_dashboard_stations() returns empty list on network error."""
+ client, underlying = _make_client()
+ underlying.rotor_stations_dashboard.side_effect = NetworkError("timeout")
+
+ stations = await client.get_dashboard_stations()
+
+ assert stations == []
+
+
+async def test_get_dashboard_stations_skips_user_type() -> None:
+ """get_dashboard_stations() filters out personal 'user' type stations."""
+ client, underlying = _make_client()
+
+ _de_client = type("C", (), {"report_unknown_fields": False})()
+
+ personal_station = StationResult.de_json(
+ {
+ "station": {
+ "id": {"type": "user", "tag": "onyourwave"},
+ "name": "My Wave",
+ "restrictions": {},
+ "restrictions2": {},
+ "full_image_url": None,
+ "id_for_from": "user-onyourwave",
+ "icon": None,
+ },
+ "settings": None,
+ "settings2": None,
+ "ad_params": None,
+ "rup_title": "My Wave",
+ "rup_description": "",
+ },
+ _de_client,
+ )
+
+ dashboard = mock.MagicMock(spec=Dashboard)
+ dashboard.stations = [personal_station]
+ underlying.rotor_stations_dashboard.return_value = dashboard
+
+ stations = await client.get_dashboard_stations()
+
+ assert stations == []
mock_client.get_artist_tracks = mock.AsyncMock(return_value=[track])
mock_client.get_playlist = mock.AsyncMock(return_value=playlist)
mock_client.get_track_download_info = mock.AsyncMock(return_value=[download_info])
+ mock_client.get_track_lyrics = mock.AsyncMock(return_value=(None, False))
+ mock_client.get_track_lyrics_from_track = mock.AsyncMock(return_value=(None, False))
async with wait_for_sync_completion(mass):
config = await mass.config.save_provider_config(
# get-file-info lossless is tried first; mock returns None so we use download_info path
mock_client.get_track_file_info_lossless = mock.AsyncMock(return_value=None)
mock_client.get_track_download_info = mock.AsyncMock(return_value=download_infos)
+ mock_client.get_track_lyrics = mock.AsyncMock(return_value=(None, False))
+ mock_client.get_track_lyrics_from_track = mock.AsyncMock(return_value=(None, False))
async with wait_for_sync_completion(mass):
config = await mass.config.save_provider_config(
--- /dev/null
+"""Test Yandex Music Recommendations."""
+
+from __future__ import annotations
+
+from typing import Any
+from unittest.mock import AsyncMock, Mock, patch
+
+import pytest
+from music_assistant_models.errors import InvalidDataError
+from music_assistant_models.media_items import Album, Playlist, RecommendationFolder, Track
+
+from music_assistant.providers.yandex_music.constants import (
+ BROWSE_NAMES_EN,
+ MY_WAVE_PLAYLIST_ID,
+ RADIO_TRACK_ID_SEP,
+ ROTOR_STATION_MY_WAVE,
+)
+from music_assistant.providers.yandex_music.provider import YandexMusicProvider
+
+
+@pytest.fixture
+def provider_mock() -> Mock:
+ """Return a mock Yandex Music provider."""
+ provider = Mock(spec=YandexMusicProvider)
+ provider.domain = "yandex_music"
+ provider.instance_id = "yandex_music_instance"
+ provider.logger = Mock()
+
+ # Mock client
+ provider.client = AsyncMock()
+ provider.client.user_id = 12345
+
+ # Mock config
+ provider.config = Mock()
+ provider.config.get_value = Mock(side_effect=lambda key: 150 if "max_tracks" in key else None)
+
+ # Mock mass with cache
+ provider.mass = Mock()
+ provider.mass.metadata = Mock()
+ provider.mass.metadata.locale = "en_US"
+ provider.mass.cache = AsyncMock()
+ provider.mass.cache.get = AsyncMock(return_value=None) # Cache always misses
+ provider.mass.cache.set = AsyncMock()
+
+ # Mock _get_browse_names to return EN names
+ provider._get_browse_names = Mock(return_value=BROWSE_NAMES_EN)
+
+ return provider
+
+
+@pytest.mark.asyncio
+async def test_get_my_wave_recommendations_success(provider_mock: Mock) -> None:
+ """Test _get_my_wave_recommendations returns data when API provides tracks."""
+ # Create mock track with required attributes
+ mock_track = Mock()
+ mock_track.id = "12345"
+ mock_track.track_id = "12345"
+
+ # Mock get_my_wave_tracks to return tracks
+ provider_mock.client.get_my_wave_tracks = AsyncMock(return_value=([mock_track], None))
+
+ # Mock _parse_my_wave_track to return a Track object with composite item_id
+ mock_parsed_track = Mock(spec=Track)
+ mock_parsed_track.item_id = f"12345{RADIO_TRACK_ID_SEP}{ROTOR_STATION_MY_WAVE}"
+ mock_parsed_track.name = "Test Track"
+ mock_parsed_track.provider_mappings = []
+ provider_mock._parse_my_wave_track = Mock(return_value=mock_parsed_track)
+
+ result = await YandexMusicProvider._get_my_wave_recommendations(provider_mock)
+
+ assert result is not None
+ assert isinstance(result, RecommendationFolder)
+ assert result.item_id == MY_WAVE_PLAYLIST_ID
+ assert result.provider == provider_mock.instance_id
+ assert result.name == BROWSE_NAMES_EN[MY_WAVE_PLAYLIST_ID]
+ assert result.icon == "mdi-waveform"
+ assert len(result.items) > 0
+
+
+@pytest.mark.asyncio
+async def test_get_my_wave_recommendations_empty(provider_mock: Mock) -> None:
+ """Test _get_my_wave_recommendations returns None when API returns no tracks."""
+ provider_mock.client.get_my_wave_tracks = AsyncMock(return_value=([], None))
+
+ result = await YandexMusicProvider._get_my_wave_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_my_wave_recommendations_duplicate_filtering(provider_mock: Mock) -> None:
+ """Test _get_my_wave_recommendations filters duplicate tracks."""
+ # Create mock tracks with same ID
+ mock_track1 = Mock()
+ mock_track1.id = "12345"
+ mock_track1.track_id = "12345"
+
+ mock_track2 = Mock()
+ mock_track2.id = "12345" # Same ID
+ mock_track2.track_id = "12345"
+
+ # First call returns track1, second call returns track2 (duplicate)
+ provider_mock.client.get_my_wave_tracks = AsyncMock(
+ side_effect=[
+ ([mock_track1], None),
+ ([mock_track2], None),
+ ]
+ )
+
+ mock_parsed_track = Mock(spec=Track)
+ mock_parsed_track.item_id = f"12345{RADIO_TRACK_ID_SEP}{ROTOR_STATION_MY_WAVE}"
+ mock_parsed_track.name = "Test Track"
+ mock_parsed_track.provider_mappings = []
+
+ # _parse_my_wave_track returns track on first call, None on duplicate
+ provider_mock._parse_my_wave_track = Mock(side_effect=[mock_parsed_track, None])
+
+ result = await YandexMusicProvider._get_my_wave_recommendations(provider_mock)
+
+ assert result is not None
+ # Should only have 1 track despite 2 API calls (duplicate filtered)
+ assert len(result.items) == 1
+
+
+@pytest.mark.asyncio
+async def test_get_my_wave_recommendations_invalid_data_error(provider_mock: Mock) -> None:
+ """Test _get_my_wave_recommendations handles InvalidDataError gracefully."""
+ mock_track = Mock()
+ mock_track.id = "12345"
+ mock_track.track_id = "12345"
+
+ provider_mock.client.get_my_wave_tracks = AsyncMock(return_value=([mock_track], None))
+
+ # _parse_my_wave_track returns None (simulates parse error handled internally)
+ provider_mock._parse_my_wave_track = Mock(return_value=None)
+
+ result = await YandexMusicProvider._get_my_wave_recommendations(provider_mock)
+
+ # Should return None as no valid tracks were parsed
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_feed_recommendations_success(provider_mock: Mock) -> None:
+ """Test _get_feed_recommendations returns data when API provides feed."""
+ # Mock feed with generated playlists
+ mock_gen_playlist = Mock()
+ mock_gen_playlist.ready = True
+ mock_gen_playlist.data = Mock() # Playlist data
+
+ mock_feed = Mock()
+ mock_feed.generated_playlists = [mock_gen_playlist]
+
+ provider_mock.client.get_feed = AsyncMock(return_value=mock_feed)
+
+ # Mock parse_playlist
+ mock_parsed_playlist = Mock(spec=Playlist)
+ mock_parsed_playlist.item_id = "playlist_1"
+ mock_parsed_playlist.name = "Playlist of the Day"
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ return_value=mock_parsed_playlist,
+ ):
+ result = await YandexMusicProvider._get_feed_recommendations(provider_mock)
+
+ assert result is not None
+ assert isinstance(result, RecommendationFolder)
+ assert result.item_id == "feed"
+ assert result.provider == provider_mock.instance_id
+ assert result.name == BROWSE_NAMES_EN["feed"]
+ assert result.icon == "mdi-account-music"
+ assert len(result.items) > 0
+
+
+@pytest.mark.asyncio
+async def test_get_feed_recommendations_empty(provider_mock: Mock) -> None:
+ """Test _get_feed_recommendations returns None when feed is empty."""
+ provider_mock.client.get_feed = AsyncMock(return_value=None)
+
+ result = await YandexMusicProvider._get_feed_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_feed_recommendations_no_generated_playlists(provider_mock: Mock) -> None:
+ """Test _get_feed_recommendations returns None when no generated playlists."""
+ mock_feed = Mock()
+ mock_feed.generated_playlists = []
+
+ provider_mock.client.get_feed = AsyncMock(return_value=mock_feed)
+
+ result = await YandexMusicProvider._get_feed_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_feed_recommendations_invalid_data_error(provider_mock: Mock) -> None:
+ """Test _get_feed_recommendations handles InvalidDataError gracefully."""
+ mock_gen_playlist = Mock()
+ mock_gen_playlist.ready = True
+ mock_gen_playlist.data = Mock()
+
+ mock_feed = Mock()
+ mock_feed.generated_playlists = [mock_gen_playlist]
+
+ provider_mock.client.get_feed = AsyncMock(return_value=mock_feed)
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ side_effect=InvalidDataError("Parse error"),
+ ):
+ result = await YandexMusicProvider._get_feed_recommendations(provider_mock)
+
+ assert result is None
+ provider_mock.logger.debug.assert_called()
+
+
+@pytest.mark.asyncio
+async def test_get_chart_recommendations_success(provider_mock: Mock) -> None:
+ """Test _get_chart_recommendations returns data when API provides chart."""
+ # Mock TrackShort with .track attribute
+ mock_track_short = Mock()
+ mock_track_obj = Mock() # The actual Track object
+ mock_track_short.track = mock_track_obj
+
+ mock_chart = Mock()
+ mock_chart.tracks = [mock_track_short]
+
+ mock_chart_info = Mock()
+ mock_chart_info.chart = mock_chart
+
+ provider_mock.client.get_chart = AsyncMock(return_value=mock_chart_info)
+
+ # Mock parse_track
+ mock_parsed_track = Mock(spec=Track)
+ mock_parsed_track.item_id = "track_1"
+ mock_parsed_track.name = "Chart Track 1"
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_track",
+ return_value=mock_parsed_track,
+ ):
+ result = await YandexMusicProvider._get_chart_recommendations(provider_mock)
+
+ assert result is not None
+ assert isinstance(result, RecommendationFolder)
+ assert result.item_id == "chart"
+ assert result.provider == provider_mock.instance_id
+ assert result.name == BROWSE_NAMES_EN["chart"]
+ assert result.icon == "mdi-chart-line"
+ assert len(result.items) > 0
+
+
+@pytest.mark.asyncio
+async def test_get_chart_recommendations_empty(provider_mock: Mock) -> None:
+ """Test _get_chart_recommendations returns None when chart is empty."""
+ provider_mock.client.get_chart = AsyncMock(return_value=None)
+
+ result = await YandexMusicProvider._get_chart_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_chart_recommendations_no_tracks(provider_mock: Mock) -> None:
+ """Test _get_chart_recommendations returns None when chart has no tracks."""
+ mock_chart = Mock()
+ mock_chart.tracks = []
+
+ mock_chart_info = Mock()
+ mock_chart_info.chart = mock_chart
+
+ provider_mock.client.get_chart = AsyncMock(return_value=mock_chart_info)
+
+ result = await YandexMusicProvider._get_chart_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_chart_recommendations_invalid_data_error(provider_mock: Mock) -> None:
+ """Test _get_chart_recommendations handles InvalidDataError gracefully."""
+ mock_track_short = Mock()
+ mock_track_obj = Mock()
+ mock_track_short.track = mock_track_obj
+
+ mock_chart = Mock()
+ mock_chart.tracks = [mock_track_short]
+
+ mock_chart_info = Mock()
+ mock_chart_info.chart = mock_chart
+
+ provider_mock.client.get_chart = AsyncMock(return_value=mock_chart_info)
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_track",
+ side_effect=InvalidDataError("Parse error"),
+ ):
+ result = await YandexMusicProvider._get_chart_recommendations(provider_mock)
+
+ assert result is None
+ provider_mock.logger.debug.assert_called()
+
+
+@pytest.mark.asyncio
+async def test_get_new_releases_recommendations_success(provider_mock: Mock) -> None:
+ """Test _get_new_releases_recommendations returns data when API provides releases."""
+ # Mock releases with album IDs
+ mock_releases = Mock()
+ mock_releases.new_releases = [123, 456, 789]
+
+ provider_mock.client.get_new_releases = AsyncMock(return_value=mock_releases)
+
+ # Mock get_albums to return album objects
+ mock_album = Mock()
+ provider_mock.client.get_albums = AsyncMock(return_value=[mock_album])
+
+ # Mock parse_album
+ mock_parsed_album = Mock(spec=Album)
+ mock_parsed_album.item_id = "album_1"
+ mock_parsed_album.name = "New Album"
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_album",
+ return_value=mock_parsed_album,
+ ):
+ result = await YandexMusicProvider._get_new_releases_recommendations(provider_mock)
+
+ assert result is not None
+ assert isinstance(result, RecommendationFolder)
+ assert result.item_id == "new_releases"
+ assert result.provider == provider_mock.instance_id
+ assert result.name == BROWSE_NAMES_EN["new_releases"]
+ assert result.icon == "mdi-new-box"
+ assert len(result.items) > 0
+
+
+@pytest.mark.asyncio
+async def test_get_new_releases_recommendations_empty(provider_mock: Mock) -> None:
+ """Test _get_new_releases_recommendations returns None when releases are empty."""
+ provider_mock.client.get_new_releases = AsyncMock(return_value=None)
+
+ result = await YandexMusicProvider._get_new_releases_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_new_releases_recommendations_no_releases(provider_mock: Mock) -> None:
+ """Test _get_new_releases_recommendations returns None when no releases."""
+ mock_releases = Mock()
+ mock_releases.new_releases = []
+
+ provider_mock.client.get_new_releases = AsyncMock(return_value=mock_releases)
+
+ result = await YandexMusicProvider._get_new_releases_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_new_releases_recommendations_invalid_data_error(provider_mock: Mock) -> None:
+ """Test _get_new_releases_recommendations handles InvalidDataError gracefully."""
+ mock_releases = Mock()
+ mock_releases.new_releases = [123]
+
+ provider_mock.client.get_new_releases = AsyncMock(return_value=mock_releases)
+ provider_mock.client.get_albums = AsyncMock(return_value=[Mock()])
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_album",
+ side_effect=InvalidDataError("Parse error"),
+ ):
+ result = await YandexMusicProvider._get_new_releases_recommendations(provider_mock)
+
+ assert result is None
+ provider_mock.logger.debug.assert_called()
+
+
+@pytest.mark.asyncio
+async def test_get_new_playlists_recommendations_success(provider_mock: Mock) -> None:
+ """Test _get_new_playlists_recommendations returns data when API provides playlists."""
+ # Mock playlist ID object
+ mock_playlist_id = Mock()
+ mock_playlist_id.uid = "user123"
+ mock_playlist_id.kind = "456"
+
+ mock_result = Mock()
+ mock_result.new_playlists = [mock_playlist_id]
+
+ provider_mock.client.get_new_playlists = AsyncMock(return_value=mock_result)
+
+ # Mock get_playlists to return playlist objects
+ mock_playlist = Mock()
+ provider_mock.client.get_playlists = AsyncMock(return_value=[mock_playlist])
+
+ # Mock parse_playlist
+ mock_parsed_playlist = Mock(spec=Playlist)
+ mock_parsed_playlist.item_id = "playlist_1"
+ mock_parsed_playlist.name = "New Playlist"
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ return_value=mock_parsed_playlist,
+ ):
+ result = await YandexMusicProvider._get_new_playlists_recommendations(provider_mock)
+
+ assert result is not None
+ assert isinstance(result, RecommendationFolder)
+ assert result.item_id == "new_playlists"
+ assert result.provider == provider_mock.instance_id
+ assert result.name == BROWSE_NAMES_EN["new_playlists"]
+ assert result.icon == "mdi-playlist-star"
+ assert len(result.items) > 0
+
+
+@pytest.mark.asyncio
+async def test_get_new_playlists_recommendations_empty(provider_mock: Mock) -> None:
+ """Test _get_new_playlists_recommendations returns None when result is empty."""
+ provider_mock.client.get_new_playlists = AsyncMock(return_value=None)
+
+ result = await YandexMusicProvider._get_new_playlists_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_new_playlists_recommendations_no_playlists(provider_mock: Mock) -> None:
+ """Test _get_new_playlists_recommendations returns None when no playlists."""
+ mock_result = Mock()
+ mock_result.new_playlists = []
+
+ provider_mock.client.get_new_playlists = AsyncMock(return_value=mock_result)
+
+ result = await YandexMusicProvider._get_new_playlists_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_new_playlists_recommendations_invalid_data_error(provider_mock: Mock) -> None:
+ """Test _get_new_playlists_recommendations handles InvalidDataError gracefully."""
+ mock_playlist_id = Mock()
+ mock_playlist_id.uid = "user123"
+ mock_playlist_id.kind = "456"
+
+ mock_result = Mock()
+ mock_result.new_playlists = [mock_playlist_id]
+
+ provider_mock.client.get_new_playlists = AsyncMock(return_value=mock_result)
+ provider_mock.client.get_playlists = AsyncMock(return_value=[Mock()])
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ side_effect=InvalidDataError("Parse error"),
+ ):
+ result = await YandexMusicProvider._get_new_playlists_recommendations(provider_mock)
+
+ assert result is None
+ provider_mock.logger.debug.assert_called()
+
+
+@pytest.mark.asyncio
+async def test_get_top_picks_recommendations_success(provider_mock: Mock) -> None:
+ """Test _get_top_picks_recommendations returns data when API provides playlists."""
+ mock_playlist = Mock()
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[mock_playlist])
+
+ # Mock parse_playlist
+ mock_parsed_playlist = Mock(spec=Playlist)
+ mock_parsed_playlist.item_id = "playlist_1"
+ mock_parsed_playlist.name = "Top Pick"
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ return_value=mock_parsed_playlist,
+ ):
+ result = await YandexMusicProvider._get_top_picks_recommendations(provider_mock)
+
+ assert result is not None
+ assert isinstance(result, RecommendationFolder)
+ assert result.item_id == "top_picks"
+ assert result.provider == provider_mock.instance_id
+ assert result.name == BROWSE_NAMES_EN["top_picks"]
+ assert result.icon == "mdi-star"
+ assert len(result.items) > 0
+ # Verify it called with "top" tag
+ provider_mock.client.get_tag_playlists.assert_called_once_with("top")
+
+
+@pytest.mark.asyncio
+async def test_get_top_picks_recommendations_empty(provider_mock: Mock) -> None:
+ """Test _get_top_picks_recommendations returns None when API returns empty."""
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[])
+
+ result = await YandexMusicProvider._get_top_picks_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_top_picks_recommendations_invalid_data_error(provider_mock: Mock) -> None:
+ """Test _get_top_picks_recommendations handles InvalidDataError gracefully."""
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[Mock()])
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ side_effect=InvalidDataError("Parse error"),
+ ):
+ result = await YandexMusicProvider._get_top_picks_recommendations(provider_mock)
+
+ assert result is None
+ provider_mock.logger.debug.assert_called()
+
+
+@pytest.mark.asyncio
+async def test_get_mood_mix_recommendations_success(provider_mock: Mock) -> None:
+ """Test _get_mood_mix_recommendations returns data with deterministic random choice."""
+ mock_playlist = Mock()
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[mock_playlist])
+
+ # Mock parse_playlist
+ mock_parsed_playlist = Mock(spec=Playlist)
+ mock_parsed_playlist.item_id = "playlist_1"
+ mock_parsed_playlist.name = "Chill Playlist"
+
+ # No need to patch random.choice - tag is now passed as argument
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ return_value=mock_parsed_playlist,
+ ):
+ result = await YandexMusicProvider._get_mood_mix_recommendations(provider_mock, "chill")
+
+ assert result is not None
+ assert isinstance(result, RecommendationFolder)
+ assert result.item_id == "mood_mix"
+ assert result.provider == provider_mock.instance_id
+ # Name should include the mood tag
+ assert "Chill" in result.name or "chill" in result.name.lower()
+ assert result.icon == "mdi-emoticon-outline"
+ assert len(result.items) > 0
+ # Verify it called with mood tag
+ provider_mock.client.get_tag_playlists.assert_called_once_with("chill")
+
+
+@pytest.mark.asyncio
+async def test_get_mood_mix_recommendations_empty(provider_mock: Mock) -> None:
+ """Test _get_mood_mix_recommendations returns None when API returns empty."""
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[])
+
+ result = await YandexMusicProvider._get_mood_mix_recommendations(provider_mock, "sad")
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_mood_mix_recommendations_invalid_data_error(provider_mock: Mock) -> None:
+ """Test _get_mood_mix_recommendations handles InvalidDataError gracefully."""
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[Mock()])
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ side_effect=InvalidDataError("Parse error"),
+ ):
+ result = await YandexMusicProvider._get_mood_mix_recommendations(provider_mock, "romantic")
+
+ assert result is None
+ provider_mock.logger.debug.assert_called()
+
+
+@pytest.mark.asyncio
+async def test_get_activity_mix_recommendations_success(provider_mock: Mock) -> None:
+ """Test _get_activity_mix_recommendations returns data with deterministic random choice."""
+ mock_playlist = Mock()
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[mock_playlist])
+
+ # Mock parse_playlist
+ mock_parsed_playlist = Mock(spec=Playlist)
+ mock_parsed_playlist.item_id = "playlist_1"
+ mock_parsed_playlist.name = "Workout Playlist"
+
+ # No need to patch random.choice - tag is now passed as argument
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ return_value=mock_parsed_playlist,
+ ):
+ result = await YandexMusicProvider._get_activity_mix_recommendations(
+ provider_mock, "workout"
+ )
+
+ assert result is not None
+ assert isinstance(result, RecommendationFolder)
+ assert result.item_id == "activity_mix"
+ assert result.provider == provider_mock.instance_id
+ # Name should include the activity tag
+ assert "Workout" in result.name or "workout" in result.name.lower()
+ assert result.icon == "mdi-run"
+ assert len(result.items) > 0
+ # Verify it called with activity tag
+ provider_mock.client.get_tag_playlists.assert_called_once_with("workout")
+
+
+@pytest.mark.asyncio
+async def test_get_activity_mix_recommendations_empty(provider_mock: Mock) -> None:
+ """Test _get_activity_mix_recommendations returns None when API returns empty."""
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[])
+
+ result = await YandexMusicProvider._get_activity_mix_recommendations(provider_mock, "focus")
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_activity_mix_recommendations_invalid_data_error(provider_mock: Mock) -> None:
+ """Test _get_activity_mix_recommendations handles InvalidDataError gracefully."""
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[Mock()])
+
+ with patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ side_effect=InvalidDataError("Parse error"),
+ ):
+ result = await YandexMusicProvider._get_activity_mix_recommendations(
+ provider_mock, "morning"
+ )
+
+ assert result is None
+ provider_mock.logger.debug.assert_called()
+
+
+@pytest.mark.asyncio
+async def test_get_seasonal_mix_recommendations_winter(provider_mock: Mock) -> None:
+ """Test _get_seasonal_mix_recommendations returns winter playlists in January."""
+ mock_playlist = Mock()
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[mock_playlist])
+
+ # Mock parse_playlist
+ mock_parsed_playlist = Mock(spec=Playlist)
+ mock_parsed_playlist.item_id = "playlist_1"
+ mock_parsed_playlist.name = "Winter Playlist"
+
+ # Patch datetime to return January (month 1)
+ mock_datetime = Mock()
+ mock_datetime.now.return_value.month = 1
+
+ with (
+ patch("music_assistant.providers.yandex_music.provider.datetime", mock_datetime),
+ patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ return_value=mock_parsed_playlist,
+ ),
+ ):
+ result = await YandexMusicProvider._get_seasonal_mix_recommendations(provider_mock)
+
+ assert result is not None
+ assert isinstance(result, RecommendationFolder)
+ assert result.item_id == "seasonal_mix"
+ assert result.provider == provider_mock.instance_id
+ # Name should include winter
+ assert "Winter" in result.name or "winter" in result.name.lower()
+ assert result.icon == "mdi-weather-sunny"
+ assert len(result.items) > 0
+ # Verify it called with winter tag
+ provider_mock.client.get_tag_playlists.assert_called_once_with("winter")
+
+
+@pytest.mark.asyncio
+async def test_get_seasonal_mix_recommendations_summer(provider_mock: Mock) -> None:
+ """Test _get_seasonal_mix_recommendations returns summer playlists in July."""
+ mock_playlist = Mock()
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[mock_playlist])
+
+ mock_parsed_playlist = Mock(spec=Playlist)
+ mock_parsed_playlist.item_id = "playlist_1"
+ mock_parsed_playlist.name = "Summer Playlist"
+
+ # Patch datetime to return July (month 7)
+ mock_datetime = Mock()
+ mock_datetime.now.return_value.month = 7
+
+ with (
+ patch("music_assistant.providers.yandex_music.provider.datetime", mock_datetime),
+ patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ return_value=mock_parsed_playlist,
+ ),
+ ):
+ result = await YandexMusicProvider._get_seasonal_mix_recommendations(provider_mock)
+
+ assert result is not None
+ # Verify it called with summer tag
+ provider_mock.client.get_tag_playlists.assert_called_once_with("summer")
+
+
+@pytest.mark.asyncio
+async def test_get_seasonal_mix_recommendations_spring_fallback(provider_mock: Mock) -> None:
+ """Test _get_seasonal_mix_recommendations falls back to autumn for spring months."""
+ mock_playlist = Mock()
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[mock_playlist])
+
+ mock_parsed_playlist = Mock(spec=Playlist)
+ mock_parsed_playlist.item_id = "playlist_1"
+ mock_parsed_playlist.name = "Autumn Playlist"
+
+ # Patch datetime to return March (month 3 - spring)
+ mock_datetime = Mock()
+ mock_datetime.now.return_value.month = 3
+
+ # _validate_tag returns False for spring, triggering fallback to autumn
+ provider_mock._validate_tag = AsyncMock(return_value=False)
+
+ with (
+ patch("music_assistant.providers.yandex_music.provider.datetime", mock_datetime),
+ patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ return_value=mock_parsed_playlist,
+ ),
+ ):
+ result = await YandexMusicProvider._get_seasonal_mix_recommendations(provider_mock)
+
+ assert result is not None
+ # Verify it called with autumn tag (spring fallback)
+ provider_mock.client.get_tag_playlists.assert_called_once_with("autumn")
+
+
+@pytest.mark.asyncio
+async def test_get_seasonal_mix_recommendations_empty(provider_mock: Mock) -> None:
+ """Test _get_seasonal_mix_recommendations returns None when API returns empty."""
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[])
+
+ mock_datetime = Mock()
+ mock_datetime.now.return_value.month = 6
+
+ with patch("music_assistant.providers.yandex_music.provider.datetime", mock_datetime):
+ result = await YandexMusicProvider._get_seasonal_mix_recommendations(provider_mock)
+
+ assert result is None
+
+
+@pytest.mark.asyncio
+async def test_get_seasonal_mix_recommendations_invalid_data_error(provider_mock: Mock) -> None:
+ """Test _get_seasonal_mix_recommendations handles InvalidDataError gracefully."""
+ provider_mock.client.get_tag_playlists = AsyncMock(return_value=[Mock()])
+
+ mock_datetime = Mock()
+ mock_datetime.now.return_value.month = 9
+
+ with (
+ patch("music_assistant.providers.yandex_music.provider.datetime", mock_datetime),
+ patch(
+ "music_assistant.providers.yandex_music.provider.parse_playlist",
+ side_effect=InvalidDataError("Parse error"),
+ ),
+ ):
+ result = await YandexMusicProvider._get_seasonal_mix_recommendations(provider_mock)
+
+ assert result is None
+ provider_mock.logger.debug.assert_called()
+
+
+@pytest.mark.asyncio
+async def test_recommendations_aggregates_all_folders(provider_mock: Mock) -> None:
+ """Test recommendations() aggregates all recommendation folders."""
+ # Mock all individual recommendation methods to return folders
+ mock_folder = Mock(spec=RecommendationFolder)
+ mock_folder.item_id = "test_folder"
+ mock_folder.name = "Test Folder"
+
+ async def return_folder(*_args: Any, **_kwargs: Any) -> RecommendationFolder:
+ return mock_folder
+
+ async def return_tag(_category: str) -> str:
+ return "test_tag"
+
+ # Set the methods directly on the provider mock instance
+ provider_mock._get_my_wave_recommendations = return_folder
+ provider_mock._get_feed_recommendations = return_folder
+ provider_mock._get_chart_recommendations = return_folder
+ provider_mock._get_new_releases_recommendations = return_folder
+ provider_mock._get_new_playlists_recommendations = return_folder
+ provider_mock._get_top_picks_recommendations = return_folder
+ provider_mock._get_mood_mix_recommendations = return_folder
+ provider_mock._get_activity_mix_recommendations = return_folder
+ provider_mock._get_seasonal_mix_recommendations = return_folder
+ provider_mock._pick_random_tag_for_category = return_tag
+
+ result = await YandexMusicProvider.recommendations(provider_mock)
+
+ assert len(result) == 9 # All 9 methods returned folders
+
+
+@pytest.mark.asyncio
+async def test_recommendations_filters_none_folders(provider_mock: Mock) -> None:
+ """Test recommendations() filters out None results from individual methods."""
+ mock_folder = Mock(spec=RecommendationFolder)
+ mock_folder.item_id = "test_folder"
+ mock_folder.name = "Test Folder"
+
+ # Create async functions that return the desired values
+ async def return_folder(*_args: Any, **_kwargs: Any) -> RecommendationFolder:
+ return mock_folder
+
+ async def return_none(*_args: Any, **_kwargs: Any) -> None:
+ return None
+
+ async def return_tag(_category: str) -> str:
+ return "test_tag"
+
+ # Set the methods directly on the provider mock instance
+ provider_mock._get_my_wave_recommendations = return_folder
+ provider_mock._get_feed_recommendations = return_none
+ provider_mock._get_chart_recommendations = return_folder
+ provider_mock._get_new_releases_recommendations = return_none
+ provider_mock._get_new_playlists_recommendations = return_folder
+ provider_mock._get_top_picks_recommendations = return_none
+ provider_mock._get_mood_mix_recommendations = return_folder
+ provider_mock._get_activity_mix_recommendations = return_none
+ provider_mock._get_seasonal_mix_recommendations = return_folder
+ provider_mock._pick_random_tag_for_category = return_tag
+
+ result = await YandexMusicProvider.recommendations(provider_mock)
+
+ # Should only return 5 folders (4 None were filtered out)
+ assert len(result) == 5
+
+
+@pytest.mark.asyncio
+async def test_recommendations_returns_empty_list_when_all_none(provider_mock: Mock) -> None:
+ """Test recommendations() returns empty list when all methods return None."""
+
+ async def return_none(*_args: Any, **_kwargs: Any) -> None:
+ return None
+
+ async def return_no_tag(_category: str) -> None:
+ return None
+
+ # Set the methods directly on the provider mock instance
+ provider_mock._get_my_wave_recommendations = return_none
+ provider_mock._get_feed_recommendations = return_none
+ provider_mock._get_chart_recommendations = return_none
+ provider_mock._get_new_releases_recommendations = return_none
+ provider_mock._get_new_playlists_recommendations = return_none
+ provider_mock._get_top_picks_recommendations = return_none
+ provider_mock._get_mood_mix_recommendations = return_none
+ provider_mock._get_activity_mix_recommendations = return_none
+ provider_mock._get_seasonal_mix_recommendations = return_none
+ provider_mock._pick_random_tag_for_category = return_no_tag
+
+ result = await YandexMusicProvider.recommendations(provider_mock)
+
+ assert result == []
from __future__ import annotations
+import unittest.mock
from typing import TYPE_CHECKING, Any
import pytest
-from music_assistant_models.enums import ContentType
-
-from music_assistant.providers.yandex_music.constants import QUALITY_HIGH, QUALITY_LOSSLESS
+from aiohttp import ClientPayloadError
+from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
+from music_assistant_models.enums import ContentType, StreamType
+from music_assistant_models.errors import MediaNotFoundError
+from music_assistant_models.media_items import AudioFormat
+from music_assistant_models.streamdetails import StreamDetails
+
+from music_assistant.providers.yandex_music.constants import (
+ QUALITY_BALANCED,
+ QUALITY_EFFICIENT,
+ QUALITY_HIGH,
+ QUALITY_SUPERB,
+)
from music_assistant.providers.yandex_music.streaming import YandexMusicStreamingManager
if TYPE_CHECKING:
flac = _make_download_info("flac", 0, "https://example.com/track.flac")
download_infos = [mp3, flac]
- result = streaming_manager._select_best_quality(download_infos, QUALITY_LOSSLESS)
+ result = streaming_manager._select_best_quality(download_infos, QUALITY_SUPERB)
assert result is not None
assert result.codec == "flac"
assert result.direct_link == "https://example.com/track.flac"
-def test_select_best_quality_high_returns_highest_bitrate(
+def test_select_best_quality_balanced_falls_back_to_highest(
streaming_manager: YandexMusicStreamingManager,
) -> None:
- """When preferred is 'high' and list has MP3 and FLAC, highest bitrate is selected."""
+ """When preferred is 'balanced' and no option in 128-256kbps range, highest bitrate is used."""
mp3 = _make_download_info("mp3", 320, "https://example.com/track.mp3")
flac = _make_download_info("flac", 0, "https://example.com/track.flac")
download_infos = [mp3, flac]
- result = streaming_manager._select_best_quality(download_infos, QUALITY_HIGH)
+ result = streaming_manager._select_best_quality(download_infos, QUALITY_BALANCED)
assert result is not None
assert result.codec == "mp3"
mp3 = _make_download_info("mp3", 320, "https://example.com/track.mp3")
download_infos = [mp3]
- result = streaming_manager_with_tracking._select_best_quality(download_infos, QUALITY_LOSSLESS)
+ result = streaming_manager_with_tracking._select_best_quality(download_infos, QUALITY_SUPERB)
assert result is not None
assert result.codec == "mp3"
streaming_manager: YandexMusicStreamingManager,
) -> None:
"""Empty download_infos returns None."""
- result = streaming_manager._select_best_quality([], QUALITY_LOSSLESS)
+ result = streaming_manager._select_best_quality([], QUALITY_SUPERB)
assert result is None
assert result.bitrate_in_kbps == 320
-def test_get_content_type_flac_mp4_returns_flac(
+def test_get_content_type_flac_mp4_returns_mp4_container_with_flac_codec(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """flac-mp4 codec from get-file-info is mapped to MP4 container with FLAC codec."""
+ assert streaming_manager._get_content_type("flac-mp4") == (ContentType.MP4, ContentType.FLAC)
+ assert streaming_manager._get_content_type("FLAC-MP4") == (ContentType.MP4, ContentType.FLAC)
+
+
+def test_get_content_type_flac_returns_flac_container_with_unknown_codec(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """Plain FLAC codec is mapped to FLAC container with UNKNOWN codec."""
+ assert streaming_manager._get_content_type("flac") == (ContentType.FLAC, ContentType.UNKNOWN)
+ assert streaming_manager._get_content_type("FLAC") == (ContentType.FLAC, ContentType.UNKNOWN)
+
+
+def test_get_content_type_aac_variants_return_aac(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """All AAC codec variants are mapped correctly (MP4 container or plain AAC)."""
+ # Plain AAC variants
+ assert streaming_manager._get_content_type("aac") == (ContentType.AAC, ContentType.UNKNOWN)
+ assert streaming_manager._get_content_type("AAC") == (ContentType.AAC, ContentType.UNKNOWN)
+ assert streaming_manager._get_content_type("he-aac") == (ContentType.AAC, ContentType.UNKNOWN)
+ assert streaming_manager._get_content_type("HE-AAC") == (ContentType.AAC, ContentType.UNKNOWN)
+ # MP4 container variants
+ assert streaming_manager._get_content_type("aac-mp4") == (ContentType.MP4, ContentType.AAC)
+ assert streaming_manager._get_content_type("AAC-MP4") == (ContentType.MP4, ContentType.AAC)
+ assert streaming_manager._get_content_type("he-aac-mp4") == (ContentType.MP4, ContentType.AAC)
+ assert streaming_manager._get_content_type("HE-AAC-MP4") == (ContentType.MP4, ContentType.AAC)
+
+
+# --- Efficient quality tests ---
+
+
+def test_select_best_quality_efficient_prefers_lowest_aac(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """Efficient quality prefers lowest bitrate AAC over higher bitrate options."""
+ mp3_320 = _make_download_info("mp3", 320)
+ aac_64 = _make_download_info("aac", 64)
+ aac_192 = _make_download_info("aac", 192)
+
+ result = streaming_manager._select_best_quality([mp3_320, aac_64, aac_192], QUALITY_EFFICIENT)
+
+ assert result is not None
+ assert result.codec == "aac"
+ assert result.bitrate_in_kbps == 64
+
+
+def test_select_best_quality_efficient_aac_mp4_variant(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """Efficient quality recognizes aac-mp4 container variant."""
+ mp3_320 = _make_download_info("mp3", 320)
+ aac_mp4_64 = _make_download_info("aac-mp4", 64)
+
+ result = streaming_manager._select_best_quality([mp3_320, aac_mp4_64], QUALITY_EFFICIENT)
+
+ assert result is not None
+ assert result.codec == "aac-mp4"
+ assert result.bitrate_in_kbps == 64
+
+
+def test_select_best_quality_efficient_fallback_to_mp3(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """Efficient quality falls back to MP3 when no AAC available."""
+ mp3_128 = _make_download_info("mp3", 128)
+ flac = _make_download_info("flac", 0)
+
+ result = streaming_manager._select_best_quality([mp3_128, flac], QUALITY_EFFICIENT)
+
+ assert result is not None
+ assert result.codec == "mp3"
+
+
+def test_select_best_quality_efficient_fallback_to_lowest(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """Efficient quality falls back to lowest bitrate when no AAC/MP3."""
+ flac = _make_download_info("flac", 1411)
+
+ result = streaming_manager._select_best_quality([flac], QUALITY_EFFICIENT)
+
+ assert result is not None
+ assert result.codec == "flac"
+
+
+# --- High quality tests ---
+
+
+def test_select_best_quality_high_prefers_mp3_320(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """High quality prefers MP3 with bitrate >= 256kbps."""
+ mp3_320 = _make_download_info("mp3", 320)
+ mp3_128 = _make_download_info("mp3", 128)
+ aac_192 = _make_download_info("aac", 192)
+ flac = _make_download_info("flac", 1411)
+
+ result = streaming_manager._select_best_quality([mp3_320, mp3_128, aac_192, flac], QUALITY_HIGH)
+
+ assert result is not None
+ assert result.codec == "mp3"
+ assert result.bitrate_in_kbps == 320
+
+
+def test_select_best_quality_high_fallback_to_any_mp3(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """High quality falls back to any MP3 when no high-bitrate MP3 available."""
+ mp3_128 = _make_download_info("mp3", 128)
+ aac_192 = _make_download_info("aac", 192)
+
+ result = streaming_manager._select_best_quality([mp3_128, aac_192], QUALITY_HIGH)
+
+ assert result is not None
+ assert result.codec == "mp3"
+ assert result.bitrate_in_kbps == 128
+
+
+def test_select_best_quality_high_no_mp3_uses_non_flac(
streaming_manager: YandexMusicStreamingManager,
) -> None:
- """flac-mp4 codec from get-file-info is mapped to ContentType.FLAC."""
- assert streaming_manager._get_content_type("flac-mp4") == ContentType.FLAC
- assert streaming_manager._get_content_type("FLAC-MP4") == ContentType.FLAC
+ """High quality uses highest non-FLAC when no MP3 available."""
+ aac_192 = _make_download_info("aac", 192)
+ flac = _make_download_info("flac", 1411)
+
+ result = streaming_manager._select_best_quality([aac_192, flac], QUALITY_HIGH)
+
+ assert result is not None
+ assert result.codec == "aac"
+ assert result.bitrate_in_kbps == 192
+
+
+def test_select_best_quality_high_only_flac_returns_flac(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """High quality returns FLAC as last resort when nothing else available."""
+ flac = _make_download_info("flac", 1411)
+
+ result = streaming_manager._select_best_quality([flac], QUALITY_HIGH)
+
+ assert result is not None
+ assert result.codec == "flac"
+
+
+# --- Audio params tests ---
+
+
+def test_get_audio_params_flac_mp4(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """flac-mp4 returns 48kHz/24bit."""
+ assert streaming_manager._get_audio_params("flac-mp4") == (48000, 24)
+
+
+def test_get_audio_params_flac_mp4_case_insensitive(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """flac-mp4 matching is case-insensitive."""
+ assert streaming_manager._get_audio_params("FLAC-MP4") == (48000, 24)
+
+
+def test_get_audio_params_flac(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """Plain FLAC returns CD-quality defaults."""
+ assert streaming_manager._get_audio_params("flac") == (44100, 16)
+
+
+def test_get_audio_params_mp3(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """MP3 returns CD-quality defaults."""
+ assert streaming_manager._get_audio_params("mp3") == (44100, 16)
+
+
+def test_get_audio_params_none(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """None codec returns CD-quality defaults."""
+ assert streaming_manager._get_audio_params(None) == (44100, 16)
+
+
+# --- get_audio_stream tests ---
+
+
+def _make_encrypted_stream_details(
+ key_hex: str,
+ url: str = "https://example.com/encrypted.flac",
+) -> StreamDetails:
+ """Build StreamDetails for encrypted FLAC stream tests."""
+ return StreamDetails(
+ item_id="test_track_123",
+ provider="yandex_music_instance",
+ audio_format=AudioFormat(content_type=ContentType.MP4),
+ stream_type=StreamType.CUSTOM,
+ data={
+ "encrypted_url": url,
+ "decryption_key": key_hex,
+ "codec": "flac-mp4",
+ },
+ )
+
+
+class _MockContent:
+ """Async iterable content for mock HTTP responses."""
+
+ def __init__(self, chunks: list[bytes], *, drop_payload_error: bool = False) -> None:
+ self._chunks = chunks
+ self._drop = drop_payload_error
+
+ async def iter_chunked(self, size: int) -> Any:
+ for chunk in self._chunks:
+ yield chunk
+ if self._drop:
+ raise ClientPayloadError("connection reset by peer")
+
+
+class _MockResponse:
+ """Fake aiohttp ClientResponse for streaming tests."""
+
+ def __init__(
+ self,
+ chunks: list[bytes],
+ *,
+ error: Exception | None = None,
+ drop_payload_error: bool = False,
+ ) -> None:
+ self.content = _MockContent(chunks, drop_payload_error=drop_payload_error)
+ self._error = error
+
+ def raise_for_status(self) -> None:
+ """Raise stored error if set, simulating a non-2xx HTTP response."""
+ if self._error is not None:
+ raise self._error
+
+ async def __aenter__(self) -> _MockResponse:
+ return self
+
+ async def __aexit__(self, *args: object) -> None:
+ pass
+
+
+class _MockHttpSession:
+ """Fake aiohttp ClientSession for streaming tests."""
+
+ def __init__(self, response: _MockResponse) -> None:
+ self._response = response
+
+ def get(self, url: str, **kwargs: object) -> _MockResponse:
+ return self._response
+
+
+class _MultiCallHttpSession:
+ """Fake aiohttp ClientSession returning successive responses and recording calls."""
+
+ def __init__(self, responses: list[_MockResponse]) -> None:
+ self._responses = responses
+ self.calls: list[dict[str, Any]] = []
+
+ def get(self, url: str, **kwargs: object) -> _MockResponse:
+ self.calls.append({"url": url, "headers": kwargs.get("headers", {})})
+ return self._responses[len(self.calls) - 1]
+
+
+async def test_get_audio_stream_invalid_key_length(
+ streaming_manager: YandexMusicStreamingManager,
+) -> None:
+ """Invalid AES key length raises MediaNotFoundError before any HTTP request."""
+ sd = _make_encrypted_stream_details("deadbeef") # 4 bytes — invalid
+
+ with pytest.raises(MediaNotFoundError, match="Unsupported AES key length"):
+ async for _ in streaming_manager.get_audio_stream(sd):
+ pass
+
+
+async def test_get_audio_stream_http_error_raises_media_not_found(
+ streaming_manager: YandexMusicStreamingManager,
+ streaming_provider_stub: StreamingProviderStub,
+) -> None:
+ """HTTP error from encrypted URL is converted to MediaNotFoundError."""
+ key = b"\x00" * 32
+ sd = _make_encrypted_stream_details(key.hex())
+ streaming_provider_stub.mass.http_session = _MockHttpSession(
+ _MockResponse([], error=RuntimeError("403 Forbidden"))
+ )
+
+ with pytest.raises(MediaNotFoundError, match="Failed to fetch encrypted stream"):
+ async for _ in streaming_manager.get_audio_stream(sd):
+ pass
+
+
+async def test_get_audio_stream_decrypts_aes_ctr_correctly(
+ streaming_manager: YandexMusicStreamingManager,
+ streaming_provider_stub: StreamingProviderStub,
+) -> None:
+ """Encrypted stream is decrypted correctly with AES-256-CTR and zero IV."""
+ key = b"\x42" * 32
+ plaintext = b"Hello, Yandex Music FLAC data!\n" * 50
+
+ # Encrypt with the same algorithm used in get_audio_stream
+ nonce_16 = bytes(16)
+ encryptor = Cipher(algorithms.AES(key), modes.CTR(nonce_16)).encryptor()
+ ciphertext = encryptor.update(plaintext) + encryptor.finalize()
+
+ sd = _make_encrypted_stream_details(key.hex())
+ streaming_provider_stub.mass.http_session = _MockHttpSession(_MockResponse([ciphertext]))
+
+ result = b""
+ async for chunk in streaming_manager.get_audio_stream(sd):
+ result += chunk
+
+ assert result == plaintext
+
+
+async def test_get_audio_stream_reconnects_with_range_header(
+ streaming_manager: YandexMusicStreamingManager,
+ streaming_provider_stub: StreamingProviderStub,
+) -> None:
+ """On ClientPayloadError, reconnects with correct Range header and full plaintext restored."""
+ key = b"\x11" * 32
+ # 96 bytes = 6 AES-CTR blocks; split at byte 48 (block boundary)
+ plaintext = b"AAAAAAAAAAAAAAAA" * 3 + b"BBBBBBBBBBBBBBBB" * 3
+
+ nonce_16 = bytes(16)
+ encryptor = Cipher(algorithms.AES(key), modes.CTR(nonce_16)).encryptor()
+ ciphertext = encryptor.update(plaintext) + encryptor.finalize()
+
+ drop_at = 48 # exactly 3 blocks — clean block boundary
+
+ # First request drops after 48 bytes; second serves the remainder
+ first_resp = _MockResponse([ciphertext[:drop_at]], drop_payload_error=True)
+ second_resp = _MockResponse([ciphertext[drop_at:]])
+ session = _MultiCallHttpSession([first_resp, second_resp])
+ streaming_provider_stub.mass.http_session = session
+
+ result = b""
+ with unittest.mock.patch("asyncio.sleep"):
+ async for chunk in streaming_manager.get_audio_stream(
+ _make_encrypted_stream_details(key.hex())
+ ):
+ result += chunk
+
+ assert result == plaintext
+ assert len(session.calls) == 2
+ assert session.calls[0].get("headers") == {}
+ assert session.calls[1]["headers"] == {"Range": f"bytes={drop_at}-"}