Compare commits

...

39 commits

Author SHA1 Message Date
GitHub Actions Bot
e0081a68c1 update M3U8 2025-12-18 16:01:29 -05:00
GitHub Actions Bot
e8dd5feb9f health log 2025-12-18 20:40:35 +00:00
GitHub Actions Bot
03b1100dcd update M3U8 2025-12-18 15:31:11 -05:00
GitHub Actions Bot
5a18ed630c update M3U8 2025-12-18 15:03:24 -05:00
GitHub Actions Bot
0dd92f7826 update M3U8 2025-12-18 14:32:02 -05:00
GitHub Actions Bot
eb5fb917cb update M3U8 2025-12-18 14:01:46 -05:00
GitHub Actions Bot
557b828be3 update EPG 2025-12-18 18:59:49 +00:00
GitHub Actions Bot
002260ca7b update M3U8 2025-12-18 13:31:05 -05:00
GitHub Actions Bot
1a3850fc04 update M3U8 2025-12-18 13:00:56 -05:00
doms9
00000d9f4f e
edit watchfooty mirrors
misc. edits
2025-12-18 12:51:16 -05:00
GitHub Actions Bot
c100d6fc1d update M3U8 2025-12-18 12:01:06 -05:00
GitHub Actions Bot
39cf34aa3a update M3U8 2025-12-18 11:01:45 -05:00
GitHub Actions Bot
74042efdc8 update M3U8 2025-12-18 10:01:07 -05:00
GitHub Actions Bot
9e616e092b health log 2025-12-18 14:46:42 +00:00
GitHub Actions Bot
1fa4b0b04f update M3U8 2025-12-18 09:01:00 -05:00
GitHub Actions Bot
f3e1188d05 update M3U8 2025-12-18 08:01:05 -05:00
GitHub Actions Bot
60b4883907 update EPG 2025-12-18 10:54:24 +00:00
doms9
00000d939c e 2025-12-18 04:14:54 -05:00
GitHub Actions Bot
8b2d8cc1fc health log 2025-12-18 08:51:02 +00:00
doms9
00000d9079 e
misc. edits
2025-12-18 03:04:11 -05:00
GitHub Actions Bot
a8ead389ea update M3U8 2025-12-17 23:30:57 -05:00
GitHub Actions Bot
1ee2b1f9d7 health log 2025-12-17 23:25:37 -05:00
GitHub Actions Bot
16253b850c update M3U8 2025-12-17 23:00:38 -05:00
GitHub Actions Bot
aa6dd05475 update EPG 2025-12-18 03:45:53 +00:00
GitHub Actions Bot
00d12bbfa1 health log 2025-12-18 03:42:46 +00:00
GitHub Actions Bot
d18e200189 update M3U8 2025-12-17 22:31:16 -05:00
GitHub Actions Bot
b582e64e2b update M3U8 2025-12-17 22:00:39 -05:00
GitHub Actions Bot
5a602917cb update M3U8 2025-12-17 21:30:55 -05:00
GitHub Actions Bot
a654fe07b7 update M3U8 2025-12-17 21:01:21 -05:00
doms9
00000d91a5 e
add istreameast.py
2025-12-17 20:57:35 -05:00
GitHub Actions Bot
5aa6a95236 update M3U8 2025-12-17 20:31:13 -05:00
GitHub Actions Bot
949a0526d2 update M3U8 2025-12-17 20:01:23 -05:00
GitHub Actions Bot
40f122a374 update M3U8 2025-12-17 19:32:05 -05:00
GitHub Actions Bot
856a4f4c8c update M3U8 2025-12-17 19:02:40 -05:00
GitHub Actions Bot
b3e3092a79 update M3U8 2025-12-17 18:30:28 -05:00
GitHub Actions Bot
78b3152664 update M3U8 2025-12-17 18:00:53 -05:00
GitHub Actions Bot
19e5db2bf2 update M3U8 2025-12-17 17:30:23 -05:00
GitHub Actions Bot
e2a93db8f6 update M3U8 2025-12-17 17:01:02 -05:00
GitHub Actions Bot
f5091a2d7a update M3U8 2025-12-17 16:30:29 -05:00
27 changed files with 89699 additions and 93952 deletions

177617
EPG/TV.xml

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load diff

View file

@ -7,16 +7,16 @@ http://fl1.moveonjoy.com/ANE/index.m3u8
https://fl1.moveonjoy.com/FL_Tampa_ABC/index.m3u8 https://fl1.moveonjoy.com/FL_Tampa_ABC/index.m3u8
#EXTINF:-1 tvg-chno="3" tvg-id="ACC.Network.us2" tvg-name="ACC Network" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s111871_dark_360w_270h.png" group-title="TV",ACC Network #EXTINF:-1 tvg-chno="3" tvg-id="ACC.Network.us2" tvg-name="ACC Network" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s111871_dark_360w_270h.png" group-title="TV",ACC Network
http://tv14s.xyz:8080/A1Jay5/362586/9273 http://cord-cutter.net:8080/30550113/30550113/9273
#EXTINF:-1 tvg-chno="4" tvg-id="AdultSwim.com.Cartoon.Network.us2" tvg-name="Adult Swim" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16496_dark_360w_270h.png" group-title="TV",Adult Swim #EXTINF:-1 tvg-chno="4" tvg-id="AdultSwim.com.Cartoon.Network.us2" tvg-name="Adult Swim" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16496_dark_360w_270h.png" group-title="TV",Adult Swim
https://turnerlive.warnermediacdn.com/hls/live/2023183/aseast/noslate/VIDEO_1_5128000.m3u8 https://turnerlive.warnermediacdn.com/hls/live/2023183/aseast/noslate/VIDEO_1_5128000.m3u8
#EXTINF:-1 tvg-chno="5" tvg-id="Altitude.Sports.us2" tvg-name="Altitude Sports" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s44263_dark_360w_270h.png" group-title="TV",Altitude Sports #EXTINF:-1 tvg-chno="5" tvg-id="Altitude.Sports.us2" tvg-name="Altitude Sports" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s44263_dark_360w_270h.png" group-title="TV",Altitude Sports
http://tv14s.xyz:8080/A1Jay5/362586/79545 http://cord-cutter.net:8080/30550113/30550113/79545
#EXTINF:-1 tvg-chno="6" tvg-id="AMC.HD.us2" tvg-name="AMC" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10021_dark_360w_270h.png" group-title="TV",AMC #EXTINF:-1 tvg-chno="6" tvg-id="AMC.HD.us2" tvg-name="AMC" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10021_dark_360w_270h.png" group-title="TV",AMC
http://tv14s.xyz:8080/A1Jay5/362586/18925 http://cord-cutter.net:8080/30550113/30550113/18925
#EXTINF:-1 tvg-chno="7" tvg-id="Animal.Planet.HD.us2" tvg-name="Animal Planet" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16331_dark_360w_270h.png" group-title="TV",Animal Planet #EXTINF:-1 tvg-chno="7" tvg-id="Animal.Planet.HD.us2" tvg-name="Animal Planet" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16331_dark_360w_270h.png" group-title="TV",Animal Planet
http://fl1.moveonjoy.com/Animal_Planet/index.m3u8 http://fl1.moveonjoy.com/Animal_Planet/index.m3u8
@ -28,25 +28,25 @@ http://fl1.moveonjoy.com/Aspire/index.m3u8
http://stalker.klma2023.net/play/live.php?mac=00:1B:79:F8:59:0E&stream=1163984&extension=ts http://stalker.klma2023.net/play/live.php?mac=00:1B:79:F8:59:0E&stream=1163984&extension=ts
#EXTINF:-1 tvg-chno="10" tvg-id="BBC.America.HD.us2" tvg-name="BBC America" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s64492_dark_360w_270h.png" group-title="TV",BBC America #EXTINF:-1 tvg-chno="10" tvg-id="BBC.America.HD.us2" tvg-name="BBC America" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s64492_dark_360w_270h.png" group-title="TV",BBC America
http://tv14s.xyz:8080/A1Jay5/362586/20194 http://cord-cutter.net:8080/30550113/30550113/20194
#EXTINF:-1 tvg-chno="11" tvg-id="BBC.News.(North.America).HD.us2" tvg-name="BBC World News" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s89542_dark_360w_270h.png" group-title="TV",BBC World News #EXTINF:-1 tvg-chno="11" tvg-id="BBC.News.(North.America).HD.us2" tvg-name="BBC World News" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s89542_dark_360w_270h.png" group-title="TV",BBC World News
http://tv14s.xyz:8080/A1Jay5/362586/139752 http://cord-cutter.net:8080/30550113/30550113/139752
#EXTINF:-1 tvg-chno="12" tvg-id="BET.HD.us2" tvg-name="BET" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10051_dark_360w_270h.png" group-title="TV",BET #EXTINF:-1 tvg-chno="12" tvg-id="BET.HD.us2" tvg-name="BET" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10051_dark_360w_270h.png" group-title="TV",BET
http://fl1.moveonjoy.com/BET_EAST/index.m3u8 http://fl1.moveonjoy.com/BET_EAST/index.m3u8
#EXTINF:-1 tvg-chno="13" tvg-id="Big.Ten.Network.HD.us2" tvg-name="Big Ten Network" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s56783_dark_360w_270h.png" group-title="TV",Big Ten Network #EXTINF:-1 tvg-chno="13" tvg-id="Big.Ten.Network.HD.us2" tvg-name="Big Ten Network" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s56783_dark_360w_270h.png" group-title="TV",Big Ten Network
http://tv14s.xyz:8080/A1Jay5/362586/9828 http://cord-cutter.net:8080/30550113/30550113/9828
#EXTINF:-1 tvg-chno="14" tvg-id="Bloomberg.HD.us2" tvg-name="Bloomberg TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s71799_dark_360w_270h.png" group-title="TV",Bloomberg TV #EXTINF:-1 tvg-chno="14" tvg-id="Bloomberg.HD.us2" tvg-name="Bloomberg TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s71799_dark_360w_270h.png" group-title="TV",Bloomberg TV
http://tv14s.xyz:8080/A1Jay5/362586/15158 http://cord-cutter.net:8080/30550113/30550113/15158
#EXTINF:-1 tvg-chno="15" tvg-id="Boomerang.us2" tvg-name="Boomerang" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s21883_dark_360w_270h.png" group-title="TV",Boomerang #EXTINF:-1 tvg-chno="15" tvg-id="Boomerang.us2" tvg-name="Boomerang" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s21883_dark_360w_270h.png" group-title="TV",Boomerang
http://tv14s.xyz:8080/A1Jay5/362586/14741 http://cord-cutter.net:8080/30550113/30550113/14741
#EXTINF:-1 tvg-chno="16" tvg-id="Bounce.TV.us2" tvg-name="Bounce TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s73067_dark_360w_270h.png" group-title="TV",Bounce TV #EXTINF:-1 tvg-chno="16" tvg-id="Bounce.TV.us2" tvg-name="Bounce TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s73067_dark_360w_270h.png" group-title="TV",Bounce TV
http://tv14s.xyz:8080/A1Jay5/362586/48323 http://cord-cutter.net:8080/30550113/30550113/48323
#EXTINF:-1 tvg-chno="17" tvg-id="Bravo.HD.us2" tvg-name="Bravo TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10057_dark_360w_270h.png" group-title="TV",Bravo TV #EXTINF:-1 tvg-chno="17" tvg-id="Bravo.HD.us2" tvg-name="Bravo TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10057_dark_360w_270h.png" group-title="TV",Bravo TV
http://fl1.moveonjoy.com/BRAVO/index.m3u8 http://fl1.moveonjoy.com/BRAVO/index.m3u8
@ -58,16 +58,16 @@ https://buzzrota-web.amagi.tv/playlist.m3u8
http://fl1.moveonjoy.com/C-SPAN/index.m3u8 http://fl1.moveonjoy.com/C-SPAN/index.m3u8
#EXTINF:-1 tvg-chno="20" tvg-id="Cartoon.Network.HD.us2" tvg-name="Cartoon Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s12131_dark_360w_270h.png" group-title="TV",Cartoon Network #EXTINF:-1 tvg-chno="20" tvg-id="Cartoon.Network.HD.us2" tvg-name="Cartoon Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s12131_dark_360w_270h.png" group-title="TV",Cartoon Network
http://tv14s.xyz:8080/A1Jay5/362586/46708 http://cord-cutter.net:8080/30550113/30550113/46708
#EXTINF:-1 tvg-chno="21" tvg-id="WCBS-DT.us_locals1" tvg-name="CBS" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10098_dark_360w_270h.png" group-title="TV",CBS #EXTINF:-1 tvg-chno="21" tvg-id="WCBS-DT.us_locals1" tvg-name="CBS" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10098_dark_360w_270h.png" group-title="TV",CBS
http://tv14s.xyz:8080/A1Jay5/362586/120749 http://cord-cutter.net:8080/30550113/30550113/120749
#EXTINF:-1 tvg-chno="22" tvg-id="plex.tv.CBS.Sports.Golazo.Network.plex" tvg-name="CBS Sports Golazo Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s133691_dark_360w_270h.png" group-title="TV",CBS Sports Golazo Network #EXTINF:-1 tvg-chno="22" tvg-id="plex.tv.CBS.Sports.Golazo.Network.plex" tvg-name="CBS Sports Golazo Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s133691_dark_360w_270h.png" group-title="TV",CBS Sports Golazo Network
https://dai.google.com/linear/hls/event/GxrCGmwST0ixsrc_QgB6qw/master.m3u8 https://dai.google.com/linear/hls/event/GxrCGmwST0ixsrc_QgB6qw/master.m3u8
#EXTINF:-1 tvg-chno="23" tvg-id="CBS.Sports.Network.HD.us2" tvg-name="CBS Sports Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16365_dark_360w_270h.png" group-title="TV",CBS Sports Network #EXTINF:-1 tvg-chno="23" tvg-id="CBS.Sports.Network.HD.us2" tvg-name="CBS Sports Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16365_dark_360w_270h.png" group-title="TV",CBS Sports Network
http://tv14s.xyz:8080/A1Jay5/362586/10454 http://cord-cutter.net:8080/30550113/30550113/10454
#EXTINF:-1 tvg-chno="24" tvg-id="CMT.HD.us2" tvg-name="CMT" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10138_dark_360w_270h.png" group-title="TV",CMT #EXTINF:-1 tvg-chno="24" tvg-id="CMT.HD.us2" tvg-name="CMT" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10138_dark_360w_270h.png" group-title="TV",CMT
https://fl1.moveonjoy.com/CMT/index.m3u8 https://fl1.moveonjoy.com/CMT/index.m3u8
@ -79,34 +79,34 @@ https://fl1.moveonjoy.com/CNBC/index.m3u8
https://turnerlive.warnermediacdn.com/hls/live/586495/cnngo/cnn_slate/VIDEO_0_3564000.m3u8 https://turnerlive.warnermediacdn.com/hls/live/586495/cnngo/cnn_slate/VIDEO_0_3564000.m3u8
#EXTINF:-1 tvg-chno="27" tvg-id="Comedy.Central.HD.us2" tvg-name="Comedy Central" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10149_dark_360w_270h.png" group-title="TV",Comedy Central #EXTINF:-1 tvg-chno="27" tvg-id="Comedy.Central.HD.us2" tvg-name="Comedy Central" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10149_dark_360w_270h.png" group-title="TV",Comedy Central
http://tv14s.xyz:8080/A1Jay5/362586/7466 http://cord-cutter.net:8080/30550113/30550113/7466
#EXTINF:-1 tvg-chno="28" tvg-id="Comedy.TV.HD.us2" tvg-name="Comedy TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s82470_dark_360w_270h.png" group-title="TV",Comedy TV #EXTINF:-1 tvg-chno="28" tvg-id="Comedy.TV.HD.us2" tvg-name="Comedy TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s82470_dark_360w_270h.png" group-title="TV",Comedy TV
https://fl1.moveonjoy.com/Comedy_TV/index.m3u8 https://fl1.moveonjoy.com/Comedy_TV/index.m3u8
#EXTINF:-1 tvg-chno="29" tvg-id="Comet.us2" tvg-name="Comet TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s97051_dark_360w_270h.png" group-title="TV",Comet TV #EXTINF:-1 tvg-chno="29" tvg-id="Comet.us2" tvg-name="Comet TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s97051_dark_360w_270h.png" group-title="TV",Comet TV
http://tv14s.xyz:8080/A1Jay5/362586/125831 http://cord-cutter.net:8080/30550113/30550113/125831
#EXTINF:-1 tvg-chno="30" tvg-id="Cooking.Channel.HD.us2" tvg-name="Cooking Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s30156_dark_360w_270h.png" group-title="TV",Cooking Channel #EXTINF:-1 tvg-chno="30" tvg-id="Cooking.Channel.HD.us2" tvg-name="Cooking Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s30156_dark_360w_270h.png" group-title="TV",Cooking Channel
https://fl1.moveonjoy.com/COOKING_CHANNEL/index.m3u8 https://fl1.moveonjoy.com/COOKING_CHANNEL/index.m3u8
#EXTINF:-1 tvg-chno="31" tvg-id="Court.TV.us2" tvg-name="Court TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s117160_dark_360w_270h.png" group-title="TV",Court TV #EXTINF:-1 tvg-chno="31" tvg-id="Court.TV.us2" tvg-name="Court TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s117160_dark_360w_270h.png" group-title="TV",Court TV
http://tv14s.xyz:8080/A1Jay5/362586/21092 http://cord-cutter.net:8080/30550113/30550113/21092
#EXTINF:-1 tvg-chno="32" tvg-id="COZI.TV.us2" tvg-name="Cozi TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s78851_dark_360w_270h.png" group-title="TV",Cozi TV #EXTINF:-1 tvg-chno="32" tvg-id="COZI.TV.us2" tvg-name="Cozi TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s78851_dark_360w_270h.png" group-title="TV",Cozi TV
http://tv14s.xyz:8080/A1Jay5/362586/11868 http://cord-cutter.net:8080/30550113/30550113/11868
#EXTINF:-1 tvg-chno="33" tvg-id="Crime.and.Investigation.Network.HD.us2" tvg-name="Crime & Investigation Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s61469_dark_360w_270h.png" group-title="TV",Crime & Investigation Network #EXTINF:-1 tvg-chno="33" tvg-id="Crime.and.Investigation.Network.HD.us2" tvg-name="Crime & Investigation Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s61469_dark_360w_270h.png" group-title="TV",Crime & Investigation Network
https://fl1.moveonjoy.com/Crime_and_Investigation_Network/index.m3u8 https://fl1.moveonjoy.com/Crime_and_Investigation_Network/index.m3u8
#EXTINF:-1 tvg-chno="34" tvg-id="WKCF-DT.us_locals1" tvg-name="CW" tvg-logo="https://i.gyazo.com/afd5b481b327d204087dfde6a7741f9d.png" group-title="TV",CW #EXTINF:-1 tvg-chno="34" tvg-id="WKCF-DT.us_locals1" tvg-name="CW" tvg-logo="https://i.gyazo.com/afd5b481b327d204087dfde6a7741f9d.png" group-title="TV",CW
http://tv14s.xyz:8080/A1Jay5/362586/120893 http://cord-cutter.net:8080/30550113/30550113/120893
#EXTINF:-1 tvg-chno="35" tvg-id="Discovery.Channel.HD.us2" tvg-name="Discovery Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11150_dark_360w_270h.png" group-title="TV",Discovery Channel #EXTINF:-1 tvg-chno="35" tvg-id="Discovery.Channel.HD.us2" tvg-name="Discovery Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11150_dark_360w_270h.png" group-title="TV",Discovery Channel
http://tv14s.xyz:8080/A1Jay5/362586/46720 http://cord-cutter.net:8080/30550113/30550113/46720
#EXTINF:-1 tvg-chno="36" tvg-id="Discovery.Family.Channel.HD.us2" tvg-name="Discovery Family Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16618_dark_360w_270h.png" group-title="TV",Discovery Family Channel #EXTINF:-1 tvg-chno="36" tvg-id="Discovery.Family.Channel.HD.us2" tvg-name="Discovery Family Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16618_dark_360w_270h.png" group-title="TV",Discovery Family Channel
http://tv14s.xyz:8080/A1Jay5/362586/10538 http://cord-cutter.net:8080/30550113/30550113/10538
#EXTINF:-1 tvg-chno="37" tvg-id="Discovery.Life.Channel.us2" tvg-name="Discovery Life" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16125_dark_360w_270h.png" group-title="TV",Discovery Life #EXTINF:-1 tvg-chno="37" tvg-id="Discovery.Life.Channel.us2" tvg-name="Discovery Life" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16125_dark_360w_270h.png" group-title="TV",Discovery Life
https://fl1.moveonjoy.com/DISCOVERY_LIFE/index.m3u8 https://fl1.moveonjoy.com/DISCOVERY_LIFE/index.m3u8
@ -115,106 +115,106 @@ https://fl1.moveonjoy.com/DISCOVERY_LIFE/index.m3u8
https://fl1.moveonjoy.com/Discovery_Science/index.m3u8 https://fl1.moveonjoy.com/Discovery_Science/index.m3u8
#EXTINF:-1 tvg-chno="39" tvg-id="Disney.Channel.HD.us2" tvg-name="Disney" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10171_dark_360w_270h.png" group-title="TV",Disney #EXTINF:-1 tvg-chno="39" tvg-id="Disney.Channel.HD.us2" tvg-name="Disney" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10171_dark_360w_270h.png" group-title="TV",Disney
http://tv14s.xyz:8080/A1Jay5/362586/2206 http://cord-cutter.net:8080/30550113/30550113/2206
#EXTINF:-1 tvg-chno="40" tvg-id="Disney.XD.HD.us2" tvg-name="Disney XD" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s18279_dark_360w_270h.png" group-title="TV",Disney XD #EXTINF:-1 tvg-chno="40" tvg-id="Disney.XD.HD.us2" tvg-name="Disney XD" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s18279_dark_360w_270h.png" group-title="TV",Disney XD
http://tv14s.xyz:8080/A1Jay5/362586/75621 http://cord-cutter.net:8080/30550113/30550113/75621
#EXTINF:-1 tvg-chno="41" tvg-id="E!.Entertainment.Television.HD.us2" tvg-name="E! Entertainment" tvg-logo="https://i.gyazo.com/f73b80e3eb56cec06df6705d00e2f422.png" group-title="TV",E! Entertainment #EXTINF:-1 tvg-chno="41" tvg-id="E!.Entertainment.Television.HD.us2" tvg-name="E! Entertainment" tvg-logo="https://i.gyazo.com/f73b80e3eb56cec06df6705d00e2f422.png" group-title="TV",E! Entertainment
http://fl1.moveonjoy.com/E_ENTERTAINMENT_TELEVISION/index.m3u8 http://fl1.moveonjoy.com/E_ENTERTAINMENT_TELEVISION/index.m3u8
#EXTINF:-1 tvg-chno="42" tvg-id="ESPN.HD.us2" tvg-name="ESPN" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10179_dark_360w_270h.png" group-title="TV",ESPN #EXTINF:-1 tvg-chno="42" tvg-id="ESPN.HD.us2" tvg-name="ESPN" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10179_dark_360w_270h.png" group-title="TV",ESPN
http://tv14s.xyz:8080/A1Jay5/362586/14197 http://cord-cutter.net:8080/30550113/30550113/14197
#EXTINF:-1 tvg-chno="43" tvg-id="ESPNEWS.HD.us2" tvg-name="ESPN News" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16485_dark_360w_270h.png" group-title="TV",ESPN News #EXTINF:-1 tvg-chno="43" tvg-id="ESPNEWS.HD.us2" tvg-name="ESPN News" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16485_dark_360w_270h.png" group-title="TV",ESPN News
http://tv14s.xyz:8080/A1Jay5/362586/17707 http://cord-cutter.net:8080/30550113/30550113/17707
#EXTINF:-1 tvg-chno="44" tvg-id="ESPNU.HD.us2" tvg-name="ESPN U" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s45654_dark_360w_270h.png" group-title="TV",ESPN U #EXTINF:-1 tvg-chno="44" tvg-id="ESPNU.HD.us2" tvg-name="ESPN U" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s45654_dark_360w_270h.png" group-title="TV",ESPN U
http://tv14s.xyz:8080/A1Jay5/362586/10255 http://cord-cutter.net:8080/30550113/30550113/10255
#EXTINF:-1 tvg-chno="45" tvg-id="ESPN2.HD.us2" tvg-name="ESPN2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s12444_dark_360w_270h.png" group-title="TV",ESPN2 #EXTINF:-1 tvg-chno="45" tvg-id="ESPN2.HD.us2" tvg-name="ESPN2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s12444_dark_360w_270h.png" group-title="TV",ESPN2
http://tv14s.xyz:8080/A1Jay5/362586/2210 http://cord-cutter.net:8080/30550113/30550113/2210
#EXTINF:-1 tvg-chno="46" tvg-id="FanDuel.Sports.Network.Detroit.24/7.HDTV.us" tvg-name="FDSN Detroit" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s54286_dark_360w_270h.png" group-title="TV",FDSN Detroit #EXTINF:-1 tvg-chno="46" tvg-id="FanDuel.Sports.Network.Detroit.24/7.HDTV.us" tvg-name="FDSN Detroit" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s54286_dark_360w_270h.png" group-title="TV",FDSN Detroit
http://tv14s.xyz:8080/A1Jay5/362586/20936 http://cord-cutter.net:8080/30550113/30550113/20936
#EXTINF:-1 tvg-chno="47" tvg-id="FanDuel.Sports.Network.Florida.HDTV.(Out.of.Market).us" tvg-name="FDSN Florida" tvg-logo="https://i.gyazo.com/fad701fbaaafe161b13b23ed9b50179b.png" group-title="TV",FDSN Florida #EXTINF:-1 tvg-chno="47" tvg-id="FanDuel.Sports.Network.Florida.HDTV.(Out.of.Market).us" tvg-name="FDSN Florida" tvg-logo="https://i.gyazo.com/fad701fbaaafe161b13b23ed9b50179b.png" group-title="TV",FDSN Florida
http://tv14s.xyz:8080/A1Jay5/362586/46794 http://cord-cutter.net:8080/30550113/30550113/46794
#EXTINF:-1 tvg-chno="48" tvg-id="FanDuel.Sports.Network.Midwest.24/7.HDTV.us" tvg-name="FDSN Midwest" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11058_dark_360w_270h.png" group-title="TV",FDSN Midwest #EXTINF:-1 tvg-chno="48" tvg-id="FanDuel.Sports.Network.Midwest.24/7.HDTV.us" tvg-name="FDSN Midwest" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11058_dark_360w_270h.png" group-title="TV",FDSN Midwest
http://tv14s.xyz:8080/A1Jay5/362586/66795 http://cord-cutter.net:8080/30550113/30550113/66795
#EXTINF:-1 tvg-chno="49" tvg-id="FanDuel.Sports.Network.North.HDTV.us" tvg-name="FDSN North" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10977_dark_360w_270h.png" group-title="TV",FDSN North #EXTINF:-1 tvg-chno="49" tvg-id="FanDuel.Sports.Network.North.HDTV.us" tvg-name="FDSN North" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10977_dark_360w_270h.png" group-title="TV",FDSN North
http://tv14s.xyz:8080/A1Jay5/362586/58827 http://cord-cutter.net:8080/30550113/30550113/58827
#EXTINF:-1 tvg-chno="50" tvg-id="FanDuel.Sports.Network.Ohio.(Cleveland).HDTV.us" tvg-name="FDSN Ohio" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s49691_dark_360w_270h.png" group-title="TV",FDSN Ohio #EXTINF:-1 tvg-chno="50" tvg-id="FanDuel.Sports.Network.Ohio.(Cleveland).HDTV.us" tvg-name="FDSN Ohio" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s49691_dark_360w_270h.png" group-title="TV",FDSN Ohio
http://tv14s.xyz:8080/A1Jay5/362586/17752 http://cord-cutter.net:8080/30550113/30550113/17752
#EXTINF:-1 tvg-chno="51" tvg-id="FanDuel.Sports.Network.Oklahoma.24/7.HDTV.(Tulsa).us" tvg-name="FDSN Oklahoma" tvg-logo="https://i.gyazo.com/80ad6fd142cd67f06eef58d9ce5aa72b.png" group-title="TV",FDSN Oklahoma #EXTINF:-1 tvg-chno="51" tvg-id="FanDuel.Sports.Network.Oklahoma.24/7.HDTV.(Tulsa).us" tvg-name="FDSN Oklahoma" tvg-logo="https://i.gyazo.com/80ad6fd142cd67f06eef58d9ce5aa72b.png" group-title="TV",FDSN Oklahoma
http://tv14s.xyz:8080/A1Jay5/362586/20934 http://cord-cutter.net:8080/30550113/30550113/20934
#EXTINF:-1 tvg-chno="52" tvg-id="FanDuel.Sports.Network.SoCal.HDTV.us" tvg-name="FDSN SoCal" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16743_dark_360w_270h.png" group-title="TV",FDSN SoCal #EXTINF:-1 tvg-chno="52" tvg-id="FanDuel.Sports.Network.SoCal.HDTV.us" tvg-name="FDSN SoCal" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16743_dark_360w_270h.png" group-title="TV",FDSN SoCal
http://tv14s.xyz:8080/A1Jay5/362586/221151 http://cord-cutter.net:8080/30550113/30550113/221151
#EXTINF:-1 tvg-chno="53" tvg-id="FanDuel.Sports.Network.Southeast.HDTV.(Mont./Birm./Dothan/Mobile.AL).us" tvg-name="FDSN Southeast" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s20789_dark_360w_270h.png" group-title="TV",FDSN Southeast #EXTINF:-1 tvg-chno="53" tvg-id="FanDuel.Sports.Network.Southeast.HDTV.(Mont./Birm./Dothan/Mobile.AL).us" tvg-name="FDSN Southeast" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s20789_dark_360w_270h.png" group-title="TV",FDSN Southeast
http://tv14s.xyz:8080/A1Jay5/362586/81111 http://cord-cutter.net:8080/30550113/30550113/81111
#EXTINF:-1 tvg-chno="54" tvg-id="FanDuel.Sports.Network.Southwest.HDTV.24/7.(Main).us" tvg-name="FDSN Southwest" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59629_dark_360w_270h.png" group-title="TV",FDSN Southwest #EXTINF:-1 tvg-chno="54" tvg-id="FanDuel.Sports.Network.Southwest.HDTV.24/7.(Main).us" tvg-name="FDSN Southwest" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59629_dark_360w_270h.png" group-title="TV",FDSN Southwest
http://tv14s.xyz:8080/A1Jay5/362586/21843 http://cord-cutter.net:8080/30550113/30550113/21843
#EXTINF:-1 tvg-chno="55" tvg-id="FanDuel.Sports.Network.Sun.South.24/7.HDTV.(South.Marlins,.Rays,.Heat).us" tvg-name="FDSN Sun" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s61084_dark_360w_270h.png" group-title="TV",FDSN Sun #EXTINF:-1 tvg-chno="55" tvg-id="FanDuel.Sports.Network.Sun.South.24/7.HDTV.(South.Marlins,.Rays,.Heat).us" tvg-name="FDSN Sun" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s61084_dark_360w_270h.png" group-title="TV",FDSN Sun
http://tv14s.xyz:8080/A1Jay5/362586/104917 http://cord-cutter.net:8080/30550113/30550113/104917
#EXTINF:-1 tvg-chno="56" tvg-id="FanDuel.Sports.Network.West.HDTV.us" tvg-name="FDSN West" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59627_dark_360w_270h.png" group-title="TV",FDSN West #EXTINF:-1 tvg-chno="56" tvg-id="FanDuel.Sports.Network.West.HDTV.us" tvg-name="FDSN West" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59627_dark_360w_270h.png" group-title="TV",FDSN West
http://tv14s.xyz:8080/A1Jay5/362586/20932 http://cord-cutter.net:8080/30550113/30550113/20932
#EXTINF:-1 tvg-chno="57" tvg-id="FanDuel.Sports.Network.Wisconsin.24/7.HDTV.us" tvg-name="FDSN Wisconsin" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16348_dark_360w_270h.png" group-title="TV",FDSN Wisconsin #EXTINF:-1 tvg-chno="57" tvg-id="FanDuel.Sports.Network.Wisconsin.24/7.HDTV.us" tvg-name="FDSN Wisconsin" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16348_dark_360w_270h.png" group-title="TV",FDSN Wisconsin
http://tv14s.xyz:8080/A1Jay5/362586/78599 http://cord-cutter.net:8080/30550113/30550113/78599
#EXTINF:-1 tvg-chno="58" tvg-id="plex.tv.FIFA+.plex" tvg-name="FIFA+ TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s136235_dark_360w_270h.png" group-title="TV",FIFA+ TV #EXTINF:-1 tvg-chno="58" tvg-id="plex.tv.FIFA+.plex" tvg-name="FIFA+ TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s136235_dark_360w_270h.png" group-title="TV",FIFA+ TV
https://jmp2.uk/stvp-IN270000230 https://jmp2.uk/stvp-IN270000230
#EXTINF:-1 tvg-chno="59" tvg-id="Food.Network.HD.us2" tvg-name="Food Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s44718_dark_360w_270h.png" group-title="TV",Food Network #EXTINF:-1 tvg-chno="59" tvg-id="Food.Network.HD.us2" tvg-name="Food Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s44718_dark_360w_270h.png" group-title="TV",Food Network
http://tv14s.xyz:8080/A1Jay5/362586/7323 http://cord-cutter.net:8080/30550113/30550113/7323
#EXTINF:-1 tvg-chno="60" tvg-id="WFLX-DT.us_locals1" tvg-name="Fox" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s28719_dark_360w_270h.png" group-title="TV",Fox #EXTINF:-1 tvg-chno="60" tvg-id="WFLX-DT.us_locals1" tvg-name="Fox" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s28719_dark_360w_270h.png" group-title="TV",Fox
http://tv14s.xyz:8080/A1Jay5/362586/121595 http://cord-cutter.net:8080/30550113/30550113/121595
#EXTINF:-1 tvg-chno="61" tvg-id="Fox.Business.HD.us2" tvg-name="Fox Business" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s58649_dark_360w_270h.png" group-title="TV",Fox Business #EXTINF:-1 tvg-chno="61" tvg-id="Fox.Business.HD.us2" tvg-name="Fox Business" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s58649_dark_360w_270h.png" group-title="TV",Fox Business
http://tv14s.xyz:8080/A1Jay5/362586/17639 http://cord-cutter.net:8080/30550113/30550113/17639
#EXTINF:-1 tvg-chno="62" tvg-id="Fox.News.Channel.HD.us2" tvg-name="Fox News" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16374_dark_360w_270h.png" group-title="TV",Fox News #EXTINF:-1 tvg-chno="62" tvg-id="Fox.News.Channel.HD.us2" tvg-name="Fox News" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16374_dark_360w_270h.png" group-title="TV",Fox News
http://tv14s.xyz:8080/A1Jay5/362586/1818 http://cord-cutter.net:8080/30550113/30550113/1818
#EXTINF:-1 tvg-chno="63" tvg-id="FS1.Fox.Sports.1.HD.us2" tvg-name="Fox Sports 1" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s82541_dark_360w_270h.png" group-title="TV",Fox Sports 1 #EXTINF:-1 tvg-chno="63" tvg-id="FS1.Fox.Sports.1.HD.us2" tvg-name="Fox Sports 1" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s82541_dark_360w_270h.png" group-title="TV",Fox Sports 1
http://tv14s.xyz:8080/A1Jay5/362586/1846 http://cord-cutter.net:8080/30550113/30550113/1846
#EXTINF:-1 tvg-chno="64" tvg-id="FS2.Fox.Sports.2.HD.us2" tvg-name="Fox Sports 2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s33178_dark_360w_270h.png" group-title="TV",Fox Sports 2 #EXTINF:-1 tvg-chno="64" tvg-id="FS2.Fox.Sports.2.HD.us2" tvg-name="Fox Sports 2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s33178_dark_360w_270h.png" group-title="TV",Fox Sports 2
http://tv14s.xyz:8080/A1Jay5/362586/1847 http://cord-cutter.net:8080/30550113/30550113/1847
#EXTINF:-1 tvg-chno="65" tvg-id="Freeform.HD.us2" tvg-name="Freeform TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10093_dark_360w_270h.png" group-title="TV",Freeform TV #EXTINF:-1 tvg-chno="65" tvg-id="Freeform.HD.us2" tvg-name="Freeform TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10093_dark_360w_270h.png" group-title="TV",Freeform TV
http://tv14s.xyz:8080/A1Jay5/362586/13370 http://cord-cutter.net:8080/30550113/30550113/13370
#EXTINF:-1 tvg-chno="66" tvg-id="Fuse.HD.us2" tvg-name="FUSE" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14929_dark_360w_270h.png" group-title="TV",FUSE #EXTINF:-1 tvg-chno="66" tvg-id="Fuse.HD.us2" tvg-name="FUSE" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14929_dark_360w_270h.png" group-title="TV",FUSE
http://fl1.moveonjoy.com/FUSE/index.m3u8 http://fl1.moveonjoy.com/FUSE/index.m3u8
#EXTINF:-1 tvg-chno="67" tvg-id="FX.HD.us2" tvg-name="FX" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14321_dark_360w_270h.png" group-title="TV",FX #EXTINF:-1 tvg-chno="67" tvg-id="FX.HD.us2" tvg-name="FX" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14321_dark_360w_270h.png" group-title="TV",FX
http://tv14s.xyz:8080/A1Jay5/362586/46690 http://cord-cutter.net:8080/30550113/30550113/46690
#EXTINF:-1 tvg-chno="68" tvg-id="FX.Movie.Channel.HD.us2" tvg-name="FX Movie Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s70253_dark_360w_270h.png" group-title="TV",FX Movie Channel #EXTINF:-1 tvg-chno="68" tvg-id="FX.Movie.Channel.HD.us2" tvg-name="FX Movie Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s70253_dark_360w_270h.png" group-title="TV",FX Movie Channel
http://fl1.moveonjoy.com/FX_MOVIE/index.m3u8 http://fl1.moveonjoy.com/FX_MOVIE/index.m3u8
#EXTINF:-1 tvg-chno="69" tvg-id="FXX.HD.us2" tvg-name="FXX" tvg-logo="https://raw.githubusercontent.com/tv-logo/tv-logos/refs/heads/main/countries/united-states/fxx-us.png" group-title="TV",FXX #EXTINF:-1 tvg-chno="69" tvg-id="FXX.HD.us2" tvg-name="FXX" tvg-logo="https://raw.githubusercontent.com/tv-logo/tv-logos/refs/heads/main/countries/united-states/fxx-us.png" group-title="TV",FXX
http://tv14s.xyz:8080/A1Jay5/362586/46699 http://cord-cutter.net:8080/30550113/30550113/46699
#EXTINF:-1 tvg-chno="70" tvg-id="FYI.Channel.HD.us2" tvg-name="FYI TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16834_dark_360w_270h.png" group-title="TV",FYI TV #EXTINF:-1 tvg-chno="70" tvg-id="FYI.Channel.HD.us2" tvg-name="FYI TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16834_dark_360w_270h.png" group-title="TV",FYI TV
http://fl1.moveonjoy.com/FYI/index.m3u8 http://fl1.moveonjoy.com/FYI/index.m3u8
#EXTINF:-1 tvg-chno="71" tvg-id="Game.Show.Network.HD.us2" tvg-name="Game Show Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14909_dark_360w_270h.png" group-title="TV",Game Show Network #EXTINF:-1 tvg-chno="71" tvg-id="Game.Show.Network.HD.us2" tvg-name="Game Show Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14909_dark_360w_270h.png" group-title="TV",Game Show Network
http://tv14s.xyz:8080/A1Jay5/362586/120633 http://cord-cutter.net:8080/30550113/30550113/120633
#EXTINF:-1 tvg-chno="72" tvg-id="get.us2" tvg-name="getTV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s82563_dark_360w_270h.png" group-title="TV",getTV #EXTINF:-1 tvg-chno="72" tvg-id="get.us2" tvg-name="getTV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s82563_dark_360w_270h.png" group-title="TV",getTV
http://tv14s.xyz:8080/A1Jay5/362586/18366 http://cord-cutter.net:8080/30550113/30550113/18366
#EXTINF:-1 tvg-chno="73" tvg-id="Golf.Channel.HD.us2" tvg-name="Golf Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14899_dark_360w_270h.png" group-title="TV",Golf Channel #EXTINF:-1 tvg-chno="73" tvg-id="Golf.Channel.HD.us2" tvg-name="Golf Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14899_dark_360w_270h.png" group-title="TV",Golf Channel
https://fl1.moveonjoy.com/GOLF/index.m3u8 https://fl1.moveonjoy.com/GOLF/index.m3u8
@ -232,7 +232,7 @@ https://fl1.moveonjoy.com/HALLMARK_DRAMA/index.m3u8
https://fl1.moveonjoy.com/HALLMARK_MOVIES_MYSTERIES/index.m3u8 https://fl1.moveonjoy.com/HALLMARK_MOVIES_MYSTERIES/index.m3u8
#EXTINF:-1 tvg-chno="78" tvg-id="HBO.East.us2" tvg-name="HBO" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10240_dark_360w_270h.png" group-title="TV",HBO #EXTINF:-1 tvg-chno="78" tvg-id="HBO.East.us2" tvg-name="HBO" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10240_dark_360w_270h.png" group-title="TV",HBO
http://tv14s.xyz:8080/A1Jay5/362586/46713 http://cord-cutter.net:8080/30550113/30550113/46713
#EXTINF:-1 tvg-chno="79" tvg-id="HBO2.HD.us2" tvg-name="HBO 2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68140_dark_360w_270h.png" group-title="TV",HBO 2 #EXTINF:-1 tvg-chno="79" tvg-id="HBO2.HD.us2" tvg-name="HBO 2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68140_dark_360w_270h.png" group-title="TV",HBO 2
http://fl1.moveonjoy.com/HBO_2/index.m3u8 http://fl1.moveonjoy.com/HBO_2/index.m3u8
@ -247,7 +247,7 @@ https://fl1.moveonjoy.com/HBO_FAMILY/index.m3u8
https://fl1.moveonjoy.com/HBO_ZONE/index.m3u8 https://fl1.moveonjoy.com/HBO_ZONE/index.m3u8
#EXTINF:-1 tvg-chno="83" tvg-id="History.HD.us2" tvg-name="History Channel" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s87679_dark_360w_270h.png" group-title="TV",History Channel #EXTINF:-1 tvg-chno="83" tvg-id="History.HD.us2" tvg-name="History Channel" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s87679_dark_360w_270h.png" group-title="TV",History Channel
http://tv14s.xyz:8080/A1Jay5/362586/15017 http://cord-cutter.net:8080/30550113/30550113/15017
#EXTINF:-1 tvg-chno="84" tvg-id="HLN.HD.us2" tvg-name="HLN TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10145_dark_360w_270h.png" group-title="TV",HLN TV #EXTINF:-1 tvg-chno="84" tvg-id="HLN.HD.us2" tvg-name="HLN TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10145_dark_360w_270h.png" group-title="TV",HLN TV
https://turnerlive.warnermediacdn.com/hls/live/586496/cnngo/hln/VIDEO_0_3564000.m3u8 https://turnerlive.warnermediacdn.com/hls/live/586496/cnngo/hln/VIDEO_0_3564000.m3u8
@ -256,7 +256,7 @@ https://turnerlive.warnermediacdn.com/hls/live/586496/cnngo/hln/VIDEO_0_3564000.
https://fl1.moveonjoy.com/INVESTIGATION_DISCOVERY/index.m3u8 https://fl1.moveonjoy.com/INVESTIGATION_DISCOVERY/index.m3u8
#EXTINF:-1 tvg-chno="86" tvg-id="ION.Television.HD.us2" tvg-name="ION TV" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s18633_dark_360w_270h.png" group-title="TV",ION TV #EXTINF:-1 tvg-chno="86" tvg-id="ION.Television.HD.us2" tvg-name="ION TV" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s18633_dark_360w_270h.png" group-title="TV",ION TV
http://tv14s.xyz:8080/A1Jay5/362586/9297 http://cord-cutter.net:8080/30550113/30550113/9297
#EXTINF:-1 tvg-chno="87" tvg-id="Lifetime.HD.us2" tvg-name="Lifetime" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10918_dark_360w_270h.png" group-title="TV",Lifetime #EXTINF:-1 tvg-chno="87" tvg-id="Lifetime.HD.us2" tvg-name="Lifetime" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10918_dark_360w_270h.png" group-title="TV",Lifetime
http://fl1.moveonjoy.com/LIFETIME/index.m3u8 http://fl1.moveonjoy.com/LIFETIME/index.m3u8
@ -265,16 +265,16 @@ http://fl1.moveonjoy.com/LIFETIME/index.m3u8
https://fl1.moveonjoy.com/LIFETIME_MOVIE_NETWORK/index.m3u8 https://fl1.moveonjoy.com/LIFETIME_MOVIE_NETWORK/index.m3u8
#EXTINF:-1 tvg-chno="89" tvg-id="Marquee.Sports.Network.HD.us2" tvg-name="Marquee Sports Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s113768_dark_360w_270h.png" group-title="TV",Marquee Sports Network #EXTINF:-1 tvg-chno="89" tvg-id="Marquee.Sports.Network.HD.us2" tvg-name="Marquee Sports Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s113768_dark_360w_270h.png" group-title="TV",Marquee Sports Network
http://tv14s.xyz:8080/A1Jay5/362586/13379 http://cord-cutter.net:8080/30550113/30550113/13379
#EXTINF:-1 tvg-chno="90" tvg-id="MLB.Network.HD.us2" tvg-name="MLB Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s62081_dark_360w_270h.png" group-title="TV",MLB Network #EXTINF:-1 tvg-chno="90" tvg-id="MLB.Network.HD.us2" tvg-name="MLB Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s62081_dark_360w_270h.png" group-title="TV",MLB Network
https://fl1.moveonjoy.com/MLB_NETWORK/index.m3u8 https://fl1.moveonjoy.com/MLB_NETWORK/index.m3u8
#EXTINF:-1 tvg-chno="91" tvg-id="MOTORTREND.HD.us2" tvg-name="MotorTrend TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s31046_dark_360w_270h.png" group-title="TV",MotorTrend TV #EXTINF:-1 tvg-chno="91" tvg-id="MOTORTREND.HD.us2" tvg-name="MotorTrend TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s31046_dark_360w_270h.png" group-title="TV",MotorTrend TV
http://tv14s.xyz:8080/A1Jay5/362586/10399 http://cord-cutter.net:8080/30550113/30550113/10399
#EXTINF:-1 tvg-chno="92" tvg-id="MSG.National.us2" tvg-name="MSG" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10979_dark_360w_270h.png" group-title="TV",MSG #EXTINF:-1 tvg-chno="92" tvg-id="MSG.National.us2" tvg-name="MSG" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10979_dark_360w_270h.png" group-title="TV",MSG
http://tv14s.xyz:8080/A1Jay5/362586/21090 http://cord-cutter.net:8080/30550113/30550113/21090
#EXTINF:-1 tvg-chno="93" tvg-id="MSNBC.HD.us2" tvg-name="MSNBC" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16300_dark_360w_270h.png" group-title="TV",MSNBC #EXTINF:-1 tvg-chno="93" tvg-id="MSNBC.HD.us2" tvg-name="MSNBC" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s16300_dark_360w_270h.png" group-title="TV",MSNBC
https://fl1.moveonjoy.com/MSNBC/index.m3u8 https://fl1.moveonjoy.com/MSNBC/index.m3u8
@ -292,40 +292,40 @@ http://fl1.moveonjoy.com/NBA_TV/index.m3u8
https://fl1.moveonjoy.com/FL_Tampa_NBC/index.m3u8 https://fl1.moveonjoy.com/FL_Tampa_NBC/index.m3u8
#EXTINF:-1 tvg-chno="98" tvg-id="NBC.Sports.Bay.Area.HD.us2" tvg-name="NBC Sports Bay Area" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s63138_dark_360w_270h.png" group-title="TV",NBC Sports Bay Area #EXTINF:-1 tvg-chno="98" tvg-id="NBC.Sports.Bay.Area.HD.us2" tvg-name="NBC Sports Bay Area" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s63138_dark_360w_270h.png" group-title="TV",NBC Sports Bay Area
http://tv14s.xyz:8080/A1Jay5/362586/9900 http://cord-cutter.net:8080/30550113/30550113/9900
#EXTINF:-1 tvg-chno="99" tvg-id="NBC.Sports.Boston.HD.us2" tvg-name="NBC Sports Boston" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s49198_dark_360w_270h.png" group-title="TV",NBC Sports Boston #EXTINF:-1 tvg-chno="99" tvg-id="NBC.Sports.Boston.HD.us2" tvg-name="NBC Sports Boston" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s49198_dark_360w_270h.png" group-title="TV",NBC Sports Boston
http://tv14s.xyz:8080/A1Jay5/362586/20939 http://cord-cutter.net:8080/30550113/30550113/20939
#EXTINF:-1 tvg-chno="100" tvg-id="NBC.Sports.California.SAT.us2" tvg-name="NBC Sports California" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s45540_dark_360w_270h.png" group-title="TV",NBC Sports California #EXTINF:-1 tvg-chno="100" tvg-id="NBC.Sports.California.SAT.us2" tvg-name="NBC Sports California" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s45540_dark_360w_270h.png" group-title="TV",NBC Sports California
http://tv14s.xyz:8080/A1Jay5/362586/20940 http://cord-cutter.net:8080/30550113/30550113/20940
#EXTINF:-1 tvg-chno="101" tvg-id="a90a91570ce0536cbb22b591ad7e0da2" tvg-name="NBC Sports NOW" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s114140_dark_360w_270h.png" group-title="TV",NBC Sports NOW #EXTINF:-1 tvg-chno="101" tvg-id="a90a91570ce0536cbb22b591ad7e0da2" tvg-name="NBC Sports NOW" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s114140_dark_360w_270h.png" group-title="TV",NBC Sports NOW
https://jmp2.uk/plu-6549306c83595c000815a696.m3u8 https://jmp2.uk/plu-6549306c83595c000815a696.m3u8
#EXTINF:-1 tvg-chno="102" tvg-id="NBC.Sports.Philadelphia.HD.us2" tvg-name="NBC Sports Philadelphia" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s32571_dark_360w_270h.png" group-title="TV",NBC Sports Philadelphia #EXTINF:-1 tvg-chno="102" tvg-id="NBC.Sports.Philadelphia.HD.us2" tvg-name="NBC Sports Philadelphia" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s32571_dark_360w_270h.png" group-title="TV",NBC Sports Philadelphia
http://tv14s.xyz:8080/A1Jay5/362586/20943 http://cord-cutter.net:8080/30550113/30550113/20943
#EXTINF:-1 tvg-chno="103" tvg-id="New.England.Sports.Network.HD.us2" tvg-name="NESN" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s35038_dark_360w_270h.png" group-title="TV",NESN #EXTINF:-1 tvg-chno="103" tvg-id="New.England.Sports.Network.HD.us2" tvg-name="NESN" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s35038_dark_360w_270h.png" group-title="TV",NESN
http://tv14s.xyz:8080/A1Jay5/362586/31637 http://cord-cutter.net:8080/30550113/30550113/31637
#EXTINF:-1 tvg-chno="104" tvg-id="NewsNation.us2" tvg-name="NewsNation" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s91096_dark_360w_270h.png" group-title="TV",NewsNation #EXTINF:-1 tvg-chno="104" tvg-id="NewsNation.us2" tvg-name="NewsNation" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s91096_dark_360w_270h.png" group-title="TV",NewsNation
http://tv14s.xyz:8080/A1Jay5/362586/161450 http://cord-cutter.net:8080/30550113/30550113/161450
#EXTINF:-1 tvg-chno="105" tvg-id="NFL.Network.HD.us2" tvg-name="NFL Network" tvg-logo="https://raw.githubusercontent.com/tv-logo/tv-logos/refs/heads/main/countries/united-states/nfl-network-hz-us.png" group-title="TV",NFL Network #EXTINF:-1 tvg-chno="105" tvg-id="NFL.Network.HD.us2" tvg-name="NFL Network" tvg-logo="https://raw.githubusercontent.com/tv-logo/tv-logos/refs/heads/main/countries/united-states/nfl-network-hz-us.png" group-title="TV",NFL Network
http://tv14s.xyz:8080/A1Jay5/362586/159117 http://cord-cutter.net:8080/30550113/30550113/159117
#EXTINF:-1 tvg-chno="106" tvg-id="NFL.RedZone.HD.us2" tvg-name="NFL RedZone" tvg-logo="https://raw.githubusercontent.com/tv-logo/tv-logos/refs/heads/main/countries/united-states/nfl-red-zone-hz-us.png" group-title="TV",NFL RedZone #EXTINF:-1 tvg-chno="106" tvg-id="NFL.RedZone.HD.us2" tvg-name="NFL RedZone" tvg-logo="https://raw.githubusercontent.com/tv-logo/tv-logos/refs/heads/main/countries/united-states/nfl-red-zone-hz-us.png" group-title="TV",NFL RedZone
http://tv14s.xyz:8080/A1Jay5/362586/208830 http://cord-cutter.net:8080/30550113/30550113/208830
#EXTINF:-1 tvg-chno="107" tvg-id="NHL.Network.HD.us2" tvg-name="NHL Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s58570_dark_360w_270h.png" group-title="TV",NHL Network #EXTINF:-1 tvg-chno="107" tvg-id="NHL.Network.HD.us2" tvg-name="NHL Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s58570_dark_360w_270h.png" group-title="TV",NHL Network
http://23.237.104.106:8080/USA_NHL_NETWORK/index.m3u8 http://23.237.104.106:8080/USA_NHL_NETWORK/index.m3u8
#EXTINF:-1 tvg-chno="108" tvg-id="Nickelodeon.HD.us2" tvg-name="Nickelodeon" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11006_dark_360w_270h.png" group-title="TV",Nickelodeon #EXTINF:-1 tvg-chno="108" tvg-id="Nickelodeon.HD.us2" tvg-name="Nickelodeon" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11006_dark_360w_270h.png" group-title="TV",Nickelodeon
http://tv14s.xyz:8080/A1Jay5/362586/38 http://cord-cutter.net:8080/30550113/30550113/38
#EXTINF:-1 tvg-chno="109" tvg-id="Nicktoons.us2" tvg-name="Nicktoons" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s30420_dark_360w_270h.png" group-title="TV",Nicktoons #EXTINF:-1 tvg-chno="109" tvg-id="Nicktoons.us2" tvg-name="Nicktoons" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s30420_dark_360w_270h.png" group-title="TV",Nicktoons
http://tv14s.xyz:8080/A1Jay5/362586/36 http://cord-cutter.net:8080/30550113/30550113/36
#EXTINF:-1 tvg-chno="110" tvg-id="Outdoor.Channel.HD.us2" tvg-name="Outdoor Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14776_dark_360w_270h.png" group-title="TV",Outdoor Channel #EXTINF:-1 tvg-chno="110" tvg-id="Outdoor.Channel.HD.us2" tvg-name="Outdoor Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14776_dark_360w_270h.png" group-title="TV",Outdoor Channel
http://fl1.moveonjoy.com/OUTDOOR_CHANNEL/index.m3u8 http://fl1.moveonjoy.com/OUTDOOR_CHANNEL/index.m3u8
@ -346,13 +346,13 @@ https://fl1.moveonjoy.com/Pop_TV/index.m3u8
http://c3921155.edmonst.net/iptv/ZQHGFQ9PRYN859UHYGWY674B/2160/index.m3u8 http://c3921155.edmonst.net/iptv/ZQHGFQ9PRYN859UHYGWY674B/2160/index.m3u8
#EXTINF:-1 tvg-chno="116" tvg-id="ReelzChannel.HD.us2" tvg-name="Reelz Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68385_dark_360w_270h.png" group-title="TV",Reelz Channel #EXTINF:-1 tvg-chno="116" tvg-id="ReelzChannel.HD.us2" tvg-name="Reelz Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68385_dark_360w_270h.png" group-title="TV",Reelz Channel
http://tv14s.xyz:8080/A1Jay5/362586/10526 http://cord-cutter.net:8080/30550113/30550113/10526
#EXTINF:-1 tvg-chno="117" tvg-id="ROOT.Sports.Northwest.HD.us2" tvg-name="Root Sports" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11062_dark_360w_270h.png" group-title="TV",Root Sports #EXTINF:-1 tvg-chno="117" tvg-id="ROOT.Sports.Northwest.HD.us2" tvg-name="Root Sports" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11062_dark_360w_270h.png" group-title="TV",Root Sports
http://tv14s.xyz:8080/A1Jay5/362586/85232 http://cord-cutter.net:8080/30550113/30550113/85232
#EXTINF:-1 tvg-chno="118" tvg-id="SEC.Network.HD.us2" tvg-name="SEC Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s89535_dark_360w_270h.png" group-title="TV",SEC Network #EXTINF:-1 tvg-chno="118" tvg-id="SEC.Network.HD.us2" tvg-name="SEC Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s89535_dark_360w_270h.png" group-title="TV",SEC Network
http://tv14s.xyz:8080/A1Jay5/362586/17608 http://cord-cutter.net:8080/30550113/30550113/17608
#EXTINF:-1 tvg-chno="119" tvg-id="Paramount+.with.Showtime.HD.us2" tvg-name="Showtime" tvg-logo="https://raw.githubusercontent.com/tv-logo/tv-logos/refs/heads/main/countries/united-states/showtime-us.png" group-title="TV",Showtime #EXTINF:-1 tvg-chno="119" tvg-id="Paramount+.with.Showtime.HD.us2" tvg-name="Showtime" tvg-logo="https://raw.githubusercontent.com/tv-logo/tv-logos/refs/heads/main/countries/united-states/showtime-us.png" group-title="TV",Showtime
http://fl1.moveonjoy.com/SHOWTIME/index.m3u8 http://fl1.moveonjoy.com/SHOWTIME/index.m3u8
@ -364,37 +364,37 @@ http://fl1.moveonjoy.com/SMITHSONIAN_CHANNEL/index.m3u8
http://fl1.moveonjoy.com/Sony_Movie_Channel/index.m3u8 http://fl1.moveonjoy.com/Sony_Movie_Channel/index.m3u8
#EXTINF:-1 tvg-chno="122" tvg-id="Space.City.Home.Network.HD.us2" tvg-name="Space City Home Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s77744_dark_360w_270h.png" group-title="TV",Space City Home Network #EXTINF:-1 tvg-chno="122" tvg-id="Space.City.Home.Network.HD.us2" tvg-name="Space City Home Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s77744_dark_360w_270h.png" group-title="TV",Space City Home Network
http://tv14s.xyz:8080/A1Jay5/362586/213668 http://cord-cutter.net:8080/30550113/30550113/213668
#EXTINF:-1 tvg-chno="123" tvg-id="Spectrum.SportsNet.LA.Dodgers.HD.us2" tvg-name="Spectrum SportsNet LA Dodgers" tvg-logo="https://i.gyazo.com/765cce528ddda366695bb178d9dee6da.png" group-title="TV",Spectrum SportsNet LA Dodgers #EXTINF:-1 tvg-chno="123" tvg-id="Spectrum.SportsNet.LA.Dodgers.HD.us2" tvg-name="Spectrum SportsNet LA Dodgers" tvg-logo="https://i.gyazo.com/765cce528ddda366695bb178d9dee6da.png" group-title="TV",Spectrum SportsNet LA Dodgers
http://tv14s.xyz:8080/A1Jay5/362586/31636 http://cord-cutter.net:8080/30550113/30550113/31636
#EXTINF:-1 tvg-chno="124" tvg-id="Spectrum.SportsNet.Lakers.HD.us2" tvg-name="Spectrum SportsNet Lakers" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s77422_dark_360w_270h.png" group-title="TV",Spectrum SportsNet Lakers #EXTINF:-1 tvg-chno="124" tvg-id="Spectrum.SportsNet.Lakers.HD.us2" tvg-name="Spectrum SportsNet Lakers" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s77422_dark_360w_270h.png" group-title="TV",Spectrum SportsNet Lakers
http://tv14s.xyz:8080/A1Jay5/362586/20946 http://cord-cutter.net:8080/30550113/30550113/20946
#EXTINF:-1 tvg-chno="125" tvg-id="Sportsnet.360.HD.ca2" tvg-name="Sportsnet 360" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s49952_dark_360w_270h.png" group-title="TV",Sportsnet 360 #EXTINF:-1 tvg-chno="125" tvg-id="Sportsnet.360.HD.ca2" tvg-name="Sportsnet 360" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s49952_dark_360w_270h.png" group-title="TV",Sportsnet 360
http://tv14s.xyz:8080/A1Jay5/362586/57299 http://cord-cutter.net:8080/30550113/30550113/57299
#EXTINF:-1 tvg-chno="126" tvg-id="Sportsnet.East.ca2" tvg-name="Sportsnet East" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s62109_dark_360w_270h.png" group-title="TV",Sportsnet East #EXTINF:-1 tvg-chno="126" tvg-id="Sportsnet.East.ca2" tvg-name="Sportsnet East" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s62109_dark_360w_270h.png" group-title="TV",Sportsnet East
http://tv14s.xyz:8080/A1Jay5/362586/57298 http://cord-cutter.net:8080/30550113/30550113/57298
#EXTINF:-1 tvg-chno="127" tvg-id="SNY.SportsNet.New.York.HD.us2" tvg-name="SportsNet New York" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s50038_dark_360w_270h.png" group-title="TV",SportsNet New York #EXTINF:-1 tvg-chno="127" tvg-id="SNY.SportsNet.New.York.HD.us2" tvg-name="SportsNet New York" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s50038_dark_360w_270h.png" group-title="TV",SportsNet New York
http://tv14s.xyz:8080/A1Jay5/362586/20938 http://cord-cutter.net:8080/30550113/30550113/20938
#EXTINF:-1 tvg-chno="128" tvg-id="Sportsnet.One.ca2" tvg-name="Sportsnet One" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68859_dark_360w_270h.png" group-title="TV",Sportsnet One #EXTINF:-1 tvg-chno="128" tvg-id="Sportsnet.One.ca2" tvg-name="Sportsnet One" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68859_dark_360w_270h.png" group-title="TV",Sportsnet One
http://tv14s.xyz:8080/A1Jay5/362586/10247 http://cord-cutter.net:8080/30550113/30550113/10247
#EXTINF:-1 tvg-chno="129" tvg-id="Sportsnet.Ontario.HD.ca2" tvg-name="Sportsnet Ontario" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s62111_dark_360w_270h.png" group-title="TV",Sportsnet Ontario #EXTINF:-1 tvg-chno="129" tvg-id="Sportsnet.Ontario.HD.ca2" tvg-name="Sportsnet Ontario" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s62111_dark_360w_270h.png" group-title="TV",Sportsnet Ontario
http://tv14s.xyz:8080/A1Jay5/362586/11649 http://cord-cutter.net:8080/30550113/30550113/11649
#EXTINF:-1 tvg-chno="130" tvg-id="SportsNet.Pittsburgh.HD.us2" tvg-name="SportsNet Pittsburgh" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s26028_dark_360w_270h.png" group-title="TV",SportsNet Pittsburgh #EXTINF:-1 tvg-chno="130" tvg-id="SportsNet.Pittsburgh.HD.us2" tvg-name="SportsNet Pittsburgh" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s26028_dark_360w_270h.png" group-title="TV",SportsNet Pittsburgh
http://tv14s.xyz:8080/A1Jay5/362586/108178 http://cord-cutter.net:8080/30550113/30550113/108178
#EXTINF:-1 tvg-chno="131" tvg-id="Starz.HD.us2" tvg-name="Starz" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s12719_dark_360w_270h.png" group-title="TV",Starz #EXTINF:-1 tvg-chno="131" tvg-id="Starz.HD.us2" tvg-name="Starz" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s12719_dark_360w_270h.png" group-title="TV",Starz
http://tv14s.xyz:8080/A1Jay5/362586/9299 http://cord-cutter.net:8080/30550113/30550113/9299
#EXTINF:-1 tvg-chno="132" tvg-id="Syfy.HD.us2" tvg-name="Syfy" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11097_dark_360w_270h.png" group-title="TV",Syfy #EXTINF:-1 tvg-chno="132" tvg-id="Syfy.HD.us2" tvg-name="Syfy" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11097_dark_360w_270h.png" group-title="TV",Syfy
http://tv14s.xyz:8080/A1Jay5/362586/46685 http://cord-cutter.net:8080/30550113/30550113/46685
#EXTINF:-1 tvg-chno="133" tvg-id="TBS.HD.us2" tvg-name="TBS" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11867_dark_360w_270h.png" group-title="TV",TBS #EXTINF:-1 tvg-chno="133" tvg-id="TBS.HD.us2" tvg-name="TBS" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11867_dark_360w_270h.png" group-title="TV",TBS
https://turnerlive.warnermediacdn.com/hls/live/2023172/tbseast/slate/VIDEO_0_3564000.m3u8 https://turnerlive.warnermediacdn.com/hls/live/2023172/tbseast/slate/VIDEO_0_3564000.m3u8
@ -403,10 +403,10 @@ https://turnerlive.warnermediacdn.com/hls/live/2023172/tbseast/slate/VIDEO_0_356
https://fl1.moveonjoy.com/TENNIS_CHANNEL/index.m3u8 https://fl1.moveonjoy.com/TENNIS_CHANNEL/index.m3u8
#EXTINF:-1 tvg-chno="135" tvg-id="The.Weather.Channel.HD.us2" tvg-name="The Weather Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s58812_dark_360w_270h.png" group-title="TV",The Weather Channel #EXTINF:-1 tvg-chno="135" tvg-id="The.Weather.Channel.HD.us2" tvg-name="The Weather Channel" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s58812_dark_360w_270h.png" group-title="TV",The Weather Channel
http://tv14s.xyz:8080/A1Jay5/362586/18926 http://cord-cutter.net:8080/30550113/30550113/18926
#EXTINF:-1 tvg-chno="136" tvg-id="TLC.HD.(US).us2" tvg-name="TLC" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11158_dark_360w_270h.png" group-title="TV",TLC #EXTINF:-1 tvg-chno="136" tvg-id="TLC.HD.(US).us2" tvg-name="TLC" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11158_dark_360w_270h.png" group-title="TV",TLC
http://tv14s.xyz:8080/A1Jay5/362586/12734 http://cord-cutter.net:8080/30550113/30550113/12734
#EXTINF:-1 tvg-chno="137" tvg-id="TNT.HD.us2" tvg-name="TNT" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11164_dark_360w_270h.png" group-title="TV",TNT #EXTINF:-1 tvg-chno="137" tvg-id="TNT.HD.us2" tvg-name="TNT" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11164_dark_360w_270h.png" group-title="TV",TNT
https://turnerlive.warnermediacdn.com/hls/live/2023168/tnteast/slate/VIDEO_0_3564000.m3u8 https://turnerlive.warnermediacdn.com/hls/live/2023168/tnteast/slate/VIDEO_0_3564000.m3u8
@ -415,10 +415,10 @@ https://turnerlive.warnermediacdn.com/hls/live/2023168/tnteast/slate/VIDEO_0_356
https://turnerlive.warnermediacdn.com/hls/live/2023176/trueast/slate/VIDEO_0_3564000.m3u8 https://turnerlive.warnermediacdn.com/hls/live/2023176/trueast/slate/VIDEO_0_3564000.m3u8
#EXTINF:-1 tvg-chno="139" tvg-id="TSN.1.ca2" tvg-name="TSN1" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11182_dark_360w_270h.png" group-title="TV",TSN1 #EXTINF:-1 tvg-chno="139" tvg-id="TSN.1.ca2" tvg-name="TSN1" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11182_dark_360w_270h.png" group-title="TV",TSN1
http://tv14s.xyz:8080/A1Jay5/362586/57292 http://cord-cutter.net:8080/30550113/30550113/57292
#EXTINF:-1 tvg-chno="140" tvg-id="TSN.2.ca2" tvg-name="TSN2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s61474_dark_360w_270h.png" group-title="TV",TSN2 #EXTINF:-1 tvg-chno="140" tvg-id="TSN.2.ca2" tvg-name="TSN2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s61474_dark_360w_270h.png" group-title="TV",TSN2
http://tv14s.xyz:8080/A1Jay5/362586/47442 http://cord-cutter.net:8080/30550113/30550113/47442
#EXTINF:-1 tvg-chno="141" tvg-id="Turner.Classic.Movies.HD.us2" tvg-name="Turner Classic Movies" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s12852_dark_360w_270h.png" group-title="TV",Turner Classic Movies #EXTINF:-1 tvg-chno="141" tvg-id="Turner.Classic.Movies.HD.us2" tvg-name="Turner Classic Movies" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s12852_dark_360w_270h.png" group-title="TV",Turner Classic Movies
https://turnerlive.warnermediacdn.com/hls/live/2023186/tcmeast/noslate/VIDEO_1_5128000.m3u8 https://turnerlive.warnermediacdn.com/hls/live/2023186/tcmeast/noslate/VIDEO_1_5128000.m3u8
@ -430,13 +430,13 @@ https://fl1.moveonjoy.com/TV_LAND/index.m3u8
https://fl1.moveonjoy.com/TV_ONE/index.m3u8 https://fl1.moveonjoy.com/TV_ONE/index.m3u8
#EXTINF:-1 tvg-chno="144" tvg-id="USA.Network.HD.us2" tvg-name="USA East" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11207_dark_360w_270h.png" group-title="TV",USA East #EXTINF:-1 tvg-chno="144" tvg-id="USA.Network.HD.us2" tvg-name="USA East" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s11207_dark_360w_270h.png" group-title="TV",USA East
http://tv14s.xyz:8080/A1Jay5/362586/10252 http://cord-cutter.net:8080/30550113/30550113/10252
#EXTINF:-1 tvg-chno="145" tvg-id="Vice.HD.us2" tvg-name="Vice TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s18822_dark_360w_270h.png" group-title="TV",Vice TV #EXTINF:-1 tvg-chno="145" tvg-id="Vice.HD.us2" tvg-name="Vice TV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s18822_dark_360w_270h.png" group-title="TV",Vice TV
http://tv14s.xyz:8080/A1Jay5/362586/46697 http://cord-cutter.net:8080/30550113/30550113/46697
#EXTINF:-1 tvg-chno="146" tvg-id="Willow.Cricket.HD.us2" tvg-name="Willow Cricket" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68605_dark_360w_270h.png" group-title="TV",Willow Cricket #EXTINF:-1 tvg-chno="146" tvg-id="Willow.Cricket.HD.us2" tvg-name="Willow Cricket" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68605_dark_360w_270h.png" group-title="TV",Willow Cricket
http://tv14s.xyz:8080/A1Jay5/362586/41979 http://cord-cutter.net:8080/30550113/30550113/41979
#EXTINF:-1 tvg-chno="147" tvg-id="Yes.Network.us2" tvg-name="YES Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s30017_dark_360w_270h.png" group-title="TV",YES Network #EXTINF:-1 tvg-chno="147" tvg-id="Yes.Network.us2" tvg-name="YES Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s30017_dark_360w_270h.png" group-title="TV",YES Network
https://fl1.moveonjoy.com/YES_NETWORK/index.m3u8 https://fl1.moveonjoy.com/YES_NETWORK/index.m3u8

File diff suppressed because it is too large Load diff

View file

@ -5,6 +5,7 @@ from pathlib import Path
from scrapers import ( from scrapers import (
fawa, fawa,
istreameast,
lotus, lotus,
pixel, pixel,
ppv, ppv,
@ -47,27 +48,29 @@ async def main() -> None:
base_m3u8, tvg_chno = load_base() base_m3u8, tvg_chno = load_base()
tasks = [ tasks = [
asyncio.create_task(fawa.scrape(network.client)), asyncio.create_task(fawa.scrape()),
asyncio.create_task(lotus.scrape(network.client)), asyncio.create_task(istreameast.scrape()),
asyncio.create_task(lotus.scrape()),
asyncio.create_task(pixel.scrape()), asyncio.create_task(pixel.scrape()),
asyncio.create_task(ppv.scrape(network.client)), asyncio.create_task(ppv.scrape()),
asyncio.create_task(roxie.scrape(network.client)), asyncio.create_task(roxie.scrape()),
asyncio.create_task(shark.scrape(network.client)), asyncio.create_task(shark.scrape()),
asyncio.create_task(sport9.scrape(network.client)), asyncio.create_task(sport9.scrape()),
asyncio.create_task(streamcenter.scrape(network.client)), asyncio.create_task(streamcenter.scrape()),
asyncio.create_task(streamfree.scrape(network.client)), asyncio.create_task(streamfree.scrape()),
asyncio.create_task(streamhub.scrape(network.client)), asyncio.create_task(streamhub.scrape()),
asyncio.create_task(streamsgate.scrape(network.client)), asyncio.create_task(streamsgate.scrape()),
asyncio.create_task(strmd.scrape(network.client)), asyncio.create_task(strmd.scrape()),
asyncio.create_task(tvpass.scrape(network.client)), asyncio.create_task(tvpass.scrape()),
asyncio.create_task(watchfooty.scrape(network.client)), asyncio.create_task(watchfooty.scrape()),
asyncio.create_task(webcast.scrape(network.client)), asyncio.create_task(webcast.scrape()),
] ]
await asyncio.gather(*tasks) await asyncio.gather(*tasks)
additions = ( additions = (
fawa.urls fawa.urls
| istreameast.urls
| lotus.urls | lotus.urls
| pixel.urls | pixel.urls
| ppv.urls | ppv.urls

View file

@ -2,7 +2,6 @@ import re
from functools import partial from functools import partial
from urllib.parse import quote, urljoin from urllib.parse import quote, urljoin
import httpx
from selectolax.parser import HTMLParser from selectolax.parser import HTMLParser
from .utils import Cache, Time, get_logger, leagues, network from .utils import Cache, Time, get_logger, leagues, network
@ -18,17 +17,10 @@ CACHE_FILE = Cache(f"{TAG.lower()}.json", exp=10_800)
BASE_URL = "http://www.fawanews.sc/" BASE_URL = "http://www.fawanews.sc/"
async def process_event( async def process_event(url: str, url_num: int) -> str | None:
client: httpx.AsyncClient, if not (html_data := await network.request(url, log=log)):
url: str, log.info(f"URL {url_num}) Failed to load url.")
url_num: int,
) -> str | None:
try:
r = await client.get(url)
r.raise_for_status()
except Exception as e:
log.error(f'URL {url_num}) Failed to fetch "{url}": {e}')
return return
valid_m3u8 = re.compile( valid_m3u8 = re.compile(
@ -36,33 +28,27 @@ async def process_event(
re.IGNORECASE, re.IGNORECASE,
) )
if not (match := valid_m3u8.search(r.text)): if not (match := valid_m3u8.search(html_data.text)):
log.info(f"URL {url_num}) No M3U8 found") log.info(f"URL {url_num}) No M3U8 found")
return return
log.info(f"URL {url_num}) Captured M3U8") log.info(f"URL {url_num}) Captured M3U8")
return match[2] return match[2]
async def get_events( async def get_events(cached_hrefs: set[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient, events = []
cached_hrefs: set[str],
) -> list[dict[str, str]]:
try:
r = await client.get(BASE_URL)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{BASE_URL}": {e}')
return [] if not (html_data := await network.request(BASE_URL, log=log)):
return events
soup = HTMLParser(r.content) soup = HTMLParser(html_data.content)
valid_event = re.compile(r"\d{1,2}:\d{1,2}") valid_event = re.compile(r"\d{1,2}:\d{1,2}")
clean_event = re.compile(r"\s+-+\s+\w{1,4}") clean_event = re.compile(r"\s+-+\s+\w{1,4}")
events = []
for item in soup.css(".user-item"): for item in soup.css(".user-item"):
text = item.css_first(".user-item__name") text = item.css_first(".user-item__name")
subtext = item.css_first(".user-item__playing") subtext = item.css_first(".user-item__playing")
@ -98,17 +84,20 @@ async def get_events(
return events return events
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_hrefs = {entry["href"] for entry in cached_urls.values()} cached_hrefs = {entry["href"] for entry in cached_urls.values()}
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{BASE_URL}"') log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(client, cached_hrefs) events = await get_events(cached_hrefs)
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -118,7 +107,6 @@ async def scrape(client: httpx.AsyncClient) -> None:
for i, ev in enumerate(events, start=1): for i, ev in enumerate(events, start=1):
handler = partial( handler = partial(
process_event, process_event,
client=client,
url=ev["link"], url=ev["link"],
url_num=i, url_num=i,
) )
@ -155,6 +143,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -0,0 +1,152 @@
import base64
import re
from selectolax.parser import HTMLParser
from .utils import Cache, Time, get_logger, leagues, network
log = get_logger(__name__)
urls: dict[str, dict[str, str | float]] = {}
TAG = "ISTRMEST"
CACHE_FILE = Cache(f"{TAG.lower()}.json", exp=3_600)
BASE_URL = "https://istreameast.app"
async def process_event(url: str, url_num: int) -> str | None:
pattern = re.compile(r"source:\s*window\.atob\(\s*'([^']+)'\s*\)", re.IGNORECASE)
if not (event_data := await network.request(url, log=log)):
log.info(f"URL {url_num}) Failed to load url.")
return
soup = HTMLParser(event_data.content)
if not (iframe := soup.css_first("iframe#wp_player")):
log.warning(f"URL {url_num}) No iframe element found.")
return
if not (iframe_src := iframe.attributes.get("src")):
log.warning(f"URL {url_num}) No iframe source found.")
return
if not (iframe_src_data := await network.request(iframe_src, log=log)):
log.info(f"URL {url_num}) Failed to load iframe source.")
return
if not (match := pattern.search(iframe_src_data.text)):
log.warning(f"URL {url_num}) No Clappr source found.")
return
log.info(f"URL {url_num}) Captured M3U8")
return base64.b64decode(match[1]).decode("utf-8")
async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
events = []
if not (html_data := await network.request(BASE_URL, log=log)):
return events
pattern = re.compile(r"^(?:LIVE|\d+\s+(minutes?)\b)", re.IGNORECASE)
soup = HTMLParser(html_data.content)
for link in soup.css("li.f1-podium--item > a.f1-podium--link"):
li_item = link.parent
if not (rank_elem := li_item.css_first(".f1-podium--rank")):
continue
sport = rank_elem.text(strip=True)
if not (driver_elem := li_item.css_first(".f1-podium--driver")):
continue
event_name = driver_elem.text(strip=True)
if inner_span := driver_elem.css_first("span.d-md-inline"):
event_name = inner_span.text(strip=True)
if f"[{sport}] {event_name} ({TAG})" in cached_keys:
continue
if not (href := link.attributes.get("href")):
continue
if not (time_elem := li_item.css_first(".SaatZamanBilgisi")):
continue
time_text = time_elem.text(strip=True)
if not pattern.search(time_text):
continue
events.append(
{
"sport": sport,
"event": event_name,
"link": href,
}
)
return events
async def scrape() -> None:
cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls)
urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)")
if events:
now = Time.clean(Time.now()).timestamp()
for i, ev in enumerate(events, start=1):
if url := await process_event(ev["link"], i):
sport, event, link = (
ev["sport"],
ev["event"],
ev["link"],
)
key = f"[{sport}] {event} ({TAG})"
tvg_id, logo = leagues.get_tvg_info(sport, event)
entry = {
"url": url,
"logo": logo,
"base": "https://gooz.aapmains.net",
"timestamp": now,
"id": tvg_id or "Live.Event.us",
"link": link,
}
urls[key] = cached_urls[key] = entry
if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)")
else:
log.info("No new events found")
CACHE_FILE.write(cached_urls)

View file

@ -1,6 +1,5 @@
from functools import partial from functools import partial
import httpx
from playwright.async_api import async_playwright from playwright.async_api import async_playwright
from .utils import Cache, Time, get_logger, leagues, network from .utils import Cache, Time, get_logger, leagues, network
@ -22,40 +21,16 @@ def fix_league(s: str) -> str:
return " ".join(x.capitalize() for x in s.split()) if len(s) > 5 else s.upper() return " ".join(x.capitalize() for x in s.split()) if len(s) > 5 else s.upper()
async def refresh_api_cache( async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient, now = Time.clean(Time.now())
url: str,
now_ts: float,
) -> dict[str, dict[str, str]]:
log.info("Refreshing API cache")
try:
r = await client.get(url)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{url}": {e}')
return {}
if not (data := r.json()):
return {}
data["timestamp"] = now_ts
return data
async def get_events(
client: httpx.AsyncClient, cached_keys: set[str]
) -> list[dict[str, str]]:
now = Time.now()
if not (api_data := API_CACHE.load(per_entry=False)): if not (api_data := API_CACHE.load(per_entry=False)):
api_data = await refresh_api_cache( api_data = {}
client,
BASE_URL, if r := await network.request(BASE_URL, log=log):
now.timestamp(), api_data: dict = r.json()
)
api_data["timestamp"] = now.timestamp()
API_CACHE.write(api_data) API_CACHE.write(api_data)
@ -68,9 +43,14 @@ async def get_events(
continue continue
for event in info["items"]: for event in info["items"]:
event_league = event["league"] if (event_league := event["league"]) == "channel tv":
continue
if event_league == "channel tv": sport = fix_league(event_league)
event_name = event["title"]
if f"[{sport}] {event_name} ({TAG})" in cached_keys:
continue continue
event_streams: list[dict[str, str]] = event["streams"] event_streams: list[dict[str, str]] = event["streams"]
@ -78,41 +58,34 @@ async def get_events(
if not (event_link := event_streams[0].get("link")): if not (event_link := event_streams[0].get("link")):
continue continue
sport = fix_league(event_league)
event_name = event["title"]
key = f"[{sport}] {event_name} ({TAG})"
if cached_keys & {key}:
continue
events.append( events.append(
{ {
"sport": sport, "sport": sport,
"event": event_name, "event": event_name,
"link": event_link, "link": event_link,
"timestamp": now.timestamp(),
} }
) )
return events return events
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{BASE_URL}"') log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(client, set(cached_urls.keys())) events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
if events: if events:
now = Time.clean(Time.now()).timestamp()
async with async_playwright() as p: async with async_playwright() as p:
browser, context = await network.browser(p) browser, context = await network.browser(p)
@ -132,10 +105,11 @@ async def scrape(client: httpx.AsyncClient) -> None:
) )
if url: if url:
sport, event, link = ( sport, event, link, ts = (
ev["sport"], ev["sport"],
ev["event"], ev["event"],
ev["link"], ev["link"],
ev["timestamp"],
) )
tvg_id, logo = leagues.get_tvg_info(sport, event) tvg_id, logo = leagues.get_tvg_info(sport, event)
@ -146,7 +120,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
"url": url, "url": url,
"logo": logo, "logo": logo,
"base": "https://vividmosaica.com/", "base": "https://vividmosaica.com/",
"timestamp": now, "timestamp": ts,
"id": tvg_id or "Live.Event.us", "id": tvg_id or "Live.Event.us",
"link": link, "link": link,
} }
@ -157,6 +131,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -55,7 +55,9 @@ async def get_events() -> dict[str, dict[str, str | float]]:
continue continue
event_name = event["match_name"] event_name = event["match_name"]
channel_info: dict[str, str] = event["channel"] channel_info: dict[str, str] = event["channel"]
category: dict[str, str] = channel_info["TVCategory"] category: dict[str, str] = channel_info["TVCategory"]
sport = category["name"] sport = category["name"]
@ -82,7 +84,9 @@ async def get_events() -> dict[str, dict[str, str | float]]:
async def scrape() -> None: async def scrape() -> None:
if cached := CACHE_FILE.load(): if cached := CACHE_FILE.load():
urls.update(cached) urls.update(cached)
log.info(f"Loaded {len(urls)} event(s) from cache") log.info(f"Loaded {len(urls)} event(s) from cache")
return return
log.info(f'Scraping from "{BASE_URL}"') log.info(f'Scraping from "{BASE_URL}"')

View file

@ -1,6 +1,5 @@
from functools import partial from functools import partial
import httpx
from playwright.async_api import async_playwright from playwright.async_api import async_playwright
from .utils import Cache, Time, get_logger, leagues, network from .utils import Cache, Time, get_logger, leagues, network
@ -28,36 +27,19 @@ BASE_MIRRORS = [
] ]
async def refresh_api_cache( async def get_events(api_url: str, cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient, events = []
url: str,
) -> dict[str, dict[str, str]]:
log.info("Refreshing API cache")
try:
r = await client.get(url)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{url}": {e}')
return {}
return r.json()
async def get_events(
client: httpx.AsyncClient,
api_url: str,
cached_keys: set[str],
) -> list[dict[str, str]]:
if not (api_data := API_FILE.load(per_entry=False)): if not (api_data := API_FILE.load(per_entry=False)):
api_data = await refresh_api_cache(client, api_url) api_data = {}
if r := await network.request(api_url, log=log):
api_data: dict = r.json()
API_FILE.write(api_data) API_FILE.write(api_data)
events = []
now = Time.clean(Time.now()) now = Time.clean(Time.now())
start_dt = now.delta(minutes=-30) start_dt = now.delta(minutes=-30)
end_dt = now.delta(minutes=30) end_dt = now.delta(minutes=30)
@ -69,16 +51,17 @@ async def get_events(
for event in stream_group.get("streams", []): for event in stream_group.get("streams", []):
name = event.get("name") name = event.get("name")
start_ts = event.get("starts_at") start_ts = event.get("starts_at")
logo = event.get("poster") logo = event.get("poster")
iframe = event.get("iframe") iframe = event.get("iframe")
if not (name and start_ts and iframe): if not (name and start_ts and iframe):
continue continue
key = f"[{sport}] {name} ({TAG})" if f"[{sport}] {name} ({TAG})" in cached_keys:
if cached_keys & {key}:
continue continue
event_dt = Time.from_ts(start_ts) event_dt = Time.from_ts(start_ts)
@ -99,9 +82,11 @@ async def get_events(
return events return events
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
@ -117,11 +102,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
log.info(f'Scraping from "{base_url}"') log.info(f'Scraping from "{base_url}"')
events = await get_events( events = await get_events(api_url, cached_urls.keys())
client,
api_url,
set(cached_urls.keys()),
)
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -173,6 +154,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -3,7 +3,6 @@ import re
from functools import partial from functools import partial
from urllib.parse import urljoin from urllib.parse import urljoin
import httpx
from selectolax.parser import HTMLParser from selectolax.parser import HTMLParser
from .utils import Cache, Time, get_logger, leagues, network from .utils import Cache, Time, get_logger, leagues, network
@ -31,17 +30,8 @@ SPORT_ENDPOINTS = {
} }
async def process_event( async def process_event(url: str, url_num: int) -> str | None:
client: httpx.AsyncClient, if not (html_data := await network.request(url, log=log)):
url: str,
url_num: int,
) -> str | None:
try:
r = await client.get(url)
r.raise_for_status()
except Exception as e:
log.error(f'URL {url_num}) Failed to fetch "{url}": {e}')
return return
valid_m3u8 = re.compile( valid_m3u8 = re.compile(
@ -49,39 +39,29 @@ async def process_event(
re.IGNORECASE, re.IGNORECASE,
) )
if not (match := valid_m3u8.search(r.text)): if not (match := valid_m3u8.search(html_data.text)):
log.info(f"URL {url_num}) No M3U8 found") log.info(f"URL {url_num}) No M3U8 found")
return return
log.info(f"URL {url_num}) Captured M3U8") log.info(f"URL {url_num}) Captured M3U8")
return match[1] return match[1]
async def get_html_data(client: httpx.AsyncClient, url: str) -> bytes:
try:
r = await client.get(url)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{url}": {e}')
return b""
return r.content
async def refresh_html_cache( async def refresh_html_cache(
client: httpx.AsyncClient,
url: str, url: str,
sport: str, sport: str,
now_ts: float, now_ts: float,
) -> dict[str, dict[str, str | float]]: ) -> dict[str, dict[str, str | float]]:
html_data = await get_html_data(client, url)
soup = HTMLParser(html_data)
events = {} events = {}
if not (html_data := await network.request(url, log=log)):
return events
soup = HTMLParser(html_data.content)
for row in soup.css("table#eventsTable tbody tr"): for row in soup.css("table#eventsTable tbody tr"):
if not (a_tag := row.css_first("td a")): if not (a_tag := row.css_first("td a")):
continue continue
@ -113,9 +93,7 @@ async def refresh_html_cache(
return events return events
async def get_events( async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient, cached_keys: set[str]
) -> list[dict[str, str]]:
now = Time.clean(Time.now()) now = Time.clean(Time.now())
if not (events := HTML_CACHE.load()): if not (events := HTML_CACHE.load()):
@ -125,7 +103,6 @@ async def get_events(
tasks = [ tasks = [
refresh_html_cache( refresh_html_cache(
client,
url, url,
sport, sport,
now.timestamp(), now.timestamp(),
@ -145,7 +122,7 @@ async def get_events(
end_ts = now.delta(minutes=30).timestamp() end_ts = now.delta(minutes=30).timestamp()
for k, v in events.items(): for k, v in events.items():
if cached_keys & {k}: if k in cached_keys:
continue continue
if not start_ts <= v["event_ts"] <= end_ts: if not start_ts <= v["event_ts"] <= end_ts:
@ -156,16 +133,18 @@ async def get_events(
return live return live
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{BASE_URL}"') log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(client, set(cached_urls.keys())) events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -173,7 +152,6 @@ async def scrape(client: httpx.AsyncClient) -> None:
for i, ev in enumerate(events, start=1): for i, ev in enumerate(events, start=1):
handler = partial( handler = partial(
process_event, process_event,
client=client,
url=ev["link"], url=ev["link"],
url_num=i, url_num=i,
) )
@ -209,6 +187,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -1,7 +1,6 @@
import re import re
from functools import partial from functools import partial
import httpx
from selectolax.parser import HTMLParser from selectolax.parser import HTMLParser
from .utils import Cache, Time, get_logger, leagues, network from .utils import Cache, Time, get_logger, leagues, network
@ -19,52 +18,39 @@ HTML_CACHE = Cache(f"{TAG.lower()}-html.json", exp=19_800)
BASE_URL = "https://sharkstreams.net" BASE_URL = "https://sharkstreams.net"
async def process_event( async def process_event(url: str, url_num: int) -> str | None:
client: httpx.AsyncClient, if not (r := await network.request(url, log=log)):
url: str, log.info(f"URL {url_num}) Failed to load url.")
url_num: int,
) -> str | None:
try:
r = await client.get(url)
r.raise_for_status()
except Exception as e:
log.error(f'URL {url_num}) Failed to fetch "{url}": {e}')
return return
data: dict[str, list[str]] = r.json() data: dict[str, list[str]] = r.json()
if not data.get("urls"): if not (urls := data.get("urls")):
log.info(f"URL {url_num}) No M3U8 found") log.info(f"URL {url_num}) No M3U8 found")
return return
log.info(f"URL {url_num}) Captured M3U8") log.info(f"URL {url_num}) Captured M3U8")
return data["urls"][0] return urls[0]
async def refresh_html_cache( async def refresh_html_cache(now_ts: float) -> dict[str, dict[str, str | float]]:
client: httpx.AsyncClient, now_ts: float
) -> dict[str, dict[str, str | float]]:
log.info("Refreshing HTML cache") log.info("Refreshing HTML cache")
try:
r = await client.get(BASE_URL)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{BASE_URL}": {e}')
return {}
pattern = re.compile(r"openEmbed\('([^']+)'\)", re.IGNORECASE)
soup = HTMLParser(r.content)
events = {} events = {}
if not (html_data := await network.request(BASE_URL, log=log)):
return events
pattern = re.compile(r"openEmbed\('([^']+)'\)", re.IGNORECASE)
soup = HTMLParser(html_data.content)
for row in soup.css(".row"): for row in soup.css(".row"):
date_node = row.css_first(".ch-date") date_node = row.css_first(".ch-date")
sport_node = row.css_first(".ch-category") sport_node = row.css_first(".ch-category")
name_node = row.css_first(".ch-name") name_node = row.css_first(".ch-name")
@ -72,7 +58,9 @@ async def refresh_html_cache(
continue continue
event_dt = Time.from_str(date_node.text(strip=True), timezone="EST") event_dt = Time.from_str(date_node.text(strip=True), timezone="EST")
sport = sport_node.text(strip=True) sport = sport_node.text(strip=True)
event_name = name_node.text(strip=True) event_name = name_node.text(strip=True)
embed_btn = row.css_first("a.hd-link.secondary") embed_btn = row.css_first("a.hd-link.secondary")
@ -98,14 +86,11 @@ async def refresh_html_cache(
return events return events
async def get_events( async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient,
cached_keys: set[str],
) -> list[dict[str, str]]:
now = Time.clean(Time.now()) now = Time.clean(Time.now())
if not (events := HTML_CACHE.load()): if not (events := HTML_CACHE.load()):
events = await refresh_html_cache(client, now.timestamp()) events = await refresh_html_cache(now.timestamp())
HTML_CACHE.write(events) HTML_CACHE.write(events)
@ -115,7 +100,7 @@ async def get_events(
end_ts = now.delta(minutes=10).timestamp() end_ts = now.delta(minutes=10).timestamp()
for k, v in events.items(): for k, v in events.items():
if cached_keys & {k}: if k in cached_keys:
continue continue
if not start_ts <= v["event_ts"] <= end_ts: if not start_ts <= v["event_ts"] <= end_ts:
@ -126,16 +111,18 @@ async def get_events(
return live return live
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{BASE_URL}"') log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(client, set(cached_urls.keys())) events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -143,7 +130,6 @@ async def scrape(client: httpx.AsyncClient) -> None:
for i, ev in enumerate(events, start=1): for i, ev in enumerate(events, start=1):
handler = partial( handler = partial(
process_event, process_event,
client=client,
url=ev["link"], url=ev["link"],
url_num=i, url_num=i,
) )
@ -179,6 +165,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -2,7 +2,6 @@ import asyncio
from functools import partial from functools import partial
from urllib.parse import urljoin from urllib.parse import urljoin
import httpx
from playwright.async_api import async_playwright from playwright.async_api import async_playwright
from selectolax.parser import HTMLParser from selectolax.parser import HTMLParser
@ -16,34 +15,18 @@ TAG = "SPORT9"
CACHE_FILE = Cache(f"{TAG.lower()}.json", exp=3_600) CACHE_FILE = Cache(f"{TAG.lower()}.json", exp=3_600)
BASE_URL = "https://sport9.ru" BASE_URL = "https://sport9.ru/"
async def get_html_data( async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient,
url: str,
date: str,
) -> bytes:
try:
r = await client.get(url, params={"date": date})
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{r.url}": {e}')
return b""
return r.content
async def get_events(
client: httpx.AsyncClient,
cached_keys: set[str],
) -> list[dict[str, str]]:
now = Time.now() now = Time.now()
tasks = [ tasks = [
get_html_data(client, BASE_URL, str(d.date())) network.request(
BASE_URL,
log=log,
params={"date": d.date()},
)
for d in [ for d in [
now.delta(days=-1), now.delta(days=-1),
now, now,
@ -53,10 +36,11 @@ async def get_events(
results = await asyncio.gather(*tasks) results = await asyncio.gather(*tasks)
soups = [HTMLParser(html) for html in results]
events = [] events = []
if not (soups := [HTMLParser(html.content) for html in results if html]):
return events
for soup in soups: for soup in soups:
for card in soup.css("a.match-card"): for card in soup.css("a.match-card"):
live_badge = card.css_first(".live-badge") live_badge = card.css_first(".live-badge")
@ -68,7 +52,9 @@ async def get_events(
continue continue
sport = sport_node.text(strip=True) sport = sport_node.text(strip=True)
team_1_node = card.css_first(".team1 .team-name") team_1_node = card.css_first(".team1 .team-name")
team_2_node = card.css_first(".team2 .team-name") team_2_node = card.css_first(".team2 .team-name")
if team_1_node and team_2_node: if team_1_node and team_2_node:
@ -85,12 +71,10 @@ async def get_events(
else: else:
continue continue
if not (href := card.attributes.get("href")): if f"[{sport}] {event} ({TAG})" in cached_keys:
continue continue
key = f"[{sport}] {event} ({TAG})" if not (href := card.attributes.get("href")):
if cached_keys & {key}:
continue continue
events.append( events.append(
@ -104,16 +88,18 @@ async def get_events(
return events return events
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{BASE_URL}"') log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(client, set(cached_urls.keys())) events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -164,6 +150,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -1,6 +1,5 @@
from functools import partial from functools import partial
import httpx
from playwright.async_api import async_playwright from playwright.async_api import async_playwright
from .utils import Cache, Time, get_logger, leagues, network from .utils import Cache, Time, get_logger, leagues, network
@ -17,7 +16,6 @@ API_FILE = Cache(f"{TAG.lower()}-api.json", exp=28_800)
BASE_URL = "https://backendstreamcenter.youshop.pro:488/api/Parties" BASE_URL = "https://backendstreamcenter.youshop.pro:488/api/Parties"
CATEGORIES = { CATEGORIES = {
4: "Basketball", 4: "Basketball",
9: "Football", 9: "Football",
@ -33,35 +31,20 @@ CATEGORIES = {
} }
async def refresh_api_cache( async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient, now_ts: float
) -> list[dict[str, str | int]]:
log.info("Refreshing API cache")
try:
r = await client.get(BASE_URL, params={"pageNumber": 1, "pageSize": 500})
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{r.url}": {e}')
return []
if not (data := r.json()):
return []
data[-1]["timestamp"] = now_ts
return data
async def get_events(
client: httpx.AsyncClient,
cached_keys: set[str],
) -> list[dict[str, str]]:
now = Time.clean(Time.now()) now = Time.clean(Time.now())
if not (api_data := API_FILE.load(per_entry=False, index=-1)): if not (api_data := API_FILE.load(per_entry=False, index=-1)):
api_data = await refresh_api_cache(client, now.timestamp()) api_data = []
if r := await network.request(
BASE_URL,
log=log,
params={"pageNumber": 1, "pageSize": 500},
):
api_data: list[dict] = r.json()
api_data[-1]["timestamp"] = now.timestamp()
API_FILE.write(api_data) API_FILE.write(api_data)
@ -82,17 +65,15 @@ async def get_events(
if not (name and category_id and iframe and event_time): if not (name and category_id and iframe and event_time):
continue continue
event_dt = Time.from_str(event_time, timezone="CET")
if not start_dt <= event_dt <= end_dt:
continue
if not (sport := CATEGORIES.get(category_id)): if not (sport := CATEGORIES.get(category_id)):
continue continue
key = f"[{sport}] {name} ({TAG})" if f"[{sport}] {name} ({TAG})" in cached_keys:
continue
if cached_keys & {key}: event_dt = Time.from_str(event_time, timezone="CET")
if not start_dt <= event_dt <= end_dt:
continue continue
events.append( events.append(
@ -107,16 +88,18 @@ async def get_events(
return events return events
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
log.info('Scraping from "https://streamcenter.xyz"') log.info('Scraping from "https://streamcenter.xyz"')
events = await get_events(client, set(cached_urls.keys())) events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -166,6 +149,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -1,7 +1,5 @@
from urllib.parse import urljoin from urllib.parse import urljoin
import httpx
from .utils import Cache, Time, get_logger, leagues, network from .utils import Cache, Time, get_logger, leagues, network
log = get_logger(__name__) log = get_logger(__name__)
@ -15,24 +13,20 @@ CACHE_FILE = Cache(f"{TAG.lower()}.json", exp=19_800)
BASE_URL = "https://streamfree.to/" BASE_URL = "https://streamfree.to/"
async def refresh_api_cache(client: httpx.AsyncClient) -> dict[str, dict[str, list]]: async def get_events() -> dict[str, dict[str, str | float]]:
try:
r = await client.get(urljoin(BASE_URL, "streams"))
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{r.url}": {e}')
return {}
return r.json()
async def get_events(client: httpx.AsyncClient) -> dict[str, dict[str, str | float]]:
api_data = await refresh_api_cache(client)
events = {} events = {}
now = Time.clean(Time.now()).timestamp() if not (
r := await network.request(
urljoin(BASE_URL, "streams"),
log=log,
)
):
return events
api_data: dict = r.json()
now = Time.clean(Time.now())
for streams in api_data.get("streams", {}).values(): for streams in api_data.get("streams", {}).values():
if not streams: if not streams:
@ -66,24 +60,24 @@ async def get_events(client: httpx.AsyncClient) -> dict[str, dict[str, str | flo
), ),
"logo": logo or pic, "logo": logo or pic,
"base": BASE_URL, "base": BASE_URL,
"timestamp": now, "timestamp": now.timestamp(),
"id": tvg_id or "Live.Event.us", "id": tvg_id or "Live.Event.us",
} }
return events return events
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
if cached := CACHE_FILE.load(): if cached := CACHE_FILE.load():
urls.update(cached) urls.update(cached)
log.info(f"Loaded {len(urls)} event(s) from cache") log.info(f"Loaded {len(urls)} event(s) from cache")
return return
log.info(f'Scraping from "{BASE_URL}"') log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(client) urls.update(await get_events())
urls.update(events)
CACHE_FILE.write(urls) CACHE_FILE.write(urls)

View file

@ -2,7 +2,6 @@ import asyncio
from functools import partial from functools import partial
from urllib.parse import urljoin from urllib.parse import urljoin
import httpx
from playwright.async_api import async_playwright from playwright.async_api import async_playwright
from selectolax.parser import HTMLParser from selectolax.parser import HTMLParser
@ -20,7 +19,6 @@ HTML_CACHE = Cache(f"{TAG.lower()}-html.json", exp=28_800)
BASE_URL = "https://streamhub.pro/" BASE_URL = "https://streamhub.pro/"
CATEGORIES = { CATEGORIES = {
"Soccer": "sport_68c02a4464a38", "Soccer": "sport_68c02a4464a38",
"American Football": "sport_68c02a4465113", "American Football": "sport_68c02a4465113",
@ -36,40 +34,24 @@ CATEGORIES = {
} }
async def get_html_data(
client: httpx.AsyncClient,
date: str,
sport_id: str,
) -> bytes:
try:
r = await client.get(
urljoin(BASE_URL, f"events/{date}"),
params={"sport_id": sport_id},
)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{r.url}": {e}')
return b""
return r.content
async def refresh_html_cache( async def refresh_html_cache(
client: httpx.AsyncClient,
date: str, date: str,
sport_id: str, sport_id: str,
ts: float, ts: float,
) -> dict[str, dict[str, str | float]]: ) -> dict[str, dict[str, str | float]]:
html_data = await get_html_data(client, date, sport_id)
soup = HTMLParser(html_data)
events = {} events = {}
if not (
html_data := await network.request(
urljoin(BASE_URL, f"events/{date}"),
log=log,
params={"sport_id": sport_id},
)
):
return events
soup = HTMLParser(html_data.content)
for section in soup.css(".events-section"): for section in soup.css(".events-section"):
if not (sport_node := section.css_first(".section-titlte")): if not (sport_node := section.css_first(".section-titlte")):
continue continue
@ -111,25 +93,19 @@ async def refresh_html_cache(
return events return events
async def get_events( async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient,
cached_keys: set[str],
) -> list[dict[str, str]]:
now = Time.clean(Time.now()) now = Time.clean(Time.now())
if not (events := HTML_CACHE.load()): if not (events := HTML_CACHE.load()):
log.info("Refreshing HTML cache") log.info("Refreshing HTML cache")
dates = [now.date(), now.delta(days=1).date()]
tasks = [ tasks = [
refresh_html_cache( refresh_html_cache(
client,
date, date,
sport_id, sport_id,
now.timestamp(), now.timestamp(),
) )
for date in dates for date in [now.date(), now.delta(days=1).date()]
for sport_id in CATEGORIES.values() for sport_id in CATEGORIES.values()
] ]
@ -145,7 +121,7 @@ async def get_events(
end_ts = now.delta(minutes=5).timestamp() end_ts = now.delta(minutes=5).timestamp()
for k, v in events.items(): for k, v in events.items():
if cached_keys & {k}: if k in cached_keys:
continue continue
if not start_ts <= v["event_ts"] <= end_ts: if not start_ts <= v["event_ts"] <= end_ts:
@ -156,16 +132,18 @@ async def get_events(
return live return live
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{BASE_URL}"') log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(client, set(cached_urls.keys())) events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -217,6 +195,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -4,7 +4,6 @@ from itertools import chain
from typing import Any from typing import Any
from urllib.parse import urljoin from urllib.parse import urljoin
import httpx
from playwright.async_api import async_playwright from playwright.async_api import async_playwright
from .utils import Cache, Time, get_logger, leagues, network from .utils import Cache, Time, get_logger, leagues, network
@ -46,32 +45,20 @@ def get_event(t1: str, t2: str) -> str:
return f"{t1.strip()} vs {t2.strip()}" return f"{t1.strip()} vs {t2.strip()}"
async def get_api_data(client: httpx.AsyncClient, url: str) -> list[dict[str, Any]]: async def refresh_api_cache(now_ts: float) -> list[dict[str, Any]]:
try:
r = await client.get(url)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{url}": {e}')
return []
return r.json()
async def refresh_api_cache(
client: httpx.AsyncClient,
now_ts: float,
) -> list[dict[str, Any]]:
log.info("Refreshing API cache") log.info("Refreshing API cache")
tasks = [ tasks = [
get_api_data(client, urljoin(BASE_URL, f"data/{sport}.json")) network.request(
urljoin(BASE_URL, f"data/{sport}.json"),
log=log,
)
for sport in SPORT_ENDPOINTS for sport in SPORT_ENDPOINTS
] ]
results = await asyncio.gather(*tasks) results = await asyncio.gather(*tasks)
if not (data := list(chain(*results))): if not (data := [*chain.from_iterable(r.json() for r in results if r)]):
return [] return []
for ev in data: for ev in data:
@ -82,13 +69,11 @@ async def refresh_api_cache(
return data return data
async def get_events( async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient, cached_keys: set[str]
) -> list[dict[str, str]]:
now = Time.clean(Time.now()) now = Time.clean(Time.now())
if not (api_data := API_FILE.load(per_entry=False, index=-1)): if not (api_data := API_FILE.load(per_entry=False, index=-1)):
api_data = await refresh_api_cache(client, now.timestamp()) api_data = await refresh_api_cache(now.timestamp())
API_FILE.write(api_data) API_FILE.write(api_data)
@ -104,27 +89,28 @@ async def get_events(
t1, t2 = stream_group.get("away"), stream_group.get("home") t1, t2 = stream_group.get("away"), stream_group.get("home")
event = get_event(t1, t2)
if not (event_ts and sport): if not (event_ts and sport):
continue continue
if "F1 Abu Dhabi" in event: # api bug
continue
if f"[{sport}] {event} ({TAG})" in cached_keys:
continue
event_dt = Time.from_ts(event_ts) event_dt = Time.from_ts(event_ts)
if not start_dt <= event_dt <= end_dt: if not start_dt <= event_dt <= end_dt:
continue continue
event = get_event(t1, t2)
if not (streams := stream_group.get("streams")): if not (streams := stream_group.get("streams")):
continue continue
if not (url := streams[0].get("url")): if not (url := streams[0].get("url")):
continue continue
key = f"[{sport}] {event} ({TAG})"
if cached_keys & {key}:
continue
events.append( events.append(
{ {
"sport": sport, "sport": sport,
@ -137,16 +123,18 @@ async def get_events(
return events return events
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{BASE_URL}"') log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(client, set(cached_urls.keys())) events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -196,6 +184,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -1,9 +1,7 @@
import re import re
from functools import partial from functools import partial
from typing import Any
from urllib.parse import urljoin from urllib.parse import urljoin
import httpx
from playwright.async_api import async_playwright from playwright.async_api import async_playwright
from .utils import Cache, Time, get_logger, leagues, network from .utils import Cache, Time, get_logger, leagues, network
@ -35,52 +33,28 @@ def fix_sport(s: str) -> str:
return s.capitalize() if len(s) >= 4 else s.upper() return s.capitalize() if len(s) >= 4 else s.upper()
async def refresh_api_cache( async def get_events(url: str, cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient,
url: str,
now_ts: float,
) -> list[dict[str, Any]]:
log.info("Refreshing API cache")
try:
r = await client.get(url)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{url}": {e}')
return []
if not (data := r.json()):
return []
data[-1]["timestamp"] = now_ts
return data
async def get_events(
client: httpx.AsyncClient,
url: str,
cached_keys: set[str],
) -> list[dict[str, str]]:
now = Time.clean(Time.now()) now = Time.clean(Time.now())
if not (api_data := API_FILE.load(per_entry=False, index=-1)): if not (api_data := API_FILE.load(per_entry=False, index=-1)):
api_data = await refresh_api_cache( api_data = []
client,
if r := await network.request(
urljoin(url, "api/matches/all-today"), urljoin(url, "api/matches/all-today"),
now.timestamp(), log=log,
) ):
api_data: list[dict] = r.json()
api_data[-1]["timestamp"] = now.timestamp()
API_FILE.write(api_data) API_FILE.write(api_data)
events = [] events = []
pattern = re.compile(r"[\n\r]+|\s{2,}")
start_dt = now.delta(minutes=-30) start_dt = now.delta(minutes=-30)
end_dt = now.delta(minutes=30) end_dt = now.delta(minutes=30)
pattern = re.compile(r"[\n\r]+|\s{2,}")
for event in api_data: for event in api_data:
if (category := event.get("category")) == "other": if (category := event.get("category")) == "other":
@ -99,13 +73,12 @@ async def get_events(
sport = fix_sport(category) sport = fix_sport(category)
parts = pattern.split(event["title"].strip()) parts = pattern.split(event["title"].strip())
name = " | ".join(p.strip() for p in parts if p.strip()) name = " | ".join(p.strip() for p in parts if p.strip())
logo = urljoin(url, poster) if (poster := event.get("poster")) else None logo = urljoin(url, poster) if (poster := event.get("poster")) else None
key = f"[{sport}] {name} ({TAG})" if f"[{sport}] {name} ({TAG})" in cached_keys:
if cached_keys & {key}:
continue continue
sources: list[dict[str, str]] = event["sources"] sources: list[dict[str, str]] = event["sources"]
@ -113,7 +86,8 @@ async def get_events(
if not sources: if not sources:
continue continue
skip_types = {"alpha", "bravo"} skip_types = ["alpha", "bravo"]
valid_sources = [d for d in sources if d.get("source") not in skip_types] valid_sources = [d for d in sources if d.get("source") not in skip_types]
if not valid_sources: if not valid_sources:
@ -122,6 +96,7 @@ async def get_events(
srce = valid_sources[0] srce = valid_sources[0]
source_type = srce.get("source") source_type = srce.get("source")
stream_id = srce.get("id") stream_id = srce.get("id")
if not (source_type and stream_id): if not (source_type and stream_id):
@ -140,9 +115,11 @@ async def get_events(
return events return events
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
@ -154,11 +131,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
log.info(f'Scraping from "{base_url}"') log.info(f'Scraping from "{base_url}"')
events = await get_events( events = await get_events(base_url, cached_urls.keys())
client,
base_url,
set(cached_urls.keys()),
)
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -209,6 +182,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -1,8 +1,6 @@
import re import re
import httpx from .utils import Cache, Time, get_logger, leagues, network
from .utils import Cache, Time, get_logger, leagues
log = get_logger(__name__) log = get_logger(__name__)
@ -15,29 +13,22 @@ CACHE_FILE = Cache(f"{TAG.lower()}.json", exp=86_400)
BASE_URL = "https://tvpass.org/playlist/m3u" BASE_URL = "https://tvpass.org/playlist/m3u"
async def get_data(client: httpx.AsyncClient) -> list[str]: async def get_events() -> dict[str, dict[str, str | float]]:
try:
r = await client.get(BASE_URL)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{BASE_URL}": {e}')
return []
return r.text.splitlines()
async def get_events(client: httpx.AsyncClient) -> dict[str, dict[str, str | float]]:
now = Time.clean(Time.now()).timestamp()
events = {} events = {}
data = await get_data(client) if not (r := await network.request(BASE_URL, log=log)):
return events
now = Time.clean(Time.now())
data = r.text.splitlines()
for i, line in enumerate(data, start=1): for i, line in enumerate(data, start=1):
if line.startswith("#EXTINF"): if line.startswith("#EXTINF"):
tvg_id_match = re.search(r'tvg-id="([^"]*)"', line) tvg_id_match = re.search(r'tvg-id="([^"]*)"', line)
tvg_name_match = re.search(r'tvg-name="([^"]*)"', line) tvg_name_match = re.search(r'tvg-name="([^"]*)"', line)
group_title_match = re.search(r'group-title="([^"]*)"', line) group_title_match = re.search(r'group-title="([^"]*)"', line)
tvg = tvg_id_match[1] if tvg_id_match else None tvg = tvg_id_match[1] if tvg_id_match else None
@ -59,23 +50,23 @@ async def get_events(client: httpx.AsyncClient) -> dict[str, dict[str, str | flo
"logo": logo, "logo": logo,
"id": tvg_id or "Live.Event.us", "id": tvg_id or "Live.Event.us",
"base": "https://tvpass.org", "base": "https://tvpass.org",
"timestamp": now, "timestamp": now.timestamp(),
} }
return events return events
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
if cached := CACHE_FILE.load(): if cached := CACHE_FILE.load():
urls.update(cached) urls.update(cached)
log.info(f"Loaded {len(urls)} event(s) from cache") log.info(f"Loaded {len(urls)} event(s) from cache")
return return
log.info(f'Scraping from "{BASE_URL}"') log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(client) urls.update(await get_events())
urls.update(events)
CACHE_FILE.write(urls) CACHE_FILE.write(urls)

View file

@ -7,7 +7,9 @@ from .config import Time
class Cache: class Cache:
def __init__(self, file: str, exp: int | float) -> None: def __init__(self, file: str, exp: int | float) -> None:
self.file = Path(__file__).parent.parent / "caches" / file self.file = Path(__file__).parent.parent / "caches" / file
self.exp = exp self.exp = exp
self.now_ts = Time.now().timestamp() self.now_ts = Time.now().timestamp()
def is_fresh(self, entry: dict) -> bool: def is_fresh(self, entry: dict) -> bool:

View file

@ -45,11 +45,13 @@ class Time(datetime):
def to_tz(self, tzone: str) -> "Time": def to_tz(self, tzone: str) -> "Time":
dt = self.astimezone(self.ZONES[tzone]) dt = self.astimezone(self.ZONES[tzone])
return self.__class__.fromtimestamp(dt.timestamp(), tz=self.ZONES[tzone]) return self.__class__.fromtimestamp(dt.timestamp(), tz=self.ZONES[tzone])
@classmethod @classmethod
def _to_class_tz(cls, dt) -> "Time": def _to_class_tz(cls, dt) -> "Time":
dt = dt.astimezone(cls.TZ) dt = dt.astimezone(cls.TZ)
return cls.fromtimestamp(dt.timestamp(), tz=cls.TZ) return cls.fromtimestamp(dt.timestamp(), tz=cls.TZ)
@classmethod @classmethod

View file

@ -67,7 +67,9 @@
"NCAA AMERICAN FOOTBALL", "NCAA AMERICAN FOOTBALL",
"NCAA BASKETBALL", "NCAA BASKETBALL",
"NCAA FOOTBALL", "NCAA FOOTBALL",
"NCAA MEN",
"NCAA SPORTS", "NCAA SPORTS",
"NCAA WOMEN",
"NCAAB", "NCAAB",
"NCAAB D", "NCAAB D",
"NCAAB D-I", "NCAAB D-I",

View file

@ -22,9 +22,13 @@ COLORS = {
class ColorFormatter(logging.Formatter): class ColorFormatter(logging.Formatter):
def format(self, record) -> str: def format(self, record) -> str:
color = COLORS.get(record.levelname, COLORS["reset"]) color = COLORS.get(record.levelname, COLORS["reset"])
levelname = record.levelname levelname = record.levelname
record.levelname = f"{color}{levelname:<8}{COLORS['reset']}" record.levelname = f"{color}{levelname:<8}{COLORS['reset']}"
formatted = super().format(record) formatted = super().format(record)
record.levelname = levelname record.levelname = levelname
return formatted return formatted
@ -38,10 +42,15 @@ def get_logger(name: str | None = None) -> logging.Logger:
if not logger.hasHandlers(): if not logger.hasHandlers():
handler = logging.StreamHandler() handler = logging.StreamHandler()
formatter = ColorFormatter(LOG_FMT, datefmt="%Y-%m-%d | %H:%M:%S") formatter = ColorFormatter(LOG_FMT, datefmt="%Y-%m-%d | %H:%M:%S")
handler.setFormatter(formatter) handler.setFormatter(formatter)
logger.addHandler(handler) logger.addHandler(handler)
logger.setLevel(logging.INFO) logger.setLevel(logging.INFO)
logger.propagate = False logger.propagate = False
return logger return logger

View file

@ -12,6 +12,8 @@ from playwright.async_api import Browser, BrowserContext, Playwright, Request
from .logger import get_logger from .logger import get_logger
logger = get_logger(__name__)
T = TypeVar("T") T = TypeVar("T")
@ -32,8 +34,6 @@ class Network:
http2=True, http2=True,
) )
self._logger = get_logger("network")
@staticmethod @staticmethod
def build_proxy_url( def build_proxy_url(
tag: str, tag: str,
@ -51,26 +51,37 @@ class Network:
else urljoin(base, f"{tag}/{path}") else urljoin(base, f"{tag}/{path}")
) )
async def check_status(self, url: str) -> bool: async def request(
self,
url: str,
log: logging.Logger | None = None,
**kwargs,
) -> httpx.Response | None:
log = log or logger
try: try:
r = await self.client.get(url, timeout=5) r = await self.client.get(url, **kwargs)
r.raise_for_status() r.raise_for_status()
return r.status_code == 200
return r
except (httpx.HTTPError, httpx.TimeoutException) as e: except (httpx.HTTPError, httpx.TimeoutException) as e:
self._logger.debug(f"Status check failed for {url}: {e}") log.error(f'Failed to fetch "{url}": {e}')
return False
return ""
async def get_base(self, mirrors: list[str]) -> str | None: async def get_base(self, mirrors: list[str]) -> str | None:
random.shuffle(mirrors) random.shuffle(mirrors)
tasks = [self.check_status(link) for link in mirrors] for mirror in mirrors:
results = await asyncio.gather(*tasks, return_exceptions=True) if not (r := await self.request(mirror)):
continue
working_mirrors = [ elif r.status_code != 200:
mirror for mirror, success in zip(mirrors, results) if success continue
]
return working_mirrors[0] if working_mirrors else None return mirror
@staticmethod @staticmethod
async def safe_process( async def safe_process(
@ -80,8 +91,7 @@ class Network:
log: logging.Logger | None = None, log: logging.Logger | None = None,
) -> T | None: ) -> T | None:
if not log: log = log or logger
log = logging.getLogger(__name__)
task = asyncio.create_task(fn()) task = asyncio.create_task(fn())
@ -96,13 +106,15 @@ class Network:
await task await task
except asyncio.CancelledError: except asyncio.CancelledError:
pass pass
except Exception as e: except Exception as e:
log.debug(f"URL {url_num}) Ignore exception after timeout: {e}") log.debug(f"URL {url_num}) Ignore exception after timeout: {e}")
return None return
except Exception as e: except Exception as e:
log.error(f"URL {url_num}) Unexpected error: {e}") log.error(f"URL {url_num}) Unexpected error: {e}")
return None
return
@staticmethod @staticmethod
def capture_req( def capture_req(
@ -133,6 +145,8 @@ class Network:
log: logging.Logger | None = None, log: logging.Logger | None = None,
) -> str | None: ) -> str | None:
log = log or logger
page = await context.new_page() page = await context.new_page()
captured: list[str] = [] captured: list[str] = []
@ -160,6 +174,7 @@ class Network:
await asyncio.wait_for(wait_task, timeout=timeout) await asyncio.wait_for(wait_task, timeout=timeout)
except asyncio.TimeoutError: except asyncio.TimeoutError:
log.warning(f"URL {url_num}) Timed out waiting for M3U8.") log.warning(f"URL {url_num}) Timed out waiting for M3U8.")
return return
finally: finally:
@ -173,17 +188,21 @@ class Network:
if captured: if captured:
log.info(f"URL {url_num}) Captured M3U8") log.info(f"URL {url_num}) Captured M3U8")
return captured[0] return captured[0]
log.warning(f"URL {url_num}) No M3U8 captured after waiting.") log.warning(f"URL {url_num}) No M3U8 captured after waiting.")
return return
except Exception as e: except Exception as e:
log.warning(f"URL {url_num}) Exception while processing: {e}") log.warning(f"URL {url_num}) Exception while processing: {e}")
return return
finally: finally:
page.remove_listener("request", handler) page.remove_listener("request", handler)
await page.close() await page.close()
@staticmethod @staticmethod
@ -195,7 +214,9 @@ class Network:
if browser == "brave": if browser == "brave":
brwsr = await playwright.chromium.connect_over_cdp("http://localhost:9222") brwsr = await playwright.chromium.connect_over_cdp("http://localhost:9222")
context = brwsr.contexts[0] context = brwsr.contexts[0]
else: else:
brwsr = await playwright.firefox.launch(headless=True) brwsr = await playwright.firefox.launch(headless=True)

View file

@ -5,7 +5,6 @@ from itertools import chain
from typing import Any from typing import Any
from urllib.parse import urljoin from urllib.parse import urljoin
import httpx
from playwright.async_api import BrowserContext, async_playwright from playwright.async_api import BrowserContext, async_playwright
from .utils import Cache, Time, get_logger, leagues, network from .utils import Cache, Time, get_logger, leagues, network
@ -20,11 +19,15 @@ CACHE_FILE = Cache(f"{TAG.lower()}.json", exp=10_800)
API_FILE = Cache(f"{TAG.lower()}-api.json", exp=28_800) API_FILE = Cache(f"{TAG.lower()}-api.json", exp=28_800)
API_MIRRORS = ["https://api.watchfooty.top", "https://api.watchfooty.st"] API_URL = "https://api.watchfooty.st"
BASE_MIRRORS = ["https://www.watchfooty.top", "https://www.watchfooty.st"] BASE_MIRRORS = [
"https://www.watchfooty.top",
"https://www.watchfooty.st",
"https://www.watchfooty.su",
]
SPORT_ENDPOINTS = [ VALID_SPORTS = [
"american-football", "american-football",
# "australian-football", # "australian-football",
# "baseball", # "baseball",
@ -42,37 +45,27 @@ SPORT_ENDPOINTS = [
] ]
async def get_api_data(client: httpx.AsyncClient, url: str) -> list[dict[str, Any]]: async def refresh_api_cache(now: Time) -> list[dict[str, Any]]:
try:
r = await client.get(url, timeout=5)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{url}": {e}')
return []
return r.json()
async def refresh_api_cache(
client: httpx.AsyncClient, url: str
) -> list[dict[str, Any]]:
log.info("Refreshing API cache") log.info("Refreshing API cache")
tasks = [ tasks = [
get_api_data(client, urljoin(url, f"api/v1/matches/{sport}")) network.request(
for sport in SPORT_ENDPOINTS urljoin(API_URL, "api/v1/matches/all"),
log=log,
params={"date": d.date()},
)
for d in [now, now.delta(days=1)]
] ]
results = await asyncio.gather(*tasks) results = await asyncio.gather(*tasks)
if not (data := list(chain(*results))): if not (data := [*chain.from_iterable(r.json() for r in results if r)]):
return [] return []
for ev in data: for ev in data:
ev["ts"] = ev.pop("timestamp") ev["ts"] = ev.pop("timestamp")
data[-1]["timestamp"] = Time.clean(Time.now()).timestamp() data[-1]["timestamp"] = now.timestamp()
return data return data
@ -115,12 +108,14 @@ async def process_event(
text = await header.inner_text() text = await header.inner_text()
except TimeoutError: except TimeoutError:
log.warning(f"URL {url_num}) Can't find stream links header.") log.warning(f"URL {url_num}) Can't find stream links header.")
return return
match = re.search(r"\((\d+)\)", text) match = re.search(r"\((\d+)\)", text)
if not match or int(match[1]) == 0: if not match or int(match[1]) == 0:
log.warning(f"URL {url_num}) No available stream links.") log.warning(f"URL {url_num}) No available stream links.")
return return
first_available = await page.wait_for_selector( first_available = await page.wait_for_selector(
@ -135,6 +130,7 @@ async def process_event(
await asyncio.wait_for(wait_task, timeout=6) await asyncio.wait_for(wait_task, timeout=6)
except asyncio.TimeoutError: except asyncio.TimeoutError:
log.warning(f"URL {url_num}) Timed out waiting for M3U8.") log.warning(f"URL {url_num}) Timed out waiting for M3U8.")
return return
finally: finally:
@ -148,48 +144,57 @@ async def process_event(
if captured: if captured:
log.info(f"URL {url_num}) Captured M3U8") log.info(f"URL {url_num}) Captured M3U8")
return captured[-1] return captured[-1]
log.warning(f"URL {url_num}) No M3U8 captured after waiting.") log.warning(f"URL {url_num}) No M3U8 captured after waiting.")
return return
except Exception as e: except Exception as e:
log.warning(f"URL {url_num}) Exception while processing: {e}") log.warning(f"URL {url_num}) Exception while processing: {e}")
return return
finally: finally:
page.remove_listener("request", handler) page.remove_listener("request", handler)
await page.close() await page.close()
async def get_events( async def get_events(base_url: str, cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient, now = Time.clean(Time.now())
api_url: str,
base_url: str,
cached_keys: set[str],
) -> list[dict[str, str]]:
if not (api_data := API_FILE.load(per_entry=False, index=-1)): if not (api_data := API_FILE.load(per_entry=False, index=-1)):
api_data = await refresh_api_cache(client, api_url) api_data = await refresh_api_cache(now)
API_FILE.write(api_data) API_FILE.write(api_data)
events = [] events = []
now = Time.clean(Time.now()) pattern = re.compile(r"\-+|\(")
start_dt = now.delta(minutes=-30) start_dt = now.delta(minutes=-30)
end_dt = now.delta(minutes=5) end_dt = now.delta(minutes=5)
pattern = re.compile(r"\-+|\(")
for event in api_data: for event in api_data:
match_id = event.get("matchId") match_id = event.get("matchId")
name = event.get("title") name = event.get("title")
league = event.get("league") league = event.get("league")
if not (match_id and name and league): if not (match_id and name and league):
continue continue
if event["sport"] not in VALID_SPORTS:
continue
sport = pattern.split(league, 1)[0].strip()
if f"[{sport}] {name} ({TAG})" in cached_keys:
continue
if not (ts := event.get("ts")): if not (ts := event.get("ts")):
continue continue
@ -200,14 +205,7 @@ async def get_events(
if not start_dt <= event_dt <= end_dt: if not start_dt <= event_dt <= end_dt:
continue continue
sport = pattern.split(league, 1)[0].strip() logo = urljoin(API_URL, poster) if (poster := event.get("poster")) else None
logo = urljoin(api_url, poster) if (poster := event.get("poster")) else None
key = f"[{sport}] {name} ({TAG})"
if cached_keys & {key}:
continue
events.append( events.append(
{ {
@ -222,31 +220,27 @@ async def get_events(
return events return events
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
valid_urls = {k: v for k, v in cached_urls.items() if v["url"]} valid_urls = {k: v for k, v in cached_urls.items() if v["url"]}
valid_count = cached_count = len(valid_urls) valid_count = cached_count = len(valid_urls)
urls.update(valid_urls) urls.update(valid_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
base_url = await network.get_base(BASE_MIRRORS) if not (base_url := await network.get_base(BASE_MIRRORS)):
api_url = await network.get_base(API_MIRRORS)
if not (base_url and api_url):
log.warning("No working Watch Footy mirrors") log.warning("No working Watch Footy mirrors")
CACHE_FILE.write(cached_urls) CACHE_FILE.write(cached_urls)
return return
log.info(f'Scraping from "{base_url}"') log.info(f'Scraping from "{base_url}"')
events = await get_events( events = await get_events(base_url, cached_urls.keys())
client,
api_url,
base_url,
set(cached_urls.keys()),
)
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -299,6 +293,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := valid_count - cached_count: if new_count := valid_count - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -1,7 +1,6 @@
import asyncio import asyncio
from functools import partial from functools import partial
import httpx
from playwright.async_api import async_playwright from playwright.async_api import async_playwright
from selectolax.parser import HTMLParser from selectolax.parser import HTMLParser
@ -24,22 +23,15 @@ def fix_event(s: str) -> str:
return " vs ".join(s.split("@")) return " vs ".join(s.split("@"))
async def refresh_html_cache( async def refresh_html_cache(url: str) -> dict[str, dict[str, str | float]]:
client: httpx.AsyncClient, url: str events = {}
) -> dict[str, dict[str, str | float]]:
try:
r = await client.get(url)
r.raise_for_status()
except Exception as e:
log.error(f'Failed to fetch "{url}": {e}')
return {} if not (html_data := await network.request(url, log=log)):
return events
now = Time.clean(Time.now()) now = Time.clean(Time.now())
soup = HTMLParser(r.content) soup = HTMLParser(html_data.content)
events = {}
title = soup.css_first("title").text(strip=True) title = soup.css_first("title").text(strip=True)
@ -87,15 +79,13 @@ async def refresh_html_cache(
return events return events
async def get_events( async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
client: httpx.AsyncClient, cached_keys: set[str]
) -> list[dict[str, str]]:
now = Time.clean(Time.now()) now = Time.clean(Time.now())
if not (events := HTML_CACHE.load()): if not (events := HTML_CACHE.load()):
log.info("Refreshing HTML cache") log.info("Refreshing HTML cache")
tasks = [refresh_html_cache(client, url) for url in BASE_URLS.values()] tasks = [refresh_html_cache(url) for url in BASE_URLS.values()]
results = await asyncio.gather(*tasks) results = await asyncio.gather(*tasks)
@ -109,7 +99,7 @@ async def get_events(
end_ts = now.delta(minutes=30).timestamp() end_ts = now.delta(minutes=30).timestamp()
for k, v in events.items(): for k, v in events.items():
if cached_keys & {k}: if k in cached_keys:
continue continue
if not start_ts <= v["event_ts"] <= end_ts: if not start_ts <= v["event_ts"] <= end_ts:
@ -120,16 +110,18 @@ async def get_events(
return live return live
async def scrape(client: httpx.AsyncClient) -> None: async def scrape() -> None:
cached_urls = CACHE_FILE.load() cached_urls = CACHE_FILE.load()
cached_count = len(cached_urls) cached_count = len(cached_urls)
urls.update(cached_urls) urls.update(cached_urls)
log.info(f"Loaded {cached_count} event(s) from cache") log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{' & '.join(BASE_URLS.values())}"') log.info(f'Scraping from "{' & '.join(BASE_URLS.values())}"')
events = await get_events(client, set(cached_urls.keys())) events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
@ -179,6 +171,7 @@ async def scrape(client: httpx.AsyncClient) -> None:
if new_count := len(cached_urls) - cached_count: if new_count := len(cached_urls) - cached_count:
log.info(f"Collected and cached {new_count} new event(s)") log.info(f"Collected and cached {new_count} new event(s)")
else: else:
log.info("No new events found") log.info("No new events found")

View file

@ -1,26 +1,11 @@
## Base Log @ 2025-12-17 20:44 UTC ## Base Log @ 2025-12-18 20:40 UTC
### ✅ Working Streams: 129<br>❌ Dead Streams: 17 ### ✅ Working Streams: 144<br>❌ Dead Streams: 2
| Channel | Error (Code) | Link | | Channel | Error (Code) | Link |
| ------- | ------------ | ---- | | ------- | ------------ | ---- |
| BET | HTTP Error (404) | `http://fl1.moveonjoy.com/BET_EAST/index.m3u8` | | ESPN U | HTTP Error (403) | `http://cord-cutter.net:8080/30550113/30550113/10255` |
| CNBC | HTTP Error (404) | `https://fl1.moveonjoy.com/CNBC/index.m3u8` | | ION TV | HTTP Error (403) | `http://cord-cutter.net:8080/30550113/30550113/9297` |
| Discovery Life | HTTP Error (404) | `https://fl1.moveonjoy.com/DISCOVERY_LIFE/index.m3u8` |
| FDSN Florida | HTTP Error (403) | `http://tv14s.xyz:8080/A1Jay5/362586/46794` |
| FYI TV | HTTP Error (404) | `http://fl1.moveonjoy.com/FYI/index.m3u8` |
| Grit TV | HTTP Error (404) | `http://fl1.moveonjoy.com/GRIT_TV/index.m3u8` |
| HBO 2 | HTTP Error (404) | `http://fl1.moveonjoy.com/HBO_2/index.m3u8` |
| HBO Comedy | HTTP Error (404) | `http://fl1.moveonjoy.com/HBO_COMEDY/index.m3u8` |
| HBO Family | HTTP Error (404) | `https://fl1.moveonjoy.com/HBO_FAMILY/index.m3u8` |
| HBO Zone | HTTP Error (404) | `https://fl1.moveonjoy.com/HBO_ZONE/index.m3u8` |
| MLB Network | HTTP Error (404) | `https://fl1.moveonjoy.com/MLB_NETWORK/index.m3u8` |
| NBA TV | HTTP Error (404) | `http://fl1.moveonjoy.com/NBA_TV/index.m3u8` |
| National Geographic | HTTP Error (404) | `http://fl1.moveonjoy.com/National_Geographic/index.m3u8` |
| Paramount Network | HTTP Error (404) | `https://fl1.moveonjoy.com/PARAMOUNT_NETWORK/index.m3u8` |
| Showtime | HTTP Error (404) | `http://fl1.moveonjoy.com/SHOWTIME/index.m3u8` |
| Smithsonian Channel | HTTP Error (404) | `http://fl1.moveonjoy.com/SMITHSONIAN_CHANNEL/index.m3u8` |
| Tennis Channel | HTTP Error (404) | `https://fl1.moveonjoy.com/TENNIS_CHANNEL/index.m3u8` |
--- ---
#### Base Channels URL #### Base Channels URL
``` ```