Compare commits

..

No commits in common. "19731cc7c47f515c7b8c20e2b387b726b3a39d98" and "90ec2f586d7683e22cdb1939805d130b404d3f86" have entirely different histories.

12 changed files with 100074 additions and 106601 deletions

File diff suppressed because it is too large Load diff

202277
M3U8/TV.xml

File diff suppressed because one or more lines are too long

View file

@ -28,12 +28,10 @@ http://fl1.moveonjoy.com/Aspire/index.m3u8
http://fl1.moveonjoy.com/BBC_AMERICA/index.m3u8 http://fl1.moveonjoy.com/BBC_AMERICA/index.m3u8
#EXTINF:-1 tvg-chno="10" tvg-id="BBC.News.(North.America).HD.us2" tvg-name="BBC World News" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s89542_dark_360w_270h.png" group-title="TV",BBC World News #EXTINF:-1 tvg-chno="10" tvg-id="BBC.News.(North.America).HD.us2" tvg-name="BBC World News" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s89542_dark_360w_270h.png" group-title="TV",BBC World News
#EXTVLCOPT:http-user-agent=curl/8.5.0 http://fl1.moveonjoy.com/BBC_WORLD_NEWS/index.m3u8
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/1723
#EXTINF:-1 tvg-chno="11" tvg-id="BET.HD.us2" tvg-name="BET" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10051_dark_360w_270h.png" group-title="TV",BET #EXTINF:-1 tvg-chno="11" tvg-id="BET.HD.us2" tvg-name="BET" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10051_dark_360w_270h.png" group-title="TV",BET
#EXTVLCOPT:http-user-agent=curl/8.5.0 https://streamer1.nexgen.bz/BET/index.m3u8
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/2071
#EXTINF:-1 tvg-chno="12" tvg-id="Big.Ten.Network.HD.us2" tvg-name="Big Ten Network" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s56783_dark_360w_270h.png" group-title="TV",Big Ten Network #EXTINF:-1 tvg-chno="12" tvg-id="Big.Ten.Network.HD.us2" tvg-name="Big Ten Network" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s56783_dark_360w_270h.png" group-title="TV",Big Ten Network
http://23.237.104.106:8080/USA_BTN/index.m3u8 http://23.237.104.106:8080/USA_BTN/index.m3u8
@ -150,8 +148,7 @@ http://mytvstream.net:8080/live/A1Jay5/362586/66795.m3u8
http://mytvstream.net:8080/live/A1Jay5/362586/58827.m3u8 http://mytvstream.net:8080/live/A1Jay5/362586/58827.m3u8
#EXTINF:-1 tvg-chno="50" tvg-id="FanDuel.Sports.Network.Ohio.(Cleveland).HDTV.us" tvg-name="FDSN Ohio" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s49691_dark_360w_270h.png" group-title="TV",FDSN Ohio #EXTINF:-1 tvg-chno="50" tvg-id="FanDuel.Sports.Network.Ohio.(Cleveland).HDTV.us" tvg-name="FDSN Ohio" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s49691_dark_360w_270h.png" group-title="TV",FDSN Ohio
#EXTVLCOPT:http-user-agent=curl/8.5.0 http://mytvstream.net:8080/live/A1Jay5/362586/17752.m3u8
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222126
#EXTINF:-1 tvg-chno="51" tvg-id="FanDuel.Sports.Network.Oklahoma.24/7.HDTV.(Tulsa).us" tvg-name="FDSN Oklahoma" tvg-logo="https://i.gyazo.com/80ad6fd142cd67f06eef58d9ce5aa72b.png" group-title="TV",FDSN Oklahoma #EXTINF:-1 tvg-chno="51" tvg-id="FanDuel.Sports.Network.Oklahoma.24/7.HDTV.(Tulsa).us" tvg-name="FDSN Oklahoma" tvg-logo="https://i.gyazo.com/80ad6fd142cd67f06eef58d9ce5aa72b.png" group-title="TV",FDSN Oklahoma
http://mytvstream.net:8080/live/A1Jay5/362586/20934.m3u8 http://mytvstream.net:8080/live/A1Jay5/362586/20934.m3u8
@ -160,16 +157,13 @@ http://mytvstream.net:8080/live/A1Jay5/362586/20934.m3u8
http://mytvstream.net:8080/live/A1Jay5/362586/221151.m3u8 http://mytvstream.net:8080/live/A1Jay5/362586/221151.m3u8
#EXTINF:-1 tvg-chno="53" tvg-id="FanDuel.Sports.Network.Southeast.HDTV.(Mont./Birm./Dothan/Mobile.AL).us" tvg-name="FDSN Southeast" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s20789_dark_360w_270h.png" group-title="TV",FDSN Southeast #EXTINF:-1 tvg-chno="53" tvg-id="FanDuel.Sports.Network.Southeast.HDTV.(Mont./Birm./Dothan/Mobile.AL).us" tvg-name="FDSN Southeast" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s20789_dark_360w_270h.png" group-title="TV",FDSN Southeast
#EXTVLCOPT:http-user-agent=curl/8.5.0 http://mytvstream.net:8080/live/A1Jay5/362586/2213.m3u8
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222130
#EXTINF:-1 tvg-chno="54" tvg-id="FanDuel.Sports.Network.Southwest.HDTV.24/7.(Main).us" tvg-name="FDSN Southwest" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59629_dark_360w_270h.png" group-title="TV",FDSN Southwest #EXTINF:-1 tvg-chno="54" tvg-id="FanDuel.Sports.Network.Southwest.HDTV.24/7.(Main).us" tvg-name="FDSN Southwest" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59629_dark_360w_270h.png" group-title="TV",FDSN Southwest
#EXTVLCOPT:http-user-agent=curl/8.5.0 http://mytvstream.net:8080/live/A1Jay5/362586/21843.m3u8
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/220452
#EXTINF:-1 tvg-chno="55" tvg-id="FanDuel.Sports.Network.Sun.South.24/7.HDTV.(South.Marlins,.Rays,.Heat).us" tvg-name="FDSN Sun" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s61084_dark_360w_270h.png" group-title="TV",FDSN Sun #EXTINF:-1 tvg-chno="55" tvg-id="FanDuel.Sports.Network.Sun.South.24/7.HDTV.(South.Marlins,.Rays,.Heat).us" tvg-name="FDSN Sun" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s61084_dark_360w_270h.png" group-title="TV",FDSN Sun
#EXTVLCOPT:http-user-agent=curl/8.5.0 http://mytvstream.net:8080/live/A1Jay5/362586/104917.m3u8
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222132
#EXTINF:-1 tvg-chno="56" tvg-id="FanDuel.Sports.Network.West.HDTV.us" tvg-name="FDSN West" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59627_dark_360w_270h.png" group-title="TV",FDSN West #EXTINF:-1 tvg-chno="56" tvg-id="FanDuel.Sports.Network.West.HDTV.us" tvg-name="FDSN West" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59627_dark_360w_270h.png" group-title="TV",FDSN West
http://mytvstream.net:8080/live/A1Jay5/362586/20932.m3u8 http://mytvstream.net:8080/live/A1Jay5/362586/20932.m3u8
@ -217,8 +211,7 @@ https://fl1.moveonjoy.com/FXX/index.m3u8
http://fl1.moveonjoy.com/FYI/index.m3u8 http://fl1.moveonjoy.com/FYI/index.m3u8
#EXTINF:-1 tvg-chno="71" tvg-id="Game.Show.Network.HD.us2" tvg-name="Game Show Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14909_dark_360w_270h.png" group-title="TV",Game Show Network #EXTINF:-1 tvg-chno="71" tvg-id="Game.Show.Network.HD.us2" tvg-name="Game Show Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14909_dark_360w_270h.png" group-title="TV",Game Show Network
#EXTVLCOPT:http-user-agent=curl/8.5.0 https://streamer1.nexgen.bz/GSN/index.m3u8
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/1948
#EXTINF:-1 tvg-chno="72" tvg-id="get.us2" tvg-name="getTV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s82563_dark_360w_270h.png" group-title="TV",getTV #EXTINF:-1 tvg-chno="72" tvg-id="get.us2" tvg-name="getTV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s82563_dark_360w_270h.png" group-title="TV",getTV
http://fl1.moveonjoy.com/GET_TV/index.m3u8 http://fl1.moveonjoy.com/GET_TV/index.m3u8
@ -242,7 +235,7 @@ https://fl1.moveonjoy.com/HALLMARK_MOVIES_MYSTERIES/index.m3u8
http://fl1.moveonjoy.com/HBO/index.m3u8 http://fl1.moveonjoy.com/HBO/index.m3u8
#EXTINF:-1 tvg-chno="79" tvg-id="HBO2.HD.us2" tvg-name="HBO 2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68140_dark_360w_270h.png" group-title="TV",HBO 2 #EXTINF:-1 tvg-chno="79" tvg-id="HBO2.HD.us2" tvg-name="HBO 2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68140_dark_360w_270h.png" group-title="TV",HBO 2
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/2071 http://fl1.moveonjoy.com/HBO_2/index.m3u8
#EXTINF:-1 tvg-chno="80" tvg-id="HBO.Comedy.HD.us2" tvg-name="HBO Comedy" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59839_dark_360w_270h.png" group-title="TV",HBO Comedy #EXTINF:-1 tvg-chno="80" tvg-id="HBO.Comedy.HD.us2" tvg-name="HBO Comedy" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59839_dark_360w_270h.png" group-title="TV",HBO Comedy
http://fl1.moveonjoy.com/HBO_COMEDY/index.m3u8 http://fl1.moveonjoy.com/HBO_COMEDY/index.m3u8
@ -443,4 +436,4 @@ http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/129973
https://fl1.moveonjoy.com/VICELAND/index.m3u8 https://fl1.moveonjoy.com/VICELAND/index.m3u8
#EXTINF:-1 tvg-chno="146" tvg-id="Yes.Network.us2" tvg-name="YES Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s30017_dark_360w_270h.png" group-title="TV",YES Network #EXTINF:-1 tvg-chno="146" tvg-id="Yes.Network.us2" tvg-name="YES Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s30017_dark_360w_270h.png" group-title="TV",YES Network
https://fl1.moveonjoy.com/YES_NETWORK/index.m3u8 https://fl1.moveonjoy.com/YES_NETWORK/index.m3u8

View file

@ -13,9 +13,7 @@ BASE_M3U8 = Path(__file__).parent / "base.m3u8"
EPG_FILE = Path(__file__).parent / "TV.xml" EPG_FILE = Path(__file__).parent / "TV.xml"
LIVE_IMG = "https://i.gyazo.com/978f2eb4a199ca5b56b447aded0cb9e3.png" EPG_URLS = [
EPG_URLS = {
"https://epgshare01.online/epgshare01/epg_ripper_CA2.xml.gz", "https://epgshare01.online/epgshare01/epg_ripper_CA2.xml.gz",
"https://epgshare01.online/epgshare01/epg_ripper_DUMMY_CHANNELS.xml.gz", "https://epgshare01.online/epgshare01/epg_ripper_DUMMY_CHANNELS.xml.gz",
"https://epgshare01.online/epgshare01/epg_ripper_FANDUEL1.xml.gz", "https://epgshare01.online/epgshare01/epg_ripper_FANDUEL1.xml.gz",
@ -24,7 +22,9 @@ EPG_URLS = {
"https://epgshare01.online/epgshare01/epg_ripper_US2.xml.gz", "https://epgshare01.online/epgshare01/epg_ripper_US2.xml.gz",
"https://epgshare01.online/epgshare01/epg_ripper_US_LOCALS1.xml.gz", "https://epgshare01.online/epgshare01/epg_ripper_US_LOCALS1.xml.gz",
"https://i.mjh.nz/Roku/all.xml.gz", "https://i.mjh.nz/Roku/all.xml.gz",
} ]
LIVE_IMG = "https://i.gyazo.com/978f2eb4a199ca5b56b447aded0cb9e3.png"
DUMMIES = { DUMMIES = {
"Basketball.Dummy.us": LIVE_IMG, "Basketball.Dummy.us": LIVE_IMG,
@ -48,47 +48,39 @@ REPLACE_IDs = {
def get_tvg_ids() -> dict[str, str]: def get_tvg_ids() -> dict[str, str]:
tvg: dict[str, str] = {} base_m3u8 = BASE_M3U8.read_text(encoding="utf-8").splitlines()
for line in BASE_M3U8.read_text(encoding="utf-8").splitlines(): tvg = {}
if not line.startswith("#EXTINF"):
continue
tvg_id = re.search(r'tvg-id="([^"]*)"', line) for line in base_m3u8:
tvg_logo = re.search(r'tvg-logo="([^"]*)"', line) if line.startswith("#EXTINF"):
tvg_id = re.search(r'tvg-id="([^"]*)"', line)[1]
tvg_logo = re.search(r'tvg-logo="([^"]*)"', line)[1]
if tvg_id: tvg[tvg_id] = tvg_logo
tvg[tvg_id[1]] = tvg_logo[1] if tvg_logo else None
tvg |= DUMMIES
tvg |= {v["old"]: LIVE_IMG for v in REPLACE_IDs.values()}
return tvg return tvg
async def fetch_xml(url: str) -> ET.Element | None: async def fetch_xml(url: str) -> ET.Element | None:
if not (xml_data := await network.request(url, log=log)): if not (html_data := await network.request(url, log=log)):
return return
try: try:
log.info(f'Parsing XML from "{url}"') decompressed_data = gzip.decompress(html_data.content)
data = gzip.decompress(xml_data.content) return ET.fromstring(decompressed_data)
return ET.fromstring(data)
except Exception as e: except Exception as e:
log.error(f'Failed to parse from "{url}": {e}') log.error(f'Failed to decompress and parse XML from "{url}": {e}')
return return
def hijack_id( def hijack_id(
root: ET.Element,
*,
old: str, old: str,
new: str, new: str,
text: str, text: str,
root: ET.Element,
) -> None: ) -> None:
og_channel = root.find(f"./channel[@id='{old}']") og_channel = root.find(f"./channel[@id='{old}']")
@ -98,7 +90,7 @@ def hijack_id(
display_name = og_channel.find("display-name") display_name = og_channel.find("display-name")
if (display_name := og_channel.find("display-name")) is not None: if display_name is not None:
new_channel.append(ET.Element("display-name", display_name.attrib)) new_channel.append(ET.Element("display-name", display_name.attrib))
new_channel[-1].text = text new_channel[-1].text = text
@ -122,7 +114,9 @@ def hijack_id(
new_program.append(new_child) new_program.append(new_child)
for tag_name in ["title", "desc", "sub-title"]: for tag_name in ["title", "desc", "sub-title"]:
if (tag := new_program.find(tag_name)) is not None: tag = new_program.find(tag_name)
if tag is not None:
tag.text = text tag.text = text
root.remove(program) root.remove(program)
@ -135,60 +129,48 @@ async def main() -> None:
tvg_ids = get_tvg_ids() tvg_ids = get_tvg_ids()
parsed_tvg_ids: set[str] = set() tvg_ids |= DUMMIES | {v["old"]: LIVE_IMG for v in REPLACE_IDs.values()}
root = ET.Element("tv") root = ET.Element("tv")
epgs = await asyncio.gather(*(fetch_xml(url) for url in EPG_URLS)) tasks = [fetch_xml(url) for url in EPG_URLS]
results = await asyncio.gather(*tasks)
for epg_data in results:
if epg_data is None:
continue
for epg_data in (epg for epg in epgs if epg is not None):
for channel in epg_data.findall("channel"): for channel in epg_data.findall("channel"):
if (channel_id := channel.get("id")) not in tvg_ids: if (channel_id := channel.get("id")) in tvg_ids:
continue for icon_tag in channel.findall("icon"):
if logo := tvg_ids.get(channel_id):
icon_tag.set("src", logo)
parsed_tvg_ids.add(channel_id) if (url_tag := channel.find("url")) is not None:
channel.remove(url_tag)
for icon_tag in channel.findall("icon"): root.append(channel)
if logo := tvg_ids.get(channel_id):
icon_tag.set("src", logo)
if (url_tag := channel.find("url")) is not None:
channel.remove(url_tag)
root.append(channel)
for program in epg_data.findall("programme"): for program in epg_data.findall("programme"):
if program.get("channel") not in tvg_ids: if program.get("channel") in tvg_ids:
continue title_text = program.find("title").text
subtitle = program.find("sub-title")
title_text = program.find("title").text if (
title_text in ["NHL Hockey", "Live: NFL Football"]
and subtitle is not None
):
program.find("title").text = f"{title_text} {subtitle.text}"
subtitle = program.find("sub-title") root.append(program)
if ( for k, v in REPLACE_IDs.items():
title_text in ["NHL Hockey", "Live: NFL Football"] hijack_id(**v, text=k, root=root)
and subtitle is not None
):
program.find("title").text = f"{title_text} {subtitle.text}"
root.append(program)
for title, ids in REPLACE_IDs.items():
hijack_id(root, **ids, text=title)
if missing_ids := set(tvg_ids) - parsed_tvg_ids:
log.warning(f"Missed {len(missing_ids)} TVG ID(s)")
for channel_id in missing_ids:
log.warning(f"Missing: {channel_id}")
tree = ET.ElementTree(root) tree = ET.ElementTree(root)
tree.write( tree.write(EPG_FILE, encoding="utf-8", xml_declaration=True)
EPG_FILE,
encoding="utf-8",
xml_declaration=True,
)
log.info(f"EPG saved to {EPG_FILE.resolve()}") log.info(f"EPG saved to {EPG_FILE.resolve()}")

File diff suppressed because it is too large Load diff

View file

@ -21,7 +21,6 @@ from scrapers import (
streamfree, streamfree,
streamhub, streamhub,
streamsgate, streamsgate,
totalsportek,
tvpass, tvpass,
watchfooty, watchfooty,
webcast, webcast,
@ -71,7 +70,6 @@ async def main() -> None:
asyncio.create_task(streamcenter.scrape(xtrnl_brwsr)), asyncio.create_task(streamcenter.scrape(xtrnl_brwsr)),
# asyncio.create_task(streamhub.scrape(xtrnl_brwsr)), # asyncio.create_task(streamhub.scrape(xtrnl_brwsr)),
asyncio.create_task(streamsgate.scrape(xtrnl_brwsr)), asyncio.create_task(streamsgate.scrape(xtrnl_brwsr)),
asyncio.create_task(totalsportek.scrape(hdl_brwsr)),
asyncio.create_task(webcast.scrape(hdl_brwsr)), asyncio.create_task(webcast.scrape(hdl_brwsr)),
asyncio.create_task(watchfooty.scrape(xtrnl_brwsr)), asyncio.create_task(watchfooty.scrape(xtrnl_brwsr)),
] ]
@ -82,10 +80,10 @@ async def main() -> None:
asyncio.create_task(pawa.scrape()), asyncio.create_task(pawa.scrape()),
asyncio.create_task(roxie.scrape()), asyncio.create_task(roxie.scrape()),
asyncio.create_task(shark.scrape()), asyncio.create_task(shark.scrape()),
# asyncio.create_task(streambtw.scrape()), asyncio.create_task(streambtw.scrape()),
asyncio.create_task(streamfree.scrape()), asyncio.create_task(streamfree.scrape()),
asyncio.create_task(tvpass.scrape()), asyncio.create_task(tvpass.scrape()),
# asyncio.create_task(xstreameast.scrape()), asyncio.create_task(xstreameast.scrape()),
] ]
await asyncio.gather(*(pw_tasks + httpx_tasks)) await asyncio.gather(*(pw_tasks + httpx_tasks))
@ -114,7 +112,6 @@ async def main() -> None:
| streamfree.urls | streamfree.urls
| streamhub.urls | streamhub.urls
| streamsgate.urls | streamsgate.urls
| totalsportek.urls
| tvpass.urls | tvpass.urls
| watchfooty.urls | watchfooty.urls
| webcast.urls | webcast.urls

View file

@ -10,12 +10,12 @@ log = get_logger(__name__)
urls: dict[str, dict[str, str | float]] = {} urls: dict[str, dict[str, str | float]] = {}
TAG = "LIVETVSX" TAG = "LTVSX"
CACHE_FILE = Cache(TAG, exp=10_800)
XML_CACHE = Cache(f"{TAG}-xml", exp=28_000) XML_CACHE = Cache(f"{TAG}-xml", exp=28_000)
CACHE_FILE = Cache(TAG, exp=10_800)
BASE_URL = "https://cdn.livetv861.me/rss/upcoming_en.xml" BASE_URL = "https://cdn.livetv861.me/rss/upcoming_en.xml"
VALID_SPORTS = {"NBA", "NHL", "NFL", "NCAA", "MLB"} VALID_SPORTS = {"NBA", "NHL", "NFL", "NCAA", "MLB"}
@ -160,8 +160,8 @@ async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
live = [] live = []
start_ts = now.delta(hours=-1).timestamp() start_ts = now.delta(minutes=-30).timestamp()
end_ts = now.delta(minutes=5).timestamp() end_ts = now.delta(minutes=30).timestamp()
for k, v in events.items(): for k, v in events.items():
if k in cached_keys: if k in cached_keys:
@ -193,7 +193,7 @@ async def scrape(browser: Browser) -> None:
log.info(f"Processing {len(events)} new URL(s)") log.info(f"Processing {len(events)} new URL(s)")
if events: if events:
async with network.event_context(browser, ignore_https=True) as context: async with network.event_context(browser) as context:
for i, ev in enumerate(events, start=1): for i, ev in enumerate(events, start=1):
async with network.event_page(context) as page: async with network.event_page(context) as page:
handler = partial( handler = partial(
@ -210,11 +210,10 @@ async def scrape(browser: Browser) -> None:
log=log, log=log,
) )
sport, event, ts, link = ( sport, event, ts = (
ev["sport"], ev["sport"],
ev["event"], ev["event"],
ev["timestamp"], ev["timestamp"],
ev["link"],
) )
key = f"[{sport}] {event} ({TAG})" key = f"[{sport}] {event} ({TAG})"
@ -227,7 +226,6 @@ async def scrape(browser: Browser) -> None:
"base": "https://livetv.sx/enx/", "base": "https://livetv.sx/enx/",
"timestamp": ts, "timestamp": ts,
"id": tvg_id or "Live.Event.us", "id": tvg_id or "Live.Event.us",
"link": link,
} }
cached_urls[key] = entry cached_urls[key] = entry

View file

@ -1,146 +0,0 @@
from functools import partial
from urllib.parse import urljoin, urlparse
from playwright.async_api import Browser
from selectolax.parser import HTMLParser
from .utils import Cache, Time, get_logger, leagues, network
log = get_logger(__name__)
urls: dict[str, dict[str, str | float]] = {}
TAG = "TOTALSPRTK"
CACHE_FILE = Cache(TAG, exp=28_800)
BASE_URL = "https://live3.totalsportek777.com/"
def fix_txt(s: str) -> str:
s = " ".join(s.split())
return s.upper() if s.islower() else s
async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
events = []
if not (html_data := await network.request(BASE_URL, log=log)):
return events
soup = HTMLParser(html_data.content)
sport = "Live Event"
for node in soup.css("a"):
if not node.attributes.get("class"):
continue
if (parent := node.parent) and "my-1" in parent.attributes.get("class", ""):
if span := node.css_first("span"):
sport = span.text(strip=True)
sport = fix_txt(sport)
if not (teams := [t.text(strip=True) for t in node.css(".col-7 .col-12")]):
continue
if not (href := node.attributes.get("href")):
continue
href = urlparse(href).path if href.startswith("http") else href
if not (time_node := node.css_first(".col-3 span")):
continue
if time_node.text(strip=True) != "MatchStarted":
continue
event_name = fix_txt(" vs ".join(teams))
if f"[{sport}] {event_name} ({TAG})" in cached_keys:
continue
events.append(
{
"sport": sport,
"event": event_name,
"link": urljoin(BASE_URL, href),
}
)
return events
async def scrape(browser: Browser) -> None:
cached_urls = CACHE_FILE.load()
valid_urls = {k: v for k, v in cached_urls.items() if v["url"]}
valid_count = cached_count = len(valid_urls)
urls.update(valid_urls)
log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)")
if events:
now = Time.clean(Time.now())
async with network.event_context(browser) as context:
for i, ev in enumerate(events, start=1):
async with network.event_page(context) as page:
handler = partial(
network.process_event,
url=ev["link"],
url_num=i,
page=page,
log=log,
)
url = await network.safe_process(
handler,
url_num=i,
semaphore=network.HTTP_S,
log=log,
)
sport, event, link = (
ev["sport"],
ev["event"],
ev["link"],
)
key = f"[{sport}] {event} ({TAG})"
tvg_id, logo = leagues.get_tvg_info(sport, event)
entry = {
"url": url,
"logo": logo,
"base": link,
"timestamp": now.timestamp(),
"id": tvg_id or "Live.Event.us",
"link": link,
}
cached_urls[key] = entry
if url:
valid_count += 1
urls[key] = entry
if new_count := valid_count - cached_count:
log.info(f"Collected and cached {new_count} new event(s)")
else:
log.info("No new events found")
CACHE_FILE.write(cached_urls)

View file

@ -811,94 +811,65 @@
"teams": { "teams": {
"NBA": [ "NBA": [
"76ers", "76ers",
"Atlanta",
"Atlanta Hawks", "Atlanta Hawks",
"Blazers", "Blazers",
"Boston",
"Boston Celtics", "Boston Celtics",
"Brooklyn Nets", "Brooklyn Nets",
"Bucks", "Bucks",
"Bulls", "Bulls",
"Cavaliers", "Cavaliers",
"Celtics", "Celtics",
"Charlotte",
"Charlotte Hornets", "Charlotte Hornets",
"Chicago",
"Chicago Bulls", "Chicago Bulls",
"Cleveland",
"Cleveland Cavaliers", "Cleveland Cavaliers",
"Clippers", "Clippers",
"Dallas",
"Dallas Mavericks", "Dallas Mavericks",
"Denver",
"Denver Nuggets", "Denver Nuggets",
"Detroit",
"Detroit Pistons", "Detroit Pistons",
"Golden State",
"Golden State Warriors", "Golden State Warriors",
"Grizzlies", "Grizzlies",
"Hawks", "Hawks",
"Heat", "Heat",
"Hornets", "Hornets",
"Houston",
"Houston Rockets", "Houston Rockets",
"Indiana",
"Indiana Pacers", "Indiana Pacers",
"Jazz", "Jazz",
"Kings", "Kings",
"Knicks", "Knicks",
"Lakers", "Lakers",
"Los Angeles",
"Los Angeles Clippers", "Los Angeles Clippers",
"Los Angeles Lakers", "Los Angeles Lakers",
"Magic", "Magic",
"Mavericks", "Mavericks",
"Memphis",
"Memphis Grizzlies", "Memphis Grizzlies",
"Miami",
"Miami Heat", "Miami Heat",
"Milwaukee",
"Milwaukee Bucks", "Milwaukee Bucks",
"Minnesota",
"Minnesota Timberwolves", "Minnesota Timberwolves",
"Nets", "Nets",
"New Orleans",
"New Orleans Pelicans", "New Orleans Pelicans",
"New York",
"New York Knicks", "New York Knicks",
"Nuggets", "Nuggets",
"Oklahoma",
"Oklahoma City",
"Oklahoma City Thunder", "Oklahoma City Thunder",
"Orlando",
"Orlando Magic", "Orlando Magic",
"Pacers", "Pacers",
"Pelicans", "Pelicans",
"Philadelphia",
"Philadelphia 76ers", "Philadelphia 76ers",
"Phoenix",
"Phoenix Suns", "Phoenix Suns",
"Pistons", "Pistons",
"Portland",
"Portland Trail Blazers", "Portland Trail Blazers",
"Raptors", "Raptors",
"Rockets", "Rockets",
"Sacramento",
"Sacramento Kings", "Sacramento Kings",
"San Antonio",
"San Antonio Spurs", "San Antonio Spurs",
"Sixers", "Sixers",
"Spurs", "Spurs",
"Suns", "Suns",
"Thunder", "Thunder",
"Timberwolves", "Timberwolves",
"Toronto",
"Toronto Raptors", "Toronto Raptors",
"Trail Blazers", "Trail Blazers",
"Utah",
"Utah Jazz", "Utah Jazz",
"Warriors", "Warriors",
"Washington",
"Washington Wizards", "Washington Wizards",
"Wizards", "Wizards",
"Wolves" "Wolves"

View file

@ -129,14 +129,12 @@ class Network:
async def event_context( async def event_context(
browser: Browser, browser: Browser,
stealth: bool = True, stealth: bool = True,
ignore_https: bool = False,
) -> AsyncGenerator[BrowserContext, None]: ) -> AsyncGenerator[BrowserContext, None]:
context: BrowserContext | None = None context: BrowserContext | None = None
try: try:
context = await browser.new_context( context = await browser.new_context(
user_agent=Network.UA if stealth else None, user_agent=Network.UA if stealth else None,
ignore_https_errors=ignore_https,
viewport={"width": 1366, "height": 768}, viewport={"width": 1366, "height": 768},
device_scale_factor=1, device_scale_factor=1,
locale="en-US", locale="en-US",

View file

@ -9,24 +9,20 @@ STATUSLOG=$(mktemp)
get_status() { get_status() {
local url="$1" local url="$1"
local channel="$2" local channel="$2"
local index="$3"
local total="$4"
local attempt response status_code local attempt response status_code
[[ "$url" != http* ]] && return [[ "$url" != http* ]] && return
printf '[%d/%d] Checking %s\n' "$((index + 1))" "$total" "$url"
for attempt in $(seq 1 "$RETRY_COUNT"); do for attempt in $(seq 1 "$RETRY_COUNT"); do
response=$( response=$(
curl -skL \ curl -skL \
-A "$UA" \ -A "$UA" \
-H "Accept: */*" \ -H "Accept: */*" \
-H "Accept-Language: en-US,en;q=0.9" \ -H "Accept-Language: en-US,en;q=0.9" \
-H "Accept-Encoding: gzip, deflate, br" \
-H "Connection: keep-alive" \ -H "Connection: keep-alive" \
-o /dev/null \ -o /dev/null \
--compressed \ --max-time 15 \
--max-time 30 \
-w "%{http_code}" \ -w "%{http_code}" \
"$url" 2>&1 "$url" 2>&1
) )
@ -51,7 +47,7 @@ get_status() {
status_code="$response" status_code="$response"
case "$status_code" in case "$status_code" in
2* | 3*) 200)
echo "PASS" >>"$STATUSLOG" echo "PASS" >>"$STATUSLOG"
;; ;;
@ -75,7 +71,6 @@ get_status() {
check_links() { check_links() {
echo "Checking links from: $base_file" echo "Checking links from: $base_file"
total_urls=$(grep -cE '^https?://' "$base_file")
channel_num=0 channel_num=0
name="" name=""
@ -91,14 +86,14 @@ check_links() {
elif [[ "$line" =~ ^https?:// ]]; then elif [[ "$line" =~ ^https?:// ]]; then
while (($(jobs -r | wc -l) >= MAX_JOBS)); do sleep 0.2; done while (($(jobs -r | wc -l) >= MAX_JOBS)); do sleep 0.2; done
get_status "$line" "$name" "$channel_num" "$total_urls" & get_status "$line" "$name" &
((channel_num++)) ((channel_num++))
fi fi
done < <(cat "$base_file") done < <(cat "$base_file")
wait wait
echo -e "\nDone." echo "Done."
} }
write_readme() { write_readme() {

View file

@ -1,18 +1,17 @@
## Base Log @ 2026-01-28 03:55 UTC ## Base Log @ 2026-01-27 03:57 UTC
### ✅ Working Streams: 136<br>❌ Dead Streams: 10 ### ✅ Working Streams: 136<br>❌ Dead Streams: 9
| Channel | Error (Code) | Link | | Channel | Error (Code) | Link |
| ------- | ------------ | ---- | | ------- | ------------ | ---- |
| FDSN Ohio | HTTP Error (502) | `http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222126` | | BBC World News | HTTP Error (404) | `http://fl1.moveonjoy.com/BBC_WORLD_NEWS/index.m3u8` |
| FDSN Oklahoma | HTTP Error (403) | `http://mytvstream.net:8080/live/A1Jay5/362586/20934.m3u8` | | FDSN Southeast | HTTP Error (403) | `http://mytvstream.net:8080/live/A1Jay5/362586/2213.m3u8` |
| FDSN Southeast | HTTP Error (502) | `http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222130` | | FDSN Southwest | HTTP Error (403) | `http://mytvstream.net:8080/live/A1Jay5/362586/21843.m3u8` |
| FDSN Southwest | HTTP Error (502) | `http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/220452` |
| FDSN Sun | HTTP Error (502) | `http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222132` |
| FXX | HTTP Error (404) | `https://fl1.moveonjoy.com/FXX/index.m3u8` | | FXX | HTTP Error (404) | `https://fl1.moveonjoy.com/FXX/index.m3u8` |
| NBC Sports Philadelphia | HTTP Error (403) | `http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/136477` | | Lifetime Movie Network | HTTP Error (404) | `https://fl1.moveonjoy.com/LIFETIME_MOVIE_NETWORK/index.m3u8` |
| NFL RedZone | HTTP Error (502) | `http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/249239` | | NBC Sports Bay Area | Unknown status (302) | `http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/257216` |
| Premier Sports 2 | HTTP Error (502) | `http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/117038` | | Paramount Network | HTTP Error (404) | `https://fl1.moveonjoy.com/PARAMOUNT_NETWORK/index.m3u8` |
| Premier Sports 2 | Unknown status (302) | `http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/117038` |
| Sportsnet One | HTTP Error (403) | `http://mytvstream.net:8080/live/k4Svp2/645504/57297.m3u8` | | Sportsnet One | HTTP Error (403) | `http://mytvstream.net:8080/live/k4Svp2/645504/57297.m3u8` |
--- ---
#### Base Channels URL #### Base Channels URL