Compare commits

..

No commits in common. "19731cc7c47f515c7b8c20e2b387b726b3a39d98" and "90ec2f586d7683e22cdb1939805d130b404d3f86" have entirely different histories.

12 changed files with 100074 additions and 106601 deletions

File diff suppressed because it is too large Load diff

202277
M3U8/TV.xml

File diff suppressed because one or more lines are too long

View file

@ -28,12 +28,10 @@ http://fl1.moveonjoy.com/Aspire/index.m3u8
http://fl1.moveonjoy.com/BBC_AMERICA/index.m3u8
#EXTINF:-1 tvg-chno="10" tvg-id="BBC.News.(North.America).HD.us2" tvg-name="BBC World News" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s89542_dark_360w_270h.png" group-title="TV",BBC World News
#EXTVLCOPT:http-user-agent=curl/8.5.0
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/1723
http://fl1.moveonjoy.com/BBC_WORLD_NEWS/index.m3u8
#EXTINF:-1 tvg-chno="11" tvg-id="BET.HD.us2" tvg-name="BET" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s10051_dark_360w_270h.png" group-title="TV",BET
#EXTVLCOPT:http-user-agent=curl/8.5.0
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/2071
https://streamer1.nexgen.bz/BET/index.m3u8
#EXTINF:-1 tvg-chno="12" tvg-id="Big.Ten.Network.HD.us2" tvg-name="Big Ten Network" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s56783_dark_360w_270h.png" group-title="TV",Big Ten Network
http://23.237.104.106:8080/USA_BTN/index.m3u8
@ -150,8 +148,7 @@ http://mytvstream.net:8080/live/A1Jay5/362586/66795.m3u8
http://mytvstream.net:8080/live/A1Jay5/362586/58827.m3u8
#EXTINF:-1 tvg-chno="50" tvg-id="FanDuel.Sports.Network.Ohio.(Cleveland).HDTV.us" tvg-name="FDSN Ohio" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s49691_dark_360w_270h.png" group-title="TV",FDSN Ohio
#EXTVLCOPT:http-user-agent=curl/8.5.0
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222126
http://mytvstream.net:8080/live/A1Jay5/362586/17752.m3u8
#EXTINF:-1 tvg-chno="51" tvg-id="FanDuel.Sports.Network.Oklahoma.24/7.HDTV.(Tulsa).us" tvg-name="FDSN Oklahoma" tvg-logo="https://i.gyazo.com/80ad6fd142cd67f06eef58d9ce5aa72b.png" group-title="TV",FDSN Oklahoma
http://mytvstream.net:8080/live/A1Jay5/362586/20934.m3u8
@ -160,16 +157,13 @@ http://mytvstream.net:8080/live/A1Jay5/362586/20934.m3u8
http://mytvstream.net:8080/live/A1Jay5/362586/221151.m3u8
#EXTINF:-1 tvg-chno="53" tvg-id="FanDuel.Sports.Network.Southeast.HDTV.(Mont./Birm./Dothan/Mobile.AL).us" tvg-name="FDSN Southeast" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s20789_dark_360w_270h.png" group-title="TV",FDSN Southeast
#EXTVLCOPT:http-user-agent=curl/8.5.0
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222130
http://mytvstream.net:8080/live/A1Jay5/362586/2213.m3u8
#EXTINF:-1 tvg-chno="54" tvg-id="FanDuel.Sports.Network.Southwest.HDTV.24/7.(Main).us" tvg-name="FDSN Southwest" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59629_dark_360w_270h.png" group-title="TV",FDSN Southwest
#EXTVLCOPT:http-user-agent=curl/8.5.0
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/220452
http://mytvstream.net:8080/live/A1Jay5/362586/21843.m3u8
#EXTINF:-1 tvg-chno="55" tvg-id="FanDuel.Sports.Network.Sun.South.24/7.HDTV.(South.Marlins,.Rays,.Heat).us" tvg-name="FDSN Sun" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s61084_dark_360w_270h.png" group-title="TV",FDSN Sun
#EXTVLCOPT:http-user-agent=curl/8.5.0
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222132
http://mytvstream.net:8080/live/A1Jay5/362586/104917.m3u8
#EXTINF:-1 tvg-chno="56" tvg-id="FanDuel.Sports.Network.West.HDTV.us" tvg-name="FDSN West" tvg-logo="https://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59627_dark_360w_270h.png" group-title="TV",FDSN West
http://mytvstream.net:8080/live/A1Jay5/362586/20932.m3u8
@ -217,8 +211,7 @@ https://fl1.moveonjoy.com/FXX/index.m3u8
http://fl1.moveonjoy.com/FYI/index.m3u8
#EXTINF:-1 tvg-chno="71" tvg-id="Game.Show.Network.HD.us2" tvg-name="Game Show Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s14909_dark_360w_270h.png" group-title="TV",Game Show Network
#EXTVLCOPT:http-user-agent=curl/8.5.0
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/1948
https://streamer1.nexgen.bz/GSN/index.m3u8
#EXTINF:-1 tvg-chno="72" tvg-id="get.us2" tvg-name="getTV" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s82563_dark_360w_270h.png" group-title="TV",getTV
http://fl1.moveonjoy.com/GET_TV/index.m3u8
@ -242,7 +235,7 @@ https://fl1.moveonjoy.com/HALLMARK_MOVIES_MYSTERIES/index.m3u8
http://fl1.moveonjoy.com/HBO/index.m3u8
#EXTINF:-1 tvg-chno="79" tvg-id="HBO2.HD.us2" tvg-name="HBO 2" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s68140_dark_360w_270h.png" group-title="TV",HBO 2
http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/2071
http://fl1.moveonjoy.com/HBO_2/index.m3u8
#EXTINF:-1 tvg-chno="80" tvg-id="HBO.Comedy.HD.us2" tvg-name="HBO Comedy" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s59839_dark_360w_270h.png" group-title="TV",HBO Comedy
http://fl1.moveonjoy.com/HBO_COMEDY/index.m3u8
@ -443,4 +436,4 @@ http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/129973
https://fl1.moveonjoy.com/VICELAND/index.m3u8
#EXTINF:-1 tvg-chno="146" tvg-id="Yes.Network.us2" tvg-name="YES Network" tvg-logo="http://schedulesdirect-api20141201-logos.s3.dualstack.us-east-1.amazonaws.com/stationLogos/s30017_dark_360w_270h.png" group-title="TV",YES Network
https://fl1.moveonjoy.com/YES_NETWORK/index.m3u8
https://fl1.moveonjoy.com/YES_NETWORK/index.m3u8

View file

@ -13,9 +13,7 @@ BASE_M3U8 = Path(__file__).parent / "base.m3u8"
EPG_FILE = Path(__file__).parent / "TV.xml"
LIVE_IMG = "https://i.gyazo.com/978f2eb4a199ca5b56b447aded0cb9e3.png"
EPG_URLS = {
EPG_URLS = [
"https://epgshare01.online/epgshare01/epg_ripper_CA2.xml.gz",
"https://epgshare01.online/epgshare01/epg_ripper_DUMMY_CHANNELS.xml.gz",
"https://epgshare01.online/epgshare01/epg_ripper_FANDUEL1.xml.gz",
@ -24,7 +22,9 @@ EPG_URLS = {
"https://epgshare01.online/epgshare01/epg_ripper_US2.xml.gz",
"https://epgshare01.online/epgshare01/epg_ripper_US_LOCALS1.xml.gz",
"https://i.mjh.nz/Roku/all.xml.gz",
}
]
LIVE_IMG = "https://i.gyazo.com/978f2eb4a199ca5b56b447aded0cb9e3.png"
DUMMIES = {
"Basketball.Dummy.us": LIVE_IMG,
@ -48,47 +48,39 @@ REPLACE_IDs = {
def get_tvg_ids() -> dict[str, str]:
tvg: dict[str, str] = {}
base_m3u8 = BASE_M3U8.read_text(encoding="utf-8").splitlines()
for line in BASE_M3U8.read_text(encoding="utf-8").splitlines():
if not line.startswith("#EXTINF"):
continue
tvg = {}
tvg_id = re.search(r'tvg-id="([^"]*)"', line)
tvg_logo = re.search(r'tvg-logo="([^"]*)"', line)
for line in base_m3u8:
if line.startswith("#EXTINF"):
tvg_id = re.search(r'tvg-id="([^"]*)"', line)[1]
tvg_logo = re.search(r'tvg-logo="([^"]*)"', line)[1]
if tvg_id:
tvg[tvg_id[1]] = tvg_logo[1] if tvg_logo else None
tvg |= DUMMIES
tvg |= {v["old"]: LIVE_IMG for v in REPLACE_IDs.values()}
tvg[tvg_id] = tvg_logo
return tvg
async def fetch_xml(url: str) -> ET.Element | None:
if not (xml_data := await network.request(url, log=log)):
if not (html_data := await network.request(url, log=log)):
return
try:
log.info(f'Parsing XML from "{url}"')
decompressed_data = gzip.decompress(html_data.content)
data = gzip.decompress(xml_data.content)
return ET.fromstring(data)
return ET.fromstring(decompressed_data)
except Exception as e:
log.error(f'Failed to parse from "{url}": {e}')
log.error(f'Failed to decompress and parse XML from "{url}": {e}')
return
def hijack_id(
root: ET.Element,
*,
old: str,
new: str,
text: str,
root: ET.Element,
) -> None:
og_channel = root.find(f"./channel[@id='{old}']")
@ -98,7 +90,7 @@ def hijack_id(
display_name = og_channel.find("display-name")
if (display_name := og_channel.find("display-name")) is not None:
if display_name is not None:
new_channel.append(ET.Element("display-name", display_name.attrib))
new_channel[-1].text = text
@ -122,7 +114,9 @@ def hijack_id(
new_program.append(new_child)
for tag_name in ["title", "desc", "sub-title"]:
if (tag := new_program.find(tag_name)) is not None:
tag = new_program.find(tag_name)
if tag is not None:
tag.text = text
root.remove(program)
@ -135,60 +129,48 @@ async def main() -> None:
tvg_ids = get_tvg_ids()
parsed_tvg_ids: set[str] = set()
tvg_ids |= DUMMIES | {v["old"]: LIVE_IMG for v in REPLACE_IDs.values()}
root = ET.Element("tv")
epgs = await asyncio.gather(*(fetch_xml(url) for url in EPG_URLS))
tasks = [fetch_xml(url) for url in EPG_URLS]
results = await asyncio.gather(*tasks)
for epg_data in results:
if epg_data is None:
continue
for epg_data in (epg for epg in epgs if epg is not None):
for channel in epg_data.findall("channel"):
if (channel_id := channel.get("id")) not in tvg_ids:
continue
if (channel_id := channel.get("id")) in tvg_ids:
for icon_tag in channel.findall("icon"):
if logo := tvg_ids.get(channel_id):
icon_tag.set("src", logo)
parsed_tvg_ids.add(channel_id)
if (url_tag := channel.find("url")) is not None:
channel.remove(url_tag)
for icon_tag in channel.findall("icon"):
if logo := tvg_ids.get(channel_id):
icon_tag.set("src", logo)
if (url_tag := channel.find("url")) is not None:
channel.remove(url_tag)
root.append(channel)
root.append(channel)
for program in epg_data.findall("programme"):
if program.get("channel") not in tvg_ids:
continue
if program.get("channel") in tvg_ids:
title_text = program.find("title").text
subtitle = program.find("sub-title")
title_text = program.find("title").text
if (
title_text in ["NHL Hockey", "Live: NFL Football"]
and subtitle is not None
):
program.find("title").text = f"{title_text} {subtitle.text}"
subtitle = program.find("sub-title")
root.append(program)
if (
title_text in ["NHL Hockey", "Live: NFL Football"]
and subtitle is not None
):
program.find("title").text = f"{title_text} {subtitle.text}"
root.append(program)
for title, ids in REPLACE_IDs.items():
hijack_id(root, **ids, text=title)
if missing_ids := set(tvg_ids) - parsed_tvg_ids:
log.warning(f"Missed {len(missing_ids)} TVG ID(s)")
for channel_id in missing_ids:
log.warning(f"Missing: {channel_id}")
for k, v in REPLACE_IDs.items():
hijack_id(**v, text=k, root=root)
tree = ET.ElementTree(root)
tree.write(
EPG_FILE,
encoding="utf-8",
xml_declaration=True,
)
tree.write(EPG_FILE, encoding="utf-8", xml_declaration=True)
log.info(f"EPG saved to {EPG_FILE.resolve()}")

File diff suppressed because it is too large Load diff

View file

@ -21,7 +21,6 @@ from scrapers import (
streamfree,
streamhub,
streamsgate,
totalsportek,
tvpass,
watchfooty,
webcast,
@ -71,7 +70,6 @@ async def main() -> None:
asyncio.create_task(streamcenter.scrape(xtrnl_brwsr)),
# asyncio.create_task(streamhub.scrape(xtrnl_brwsr)),
asyncio.create_task(streamsgate.scrape(xtrnl_brwsr)),
asyncio.create_task(totalsportek.scrape(hdl_brwsr)),
asyncio.create_task(webcast.scrape(hdl_brwsr)),
asyncio.create_task(watchfooty.scrape(xtrnl_brwsr)),
]
@ -82,10 +80,10 @@ async def main() -> None:
asyncio.create_task(pawa.scrape()),
asyncio.create_task(roxie.scrape()),
asyncio.create_task(shark.scrape()),
# asyncio.create_task(streambtw.scrape()),
asyncio.create_task(streambtw.scrape()),
asyncio.create_task(streamfree.scrape()),
asyncio.create_task(tvpass.scrape()),
# asyncio.create_task(xstreameast.scrape()),
asyncio.create_task(xstreameast.scrape()),
]
await asyncio.gather(*(pw_tasks + httpx_tasks))
@ -114,7 +112,6 @@ async def main() -> None:
| streamfree.urls
| streamhub.urls
| streamsgate.urls
| totalsportek.urls
| tvpass.urls
| watchfooty.urls
| webcast.urls

View file

@ -10,12 +10,12 @@ log = get_logger(__name__)
urls: dict[str, dict[str, str | float]] = {}
TAG = "LIVETVSX"
CACHE_FILE = Cache(TAG, exp=10_800)
TAG = "LTVSX"
XML_CACHE = Cache(f"{TAG}-xml", exp=28_000)
CACHE_FILE = Cache(TAG, exp=10_800)
BASE_URL = "https://cdn.livetv861.me/rss/upcoming_en.xml"
VALID_SPORTS = {"NBA", "NHL", "NFL", "NCAA", "MLB"}
@ -160,8 +160,8 @@ async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
live = []
start_ts = now.delta(hours=-1).timestamp()
end_ts = now.delta(minutes=5).timestamp()
start_ts = now.delta(minutes=-30).timestamp()
end_ts = now.delta(minutes=30).timestamp()
for k, v in events.items():
if k in cached_keys:
@ -193,7 +193,7 @@ async def scrape(browser: Browser) -> None:
log.info(f"Processing {len(events)} new URL(s)")
if events:
async with network.event_context(browser, ignore_https=True) as context:
async with network.event_context(browser) as context:
for i, ev in enumerate(events, start=1):
async with network.event_page(context) as page:
handler = partial(
@ -210,11 +210,10 @@ async def scrape(browser: Browser) -> None:
log=log,
)
sport, event, ts, link = (
sport, event, ts = (
ev["sport"],
ev["event"],
ev["timestamp"],
ev["link"],
)
key = f"[{sport}] {event} ({TAG})"
@ -227,7 +226,6 @@ async def scrape(browser: Browser) -> None:
"base": "https://livetv.sx/enx/",
"timestamp": ts,
"id": tvg_id or "Live.Event.us",
"link": link,
}
cached_urls[key] = entry

View file

@ -1,146 +0,0 @@
from functools import partial
from urllib.parse import urljoin, urlparse
from playwright.async_api import Browser
from selectolax.parser import HTMLParser
from .utils import Cache, Time, get_logger, leagues, network
log = get_logger(__name__)
urls: dict[str, dict[str, str | float]] = {}
TAG = "TOTALSPRTK"
CACHE_FILE = Cache(TAG, exp=28_800)
BASE_URL = "https://live3.totalsportek777.com/"
def fix_txt(s: str) -> str:
s = " ".join(s.split())
return s.upper() if s.islower() else s
async def get_events(cached_keys: list[str]) -> list[dict[str, str]]:
events = []
if not (html_data := await network.request(BASE_URL, log=log)):
return events
soup = HTMLParser(html_data.content)
sport = "Live Event"
for node in soup.css("a"):
if not node.attributes.get("class"):
continue
if (parent := node.parent) and "my-1" in parent.attributes.get("class", ""):
if span := node.css_first("span"):
sport = span.text(strip=True)
sport = fix_txt(sport)
if not (teams := [t.text(strip=True) for t in node.css(".col-7 .col-12")]):
continue
if not (href := node.attributes.get("href")):
continue
href = urlparse(href).path if href.startswith("http") else href
if not (time_node := node.css_first(".col-3 span")):
continue
if time_node.text(strip=True) != "MatchStarted":
continue
event_name = fix_txt(" vs ".join(teams))
if f"[{sport}] {event_name} ({TAG})" in cached_keys:
continue
events.append(
{
"sport": sport,
"event": event_name,
"link": urljoin(BASE_URL, href),
}
)
return events
async def scrape(browser: Browser) -> None:
cached_urls = CACHE_FILE.load()
valid_urls = {k: v for k, v in cached_urls.items() if v["url"]}
valid_count = cached_count = len(valid_urls)
urls.update(valid_urls)
log.info(f"Loaded {cached_count} event(s) from cache")
log.info(f'Scraping from "{BASE_URL}"')
events = await get_events(cached_urls.keys())
log.info(f"Processing {len(events)} new URL(s)")
if events:
now = Time.clean(Time.now())
async with network.event_context(browser) as context:
for i, ev in enumerate(events, start=1):
async with network.event_page(context) as page:
handler = partial(
network.process_event,
url=ev["link"],
url_num=i,
page=page,
log=log,
)
url = await network.safe_process(
handler,
url_num=i,
semaphore=network.HTTP_S,
log=log,
)
sport, event, link = (
ev["sport"],
ev["event"],
ev["link"],
)
key = f"[{sport}] {event} ({TAG})"
tvg_id, logo = leagues.get_tvg_info(sport, event)
entry = {
"url": url,
"logo": logo,
"base": link,
"timestamp": now.timestamp(),
"id": tvg_id or "Live.Event.us",
"link": link,
}
cached_urls[key] = entry
if url:
valid_count += 1
urls[key] = entry
if new_count := valid_count - cached_count:
log.info(f"Collected and cached {new_count} new event(s)")
else:
log.info("No new events found")
CACHE_FILE.write(cached_urls)

View file

@ -811,94 +811,65 @@
"teams": {
"NBA": [
"76ers",
"Atlanta",
"Atlanta Hawks",
"Blazers",
"Boston",
"Boston Celtics",
"Brooklyn Nets",
"Bucks",
"Bulls",
"Cavaliers",
"Celtics",
"Charlotte",
"Charlotte Hornets",
"Chicago",
"Chicago Bulls",
"Cleveland",
"Cleveland Cavaliers",
"Clippers",
"Dallas",
"Dallas Mavericks",
"Denver",
"Denver Nuggets",
"Detroit",
"Detroit Pistons",
"Golden State",
"Golden State Warriors",
"Grizzlies",
"Hawks",
"Heat",
"Hornets",
"Houston",
"Houston Rockets",
"Indiana",
"Indiana Pacers",
"Jazz",
"Kings",
"Knicks",
"Lakers",
"Los Angeles",
"Los Angeles Clippers",
"Los Angeles Lakers",
"Magic",
"Mavericks",
"Memphis",
"Memphis Grizzlies",
"Miami",
"Miami Heat",
"Milwaukee",
"Milwaukee Bucks",
"Minnesota",
"Minnesota Timberwolves",
"Nets",
"New Orleans",
"New Orleans Pelicans",
"New York",
"New York Knicks",
"Nuggets",
"Oklahoma",
"Oklahoma City",
"Oklahoma City Thunder",
"Orlando",
"Orlando Magic",
"Pacers",
"Pelicans",
"Philadelphia",
"Philadelphia 76ers",
"Phoenix",
"Phoenix Suns",
"Pistons",
"Portland",
"Portland Trail Blazers",
"Raptors",
"Rockets",
"Sacramento",
"Sacramento Kings",
"San Antonio",
"San Antonio Spurs",
"Sixers",
"Spurs",
"Suns",
"Thunder",
"Timberwolves",
"Toronto",
"Toronto Raptors",
"Trail Blazers",
"Utah",
"Utah Jazz",
"Warriors",
"Washington",
"Washington Wizards",
"Wizards",
"Wolves"

View file

@ -129,14 +129,12 @@ class Network:
async def event_context(
browser: Browser,
stealth: bool = True,
ignore_https: bool = False,
) -> AsyncGenerator[BrowserContext, None]:
context: BrowserContext | None = None
try:
context = await browser.new_context(
user_agent=Network.UA if stealth else None,
ignore_https_errors=ignore_https,
viewport={"width": 1366, "height": 768},
device_scale_factor=1,
locale="en-US",

View file

@ -9,24 +9,20 @@ STATUSLOG=$(mktemp)
get_status() {
local url="$1"
local channel="$2"
local index="$3"
local total="$4"
local attempt response status_code
[[ "$url" != http* ]] && return
printf '[%d/%d] Checking %s\n' "$((index + 1))" "$total" "$url"
for attempt in $(seq 1 "$RETRY_COUNT"); do
response=$(
curl -skL \
-A "$UA" \
-H "Accept: */*" \
-H "Accept-Language: en-US,en;q=0.9" \
-H "Accept-Encoding: gzip, deflate, br" \
-H "Connection: keep-alive" \
-o /dev/null \
--compressed \
--max-time 30 \
--max-time 15 \
-w "%{http_code}" \
"$url" 2>&1
)
@ -51,7 +47,7 @@ get_status() {
status_code="$response"
case "$status_code" in
2* | 3*)
200)
echo "PASS" >>"$STATUSLOG"
;;
@ -75,7 +71,6 @@ get_status() {
check_links() {
echo "Checking links from: $base_file"
total_urls=$(grep -cE '^https?://' "$base_file")
channel_num=0
name=""
@ -91,14 +86,14 @@ check_links() {
elif [[ "$line" =~ ^https?:// ]]; then
while (($(jobs -r | wc -l) >= MAX_JOBS)); do sleep 0.2; done
get_status "$line" "$name" "$channel_num" "$total_urls" &
get_status "$line" "$name" &
((channel_num++))
fi
done < <(cat "$base_file")
wait
echo -e "\nDone."
echo "Done."
}
write_readme() {

View file

@ -1,18 +1,17 @@
## Base Log @ 2026-01-28 03:55 UTC
## Base Log @ 2026-01-27 03:57 UTC
### ✅ Working Streams: 136<br>❌ Dead Streams: 10
### ✅ Working Streams: 136<br>❌ Dead Streams: 9
| Channel | Error (Code) | Link |
| ------- | ------------ | ---- |
| FDSN Ohio | HTTP Error (502) | `http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222126` |
| FDSN Oklahoma | HTTP Error (403) | `http://mytvstream.net:8080/live/A1Jay5/362586/20934.m3u8` |
| FDSN Southeast | HTTP Error (502) | `http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222130` |
| FDSN Southwest | HTTP Error (502) | `http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/220452` |
| FDSN Sun | HTTP Error (502) | `http://lucidhosting.xyz:82/sandriassoc@gmail.com/Sm8G4ddxoW/222132` |
| BBC World News | HTTP Error (404) | `http://fl1.moveonjoy.com/BBC_WORLD_NEWS/index.m3u8` |
| FDSN Southeast | HTTP Error (403) | `http://mytvstream.net:8080/live/A1Jay5/362586/2213.m3u8` |
| FDSN Southwest | HTTP Error (403) | `http://mytvstream.net:8080/live/A1Jay5/362586/21843.m3u8` |
| FXX | HTTP Error (404) | `https://fl1.moveonjoy.com/FXX/index.m3u8` |
| NBC Sports Philadelphia | HTTP Error (403) | `http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/136477` |
| NFL RedZone | HTTP Error (502) | `http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/249239` |
| Premier Sports 2 | HTTP Error (502) | `http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/117038` |
| Lifetime Movie Network | HTTP Error (404) | `https://fl1.moveonjoy.com/LIFETIME_MOVIE_NETWORK/index.m3u8` |
| NBC Sports Bay Area | Unknown status (302) | `http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/257216` |
| Paramount Network | HTTP Error (404) | `https://fl1.moveonjoy.com/PARAMOUNT_NETWORK/index.m3u8` |
| Premier Sports 2 | Unknown status (302) | `http://hardcoremedia.xyz:80/NW3Vk7xXwW/8375773282/117038` |
| Sportsnet One | HTTP Error (403) | `http://mytvstream.net:8080/live/k4Svp2/645504/57297.m3u8` |
---
#### Base Channels URL