Replies: 2 comments
-
Hi there, this seems like a very interesting concept and absolutely would be a great improvement to the software, however it's far too big a task and (back a few months when I was working side-by-side with Taux1c) seeing as the scraper needs a massive rewrite to get it into a place where it can be improved upon without running into new problems every time another is solved, it just isn't something that is on the to-do list. |
Beta Was this translation helpful? Give feedback.
-
This would have been nice had I seen it before I quit working on this repo, however; it would be fairly easy to build. Instead of fetching subs you could just cut it off at that for loop and make another function that would open say users.list and for line.strip() in read_lines() add them to a list and pass the list as the list that would have been pulled from the fetch subs. Then make a menu option to call that function. All together maybe 2 hours to add that in. |
Beta Was this translation helpful? Give feedback.
-
Hello. I've been trying to teach myself this for a few hours but can't figure it out, so here I am asking how/what kind of challenge it could present me.
I don't want to download all the content from all my subs, but have far too many to selectively pluck out without it taking forever.. So I didn't know if...
Sorry if this isn't allowed here, I think it fits..? I'm still new to all of this so forgive me if not, thanks again for providing such an amazing resource <3. (not asking for hand out per se, I'm happy with being nudged in the right direction for the most part)
--If this isn't really a feasible thing to do could someone help give me a brief explanation why so I can better understand the functioning/limitations please?
Thanks for any insight anyones able to provide!! Happy hoarding!
Beta Was this translation helpful? Give feedback.
All reactions