[ / / / / / / / / / / / / ] [ dir / egy / htg / newbrit / pdfs / tijuana / vichan / voxed / zoo ]

/hydrus/ - Hydrus Network

Bug reports, feature requests, and other discussion for the hydrus network.

Catalog

Name
Email
Subject
Comment *
File
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Embed
(replaces files and can be used instead)
Options
Password (For file and post deletion.)

Allowed file types:jpg, jpeg, gif, png, webm, mp4, swf, pdf
Max filesize is 12 MB.
Max image dimensions are 10000 x 10000.
You may upload 5 per post.


New user? Start here ---> http://hydrusnetwork.github.io/hydrus/

Current to-do list has: 1015 items

Current big job: downloader engine overhaul


YouTube embed. Click thumbnail to play.

f6e26d No.6398

windows

zip: https://github.com/hydrusnetwork/hydrus/releases/download/v266/Hydrus.Network.266.-.Windows.-.Extract.only.zip

exe: https://github.com/hydrusnetwork/hydrus/releases/download/v266/Hydrus.Network.266.-.Windows.-.Installer.exe

os x

app: https://github.com/hydrusnetwork/hydrus/releases/download/v266/Hydrus.Network.266.-.OS.X.-.App.dmg

tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v266/Hydrus.Network.266.-.OS.X.-.Extract.only.tar.gz

linux

tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v266/Hydrus.Network.266.-.Linux.-.Executable.tar.gz

source

tar.gz: https://github.com/hydrusnetwork/hydrus/archive/v266.tar.gz

I had a great week with a lot of hydrus work done. All the 'gallery' downloaders and subscriptions now use the new networking engine, and the program is more stable.

gallery downloaders and other improvements

All the gallery downloaders, like boorus and pixiv and so on, now use the new networking engine. They display the new 'network job control' on their download pages and obey the new bandwidth rules.

Subscriptions do not yet have any network ui (so you will not see the files actually downloading with a KB/s display), but they seem to work well. By default, they will now download up to 256MB of files a day and then stop, so you should see them spreading their work out for bigger jobs.

Also, I am happy to say that the Hentai Foundry downloader works again! As expected, the new engine's better handling of cookies and generally more polite, browser-like behaviour was what was needed. If you have a bunch of HF subs paused, please do slowly unpause them in batches over the next week or so and let me know how it goes.

And the 'file import status' controls–the part where it says '23 successful, 3 already deleted…', the overall import progress gauge, and the button to launch the detailed status window–are now all wrapped into the same tight panel, just to keep it neat and laid out together. This is now deployed in several locations, along with some other misc ui improvements.

And I have separated the gallery 'file' and 'page' download loop into two separate threads so that they run simultaneously. This is just a neat workflow/QoL improvement I have been planning a while, and since I was diving into this stuff this week, I made the change. I am really pleased with it.

I also fixed an issue with gelbooru parsing–they changed their url format again.

I made a lot of changes to the downloaders this week. There are likely some bugs, so please report any weirdness or other errors you come across.

modal popups and db migration

You won't see these often, but all big database maintenance jobs will now throw their popup message in the middle of the screen, and while they run, you won't be able to use the rest of the client. This will stop some accidental hangs people have encountered when they try to use the client when big stuff is going on. Most of the jobs have a cancel button if you want to stop the early, which you now won't be able to miss.

And I have fleshed out the database->migrate database dialog. It now allows thumbnail relocation, gives more information, and uses the new cancellable modal popup when it rebalances files. This dialog is almost ready for any user, so feel free to check it out and read the draft help it links to if you are interested in migrating your db.

stability and fixes

All pages that do work in the background (usually, this means importers/downloaders) are now much more polite about how they talk to themselves and the rest of the client. Having thirty thread watchers open is now far more stable and causes less jittery lag and generally 'bad' code stuff behind the scenes. I believe this has either completely fixed or greatly relieved the crashes some people have experienced trying to open the options or some other dialogs when the client is under heavy usage. Let me know if you still get crashes trying to open dialogs!

The top-right hover window will no longer flicker when you set a rating!

Large videos won't take so long to import!

full list

- converted gallery downloaders to the new network engine

- greatly simplified how gallery downloaders report network activity and converted them to show the new network job control as well

- subscriptions also work on the new engine but will not show network gui yet

- hacked hentai foundry and pixiv login to use the new network engine

- successful logins to hf or pixiv now print to the log

- the new network engine now clears temporary session cookies after 90 mins of inactivity

- gallery downloaders and subscriptions now use the new 'downloader instance' and subscription bandwidth rules. by default, this means downloaders will do small bursts every five minutes and that subscriptions will do at most 256MB per day

- subscriptions' bandwidth use is now listed by name in the review bandwidth panel

- subscriptions use a new bandwidth test to determine if they should start or continue based on current bandwidth limits. it should mean subs do a good bit of work and then stop when they are supposed to without ever waiting on bandwidth more than 30s or so

- thread watcher and page of images now ignore bandwidth limits when doing their 'page' fetching part

- gallery page fetching will ignore bandwidth rules (in order to stop gallery walk desyncs from having to wait a long time)–it will fetch one page per five seconds

- the thread watcher and page of images importers now work on their files and page-checking simultaneously–also, the page of images will process its page queue at any time, not only when the current file queue is finished

- the gallery downloader page now works on its files and gallery page parsing simultaneously!

- wrote a 'file import status' control to better wrap up the import summary, progress gauge, and file import status button into one panel

- thread watchers now use the new file status control

- 'page of images' downloaders now use the new file status control

- the gallery downloaders now use the new file status control

- all network jobs will now retry up to four times on the BadStatusLine ConnectionError, which seems to be a TLS (https) negotiation timeout/remote termination

- all requests on the new network engine will now timeout after ten seconds

- they will also retry on generic timeout errors

- popup messages can now be shown in 'modal' mode as a dialog that prohibits interaction with the rest of the client.

- these will not boot while the client is minimised

- database maintenance routines will now all publish messages like this

- 'migrate database' panel now publishes a modal message when it does a file rebalance

- rewrote the controller-side pubsub pipeline to respond faster and consume fewer program resources, particularly for the client

- import pages now update themselves in a less spammy way behind the scenes, meaning more active pages can be open at once without them stepping on each other and clogging things up

- simplified how importers set their status information in several ways

- reduced a swath of pubsub spam related to content updates

- improved how spammy small jobs are written to the profile log

- reduced text flicker on all download pages

- more misc pubsub improvements

- reduced some gui update/refresh spam on hidden pages

- cleaned up a bunch of database->gui message reporting and cleanup code

- added close other/left/right pages to tab right-click menu

- the top-right media hover window will no longer refit-flicker on a ratings change

- wrote a new panel wrapper for listctrls that handles the underlying row of buttons in a neater way and automatically disables them if they are nullipotent (mostly, this means greying out 'delete' buttons when nothing is selected)

- several listctrls use this new panel

- when it is not strictly necessary, videos that are >30MB will no longer use the CPU-expensive manual frame count parsing

- a problem where newly reloaded thread watchers could sometimes stick in a 1/1 initialisation state _should_ be fixed

- fixed gelbooru url parsing (they stopped using the janky redirect.php urls)

- fixed an issue that meant ipfs pin was erroring when trying to show gui-side

- 'page of images' downloaders now protest if they are told to close while working

- some small layout and status text fixes

- adminside petition processing now has a 'flip selected' box to flip checked status of all selected contents

- adminside petition processing contents chechklistbox now supports multiple selection

- improved the reliability of some shutdown code

- cleared out some old unused code

- misc new control cleanup

- misc fixes

- more misc fixes

next week

I pushed it a little hard this week. I did a lot of hydrus work and I had a stressful IRL week besides, so I would like to take it easier next week, mostly fixing any lingering issues with these recent changes and adding bells and whistles like subscription network ui.

Sankaku is not yet working in the new networking engine, but I think I know the fix. It will take some more downloader overhaul work to get it in the client proper, but it is in my mind, so please hang in there.

de3148 No.6401

>>6398

I'm still not sure gelbooru's working correctly; the tag, "a1" gets 798 urls when gel has a tag count of ~3000. I know the tag counts displayed on the site aren't the true count, but that still seems too small. Another case: "agenasu" has a count of 233, 180 thumbnails show up when you search it, but my subscription only finds 96 urls. I've clicked about half of the thumbnails and they're all valid in my browser, too. Any idea?


01969f No.6403

I'm getting a huge amount of errors after the update.

"Daemon Save Dirty Objects encountered an exception:"


DBException
UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-1: ordinal not in range(128)
Traceback (most recent call last):
File "include\HydrusThreading.py", line 195, in run
self._callable( self._controller )
File "include\ClientDaemons.py", line 247, in DAEMONSaveDirtyObjects
controller.SaveDirtyObjects()
File "include\ClientController.py", line 1047, in SaveDirtyObjects
self.WriteSynchronous( 'serialisable', self.network_engine.bandwidth_manager )
File "include\HydrusController.py", line 413, in WriteSynchronous
return self._Write( action, HC.LOW_PRIORITY, True, *args, **kwargs )
File "include\HydrusController.py", line 121, in _Write
result = self.db.Write( action, priority, synchronous, *args, **kwargs )
File "include\HydrusDB.py", line 877, in Write
if synchronous: return job.GetResult()
File "include\HydrusData.py", line 1739, in GetResult
raise e
DBException: UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-1: ordinal not in range(128)
Database Traceback (most recent call last):
File "include\HydrusDB.py", line 525, in _ProcessJob
result = self._Write( action, *args, **kwargs )
File "include\ClientDB.py", line 10302, in _Write
elif action == 'serialisable': result = self._SetJSONDump( *args, **kwargs )
File "include\ClientDB.py", line 8172, in _SetJSONDump
( dump_type, version, serialisable_info ) = obj.GetSerialisableTuple()
File "include\HydrusSerialisable.py", line 136, in GetSerialisableTuple
return ( self.SERIALISABLE_TYPE, self.SERIALISABLE_VERSION, self._GetSerialisableInfo() )
File "include\ClientNetworking.py", line 1128, in _GetSerialisableInfo
all_serialisable_trackers = [ ( network_context.GetSerialisableTuple(), tracker.GetSerialisableTuple() ) for ( network_context, tracker ) in self._network_contexts_to_bandwidth_trackers.items() if not network_context.IsEphemeral() ]
File "include\HydrusSerialisable.py", line 136, in GetSerialisableTuple
return ( self.SERIALISABLE_TYPE, self.SERIALISABLE_VERSION, self._GetSerialisableInfo() )
File "include\ClientNetworking.py", line 1460, in _GetSerialisableInfo
serialisable_context_data = self.context_data.encode( 'hex' )
File "encodings\hex_codec.py", line 24, in hex_encode
UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-1: ordinal not in range(128)


Database Traceback (most recent call last):
File "include\HydrusDB.py", line 525, in _ProcessJob
result = self._Write( action, *args, **kwargs )
File "include\ClientDB.py", line 10302, in _Write
elif action == 'serialisable': result = self._SetJSONDump( *args, **kwargs )
File "include\ClientDB.py", line 8172, in _SetJSONDump
( dump_type, version, serialisable_info ) = obj.GetSerialisableTuple()
File "include\HydrusSerialisable.py", line 136, in GetSerialisableTuple
return ( self.SERIALISABLE_TYPE, self.SERIALISABLE_VERSION, self._GetSerialisableInfo() )
File "include\ClientNetworking.py", line 1128, in _GetSerialisableInfo
all_serialisable_trackers = [ ( network_context.GetSerialisableTuple(), tracker.GetSerialisableTuple() ) for ( network_context, tracker ) in self._network_contexts_to_bandwidth_trackers.items() if not network_context.IsEphemeral() ]
File "include\HydrusSerialisable.py", line 136, in GetSerialisableTuple
return ( self.SERIALISABLE_TYPE, self.SERIALISABLE_VERSION, self._GetSerialisableInfo() )
File "include\ClientNetworking.py", line 1460, in _GetSerialisableInfo
serialisable_context_data = self.context_data.encode( 'hex' )
File "encodings\hex_codec.py", line 24, in hex_encode
UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-1: ordinal not in range(128)


01969f No.6404

File: 5bc85face09ede1⋯.png (19.99 KB, 871x641, 871:641, screen.1501137120.png)

>>6403

Also when I tried to exit, a serious error occurred.


de3148 No.6405

>>6403

>>6404

I also got these when I first updated and ran subscriptions as normal, but after restarting the client a few times I do not encounter them.


ab41b3 No.6406

Everything works fine here, feels good to be able to turn on all my subscriptions again.


2f51c5 No.6407

>>6405

>>6404

>>6403

Same, restarting doesn't help, all my subs are fucked.


2f51c5 No.6408

>>6407

Should add that restarting my client doesn't fix it for me.


f6e26d No.6409

>>6403

If you have this problem, please pause your subscriptions through services->pause, then go into review bandwidth, and delete any history for your subs.

I think maybe your subs have unicode names? Could you post some examples if so?

If you do have unicode sub names, renaming them in manage subscriptions should fix this problem for now.

I will have a proper fix for this next week.

Post last edited at

2f51c5 No.6410

>>6409

Most of my subs have japanese names with unicode letters, and I have 500+ subs, so I guess I'm fucked until next wednesday.


dba63e No.6411

>you can't have more than 128 page

Warn me, but please don't stopping to loading de facto deleting these pages.

I can't recover them now, using last session doesn't do it, stopping the same way..

I should have saved the session before updating I guess, but I really didn't expect that innovation.

It wasn't unstable, just slow to start (bunch of 3/4 illustration pages.)

I had to revert to a week old update and revert my version to cut it in half.


dba63e No.6412

>>6411

That behavior was part of the previous update, which I have now read.

I hope I can still somehow restore it.


dba63e No.6413

I think the best way would have been to simply refuse to open any new page until the session is under 127 page at least.


de3148 No.6414

One of my pixiv subscriptions will only download some ~90 or so files (~70mb) before stopping. The only way to get it to continue as I want it to is to delete the bandwidth usage history. Is this intended?


de3148 No.6415

>>6401

did some testing with the gallery downloader page. it looks like it's detecting that there are less pages than there really are?

a1: found 1148 urls over 29 pages, exists 55 pages

agenasu: found 96 urls over 3 pages, exists 5 pages

aki99: found 168 urls over 4 pages, exists 8 pages

apostle: found urls over 4 pages, exists 5 pages


0f9470 No.6417

Ok, got something weird

a dialogue box will open in the upper left hand corner, then close itself

one time it happened, it locked me out of the program entirely because apparently it wanted something but was behind the client but I couldn't do anything to the client otherwise and just now I saw it again but it just closed itself on its own. I don't know what it is because it doesn't stay open long enough or in the foreground long enough for me to see what it is.


0f9470 No.6419

File: 5f194984d4acc84⋯.png (37.3 KB, 445x219, 445:219, client_2017-07-28_17-20-03.png)

Ok got one of the messages… however its weird.

This is apparently happening, and when pop ups like this occur it locks the entire program till the messages/prompt is existed.

Not sure if this is new, or if this is just something weird only I am experiencing, but thought I should mention it.

Side note, do you have any kind of an eta range for when you are going to have thread watcher tab watch multiple threads? Im getting by right now on knowing that I can drag and drop my address bar into notepad++ and \n html replace and then space with nothing replace and copy paste the link dump into a download tab, then go through the links at a later time in an archive again to confirm I have them all, while this works and I like the add all the links to the download queue instead of one at a time, it would be nice to be able to thread watcher these links instead.


ab41b3 No.6420

I updated all my Hentai Foundry subscriptions and left their new file tabs open, and closed Hydrus. Today I opened Hydrus again, but now some of the tabs have no files in them. No idea what happened there, the tabs don't seem to have saved properly. But only a few of them.


ab41b3 No.6421

I added a HF subscription to this artist, but Hydrus only finds one of the images…

http://www.hentai-foundry.com/pictures/user/Mosq


878638 No.6423

I'm hitting a lock up that seems to be triggered by subscriptions downloading new images while the client is minimised.

When I maximise the client again an extra window pops up in the top left, and I think I can make out the text 'similar' but it closes too quickly. I'll try to get a screenshot.

That window disappears and I get the following exception. At this point I can't interact with the client at all, I had to copy this from the console I started the client from.


2017/07/30 17:21:13: Uncaught exception:
2017/07/30 17:21:13: PyAssertionError
C++ assertion "IsRunning()" failed at ..\..\src\common\evtloopcmn.cpp(83) in wxEventLoopBase::Exit(): Use ScheduleExit() on not running loop
File "F:\hn\art-266\include\ClientGUIPopupMessages.py", line 943, in TIMEREvent
self.GetParent().DoOK()
File "F:\hn\art-266\include\ClientGUITopLevelWindows.py", line 512, in DoOK
self.EndModal( wx.ID_OK )
File "C:\Python27\lib\site-packages\wx-3.0-msw\wx\_windows.py", line 809, in EndModal
return _windows_.Dialog_EndModal(*args, **kwargs)

Upgraded from v264a, dunno if it happened on 265.


878638 No.6424

File: 74075a993c6a329⋯.png (1.17 KB, 108x87, 36:29, python_2017-07-30_17-51-49.png)

>>6423

Update: if I hover over the icon in the taskbar then I can see the modal behind the main window, but I have no way to make it active. Taskbar thinks there's only 1 window


97789b No.6425

I just want to point this out: https://wxpython.org/news/wxpython-4.0.0b1-release/index.html. Months ago I had mentioned a Python 3-compatible wxPython release (Project Phoenix) to you and you had expressed interest in taking a look at a potential port to Python 3, once a release was available: this seems to be the case now: https://pypi.python.org/pypi/wxPython/4.0.0b1.

Do you have any plans on taking a look at this and assessing how much effort it would be? Python 2 won't be around (at least officially supported) much longer, and general usage is steadily decreasing. Many aspects of Python 3 are also much faster by now.

Thanks as always for your continued efforts!


f6e26d No.6435

File: bbd67078bdd56ba⋯.jpg (3.83 MB, 2337x3252, 779:1084, bbd67078bdd56ba6d60b295ba5….jpg)

>>6401

>>6415

I will be looking at this tomorrow, thank you. I have had several similar reports.

>>6403

>>6404

>>6405

>>6407

>>6408

>>6410

This is now fixed for v267. I apologise for the inconvenience.

>>6411

>>6412

>>6413

I am sorry for the problem you had here. I was talking with a user who had I think 165 pages and was experiencing crashes due to some window-count limits being reached. I believe 128 pages may still be too high for program stability in some cases.

Did you end up recovering your pages? If not–but you still have the session–I can write a special debug exception for you that will ignore the limit and let you get them back.

I do expect to reduce this problem in future by allowing nested pages, particularly for downloaders. I want thread watchers to allow multiple threads per page in the current overhaul.

>>6414

Rather than trying to download all their files in one go, subscriptions now do a small amount of work every day. I think by default it is 256MB or ~100 files per day, whichever occurs first. You can change this under the services->review bandwidth usage dialog. It is under construction, so if you check it out, please let me know if you find it confusing so I can improve the layout and help and so on.

>>6417

>>6419

>>6423

>>6424

Thank you for these reports. This jank is frustrating, and I apologise for the lock-ups. I got hit several times this week. It should all be fixed for v267. You won't see the flickering ones, and all of them will close themselves properly.

>>6419

For the thread watcher, I am not sure. I will do it in this current overhaul, but it will take a decent bit of work. I don't want to shush all the threads into the same set of thumbnails–I want them each to their own nested sub-tab, and the same for different gallery queries.

I did however add a hook for v267 that will auto-detect 4chan and 8chan links when you drag and drop them onto the client–it'll open up a new thread watcher immediately for that link and start it for you!

>>6420

Thank you for this report. I am not sure what has happened here. New sessions load asynchronously, so could it be that you had just restarted the client and clicked on the tab and they weren't loaded in yet?

I will keep this in mind.

>>6425

As it happens, I looked at the new wxPython this past Friday. I got the client to boot with a ton of errors that seem fixable. I will put aside a week in the coming months to make sure I can do the same on Linux and OS X and then figure it all out. There is a surprising amount of work to do, as I have a ton of old duct-tape fixes for wx 3.x issues in place that the 4.x code wasn't happy with.

The switch to Python 3 will be the same deal, but on a slightly longer timeframe. Next year, probably, but we'll see.


0f9470 No.6436

>>6435

very helpful with the hook, at least on threads that I want to keep up with no matter what, otherwise I think i'm better served by dumping everything into a notepad ++ file and copy paste for the time being.

I'm not even going to pretend to know how programing works, so ignore this if its not a feasible part of the steps to get it nested.

I would take a single thread watcher page that does not display images, but has all the threads listed and keeps them after they 404, but displays they 404, then through right clicking the link you could open all the images that thread imported in a new tab. This is functionality I believe would be there even after you do everything you want to do, If there is any way to get some kind of function skeleton of a multi thread thread watcher page without bending over backwards to do it, that would be great.

on a side note, got the 128 page message, is there a way to open a page after you get it or does it just hard block? I had the need of going over the limit for a moment a few hours ago because the program got locked up because of the messages, i missed an import folder import, and had to use the file search, but when I wanted to ctrl+A and open all files in new tab, I couldn't till something was closed, I have a lot of closeable shit open so its not a big issue for me, but it still an issue nonetheless.


97789b No.6437

>>6435

> The switch to Python 3 will be the same deal, but on a slightly longer timeframe. Next year, probably, but we'll see.

Sounds great. I hope at that time you might also consider being more active on GitHub, especially regarding contributions,


f6e26d No.6438

File: d397848be79326a⋯.gif (1.56 MB, 400x321, 400:321, d397848be79326ae610f061245….gif)

>>6436

Thank you for these thoughts. I think the multi-download page will be something like what you describe here. I am still thinking how I want to crush all the info together into a neat list that shows what you want in the space on the left. Having rethought the work I will need to do it nicely, I think I will 'bodge' an easy&ugly solution in the coming weeks and polish the rest later on.

I think the layout and workflow planning is the bigger issue now. I am keen to start, but I need to plan it out more.

The 128 limit is hard. I don't really want to make exceptions, but perhaps a warning at 120 is a good idea. I think the better solution here is to improve session control to make it easier to freeze open pages for later and reload favourite searches and so on.


0f9470 No.6440

>>6438

I have relatively recently got a new computer, so I don't have every program I use installed, one being photoshop, or I would mock something up for you to consider. but because I think you don't want a cluster fuck of 100~ watched threads images all in one tab, take this as an idea

the left side is blank but a placeholder for all the shit when you get the threads.

thread watcher itself should act on its own weather the tab is open or not, kind of like how duplicates still retains its place in searching,

Where images would usually be is the thread watcher mass threads

Now I don't know what is possible to do in the program without hitting a limit, take everything I suggest as either a simple dont need to double click function, or double clicking on the thread pops a window up that presents these functions.

[Thread url][user text][Images found][Images imported][Already in DB][Failed][Deleted][Green/yellow/red circle indicating can check, something's wrong and 404][Check interval][Time to next check][Check now]

the thread url would be a click to copy the address incase you want to see it in browser

user text is just a note, as some threads could be up for days or months depending on the board

The images are self explanatory

The indicator would just be an easy way to see what the status is

check interval and time are self explanatory with a check now button.

and it just goes like that all the way down to how ever many threads you want to keep up with.

now on the side that is a place holder at the top just under the tabs are 3 buttons the leftmost is blanked out the middle one is open in new tab and the right is see images

New tab would do just that, open the selected images in a new tab, nothing different from currently selecting images and opening them in a new tab

the right one would open the images in the same tab however and when seeing the images, the 3 buttons would still be there, but with the right most blanked out, and the left being go back the middle would stay unchanged.

at least thats my idea, you could also scrape the subject of the threads and put them in the user text field, as seeing 4chanx is able to see the subject as its own thing there has to be a hook there somewhere.


97789b No.6446

>>6438

Sooner or later you'll have to add CloudFlare detection/opening a browser window to solve it for the scapers.

CloudFlare is getting miore and more shitty.


6b7ac5 No.6453

>>6398

I could watch that girl (at the end of the video) double-wielding those swords all day ! :-)


f6e26d No.6465

File: cf92416906b7826⋯.gif (2.52 MB, 250x270, 25:27, cf92416906b78269ec8f5a3735….gif)

>>6440

Thanks. I'm in a difficult spot at the moment in that all the fun stuff I would like to do here would need like three weeks of rewriting. There's a lot of old bad code in the thumbnail view, so maybe I should just bite the bullet and finally get around to updating it, but I'm also up to my neck with other things and the overall downloader overhaul atm.

I will try and split the overarching project here into smaller jobs. I'm hoping to have nested page tabs in the next week or so–maybe that will clear out a lot of the mess here and make it less of a priority.

>>6446

Yeah, I think I agree. We'll see how the new engine goes–I am generally confident I have a fix for sankaku, for instance, based on User-Agent–and whether we can patch around most of the problems by hitting old APIs and so on, at least for now.

Please let me know, as it rolls out, any specific examples you can't figure out a fix for.

>>6453

Beauty and violence is the best combination, tbh.




[Return][Go to top][Catalog][Post a Reply]
Delete Post [ ]
[]
[ / / / / / / / / / / / / ] [ dir / egy / htg / newbrit / pdfs / tijuana / vichan / voxed / zoo ]