windows
zip: https://github.com/hydrusnetwork/hydrus/releases/download/v359/Hydrus.Network.359.-.Windows.-.Extract.only.zip
exe: https://github.com/hydrusnetwork/hydrus/releases/download/v359/Hydrus.Network.359.-.Windows.-.Installer.exe
os x
app: https://github.com/hydrusnetwork/hydrus/releases/download/v359/Hydrus.Network.359.-.OS.X.-.App.dmg
linux
tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v359/Hydrus.Network.359.-.Linux.-.Executable.tar.gz
source
tar.gz: https://github.com/hydrusnetwork/hydrus/archive/v359.tar.gz
I had an excellent and very full week.
file maintenance
The new file maintenance system now has some nice UI under database->maintenance->review scheduled file maintenance. It has two pages. It is a little advanced, but if you have some hydrus experience, please feel free to have a play around with it.
The first page reviews what is currently scheduled to run. It shows job counts for each type and lets you manually start work on that job type or cancel them.
The second lets you schedule new jobs. It uses the standard file search interface, so you can queue up a thumbnail regen for all webms with a certain tag, say, or recheck file metadata on all pngs (just in case they were truly apngs). There is also an advanced button to run talks on repository update files.
Furthermore, the new file maintenance manager now handles file integrity checks. The old 'file integrity check' entries under the database menu, which would check all files in your system in one go, are gone, now handled on the nicer system that doesn't interrupt work if you force it and will otherwise chip away at any large job in smaller pieces in idle time. Missing files or files will incorrect data will now automatically have any known URLs exported to .txt files, and incorrect files will also be exported, all to an appropriate directory beneath your main db dir.
I would like to add saved file searches within the next month or so, at which point I will attach some recommended jobs here, such as detecting apngs and correcting some webms that were once detected as mkvs.
duplicates
The thumbnail right-click menu now lets you reset or undo file relationships! If a file has duplicates, you can remove it from that group, and if it has alternates, you can pull it from that. You can also completely 'dissolve' the groups and clear out false positive relationships. It gives you little yes/no explanatory dialogs for each action, if you aren't quite sure what you mean to do.
I expect to add commands to apply these sorts of operations to entire selections in the next few weeks.
The duplicates storage overhaul is coming to a close. I have a few more cleanup jobs like this, some filter interaction improvements, and some proper help to write. We are getting there!
ipfs nocopy
I have added an advanced and experimental 'nocopy' IPFS pin mode to the client. This permits IPFS to share files straight from your hydrus file store, without having to make a copy. If you are not familiar with this, I recommend you not try. I would like feedback on this from the users who are enthusiastic about it so I can iterate on it and make it easier for regular users in future.
Essentially, the IPFS review/manage services panels now have a bit of nicer UI to talk to the daemon. You can check that 'nocopy' is turned on and enable it under manage services. Unfortunately, nocopy will only work on any file that is beneath the parent of the main ipfs conf directory, which is typically your USERDIR, so if your hydrus client_files folder is above that, or on another drive entirely, you will need to remap the locations with some symlinks(!) before it will work. The manage services panel has some help on it, but please feel free to ask me if you need some more.
I tested this, and it appears to work, but I am not a big IPFS user, so I would appreciate anyone who is giving this a proper shake-down and letting me know what you think. I'd love to add some easier plug-n-play IPFS sharing ability to the client, but we'll need to get over some hurdles first.
I expect to write a new IPFS downloader or similar that will pull multihash data from public http gateways so users can get into this more without having to set up a daemon.
deviant art
Deviant Art seem to be rolling out a complete site redesign that breaks the existing login script and gallery & file page parsers. Much like a recent Pixiv rollout, not all users are being affected at the same time–it looks like users who remain logged in from a previous session are not seeing the update as quickly.
I have written completely new objects for DA for this update. There is a new login script and parsers, and now there is a new downloader just for tags. Unfortunately, I think you need to be logged in–or otherwise have some particular cookies established from previous activity–to access the new download searches, or else you will get 404. If you are logged in and get this 404, try resetting your login and then logging back in. Dragging file page URLs straight on to hydrus seems to work fine, it is just this gallery search step that needs a 'new' session set up before it works.
I am afraid I have also had some CloudFlare 500 server error results with the new search. I suspect DA hasn't completely rolled out their new tech yet or something weirder is happening. These gallery 500s can pause the downloader hydrus-side annoyingly, so I will write some ui next week to let you try again quicker. In any case, the new downloaders I am putting out today are an improvement on 'it doesn't work at all', but they may not be everything we need. If you are a keen DA user, please let me know how you get on.
the rest
I fixed the 'file lookup script' GET problem. I apologise for the inconvenience.
I am also rolling out an improvement to the shimmie parser this week that pulls source time and md5 hash.
full list
- ipfs nocopy:
- wrote a new panel to better show ipfs daemon status and added it to the review and manage ipfs service panels
- added nocopy config review and enable status and buttons to this new panel
- added an EXPERIMENTAL 'use nocopy' checkbox to the ipfs manage services panel
- added accompanying WEWLAD path translation ui to enable nocopy when your hydrus media storage paths are inaccessible to the ipfs daemon for nocopy purposes. a help button explains this more–it currently needs some symlinking, so non-advanced users should stay away
- if everything is set up, ipfs nocopy seems to work! I am not totally happy about the setup required here, so feedback from advanced ipfs-fluent users would be appreciated and we can iterate on this
- improved stability of ipfs daemon/version checking code
- .
- file maintenance:
- wrote some proper file maintenance ui under database->maintain->review scheduled file maintenance!
- for existing work, the new file maintenance ui shows how much work is scheduled for each job type and lets you cancel that work or run it manually
- for new work, the new file maintenance ui lets you queue up work of any type for files you select with the standard tag autocomplete search interface! you can schedule all pngs to be rescanned in case they are truly apngs, or regen thumbs for all files imported before a certain date, or whatever you wish. you can also queue up repository update files
- the file maintenance manager can now deal with repository update files when it does a complete file metadata regen
- the file maintenance manager now takes responsibility for checking file presence and file integrity. the old 'check file integrity' options under database->maintenance, which did all files in one go, are now gone
- file integrity checks will now always export broken files and missing/broken files' known urls to .txt files to your db_dir/missing_and_invalid_files. appropriate popups and log data will be sent as well. also, the known urls will be both exported on a per-file .txt basis and appended to one unified .txt
- if a file now fails to parse on a metadata reparse, it is now automatically checked for file data integrity
- if a repository encounters a missing, invalid, or incorrect filetype update during update processing, it now schedules all updates in the repo to be appropriately rescanned by the file maintenance manager
- if the storage subdirectory directory does not exist on a client file path request or thumbnail-add attempt, a special error will now be raised with instructions to reconnect the location or shut the client down immediately
- cleaned up some ffmpeg mime-detection logspam
- .
- duplicates:
- added several single-file thumbnail right-click dissolve/reset duplicate actions:
- - reset search status
- - remove from duplicate group (if in one and not the king)
- - dissolve duplicate group (if in a group)
- - remove from alternate group (if in one)
- - dissolve alternate group (if in one)
- - clear false-positive relations (if it has some)
- added some new code to deal with dissolution and member extraction at the db level
- when a member is extracted from alternate group, its constituent files are now requeued for potential search
- multi-selection duplicate right-click actions are now available to non-advanced-mode users
- wrote some unit tests for the new dissolve/reset actions
- cleaned up some misc duplicates code
- .
- the rest:
- fixed a recent bug in the file lookup script GET call–I apologise for the mistake
- the main gui page tab menu now lets you sort page tabs by the number of files they have
- deviant art seem to be rolling out a new page format. this week hydrus introduces completely new deviant art downloader objects that, fingers crossed, will update any existing users smoothly and also provide new tag search functionality. users who are still logged in may still be getting the old page format. if this is you, and this update does not work (although I _think_ it should, even so), please try clearing your existing login and logging in again
- new deviant art login script, artist + tag GUGs, gallery url classes, file and gallery parsers
- updated the shimmie file page parser to pull source time and md5
- improved the 'process now' advanced button to only focus on actual specific outstanding processing. previously, it was also checking for new metadata when due, which, when the server was not available, could seemingly idle for a time before actually processing updates due to the new delaying connection retry code
- wrote a new 'file import report mode' mode to help->debug->report modes
- fixed a progress display issue with the janitorial petitions processing page
- improved accuracy of sibling and parent petition counts, and properly capped them at 1000
- mapping petitions are now grouped by namespace, and will come in more manageable chunks
- fixed the server launch-and-init test debug code
- misc string-to-string control improvements to support the new ipfs edit ui
- removed the old 'continual tag archive sync' legacy code from tag services, which has been semi/non-functional for a long time
- cleaned up the annoying separator hanging on the end of certain tag right-click menus
- cleared out the 'Exception ignored in' spam that is often printed after the log closes
next week
I pushed it a bit hard this week, so I will take it easier as I catch up on a variety of smaller jobs. I also need to catch up on my messages.