[ / / / / / / / / / / / / / ] [ dir / random / cloveros / cuteboys / erp / fast / hydrus / in / strek / tftg ]

/hydrus/ - Hydrus Network

Bug reports, feature requests, and other discussion for the hydrus network.
Name
Email
Subject
Comment *
File
Password (Randomized for file and post deletion; you may also set your own.)
Archive
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Embed
(replaces files and can be used instead)
Voice recorder Show voice recorder

(the Stop button will be clickable 5 seconds after you press Record)
Options

Allowed file types:jpg, jpeg, gif, png, webm, mp4, swf, pdf
Max filesize is 16 MB.
Max image dimensions are 15000 x 15000.
You may upload 5 per post.


New user? Start here ---> http://hydrusnetwork.github.io/hydrus/

Experienced user with a bit of cash who wants to help out? ---> Patreon

Current to-do list has: 2,017 items

Current big job: Catching up on Qt, MPV, tag work, and small jobs. New poll once things have calmed down.


HookTube embed. Click on thumbnail to play.

31c14a  No.9326

windows

zip: https://github.com/hydrusnetwork/hydrus/releases/download/v313/Hydrus.Network.313.-.Windows.-.Extract.only.zip

exe: https://github.com/hydrusnetwork/hydrus/releases/download/v313/Hydrus.Network.313.-.Windows.-.Installer.exe

os x

app: https://github.com/hydrusnetwork/hydrus/releases/download/v313/Hydrus.Network.313.-.OS.X.-.App.dmg

tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v313/Hydrus.Network.313.-.OS.X.-.Extract.only.tar.gz

linux

tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v313/Hydrus.Network.313.-.Linux.-.Executable.tar.gz

source

tar.gz: https://github.com/hydrusnetwork/hydrus/archive/v313.tar.gz

I had a great week. There are some more downloader and tag import improvements, and my gallery overhaul is going very well.

gallery log

Other than some final parsers, tweaks, and bells and whistles, the 'page download' part of the downloader overhaul is done, so I have moved onto the gallery part–where pages of thumbnails are converted into Page URLs. This work is going very well–a lot of the data side of it was already done due to the page download stuff–and I am rolling a bit of it out today.

All the downloaders will now have a new 'gallery log' box attached, which is a little bit of text and a second 'table' icon button, like the one for file imports. This button launches a window with a similar table of data, this time listing all the Gallery URLs visited and the result (including any errors). Until now, this information has been discarded, but having this logged will prove very useful for a variety of reasons–I already used it on my dev machine this week to debug some pixiv stuff.

The gallery log is read-only for now (it only reports what has happened), but I will spend the coming weeks adding in a new gallery parser and start offloading the work from the old system when available, just as has been happening with the file downloader, and then I'll be able to add some kind of 'retry' and 'continue from here' kind of commands to its right-click menu and add Gallery URL drag-and-drop and so on. Thankfully, gallery parsing is significantly simpler than post page parsing, so I expect to roll this out pretty quick. I am really pleased with how this has come together.

For those users who are interested in editing in the new parsing system, please check out the new 'danbooru gallery page parser' under network->manage parsers. It doesn't do anything yet, but it is a good idea of the basics of what you'll be creating here.

tag import options filters

Tag import options is marching on. The new 'get all tags' checkbox now has a 'Tag Filter' object attached which lets you control which tags you want much more finely than the old namespace checkbox system–such as 'get all tags except "species:" tags' and 'get only "creator:", "character:", and "series:" tags, but (these ten unnamespaced tags) are also fine'.

Also, the new and advanced 'only add tags that already exist' option has a similarly-advanced filter to go along with it, so if you are feeling clever, you can now say 'get all namespaced tags, but only get existing unnamespaced tags'.

I've written a bit of help into both the new tag filter editing dialogs, but please let me know how it could be improved. This stuff is complicated, and I could see adding an 'easy mode' checkbox/panel after some feedback.

I expect to phase out the old namespace checkbox system in the coming weeks. Any existing checked namespaces will be converted to an appropriate tag filter beside the 'get tags' checkbox. It will still be useful to know what namespaces a parser expects to generate, so I'll see about putting that somewhere in the ui as well.

I will be adding more ways to quickly copy/paste and mass-set/inherit tag import options, so if you want to take advantage of these new filters, please do a couple of small tests to make sure you know what you want, and then wait for these new features. Don't waste time editing a hundred subscription TIOs' filters individually!

downloaders

In multi-watchers and subscriptions, the summary of import progress now has a shorthand for number of unsuccessful results, like '222/245 - 2I4F', which means "222 items out of 245 processed, with 2 'ignored' and 4 'failed'". I like the overall change, but looking at my updated IRL client now I think I'll pull the 'D' for 'deleted' out, as this is a bit spammy not so useful. The 'I' and 'F' are great for quickly finding problems, however.

Just as I rolled out the Deviant Art parser last week, they switched up how their Post URLs work, which meant DA subs will have hit their periodic limits (thinking they were seeing new URLs) and also meant my new parser wasn't kicking in. I've fixed the underlying URL-matching in the update this week, so while you will still get some subscriptions moaning about hitting their periodic file limits, it should all settle down in a sync or two and the new parser will kick in properly from now on. You might like to scan your DA subs for any with 'I' in their progress texts as above, as the new parser may be able to work on these URLs (typically these are images with high-res download disabled, which the old parser could not handle, but there are also some 'here's a pack of adobe brushes in a strange file format' that will just fail harmlessly).

I discovered that the Pixiv downloader was missing explicitly 'manga' works this week, so that is also corrected. The multi-file works it was previously fetching were only 'multi-file illustrations', which are under a slightly different section of their site. I expect I removed the manga search (technically, I was searching for 'type=illust' instead of 'type=all') in an old version when manga results were more of a pain than not. In any case, manga is now included in searches going forward. If you have some subscriptions that you would like to retroactively fetch old manga links for, instead of resetting the subs or running a full manual download again, I recommend you wait for the gallery overhaul (and the subsequent 'Searcher' work) so I can roll out an explicitly 'manga-only' Pixiv downloader, at which point you can just run manual downloads for just those artists' manga (due to Pixiv being odd technically, re-running entire searches can eat a bunch of bandwidth, as it is difficult to tell if the client has seen a pixiv file/URL before and so skip the download).

full list

- fleshed out the new gallery log and its constituent log entry objects

- added gallery logs to gallery downloaders, subscriptions, url downloaders, simple downloaders and watchers

- added very simple gallery log reporting to these downloaders

- added first, read-only version of gallery log ui to these downloaders

- fleshed out some new gallery/file-object pipeline stuff

- wrote a simple danbooru gallery page parser and added it on update. it doesn't do anything yet, but if you are into the new parsing system, please check it out as an example

- the url downloader now has a full file import status control with status text

- fixed a url count issue on completely fresh gallery downloads that was stopping gallery searches one file (like 199 vs 200) before the file limit

- the pixiv downloader now fetches 'type=all' gallery pages, which include specifically manga file pages (as opposed to merely multi-file 'illustrations')

- added 'retry ignored' to the file import status button's right-click menu

- fixed the deviant art url class and parser to use the new file page format. also added an '(old format)' class to match the old way for legacy purposes (this legacy class also uses an api conversion to connect to the new parser–we'll also figure out a way to convert all these over at the db level en masse later!)

- updated some similar deviant art gallery stuff as well

- tag import options now has a tag filter to go along with the 'get all tags' checkbox! ('get all tags' is now renamed to 'get all' as a result). this filter lets you make more complicated tag filtering decisions like 'get all tags except "species:" tags'.

- the new 'only get tags if they already exist' checkbox now also has a filter, if you want to only apply this test to a subset of tags (like the unwashed mess of unnamespaced tags many boorus and sites provide)

- generalised a 'tag filter' button class to make it simpler to edit tag filters across the program, and cleaned up some related status code

- fixed a problem with deriving tag import options for specific url classes when that url class was part of an api-url-class chain

- if the domain manager cannot now find a url match for a pending download, it now assigns the file post default tag import options to that import

- added a new 'duplicates' options page that has a hacky way to edit the weighted scores used to determine which of the pair of files to present file in the duplicates filter

- unifed how some file import status generation works, adding a new 'simple status' string to briefly summarise progress in multi-watcher and edit subscriptions columns

- cleared out some old redundant status caching in the urls downloader

- simplified how almost all timestamp strings are generated

- simplified how time delta strings are generated

- brushed up some simple common ways to present timestamps as 'human pretty' strings

- all places where timestamps would be presented as a mix of '5 days ago' and complete datetime strings will now present as '5 days ago' unless you set the new options->gui 'always show iso' checkbox. going back to simple to clear up confusion in workflow and code. I may revisit this, as turning on ISO mode now spams it all over the place

- cleaned up the 'looks like the computer just woke from sleep' check and reduced its grace period to fifteen seconds. foreground daemons (like the subscription daemon) and the network engine will now also obey it

- added a 'simulate wake from sleep' debug action to better test the sleep-wake detection code

- improved my custom statictext class to auto-wrap text without flickering

- used this new autowrapping to improve wrapping and layout of popup message texts

- replaced all other st wrapping with this new code

- wrote a little helper function to better dedupe lists in future

- did a bunch of refactoring to neaten some long common func names

- deleted some old unused code

next week

More gallery work. I'd like to flesh out the new gallery log pipeline and have it step in when a parser is available, and then I can start rolling out new gallery parsers. I have also recently had a good run at clearing out small stuff, so I'd like to go through some of my older todo–just some small weird jobs that have been on the back-burner for longer than I wanted.

____________________________
Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

a699bd  No.9329

>>9326

just updated, have a few concerns/annoyances with multithreadwatcher out the gate.

1) It doesn't remember how things were sorted. I have it sorted newest first, I think it sorted by name and 404 status.

2) it doesn't remember how wide everything is. I set it so the time is visible no matter how long it gets, it reverted back to the default

I may find a few more things throughout the week, but these were the ons that stuck to me. also, 12 minutes on a full load this time.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

a699bd  No.9330

>>9329

just read the post, I suggest keeping the D in there, and if necessary have it a toggle, I find knowing that the program recognized everything more useful then a bit cleaner look.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

248ab6  No.9331

Can we get 4chan thread names as importable tags please?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

0719c5  No.9332

>>9326

Thanks for adding the duplicates thing. I have a question though, how exactly do I use it? I set the two for larger filesize to 100 and the rest to 0, but I still get either the larger or smaller first when using the filter.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

a699bd  No.9333

>>9331

If this happens it would have to be in conjunction with what I thought of for the machine learning tagging, a separate tag repository that has to be manually confirmed.

would be nice for cyoa threads, would be horrible for most other things unless you keep up with confirming or denying the tags.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

5e9183  No.9334

File: bc91918bfeedfb0⋯.png (19.54 KB, 894x431, 894:431, 1530764701.png)

>>9326

I want to make sure I'm understanding how to set this up properly.

I have "Only Add Tags That Already Exist" set. I would add an 'Exception' for character/creator/series to get my desired functionality - right?

An alternative way I could set this up is to block everything except my whitelist of 450ish tags and then whitelist the character/series/creators. But for obvious reasons I don't want to manually add 450~ tags unnamedspaced tags. :)

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

5e9183  No.9335

>>9334

Ah, giving it a more careful reading I need to add `:` to the Excludes as using this feature TURNS ON "filters by existing tags" already. So `:`" on Exclude and the three I want in Exceptions.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

5e9183  No.9337

>>9335

Okay, tested on a few small samples. I had the filters on the wrong side.

To clarify this:

The "Tag Exists" filter is the first filter applied to imported tags. You want to EXCLUDE things from this filter by adding them to the lefthand side. In my case, this is the character/creator/series namespaced tags. If I wanted all namespaced tags that don't exist I'd add `:` to the left and any namespaced tags I want to EXCLUDE would be added to the Exceptions on the right. Which is a little counter-intuitive due to the naming but I get it now.

The most important line to add to the HELP dialogue would be "The 'Only Add Tags That Already Exist' filter is ran first - add tags you would like to exclude from this initial filter on the left."

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

0293dd  No.9342

>>9326

Is the gallery parsing engine the thing you mentioned a while ago where we can write custom scripts (shareable as images) to set up downloads from any gallery or imageboard type site we want? I've been working on a fuskator spreadsheet lately of stuff I want to scrape, should I be getting hype?

>>9333

OR it could just be a checkbox option exactly like filename is now, maybe with a global option to override it to always on for people who want that.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

a699bd  No.9343

>>9342

I'm not to concerned with what people have in their own tag library, i'm more concerned with the potential of the tags being added to public repositories where a 4chan post has a subject of lets go monster girl, or monster girl thread 45

then that tag gets added to every image in there thread, while a good proportion of those images aren't monster girls, they are cat/dog/bunny girls at best, and even then, they look in costume.

we also have the tag 'anal' which has a meaning, and just a picture of a gaping anus and nothing else is not anal

I would personally, if that became a thing, wish to have those tags relegated to a requires manual confirmation then an explicit this is the tag. I want to trust the tags are correct, which for the most part on boorus or other places you can, but not 4chan.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

a699bd  No.9344

so, every now and then for reasons I dont know, images create their own window.

I think it has something to do with I clicked the image and the program hung mid click so when my mouse rests somewhere else an the program comes out of its hang, it things I dragged and dropped.

usually an annoying but benign problem.

but it happened with a multi thread watcher.

once I was able to do anything, I found the thread watcher, cleared highlight and rehilighted the image was back where it should be in relation to download order.

I have said I would like highlight and clear light function on all download modes, as the functionality is clearly there, however if that happened, something would need to be done about moving files

I'm in the middle of culling through a fairly large amount of files where each one was downloaded in order, and needs to stay in order, however people also posted unrelated images, so if I culled through the files, then reordered them by mistake cleared and re highlighted it, all the files would be back.

For the most part I wouldn't want files to be culled out of the download list itself, but in fairly rare cases like this I would.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

96d750  No.9349

gelbooru parser seems to be broken suddenly. It now needs pid=42 to get to the second page instead of 20 otherwise every sub only gets max 42 images bcause gelbooru just resets to pid=0 if it's not a multiple of 42.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

31c14a  No.9353

File: 382a717a39fd335⋯.png (3.12 KB, 512x85, 512:85, 4chan thread api parser --….png)

File: 0d017810f929f7f⋯.webm (1.91 MB, 1066x600, 533:300, 0d017810f929f7f4262a5ba45….webm)

>>9330

>>9329

Thanks. Yeah, my new listctrl class doesn't remember sort or column widths yet. I am still replacing instances of the old one (which was even worse), but when that cleanup job is done, I will construct some sort of options object to remember this info and even maybe do some 'select columns to show' stuff.

Thanks for letting me know that you like D. I hate it every time I look now, so I was going to nuke it completely. I'll make an option, likely defaulting to 'not show' so as not to confuse newer users with too much data.

>>9331

If you feel brave, you can add this yourself under network->manage parsers, although some of this ui is still a bit ugly and prototype. I probably won't add this myself because of the reasons like >>9333 . I am going off maintaining default support for difficult tags, which is also why I dropped pixiv regular tags recently–they are just shit quality overall and more of a pain to discard than the value in having them.

I recommend you manually tag these threads' content en masse when they are DEAD, or maybe wait for me to figure out what I am doing with tag import options and namespaces, and maybe there will be a way to better select what a parser can put out (I am not sure though, as this is proving increasingly complicated on the back-end).

EDIT: fuck it, attached is one that should do it. Drag and drop it onto manage parsers and then link it under manage url class links. Let me know if it doesn't work for you.

>>9332

That sounds right, so maybe I messed something up. I will check it, thank you for the report.

>>9334

>>9335

>>9337

Yeah, I think this ui is fuckery duckery doo. It reflects the underlying data rather than how humans actually want to approach it. I will give this another pass and maybe an easy mode for just like whitelists or whatever. Thank you for your feedback.

>>9342

Absolutely, or almost. We can now parse 'Post URLs' well. This next step will convert thumbnail-filled 'Gallery URLs' into lists of Post URLs (and possibly a next page Gallery URL), and the one after that will convert 'samus_aran' into the initialising Gallery URL. It is all debug-tier ui atm, but I will write a neat human-friendly 'just drag and drop this png onto this bit of the hydrus client and you'll get preggobooru downloader m8' system that ties it all together.

Please check out the help here, which is limited for now but will be fleshed out as these latter parts become real:

https://hydrusnetwork.github.io/hydrus/help/downloader_intro.html

>>9343

Yeah, the PTR (and any other tag service) getting a bunch of shit added is an ongoing problem. The Deviant Art gallery parser is sperging out this week due to more URL changes on their end. There is a little cruft that builds up here that is not yet dealt with on the database end. At some point I will need to write some deeper orphan cleaning routines to both client and serverside db and update code to make these ultimately completely 0-byte problems.

>>9344

Thank you for this report. When you say 'create their own window', do you mean the end of the hang the thought-to-be drag and drop creates a new regular thumbnail media page with just that thumbnail?

If you find the program hangs a lot, the best remedy is to reduce the size of your current session. I assume you are the guy with lots of multi-watchers going on, but if you can clear out or send-to-a-saved-session any large dumps of files, your client should calm down a lot.

You might also like to run some profiles and send them to me as per here:

https://hydrusnetwork.github.io/hydrus/help/reducing_lag.html

The pubsub profile mode may make your client go nuts, but it is probably the source of the lag. I am not sure if the profile log will have personal/private info. If it does, please email it to me or DM me on discord or send me a 'private' pastebin link on a thread report or something here rather than posting it in public.

>>9349

Thank you for this report. I think you got hit by an issue with the legacy gallery downloader, which sometimes has difficulty figuring out the right index to advance per page num for these kinds of boorus. Restarting your client should do a temp fix for now.

As it happens, this gallery work I am doing right now will fix it completely by actually parsing the 'next gallery page' URL directly, and gelbooru will obviously be high up on the list to replace, so this may be fixed naturally in a couple of weeks.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

87bc51  No.9354

File: c4ca30678e02b8f⋯.png (12.29 KB, 473x274, 473:274, hydrus.PNG)

Just tried updating to v313. That failed an figured to update to v305 first. Should I be concerned by this?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

a699bd  No.9355

>>9353

I would suggest changing ignores I from capital to lower case, as at a quick glance it looks like a 1

For en mass tagging from 4chan, I have an idea that would be a bit of work, but would pay off because its would have to manually cull. Would you be able to make a tab that will tag everything dropped into it?

You look at images from one tab, you drag them to the mass tag, you go to the mass tag, and press a button to confirm, I think that 1 layer of abstraction would be better long term due to people needing to put at least some thought into 'should this image be named this'

otherwise if it's just automated the, would it be possible to add in a 'source-4chan:(Your tag here)' like tag just so when it does happen that the tag is useless its easier to mass remove the tag?

and yea I am the person with a large image list, mostly crap I need to cull though but I procrastinate, and when the program hangs, I end up procrastinating more, and if I am able to shove them out of the way, the shit will never get done.

as for the profile, just emailed it to ya, I re did it and sent it again right after doing one of each of the profiles till I got a hang, file turned out a bit large with the ui one.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

a9d594  No.9356

>>9353

>As it happens, this gallery work I am doing right now will fix it completely by actually parsing the 'next gallery page' URL directly, and gelbooru will obviously be high up on the list to replace, so this may be fixed naturally in a couple of weeks.

Is there anything i can do on my end to fix this? Restarting doesn't solve the problem and all my gelbooru subscriptions are useless now. Sadly gelbooru was the only that is actually usable as a source.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

15693f  No.9357

>>9353

Hey, dev. Since there is 'Database > check > file integrity' which removes the entries for missing file, maybe it is also possible to do the reverse ?

Say, something goes wrong and you loose some files, you can do a file integrity check where it redownloads the file if the source is still valid and removes the file entry if the source and the (local) file is missing.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

4da80b  No.9361

>>9353

Congrats, glad it's getting there on the gallery parser. I'll check out the intro.

>>9343

I get what you're saying, I honestly didn't consider that. I just sort of assumed that everyone set up their tags as private by default, used the tag db to help them sort, refine, and filter things out, and then manually added, removed, and corrected tags according to a useful rubric before archiving them, and then sharing the tags over to the public tag db.

Frankly, I tend to think that's exactly what everyone should be doing.

I personally have added almost zero tags on anything to the tag db, I was waiting for a basic universal tagging rubric to emerge or be hashed out here as there was some discussion about it last year before doing any tag commits myself. Also last time I checked there wasn't a way to do a search (in my archive in this case), CTRL+A, right-click, and hit "push all local tags to hydrus tag database", which would make that easier.

But given that everyone just spams whatever onto the public db rather than setting up their own tags, that explains why I get a lot of tag clutter the times I do look at the public tags. And also means it's unlikely I'll get a mass copy tags option because no one wants more tag spam.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

a699bd  No.9366

>>9361

personally, I'm waiting on image import notes to have user notes included, for deleted image it's a must for me, and to automatically annotate duplicates that are deleted so I know the reason an image is deleted is because I have a better version or even I have an alternative version I liked more.

outside of that, I do tagging though ratings at the moment, they tend to be easier to work with for me, along with converting the rating to tags fairly easy when the time comes.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

da00f0  No.9377

>>9366

What benefit do ratings have over tags?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

31c14a  No.9403

>>9354

Hey, do you mean that you were unable to update all the way up to v313 and had to go up to v305 first? And you were then ok to go up to v313? If so, this is likely no problem at all. Big update gaps are subject to bitrot, and if splitting it up into smaller jumps made it work in the end, you should be completely ok.

Let me know if you run into any further problems though! If you just updated from v262, a hell of a lot has changed, so don't be afraid to ask if anything is confusing.

>>9356

Not definitely, but you might be able to force it manually. Atm, if the booru downloader comes across a booru that doesn't use 1, 2, 3 page indices, it takes the first gallery result it sees as the definitional number of files per page. I assume you have some downloader or sub that is firing that only has 20 results total, so it is falling short of the 42(?) it would normally see, and this is propagating to all your other queues. If you boot and then semi-quickly start a search for something simple like 'blue_eyes' or 'skirt', it should initialise with 42.

>>9357

There isn't a 'redownload missing' maintenance tool yet, but I think that could be a cool tool to add now that we have better 'known url' storage and url-based actioning in the new download system. It would also be useful if I roll out some kind of url sharing. (i.e. you could tell your client, based on that 'URL Repository': 'hey client, download everything that is on PUR that is tagged 'character:samus_aran' on PTR')

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.



[Return][Go to top][Catalog][Nerve Center][Random][Post a Reply]
Delete Post [ ]
[]
[ / / / / / / / / / / / / / ] [ dir / random / cloveros / cuteboys / erp / fast / hydrus / in / strek / tftg ]