Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistent success rate on nls.co.uk site #145

Open
JohnDeZoom opened this issue May 22, 2017 · 10 comments
Open

Inconsistent success rate on nls.co.uk site #145

JohnDeZoom opened this issue May 22, 2017 · 10 comments

Comments

@JohnDeZoom
Copy link

JohnDeZoom commented May 22, 2017

I've successfully obtained one image from the maps.nls.uk site by typing in this link:

http://maps.nls.uk/imgsrv/iipsrv.fcgi?iiif=/10110/101103608.jp2/info.json

into the DeZoomify window, but with another one, this .....

http://maps.nls.uk/imgsrv/iipsrv.fcgi?iiif=/10234/102344075.jp2/info.json

I get an error ....
"NS_ERROR_FAILURE
(http://ophit.alwaysdata/dezoomify.net/zoommanager.js:62)"

I've checked and the tile from which I formulated the address exists and displays.

The second link is derived from the tile links for:
http://maps.nls.uk/view/126524075

What exactly is this error message related to ? and Any idea for this inconsistency ?

Thanks
John

@JohnDeZoom
Copy link
Author

The message precisely is:
dezoomify-error_mess

@lovasoa
Copy link
Owner

lovasoa commented May 23, 2017

Dezoomify FAQ item explaining how to allocate more memory to large canvases

Did you try allocating more memory to canvas as specified in the FAQ ?

@lovasoa
Copy link
Owner

lovasoa commented May 23, 2017

I have gfx.max-alloc-size set to 2000000000 and dezooming your image works for me (firefox 53.0.2 64 bits on linux).

@lovasoa
Copy link
Owner

lovasoa commented May 23, 2017

Here it is:

low resolution

canvas

full resolution

280Mb png file

@JohnDeZoom
Copy link
Author

Thank you. That did the trick although on my machine (4Gb memory) it's a trifle slow !
I tried first doubling the memory to 1000000000 without success burrrìt 2000000000 worked (slowly).
It did crash the machine the first time becasue I had other thngs running. I was a bit surprised becasuìse the file I had managed to download was 250Mb and took nowhere near as long to process. I guess it's a case of experimenting the best memory setting for any particular computer

@JohnDeZoom
Copy link
Author

oh and is it possible to generate the downloaded files in .jpg format directly ? I also noticed the x10 file size saving by converting to jpeg afterwards.

@lovasoa
Copy link
Owner

lovasoa commented May 25, 2017 via email

@JohnDeZoom
Copy link
Author

message about the jpeg possibilities understood. For my uses jpg is fine but I understand you can't get anything but png becasue that's the source. Thanks for your explanation anyway

@JohnDeZoom
Copy link
Author

what do you mean any website will be able to crash my browser IF IT WANTS TO ?? !!!!!
Also, by 'setting it back' - to what ? Originally there wasn't the keyword present so I suppose you mean delete the keyword previously created ?

@JohnDeZoom
Copy link
Author

UPDATE - after being able to get the quoted image by increasing the parameter to 2Gb, I then tried this one:-
http://maps.nls.uk/imgsrv/iipsrv.fcgi?iiif=/12652/126522371.jp2/info.json
but got nothing but freezing on 1%.
I tried to increase memory further but I could only increse to 2.1Gb, anything more attempted resulted in a refusal by Firefox to do so ('invalid number' was the error message when I tried).
I have 4 Mb so I'd have thought I could increase it more than that.
Do you know how Firefox works in rìthat respect ?
I assume that this image is just too big , could I ask you to download it and post the jpg version here please, or just tell me what the image size actually is, for reference ? Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants