Jump to content
  • Sign Up

Requesting all skins


Argamilwins.3914

Recommended Posts

Hi everyone,First i'd like to say sorry for my bad english its not my native language! So I started a little ReactJS app for fun and to build my portfolio at the same time and I wanted to fetch all the skins available and then check all the skin my account and then see what skins am I missing.

I tried to do this "https://api.guildwars2.com/v2/skins" I understand this is returning me the ids of each skin in the the game and if I do "https://api.guildwars2.com/v2/skins?ids=x" this will return me the skin with the right id. While looking for a solution I saw that you could get them with pages like this "https://api.guildwars2.com/v2/skins?page=0&page_size=200&lang=en" but its not really what I'm looking for...

what I'd like to do is get all the skins object and then filter them into categories like type= Weapon so I thought maybe there was a query to like get them by type or any other things but it seems that its not a thing... only by ids or pages. Now I don't really know what to do if anyone can point be toward a direction that would be nice ! :)

Thank you all for your time!

Link to comment
Share on other sites

Yeah I get that but the thing I don't get is how to get the whole package. Since the only thing " https://api.guildwars2.com/v2/skins " returns me is an array of ids. Should I just make a loop that push all the pages ? => " https://api.guildwars2.com/v2/skins?page=i&page_size=200&lang=en " because I tried to do something else but the server did like me trying to do like 6k request

Link to comment
Share on other sites

Looping through the pages to get the whole is the best way as far as I know. On the first request (for page index 0) the response header will contain x-page-total, which is currently 31 when page size is 200, in the manual test I just did. This will tell your script that the loop must end after page index 30.

If you request every skin data by index individually (6199 of them, by x-result-total in the response header) and don't throttle the requests, you are likely to get denied for overburdening the server. It's way more efficient to request pages (31 of them at size 200).

Let us know how this goes.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...