Requesting all skins — Guild Wars 2 Forums
Home API Development

Requesting all skins

Hi everyone,
First i'd like to say sorry for my bad english its not my native language! So I started a little ReactJS app for fun and to build my portfolio at the same time and I wanted to fetch all the skins available and then check all the skin my account and then see what skins am I missing.

I tried to do this "" I understand this is returning me the ids of each skin in the the game and if I do "" this will return me the skin with the right id. While looking for a solution I saw that you could get them with pages like this "" but its not really what I'm looking for...

what I'd like to do is get all the skins object and then filter them into categories like type= Weapon so I thought maybe there was a query to like get them by type or any other things but it seems that its not a thing... only by ids or pages. Now I don't really know what to do if anyone can point be toward a direction that would be nice ! :)

Thank you all for your time!


  • Leo.3428Leo.3428 Member ✭✭✭

    I don't think any endpoint takes filters, it's on your app's shoulders to download the whole package and apply the filters on the cached data. Makes sense?

  • Yeah I get that but the thing I don't get is how to get the whole package. Since the only thing " " returns me is an array of ids. Should I just make a loop that push all the pages ? => " " because I tried to do something else but the server did like me trying to do like 6k request

  • Leo.3428Leo.3428 Member ✭✭✭

    Looping through the pages to get the whole is the best way as far as I know. On the first request (for page index 0) the response header will contain x-page-total, which is currently 31 when page size is 200, in the manual test I just did. This will tell your script that the loop must end after page index 30.

    If you request every skin data by index individually (6199 of them, by x-result-total in the response header) and don't throttle the requests, you are likely to get denied for overburdening the server. It's way more efficient to request pages (31 of them at size 200).

    Let us know how this goes.