What I did was, I used the Bugzilla REST APIs to download all bugs for a specific product. Then I bulk-uploaded then to Autocompeter.com and lastly built a simply web front-end.
When you "download all" bugs with the Bugzilla REST API, it might be capped but I don't know what the limit is. The trick is to not download ALL bugs for the product in one big fat query, but to find out what all components are for that product and then download for each. The Python code is here.
Everyone's Invited to Play
So first you need to sign in on https://autocompeter.com using your GitHub account. Then you can generate a Auth-Key by picking a domain. The domain can be anything really. I picked bugzilla.mozilla.org
but you can use whatever you like.
Then, when you have an Auth-Key you need to know the name of the product (or products) and run the script like this:
python download.py 7U4eFYH5cqR15m3ekuxkzaUR Socorro
Once you've done that, fork my codepen and replace the domain and any other references to the product.
Caveats
To make this really useful, you'd have to run it more often. Perhaps you can hook it up to a cron job or something and make it so that you only download, from the REST API, things that have changed since the last time you did a big download. Then you can let the cron job run frequently.
If you want really hot results, you could hook up a server-side service that consumes the Bugzfeed websocket.
Last but not least; this will never list private/secure bugs. Only publically available stuff.
The Future
If people enjoy it perhaps we can change the front-end demo so it's not hardcoded to one specific product ("Socorro" in my case). And it can be made pretty.
And the data would need to be downloaded and re-submitted more frequently. A quick Heroku app mayhaps?
Comments