You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I looked at the 2 pull requests and does not seem to be viable option right now. Posting an idea for a dataset to be used
This dataset has not already been used in TidyTuesday.
The dataset will (probably) be less than 20MB when saved as a tidy CSV ? hard to say
I can imagine a data visualization related to this dataset.
title: Bike Index data
about: Bike Index is the most widely used and successful bicycle registration service in the world with over 1,366,000 cataloged bikes, 1,730 community partners and tens of thousands of daily searches. This website helps people get their stolen bikes back, it also can used for geographic maps of which areas have the most bikes stolen. This website fits with R users as the website is used around the world
I looked at the 2 pull requests and does not seem to be viable option right now. Posting an idea for a dataset to be used
This dataset has not already been used in TidyTuesday.
The dataset will (probably) be less than 20MB when saved as a tidy CSV ? hard to say
I can imagine a data visualization related to this dataset.
title: Bike Index data
about: Bike Index is the most widely used and successful bicycle registration service in the world with over 1,366,000 cataloged bikes, 1,730 community partners and tens of thousands of daily searches. This website helps people get their stolen bikes back, it also can used for geographic maps of which areas have the most bikes stolen. This website fits with R users as the website is used around the world
data_source: A source where the dataset can be downloaded. https://bikeindex.org/documentation/api_v3
If you want more backstory on this website listen to the Darknet Diaries episode
anyway, posting for reference and if someone has interest in pulling the data to make a formal submission. 🚲
The text was updated successfully, but these errors were encountered: