We were fed up with all those negative news and numbers on the media. So we decided to look out for particularly positive side effects of the crisis and tried to analyze and visualize them. Furthermore, we wanted to build something where we could use and improve our skills regarding Python and R coding.
What it does
It takes data about several side effects of the corona pandemic and displays them in form of graphs on a shiny website.
How I built it
We built several web crawlers in Pyhton in order to get the data we needed and designed the corresponding graphs in R. Afterwards we used the Shiny package in R to build a webpage, that presents the data visualization.
Challenges I ran into
- Using github for the first time
- Crawling dynamic webpages
- Structuring the crawled data
- Plotting the data ## Accomplishments that I'm proud of
- Learned a lot of topics and gained experience in programming a real project.
- Coordination of a team working together online
- Staying motivated for such a long time (we did it!!!) ## What I learned
- How to build a crawler for dynamic webpages
- How to use github
- Plotting data with ggplot2 package
- Building webpages with shiny package ## What's next for Some other new
- Deeper Analyzation of the current topics
- Find more side effects of the virus.
api, chromedriver, crawler, ggplot, python, r, shiny