Visualizing US Violent & Property Crime by State/Town

Posted by

The benefits of reducing violent and property crime in your neighborhood have a long-term impact on the livelihood of the people you love, and the community you serve. Here is a review of the work I did for MyCityAtPeace, and for-profit organization using a specialized hands on approach to tackling these issues.

Project needs analysis and implementation plan

If you do a quick Google search, you’ll find alot of useful information on various crime statistics; robbery rate, burglary rate, aggravated assault and much more. However, it is very difficult to find free information on violent and property crime in the US on state and town level.

So here is how I went about gathering the information,

  1. I consulted with Executive level stakeholder to determine their needs to assist with efficient implementation of strategic and operational resources.
  2. 9 socio-economic variables were required
    1. Degree of Violent Crime
    2. Degree of Property Crime
    3. Average population town and growth
    4. Average unemployment of town and growth
    5. Median real estate value of town and growth
    6. Percentage cost of living compared to the US average
  3. The code base was developed in Python and implemented on Google Cloud Platform.
  4. It was then cleansed using Python, and visualized in Tableau.

The code and Tableau worksheet is available on my Github repository.

Code structure and implementation

Data on these socioeconomic variables isn’t freely available, atleast on the level required. If you’re planning on scraping data, it makes it alot easier if you choose a website that has most of the information you need. Why? Because each website has a unique HTML structure, and will save you from developing multiple other scripts for tailoring the extraction process. Lucky for me, all the data I need could be found on BestPlaces.

Here is the high level process of the script design,

  1. Required dependencies were uploaded with the usual suspects; pandas, numpy, re, requests and BeautifulSoup
  2. State level zip codes were imported into a Pandas DataFrame
  3. The DataFrame was cleansed of NaN’s and initialized to include columns for two urls used to facilitate the collection process.
  4. The scraping process iterated through zipcodesby making a request to the base url, and appending the zip code as per the structure defined on the website. So here is an example of demographic data for Cambridge. Take note of the url structure.
  5. Exported the data to a csv file.

Google Cloud Implementation

So if you take the code and get it to run on your computer, it will take about 9 hours to finish. I honestly didn’t have that kind of time to wait in a fixed wifi spot (because if i moved the computer and connection stopped, so was the scraping process).

So I implemented the process on Google’s Computer Engine on it’s Computer Platform (GCP). The cost of running the code on the service ended up being about $3. There are tons of tutorials out there to help you get started with it but here is the overview,

  1. Created a project on GCP.
  2. Created a standard “VM Instance” with the following setup,
    1. Zone: us-central1-a
    2. Machine Type: n1-standard-1 (1 vCPU, 3.75 GB memory)
    3. OS: Ubuntu 16.04
  3. SSH’d into the Virtual Machine (VM), updated and installed the required dependencies including the Python codebase and folder structure.
  4. Then ran the code on 3 url’s to make sure it was outputting correctly. If it doesn’t run as planned, it might have something to do with the way you’re referencing input and output files. The syntax is different on Linux.

Finally, perhaps the most rewarding part of this process is making sure the code runs when you close the SSH terminal. Here is the tutorial I used to do it. I went to bed, slept for about 4 hours, and the scraping process was done.

The results

The data was imported into Tableau, which automatically identified state as the geographic variable allowing for “maps” visualization. Here are some interesting finds.

High distribution of US violent crime in pink
High distribution of US property crime in yellow

From the images above, we see that a higher prevalence of property in the US. Where property crime has higher variability around certain hotspots, violent crime seems to (mostly) be restricted to the hotspots themselves.

Ratio of US violent to property crime with higher ratio’s violent crime in maroon
Ratio of US violent to property crime influenced by high average US cost of living
Ratio of US violent to property crime influenced by high US population growth

From the images above, we see the distribution of the ratio of US violent crime to property crime over two demographics; Average US Cost of Living and Population Growth. By using a ratio, you can isolate states where one type of variable has more of an influence then the other.

Specifically,

  1. We see a higher prevalence of violent crime around the outskirts of the US.
  2. We see 6 states that have a higher prevalence of violent crime and positive population growth.
  3. And we see states around the western region of the US having a higher prevalence of violent crime and higher than average cost of living.

Leave a Reply