My company, Kwelia, is sitting on mountains of data, so I decided to try my hand at mapping. I have played around with JGR but it’s just too buggy, at least on my mac, so I went looking for other alternatives and found a good write up here. I decided on mapping prices per sqft for apartment rentals by zip codes in the bay area because we are launching our services there shortly.
First transforming the data was easy using ddply in the plyr package after I had queried all the right zip codes into a data frame from the database.
library("plyr")
w=ddply(sanfran, .(zip), summarise, pricepersqft=mean(price/sqft))
Then it’s a matter of loading the shape file after downloading it from here.
library(maptools)
library(RColorBrewer)
library(classInt)
zip=readShapePoly("bayarea_zipcodes.shp")
The ddply
will sort the zip codes, so I transformed the zip spatial data into a regular data frame, merged it with “w”, added pricepersqft
to an ordered “zip” data frame, and finally subset out zips without data.
##transform to regular data frame
a=as.data.frame(zip)
##merge with the ddply data
r=merge(a, w, all=TRUE)
##order zips in the spatial poly data
d=zip[order(zip$ZIP),]
##merge price per sqft with spatial data
d@data$pricepersqft=r$pricepersqft
##subset out zips with missing data
yy=d[!is.na(d$pricepersqft),]
Finally comes the plotting, which ,luckily, is almost exactly the same as the example that it I found above.
#select color palette and the number colors(prices per sqft) to represent on the map colors
colors=brewer.pal(9, "YlOrRd")
#set breaks for the 9 colors
brks=classIntervals(zip$INCOME, n=9, style="quantile")
brks=brks$brks
#plot the map
plot(zip, col=colors[findInterval(zip$INCOME, brks,all.inside=TRUE)], axes=F)
#add a title
title(paste ("SF Bay Area Price Per SQFT for rentals by Zip"))
#add a legend
legend(x=6298809, y=2350000, legend=leglabs(round(brks)), fill=colors, bty="n",x.intersp = .5, y.intersp = .5)
Here are the actual current averages of rental prices by zip:
QED
nice post, you could save a few lines:
library(maptools)
library(RColorBrewer)
library(classInt)
library(“plyr”)
sanfran <- read.csv("sanzipsprice.csv", header=T)
zip <- readShapePoly("bayarea_zipcodes.shp")
w <- ddply(sanfran, .(zip), summarise, pricepersqft=mean(pricepersqft))
# add price
zip$pricepersqft <- sanfran[match(zip$ZIP, sanfran$zip, nomatch=NA), 2]
zip <- zip[!is.na(zip$pricepersqft),]
#plot
spplot(zip, "pricepersqft", at=classIntervals(zip$pricepersqft, n=9, style="quantile")$brks, col.regions=brewer.pal(9, "YlOrRd"), main="SF BayArea Price Per SQFT for rentals by zip", col="grey", lwd=0.5)
Thanks! I always appreciate suggestions to improve my code.
Really nice post! Very instructive.
It is possible to manipulate the dataframe of a SPDF directly (without having to extract the @data slot), so you can match zip codes directly from your spreadsheet. In my example I have also kept those with no data and added an item to the legend for them (so it looks more like the Bay Area!):
zc <- read.table( "sanzipsprice.txt" , h = T )
zip=readShapePoly( "bayarea_zipcodes.shp" )
zip$PSQFT <- zc$price[match(zip$ZIP,zc$zip , nomatch = NA )]
colors=brewer.pal(9, "YlOrRd")
brks=classIntervals(zip$PSQFT, n=9, style="quantile")$brks
cols <- colors[findInterval(zip$PSQFT, brks, all.inside=TRUE)]
cols[is.na(cols)] <- "#D9D9D9"
plot( zip , col = cols , axes=F )
title(paste ("SF Bay Area Price Per SQFT for rentals by Zip"))
legend(x=6298809, y=2350000, legend=c( "No Data" , leglabs( round(brks , 1 ) ) ), fill = c("#D9D9D9" , colors) , bty="n",x.intersp = 1.5, y.intersp = 1.5)
Thanks for the informative post.
Is it possible to write the name (or zip code) of each region on the map so as to make it easier to read?
R can use Google Satellite maps as shown here. Is it possible to overlay this data onto a Google map?
library (dismo)
x <- geocode ('Foster City, CA, USA')
e <- extent (as.numeric (x [4:7]) + c (-0.5, 0.5, -0.5, 0.5))
g <- gmap (e, type = "satellite")
plot(g)
Any ideas on how to overlay another shape file onto this map? say of freeways or cities…
I will be doing this in an upcoming post.