![]() Like GODZILLA rising to save Japan, GRANITESEC rises from dormancy (caused mostly by that sod @hrbrmstr) to fill your summer with food, fun & some other audience appropriate word that begins with an “F” to complete the alliteration trifecta. Hit the Eventbrite link on the right to register and join in the festivities! |
|
Author Archives: hrbrmstr
Well, the proverbial cat is definitely out of the bag now. I’m moving on from the current gig to take a security data scientist position at Verizon Enterprise. The esteemed Wade Baker will be my new benevolent overlord and it probably isn’t a shocker that I went to the place my [co-author](http://dds.ec/amzn) works.
Wade’s got an awesome team and I’m excited to start contributing. I’ll definitely miss my evil (and, not-so-evil) minions from the current-but-soon-to-be-former gig, but they’ll continue doing EPIC risk work and security analytics in my absence.
Also, I’m staying put in Maine (apart from what I suspect will be a boatload of travel), so fret not Seacoasters, many a night at 7th Settlement will continue to be had!
Thanks to a comment suggestion, the Rforecastio package is now up to version 1.3.0 and has a new parameter which lets you specify which time conversion function you want to use. Details are up on [github](https://github.com/hrbrmstr/Rforecastio).
Not even going to put an `R` category on this since I don’t want to pollute R-bloggers with this tiny post, but I had to provide the option to let folks specify `ssl.verifypeer=FALSE` (so I made it a generic option to pass in any CURL parameters) and I had a couple gaping bugs that I missed due to not clearing out my environment before building & testing.
I’ve bumped up the version number of `Rforecastio` ([github](https://github.com/hrbrmstr/Rforecastio)) to `1.1.0`. The new
features are:
– removing the SSL certificate bypass check (it doesn’t need it
anymore)
– using `plyr` for easier conversion of JSON-\>data frame
– adding in a new `daily` forecast data frame
– roxygen2 inline documentation
library(Rforecastio) library(ggplot2) library(plyr) # NEVER put API keys in revision control systems or source code! fio.api.key= readLines("~/.forecast.io") my.latitude = "43.2673" my.longitude = "-70.8618" fio.list <- fio.forecast(fio.api.key, my.latitude, my.longitude) fio.gg <- ggplot(data=fio.list$hourly.df, aes(x=time, y=temperature)) fio.gg <- fio.gg + labs(y="Readings", x="Time", title="Houry Readings") fio.gg <- fio.gg + geom_line(aes(y=humidity*100), color="green") fio.gg <- fio.gg + geom_line(aes(y=temperature), color="red") fio.gg <- fio.gg + geom_line(aes(y=dewPoint), color="blue") fio.gg <- fio.gg + theme_bw() fio.gg

fio.gg <- ggplot(data=fio.list$daily.df, aes(x=time, y=temperature)) fio.gg <- fio.gg + labs(y="Readings", x="Time", title="Daily Readings") fio.gg <- fio.gg + geom_line(aes(y=humidity*100), color="green") fio.gg <- fio.gg + geom_line(aes(y=temperatureMax), color="red") fio.gg <- fio.gg + geom_line(aes(y=temperatureMin), color="red", linetype=2) fio.gg <- fio.gg + geom_line(aes(y=dewPoint), color="blue") fio.gg <- fio.gg + theme_bw() fio.gg

Over on the [Data Driven Security Blog](http://datadrivensecurity.info/blog/posts/2014/Apr/making-better-dns-txt-record-lookups-with-rcpp/) there’s a post on how to use `Rcpp` to interface with an external library (in this case `ldns` for DNS lookups). It builds on [another post](http://datadrivensecurity.info/blog/posts/2014/Apr/firewall-busting-asn-lookups/) which uses `system()` to make a call to `dig` to lookup DNS `TXT` records.
The core code is below and at both the aforementioned blog post and [this gist](https://gist.github.com/hrbrmstr/11286662). The post walks you though creating a simple interface and a future post will cover how to build a full package interface to an external library.
Andreas Diesner’s `#spiffy` [Fit2Tcx](https://github.com/adiesner/Fit2Tcx) command-line utility is a lightweight way to convert Garmin/ANT [FIT](http://www.thisisant.com/resources/fit) files to [TCX](http://en.wikipedia.org/wiki/Training_Center_XML) for further processing.
On a linux system, installing it is as simple as:
sudo add-apt-repository ppa:andreas-diesner/garminplugin sudo apt-get update sudo apt-get install fit2tcx
On a Mac OS X system, you’ll need to first grab the `tinyxml` package from `homebrew`:
brew install tinyxml
to install the necessary support library.
After a `git clone` of the Fit2Tcx repository, change the
DFLAGS += -s $(CREATE_LIB) $(CREATE_DEF)
line in `Makefile.in` to
DFLAGS += $(CREATE_LIB) $(CREATE_DEF)
and then do the typical `./configure && make` (there is no `test` target).
You’ll now have a relatively small `fit2tcx` binary that you can move to `/usr/local/bin` or wherever you like command-line utilities to be put.
You can also grab the [pre-compiled binary](http://rud.is/dl/fit2tcx.gz) (built on `OS X 10.9.2` with “latest” `Xcode`).
I had no intention to blog this, but @jayjacobs convinced me otherwise. I was curious about the recent (end of March, 2014) [California earthquake](http://www.latimes.com/local/lanow/la-me-ln-an-estimated-17-million-people-felt-51-earthquake-in-california-20140331,0,2465821.story#axzz2xfGBteq0) “storm” and did a quick plot for “fun” and personal use using `ggmap`/`ggplot`.
I used data from the [Southern California Earthquake Center](http://www.data.scec.org/recent/recenteqs/Maps/Los_Angeles.html) (that I cleaned up a bit and that you can find [here](/dl/quakes.dat)) but would have used the USGS quake data if the site hadn’t been down when I tried to get it from there.
The code/process isn’t exactly rocket-science, but if you’re looking for a simple way to layer some data on a “real” map (vs handling shapefiles on your own) then this is a really compact/self-contained tutorial/example.
You can find the code & data over at [github](https://gist.github.com/hrbrmstr/9921419) as well.
There’s lots of ‘splainin in the comments (which are prbly easier to read on the github site) but drop a note in the comments or on Twitter if it needs any further explanation. The graphic is SVG, so use a proper browser :-) or run the code in R if you can’t see it here.
(click for larger version)
library(ggplot2) library(ggmap) library(plyr) library(grid) library(gridExtra) # read in cleaned up data dat <- read.table("quakes.dat", header=TRUE, stringsAsFactors=FALSE) # map decimal magnitudes into an integer range dat$m <- cut(dat$MAG, c(0:10)) # convert to dates dat$DATE <- as.Date(dat$DATE) # so we can re-order the data frame dat <- dat[order(dat$DATE),] # not 100% necessary, but get just the numeric portion of the cut factor dat$Magnitude <- factor(as.numeric(dat$m)) # sum up by date for the barplot dat.sum <- count(dat, .(DATE, Magnitude)) # start the ggmap bit # It's super-handy that it understands things like "Los Angeles" #spoffy # I like the 'toner' version. Would also use a stamen map but I can't get # to it consistently from behind a proxy server la <- get_map(location="Los Angeles", zoom=10, color="bw", maptype="toner") # get base map layer gg <- ggmap(la) # add points. Note that the plot will produce warnings for all points not in the # lat/lon range of the base map layer. Also note that i'm encoding magnitude by # size and color and using alpha for depth. because of the way the data is sorted # the most recent quakes in the set should be on top gg <- gg + geom_point(data=dat, mapping=aes(x=LON, y=LAT, size=MAG, fill=m, alpha=DEPTH), shape=21, color="black") # this takes the magnitude domain and maps it to a better range of values (IMO) gg <- gg + scale_size_continuous(range=c(1,15)) # this bit makes the right size color ramp. i like the reversed view better for this map gg <- gg + scale_fill_manual(values=rev(terrain.colors(length(levels(dat$Magnitude))))) gg <- gg + ggtitle("Recent Earthquakes in CA & NV") # no need for a legend as the bars are pretty much the legend gg <- gg + theme(legend.position="none") # now for the bars. we work with the summarized data frame gg.1 <- ggplot(dat.sum, aes(x=DATE, y=freq, group=Magnitude)) # normally, i dislike stacked bar charts, but this is one time i think they work well gg.1 <- gg.1 + geom_bar(aes(fill=Magnitude), position="stack", stat="identity") # fancy, schmanzy color mapping again gg.1 <- gg.1 + scale_fill_manual(values=rev(terrain.colors(length(levels(dat$Magnitude))))) # show the data source! gg.1 <- gg.1 + labs(x="Data from: http://www.data.scec.org/recent/recenteqs/Maps/Los_Angeles.html", y="Quake Count") gg.1 <- gg.1 + theme_bw() #stopthegray # use grid.arrange to make the sizes work well grid.arrange(gg, gg.1, nrow=2, ncol=1, heights=c(3,1))