|Like GODZILLA rising to save Japan, GRANITESEC rises from dormancy (caused mostly by that sod @hrbrmstr) to fill your summer with food, fun & some other audience appropriate word that begins with an “F” to complete the alliteration trifecta. Hit the Eventbrite link on the right to register and join in the festivities!|
Well, the proverbial cat is definitely out of the bag now. I’m moving on from the current gig to take a security data scientist position at Verizon Enterprise. The esteemed Wade Baker will be my new benevolent overlord and it probably isn’t a shocker that I went to the place my co-author works.
Wade’s got an awesome team and I’m excited to start contributing. I’ll definitely miss my evil (and, not-so-evil) minions from the current-but-soon-to-be-former gig, but they’ll continue doing EPIC risk work and security analytics in my absence.
Also, I’m staying put in Maine (apart from what I suspect will be a boatload of travel), so fret not Seacoasters, many a night at 7th Settlement will continue to be had!
Thanks to a comment suggestion, the Rforecastio package is now up to version 1.3.0 and has a new parameter which lets you specify which time conversion function you want to use. Details are up on github.
Not even going to put an
R category on this since I don’t want to pollute R-bloggers with this tiny post, but I had to provide the option to let folks specify
ssl.verifypeer=FALSE (so I made it a generic option to pass in any CURL parameters) and I had a couple gaping bugs that I missed due to not clearing out my environment before building & testing.
I’ve bumped up the version number of
Rforecastio (github) to
1.1.0. The new
- removing the SSL certificate bypass check (it doesn’t need it anymore)
plyrfor easier conversion of JSON->data frame
- adding in a new
dailyforecast data frame
- roxygen2 inline documentation
library(Rforecastio) library(ggplot2) library(plyr) # NEVER put API keys in revision control systems or source code! fio.api.key= readLines("~/.forecast.io") my.latitude = "43.2673" my.longitude = "-70.8618" fio.list <- fio.forecast(fio.api.key, my.latitude, my.longitude) fio.gg <- ggplot(data=fio.list$hourly.df, aes(x=time, y=temperature)) fio.gg <- fio.gg + labs(y="Readings", x="Time", title="Houry Readings") fio.gg <- fio.gg + geom_line(aes(y=humidity*100), color="green") fio.gg <- fio.gg + geom_line(aes(y=temperature), color="red") fio.gg <- fio.gg + geom_line(aes(y=dewPoint), color="blue") fio.gg <- fio.gg + theme_bw() fio.gg
fio.gg <- ggplot(data=fio.list$daily.df, aes(x=time, y=temperature)) fio.gg <- fio.gg + labs(y="Readings", x="Time", title="Daily Readings") fio.gg <- fio.gg + geom_line(aes(y=humidity*100), color="green") fio.gg <- fio.gg + geom_line(aes(y=temperatureMax), color="red") fio.gg <- fio.gg + geom_line(aes(y=temperatureMin), color="red", linetype=2) fio.gg <- fio.gg + geom_line(aes(y=dewPoint), color="blue") fio.gg <- fio.gg + theme_bw() fio.gg
Over on the Data Driven Security Blog there’s a post on how to use
Rcpp to interface with an external library (in this case
ldns for DNS lookups). It builds on another post which uses
system() to make a call to
dig to lookup DNS
The core code is below and at both the aforementioned blog post and this gist. The post walks you though creating a simple interface and a future post will cover how to build a full package interface to an external library.
On a linux system, installing it is as simple as:
sudo add-apt-repository ppa:andreas-diesner/garminplugin sudo apt-get update sudo apt-get install fit2tcx
On a Mac OS X system, you’ll need to first grab the
tinyxml package from
brew install tinyxml
to install the necessary support library.
git clone of the Fit2Tcx repository, change the
DFLAGS += -s $(CREATE_LIB) $(CREATE_DEF)
DFLAGS += $(CREATE_LIB) $(CREATE_DEF)
and then do the typical
./configure && make (there is no
You’ll now have a relatively small
fit2tcx binary that you can move to
/usr/local/bin or wherever you like command-line utilities to be put.
You can also grab the pre-compiled binary (built on
OS X 10.9.2 with “latest”
I had no intention to blog this, but @jayjacobs convinced me otherwise. I was curious about the recent (end of March, 2014) California earthquake “storm” and did a quick plot for “fun” and personal use using
I used data from the Southern California Earthquake Center (that I cleaned up a bit and that you can find here) but would have used the USGS quake data if the site hadn’t been down when I tried to get it from there.
The code/process isn’t exactly rocket-science, but if you’re looking for a simple way to layer some data on a “real” map (vs handling shapefiles on your own) then this is a really compact/self-contained tutorial/example.
You can find the code & data over at github as well.
There’s lots of ‘splainin in the comments (which are prbly easier to read on the github site) but drop a note in the comments or on Twitter if it needs any further explanation. The graphic is SVG, so use a proper browser :-) or run the code in R if you can’t see it here.
(click for larger version)
library(ggplot2) library(ggmap) library(plyr) library(grid) library(gridExtra) # read in cleaned up data dat <- read.table("quakes.dat", header=TRUE, stringsAsFactors=FALSE) # map decimal magnitudes into an integer range dat$m <- cut(dat$MAG, c(0:10)) # convert to dates dat$DATE <- as.Date(dat$DATE) # so we can re-order the data frame dat <- dat[order(dat$DATE),] # not 100% necessary, but get just the numeric portion of the cut factor dat$Magnitude <- factor(as.numeric(dat$m)) # sum up by date for the barplot dat.sum <- count(dat, .(DATE, Magnitude)) # start the ggmap bit # It's super-handy that it understands things like "Los Angeles" #spoffy # I like the 'toner' version. Would also use a stamen map but I can't get # to it consistently from behind a proxy server la <- get_map(location="Los Angeles", zoom=10, color="bw", maptype="toner") # get base map layer gg <- ggmap(la) # add points. Note that the plot will produce warnings for all points not in the # lat/lon range of the base map layer. Also note that i'm encoding magnitude by # size and color and using alpha for depth. because of the way the data is sorted # the most recent quakes in the set should be on top gg <- gg + geom_point(data=dat, mapping=aes(x=LON, y=LAT, size=MAG, fill=m, alpha=DEPTH), shape=21, color="black") # this takes the magnitude domain and maps it to a better range of values (IMO) gg <- gg + scale_size_continuous(range=c(1,15)) # this bit makes the right size color ramp. i like the reversed view better for this map gg <- gg + scale_fill_manual(values=rev(terrain.colors(length(levels(dat$Magnitude))))) gg <- gg + ggtitle("Recent Earthquakes in CA & NV") # no need for a legend as the bars are pretty much the legend gg <- gg + theme(legend.position="none") # now for the bars. we work with the summarized data frame gg.1 <- ggplot(dat.sum, aes(x=DATE, y=freq, group=Magnitude)) # normally, i dislike stacked bar charts, but this is one time i think they work well gg.1 <- gg.1 + geom_bar(aes(fill=Magnitude), position="stack", stat="identity") # fancy, schmanzy color mapping again gg.1 <- gg.1 + scale_fill_manual(values=rev(terrain.colors(length(levels(dat$Magnitude))))) # show the data source! gg.1 <- gg.1 + labs(x="Data from: http://www.data.scec.org/recent/recenteqs/Maps/Los_Angeles.html", y="Quake Count") gg.1 <- gg.1 + theme_bw() #stopthegray # use grid.arrange to make the sizes work well grid.arrange(gg, gg.1, nrow=2, ncol=1, heights=c(3,1))
I used my trusty “
Google-Drive-spreadsheet-IMPORTHTML-to-CSV” workflow (you can access the automagically updated data here) to make the CSV that updates daily on the site and is referenced by the Shiny/R code.
The code has been gist-ified, and I’ll be re-visiting it to refactor the
data.frame creation bits and add some more charts as the data set gets larger.
I shot a quick post over at the Data Driven Security blog explaining how to separate Twitter data gathering from R code via the Ruby
t (github repo) command. Using
t frees R code from having to be a Twitter processor and lets the analyst focus on analysis and visualization, plus you can use
t as a substitute for Twitter GUIs if you’d rather play at the command-line:
$ t timeline ddsecblog @DDSecBlog Monitoring Credential Dumps Plus Using Twitter As a Data Source http://t.co/ThYbjRI9Za @DDSecBlog Nice intro to R + stats // Data Analysis and Statistical Inference free @datacamp_com course http://t.co/FC44FF9DSp @DDSecBlog Very accessible paper & cool approach to detection // Nazca: Detecting Malware Distribution in Large-Scale Networks http://t.co/fqrSaFvUK2 @DDSecBlog Start of a new series by new contributing blogger @spttnnh! // @AlienVault rep db Longitudinal Study Part 1 : http://t.co/XM7m4zP0tr ...
The DDSec post shows how to mine the well-formatted output from the @dumpmon Twitter bot to visualize dump trends over time:
and has the code in-line and over at the DDSec github repo [R].