Skip navigation

Author Archives: hrbrmstr

Don't look at me…I do what he does — just slower. #rstats avuncular • ?Resistance Fighter • Cook • Christian • [Master] Chef des Données de Sécurité @ @rapid7

At the end of March, I caught a fleeting tweet that showcased an Epson thermal receipt printer generating a new “ticket” whenever a new GitHub issue was filed on a repository. @aschmelyun documents it well in this blog post. It’s a pretty cool hack, self-contained on a Pi Zero.

Andrew’s project birthed an idea: could I write an R package that will let me plot {ggplot2}/{grid} objects to it? The form factor of the receipt printer is tiny (~280 “pixels” wide), but the near infinite length of the paper means one can play with some data visualizations that cannot be done in other formats (and it would be cool to be able to play with other content to print to it in and outside of R).

One of the features that makes Andrew’s hack extra cool is that he used an Epson receipt printer model that was USB connected. I don’t see the need to dedicate and extra piece of plastic, metal, and silicon to manage the printing experience, especially since I already have a big linux server where I run personal, large scale data science jobs. I ended up getting a used (lots of restaurants close down each week) Epson TM-T88V off of eBay since it has Ethernet and is supports ESC/POS commands.

After unpacking it, I needed to get it on the local network. There are many guides out there for this, but this one sums up the process pretty well:

  • Plug the printer in and reset it
  • Hook up a system directly to it (Ethernet to Ethernet)
  • Configure your system to use the Epson default IP addressing scheme
  • Access the web setup page
  • Configure it to work on your network
  • Disconnect and restart the printer

To make sure everything worked, I grabbed one of the (weirdly) many projects on GitHub that provided a means for formatting graphics files to an ESC/POS compatible raster bitmap and processed a simple R-generated png to it, then used netcat to shunt the binary blob over to the printer on the default port of 9100.

I did some initial experiments with {magick}, pulling the graphics bits out of generated plots and then wrapping some R code around doing the conversion. It was clunky and tedious, and I knew there had to be a better way, so I hunted for some C/C++, Rust, or Go code that already did the conversion and found png2escpos by The Working Group. However, I’ve switched to png2pos by Petr Kutalek as the dithering it does won’t require the R user to produce only black-and-white plots for them to look good.

I thought about implementing a graphics device to support any R graphics output, but there are enough methods to convert a base R plot to a grid/grob object that I decided to mimic the functionality of ggsave() and make a ggpos() function. The comment annotations in the code snippet below walk you through the extremely basic process:

ggpos <- function(plot = ggplot2::last_plot(),
                  host_pos,
                  port = 9100L,
                  scale = 2,
                  width = 280,
                  height = 280,
                  units = "px",
                  dpi = 144,
                  bg = "white",
                  ...) {

  # we generate a png file using ggsave()

  png_file <- tempfile(fileext = ".png")

  ggplot2::ggsave(
    filename = png_file,
    plot = plot,
    scale = scale,
    width = width,
    height = height,
    units = units,
    dpi = dpi,
    bg = bg,
    ...
  )

  # we call an internal C function to convert the generated png file to an ESC/POS raster bitmap file

  res <- png_to_raster(png_file)

  if (res != "") { # if the conversion ended up generating a new file

    # read in the raw bytes

    escpos_raster <- stringi::stri_read_raw(res)

    # open up a binary socket to the printer 

    socketConnection(
      host = host_pos,
      port = port,
      open = "a+b"
    ) -> con

    on.exit(close(con))

    # shunt all the bytes over to it

    writeBin(
      object = escpos_raster,
      con = con,
      useBytes = TRUE
    )

  }

  invisible(res)

}

The only work I needed to do on the original C code was to have it output directly to a file instead of stdout.

Now, plotting to the printer is as straightforward as:

library(ggplot2)
library(escpos)

ggplot(mtcars) +
  geom_point(
    aes(wt, mpg)
  ) +
  labs(
    title = "Test of {ggpos}"
  ) +
  theme_ipsum_es(grid="XY") +
  theme(
    panel.grid.major.x = element_line(color = "black"),
    panel.grid.major.y = element_line(color = "black")
  ) -> gg

ggpos(gg, host_pos = HOSTNAME_OR_IP_ADDRESS_OF_YOUR_PRINTER)

That code produces this output (I’m still getting the hang of ripping the spooled paper off this thing):

ggplot receipt

This is the whole thing in action:

One of the 2022 #30DayChartChallenge topics was “part-to-whole”, so I rejiggered my treemap entry into a very long plot that would make CVS cashiers feel quite inferior.

You can find {escpos} over on GitHub.

FIN

One big caveat for this is that these printers have a tiny memory buffer, so very long, complex plots aren’t going to work out of the box. I had to break up my faceted heatmaps info individual ones and shunt them over one-by-one.

I’ll be switching over the the new C library soon, and adding a small DSL to handle text formatting and printing from R (the device has 2 fonts and almost no styles). I’ve even threatened to make a ShinyPOS application, but we’ll see how the motivation for that goes.

Kick the tyres and let me know if you end up using the package (+ share your creation to the 🌎).

RStudio’s macOS Electron build is coming along quite nicely and is blazing fast on Apple Silicon.

I like to install the dailies, well, daily!; and, of late, RStudio and Quarto are joined at the hip. As a result, I regularly found myself having to manually update Quarto CLI right after updating RStudio, so I made a small zsh script that will use the new RStudio dailies JSON API to grab the latest daily, and the GitHub JSON API to grab the latest Quarto release, then install them both.

Caveats and limitations are in the repo.

via Zack Whittaker on Twitter…

Download Tor Browser: http://torproject.org/download

Open this URL in Tor to access BBC (🇺🇦): https://www.bbcweb3hytmzhn5d532owbu6oqadra5z3ar726vq5kgwwn6aucdccrad.onion/ukrainian

Open this URL in Tor to access BBC (🇷🇺): https://www.bbcweb3hytmzhn5d532owbu6oqadra5z3ar726vq5kgwwn6aucdccrad.onion/russian

Ukrainian President Volodymyr Zelenskyy addresses the nation on a live TV broadcast in Kyiv, Ukraine, on Feb. 22, 2022.

Ukrainian Presidential Press Office via AP
Ukrainian President Volodymyr Zelenskyy addresses the nation on a live TV broadcast in Kyiv, Ukraine, on Feb. 22, 2022.

Here is the full transcript of Ukrainian President Volodymyr Zelenskyy’s video address early on Feb. 24 local time warning that a Russian invasion could cause tens of thousands of deaths.

“Today I initiated a phone call with the president of the Russian federation. The result was silence. Though the silence should be in Donbass. That’s why I want to address today the people of Russia. I am addressing you not as a president, I am addressing you as a citizen of Ukraine. More than 2,000 km of the common border is dividing us. Along this border your troops are stationed, almost 200,000 soldiers, thousands of military vehicles. Your leaders approved them to make a step forward, to the territory of another country. And this step can be the beginning of a big war on European continent.

We know for sure that we don’t need the war. Not a Cold War, not a hot war. Not a hybrid one. But if we’ll be attacked by the [enemy] troops, if they try to take our country away from us, our freedom, our lives, the lives of our children, we will defend ourselves. Not attack, but defend ourselves. And when you will be attacking us, you will see our faces, not our backs, but our faces.

The war is a big disaster, and this disaster has a high price. With every meaning of this word. People lose money, reputation, quality of life, they lose freedom. But the main thing is that people lose their loved ones, they lose themselves.

They told you that Ukraine is posing a threat to Russia. It was not the case in the past, not in the present, it’s not going to be in the future. You are demanding security guarantees from NATO, but we also demand security guarantees. Security for Ukraine from you, from Russia and other guarantees of the Budapest memorandum.

But our main goal is peace in Ukraine and the safety of our people, Ukrainians. For that we are ready to have talks with anybody, including you, in any format, on any platform. The war will deprive [security] guarantees from everybody — nobody will have guarantees of security anymore. Who will suffer the most from it? The people. Who doesn’t want it the most? The people! Who can stop it? The people. But are there those people among you? I am sure.

I know that they [the Russian state] won’t show my address on Russian TV, but Russian people have to see it. They need to know the truth, and the truth is that it is time to stop now, before it is too late. And if the Russian leaders don’t want to sit with us behind the table for the sake of peace, maybe they will sit behind the table with you. Do Russians want the war? I would like to know the answer. But the answer depends only on you, citizens of the Russian Federation.”

If you’ve been following my inane tweets and non-technical blog posts for any length of time since 2015, you likely know the 2016 election cycle broke me more than just a tad, with each subsequent month of the Trump presidency adding a bit more breakage. My brain is constantly trying to make sense of the systems of the world, from the micro (small personal/home things) to the macro (global-scale things). There’s a Marvel character (no, this isn’t about “Cap”), Karnak, who’s chief ability is that he can see the flaws in all things, and it’s the closest analogy I can make to how deep down the rabbit hole my brain goes with this global-systems analysis. There’s always been a deep seated need to grasp the “why”, and “how” of any “what” (which, combined with being adept with silicon-laced glowing rectangles, explains the gravitation towards cybersecurity, though all my research scientist mates out there have that same Columbo-esque desire to get to the bottom of things).

I really thought I knew the histories and trajectories of a decent percentage of the “what”s in these world systems, believing that a slew of modern critial events, like Obama’s two-term presidency (to point to just one), were clear signs of the progress society had been making, despite the laundry list of overt divisions and inequities that remain. Even though we’ve lived in a rural Maine town for many years, I was blindsighted by the massive public support and normalization of hate, largely based on fear. For some reason, it was easy to dismiss partisan games in Congress as just the way things get done in a suboptimal system. It was too easy to compartmentalize the fact that supposedly decent folks, like my in-laws, hung on every word influencers like Rush Limbaugh and FOX News hosts spewed, thinking that it’s just a fringe element feeding off of such tainted information flows.

When signs of the then-impending pandemic first emerged, I naively thought it was going to be a catalyst for positive change. I thought even someone as narcissistic as Trump and his minions would see the need to unite folks under a banner of helping to ensure we protected as many people from the ravages of Covid as possible, and lead a coordinated, global effort to create and distribute treatments and vaccines as quickly as possible. I believed I knew how solid our CDC was, and saw so many talented scientists use their skills to model and explain various outcome paths, based on how we approached the handling of thie virus. I knew Bush helped orchestrate an initial modern pandemic playbook and that Obama built upon it, and that it was actually quite good.

Then I saw that we, collectively, just don’t care if scores of people are sickened and/or die. I heard so-called leaders say that the economy is more important than human life; heard entitled citizens that wearing a piece of cloth or paper over your mouth and nose was too much of a sacrifice to make; read countless stories from even so-called faith leaders that refraining from large indoor gatherings for a while, and periodically, to help ensure we don’t overwhelm our emergency medical systems and crush the healthcare workers in them was Nazi-like oppression. And, I saw the last leader of the free world (since we’ve now permanantly ceded that position to random agents of chaos) actively downplay and subvert the crisis, leading millions to follow his lead, which ultimately leads to the impending 1 million needlessly lost lives.

When those signals emerged in March of 2020, the break got a bit worse (picture one of those window or lake-ice cracks that spider out with each additional vibration), as it did with the drumbeat of terrible event of 2020 (of which there were many).

Like I suspect was the case with many readers (assuming there are many readers), I plain-up cried (the good kind) when Biden officially won the 2020 election. I foolishly thought, like so many others, that the sinking ship was at the start of being righted, and we’d be on a slow path towards sailing again.

Then, January 6th, 2021 happened. Since then, I’ve seen state, after state vie for the “Most Failed State” top spot. I’ve seen faith leaders and communities give their all to see who can be the worst possible verison of themselves. And, I’ve seen even the most stalwart among us declare the pandemic over because they’ve no stamina left to make any effort into caring for or about the least of us and those who provide medical care to our communities.

Talk about being broken.

We have this term in cybersecurity called “fuzzing”. It’s a technique where you send inputs into an application that it is not really designed to handle (e.g. imagine sending the entirety of Webster’s dictionary to a simple date field), and then doing this repeatedly to see if you can get the application to crash, change expected behavior, or end up in a state where you can compromise it. The events of 2015 through this very day feel like this has been/is one massive fuzz against the all the clear-thinking, decent members of society; and my human operating system just plain crashed.

In the spirit of “I can do this all day”, I may have been/be broken, but was/am not content to remain that way.

  • I’ve read more tomes than you would possibly believe if I were to list them out.

  • I’ve listened to so many podcasts that I was expecting Apple’s Health app to counsel me to, perhaps, shut off all audio devices for a month or two.

  • I’ve filled my RSS reader with feeds from exceptionally gifted humans who, too, have been trying to make sense of what has happened and where we are going.

  • (I’ve also prayed, walked, rode (bike), de-screened, socmed sabbaticaled, read more fiction than ever before, and intensified healthy cooking/eating to try to balance out all the bad inputs.)

I’ve done all this because I feel compelled to not only just understand (I actually need to understand), but also help fix this situation we’re in. Selfishly, a large part of that desire to leave a better world behind for my kids and our new grandson.

Of late, I’ve seen most of my input sources devolve into the same thing: chronicling the end of America as most modern folks know it. They’ve gone from working to make sense of why/how we got here and what can be done about it to doing the same thing we all pretty much did during 2016-2019: shaking our heads at every bad news item and noting how bat guano crazy the individual behind the bad news was. Not exactly hope-filling. In fact, I could sum up things up with two lines from Matchbox 20’s “Back 2 Good”


“And everyone here’s to blame
And everyone here gets caught up in the pleasure of the pain”

A recent entry into the aforementioned tomes was Jeremy W. Peters’ book “Insurgency: How Republicans Lost Their Party and Got Everything They Ever Wanted”. I’ve been a bit more choosy in what “Jan 6” analysis tomes I toss coin at, and was dismayed yet-another reporter was releasing a book, but I listened to the little voice, and dropped an Audible credit on it and I has been a literal Godsend.

A big reason for remaining broken is that there were many missing (key) system components. You can’t identify the failure modes without seeing the complete system, and Jeremy managed to fill in (most of) those gaps. He did an amazing job going back far enough, and walking through the event trees paintakingly enough that I could actually feel the puzzle pieces fitting into place. Where there were once clouds, there is now clear sky. Items with chasms between them now have bridges.

Having the systems functionally and nearly fully documented has been immensely theraputic. It’s astonishing to realize just how many personal mental processing cores had been dedicated to this problem. It’s also all kinds of amazing to have to have some of the cognitive processor faculties back to do things like code for fun, again.

Since this is not a book review (nor a book itself), I won’t go into each and every component that was made clear. That’s not really the point of this post.

I guess the first point is that if 2015-2022 also broke you in some way, realize you’re not alone. I don’t think anyone was fully (or even partially) prepared for what we all ended up enduring and continue to endure. Hopefully knowing you’re normal, and that there we broken folk are legion will help quell at least that part of being broken.

The second point is that there was a rhyme and reason to how we got to where we are now. It is, perhaps, more of a crass limerick than poetic rhyme, and the reasons aren’t great, but events weren’t random and they did not emerge from nowhere.

The third and last point is that knowing there are “why”s and “how”s to the “what”s means it is possible to work on forging compensating controls (i.e. there can be concrete actions we can take to make things better and setup hedges to prevent us from heading down similar chaotic paths). We’re still not on a great collective path forward, and there’s no magic wand we can wave to make things better. But, we all can make individual and incremental progress in our own ways. For some, like me, it may mean breaking out of some comfort zones to do things you would not normally do. For others, it may be applying aligned talents to triaged areasm, doing what you can to make even the smallest thing a tiny bit better. We’re not going to A-bomb our way out of this conflcit. It’s going to take a long period of incremental, positive change.

If you’re still working on figuring out what went awry, I highly recommend Jeremy’s book. You can also reach out if you need some personal reassurance that all is not, in fact, lost. Unlike the hopeless ending of the aforenoted Matchbox 20 song, I do, in fact, believe there is a way of “getting back to good” and, for me, that journey starts now.

I close with a heartfelt thank you for the patience and kindness many folks have shown and expressed over this period. You’ve done more than you can possibly know.

After a Twitter convo about weather stations I picked up a WeatherFlow Tempest. Setup was quick, but the sensor package died within 24 hours. I was going to give up on it but I had written an R package (for the REST API & UDP broadcast interfaces) and C++ utility (for just the UDP broadcast interface), and the support staff were both friendly and competent and sent me a replacement super quick.

I’ve blathered about the R package already (on Twitter) so am not going to tag that here, but will link to a few repositories (in various languages) that receive the UDP broadcast messages and at least shove them to stdout.

The C++ one is mostly C but gets the job done (it just posts the messages to stdout). It should run everywhere but I only tested on macOS & Linux, because Windows is a terrible operating system nobody should use.

The Golang one has some structured types to consume about half of the JSON messages (I’ve only seen four in the broadcasts so far, and will add more as I see new ones). It’s only more verbose than the C++ one due to the various record type handling. This should run everywhere, though.

For kicks, I threw together a Swift one that is really just Swift-ified C and is a Frankenstein monster that likely shouldn’t be used. (I’ll be making a SwiftUI macOS/iOS/iPadOS app for the UDP broadcast messages, though, soon).

To round out my obsession I also made a Rust version which I’m just in 💙 with (not because of any skill of my own). It’s the smallest source file and is pretty elegant (100% due to Rust, and, again, not me).

All the code/projects are super small, but the Rust source is so tiny that it won’t be too intrusive to post here:

use std::net::UdpSocket;

fn main() -> std::io::Result<()> {

  let mut buf = [0; 1024]; // 1024 byte buffer is plenty
  let s = UdpSocket::bind("0.0.0.0:50222").expect(r#"{"message":"Could not bind to address/port."}"#);

  loop {

    let (n, _) = s.recv_from(&mut buf).expect(r#"{"message":"No broadcasts received."}"#);

    println!("{}", String::from_utf8(buf[..n].to_vec()).unwrap())

  }

}

FIN

If you’re interested in a low-cost weather station with great DIY programming support, I’d definitely (so far, at least) recommend the Tempest. We’ll see if it survives the forthcoming snowpocalypse.

These are the JSON messages it slings over UDP:

{"serial_number":"HB-00069665","type":"hub_status","firmware_revision":"177","uptime":728643,"rssi":-50,"timestamp":1643246011,"reset_flags":"BOR,PIN,POR","seq":72787,"radio_stats":[25,1,0,3,16637],"mqtt_stats":[10,108]}
{"serial_number":"ST-00055227","type":"rapid_wind","hub_sn":"HB-00069665","ob":[1643246013,0.00,0]}
{"serial_number":"ST-00055227","type":"rapid_wind","hub_sn":"HB-00069665","ob":[1643246015,0.00,0]}
{"serial_number":"ST-00055227","type":"device_status","hub_sn":"HB-00069665","timestamp":1643246016,"uptime":106625,"voltage":2.683,"firmware_revision":165,"rssi":-72,"hub_rssi":-66,"sensor_status":655364,"debug":0}
{"serial_number":"ST-00055227","type":"obs_st","hub_sn":"HB-00069665","obs":[[1643246016,0.00,0.00,0.00,0,3,1024.56,-12.82,47.84,0,0.00,0,0.000000,0,0,0,2.683,1]],"firmware_revision":165}

The New York Times had a [tragic] story on Covid deaths today and one of their plots really stuck with me for how well it told that part of the story.

NYT Chart on Covid Deaths

NOTE: The red panel highlights are off a bit as I manually typed the data in (I only did the recreation to keep {ggplot2} muscle memory as I hadn’t doe a major customization like this in quite some time).

Only one {grid} hack (for the faceted X axis labels) too!

Hopefully, I’ll have more real-world opportunity to build some detailed, properly-annotated {ggplot2} plots this year.

Shout out to @ClausWilke for {ggtext} and all the folks who’ve made {ggplot2} such a powerful data visualization tool.

library(grid)
library(gtable)
library(hrbrthemes)
library(tidyverse)

gtable_filter_remove <- function (x, name, trim = FALSE) {
  # https://stackoverflow.com/a/36780639
  matches <- !(x$layout$name %in% name)
  x$layout <- x$layout[matches, , drop = FALSE]
  x$grobs <- x$grobs[matches]
  if (trim) 
    x <- gtable_trim(x)
  x
}

read.csv(text="race,age_group,before,after,cause
White,Under 25,1,3,Covid-19 deaths increased as a share of deaths from all cause
White,25-44,3,10,Covid-19 deaths increased as a share of deaths from all cause
White,45-64,8,15,Covid-19 deaths increased as a share of deaths from all cause
White,65-84,13,11,NA
White,85+,14,6,NA
Hispanic,Under 25,3,4,Covid-19 deaths increased as a share of deaths from all cause
Hispanic,25-44,17,21,Covid-19 deaths increased as a share of deaths from all cause
Hispanic,45-64,33,26,NA
Hispanic,65-84,33,17,NA
Hispanic,85+,21,9,NA
Black,Under 25,1,3,Covid-19 deaths increased as a share of deaths from all cause
Black,25-44,7,13,Covid-19 deaths increased as a share of deaths from all cause
Black,45-64,15,17,Covid-19 deaths increased as a share of deaths from all cause
Black,65-84,20,12,Covid-19 deaths increased as a share of deaths from all cause
Black,85+,17,8,NA
Asian,Under 25,2,4,Covid-19 deaths increased as a share of deaths from all cause
Asian,25-44,12,14,Covid-19 deaths increased as a share of deaths from all cause
Asian,45-64,21,13,NA
Asian,65-84,23,8,NA
Asian,85+,17,4,NA") -> xdf

xdf %>% 
  mutate(
    before = before/100,
    after = after/100,
    age_group = fct_inorder(age_group),
    race = factor(race, levels = rev(c("Asian", "Black", "Hispanic", "White")))
  ) -> xdf

{

  ggplot( data = xdf) +
    geom_rect(
      data = xdf,
      aes(
        xmin = -Inf, xmax = Inf, ymin = -Inf, ymax = Inf,
        fill = cause
      ),
      alpha = 1/6, color = NA
    ) +
    geom_rect(
      data = xdf %>% 
        filter(
          (race == "White" & age_group %in% c("65-84", "85+")) |
            (race == "Hispanic" & age_group %in% c("45-64", "65-84", "85+")) |
            (race == "Black" & age_group %in% c("85+")) |
            (race == "Asian" & age_group %in% c("45-64", "65-84", "85+"))
        ),
      aes(
        xmin = -Inf, xmax = Inf, ymin = -Inf, ymax = Inf
      ),
      fill = "#999999", alpha = 1/6, color = NA
    ) +
    geom_segment(
      aes(-Inf, xend = Inf, -Inf, yend= -Inf),
      size = 0.25, color = "black"
    ) +
    geom_segment(
      data = xdf, aes("1", before, xend="2", yend=after),
      size = 0.25
    ) +
    geom_point(
      data = xdf, aes("1", before), 
      fill = "#999999", color = "white", size = 2, stroke = 0.5, shape = 21
    ) +
    geom_point(
      data = xdf, aes("2", after),
      fill = "#bb271a", color = "white", size = 2, stroke = 0.5, shape = 21
    ) +
    geom_text(
      data = xdf,
      aes("1", before+0.05, label = scales::percent(before, 1)),
      color = "#999999", family = font_es_bold, fontface = "bold", size = 3
    ) +
    geom_text(
      data = xdf,
      aes("2", after+0.05, label = scales::percent(after, 1)),
      color = "#bb271a", family = font_es_bold, fontface = "bold", size = 3
    ) +
    scale_x_discrete(
      expand = c(0, 0),
      labels = c("<span style='color:#999999'>BEFORE</span>", "<span style='color:#bb271a'>AFTER</a>")
    ) +
    scale_y_percent(
      limits = c(-0.005, 0.405),
      breaks = c(-0.005, 0.1, 0.2, 0.3, 0.405),
      labels = c("", "", "", "", "40%\nof deaths from\nall causes for\nthis group")
    ) +
    scale_fill_manual(
      name = NULL,
      values = c("#bb271a"),
      na.translate = FALSE
    ) +
    coord_cartesian(clip = "off") +
    facet_wrap(
      facets = race~age_group, 
      scales = "free_x",
      labeller = \(labels, multi_line = TRUE){
        labels <- lapply(labels, as.character)
        labels[["race"]][c(1,2,4,5,6,7,9,10,11,12,14,15,16,17,19,20)] <- ""
        labels[["age_group"]] <- sprintf("<span style='font-style:normal;font-weight:normal;'>%s</span>", labels[["age_group"]])
        labels[["race"]][c(3,8,13,18)] <- sprintf("<span style='font-size:12pt;'>**%s**</span>", labels[["race"]][c(3,8,13,18)])
        labels
      }
    ) +
    labs(
      x = NULL, y = NULL,
      title = "Covid-19 deaths <span style='color:#999999'>before</span> and <span style='color:#bb271a'>after</span> universal adult vaccine eligibility",
      caption = "Source: Provisional weekly death data from the C.D.C. through Nov. 27. Note: Only the four largest racial and ethnic groups are included. Universal vaccine eligibility was April 19, the date when all adults in the United States were eligible for vaccination."
    ) +
    theme_ipsum_es(grid="Y", plot_title_size = 16) +
    theme(
      plot.title.position = "plot",
      plot.title = ggtext::element_markdown(hjust = 0.5),
      plot.caption = ggtext::element_textbox_simple(
        hjust = 0, size = 8.5, family = font_es, color = "#999999",
        margin = margin(t = 14)
      ),
      axis.ticks.x.bottom = ell(size = 0.25) ,
      axis.line.x.bottom = ell(lineend = "square", size = 0.25),
      axis.text.x.bottom = ggtext::element_markdown(size = 8, margin = margin(t = 6)),
      axis.text.y.left = elt(size = 8, vjust = 1, lineheight = 0.875,  color = "#999999"),
      strip.text.x = ggtext::element_markdown(hjust = 0.5, size = 10, family = font_es),
      strip.text = ggtext::element_markdown(hjust = 0.5, size = 10, family = font_es),
      panel.spacing.x = unit("40", "pt"),
      panel.spacing.y = unit(6, "pt"),
      panel.border = elb(),
      legend.position = "top"
    ) -> gg

  grid.newpage()
  grid.draw(
    gtable_filter_remove(
      x = ggplotGrob(gg),
      name = c(sprintf("axis-b-%d-1", 2:5), sprintf("axis-b-%d-2", 2:5), sprintf("axis-b-%d-3", 2:5), sprintf("axis-b-%d-4", 2:5))
    )
  )

}

{ggplot2} recreation of NYT plot

The Moderna booster level drained me all day on Dec 1 and did what jab two did during the overnight period (achy enough to wake me up and not get back to slumber easily). To try to wear myself down, I decided to practice a bit of R with the 2021 Advent of Code. There are plenty of superb R bloggers chronicling their daily katas that I do not feel compelled to post every one (truth be told, work and fam tasks/priorities will make devoting any time to this year’s daily puzzles a rare event).

Day 01 was very straightforward (even part 2 which I used {RcppRoll} despite hoping to stick to only base R 4.x) so it’s kinda not worth a post (for me), but Day 02 was kinda fun as I don’t have regular opportunities to use scan() and get().

The input is a series of submarine commands:

forward 5
down 5
forward 8
up 3
down 8
forward 2

with a set of rules that change between parts 1 and 2.

We can read in those commands with scan() which lets us specify a pattern for each line (scan() takes care of dealing with whitespace for you):

scan(
  text = "forward 5
down 5
forward 8
up 3
down 8
forward 2",
what = list(character(0), integer(0))
) |>
  setNames(c("command", "value")) -> input

str(input)
## List of 2
##  $ command: chr [1:6] "forward" "down" "forward" "up" ...
##  $ value  : int [1:6] 5 5 8 3 8 2

The rules (link above) were pretty basic, increment/decrement some variables based on the command input, but I wanted to avoid a bunch of if statements. Since R has the get() function that enables searching by name for an object, we can make a series of functions that have the command as the identifier and then use get() to call the function:

up <- \(x) depth <<- depth - x
down <- \(x) depth <<- depth + x
forward <- \(x) position <<- position + x

position <- depth <- 0

for (idx in seq_along(input$command)) {
  get(input$command[idx], mode = "function")(input$value[idx])
}

(the final answer is computed by position X depth).

While I find this to be a “fun” solution, I’d strongly suggest:

  • avoiding using the new shortcut function declaration in mixed R version shops as it’s very new and likely to be confusing to new R users
  • being wary of the <<- assignment operator as it’s introducing a side-effect (parent/global environment modification) which will come back to bite you in other projects some day
  • ditching the $ referencing in favour of [[]] / [] to avoid partial name matching “gotchas”, and
  • adding explicit documentation to what you’re doing with get() calls (provided you really have a good case for using get() to begin with)

The code only changes slightly for part 2, so I’ll refrain from adding even more unreadable code from this post.