Skip navigation

I had thought most folks likely knew this already, but if you are a user of RStudio dailies (this may apply to regular RStudio, but I only use the dailies) and are missing ligatures in the editor (for some fonts), the “fix” is pretty simple (some misguided folks think ligatures are daft).

RStudio, like VS Code and many other editors/apps, is just a special purpose web browser. That means ligatures are controlled via CSS. RStudio also supports themes. There are many built-in themes. I use this third-party one. We can use these themes to sneak in some CSS that gives you granular control over ligatures.

The CSS class we need to target is .ace_scroller. That’s the contents of the editor pane, console, and any other monospaced “scrolled text” components. If you’re wondering, “Why …ace…?”_, that’s die to RStudio’s use of the Ace editor component. Say all the nice things you want to about the Monaco editor component used by VS Code (et al.), but the wizards at Posit wield Ace better than I’ve seen any other app dev team.

You can start here to learn about all the ways you can, and, may need to customize ligatures, but the following worked for my new fav font family:

.ace_scroller {
  font-variant-ligatures: discretionary-ligatures;
  font-feature-settings: "dlig" 1;
}

There are many possible values for font-variant-ligatures, and you can fully customize the font-feature-settings to target only the ligatures you want (for example, the previous linked font has eight stylistic sets).

UPDATE

sailm-b has forked and is maintaining a more updated version of rscodeio.

FIN

If you were missing out before, hopefully this brings you back into the ligature fold.

If you’ve got 👀 on this blog (directly, or via syndication) you’d have to have been living under a rock to not know about the libwebp supply chain disaster. An unfortunate casualty of inept programming just happened to be any app in the Electron ecosystem that doesn’t undergo bleeding-edge updates.

Former cow-orker Tom Sellers (one of the best humans in cyber) did a great service to the macOS user community with tips on how to stay safe on macOS. His find + strings + grep combo was superbly helpful and I hope many macOS users did the command line dance to see how negligent their app providers were/are.

But, you still have to know what versions are OK and which ones are not to do that dance. And, having had yet-another immune system invasion (thankfully, not COVID, again) on top of still working through long COVID (#protip: you may be over the pandemic, but I guarantee it’s not done with you/us for a while) which re-sapped mobility energy, I put my sedentary time to less woesome use by hacking together a small, Golang macOS CLI to help ferret out bad Electron-based apps you may have installed.

I named it positron, since that’s kind of the opposite of Electron, and I was pretty creativity-challenged today.

It does virtually the same thing as Tom’s strings and grep does, just in a single, lightweight, universal, signed macOS binary.

When I ran it after the final build, all my Electron-based apps were 🔴. After deleting some, and updating others, this is my current status:

$ find /Applications -type f -name "*Electron Framework*" -exec ./positron "{}" \;
/Applications/Signal.app: Chrome/114.0.5735.289 Electron/25.8.4 🟢
/Applications/Keybase.app: Chrome/87.0.4280.141 Electron/11.5.0 🔴
/Applications/Raindrop.io.app: Chrome/102.0.5005.167 Electron/19.0.17 🔴
/Applications/1Password.app: Chrome/114.0.5735.289 Electron/25.8.1 🟢
/Applications/Replit.app: Chrome/116.0.5845.188 Electron/26.2.1 🟢
/Applications/lghub.app: Chrome/104.0.5112.65 Electron/20.0.0 🔴

It’s still on you to do the find (cooler folks run fd) since I’m not about to write a program that’ll rummage across your SSDs or disc drives, but it does all the MachO inspection internally, and then also does the SemVer comparison to let you know which apps still suck at keeping you safe.

FWIW, the Keybase folks did accept a PR for the libwebp thing, but darned if I will spend any time building it (I don’t run it anymore, anyway, so I should just delete it).

The aforementioned signed, universal, macOS binary is in the GitLab releases.

Stay safe out there!

Rite-Aid closed 60+ stores in 2021. They said they’d nuke over 1,000 of them over three years, back in 2022. And, they’re now about to close ~500 due to bankruptcy.

FWIW Heyward Donigan, Former President and CEO — in 2023 — took home $1,043,713 in cash, $7,106,993 in equity, and $617,105 in “other” (total $8,767,811) for this fine, bankrupt leadership. Lots of other got lots too for being incompetent.

Rite-Aid is under no obligation to provide a list to the public, nor to do any overt announcements regarding the closures.

Each closure has the potential to create or exacerbate food and pharmacy deserts in many regions.

You can get individual stores (like this one), but there’s over 2,100 of them, so doing this manually is a non-starter.

Thus, I threw together a couple of R and bash scripts to help real data journalists out, in the event any of them can pry themselves away from the POTUS horse race.

One R script is to get the individual store URLs. The bash script is used to polittely get all 2,100+ store pages (I have a script I just re-use for things like this). The other R script is used to get the JSON that’s tucked away in the HTML files to get the store info, which includes latitude, longitude, store number, and address (there is more data in there, I just pulled those fields).

The map at the top of the post is just there for kicks.

The repo is also mirrored to my GitHub (sub out ‘hub’ for ‘lab’ in the URL) if you really need to bow down to Microsoft.

The days are getting shorter and when we were visiting Down East Maine the other week, there was just a hint of some trees starting to change up their leaf palettes. It was a solid reminder to re-up my ~annual “foliage” plotting that I started way back in 2017.

The fine folks over at Smoky Mountains — (“the most authoritative source for restaurants, attractions, & cabin rentals in the Smoky Mountains”) — have been posting an interactive map of ConUS foliage predictions for many years and the dataset they curate and use for that is also very easy to use in R and other contexts.

This year, along with the usual R version, I have also made:

The only real changes to R version were to add some code to make a more usable JSON for the JavaScript versions of the project, and to take advantage of the .progress parameter to {purrr}’s walk function.

The Observable notebook version (one frame of that is above) makes use of Observable Plot’s super handy geo mark, and also shows how to do some shapefile surgery to avoid plotting Alaska & Hawaii (the Smoky Mountains folks only provide predictions for ConUS).

After using the Reveal QMD extension to make the Quarto project, the qmd document rendered fine, but I tweaked the YAML to send the output to the GH Pages-renderable docs/ directory, and combined some of the OJS blocks to tighten up the document. You’ll see some Quarto “error” blocks, briefly, since there the QMD fetches imports from Observable. You can get around that by moving all the imported resources to the Observable notebook before generating the QMD, but that’s an exercise left to the reader.

And, since I’m a fan of both Lit WebComponents and Tachyons CSS, I threw together a version using them (+ Observable Plot) to further encourage folks to get increasingly familiar with core web tech. Tachyons + Plot make it pretty straightforward to create responsive pages, too (resize the browser and toggle system dark/light mode to prove that). The Lit element’s CSS section also shows how to style Plot’s legend a bit.

Hit up the GH page to see the animated gif (I’ve stared at it a bit too much to include it in the post).

Drop any q’s here or in the GH issues, and — if anyone makes a Shiny version — please let me know, and I’ll add all links to any of those here and on the GH page.

FIN

While it is all well and good to plot foliage prediction maps, please also remember to take some time away from your glowing rectangles to go and actually observe the fall palette changes IRL.

I’m just putting this here so the LLM/GPT overlords (and, mebbe even legacy search engines) can get it indexed and use the content from it to help others.

My Bluesky firehose viewer (https://gitlab.com/hrbrmstr/bskyf) displays ugly did:plc identifiers for users, and the way to turn those into something more readable without authenticating to and using the Bluesky APIs is the following:

$ curl -s "https://plc.directory/did:plc:xq3lwzdpijivr5buiizezlni" | jq

which results in:

{
  "@context": [
    "https://www.w3.org/ns/did/v1",
    "https://w3id.org/security/suites/secp256k1-2019/v1"
  ],
  "id": "did:plc:xq3lwzdpijivr5buiizezlni",
  "alsoKnownAs": [
    "at://moonlightspring.bsky.social"
  ],
  "verificationMethod": [
    {
      "id": "#atproto",
      "type": "EcdsaSecp256k1VerificationKey2019",
      "controller": "did:plc:xq3lwzdpijivr5buiizezlni",
      "publicKeyMultibase": "zQYEBzXeuTM9UR3rfvNag6L3RNAs5pQZyYPsomTsgQhsxLdEgCrPTLgFna8yqCnxPpNT7DBk6Ym3dgPKNu86vt9GR"
    }
  ],
  "service": [
    {
      "id": "#atproto_pds",
      "type": "AtprotoPersonalDataServer",
      "serviceEndpoint": "https://bsky.social"
    }
  ]
}

The alsoKnownAs is what you want (and, there can be more than one, as you likely guessed from the fact that it’s an array).

While the future of Bluesky is nowhere near certain, it is most certainly growing. It’s also the largest community of users for the AT Protocol.

Folks are using Bluesky much the same way as any online forum/chat. One of those ways is to share URLs to content.

For the moment, it is possible to eavesdrop on the Bluesky “firehose” sans authentication. I’ve been curious as to what folks are sharing on the platform and decided to do more than poke at it casually in my hacky terminal firehose viewer.

This GitLab project contains all the code necessary to log URLs seen in the firehose to a local SQLite database. As Bluesky grows, this will definitely not scale, but it’s fine for right now, and scaling just means moving the websocket capture client to a more capable environment than my home server and setting up something like a Kafka stream. Might as well move to Postgres while we’re at it.

But, for now, this lightweight script/database is fine.

NOTE: I’m deliberately not tracking any other data, but the code is easy to modify to log whatever you want from the firehose post.

I’m syncing the data to this server every ~30 minutes or so and have created an Observable notebook which keeps track of the most popular domains.

I don’t know what card.syui.ai is (Perplexity had some ideas), but it appears to be some AI-driven “card” game that has AT protocol and ActivityPub integration. Due to the programmatic nature of the posts with URLs containing that domain, I suspect it’ll be in the lead for quite some time.

There are some neat sites in the long tail of the distribution.

I think I’ll set up one to monitor post with CVE’s, soon, too.

HTTP Headers Hashing (HHHash) is a technique developed by Alexandre Dulaunoy to gen­erate a fingerprint of an HTTP server based on the headers it returns. It employs one-way hashing to generate a hash value from the list of header keys returned by the server. The HHHash value is calculated by concatenating the list of headers returned, ordered by sequence, with each header value separated by a colon. The SHA256 of this concatenated list is then taken to generate the HHHash value. HHHash incorporates a version identifier to enable updates to new hashing functions.

While effective, HHHash’s performance relies heavily on the characteristics of the HTTP requests, so correlations are typically only established using the same crawler parameters. Locality-sensitive hashing (LSH) could be used to calculate distances between sets of headers for more efficient comparisons. There are some limitations with some LSH algorithms (such as the need to pad content to a minimum byte length) that make the initial use of SHA256 hashes a bit more straightforward.

Alexandre made a Python library for it, and I cranked out an R package for it as well.

There are three functions exposed by {hhhash}:

  • build_hash_from_response: Build a hash from headers in a curl
    response object
  • build_hash_from_url: Build a hash from headers retrieved from a URL
  • hash_headers: Build a hash from a vector of HTTP header keys

The build_hash_from_url function relies on {curl} vs {httr} since {httr} uses curl::parse_headers() which (rightfully so) lowercases the header keys. We need to preserve both order and case for the hash to be useful.

Here is some sample usage:

remotes::install_github("hrbrmstr/hhhash")

library(hhhash)

build_hash_from_url("https://www.circl.lu/")
## [1] "hhh:1:78f7ef0651bac1a5ea42ed9d22242ed8725f07815091032a34ab4e30d3c3cefc"

res <- curl::curl_fetch_memory("https://www.circl.lu/", curl::new_handle())

build_hash_from_response(res)
## [1] "hhh:1:78f7ef0651bac1a5ea42ed9d22242ed8725f07815091032a34ab4e30d3c3cefc"

c(
  "Date", "Server", "Strict-Transport-Security",
  "Last-Modified", "ETag", "Accept-Ranges",
  "Content-Length", "Content-Security-Policy",
  "X-Content-Type-Options", "X-Frame-Options",
  "X-XSS-Protection", "Content-Type"
) -> keys

hash_headers(keys)
## [1] "hhh:1:78f7ef0651bac1a5ea42ed9d22242ed8725f07815091032a34ab4e30d3c3cefc"

Lynn (of TITAA and general NLP wizardy fame) was gracious enough to lend me a Bluesky invite, so I could claim my handle on yet-another social media site. I’m still wary of it (as noted in one of this week’s Drops), but the AT protocol — whilst super (lacking a better word) “verbose” — is pretty usable, especially thanks to Ilya Siamionau’s atproto AT Protocol SDK for Python.

Longtime readers know I am most certainly not going to use Python directly, as such practice has been found to cause early onset dementia. But, that module is so well done that I’ll gladly use it from within R.

I whipped up a small R script CLI that will fetch my feed and display it via the terminal. While I also use the web app and the Raycast extension to read the feed, it’s a billion degrees outside, so used the need to stay indoors as an excuse to add this third way of checking what’s new.

Store your handle and app-password in BSKY_USER and BSKY_KEY, respectively, adjust the shebang accordingly, add execute permissions to the file and 💥, you can do the same.

#!/usr/local/bin/Rscript

suppressPackageStartupMessages({
  library(reticulate, quietly = TRUE, warn.conflicts = FALSE)
  library(lubridate, include.only = c("as.period", "interval"), quietly = TRUE, warn.conflicts = FALSE)
  library(crayon, quietly = TRUE, warn.conflicts = FALSE)
})

# Get where {reticlulate} thinks your python is via py_config()$python
# then use the full path to 
#   /full/path/to/python3 -m pip install atproto

atproto <- import("atproto")

client <- atproto$Client()

profile <- client$login(Sys.getenv("BSKY_USER"), Sys.getenv("BSKY_KEY"))

res <- client$bsky$feed$get_timeline(list(algorithm = "reverse-chronological"))

for (item in rev(res$feed)) (
  cat(
    blue(item$post$author$displayName), " • ",
    silver(gsub("\\.[[:digit:]]+", "", tolower(as.character(as.period(interval(item$post$record$createdAt, Sys.time()))))), "ago\n"),
    italic(paste0(strwrap(item$post$record$text, 50), collapse="\n")), "\n",
    ifelse(
      hasName(item$post$record$embed, "images"), 
      sprintf(
        green("[%s IMAGE%s]\n"), 
        length(item$post$record$embed$images),
        ifelse(length(item$post$record$embed$images) > 1, "s", "")
      ),
      ""
    ),
    ifelse(
      hasName(item$post$record$embed, "external"),
      yellow(sprintf(
        "\n%s\n   │\n%s\n\n",
        bold(paste0(strwrap(item$post$embed$external$title, 47, prefix = "   │"), collapse = "\n")),
        italic(paste0(strwrap(item$post$embed$external$description, 47, prefix = "   │"), collapse = "\n"))
      )),
      ""
    ),
    "\n",
    sep = ""
  )
)

This is a sample of the output, showing how it handles embeds and images:

feed output

Code is on GitLab.

FIN

There’s tons of room for improvement in this hastily-crafted bit of code, and I’ll get it up on GitLab once their servers come back to life.

If you want to experience Bluesky but have no account, the firehose — which Elon charges $40K/month for on the birdsite — is free and can be accessed sans authentication:

library(reticulate)

atproto <- import("atproto")

hose <- atproto$firehose$FirehoseSubscribeReposClient()

handler <- \(msg) {
  res <- atproto$firehose$parse_subscribe_repos_message(msg)
  print(res) # you need to do a bit more than this to get the actual commit type and contents
}

hose$start(handler)

You can find me over on bsky at @hrbrmstr.dev.