Skip navigation

Tag Archives: Linux

Starting sometime mid-year in 2011, I began having more ‘stuff’ to do than even my eidetic memory could help with. It’s not that I forgot things, per se, but the ability to mentally recall and prioritize work, family, personal and other tasks finally required some external assistance and I resolved to find a GTD system by the end of January.

Being an OS X user, there are great choices out there (both of those have iOS sister-apps, too). However, I’m not just an OS X user. As I was saying to @myrcurial (and even @reillyusa) the other day, I dislike being locked in to proprietary solutions. Plus, the $120 price tag for OmniFocus (OS X + iPad) seemed like a king’s ransom, especially since I am also an Android user (OmniFocus only has an iOS app) and pay for both Dropbox and various virtual hosts. Believing that I still have some usable skills left, I decided to — as @hatlessec characterized my solution — cobble something together on my own.

Once upon a time, I did maintain a .plan file (when I had sysadmin duties), but really doubted the efficacy of it and finger in the age of the modern web. The thought of machinating SQLite databases, parsing XML files or even digesting bits of JSON seemed overkill for my purposes. Searching through my Evernote clippings, my memory was drawn back to one of my favorite sites, Lifehacker, which has regular GTD coverage. After re-poking around a bit, I decided to settle on @ginatrapani’s @todotxtapps for meeting the following requirements (in order):

  • It uses a plain text file with a simple structure – (no exposit necessary…the link is a quick read and the format will become second nature after a glance)
  • It is Free (mostly) – mobile apps are ~$2.00USD each and if you need more than free Dropbox hosting and want a web interface, there are potential hosting costs. If you count your setup time as money, then add that in, too.
  • It runs on OS X, BSD, Windows & Linux – no platform lock-in
  • It has a thriving community – without being backed by a vendor (like the really #spiffy @omnigroup), a strong developer & user community is extremely important to ensure the longevity of the codebase. Todo.txt has very passionate developers and users who are very active on all fronts.
  • It is very extensible & integrable – I used @alfredapps to give me a quick OS X “GUI CLI” to the todo.sh commands. I built an Alfred keyword for my most used Todo.txt functions along with a generic one to bring up vim in a Terminal.app window for a free-form edit. Alfred’s shell-commands also give me @growlmac integration (so I get some feedback after working with tasks).

    I also integrated it with @geektool. I won’t steal the thunder from other GeekTool/Todo.txt integration posts (like this one). The GeekTool integration puts my todo’s right in front of me all the time on all my desktops.

    By storing my todo directory in @dropbox, it also makes syncing to my web site and mobile devices a snap.

    On my server, I have a simple cron job setup to e-mail me my todo’s at the beginning of the day (again, so it’s in front of me wherever I look).

  • It runs on iOS AND Android – again, no platform lock-in
  • There’s an optional web interface – the one I linked to (there are others) is far from ideal, but it was quick to setup and has no overt security issues. Properly protected behind nginx or apache, you should have no issues if you need to have a web version handy.

So, while the setup is a bit more than just downloading two commercial apps, it has many other benefits and isn’t too much more work if you already have some of the other pieces in place. If you want more info on the Alfred scripts or any other setup component, drop me a note in the comments.

While I’ve read about many GTD solutions and seen many user-stories of how they met their GTD needs, I’d be interested in what tools you use to ‘get things done’…

In 2011, we saw a large increase in web site exploits that exposed private user data as well as a breakdown in the trust of SSL (for various reasons) and the introduction of real malware on to the OS X scene. If there were just three things I could ask Mac users to do in 2012 to help protect themselves (‘cuz if your a Windows user it’s been game-over for years for you already) these are what they would be.

Secure & Diversify Your Web Credentials

Just like companies have lost paper files—and then laptops—containing private data, web sites have and will continue to leak your information like a sieve. While you should choose carefully which ones you let have very sensitive data (like credit card numbers, government id numbers and health information), you really do need to ensure that you at least use different and “strong” passwords at each site you have an account at to avoid having hackers replay your credentials at other sites.

The easiest way to do this is to use a utility like 1Password (@1Password & usually $50 but is on sale for $30 for a short time) by AgileBits which works with practically every browser and will let you create and use diverse passwords at the click of a button. It even works on your mobile device, so you don’t have to worry about remembering the (necessarily) ugly passwords they end up creating. You can even use 1Password to store secure notes to yourself (say, in the event you need to use complex credentials on systems you cannot install 1Password).

By using 1Password, you will avoid being the in the 60-70% of users who have their credentials stolen and have to worry or scramble because they used the same ones on an array of popular web sites. Windows users can also take advantage of this tool (and there’s a bundle price if you need it for both platforms).

You can do this without 1Password (e.g. keep a text file or spreadsheet in a secure disk image), but the ease of use is worth the price of 1Password. If you do decide to use a more manual approach, generating secure passwords with tools like this one will also help you be a bit more secure than your brain’s “random” sequence generator.

Know What’s Going On With Your System

While the Mac App Store can help ensure you aren’t loading “bad apps” onto your system, the advent of web-born malware for the Mac was seen for real this year and 2012 may prove to be the year we see the Mac becoming more of a target. There’s no guarantee that Mac App Store apps are non-malicious and you really have no idea what the ones you download from third-party sites contain, even if they do the task you want them to. Some apps that you “know” you trust may be sending out “phone home” signals or other non-user-initiated or informed-of Internet communications with unknown payloads.

This is where a cool little utility called Little Snitch (@littlesnitch and $30) by Objective Development can really help open your eyes as to what applications and processes (programs you may not be able to “see” easily without tools like the Mac Activity Monitor app) are trying to do on your network. Their own information page says it better then I could paraphrase:

Little Snitch informs you whenever a program attempts to establish an outgoing Internet connection. You can then choose to allow or deny this connection, or define a rule how to handle similar, future connection attempts. This reliably prevents private data from being sent out without your knowledge. Little Snitch runs inconspicuously in the background and it can also detect network related activity of viruses, trojans and other malware.

Again, you could monitor your Mac firewall logs by hand with the OS X Console application and tweak your own firewall rules, but Little Snitch won’t forget to watch out for you.

Secure Your Public & Untrusted WiFi Connections

While Facebook, Twitter, Gmail and other sites have SSL (https) options (some using it by default), you really need to take control of your own transmission security when not on networks you trust. Why? Well one example is that you may be at a restaurant (as I was with my kids in November) where they terminate all SSL sessions at their border gateway (meaning they could read all the data that should have been encrypted). You also cannot be sure when Facebook is going to mindlessly toggle their SSL settings or when a Facebook application causes the SSL settings to be disabled. Even though SSL is relied upon by pretty much everyone to “just work”, it’s not a given or a panacea.

When on unfamiliar, public or other untrusted networks, it’s truly necessary to take control of the encryption as best as you can and use some type of Virtual Private Network : VPN : setup. While running your own is the only real way to know what’s happening at the VPN termination point, there are reputable services out there who can provide security and that you should be able to trust (at least better than SSL in a Starbucks). One of them—and I believe the most user-friendly one—is Cloak (@getcloak and FREE to $8-$15/month) by Bourgeois Bits.

Once installed, Cloak will detect when you’re on a public WiFi connection and automatically kick in a VPN session. You can start up a VPN session at any time with a single click in the OS X menu bar and also define more granular rules (if you want to). With Cloak, you have no excuse to not take an added measure of security when you’re out and about with your Mac.

You could do this for free (provided you trust your home Internet provider) with many modern routers or even a simple Linux/BSD or OS X box providing VPN services, but it would still not be as simple as using Cloak.

With these three simple steps/apps (less than $100), you will be far less at risk than you (probably) currently are as you run naked & blind across the internet with your password stapled to your forehead.

If you have any suggestions for similar/competing tools or have additional resolutions you think would be helpful to Mac users (or any computer user), drop a note in the comments.

Dr Greer [cgreer at ostp.eop.gov] is Assistant Director, Information Technology R&D, Office of Science & Technology Policy, The White House

Opening: “The expertise of the attendees is greatly needed.”

He provided a broad overview of the goals & initiatives of the federal government as they relate to domestic & international cybersecurity. Greer went through the responsibilities of various agencies and made it clear that this is a highly distributed effort across all sectors of government.

He emphasized the need for a close partnership with private sector to accomplish these goals and also the criticality of not just coming up with plans but also implementing those plans.

It really was a high-level overview and – as I point out in the twitter transcript – would have been cooler if Dr Greer did a deep-dive on 2-3 items vs do a survey. He did set the tone pretty well – we are in challenging times that are changing rapidly. We’re still fighting the fights of 5-10 years ago but are working to provide a framework for keeping pace with cybercrimminals. The government is “doing stuff”, but it’s all useless without translating thousands of pages of legal mumbo jumbo into practical, actionable activities.

The 10 minute post-talk Q&A was far better than the actual preso.

Twitter transcript:

#weis2011 Obama: "America's economic prosperity in 21st cent will depend on cybersecurity" :: sec begets growth but underscores threats, too

#weis2011 one time we never expected every individual to need an IP address, now even refrigerators have one.

#weis2011 IPv6 need exacerbated by mobile, mobile apps themselves have great benefit, but also introduces new threat vector.

#weis2011 OSTP runs phishing tests 3x year #spiffy

#weis2011 POTUS Strategy: Catalyze brkthrus for natnl priorities, promote mkt-based innov; invest in building blocks of american innovation

#weis2011 policy review (2009) themes: lead frm top;build cap for dig natin;share resp for cybersec;effective info sharing/irp; encrge innov

#weis2011 pimping the International Strategy For Cyberspace release recently http://1.usa.gov/jZXIdE

#weis2011 key "norms" in ISC report: upholding fundamental freedoms (esp speech), global interoperability & cybersecurity due diligence

#weis2011 Greer shifting to talking about legis; OSTP has been wrkng to promote good bills esp for natnl data breach rprting & penalties

#weis2011 computer fraud & abuse act is *25 years old*. We need new regulations to help fight 21st century crime < 25 years! yikes! #weis2011 FISMA shifting from compliance-based to proactive protection-based; mentioned EINSTEIN IDS/ISP #wes2011 pimping http://csrc.nist.gov/nice/ education & awareness efforts #weis2011 pimping fed trusted ID initiative http://www.nist.gov/nstic/ ; password are $ & failing; multiple accts are real & problematic #weis2011 (pers comment) the audience knows much of what Greer is saying, surprised he's giving such a broad overview vs 2/3 deep dives #weis2011 (pers comment) the efforts for fed cybesec seem waaay to disjoint & distributed to truly be effective. #weis2011 pimping fed trusted ID initiative http://www.nist.gov/nstic ; password are $ & failing; multiple accts are real & problematic #weis2011 pimping http://www.nitrd.gov/ CSIA, SSG & SCORE < much alphabet soup in fed cybersec…the letters didn't help senate.gov today #weis2011 results of many research efforts are both near & just over the horizon, but all useless if not put into effective practice #weiss2011 impt to work with priv sector on economics of legis&policy choices (immunity/liability/safe hrbr/incentives/disclosure/audit) #weis2011 need to understand market factors incentivizing hackers (valuation/cost-ben/risk-decision making/criminal markets) #weis2011 (pers comment) another poke at Microsoft when talking about server security. Major hacks of late were linux/apache/solaris. #lame #weis2011 Cyber insurance is a possibility if we can develop good quant-based risk assessment/management frameworks #weis2011 cgreer@ostp.eop.gov #weis2011 q:"where will cybersec be in 10yrs?" -cyberspace will be more resilient & trustworthy; hardening sys&nets useless w/o educatng ppl #weis2011 by 2021 we will have solved all the cybersecurity issues of 2005 < wise man #weis2011 q:"the US spends > than rest of wrld combined on cybersec but it's still just pennies. will this change?" :: it's in the proposals

UPDATE: Check out the newer post on additional features.

There has been much ado of late about Dropbox security with one of the most egregious issues being how easy it is to surreptitiously “clone” someone else’s Dropbox by obtaining just one piece of data – the host id – from the Dropbox SQLite config.db.

Moloch built a Windows & Linux impersonation/cloning utility in Python that was/is meant to be used from a USB/external volume. The utility can save the cloned host id to a local file and also has the capability to use a simple HTTP GET request to log data to a “mothership” web site.

Since many Dropbox users use OS X (including me) I didn’t want them to feel left out or smugly more secure. So, I set about creating a native version of the utility.

This release is not as feature-rich as Moloch’s Python script but it won’t take much more effort to crank out a version that duplicates all of the functionality. “Release early. Release often.” as the kids these days are wont to say.

You can find the source at its github repository. When building it or just downloading & running the executable (see below), you should heed the repo’s README and take care to change the following items in the application’s Info.plist property list:

  • MothershipURL – this is the URL of the remote host you want to store the cloned info to. It defaults to somesite.domain/mothership.php to avoid accidentally sending your own Dropbox data to a remote host. PLEASE NOTE that you will need to get the mothership.php script from the original Windows/Linux code distribution as I have not asked for permission to distribute it here. You can grab the original dbClone.rar directly from here: dl.dropbox.com/u/341940/dbClone.rar (I love the irony of it being hosted on Dropbox itself).

    ALSO NOTE that there’s no need to modify the application’s property list if you don’t mind typing in a URL each run. I eventually plan on making this a separate property list file that allows for multiple URLs so you can select it from a drop-down (and still type a new one if you like).

  • LogFilenamejust include the filename you want to use when storing the cloned info locally if you do not like the default (it’s the same as Moloch’s – "GroceryList.txt"). It defaults to the top-level of the mounted volume (the original Linux & Windows dbClone was meant to be run from a USB/external volume) or "~/" if running it on your boot drive.

You can use the property list editor(s) that come with Apple’s Developer Tools or use vim, TextEdit, TextWrangler (or your favorite text editor) and modify these lines appropriately:

[code]
<key>LogFilename</key>
<string>GroceryList.txt</string>
<key>MothershipURL</key>
<string>http://somesite.domain/mothership.php</string>
[/code]

If you do use the “backup” option, the current naming scheme is "backup-config.db" and it”s important to note that the program will not attempt to overwrite the file. I may change that behaviour in an upcoming release.

I tested the build on OS X 10.6.7 but the Xcode project is set to build for compatibility with 10.5.x or 10.6.x. Feedback on behaviour on other systems would be most welcome.

If you just want the executable, grab the zip’d app and give it a go.

Any and all feedback is welcome (via github or in the comments).

One of my subdomains is for mail and I was using an easy DNS hack to point it to my hosted Gmail setup (just create a CNAME pointing to ghs.google.com). This stopped working for some folks this week and I’ve had no time to debug exactly why so I decided to go back to a simple HTTP 301 redirect to avoid any glitches (for whatever reason) in the future – or, at least ensure the glitches were due to any ineptness on my part. Unfortunately, this created an interesting problem that I had not foreseen.

I started playing with Strict Transport Security (HSTS) a while ago and – for kicks & some enhanced WordPress & Drupal cookie security – moved a couple domains to it. I neglected to actually pay for a cert that would give me wildcard subdomain usage and only put in a couple domains for the cert request. I neglected to put the mail one in and that caused Chrome to not honor the redirect due to the certificate not being valid for the mail domain.

I tweaked theStrict-Transport-Security header setting in my nginx config to not include subdomains, but it seems Chrome had already tucked the entry into (on OS X):

[code padlinenumbers=”false” gutter=”false”]~/Library/Application Support/Google/Chrome/Default/TransportSecurity[/code]

and was ignoring the new expiration and subdomain settings I was now sending. Again, no time to research why as I really just needed to get the mail redirect working. I guessed that removing the entry would be the easiest way to bend Chrome to my will but it turns out that it’s not that simple since the browser seems to hash the host value:

[code]"wA9USN1KVIEHgBTF9j2q0wPLlLieQoLrXKheK9lkgl8=": {
"created": 1300919611.230054,
"expiry": 1303563439.443086,
"include_subdomains": true,
"mode": "strict"
},[/code]

(I have no idea which host that is, btw.)

I ended up backing up the TransportSecurity file and removing all entries from it. Any site I visit that has the cookie will re-establish itself and it cleared up the redirect issue. I still need to get a new certificate, but that can wait for another day.

Windows and Linux folk should be able to find that file pretty easily in their home directories if they are experiencing any similar issue. If you can’t find it, drop a note in the comments and I’ll dig out the locations.

I was trying to convey my backup workflow/setup to @joeday in 140 and it just wasn’t working very well. Twitter – as one might expect – is not exactly the place for detailed technical discussions, but it does provide fertile ground to spark ideas and dialogue. I told @geekshui that I’d blog my setup and that turned out to be just enough of a catalyst to force me to iron out my strategy for rud.is and future (if any) non-cooking/family blogging.

Background

I’m [still] a die-hard OS X user, despite the increasing gatekeeper motif Apple is sporting these days. My main computer is a MacBook Pro which I would stupidly run back into a burning building to rescue. Everything is on it. Everything. I digitize receipts, house our multimedia, spin out VMs like a DJ, create, compose, torrent, rip, zip and hack from it. Consequently, ensuring my data is available is kinda important to me.

I’ve been around computers long enough to have learned some painful lessons from four simple characters: MTBF. Drives break. Electronics fail. It’s an undeniable fact. The only way to recover from these failures is to have a good strategy for keeping your data available.

Strategy #1: Backups

While hard to digest on Twitter, my backup strategy is pretty straightforward. I use Time Machine for OS-managed full system backups. I rotate these between two large (1TB & 2TB) hard drives and I retire one large hard drive each year (MTBF…remember?). This gets me individual file recovery pretty quickly over a decent time period and a bit of hardware piece of mind.

I also have two 2.5″ IEEE 1394 drives that I SuperDupe/CarbonCopyClone images to every time Apple issues a 10.x.y update. Again, I rotate between since I really don’t trust drive manufacturers. I haven’t relied on TrueCrypt for a while (which would make for an ugly workflow) for system volumes, but it’s easy to clone disks that have FileVault protected data as long as you do so from an account that does not use or rely on FileVault data.

Both Time Machine and the drive cloning can occur while I’m sleeping, so no workflow is impacted.

Strategy #2: Dropbox

I have to start by sharing just how much I <3 Dropbox. I don’t use the free service as I grew weary of keeping within the paultry limits. Getting a paid sub to it provides more than just freedom from minutiae. I now get (as long as they have no hiccups) full recovery back as far as I want in the event I do actually lose a file or two. I have Dropbox configured on my MacBook Pro, a home Windows machine and a home Linux box. This means that even if I lose the drive on my Mac, I can get some of my non-sensitive data back from one of the other Dropbox-enabled systems (which is much faster than recovering from backups). It also means that I can get right back to work on a different system – as long as I have not used an OS X-specific program.

I could rant for quite a while about Dropbox, but it should be pretty obvious why this is part of of my backup strategy.

Strategy #3: rsync.net

While Dropbox houses non-sensitive data offsite (again, assuming no service hiccups), there is a subset of my information that I do want housed off-site in the event there is a catastrophic issue with our abode. For that, I have been using rsync.net since it’s inception. They provide outstanding customer support, have a unique view and practices around warrants and fully understand the needs of technical users concerned about availability and privacy.

There are some other things we do to ensure a refresh of the content on media drives that get hooked up to our PS3 or displays, but the the above three steps are how I ensure that I always have access to the data that enables my workflow.