Security & Privacy Of Mountain Lion’s Dictation Feature

With Gizmodo doing a post hyping Mountain Lion’s new dictation feature it’s probably a good time to note that folks in regulated environments or who just care about security & privacy a bit more than others should not enable or use this feature for the dictation of sensitive information.

From Apple’s own warning on the matter:

When you use the keyboard dictation feature on your computer, the things you dictate will be recorded and sent to Apple to convert what you say into text. Your computer will also send Apple other information, such as your first name and nickname; and the names, nicknames, and relationship with you (for example, “my dad”) of your address book contacts. All of this data is used to help the dictation feature understand you better and recognize what you say. Your User Data is not linked to other data that Apple may have from your use of other Apple services.

It’s much like what happens with Siri, Dragon Dictation or a myriad of other iOS and modern desktop apps/browser extensions. Thankfully, it performs the transfers over SSL, but that still won’t help you if your dictating health, financial or other regulated/NPPI/PII data.

While the feature is cool and does work pretty well, it’s important to make sure you and your users know what it does, how it works and where they can/cannot use it.

Cover image from Data-Driven Security
Amazon Author Page

3 Comments Security & Privacy Of Mountain Lion’s Dictation Feature

  1. Krombein

    I thought we were at the place where our tech was smart enough and fast enough to do all of that locally. Everyone should have a complete, local, secure and private instance of Siri on their machine. What happened to the future?

    Also, where’s my hoverboard and flying car?! I was promised a jetpack!

    Reply
    1. hrbrmstr

      The future is, sadly, becoming as dystopian as predicted. Initially, these new portable devices didn’t have enough local power to do the crunching. Now, the processing is centralized by Apple, Dragon, Google, etc to protect their intellectual property better, help make the results better (faster) since they can use the uploaded data in their refinement analytics but also to, well, have all your stuff. Despite whatever promises you get in a EULA or pop-up, the reality is they can use the data to profile you better and to gain knowledge of what’s going on for better trending, etc.

      Reply
  2. Zath

    This definitely does appear to be an instance of Apple’s legal dept wording things around current PII regulations. If the data does not include the last name, it skirts most state and federal regulation definitions of “Personally Identifiable”. Unfortunately, because the software also records your relationships to other “first” names in your contact book, it becomes increasingly easier for Apple, or anyone who compromises the data that Apple has recorded, to figure out the identity of a particular user. E.g. If Jeff, who uses the nickname Babblefish, has a brother named George, a sister named Margaret, and parents Susanna and Jamison, his full identity has the potential to be easily identified with minimal Google detective work.

    Another big question that comes into play here is what happens to this information once it has been utilized for improving the voice recognition software? Is it archived for a specified period of years (forever)? Is it deleted? If the dictated data happens to be a work that the user has copyrighted, does Apple receive any rights to the use of the works?

    As hrbrmstr touched on: What happens if you are dictating personal information such as your financial account information or your full name and address? If you are a teacher using dictation for recording comments and grades on student paper, you have just violated FERPA regulations by transmitting that information to a third party. If you are a doctor who is dictating notes on a patient, you have just violated HIPPA regulations.

    I think that Apple needs to clarify how this data is truly used, as well as how it is secured. I live in Massachusetts, and under MA 201 CMR 17.00, there must be assurances in place for protection of the PII of any Massachusetts resident, even if the company is located outside of the state.

    I’m very interested to see how this issue starts to play out as the media gets more wind of it. With electronic security and privacy issues being at the forefront of the internet frontier lately, Apple will need to make clarifications or find themselves in hot water with users and state/federal regulators. (read that as “litigation-happy” plaintiffs.)

    Z

    Reply

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.