2023-07-21-WPE-Helper Code

Since you’re building a CLI to chat with the MDN AI Helper, you’ll need the auth-cookie from a valid session and then make and process HTTP POST requests.

We’ll show an example in Python, and most of the freely available GPT overlords can at least get you started on converting this to JavaScript, Rust, etc. if needed.

import os
import configparser
import requests
import sqlite3
import json

As noted in the Drop, Firefox’s cookie store is pretty easy to work with (use Chrome or Safari instead if you like).

You’ll need to authenticate to MDN Plus (https://developer.mozilla.org/), so you’ll need an account. You get limited AI Helper usage for non-paid plans. As I’ve said before, the $5 USD/month plan is well worth the price of admission, though I realize it may not be an option for some folks.

After you authenticate, the auth-token will be stored in the cookie jar, which is just a SQLite database.

This function is a platform-independent way to get the location of the Firefox profile directory where that database is stored.

def get_firefox_profile_path():
  """
  Get the filesystem path to the Firefox profile directory.

  This checks the operating system and locates the Firefox profile data path
  based on the OS-specific location.

  Returns:
    str: The filesystem path to the Firefox profile directory.
  """

  if os.name == 'nt':  # Windows (ugh)
    mozilla_profile = os.path.join(os.getenv('APPDATA'), r'Mozilla\Firefox')
  elif os.uname().sysname == 'Darwin':  # Mac :-)
    mozilla_profile = os.path.join(os.path.expanduser('~'), 'Library', 'Application Support', 'Firefox')
  else:  # Linux -_-
    mozilla_profile = os.path.join(os.path.expanduser('~'), '.mozilla', 'firefox')

  mozilla_profile_ini = os.path.join(mozilla_profile, r'profiles.ini')
  profile = configparser.ConfigParser()
  profile.read(mozilla_profile_ini)
  data_path = os.path.normpath(os.path.join(mozilla_profile, profile.get('Profile0', 'Path')))
  
  return data_path

Now, we just need to read the value and keep it around for web requests:

profile_dir = get_firefox_profile_path()
cookie_path = os.path.join(profile_dir, "cookies.sqlite")
con = sqlite3.connect(cookie_path)
cur = con.cursor()
cur.execute("SELECT value  FROM moz_cookies WHERE host='developer.mozilla.org'")
auth_cookie = cur.fetchall()[0][0]

We’ll setup the cookies and basic headers:

cookies = {
  'auth-cookie': auth_cookie,
}

headers = {
  'Accept': '*/*',
  'Content-Type': 'application/json',
}

This is a starter question for you. You’ll need to do some work on your own to see how to incorporate new items into the chat and preserve previous context.

question = {
  'messages': [
    {
      'role': 'user',
      'content': 'How do i center a div?',
    },
  ],
}

Make the POST request and wait until it finishes generating.

response = requests.post(
  url = 'https://developer.mozilla.org/api/v1/plus/ai/ask',
  cookies = cookies, 
  headers = headers, 
  json = question
)

The following will decode all of the responses. How you end up displaying is totally up to you!

completions = [ line[6:].decode('utf-8') for line in response.content.splitlines() ]
completions = [ json.loads(completion) for completion in completions if completion != "" ]
completions[0:10]
[{'type': 'metadata',
  'sources': [{'url': '/en-US/docs/Learn/CSS/Howto/Center_an_item',
    'slug': 'Learn/CSS/Howto/Center_an_item',
    'title': 'How to center an item'},
   {'url': '/en-US/docs/Web/CSS/CSS_grid_layout/Box_alignment_in_grid_layout',
    'slug': 'Web/CSS/CSS_grid_layout/Box_alignment_in_grid_layout',
    'title': 'Box alignment in grid layout'},
   {'url': '/en-US/docs/Web/CSS/Layout_cookbook/Center_an_element',
    'slug': 'Web/CSS/Layout_cookbook/Center_an_element',
    'title': 'Center an element'},
   {'url': '/en-US/docs/Web/CSS/margin',
    'slug': 'Web/CSS/margin',
    'title': 'margin'}],
  'quota': None},
 {'id': 'chatcmpl-7egn1MGADelHsX2ddsLZrtHRHhFC0',
  'object': 'chat.completion.chunk',
  'created': 1689931959,
  'model': 'gpt-3.5-turbo-0613',
  'choices': [{'index': 0,
    'delta': {'content': '', 'role': 'assistant'},
    'finish_reason': None}],
  'usage': None},
 {'id': 'chatcmpl-7egn1MGADelHsX2ddsLZrtHRHhFC0',
  'object': 'chat.completion.chunk',
  'created': 1689931959,
  'model': 'gpt-3.5-turbo-0613',
  'choices': [{'index': 0,
    'delta': {'content': 'To', 'role': None},
    'finish_reason': None}],
  'usage': None},
 {'id': 'chatcmpl-7egn1MGADelHsX2ddsLZrtHRHhFC0',
  'object': 'chat.completion.chunk',
  'created': 1689931959,
  'model': 'gpt-3.5-turbo-0613',
  'choices': [{'index': 0,
    'delta': {'content': ' center', 'role': None},
    'finish_reason': None}],
  'usage': None},
 {'id': 'chatcmpl-7egn1MGADelHsX2ddsLZrtHRHhFC0',
  'object': 'chat.completion.chunk',
  'created': 1689931959,
  'model': 'gpt-3.5-turbo-0613',
  'choices': [{'index': 0,
    'delta': {'content': ' a', 'role': None},
    'finish_reason': None}],
  'usage': None},
 {'id': 'chatcmpl-7egn1MGADelHsX2ddsLZrtHRHhFC0',
  'object': 'chat.completion.chunk',
  'created': 1689931959,
  'model': 'gpt-3.5-turbo-0613',
  'choices': [{'index': 0,
    'delta': {'content': ' `', 'role': None},
    'finish_reason': None}],
  'usage': None},
 {'id': 'chatcmpl-7egn1MGADelHsX2ddsLZrtHRHhFC0',
  'object': 'chat.completion.chunk',
  'created': 1689931959,
  'model': 'gpt-3.5-turbo-0613',
  'choices': [{'index': 0,
    'delta': {'content': 'div', 'role': None},
    'finish_reason': None}],
  'usage': None},
 {'id': 'chatcmpl-7egn1MGADelHsX2ddsLZrtHRHhFC0',
  'object': 'chat.completion.chunk',
  'created': 1689931959,
  'model': 'gpt-3.5-turbo-0613',
  'choices': [{'index': 0,
    'delta': {'content': '`', 'role': None},
    'finish_reason': None}],
  'usage': None},
 {'id': 'chatcmpl-7egn1MGADelHsX2ddsLZrtHRHhFC0',
  'object': 'chat.completion.chunk',
  'created': 1689931959,
  'model': 'gpt-3.5-turbo-0613',
  'choices': [{'index': 0,
    'delta': {'content': ' element', 'role': None},
    'finish_reason': None}],
  'usage': None},
 {'id': 'chatcmpl-7egn1MGADelHsX2ddsLZrtHRHhFC0',
  'object': 'chat.completion.chunk',
  'created': 1689931959,
  'model': 'gpt-3.5-turbo-0613',
  'choices': [{'index': 0,
    'delta': {'content': ',', 'role': None},
    'finish_reason': None}],
  'usage': None}]

An alternative to the above is to add stream = True to the post call and then do something like the following:

# Iterate over each line of the streamed response
for line in response.iter_lines():
   if line:
      # Decode the byte string to a regular string (assuming UTF-8 encoding)
      message = line.decode('utf-8')
      handle_message(message)

Using this method will have the look and feel of the “typing” that ChatGPT (et al.) has.