Enabling AI Assistant (Customer-Hosted LLM)

This article contains the steps to enable AI Assistant using OpenAI or Azure as your LLMClosed An LLM, or Large Language Model, is a deep learning model that has been pre-trained on vast amounts of data and is used to power generative AI. provider.

AI Assistant automatically creates correspondence by pulling information from the Candidate and Job records in Bullhorn ATS. AI Assistant can be used to generate email messages, screening questions, pitches, and much more with the click of a button.

Before you Begin

Before AI Assistant can be enabled, you will need to sign a legal addendum. Reach out to your Bullhorn Account Manager to request AI Assistant and get started. Once the addendum has been signed, Bullhorn Support will contact you to initiate the setup.

The steps in this article are only required if you choose to enable AI Assistant with your own hosted LLM (OpenAI or Azure). If you are using the Bullhorn-hosted LLM, you do not need to complete these steps.

Step 1: Gather Details from your LLM Provider

To enable AI Assistant, you’ll need to set up an account with your chosen LLM provider and then enter some details into Bullhorn.

An LLM, or Large Language Model, is a deep learning model that has been pre-trained on vast amounts of data and is used to power generative AIClosed Generative AI is a type of AI system that is powered by Large Language Models (LLMs) and can be used to create new content in response to your questions or instructions, which are referred to as "prompts". The most well-known generative AI system is ChatGPT. systems such as ChatGPT.

Bullhorn currently supports the following LLM providers:
  • OpenAI
  • Azure OpenAI Services

Azure or OpenAI setup should be handled with your IT team to avoid unintentional payments or other account issues.

AI Assistant does not support LLM Models deployed on Azure Active Directory.

  1. Click the link for your chosen LLM provider below, and follow the steps to set up and gather the required details:
  2. Once you have followed the steps in the linked document, continue to the next section.

Step 2: Configure AI Assistant in Bullhorn

  1. In Bullhorn, navigate to Menu > Admin > AI Assistant Admin.
  2. Select your LLM provider from the Select LLM dropdown (OpenAI or Azure).
  3. More fields will appear for you to enter information based on the LLM provider you select. Fill in these fields with the details you gathered in the previous section.
    • OpenAI:

      • The API Key should be the secret key you copied in Step 3 of the OpenAI Instructions.
      • For the Model Name you can choose between "GPT4", "GPT4o", and "GPT4o mini". We currently recommend gpt-4o-mini.
    • Azure:

      • The API Key should be the API key you copied in Step 14 of the Azure OpenAI Instructions.
      • The Model Name should be the Model Deployment Name you entered in Step 9 of the Azure OpenAI Instructions.
      • The Instance Name should be the unique name you entered in Step 4 of the Azure OpenAI Instructions.
      • The API Version should be set to "2023-05-15" (the format of YYYY-MM-DD is required).
  4. Once you've filled in the required fields, click the Test button to test the API connection.

  5. Click Save in the bottom right corner.

FAQ

Which LLM should I choose? Which model is best for me?

We currently support Azure and OpenAI. Selecting the right LLM depends on factors like your number of users, the number of requests they might make per day, how much data you have (in terms of number and size of resumes/CVs), and what speed you’re looking for.

Does AI Assistant support LLM Models deployed on Azure Active Directory?

No, AI Assistant does not currently support LLM Models deployed on Azure Active Directory.

How do I enable my LLM so I can use AI Assistant?

Each LLM has its own instructions for getting started, so we recommend reviewing the OpenAI Quickstart Guide or Create and deploy an Azure OpenAI Service Resource or other help content from the providers. As part of the enablement process you will need to provide details from your LLM such as the model, API key, and tokenClosed A token is a unit of data, such as a word or a punctuation mark, that an AI system processes. For example, when you ask AI "What is a token?" AI looks at each word, space, and piece of punctuation to understand and respond to your question. Each token is roughly four characters for typical English text. limit.

If I choose OpenAI as my LLM provider, do I need to purchase ChatGPT or an API model?

If using OpenAI, you'll need to purchase an API model.

If I choose OpenAI as my LLM provider, how do I specify which model I want to use?

This is done in Bullhorn by entering your chosen model in the Model Name field on the AI Assistant Admin page.

If a newer model is released, will I have to buy that newer model, or will it automatically update with the newer version?

You would need to purchase the newer model through your AI provider. Any time you make a model update with your AI provider, make sure you update the details stored in your AI Assistant Admin page accordingly.

What if a new version of the same model is released? Will I have to buy that and make the same configuration updates as a newer model?

Newer versions of models are frequently released, and you'll need to manually update the model deploy configuration in your AI provider to the desired version.