Enabling AI Assistant (Customer-Hosted LLM)

This article contains the steps to enable AI Assistant using OpenAI or Azure as your LLMClosed An LLM, or Large Language Model, is a deep learning model that has been pre-trained on vast amounts of data and is used to power generative AI. provider.

AI Assistant automatically creates correspondence by pulling information from the Candidate, Contact, and Job records in Bullhorn ATS. AI Assistant can be used to generate email messages, screening questions, pitches, and much more with the click of a button.

Before you Begin

Before AI Assistant can be enabled, you will need to sign a legal addendum. Reach out to your Bullhorn Account Manager to request AI Assistant and get started. Once the addendum has been signed, Bullhorn Support will contact you to initiate the setup.

The steps in this article are only required if you choose to enable AI Assistant with your own hosted LLM (OpenAI or Azure). If you are using the Bullhorn-hosted LLM, you do not need to complete these steps.

Step 1: Gather Details from your LLM Provider

To enable AI Assistant, you’ll need to set up an account with your chosen LLM provider (OpenAI or Azure OpenAI Services) and then enter some details into Bullhorn.

 

  • Azure or OpenAI setup should be handled with your IT team to avoid unintentional payments or other account issues.

  • AI Assistant does not support LLM Models deployed on Azure Active Directory.

The details you'll need to enter into Bullhorn depend on your LLM provider:

LLM Provider Details You'll Need Help Resources

OpenAI

  • The API Key for your OpenAI instance

  • The Model Name of your LLM

OpenAI Developer Quickstart Tutorial

Azure OpenAI

  • The API Key for your Azure OpenAI instance

  • The Model Name of your LLM

  • Instance Name (the unique name entered when creating your Azure OpenAI instance)

  • API Version (should be set to "2023-05-15")

Create and Deploy an Azure OpenAI Service Resource

Step 2: Configure AI Assistant in Bullhorn

  1. In Bullhorn, navigate to Menu > Admin > AI Assistant Admin.
  2. Select your LLM provider from the Select LLM dropdown (OpenAI or Azure).
  3. More fields will appear for you to enter information based on the LLM provider you select. Fill in these fields with the details you gathered in the previous section.
  4. Once you've filled in the required fields, click the Test button to test the API connection.

  5. Click Save in the bottom right corner.

FAQ

Which LLM should I choose? Which model is best for me?

We currently support Azure and OpenAI and we recommend using the latest LLM version.

Selecting the right LLM depends on factors like your number of users, the number of requests they might make per day, how much data you have (in terms of number and size of resumes/CVs), and what speed you’re looking for.

Does AI Assistant support LLM Models deployed on Azure Active Directory?

No, AI Assistant does not currently support LLM Models deployed on Azure Active Directory.

How do I enable my LLM so I can use AI Assistant?

Each LLM has its own instructions for getting started, so we recommend reviewing the OpenAI Quickstart Guide or Create and deploy an Azure OpenAI Service Resource or other help content from the providers. As part of the enablement process you will need to provide details from your LLM such as the model, API key, and tokenClosed A token is a unit of data, such as a word or a punctuation mark, that an AI system processes. For example, when you ask AI "What is a token?" AI looks at each word, space, and piece of punctuation to understand and respond to your question. Each token is roughly four characters for typical English text. limit.

If I choose OpenAI as my LLM provider, do I need to purchase ChatGPT or an API model?

If using OpenAI, you'll need to purchase an API model.

If I choose OpenAI as my LLM provider, how do I specify which model I want to use?

This is done in Bullhorn by entering your chosen model in the Model Name field on the AI Assistant Admin page.

If a newer model is released, will I have to buy that newer model, or will it automatically update with the newer version?

You would need to purchase the newer model through your AI provider. Any time you make a model update with your AI provider, make sure you update the details stored in your AI Assistant Admin page accordingly.

What if a new version of the same model is released? Will I have to buy that and make the same configuration updates as a newer model?

Newer versions of models are frequently released, and you'll need to manually update the model deploy configuration in your AI provider to the desired version.

What does it mean if I get an error message when configuring or testing AI Assistant?

For a list of common error messages and solutions, see Troubleshooting AI Assistant.