AI Assistant FAQ
This article contains answers to frequently asked questions about Bullhorn's AI Assistant (formerly Copilot).
AI Assistant is an optional feature. To request it, please reach out to your Bullhorn Account Manager.
General Interest
What is Generative AI?
Generative AI is a type of AI system that is powered by Large Language Models (LLMs) and can be used to create new content in response to your questions or instructions, which are referred to as "prompts". The most well-known generative AI system is ChatGPT.
What is an LLM?
An LLM, or Large Language Model, is a deep learning model that has been pre-trained on vast amounts of data and is used to power generative AI.
What is a token? What's an average token length?
A token is a unit of data, such as a word or a punctuation mark, that an AI system processes. For example, when you ask AI "What is a token?" AI looks at each word, space, and piece of punctuation to understand and respond to your question. Each token is roughly four characters for typical English text. For more information, please see this article: What are tokens and how to count them?
How do I get started with AI Assistant?
You'll need to choose your LLM and model, speak with your Account Manager to review and sign the legal addendum, and respond to Bullhorn Support when they reach out to enable AI Assistant in your ATS.
If you're ready to get started, speak with your Bullhorn Account Manager. Stay tuned to our website, social media, and blogs to stay informed on Bullhorn's AI developments.
Using AI Assistant
How does the AI Assistant Work?
- From the AI Assistant slideout on a job or candidate record, you can select a prebuilt prompt or enter your own custom prompt.
- In the background, Bullhorn engineers a prompt to return the desired outcome and pushes that prompt to your connected Large Language Model (LLM).
- In a matter of seconds, the LLM returns the output directly to the AI Assistant slideout.
- Once your output is returned, you can send additional instructions and refine the prompt until you are happy with the results.
What information is available to push to the AI provider/AI model?
The data points you choose in the Using field are sent along with the prompt. The available data points depend on the specific prompt being used and can be customized by an Admin in the AI Assistant Studio.
Can I customize the available default prompts?
Admins can customize the available default prompts and create new prompts in the AI Assistant Studio.
I've changed the tone/length of my prompt but I prefer the version I had before. How do I retrieve a previous version of a prompt?
You can scroll up to view previous requests and responses in your current conversation. However, conversations are lost when you exit or reset AI Assistant, so be sure to copy any responses you want to keep beforehand.
Can AI Assistant be limited at a user level?
Yes, access to AI Assistant is controlled through usertype entitlements. To enable or disable access to the card for specific usertypes, please contact Bullhorn Support.
Why am I receiving an error for too much data?
This error means too much data was selected in the prompt. Reduce the amount of data selected and try again.
What can I do if I am seeing unexpected results?
If you're seeing unexpected results, we recommend checking the content of all fields and data points included in your prompt to ensure that these contain the correct information.
Does AI Assistant support formatted text?
The AI Assistant cards support formatted text generated by the LLM, including bold, italics, underline, and bulleted/numbered lists.
Bullhorn-Hosted LLM
Do customers need to sign a new legal addendum to set up AI Assistant with the Bullhorn-hosted LLM?
Yes.
What is the cost of using the Bullhorn-hosted LLM and how is it billed?
The Bullhorn-hosted LLM is free as part of our new ATS packaging. You must purchase the new packaging to get the AI Assistant Studio, AI Assistant Job Card, and Bullhorn-hosted LLM.
What LLM is being used for the Bullhorn-hosted LLM?
Meta Llama-3.2 3B
Are there any token limits per request with the Bullhorn-hosted LLM?
The limit for an individual request (prompt and data) is 16k tokens.
The total token limit (prompt request and prompt response) is 128k tokens per request.
Are there any request limits with the Bullhorn-hosted LLM?
The number of prompt requests that can be made on a per-minute basis is 3k requests.
This is the current rate limit on our API request service. There is currently no limit on the number of requests that can be made
As a single customer using the Bullhorn-hosted LLM, do I have the ability to train the model to my specific use cases, not considering any other customer data?
No, responses will only improve when we upgrade the underlying model to newer versions that have better response algorithms and updated training.
When was the cutoff date for the Bullhorn-hosted LLM Training?
December 1st 2023
Does the Bullhorn-hosted LLM use the internet to source its responses?
No, the Bullhorn-hosted LLM does not source information from the internet.
Customer-Hosted LLM
How do I enable AI Assistant with my own hosted LLM?
Enterprise edition customers may choose to enable AI Assistant with their own hosted LLM (through OpenAI or Azure) instead of using Bullhorn’s LLM. The steps to enable AI Assistant with a customer-hosted LLM can be found in this article: Enabling AI Assistant (Customer-Hosted LLM).
Before enabling AI Assistant, you'll need to speak to your Account Manager to review and sign a legal addendum.
Which LLM should I choose? Which model is best for me?
We currently support Azure and OpenAI. Selecting the right LLM depends on factors like your number of users, the number of requests they might make per day, how much data you have (in terms of number and size of resumes/CVs), and what speed you’re looking for.
Does AI Assistant support LLM Models deployed on Azure Active Directory?
No, AI Assistant does not currently support LLM Models deployed on Azure Active Directory.
How do I enable my LLM so I can use AI Assistant?
You can start by reviewing Enabling AI Assistant (Customer-Hosted LLM).
Each LLM has its own instructions for getting started, so we recommend reviewing the OpenAI Quickstart Guide or Create and deploy an Azure OpenAI Service Resource or other help content from the providers. As part of the enablement process you will need to provide details from your LLM such as the model, API key, and token limit.
If I choose OpenAI as my LLM provider, do I need to purchase ChatGPT or an API model?
If using OpenAI, you'll need to purchase an API model.
If I choose OpenAI as my LLM provider, how do I specify which model I want to use?
This is done in Bullhorn by entering your chosen model in the Model Name field on the AI Assistant Admin page.
If a newer model is released, will I have to buy that newer model, or will it automatically update with the newer version?
You would need to purchase the newer model through your AI provider. Any time you make a model update with your AI provider, make sure you update the details stored in your AI Assistant Admin page accordingly.
What if a new version of the same model is released? Will I have to buy that and make the same configuration updates as a newer model?
Newer versions of models are frequently released, and you'll need to manually update the model deploy configuration in your AI provider to the desired version.
Data Security & Privacy
Is data stored in AI Assistant?
No, AI Assistant does not store data. Only prompts are stored, not the data used.
Is data always available to the AI provider, or only when I provide access?
Information is only passed to the AI provider when you select a prompt and choose to use additional data points.
Can AI Assistant pull information from LinkedIn?
No, AI Assistant does not have this functionality.
Can I customize what data is available to send to the AI provider?
No, AI Assistant does not have this functionality. You can choose which of the available data points are pushed to your AI provider, but you cannot include any additional data.
Are there any GDPR implications for EMEA customers using AI Assistant?
Please view Bullhorn’s Commitment to the General Data Protection Regulation (GDPR) or speak to your compliance officer.
Does AI Assistant use client data to train AI or machine learning functionality, or in any other way to develop Bullhorn products?
No. The Bullhorn-hosted LLM was not trained with Bullhorn data (customer or internal). If you are using a customer-hosted LLM, you completely own and control the model.
Does AI Assistant share data between customers?
No, data is not shared between customers for either the Bullhorn-hosted LLM or a customer-hosted LLM.
Where is the server for our LLM instance housed? Is it with Bullhorn or is it with our AI Provider?
If using Bullhorn’s LLM, where your LLM instance is housed will depend on your data center. For help identifying your data center you can contact Support or your Account Manager.
If using a customer-hosted LLM, you will need to ask OpenAI or Azure for details on where your LLM instance is housed.