AI Assistant FAQ
This article contains answers to frequently asked questions about Bullhorn AI Assistant (formerly Copilot).
AI Assistant is an optional feature. To request it, please reach out to your Bullhorn Account Manager.
General Interest
What is Generative AI?
Generative AI is a type of AI system that is powered by Large Language Models (LLMs) and can be used to create new content in response to your questions or instructions, which are referred to as "prompts". The most well-known generative AI system is ChatGPT.
What is an LLM?
An LLM, or Large Language Model, is a deep learning model that has been pre-trained on vast amounts of data and is used to power generative AI.
Which LLM should I choose? Which model is best for me?
We currently support Azure and OpenAI. Selecting the right LLM depends on factors like your number of users, the number of requests they might make per day, how much data you have (in terms of number and size of resumes/CVs), and what speed you’re looking for.
What is a token? What's an average token length?
A token is a unit of data, such as a word or a punctuation mark, that an AI system processes. For example, when you ask AI "What is a token?" AI looks at each word, space, and piece of punctuation to understand and respond to your question. Each token is roughly four characters for typical English text. For more information, please see this article: What are tokens and how to count them?
Enablement
How do I get started with AI Assistant?
You'll need to choose your LLM and model, speak with your Account Manager to review and sign the legal addendum, and respond to Bullhorn Support when they reach out to enable AI Assistant in your ATS. For additional enablement details, see Enabling AI Assistant.
If you're ready to get started, speak with your Bullhorn Account Manager. Stay tuned to our website, social media, and blogs to stay informed on Bullhorn's AI developments.
How do I enable AI Assistant?
The steps to enable AI Assistant can be found in this article: Enabling AI Assistant.
Before enabling AI Assistant, you'll need to speak to your Account Manager to review and sign a legal addendum.
How do I enable my LLM so I can use AI Assistant?
You can start by reviewing Enabling AI Assistant. Each LLM has its own instructions for getting started, so we recommend reviewing the OpenAI Quickstart Guide or Create and deploy an Azure OpenAI Service Resource or other help content from the providers. As part of the enablement process you will need to provide details from your LLM such as the model, API key, and token limit.
If I choose OpenAI as my LLM provider, do I need to purchase ChatGPT or an API model?
If using OpenAI, you'll need to purchase an API model.
If I choose OpenAI as my LLM provider, how do I specify which model I want to use?
This is done in Bullhorn by entering your chosen model in the Model Name field on the AI Assistant Admin page.
If a newer model is released, will I have to buy that newer model, or will it automatically update with the newer version?
You would need to purchase the newer model through your AI provider. Any time you make a model update with your AI provider, make sure you update the details stored in your AI Assistant Admin page accordingly.
What if a new version of the same model is released? Will I have to buy that and make the same configuration updates as a newer model?
Newer versions of models are frequently released, and you'll need to manually update the model deploy configuration in your AI provider to the desired version.
Data Security & Privacy
Is data stored in AI Assistant?
No, AI Assistant does not store data. Only prompts are stored, not the data used.
Is data always available to the AI provider, or only when I provide access?
Information is only passed to the AI provider when you select a prompt and choose to use additional data points.
Can AI Assistant pull information from LinkedIn?
No, AI Assistant does not have this functionality.
Can I customize what data is available to send to the AI provider?
No, AI Assistant does not have this functionality. You can choose which of the available data points are pushed to your AI provider, but you cannot include any additional data.
Are there any GDPR implications for EMEA customers using AI Assistant?
Please view Bullhorn’s Commitment to the General Data Protection Regulation (GDPR) or speak to your compliance officer.
Does AI Assistant use client data to train AI or machine learning functionality, or in any other way to develop Bullhorn products?
Bullhorn AI Assistant utilizes a "Bring Your Own Model" implementation, so Bullhorn doesn’t host, train or control the AI model. This is completely owned and controlled by the customer.
Does AI Assistant share data between customers?
No, since AI Assistant integrates with the customer-owned AI model for each specific customer ATS instance, no data is shared between customers.
Where is the server for our LLM instance housed? Is it with Bullhorn or is it with our AI Provider?
You will need to ask OpenAI or Azure for details on where your LLM instance is housed.
Using AI Assistant on the Candidate Record
How does the AI Assistant Work?
- From the AI Assistant card you can select a prompt button or enter a custom prompt.
- In the background, Bullhorn engineers a prompt to return the desired outcome and pushes that prompt to your connected Large Language Model (LLM).
- In a matter of seconds, the LLM returns the output directly to the AI Assistant card.
- Once your output is returned, you can adjust the length, tone, or use a custom action to refine your results.
What information is available to push to the AI provider/AI model?
By default, AI Assistant pushes the candidate's resume data to the AI provider (this is the information contained in the Resume/Description field, not the attached file). In addition to the resume, you can choose to include information from the following fields:
- Categories
- Specialties
- Skills
- Education
- Work History
Note: If a candidate's resume is less than 100 characters, AI Assistant will instead use the Categories, Specialties, Skills, Education, and Work History when there is data available in those fields. If there is no data available in those fields, then AI Assistant will not be usable until you add in data points.
Can I customize the available default prompts?
No, AI Assistant does not have this functionality.
I've changed the tone/length of my prompt but I prefer the version I had before. How do I retrieve a previous version of a prompt?
It isn't possible to return to a previous version of a prompt. We recommend copying and pasting any prompts you like into a document or note so that you don't lose them.
Can AI Assistant be limited at a user level?
Yes, access to AI Assistant is controlled through usertype entitlements. To enable or disable access to the card for specific usertypes, please contact Bullhorn Support.
Why am I receiving an error for too much data?
This error means too much data was selected in the prompt. Reduce the amount of data selected and try again.
I added the AI Assistant card to my Candidate layout. Why doesn't it work?
The AI Assistant Card can be manually added to candidate records at any time, but it will not work until the integration has been properly configured by a Bullhorn Admin.
What can I do if I am seeing unexpected results?
If you're seeing unexpected results, we recommend checking the content of all fields and data points included in your prompt to ensure that these contain the correct information.
Note: If a candidate's resume is less than 100 characters, AI Assistant will instead use the Categories, Specialties, Skills, Education, and Work History when there is data available in those fields. If there is no data available in those fields, then AI Assistant will not be usable until you add in data points.