Just like a conversation with a stranger, be mindful when chatting with an AI tool.
Consider, what information is the tool asking me for? What information am I willing to share?
AI tools thrive and grow on data but you don't have to needlessly give away your data for free!
When evaluating information from a chatbot, use your critical thinking skills. Did you know that AI tools 'hallucinate' and "generate false or nonsensical information...[AI tools] "can be be very confident liars."
"A New York Times report last year found that the rate of hallucinations for AI systems was about 5% for Meta, up to 8% for Anthropic, 3% for OpenAI, and up to 27% for Google PaLM."
Why do AI tools hallucinate?
"Chatbots “hallucinate” when they don’t have the necessary training data to answer a question, but still generate a response that looks like a fact. Hallucinations can be caused by different factors such as inaccurate or biased training data and overfitting, which is when an algorithm can’t make predictions or conclusions from other data than what it was trained on."
Source: Bratton, Laura. "Salesforce says its AI chatbot won't hallucinate – well, probably." Quartz, April 25, 2024. https://qz.com/salesforce-einstein-copilot-hallucinations-1851436568. Accessed October 3, 2024.