Dr Aiden: The Dentaljuce AI Tutor
Understanding AI Accuracy and Hallucinations
Artificial Intelligence systems like Dr Aiden, the Dentaljuce AI Tutor, are powerful tools for learning and revision.
They are based on “Large Language Models” trained on very large collections of text. These systems generate
answers by recognising patterns in that data, not by thinking like a human or having clinical experience.
Modern AI models are considerably more capable and less prone to obvious errors than early versions, but they are still not perfectly accurate or authoritative. They can support your learning, help you explore topics, and suggest ways of thinking about clinical problems — but they must never replace professional judgement, current evidence, or established UK standards and guidance.
AI is limited to its training data
AI systems generate responses from the information they have been trained on and any additional material they are given at the time. General information on the internet is highly variable in quality. Not all of it is written by dental professionals or scientists, and it may be incomplete, out of date, or not aligned with UK best practice.
Where possible, Dr Aiden is guided by trusted sources such as Dentaljuce teaching material and recognised UK standards and guidance. However, no AI system can guarantee that every answer precisely reflects the latest evidence or official recommendations.
For this reason, any important or practice-changing information generated by AI must always be checked against trusted, professional sources — for example current UK guidance, original research papers, or official documents from recognised bodies — particularly when it relates to diagnosis, treatment planning, prescribing, or medico-legal issues.
AI can 'hallucinate' information
Sometimes, AI may generate information that sounds confident and plausible but is partially or completely wrong. This phenomenon is called a 'hallucination'. It happens when the AI fills in gaps based on patterns it has learned, rather than on specific facts it has reliably encountered.
Hallucinations are more likely when:
- The question is very niche, unusual, or poorly defined.
- There is limited or conflicting information in the underlying data.
- The AI is asked for exact details such as drug doses, legal wording, or guideline paragraphs.
Even with modern, improved models, you should not assume that a detailed or confident answer is necessarily correct. Always be prepared to cross-check important content with trusted sources.
AI has never picked up a drill.
AI lacks practical experience. It has never carried out an examination, taken a history in person, picked up a handpiece, or managed an anxious patient in the chair. It cannot feel caries, assess the resistance form of a preparation, or judge the subtle clinical cues that experienced dentists use every day.
What AI can do is help with the theory around those practical decisions: explaining concepts, outlining options, highlighting factors to consider, or generating revision questions and clinical scenarios to test your understanding.
It cannot tell you what to do for a specific patient in a specific situation. Clinical decisions must always be made by a qualified professional who has examined the patient and considered their overall circumstances.
AI systems do not understand information like humans do
AI can process, combine, and reproduce large amounts of information, and it can often appear to reason. However, it does not truly understand dentistry, patients, ethics, or risk in the way a trained clinician does. It has no personal experience, no professional responsibility, and no awareness of the consequences of its suggestions.
AI does not have a worldview, common sense, or clinical intuition. It does not know your patient’s medical history, social situation, consent status, or preferences unless you explicitly tell it (and even then it cannot assume responsibility for those details).
Therefore, while Dr Aiden can be a very helpful assistant for learning, discussion, and getting you to think around a topic, any information he provides should be treated as a prompt for your own critical thinking — not as a final answer.
Using Dr Aiden safely and effectively
To get the most benefit from Dr Aiden while staying within safe and professional boundaries:
- Use AI to explain concepts, summarise topics, generate revision notes, and create practice questions.
- Ask it to outline options or considerations, then check those against current guidance and your own judgement.
- Do not use AI as the primary source for drug doses, prescriptions, or medico-legal wording — always refer to authoritative, up-to-date references.
- Never rely on AI alone for diagnosis, treatment planning, or decisions about individual patients.
AI is a supplement to, not a replacement for, professional judgement and expertise. You remain responsible for any decisions you make and any care you provide.
