AI shows moderate accuracy for canine urothelial carcinoma on x-rays: full analysis

A newly published study in Veterinary Radiology & Ultrasound takes a focused look at whether AI can identify canine urothelial carcinoma from abdominal radiographs, and the answer is: sometimes, but not well enough to rely on alone. Investigators reported 68% accuracy, with 69% sensitivity and 67% specificity, after training a convolutional neural network on radiographs from dogs with and without confirmed urothelial carcinoma. The model was better at recognizing more advanced, mineralized disease than less conspicuous presentations. (acvr-website.s3.amazonaws.com)

That matters because urothelial carcinoma can be difficult to diagnose early, and abdominal radiography is widely available in general practice even when advanced imaging or specialty review isn't. According to the study abstract, dogs with histologically confirmed urothelial carcinoma and ultrasound changes were used in the disease cohorts, while comparison dogs had no clinical suspicion of urinary neoplasia and no ultrasound findings consistent with urothelial carcinoma. The training set included 500 urothelial carcinoma studies and 500 non-cancer studies, followed by validation on 185 urothelial carcinoma cases and 180 non-cancer cases. (acvr-website.s3.amazonaws.com)

The study's more nuanced finding is that performance wasn't uniform across case types. Sensitivity was higher in more severe tumors, specifically grade 4 disease that was diffuse, invaded muscle, and showed mineralization. By contrast, medial iliac lymphadenomegaly and ureteral obstruction did not improve AI sensitivity. The authors concluded that a well-trained CNN may have potential for identifying severe urothelial carcinoma on abdominal radiographs, but that more work is needed before the approach is clinically applicable. (acvr-website.s3.amazonaws.com)

This paper lands in a veterinary AI environment that is moving quickly, but still wrestling with external validation. A 2026 pilot study in JAVMA evaluated six commercial veterinary radiology AI platforms using general practice-sourced canine abdominal radiographs with confirmed diagnoses. Across 307 usable evaluations, performance was variable and mostly low to moderate, with frequent missed findings and low sensitivity for radiographic labels. The authors concluded that even the best-performing algorithm had notable limitations and that none of the platforms appeared suitable for clinical use in their current form. (researchportal.murdoch.edu.au)

There are also signs that performance may depend heavily on task definition. In the ACVR 2023 proceedings, another abstract comparing commercial AI software with board-certified veterinary radiologists framed AI as a tool that could expand access and reduce some human variability, but the broader literature still points to uneven results depending on species, study design, case complexity, and whether the model is tested internally or on real-world outside cases. That pattern is consistent with the urothelial carcinoma study: a narrowly trained model can show signal, especially in advanced disease, but moderate accuracy is a long way from dependable screening or diagnosis. (acvr-website.s3.amazonaws.com)

Why it matters: For veterinarians, the practical takeaway is restraint. If AI flags a mineralized bladder-region lesion in a dog with compatible lower urinary tract signs, that may support urgency and next-step planning. But a negative AI read shouldn't lower suspicion when the history, exam, or imaging pattern still points toward urothelial carcinoma. In day-to-day practice, this kind of tool may be most useful as a triage aid or second reader, especially where radiology access is limited, while definitive workups still depend on clinical context and follow-up diagnostics. (researchportal.murdoch.edu.au)

The study also highlights a familiar issue in veterinary AI: models often look better when disease is advanced and imaging findings are more obvious. That can limit value in the exact cases where clinicians most need support, namely early or equivocal presentations. For pet parents, that means AI-assisted radiograph interpretation may become one more layer of decision support, but it doesn't replace a veterinarian's judgment, confirmatory imaging, or pathology. (acvr-website.s3.amazonaws.com)

What to watch: The next important signals will be prospective studies, external validation outside the training institution, and head-to-head comparisons with radiologists and commercial platforms in earlier-stage disease. If future work can improve sensitivity without sacrificing specificity, especially in non-mineralized or less advanced tumors, AI could become more relevant in first-line practice. For now, the evidence supports cautious integration, not clinical substitution. (researchportal.murdoch.edu.au)

← Brief version

Like what you're reading?

The Feed delivers veterinary news every weekday.