Artificial intelligence (AI) is rapidly becoming indispensable in radiology. Nowhere is this clearer than in fracture detection, where missed or delayed diagnoses still account for between 3% and 10% of errors on trauma radiographs in busy emergency departments. Over the past five years, research teams have shown that deep-learning algorithms can equal, or even surpass, human readers for many fracture types, provided they are deployed within a structured workflow . Yet most commercial tools stop at answering a single binary question: fracture present or not?
AZmed, the Paris-based MedTech company behind the Rayvolve® AI Suite, is moving the conversation forward. Its flagship solution, AZtrauma, now delivers a first-in-class “datation” capability that estimates whether a fracture is recent or old and displays that insight directly on the system’s secondary capture. The refinement may appear subtle, a change in label text from red FRAC. to black FRAC. (old), but the clinical implications are substantial.
Why radiological age matters
Knowing that a fracture is six hours old versus six weeks old changes everything: the differential diagnosis, the immobilization strategy, the need for computed tomography, and even whether safeguarding authorities must be alerted. In pediatrics, the presence of multiple healed fractures is a red flag for non-accidental injury. In adult orthopedics, an unhealed tibial shaft fracture after three months may prompt bone-stimulating therapy or revision fixation.
Until now, age estimation relied on a radiologist’s subjective reading of callus maturation, cortical remodeling, and soft-tissue swelling. AI in radiology offers a way to classify those subtleties at scale. By integrating age estimation, AZtrauma further bridges the gap between image interpretation and clinical decision-making.
How the datation feature works
During routine processing, AZtrauma analyses each fracture’s edge sharpness, callus density, and cortical continuity, the same cues radiologists use, then assigns a probability score. When the algorithm crosses the “old-fracture” threshold, it writes FRAC. (old) in black text inside the standard white bounding box on the secondary capture; otherwise, the label remains FRAC. in red. The bounding box itself never changes colour, preserving visual consistency.

Because the secondary capture is automatically pushed to the picture archiving and communication system (PACS), users see the datation tag without leaving their normal workflow or opening a separate viewer. No additional clicks, no new software windows.
Clinical and medicolegal impact
For trauma teams, AI-assisted fracture detection paired with datation translates into:
- Reduced misattribution, avoiding situations where an older, healing injury is mistaken for an acute event, leading to unnecessary CT or operative fixation.
- Smarter follow-up, flagging patients who require delayed union monitoring and early intervention.
- Pattern recognition in abuse, documenting clusters of healed fractures that may indicate repetitive harm in children or vulnerable adults.
- Audit support, providing time-stamped evidence when injury chronology is disputed in insurance or legal cases.
These benefits align with guidance from the UK’s National Institute for Health and Care Excellence (NICE), which recently endorsed fracture-detection AI as a means to “reduce variation in care” and “prevent further injury between assessment and treatment” .
Positioning AZmed in the AI ecosystem
With the datation upgrade, AZmed reinforces its role as a practical innovator rather than a proof-of-concept vendor. The company already partners with emergency networks such as the NHS Trust and SimonMed Imaging, where AI support has cut reporting turnaround times by up to 30% . Adding radiological age estimation brings AZtrauma closer to a fully fledged digital fracture pathway, triaging, detecting, ageing, and tracking injuries across the continuum of care.
More broadly, the launch exemplifies how AI in radiology is shifting from single-task detectors to multimodal clinical assistants. As algorithmic accuracy plateaus, value will come from context: combining imaging findings with prior exams, electronic health-record flags, and, now, temporal labels. Datation is a small but meaningful step on that trajectory.
Conclusion
Early adopters of AZtrauma’s datation module report minimal learning curves, the colour-coded text is intuitive and integrates with existing reporting templates. For institutions still exploring AI, the feature demonstrates how an incremental change can deliver outsized impact by addressing a real-world pain point shared by radiologists, surgeons, and risk-management teams alike.
In short, AI, fracture detection, and clinical context no longer need to live in separate silos. With datation, AZmed offers a tangible example of how advanced algorithms can enhance, not replace, the nuanced judgement that defines modern radiology.