RADiCAIT emerged from stealth in late 2025 with a $1.7 million pre-seed round and is now raising $5 million to fund clinical trials. The Boston-based spinout from Oxford University was named a Top 20 finalist in TechCrunch Disrupt 2025’s Startup Battlefield. CEO Sean Walsh explains that the startup’s mission is to take “the most constrained, complex, and expensive” imaging method (PET) and replace it with “the most accessible, simple, and affordable” one (CT).
RADiCAIT’s technology, called Insilico PET®, uses a deep generative neural network to predict PET-like images from standard CT scans. In essence, the model is trained on paired CT and PET data, learning the statistical patterns that map anatomical CT information to PET functional data. Regent Lee, the Oxford professor and CMIO of RADiCAIT who led the original research, developed this generative model in 2021 at the University of Oxford. In practice, the AI focuses on converting one type of biological information into another—anatomical structures from CT into physiological function as seen in PET. Chief Technologist Sina (Sheena) Shahandeh describes the model as linking “disparate physical phenomena” by translating anatomy into activity. During training, the system is instructed to pay special attention to certain tissues or abnormalities in the scans. By repeatedly analyzing many examples, the AI learns which CT-based patterns correspond to clinically important PET signals.
The final PET-like image is produced through several coordinated models working together. Shahandeh likens this multi-model approach to DeepMind’s AlphaFold, which predicts protein structures from amino acid sequences—both systems take one type of biological data and convert it into another. RADiCAIT claims its synthetic PET images are mathematically and clinically equivalent to real PET scans. Walsh notes that the team can “mathematically demonstrate” that the AI-generated PET images are statistically similar to true PET scans and that doctors make the same diagnostic decisions using either. Clinical studies have reportedly confirmed that physicians reach equivalent conclusions whether reviewing chemical PET images or AI-generated ones.
The company highlights several benefits to its AI approach. No radioactive tracer or extra scan is required, since the system works with routine CT images that clinicians already collect. Hospitals need no new hardware; the AI simply upgrades existing CT images into PET-like functional maps. This makes the process fast, safe, and scalable—delivering PET-level insight quickly to far more patients without added radiation exposure or logistical challenges.
RADiCAIT is currently partnering with major health systems such as Mass General Brigham and UCSF Health to validate its technology through clinical pilot programs for lung cancer screening. These pilots test whether AI-generated PET scans can aid early detection and cancer staging as effectively as traditional PET scans. The company is also pursuing FDA approval through formal clinical trials, a process that motivates its current $5 million funding round. Once regulatory clearance is achieved, RADiCAIT plans to launch commercial pilot programs in hospitals and expand its CT-to-PET conversion to other cancers, including colorectal cancer and lymphoma.
Importantly, RADiCAIT emphasizes that its technology is designed to augment—not completely replace—PET imaging. For therapeutic procedures such as radioligand therapy, which rely on real chemical PET tracers, conventional PET scans will remain necessary. However, for diagnostic, staging, and monitoring purposes, AI-generated PET could significantly reduce reliance on traditional PET systems. Walsh points out that global medical imaging infrastructure is already “very constrained,” with limited PET capacity available to meet diagnostic demand. By delivering PET-level insights from CT scans, RADiCAIT aims to absorb much of this diagnostic burden, freeing existing PET scanners for advanced therapeutic applications.
Beyond oncology, the team envisions broader applications for its AI framework. Shahandeh notes that using AI to derive new functional insights from existing data is “broadly applicable” across many scientific domains. The company plans to explore similar AI-driven imaging innovations throughout radiology and other biomedical disciplines. In the long run, RADiCAIT’s methods could bridge gaps between multiple scientific fields—including materials science, chemistry, and physics—by uncovering hidden relationships between different forms of biological and physical data.
In summary, RADiCAIT’s AI-powered approach promises to make advanced medical imaging more affordable, accessible, and efficient. By converting standard CT scans into PET-like images, the company’s technology could spare patients from invasive, radioactive, and time-consuming procedures while providing clinicians with equivalent diagnostic accuracy. If successfully validated, this innovation could reshape cancer detection and monitoring, marking a new frontier in medical imaging.
The technical details and quotes are drawn from company statements, TechCrunch interviews, and materials published by RADiCAIT, Whistlebuzz, and Bitget News.