This story was originally published on Medtech Dive. For daily news and insights, subscribe to our free daily MedTech diving newsletter.
ChIP designer Nvidia plays a core role as more and more medical device companies integrate artificial intelligence. The company has partnered with top Medtech companies including Medtronic, Johnson & Johnson, GE Healthcare and Philips.
This purpose covers the breadth of technology. In imaging, NVIDIA collaborates with GE Healthcare Automatic X-ray and ultrasound solutionsand is working with Philips Develop basic modelsa model that can be used for a variety of tasks, for MRI machines. In robotics, J&J's bronchoscope monarch platform uses NVIDIA's computing platform. Nvidia too New Developer Framework debut Medical robotics in March, called healthcare ISAAC. The system is equipped with three computers to generate synthetic data to simulate workflows, create a virtual environment where robots can safely learn skills, and a platform for deploying applications and real-time sensor processing.
Medtech Dive talks with David Niewolny, director of healthcare and healthcare business development at NVIDIA, about the company's partnerships and the future of AI in medical devices.
This interview has been edited for length and clarity.
David Najolny: Over the past 18 months, we have seen very fast advances in healthcare and MEDTECH adopting generative AI. Here is the idea to start creating - draft clinical notes to generate synthetic data for training. Now, devices are looking at how to start bringing proxy AI into these applications. You are seeing digital agents as assistants to healthcare providers or patients with automated workflows that provide clinicians with more contextual support.
Then you can get to where we see the future. Where many artificial intelligence and innovations happen is the idea of physical AI. This makes us the role of robotics. The easy thing to think of is surgical AI. But there are a lot of applications. One specific use case I've talked about in Taiwan (in GTC Taipei) is some of these Operating robot technology. Think about nurse assistants in providing medications, bringing different supplies around the hospital, ensuring inventory in different areas of the hospital.
That’s about a complete change in the way you’ll consider doing medical imaging in the future. We took two initial use cases, one of which is the idea of automatic X-rays. Think of a future world where you no longer have X-ray technology. Now you have a digital proxy that is basically checking yours. You walk into a room where there is another robot, which may be a digital agent, which gives you all the guidance, when to stand, when to hold your breath and how to position yourself. You stand in one position and then the actual machine position itself.
You can also view some generated AI applications where it provides a complete report to doctors in terms of their clinical findings.
In this particular case, now you are expanding access to care because you essentially have these fully autonomous systems that are undergoing medical imaging.
X-rays are what we announced, and the other is near ultrasound. In each case, GE Healthcare is working with us and collaborating on the methods and tools for building these robotic systems.
It's all about building an ecosystem. You can check out all the great work we can do with AI and robotics, from Nvidia Direct to healthcare providers, which is too big. They didn't have internal developers to start building this.
So then you work backwards in that ecosystem and you realize it's Medtech that is building all these devices and solutions. What we are looking for is how do we bring together all the components of our ecosystem and build on a common platform? Computers were not mainstream until Windows opened the door for this kind of influx of software.
We have gained a lot of learning from another industry: Automotive. Essentially, we need to create a fully simulated environment, train the simulated data, take these algorithms and move it to the edge.
We embraced this knowledge and said, “What other areas have matured?” In particular, MedTech has the greatest benefit from this opportunity to have a common platform. But at the same time, it has a big obstacle. As we have seen, there is no company in this ecosystem that can essentially build the platform.
Proxy AI has a large number of applications. Abridge’s partners use many of our technologies for clinical documentation. They are integrating with major EHRs and are continuing to gain more and more hospital users.
You also have some agents who actually work with the surgeon, and now you have an assistant in the room where you can start asking questions, extracting the patient's medical records, adjusting certain devices in the room.
One of our partners, Moon Surgical, actually downloaded its full instructions to an agent that a doctor or surgeon could quote for something like setting. Instead of citing a 1,000-page manual, you can just ask where the robot is set up, how is it set up, and what are the best practices?
Yes, people are really worried. The point is that in these cases, almost every one of them is adding already existing team members. Lack of location. This is actually improving care, not narrative around people’s work. We took many staff members to think that they were busy or ordinary workloads and added them.
There will always be surgeons here, but we can automate subtasks and actually make certain tasks easier for surgeons.
Recommended reading