I’ve been hesitant to dip my toe into the artificial intelligence (AI) discussion in this forum because so much has been written about it in a short period of time. What can I add at this early juncture? I changed my mind after a healthcare tech colleague recently asked me to test out a platform using AI that helps clinicians access evidence-based guidelines or even determine if a patient needs to have a consult with a specialist. While I’m not a clinician, it’s not unusual for me to test products and services that may help our healthcare provider members in their quest to provide high quality care. Following and responding to the designated steps, the ‘Need a Consult?’ option popped up. Sure, I’ll bite! I clicked on the option and – lo and behold – a real-life specialist appeared on the screen! Wow – this could really be a neat tool. AI can be wonderful! And yet…
I couldn’t help but recall when the Michigan Quality Improvement Consortium (MQIC) offered simple guidelines to primary care physicians on topics like diabetes, pediatric obesity, hypertension, co-morbidities, and behavioral health. Updated every two years by healthcare professionals the guidelines had a Michigan-focused context and were an extremely valuable tool for healthcare providers, supported by a committee that consisted of Michigan health plans, medical societies, physician organizations and behavioral health organizations.
I’m no luddite – I’ve always been an early adopter to technology. But I never thought that tech could replace relationships – nor would I want it to. Meetings where guidelines were hashed out proved important to the end user PCPs who knew the guidelines considered Michigan’s unique health challenges. Does AI account for that? Will AI offer customized tools?
It’s difficult for me not to think about AI and healthcare without considering its potential impact on the solo primary care provider. Might it serve as a much-needed clinical partner for tougher patient scenarios – especially for physicians in remote/rural areas with few physician-as-colleague relationships? Or will it be yet another reason PCPs are compelled to merge with a larger practice, sell to a hospital system or retire early – long before their skills or desire have diminished?
I also bring up the solo PCP because of a recent discussion I had on the topic of physicians and suicide. (September is Suicide Prevention Month.) I wonder how many physician suicides occur among solo PCPs? My own primary care physician brother died by suicide – but we will never know why.
The burden of hiring and firing, managing a medical practice, ensuring payroll gets done (payroll services are often too expensive for small practices), and
keeping health care records up to date – all while putting patient care front and center – can be overwhelming. It will be interesting to see what AI-driven tools emerge that are targeted to small practices. Or would the pay-off not merit the AI investment, considering that independent practices rarely have excess cash for the latest tech tools?
As I near the end of this column, I see I have offered more woes than wows. But I do have to note another area of concern with AI and healthcare, and that’s provider fraud. How difficult would it be to hack a licensed physician’s DEA, NPI and state pharmacy credentials, set up a pay-for-service website and use an AI-generated “provider” to offer medical advice?
Okay, I’ll stop now. As a healthcare non-physician executive, I look forward to hearing from readers how AI will be positively transformative for primary care and the broader healthcare community. As a patient, I do, too.