Yes (26%) – "I have suffered from ongoing chronic illness for years without a clear cause. About a year ago, I spent several hours priming Chatgpt with my medical records, tests, labs, and filling in my history and symptoms. It gave me a diagnosis 6 months before I got it confirmed through a medical test-- a rare condition of which I don't even exhibit all the classical symptoms. I research what it tells me and I know it makes errors occasionally, but the suggestions it gives me for medications and tests to ask my doctors about have changed my life. I trust my doctors, but AI also has the ability to look at 100s of lab results at once and parse together patterns that busy doctors might accidentally overlook."
"I put in some blood test results and it was actually surprisingly helpful in explaining what all the results meant. I didn't upload any of my personal info, I just typed in what the blood test results were as a whole, and asked it to explain what they all meant, together. It was able to give me some good information to ask my doctor about at the next appointment."
No (74%) – "With all the data showing the probability of AI to “hallucinate” (I.e. make things up) why would we expect to get good advice on our medical questions?"
The medical experts mentioned have brought up THE most important aspect of this entire venture. Protected health information. HIPAA does have electronic protections in instances of sharing from provider to provider or even publishing to a statewide or inter-state health exchange. However, AI sharing is new territory, and HIPAA hasn’t caught up to it yet. For users wanting to utilize this feature, I would just say please wait until the law protects YOUR personal health information you intend to share with AI. I don’t like to seem paranoid, but ChatGPT and other AI startups are definitely taking advantage of this unprecedented time for their innovation and by extension, they are also taking advantage of us. That is exactly their intention. Don’t let them."
"We've seen the ramifications of people putting too much stock in the advice AI provides. It has quite literally led people to their deaths. Although this tool is supposed to be a supplement, it won't be for many people. Absolutely NOTHING will replace finding a PCP or NP, getting bloodwork done on a regular basis, and being personally invested in your own health. Not only would our data be at risk but this is all just a slippery slope to some low-integrity doctors inputting patient information into a chatbot like this for diagnosis assistance which, again, could prove to be detrimental at best, fatal at worst. It's just not a good idea."
❓ Our question to you: In your opinion, was the fatal shooting of Renee Good by ICE officers a justified use of force?
❓ Our question to you: In general, do you agree with the Trump admin’s changes to the federal dietary guidelines?
❓ Our question to you: In general, do you support or oppose the Trump admin’s new “Donroe Doctrine” regarding US influence in the Western Hemisphere?
Let's make our relationship official, no 💍 or elaborate proposal required. Learn and stay entertained, for free.👇
All of our news is 100% free and you can unsubscribe anytime; the quiz takes ~10 seconds to complete