Discussion
Loading...

Post

  • About
  • Code of conduct
  • Privacy
  • About Bonfire
Prof. Emily M. Bender(she/her)
@emilymbender@dair-community.social  ·  activity timestamp 2 weeks ago

Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**

https://buttondown.com/maiht3k/archive/why-you-should-refuse-to-let-your-doctor-record/

https://buttondown.com

Why you should refuse to let your doctor record you

By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
  • Copy link
  • Flag this post
  • Block
Mensch, Marina
@energisch_@troet.cafe replied  ·  activity timestamp 5 days ago

@emilymbender I would have thoughts, too. My privacy. There is never any guarantee but with A.I. involved once privacy is shreds, guaranteed.

  • Copy link
  • Flag this comment
  • Block
Katherine W
@FiddleSix@zeroes.ca replied  ·  activity timestamp last week

@emilymbender

Article on the unsanctioned use of AI in medical settings:
https://www.medpagetoday.com/opinion/second-opinions/119842?xid=nl_secondopinion

Opinion | Doctors Are Quietly Using AI. Bans Won't Fix That.

Regulation must go beyond formally sanctioned tools
  • Copy link
  • Flag this comment
  • Block
Alex White
@alextheuxguy@fosstodon.org replied  ·  activity timestamp last week

@emilymbender I always want to say no, but there’s an unspoken social pressure to just agree, especially when you’re lucky to have 30 seconds of time with your doctor.

  • Copy link
  • Flag this comment
  • Block
Janet Grootebroeder
@JanetGrBr@mastodon.social replied  ·  activity timestamp last week

@emilymbender No, I was told at the end of the consultation that everything was recorded and that 'it' was making a report now.

  • Copy link
  • Flag this comment
  • Block
Nodami
@nodami@hcommons.social replied  ·  activity timestamp last week

@emilymbender
After reading your post I had to watch again Ortho Artificial Intelligence short Video by DGlaucomflecken 😅🤣 https://youtu.be/WgnWgIOer6s?is=62_p13To_penRg-S

Ortho Artificial Intelligence
  • Copy link
  • Flag this comment
  • Block
chrisradonc
@chrisradonc@social.vivaldi.net replied  ·  activity timestamp 2 weeks ago

@emilymbender
Got round to reading this article and it's provided food for thought.

Any medical practitioner will be enticed by a solution that appears to reduce workload, particularly if it reduces the interaction with crappy electronic record systems like Powerchart (owned by Larry Ellison's Oracle).

For me, dictating a clinic letter at the end of a specialist consultation is an opportunity to take stock of all the important features, sift through the information and highlight the point of importance for other practitioners involved (including future me).

I'm not sure AI will do that properly, even if it understands my accent.

  • Copy link
  • Flag this comment
  • Block
WideEyedCurious 🇺🇸 💙 🇺🇦 & 🇨🇦
@WideEyedCurious@mstdn.social replied  ·  activity timestamp 2 weeks ago

@emilymbender I always refuse.

  • Copy link
  • Flag this comment
  • Block
Mike Dolbow
@mmdolbow@mapstodon.space replied  ·  activity timestamp 2 weeks ago

@emilymbender yuuuuup... dental office. I politely declined and they were cool with it. But will it be that way next time?

  • Copy link
  • Flag this comment
  • Block
saxnot @ 39C3
@saxnot@chaos.social replied  ·  activity timestamp 2 weeks ago

@emilymbender ah yeah I remember the diagnosed gender dysphoria situation because some machine learning program heard "male man" when a woman described herself as mailman

  • Copy link
  • Flag this comment
  • Block
Noortje Van Leeuwen
@Noortjevee@mstdn.social replied  ·  activity timestamp 2 weeks ago

@emilymbender Luckily not been asked that

  • Copy link
  • Flag this comment
  • Block
June BlueSpruce
@jbluespruce@mstdn.social replied  ·  activity timestamp 2 weeks ago

@emilymbender Excellent post. I worked for many years in healthcare. I know firsthand the incredible pressures on providers to find the time they need to give high-quality care while completing all of their administrative tasks. So I get why these AI tools are attractive. I have consented to have providers use them in my care. But I won’t any longer. The problems you describe are serious & potentially dangerous. I appreciate the perspective that documenting is part of care.

  • Copy link
  • Flag this comment
  • Block
Solarbird :flag_cascadia:
@moira@mastodon.murkworks.net replied  ·  activity timestamp 2 weeks ago

@emilymbender Yes, and I certainly decline. Fortunately, I have a good relationship with my GP, so it hasn't been an issue so far.

  • Copy link
  • Flag this comment
  • Block
Maria Langer | 📝💎🌵🛥️
@mlanger@mastodon.world replied  ·  activity timestamp 2 weeks ago

@emilymbender @DevlinLeathercraft The orthopedic surgeon who will be taking care of my trigger thumb asked to record our last session. I can't remember whether I asked him if an AI was going to transcribe it, but I will next time.

  • Copy link
  • Flag this comment
  • Block
Sam Clemente
@countablenewt@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender These kinds of scribes have been commonplace in medical scenarios for a long time at this point

  • Copy link
  • Flag this comment
  • Block
Katherine W
@FiddleSix@zeroes.ca replied  ·  activity timestamp 2 weeks ago

@emilymbender

1. At my last vet visit, there was a small typed notice across the exam room from the person/animal seating area that said AI is now being used by the practice for all visits, and to assume that if staff are in the room, recording is happening.

2. At my last primary provider visit, I asked the medical assistant if AI was being used, and if she could opt me out. She agreed.
When the PA came into the exam room, her first words were that I needed to prioritize my questions/issues, as she would only be able to deal with two, since she would have to manually chart the whole visit.
(I had come hoping for a prescription refill and two referrals for specialist care.)

  • Copy link
  • Flag this comment
  • Block
roboticus lastius (not a bot)
@lastrobot@writing.exchange replied  ·  activity timestamp 2 weeks ago

@emilymbender Agree. GPs are low availability now, saying no to this means being viewed as difficult, maybe ejected from patient roster. So you can't really say no.
Also in two visits where reports were prepped from specialists, there were errors from AI transcription mishearing that I think a human would not have made (age cited quite differently in different paragraphs, a operation claimed as had which was spoken as DID NOT have, etc) Correction required my time, effort, Dr disfavor 🫤

  • Copy link
  • Flag this comment
  • Block
Thomas Svensson 🖖
@tsvenson@mastodon.online replied  ·  activity timestamp 2 weeks ago

@emilymbender

Yes, and of course said no.

But then I discovered they used AI transcribing when adding notes to my journal, after the meetings, as it was full of obvious errors. So needed to lecture them again about my right it is not used on my medical record.

What makes this even worse is that they all know how bad it works, as it is frequently reported in media about complaints from the medical community about horrific errors, as well as inefficiency this overhyped piece of crap creates.

  • Copy link
  • Flag this comment
  • Block
Workshopshed
@Workshopshed@mastodon.scot replied  ·  activity timestamp 2 weeks ago

@emilymbender no but yesterday, I did see a poster mentioning it at a local clinic

  • Copy link
  • Flag this comment
  • Block
Theodora Ward
@theodoraward@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender thankfully my therapist was like "yeah dude don't worry about it it's weird" but i still get an email alongside every 'upcoming appointment' email reminding me to sign the permission form

  • Copy link
  • Flag this comment
  • Block
Elric
@elricofmelnibone@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender
One of my first jobs was providing tech support to doctors in a hospital setting. They were some of the most tech-illiterate folks I've ever encountered. They have no concept of operational security.

No doctor has ever asked me for permission to store any information about me in whatever systems they're using. For all I know they store it in plain text on an insecure S3 bucket.

  • Copy link
  • Flag this comment
  • Block
3Jane Tessier Ashpool
@3janeTA@beige.party replied  ·  activity timestamp 2 weeks ago

@emilymbender no but in the agreement they ask us to sign periodically it said that they might use AI. So I said I wasn’t signing if they were going to. They asked the doc and she said no I don’t use AI transcription at all and I didn’t know that was in there!

  • Copy link
  • Flag this comment
  • Block
audrina
@audrinabell@mstdn.social replied  ·  activity timestamp 2 weeks ago

@emilymbender
I have-- and refused!

  • Copy link
  • Flag this comment
  • Block
LPhilpott
@LPhilpott@mastodonapp.uk replied  ·  activity timestamp 2 weeks ago

@emilymbender It turns out some doctors, too, are noticing negative side effects of using AI for taking notes.

https://benngooch.substack.com/p/i-was-an-enthusiastic-early-adopter

I Was an Enthusiastic Early Adopter of AI Scribes. Here’s Why I Stopped

A GP reflects on what eighteen months of ambient scribing taught them about the consultation they thought they already understood.
  • Copy link
  • Flag this comment
  • Block
slotos
@slotos@toot.community replied  ·  activity timestamp 2 weeks ago

@emilymbender Funnily enough, transcribing can preserve privacy without issues. Whisper.cpp runs decently well on phones, and can be run on servers that process patient records under the same security constraints. Could easily be run locally even.

Problem is, that’s extremely hard to prove in the current „just slap a gear on it and call it steampunk” climate. I would definitely not trust a random provider.

And if they do „summarization”, forget privacy.

  • Copy link
  • Flag this comment
  • Block
cynthia rose is desirable
@cynthiarose@sfba.social replied  ·  activity timestamp 2 weeks ago

@emilymbender psychiatry did it without informed consent. I am livid

  • Copy link
  • Flag this comment
  • Block
Alan is @cogdog
@cogdog@cosocial.ca replied  ·  activity timestamp 2 weeks ago

@emilymbender for a doctors perspective on the more profound side effects of “efficiency”

https://benngooch.substack.com/p/i-was-an-enthusiastic-early-adopter

“I felt myself becoming a passive observer in encounters where I had previously been an active architect. I felt my clinical memory, my narrative identity, and my sense of connection to my patients beginning to erode at the edges.”

I Was an Enthusiastic Early Adopter of AI Scribes. Here’s Why I Stopped

A GP reflects on what eighteen months of ambient scribing taught them about the consultation they thought they already understood.
  • Copy link
  • Flag this comment
  • Block
Paco Ho Ho Hope 🎄
@paco@infosec.exchange replied  ·  activity timestamp 2 weeks ago

@emilymbender Really helpful. Thanks for sharing.

I often reason about it this way. There are very few things like this where, if you opt out this time, you can’t opt in next time. On the other hand, there are LOTS of situations, this being a good example, where opting out after you opted in is substantially more effort (or even impossible).

Opting out by default is usually a safe thing to do. You can always opt in later if you change your mind.

  • Copy link
  • Flag this comment
  • Block
Mel
@M3L155A@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender
Evidence shows litigation decrease if Drs have scribes. A Dr isn’t allowed to remember things in defence. It’s said “If it is not documented, it didn’t happen” even if it did happen & recall can be verified.

The direct effect:
1: more litigation = more insurance cost for the Dr & thus higher consult fees.
2: Drs who have psychological & emotional injury from spurious claims reduce/stop practice.

So there is high motivation for having a scribe.

  • Copy link
  • Flag this comment
  • Block
Eric deRuiter
@ridogi@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender I opted out at my physical therapist last week. They told me all their patients who work in tech have opted out.

  • Copy link
  • Flag this comment
  • Block
your auntifa liza 🇵🇷 🦛 🦦
@blogdiva@mastodon.social replied  ·  activity timestamp 2 weeks ago

yes i was a declined. when she asked, i said because

1. the companies used sites with CSAM and other abuses

2. it’s spyware. each prompt acts like a honey-pot. since you are giving them the info, it by-passes HIPPA. in turn they get to use and sell that info however they please

3. as an antifascist activist, it puts my life in danger by giving companies ran by fascists access to my whole medical history.

my MD was shocked. they had no idea about the spyware angle or CSAM

@emilymbender

  • Copy link
  • Flag this comment
  • Block
DB
@lawyersgunsnmoney@mstdn.social replied  ·  activity timestamp 2 weeks ago

@emilymbender Yes, I was asked to sign a consent (stuck in with the other standard consents) authorizing the doctor’s practice to use an AI scribe. I left the room, went up to the front desk and told them I would not sign the consent under any circumstances. They looked a little surprised, but agreed to have one of the techs act as a scribe as normal. Glad I stood my ground - there is no way in Hell I would let a Doc use AI for anything medical related

  • Copy link
  • Flag this comment
  • Block
Cleo
@Gh0stlyM0use@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender Im fortunate my gp doesn't even trust the national health database.

  • Copy link
  • Flag this comment
  • Block
Ertain
@Ertain@mast.linuxgamecast.com replied  ·  activity timestamp 2 weeks ago

@emilymbender Usually I ask my doctor to turn that off.

  • Copy link
  • Flag this comment
  • Block
Sarah Holstein
@DrSarahHolstein@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender
Completely agree! I have started having conversations with my patients about the fact that I will never use this technology and to warn them that some institutions/practices have decided they don’t even need patient consent. Aside from all of the privacy issues, it is literally a major part of my job to create detailed, accurate and nuanced notes.

  • Copy link
  • Flag this comment
  • Block
Erik Nelson
@nerpulus@mastodon.online replied  ·  activity timestamp 2 weeks ago

@emilymbender I tried to subscribe to that newsletter and it said "this e-mail address can not be subscribed" Why does it not like my email address?

  • Copy link
  • Flag this comment
  • Block
P_______X
@P__X@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender I do share a lot of AI skepticism, but physician perspective (I use it about 25-30% of visits), there are many highly speculative aspects of this take:

🧵 1/2
1) Point #1 is valid, however, the same data safety questions can be asked regarding other integrated systems. Like where is your EMR data stored, how does your radiology data integrate (reviewed in 3rd party software), etc.
2) Consent: valid concern, but the fullest version would be a long EULA-like text with a checkbox...

  • Copy link
  • Flag this comment
  • Block
Prof. Emily M. Bender(she/her)
@emilymbender@dair-community.social replied  ·  activity timestamp 2 weeks ago

@P__X Your experience is your experience, but I am **appalled** at what you're saying about consent here. The fullest version would be too long, so we're not actually doing informed consent? No thank you.

  • Copy link
  • Flag this comment
  • Block
P_______X
@P__X@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender "he fullest version would be too long, so we're not actually doing informed consent?"

No, that is not what is being said there. Unlike a blog post, I am restricted in space. I explicitly said that is is a valid concern. A basic research consent form is 8+ pages of legalese and I'm afraid that the future solution will be to add it as a checkbox for 30 pages of text at check-in that nobody reads and doesn't actually inform better. And again, my point #1.

  • Copy link
  • Flag this comment
  • Block
Prof. Emily M. Bender(she/her)
@emilymbender@dair-community.social replied  ·  activity timestamp 2 weeks ago

@P__X You are not restricted in space -- you wrote a whole thread.

My point is: if patients do not know what they are consenting to, it is not consent. If it is not possible in the context of the visit to convey the detail, then we shouldn't do the thing.

I encourage you to read the rest of the replies to my post, including the quotes, to see the lack of consent and how that is landing.

  • Copy link
  • Flag this comment
  • Block
P_______X
@P__X@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender

Frankly, I'm surprised & **disappointed** by your eagerness to jump to conclusions and make biased inferences. Eg: "an AI scribe will change how physicians speak", but *character limits don't impact how ppl write here*. Sets how seriously I should take this.

My inference: you've had minimal input from actual providers familiar w/ the tech (point #4 and 7 were dead giveaways) or who spent >10,000 hours writing notes (even #9 seems to be from a non-provider).

No thank you.

  • Copy link
  • Flag this comment
  • Block
Fantômas
@2Bfair@infosec.exchange replied  ·  activity timestamp 2 weeks ago

@emilymbender Agreed. For any points that were valid, none of them necessitate the use of LLMs. Never mind without consent. Disgusting.

  • Copy link
  • Flag this comment
  • Block
twifkak
@twifkak@mas.to replied  ·  activity timestamp 2 weeks ago

@emilymbender Thanks for the write-up! How do you feel about human scribes? I've been saying "yes to human scribes, no to AI scribes" for a while now, but your list makes me realize a lot of the concerns apply there, too.

  • Copy link
  • Flag this comment
  • Block
M.S. Bellows, Jr.
@msbellows@c.im replied  ·  activity timestamp 2 weeks ago

@emilymbender My provider always asks and always said that the days "doesn't leave our server." Are there really transcription/summarizing AIs that are completely local?

  • Copy link
  • Flag this comment
  • Block
Solitha
@solitha@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender I finally said "no" the last time I went in... and I found out the nurse also wasn't really fond of it either.

  • Copy link
  • Flag this comment
  • Block
Bill, organizer of stuff
@wcbdata@vis.social replied  ·  activity timestamp 2 weeks ago

@emilymbender I've spotted medically significant errors in the transcription during the session. It's galling.

  • Copy link
  • Flag this comment
  • Block
J. Nathan Matias 🦣
@natematias@social.coop replied  ·  activity timestamp 2 weeks ago

@emilymbender I have been asked multiple times by medical providers this year if I would consent to AI transcription, and I have said no every time. I'm tempted to ask next time if they have information from an independent audit of the system performance and privacy policy

  • Copy link
  • Flag this comment
  • Block
Jennie Louise 🇺🇦🇵🇸
@jlouiseau@aus.social replied  ·  activity timestamp 2 weeks ago

@emilymbender YES, multiple times, and with wording and timing of the "request" that amounted to coercion. Very little information given in response to questions about what tool exactly was being used, where my data would be sent, their privacy policies etc. Just "this makes it easier for me to help you."
Also recently at the emergency vet hospital, and again in a context that made it extremely difficult to refuse, lest you be seen as a difficult or problem client that creates extra work for the poor vets.

So glad you are writing about this.

  • Copy link
  • Flag this comment
  • Block
Travis Marble
@marbletravis@mastodon.world replied  ·  activity timestamp 2 weeks ago

@emilymbender I've been asked multiple times, and I always chicken out when they ask and say yes, I think it's the power imbalance, and I don't think it's fair.

  • Copy link
  • Flag this comment
  • Block
huntingdon
@huntingdon@mstdn.social replied  ·  activity timestamp 2 weeks ago

@emilymbender

Refuse! Don't do the doctor's job of taking notes, let alone support their choice to have AI transcribe notes for them. The accuracy rate is poor, but incorrect notes would still be used to treat you.

  • Copy link
  • Flag this comment
  • Block
Ted
@SprucePapoose@infosec.exchange replied  ·  activity timestamp 2 weeks ago

Yes, and by a couple of specialists who it has taken a lot of time/money to get in to see, so I've felt almost coerced into agreeing as I didn't want to risk them declining to see me so I consented against my better judgement.

Do you think there are other ways we can fight this outside of directly declining consent in these situations? One of the providers I see who has been using one for ~2 years now is the only person I've been able to see locally who can prescribe a tightly controlled medication, so I'm quite reluctant to risk the (tenuous) relationship I have established (I already don't feel like I can be completely open and honest with her, and fear what may happen if I decline the use of the scribe).

  • Copy link
  • Flag this comment
  • Block
David J. Atkinson
@meltedcheese@c.im replied  ·  activity timestamp 2 weeks ago

@emilymbender Great article. The privacy of my protected health information is my immediate concern. There are many ways it can and does leak. It’s not the provider’s fault, it is in the whole chain of info systems backing them up. Using an LLM multiplies the risk. Training data is EXTREMELY valuable. There is no way the recording or transcript are getting deleted. It’s getting sold. The incentives to are all wrong to ensure HIPAA compliance.

  • Copy link
  • Flag this comment
  • Block
David J. Atkinson
@meltedcheese@c.im replied  ·  activity timestamp 2 weeks ago

@emilymbender Yes. I think this is becoming quite common. Medical groups are providing this service through enterprise-wide medical information systems. I was grateful to be asked. I’ll just add that doctors do not like lectures from patients.

  • Copy link
  • Flag this comment
  • Block
Prof. Emily M. Bender(she/her)
@emilymbender@dair-community.social replied  ·  activity timestamp 2 weeks ago

@meltedcheese Nowhere am I suggesting that patients lecture their doctors, though?

This information can help patients make decisions for themselves about this. If consent is being requested (as it should be), then 'no' is a complete sentence.

  • Copy link
  • Flag this comment
  • Block
David J. Atkinson
@meltedcheese@c.im replied  ·  activity timestamp 2 weeks ago

@emilymbender Agreed. I’m sorry that I miscommunicated. => I am the one who “lectured” and only because AI is my area of deep expertise. If I can convince a doctor or two to at least ask the right questions and consult with other doctors before simply accepting the use of LLM technology, that’s a good thing. Patients should have the info, as you say, to make their own decisions.

  • Copy link
  • Flag this comment
  • Block
Jay
@WhiteCatTamer@mastodon.online replied  ·  activity timestamp 2 weeks ago

@emilymbender THANK YOU for point 9. It drives me batty how poorly people (read: management) rate the importance of mental review, slower moments, and backburner thinking for improving or maintaining skills. Something something working out is 10% active work and 90% fueling/rest.

Not to mention that a rushed/stressed provider makes ME stressed and less likely to be candid or “be complicated”, which is not the point of a visit!

  • Copy link
  • Flag this comment
  • Block
Jo - pièce de résistance
@JoBlakely@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender yes. I said no. That I was very much against it.

  • Copy link
  • Flag this comment
  • Block
Jo - pièce de résistance
@JoBlakely@mastodon.social replied  ·  activity timestamp 2 weeks ago

@emilymbender yes. I said no. Said I was very against it.

  • Copy link
  • Flag this comment
  • Block
Bette
@Bette@mstdn.social replied  ·  activity timestamp 2 weeks ago

@emilymbender

My doc did say she was going to use it, but I said no. She said it's easier for her. I said no. She no longer brings the thing in the room. My cardiologist, however, just put the device on the desk & I said no. He started to hem & haw, reached to turn it on, then said it wasn't working. I talked about lots of things, and how it felt like I had a basketball in my gut. The summary "he" wrote said that I play basketball for exercise. I'm 71 with health issues. He lied.

  • Copy link
  • Flag this comment
  • Block
Prof. Emily M. Bender(she/her)
@emilymbender@dair-community.social replied  ·  activity timestamp 2 weeks ago

@Bette Woof.

  • Copy link
  • Flag this comment
  • Block
Boyd Stephen Smith Jr.
@BoydStephenSmithJr@hachyderm.io replied  ·  activity timestamp 2 weeks ago

@emilymbender The booking system wouldn't let me proceed without providing consent. I'm allowed to opt-out by providing a written request; I've not (yet) done that.

  • Copy link
  • Flag this comment
  • Block
Jane Doe
@janef0421@mastodon.nz replied  ·  activity timestamp 2 weeks ago

@emilymbender My biggest concern is the potential for psychiatric violence. Inaccurate medical notes produced by these systems could very easily be used as evidence of psychosis or some other kind of psychopathology, leading to forced medical treatment. Having already experienced some of that system, it really worries me. I don’t let medical providers use these systems with me.

  • Copy link
  • Flag this comment
  • Block
Analog AI
@Retreival9096@hachyderm.io replied  ·  activity timestamp 2 weeks ago

@janef0421 @emilymbender

I just read (in a JAMA newsletter, I'll try to track it down -- it's not in my email or trash) about a Doctor who as been an early adapter. He did it "right", going over the notes in the evening to clean up the errors in transcription.

He found:
1) He could just focus on the patient, rather then the screen.
2) He got off track and was less focused, and spent more time with the patients without providing better information.
3) Most importantly, when someone came back 6 months later for a follow up, he realized that the notes were not that good. Accurate, but without insight -- they read like someone else had written them and did not help him recall what was going on.

  • Copy link
  • Flag this comment
  • Block
commonst
@commonst@social.vivaldi.net replied  ·  activity timestamp 2 weeks ago

@emilymbender there are signs at the doctor's offoce saying you can refuse, but when I did I got a lecture on how this helps, and acting like I had no clue what I was talking about. I mentioned I worked in tech and it was dismissed. As I am in an area with few doctors accepting new patients at the moment.... how do I really refuse?

My therapist asked for permission, I declined, and after my session we got into a long conversation about why. At least they were curious about it.

  • Copy link
  • Flag this comment
  • Block
The Turtle
@the_turtle@mastodon.sdf.org replied  ·  activity timestamp 2 weeks ago

@emilymbender other than "AI is just fucking wrong a lot?"

  • Copy link
  • Flag this comment
  • Block
Prof. Emily M. Bender(she/her)
@emilymbender@dair-community.social replied  ·  activity timestamp 2 weeks ago

@the_turtle You could read the post that I was linking to, or you could add to the deluge of mansplaining characteristic of the Fediverse.

You chose option #2.

  • Copy link
  • Flag this comment
  • Block
The Turtle
@the_turtle@mastodon.sdf.org replied  ·  activity timestamp 2 weeks ago

@emilymbender still, AI is just fucking wrong a lot.

  • Copy link
  • Flag this comment
  • Block
Prof. Emily M. Bender(she/her)
@emilymbender@dair-community.social replied  ·  activity timestamp 2 weeks ago

@the_turtle And this is still mansplaining.

  • Copy link
  • Flag this comment
  • Block
The Turtle
@the_turtle@mastodon.sdf.org replied  ·  activity timestamp 2 weeks ago

@emilymbender@dair-community.social see that block buttton? FUCKING USE IT THEN, SELF-IMPORTANT SHINY LIGHTS ON A SCREEN!

  • Copy link
  • Flag this comment
  • Block
moggie
@EverydayMoggie@sfba.social replied  ·  activity timestamp 2 weeks ago

Doctor at Kaiser did ask. I asked what happened if I refused, and they actually weren't able to tell me. So I assume they are recording everything regardless of what I say.

@emilymbender

  • Copy link
  • Flag this comment
  • Block
Noel Kelly
@gnoll110@ruby.social replied  ·  activity timestamp 2 weeks ago

@emilymbender

Not a doctor. But have had parent's accountant/fin advisor had had AI transcribe last couple of meetings.

  • Copy link
  • Flag this comment
  • Block
netop://ウィビ
@netopwibby@social.coop replied  ·  activity timestamp 2 weeks ago

@emilymbender I had an appointment last week (Kaiser Permanente) and my doctor asked if I was fine with being recorded and she explained that it made it easier for her to write up a report later.

No mention of AI but that's not to say that's the real reason.

  • Copy link
  • Flag this comment
  • Block
Prof. Emily M. Bender(she/her)
@emilymbender@dair-community.social replied  ·  activity timestamp 2 weeks ago

@netopwibby Oof -- so she asked if you were okay being recorded but did not provide info on what was going to happen to the recording?

  • Copy link
  • Flag this comment
  • Block
AudhdDespiteNoisyAbleism 🇨🇦
@adelinej@piaille.fr replied  ·  activity timestamp 2 weeks ago

@emilymbender Had a similar experience to
@netopwibby one with a cardiologist, I am in Canada. But at the last appointment she didn’t seem to use it? Will try to think to ask her about the use of it next time if I don’t forget.

  • Copy link
  • Flag this comment
  • Block
netop://ウィビ
@netopwibby@social.coop replied  ·  activity timestamp 2 weeks ago

@emilymbender Aside from “taking notes,” nope. Didn't seem weird at the time so I didn't probe.

  • Copy link
  • Flag this comment
  • Block
J. R. DePriest :verified_trans: :donor: :Moopsy: :EA DATA. SF:
@jrdepriest@infosec.exchange replied  ·  activity timestamp 2 weeks ago

@emilymbender

So far, I've been able to politely decline. Not sure how long that will last.

  • Copy link
  • Flag this comment
  • Block
My camera shoots fascists
@Mikal@sfba.social replied  ·  activity timestamp 2 weeks ago

@emilymbender

Yep. And I said no. She initially said not to worry because it's all deleted afterwards. I said that, no, it is not. That's not how LLM's work. All that data remains in there somewhere and can be hacked, plus I don't want anything about me used to train those things on principal. She didn't argue.

  • Copy link
  • Flag this comment
  • Block
Randall
@rbmath@mathstodon.xyz replied  ·  activity timestamp 2 weeks ago

@emilymbender I've noticed a lot of this use in veterinary medicine recently as well, just FYI.

  • Copy link
  • Flag this comment
  • Block
Prof. Emily M. Bender(she/her)
@emilymbender@dair-community.social replied  ·  activity timestamp 2 weeks ago

@rbmath Oh interesting.

  • Copy link
  • Flag this comment
  • Block
Randall
@rbmath@mathstodon.xyz replied  ·  activity timestamp 2 weeks ago

@emilymbender And, now that I'm thinking about it- I have rarely seen any question put to the client if they want to consent or not. (This is some from work in a clinic and as a client/observer from a couple of clinics.)

  • Copy link
  • Flag this comment
  • Block
Prof. Emily M. Bender(she/her)
@emilymbender@dair-community.social replied  ·  activity timestamp 2 weeks ago

@rbmath Yikes.

  • Copy link
  • Flag this comment
  • Block
Log in

Bonfire social

Bonfire social: About · Code of conduct · Privacy ·
Bonfire social · 1.0.0-rc.3.6 no JS en
Automatic federation enabled
  • Explore
  • About
  • Code of Conduct
Home
Login