OpenAI’s Whisper tool may add fake text to medical transcripts, investigation finds.

    • ladicius@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      20 hours ago

      This is the AI plan every healthcare entity worldwide will adopt.

      No joke. They are desperate for shit like this.

  • ChihuahuaOfDoom@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    1 day ago

    Regular transcription software is finally respectable (the early days of dragon naturally speaking were dark indeed). Who thought tossing AI in the mix was a good idea?

  • FigMcLargeHuge@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    20 hours ago

    If anyone needs to know the state of AI transcription, just turn on closed captioning for your local tv channel. It’s atrocious and I am sorry that people who need closed captioning are subjected to that.

  • sgibson5150@slrpnk.net
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    12 hours ago

    Years ago, I worked in a tech role at a medical transcription company. It hadn’t occurred to me that AI would render their jobs irrelevant. This used to be an area where women in particular could make decent money after a bit of training, and there were opportunities for advancement into medical coding and even hospital administration.

    I worked with some good people. Hope they landed on their feet.

  • ShareMySims@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    9 hours ago

    Errors and Hallucinations are definitely serious concerns, but my biggest concern would be privacy. If my GP is using AI, I no longer see my medical information as private, and that is unacceptable.

  • SpikesOtherDog@ani.social
    link
    fedilink
    arrow-up
    13
    ·
    22 hours ago

    I work in judicial tech and have heard questions of using AI transcription tools. I didn’t believe AI should be used in this kind of high risk area. The ones asking if AI is a good fit for court transcripts can be forgiven because all they see is the hype, but if the ones responding greenlight a project like that there will be some incredibly embarrassing moments.

    My other concern is that the court would have to run the service locally. There are situations where a victim’s name or other information is redacted. That information should not be on an Open AI server and should not be regurgitated back out when the AI misbehaves.

    • FatCrab@lemmy.one
      link
      fedilink
      arrow-up
      2
      ·
      12 hours ago

      Don’t court stenographers basically use tailored voice models and voice to text transcription already?

      • SpikesOtherDog@ani.social
        link
        fedilink
        arrow-up
        2
        ·
        9 hours ago

        I don’t get too technical with the court reporter software. They have their own license and receive direct support from their vendor. What I have seen is that there is an interpreting layer between the stenographer machine and the software, literally called magic by the vendor, that is a bit like predictive text. In this situation, the stenographer is actively recording and interpreting the results.