Conversation
it says a lot when LLMs have damaged people's perception of the term "AI" so badly that when VLC adds a system to automatically generate video captions, people immediately want to jump ship from VLC even though the "AI" they're adding is not very harmful and has very little to do with the extremely harmful LLMs being pushed into our faces by tech corporations
4
4
1
it's kind of a shame that the term "AI" has been so damaged because stuff like machine learning is actually really fascinating and has a lot of cool applications. there are people who have taught computers to play super mario bros from nothing but trial and error using it
0
0
0

@mjdxp I feel a deep ambivalence toward this topic. modern machine learning systems capable of media-to-text transcription are a big step up from the frankly miserable quality of automated transcription we had before, and there's an additional degree of agency which can be granted through those capabilities, but ultimately it still leaves disabled people relying on unpredictable and unreliable systems in the face of sheer apathy from human creators who could and should be captioning their media.

1
0
0

@mjdxp ironically one of the best usages of modern ML ASR systems is for forced alignment, yet that seems to be one of the least talked-about use-cases.

0
0
0

@russss @mjdxp wait, what? I'm certain I've had MP4 files with embedded sub tracks...

also how would ML help here? O_o

1
0
0
@mjdxp LLMs are not "extremely harmful". You know, they are language models, if you use them for ... language modeling, you'll be probably fine. LM is part of captioning system, too, you know? Yes, using LLMs for answering questions may not be good idea. That does not make LMs useless or "harmful".
0
0
0

@mjdxp We helped the tech companies do that on this network by attacking anyone who used a machine learning model for any reason.

I am convinced the culture of fedi alone prevented many people who could use machine learning applications for reasonable and decent use cases from doing so.

0
0
1