• Xanza@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    3
    ·
    edit-2
    1 day ago

    The idea of generative AI isn’t accuracy, so that’s pretty expected.

    Generative AI is designed to be used with a content base and expand on information, not to create new information. You can feed generative AI with the entirety of the current Wikipedia text source and have it expand on subjects which need it, and curtail and simplify other subjects which need it.

    You don’t ask generative AI to come up with new information–that’s how you get inaccurate information.

    text AI generators making up believable lies if it doesnt have enough information

    Let’s not anthropomorphize AI. It doesn’t lie. It uses available data to expand on a subject to make it conversationally complete when it lacks sufficient information on a subject, regardless of whether or not the context is correct. That’s completely different, and you can specifically prohibit an AI from doing that…

    AI is great when used appropriately. The issue is that people are using AI as a Google replacement, something it’s not designed to do. AI isn’t a fact engine. LLMs are designed to as closely resemble human speech as possible, not to give correct information to questions. People’s issue with AI is that they’re fucking using it wrong.

    This is an exceptionally great usage of AI because you already have the required factual background knowledge. You can simply feed it to your AI telling it not to fill in any gaps and to rewrite articles to be more uniform and to have direct and easy to consume verbiage. This instance is quite literally what generative AI was designed for…to use factual knowledge and to generate context around the existing data.

    Issues arise when you use AI for things other than what it was intended, and you don’t give it enough information and it has to generate information to complete datasets. AI will do what you ask, you just have to know how to ask it. That’s why AI prompt engineers are a thing.