the alignment problem: creatives and content

Depuis le début
                                        

Detector: "Hey, this is a dog!"

Reality: "In what world does it look like a dog to you?"

Detector: "Look at the colors, the patterns, this is clearly a dog!"

Conclusion: "You failed the level. Please don't sleep during the classes."

• True Negative

Detector: "Hey, I think I know now. This is not a dog!"

Reality: "It was about time you had some brain cells. Congrats."

Conclusion: "You finally learnt your lesson. Yay."

• False Negative

Detector: "Hey, wait. I think that's not a dog but a cat!"

Reality: "Here we go again. That was a dog image this time."

Conclusion: "I don't have enough patience to deal with this today. Try again!"

And if it is still confusing. Congrats, you've finally learnt the headache of trying to learn the Confusion Matrix.

So the conclusion really is, false positives and false negatives are common. Yes, the AI could be fed the whole internet and beyond and it still stumbles. But, don't we all? Let's give it the credits for existing and causing confusion (see what I did there?) in this world. Thanks, AI. You're the cause and the solution of our headaches and we don't know which way to lean towards.

Any questions about this and I am happy to entertain. We'll move to our next point now.

→ The struggle is real with the AI-hybrid content

Look, I know I tell you to use AI and then modify your text... but, if you put that edited text to an AI detector and then expect it to give you an absolute 100 on human writing because you finally learnt your lesson, you're wrong. Plus, that's not a good way to judge AI content.

AI hates absolutes, did you know that? Because the minute it becomes absolute, it'd need no further developments or improvements. That's impossible.

Person: "Hey AI! What's your worst nightmare?"

AI: "Absolutes."

Person: "Here, have an absolute number."

AI: "Time to run, BYE!"

There's a blooming concept of Super-AI or AGI (Artificial General Intelligence) that'd probably be the equivalent of an absolute form but we have a long way to go and even then, we're afraid that we would be wiped out by a simple paper clip.

(Google: Paper Clip Theory)

AGI basically is the human equivalent of a robot/machine, if you will. It can do all the things that a human can do and more. Just make sure we don't have a Terminator situation on our hands... Now that'd be tragic or robastic (robot + fantastic) . I don't know which will come first, I might not be there to answer it, unfortunately.

Again, if you need more information about this, please ask in the comments or ask Google. Whichever your hands reach out to first.

→ Lack of true understanding

The only person who'd be able to understand the thought process and the meaning behind something would be the person who wrote it — whether that's assisted by an AI or totally human written. Someone has to initiate and that can only be human in both scenarios. True intentions cannot be understood even if everything was put plainly to paper. It'd still not be clear.

→ AI is developing by the ramps, but are the detectors that are designed for it keeping up?

The truth is, AI is on the outrun to be the smartest "heartthrob" of the city and no one can truly understand who its soulmate is, but, it's on the race and we're on the sidelines. Or frontlines if we actively try to engage and raise awareness about AI stuff.

A Human's Guide To Detecting AI Generated ContentOù les histoires vivent. Découvrez maintenant