08-07-2024, 07:14 AM
As has already been mentioned, it is widely understood that AI detectors commonly return a false positive result on content written by non-native speaker. This is thought to be related to specific vocabulary and the use of less complex language compared with native speakers.
One suggestion I have is to try originality.ai. It isn't free, but by a wide margin seems to be the most accurate AI detector out there. It flags content and for each individual phrase, offers a percentage score of how likely it is that the content is AI.
I found that in places where other AI detection tools found my original writing (I am a native speaker) to show a 50-60% likelihood of AI generation, Originality found a 0 likelihood. (I tested it with AI generated text, too, and it caught that almost 100%.)
Originality charges by the size of the document you submit. I think a 10 page assignment cost 40 cents or something. You buy $5 or 10 worth of credits. I'm still using my original purchase from. a year ago.
One suggestion I have is to try originality.ai. It isn't free, but by a wide margin seems to be the most accurate AI detector out there. It flags content and for each individual phrase, offers a percentage score of how likely it is that the content is AI.
I found that in places where other AI detection tools found my original writing (I am a native speaker) to show a 50-60% likelihood of AI generation, Originality found a 0 likelihood. (I tested it with AI generated text, too, and it caught that almost 100%.)
Originality charges by the size of the document you submit. I think a 10 page assignment cost 40 cents or something. You buy $5 or 10 worth of credits. I'm still using my original purchase from. a year ago.


![[-]](https://www.degreeforum.net/mybb/images/collapse.png)