AI & Machine LearningContent MarketingConsumer ResearchNative Advertising

AI v. Human-Generated Content: Can You Tell The Difference?

Nativo
AI v. Human-Generated Content: Can You Tell The Difference?

Frequently Asked Questions

What did Nativo test in this study?

Nativo presented randomly selected articles (some AI-generated and some human-written) to more than 700 U.S. consumers. Participants rated content quality features and were asked to guess whether each article was written by a human or generated by AI.

How many people took part in the survey?

The study surveyed over 700 consumers in the United States.

Could consumers tell the difference between AI and human-written content?

The article summarizes the study design (presentation, quality ratings, and identification task). For detailed outcomes — including whether consumers could reliably tell the difference — Nativo links to the full interactive study.

Where can I see the full results?

Nativo provided links to the full interactive study: https://advertiser.nativo.com/ai-content-study/p/1 and to the interactive resource hosted via HubSpot. The article’s CTA links lead to the full report.

AI-Generated Summary70-80% of original length
Original Article

Despite a flurry of interest in AI over the past several months, no one has really put it to the test when it comes to content.

While there are several surveys asking people what they think of AI vs. Human generated content, no one has tested different pieces of content to see which consumers prefer, or if they can even tell the difference.

Until now.

AI vs Human content study

Nativo surveyed over 700 consumers in the US to understand more about how people react to AI vs. Human generated content.

They were presented with an article at random, asked to rate various features regarding its quality, and finally asked to guess whether it was human or AI written.