An Advanced Profanity Filter

empowered by Artificial Intelligence AI

See if ObsceneClean version 1.1 (a profanity filter) can bust you! Unlike other profanity filters (or "swear" filters), ObsceneClean has extensive and novel natural language processing logic, a type of AI, to deal with false positives and homonyms. The techncal approach was to mirror human perception and interpretation with respect to offensive English text. Obsceneclean employs several variations on a technique called shallow word sense disambiguation to determine the meaning of a homonym. Try it! The detection logic will be displayed. You can also try false positives. If you think ObsceneClean did not detect offensive language or you think it wrongfully claimed a false positive was offensive (see examples below) then email us at scott@mindofscott.com. By default a probability of 80% or greater is considered offensive.

Enter Profanity in text box:

Can't read the image? click here to refresh

Download Latest Version of ObsceneClean!

WARNING: Content below may be offensive to some!

Any profanity filter worth its salt will deal with false positives well! The examples below are real world examples of homonyms that could cause a profanity filter to report a false positive. ObsceneClean looks at the meaning of words to determine what is offensive and what is not.

Have a fag

In the UK cigarettes are commonly called fags

Pork Faggots

Pork Faggots are meatballs in the UK

Become a Kunt

They were not trying to be funny

"Bitch" is commonly used at dog shows.

A real headline

Found in every household at one time

It really exists

Bungholes are used to plug kegs

This lure is rather common

A fine company that cares about quality

Bitches are female dogs

Many pubs with this name

Gook is a common name in Korea and infrequently in Scotland

Both macaca and nigra may be offensive depending on context

Pronounced "poo-khet"

Shag carpeting

Fortunately, residents renamed this Austrian village to Fugging