OC ObsceneClean

ObsceneClean

An Advanced Profanity Filter

empowered by Artificial Intelligence AI


• Version 1.2 Beta is dead.
• Version 2.0 beta will be available soon (like weeks).
• A fork of ObsceneClean is under development.
• The code has been cleaned up and made more readable by orders of magnitude.
• The richness, detail, and diversity of data produced by ObsceneClean has expanded exponentially.

• ObsceneClean has gone far beyond its original purpose as a mere profanity filter.
• It detects any kind of hate speech and profiles the sender.
• Vast improvements and better detection as been implemented.

.

WARNING: Some may find content below offensive.

Any profanity filter worth its salt will deal with false positives well! The examples below are real world examples of homonyms that could cause a profanity filter to report a false positive. ObsceneClean looks at the meaning of words to determine what is offensive and what is not.


Have a fag

In the UK cigarettes are commonly called fags

Pork Faggots

Pork Faggots are meatballs in the UK

Become a Kunt

They were not trying to be funny

"Bitch" is commonly used at dog shows.

A real headline

Found in every household at one time

It really exists

Bungholes are used to plug kegs

This lure is rather common

A fine company that cares about quality

Bitches are female dogs

Many pubs with this name




Gook is a common name in Korea and infrequently in Scotland

Both macaca and nigra may be offensive depending on context

Pronounced "poo-khet"

Shag carpeting

Fortunately, residents renamed this Austrian village to Fugging