TensorFlow.js Setup
Install and configure TensorFlow.js for ML-powered toxicity detection
ML-powered toxicity detection requires TensorFlow.js and the toxicity model. This guide covers installation for different environments.
TensorFlow.js is an optional peer dependency. The core glin-profanity package works without it — ML features are only loaded when you import from glin-profanity/ml.
Quick Install
npm install glin-profanity @tensorflow/tfjs @tensorflow-models/toxicityyarn add glin-profanity @tensorflow/tfjs @tensorflow-models/toxicitypnpm add glin-profanity @tensorflow/tfjs @tensorflow-models/toxicityEnvironment-Specific Setup
Node.js (Server)
For server-side usage, install the Node.js-optimized TensorFlow backend:
npm install glin-profanity @tensorflow/tfjs-node @tensorflow-models/toxicity@tensorflow/tfjs-node uses native C++ bindings for significantly better performance than the pure JavaScript version.
Usage:
// Import tfjs-node BEFORE glin-profanity/ml
import '@tensorflow/tfjs-node';
import { ToxicityDetector, HybridFilter } from 'glin-profanity/ml';
const detector = new ToxicityDetector({ threshold: 0.85 });
await detector.loadModel();Browser (Client)
For browser usage, the standard package works:
npm install glin-profanity @tensorflow/tfjs @tensorflow-models/toxicityUsage with bundlers (Vite, webpack, etc.):
import { ToxicityDetector } from 'glin-profanity/ml';
const detector = new ToxicityDetector();
await detector.loadModel(); // Model loads from CDN automaticallyCDN Usage:
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/toxicity"></script>
<script src="https://cdn.jsdelivr.net/npm/glin-profanity"></script>
<script>
const detector = new GlinProfanity.ToxicityDetector();
detector.loadModel().then(() => {
console.log('Model ready');
});
</script>Next.js / SSR
For server-side rendering frameworks, configure to avoid client-side TensorFlow issues:
// lib/moderator.ts
import { HybridFilter } from 'glin-profanity/ml';
let filter: HybridFilter | null = null;
export async function getModerator() {
if (!filter) {
// Only load on server
if (typeof window === 'undefined') {
// Dynamic import for tfjs-node
await import('@tensorflow/tfjs-node');
}
filter = new HybridFilter({
enableML: true,
languages: ['english'],
});
await filter.initialize();
}
return filter;
}API Route usage:
// pages/api/moderate.ts or app/api/moderate/route.ts
import { getModerator } from '@/lib/moderator';
export async function POST(req: Request) {
const { text } = await req.json();
const moderator = await getModerator();
const result = await moderator.checkProfanityAsync(text);
return Response.json({
allowed: !result.isToxic,
reason: result.reason,
});
}Edge Runtime (Vercel Edge, Cloudflare Workers)
TensorFlow.js has limited Edge runtime support. For edge deployments, consider using rule-based detection only or calling a serverless function for ML.
// Use rule-based only on edge
import { Filter } from 'glin-profanity';
export const runtime = 'edge';
export async function POST(req: Request) {
const { text } = await req.json();
const filter = new Filter({
languages: ['english'],
detectLeetspeak: true,
});
const result = filter.checkProfanity(text);
return Response.json({ profane: result.containsProfanity });
}Verification
Install packages
npm install glin-profanity @tensorflow/tfjs @tensorflow-models/toxicityCreate test file
// test-ml.ts
import { ToxicityDetector } from 'glin-profanity/ml';
async function test() {
console.log('Loading model...');
const detector = new ToxicityDetector({ threshold: 0.85 });
const available = await detector.checkAvailability();
console.log('TensorFlow available:', available);
if (available) {
await detector.loadModel();
console.log('Model loaded!');
const result = await detector.analyze('you are stupid');
console.log('Result:', result.isToxic ? 'TOXIC' : 'CLEAN');
console.log('Categories:', result.matchedCategories);
}
}
test().catch(console.error);Run test
npx tsx test-ml.tsExpected output:
Loading model...
TensorFlow available: true
Model loaded!
Result: TOXIC
Categories: [ 'insult', 'toxicity' ]Troubleshooting
"Cannot find module '@tensorflow/tfjs'"
TensorFlow.js is not installed:
npm install @tensorflow/tfjs @tensorflow-models/toxicity"Failed to load toxicity model"
The model downloads from a CDN on first load. Check:
- Internet connectivity
- Firewall/proxy blocking TensorFlow model URLs
- Sufficient memory (model requires ~50-100MB)
Slow model loading
First load downloads the model (~10MB). Solutions:
- Preload during app initialization
- Use
preloadModel: truein config - Cache model files if possible
const detector = new ToxicityDetector({
preloadModel: true, // Start loading immediately
});Memory issues
The model uses significant memory. In serverless:
// Dispose after use
const detector = new ToxicityDetector();
await detector.loadModel();
const result = await detector.analyze(text);
detector.dispose(); // Free memoryTypeScript errors
Ensure types are installed:
npm install -D @types/nodePerformance Tips
- Preload the model on app startup, not per-request
- Use batch processing for multiple texts
- Use
@tensorflow/tfjs-nodeon server for 5-10x faster inference - Consider
rules-firstmode in HybridFilter for balanced performance - Dispose models in serverless to avoid memory leaks
Next Steps
- ToxicityDetector API — Complete API reference
- HybridFilter API — Combined rule + ML detection
- ML Integration Guide — Best practices and patterns