// TEXT TOKEN COUNTER v2.0 [SYS: ONLINE] LOADING BPE ENGINE... user@localhost:~$ token-counter --interactive
⚠ CDN 离线,使用经验公式估算 (fallback mode)
┌─ INPUT.TXT ─ EDITOR [ DRAG .TXT/.MD/.JSON TO LOAD ]
1
[ DROP FILE TO LOAD ]
LN 1, COL 1
┌─ CHAR STATISTICS [LIVE]
total chars0
no-space0
words0
lines0
paragraphs0
sentences0
// CHAR TYPE DISTRIBUTION ─────────────
中文
0 0.0%
english
0 0.0%
digits
0 0.0%
punct
0 0.0%
whitespace
0 0.0%
┌─ TOKEN ESTIMATION [BPE: ...]
loading js-tiktoken via esm.sh...
GPT-4o / GPT-5 BPE
o200k_base | 200K vocab
tokens
GPT-4 / GPT-3.5 BPE
cl100k_base | 100K vocab
tokens
Claude 3/4 EST
empirical formula | ~3.5 en / ~1.5 zh
tokens
// No public JS tokenizer for Claude. // Using empirical formula: EN ~3.5 chars/tok, ZH ~1.5 chars/tok.
Gemini 1.5/2.0 EST
empirical formula | ~4.0 en / ~1.8 zh
tokens
// No public JS tokenizer for Gemini. // Using empirical formula: EN ~4.0 chars/tok, ZH ~1.8 chars/tok.
// TOKEN COUNT COMPARISON ─────────────
// paste text to compare...
┌─ HISTORY (LAST 10)
// no history yet
┌─ EXPORT / INFO
// summary output:
// analyze text to generate summary...
// shell keybindings:
  Ctrl+V   → paste & analyze
  Ctrl+A   → select all text
  Drop    → load .txt/.md/.json
  ?text=   → URL pre-fill
// TOKEN-COUNTER v2.0 | BPE: js-tiktoken (esm.sh) | EMPIRICAL: claude, gemini user@localhost:~$