"the day she picked a pฬอrฬoฬธtecฬอtฬจedฬทอ fออกlฬoฬตอขwออeฬrฬอ the sky อกาอาอขfฬกอeาอ ฤผฬทฬตฬดฬตอออฤผฬทฬตฬดฬตอออ"
Never2Late ๐ฒ๐ผ an โโโโโโ poetry-โโโโโโ making machine โโโโโโโโ from what is โโโโโโโโ lost. In the spaces between โโโ โโโโโ on WordReference.com, โโโโโโโ oblivion, we find the โโโโโโโโโโ of language from โโโโโโโ our collective โโโโ.
Developed for la Fรชte des Fleurs, 2025 edition.
"๐ป๐๐๐ฅ๐ ๐๐ค ๐๐ ๐ฅ ๐ฅ๐๐ ๐๐๐ ๐ ๐ ๐๐๐๐ฆ๐ฅ๐ช. โ๐ ๐๐ฅ๐ฃ๐ช ๐๐ค ๐ฅ๐๐ ๐ฃ๐๐ค๐ฆ๐ฃ๐ฃ๐๐๐ฅ๐๐ ๐ ๐ ๐ ๐๐๐๐๐๐๐."
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ๐ฃ๐๐๐๐ฅ๐ค ๐๐๐ ๐ธ๐๐๐๐ ๐จ๐๐๐๐๐๐๐๐๐ฅ๐ค โฌโฐโโ ฌโ โคโ
- Profuse thanks to everest pipkin, whose work "i've never picked a protected flower" inspired this project.
Their 400-page generative glitch poetry PDF is IMO a masterpiece of digital art and concrete poetry <3 - The original forum thread which inspired the projects' names
"i've never picked a protected flower" See the source at github.com/everestpipkin/never
Instead of relying on external APIs for conceptual relationships, we use local NLP models with spaCy + vector similarity computations to retrieve semantically proximate words directly from the corpus, based on English NLP pre-trained word models.
The project is composed of four independent yet interconnected parts:
scraper.py- ษจืฆษฌเฝฤ ฦษฌส raw language from the collective unconsciousclean.py- ๊๊ค๊๊๊๊ช๊ the noise from the signalgenerator.py- เฉฎะณเธเบลเงฒเนะณเบล random words into semantic poetrywallpaper.py- ัฮทยขฮฑโัั the poems into unicode patterns
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
# Clone the repository
git clone https://github.com/PLNech/never2.git
cd never2
# Optional but recommended: create a virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install spacy beautifulsoup4 requests numpy
python -m spacy download en_core_web_lg1. ๐๐๐ฃ๐๐ก๐๐๐ โ๐ ๐๐ฅ๐ฃ๐ช ๐๐๐๐ฃ๐๐๐๐๐๐ฅ๐ค
python scraper.py --start 1 --end 100 --delay 2 --batch 20Options:
--start START- Starting page number--end END- Ending page number--delay DELAY- Delay between requests (seconds)--batch BATCH- Save to disk after this many pages
"In the gaps between forum words, I found sentences that never existed."
python clean.py -i english4.csv -o clean4.csvOptions:
-i INPUT, --input INPUT- Input CSV file (default: english4.csv)-o OUTPUT, --output OUTPUT- Output CSV file (default: derived from input)
"Language is a virus ๐๐๐๐ ๐๐๐๐๐ ๐๐๐๐๐."
python generator.py -i clean4.csv -n 20 -l 5 -f html --feet 575 --cache vectors_cache.pklOptions:
-i INPUT, --input INPUT- Input CSV file with cleaned sentences-n NUM_POEMS, --num-poems NUM_POEMS- Number of poems to generate-l LENGTH, --length LENGTH- Maximum number of lines per poem-m MODEL, --model MODEL- spaCy model to use (default: en_core_web_lg)-o OUTPUT_DIR, --output-dir OUTPUT_DIR- Directory to save generated poems-f {txt,html,json}, --format {txt,html,json}- Output format-s SEED, --seed SEED- Initial seed word for poem generation-p PORT, --port PORT- Run as HTTP server on specified port-b BATCH, --batch BATCH- Generate a large batch of poems (specify count)-w WORKERS, --workers WORKERS- Number of worker processes for batch generation--cache CACHE- Cache file for word vectors and similarity-r RELATED, --related RELATED- Test related words--feet FEET- Pattern for syllable counts (e.g., "575" for haiku, "12x4" for alexandrines)--test- Run tests for word similarity
Loading spaCy model: en_core_web_lg...
Loading cache from .poem_cache.bin...
Loaded 1545 word vectors, 310284 similarity pairs, and 216 related words sets
Loading data from clean.csv...
Loaded 10268 sentences from 2166 users
Found 15 words related to 'thought': ['think', 'believed', 'probably', 'because', 'something', 'understanding', 'knowing', 'deliberately', 'remember', 'imperceptibly']
Using cached related words for 'blush'
Found 15 words related to 'trail': ['road', 'mountainside', 'roadway', 'mountain', 'route', 'canyon', 'path', 'ridge', 'mountains', 'outcropping']
Using cached related words for 'flame'
Using cached related words for 'decay'
Using cached related words for 'sky'
[...]
Using cached related words for 'brick'
Using cached related words for 'boundary'
Saved 100 poems to poems
Saving cache to .poem_cache.bin...
Saved 1545 word vectors, 322200 similarity pairs, and 225 related words sets
"Every poem is an epitaph. ๏ผไบบโ โฟโฟ โไบบ๏ผผ"
python wallpaper.py -W 80 -H 24 -g pmm --poem poems/poem_1.txt -f html -o pattern.htmlOptions:
-W WIDTH, --width WIDTH- Width of the pattern in characters-H HEIGHT, --height HEIGHT- Height of the pattern in characters-s SEED, --seed SEED- Random seed for pattern generation-g {p1,pm,pmm,...}, --group {p1,pm,pmm,...}- Specific wallpaper group to use-p PORT, --port PORT- Run as web server on specified port-o OUTPUT, --output OUTPUT- Output file path-f {txt,html}, --format {txt,html}- Output format--poem POEM- Text file containing poem to embed-D DENSITY, --density DENSITY- Character density (higher = more characters)-b {white,black}, --background {white,black}- Background color for HTML output-i [INTERACTIVE], --interactive [INTERACTIVE]- Run in interactive mode with auto-updates--chaos- Run in chaos mode with varying density and update intervals
The -D parameter controls the density of Unicode characters in the output. Here are examples with increasing density values:
| Command | Results |
|---|---|
-D 50 -H 10 -W 40 |
![]() |
-D 100 -H 12 -W 40 |
![]() |
-D 500 -H 28 -W 80 |
![]() |
-D 1000 -H 10 -W 40 |
![]() |
- Rework wallpaper.py: Current implementation is far from the unicode magic of everest's original project. Needs significant upgrades to achieve bolder aesthetic impact.
- Add support for more output formats and embedding options
- Improve interactive visualization modes
- Rework web interface
- Enhance thematic poem generation
- Fine-tune feet/syllable matching logic
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
This project is free software. Take it, use it, modify it, reshare it. Just add/download your own dataset!
๐ ๐ ๐ ๐ ๐ ๐ ข๐ ๐ ๐ ค๐ ๐ ๐ ๐ ก ๐ ฃ๐ ๐ ๐ ๐ ๐ ค ๐ ๐ ๐ -๐น.๐ถ






