- Korean Learning Advanced TUI
Auto-imported from:
D:/repos/aiegoo/uconGPT/eng2Fix/kor2fix/README_TUI.md
Original filename:README_TUI.md
Import date: Wed, Oct 08, 25
Korean Learning Advanced TUI
๐ฏ Overview
Advanced Text User Interface (TUI) for Korean language learning with multi-backend AI support. Built with Pythonโs Textual framework for a modern terminal experience.
(Screenshots will be added)
โจ Features
- ๐จ Rich Terminal Interface: Modern TUI with colors, borders, and responsive design
- ๐ Multi-Backend Support: Switch between Ollama (Mistral) and self-hosted models
- ๐ฐ๐ท Korean Learning Focus: Specialized prompts and learning modes
- โจ๏ธ Keyboard Navigation: Intuitive controls and shortcuts
- ๐ Session Logging: Track learning progress and conversations
- ๐ Dark/Light Mode: Toggle interface themes
๐ Quick Start
Prerequisites
- Python 3.8+ (recommended 3.10+)
- Modern terminal with Unicode support
- At least one AI backend running (Ollama or self-hosted)
Installation
# Clone repository
cd /home/hsyyu/repo/kor2unity
# Install dependencies
pip install textual httpx
# Run the TUI
cd scripts
python kor2unity_advanced_tui.py
One-Line Start (with Ollama)
cd /home/hsyyu/repo/kor2unity/scripts && python kor2unity_advanced_tui.py
๐ Documentation
- Complete Setup Guide - Detailed installation and configuration
- Quick Start Guide - Get running in 5 minutes
- API Documentation - Backend endpoint details
๐๏ธ Architecture
File Structure
/home/hsyyu/repo/kor2unity/
โโโ scripts/
โ โโโ kor2unity_advanced_tui.py # ๐ฏ Main TUI application
โ โโโ llm_api.py # ๐ง Self-hosted API server
โโโ docs/
โ โโโ KOREAN_TUI_SETUP.md # ๐ Complete setup guide
โ โโโ QUICK_START.md # โก Quick start guide
โ โโโ screenshots/ # ๐ธ Demo images
โโโ logs/
โ โโโ korean_learning.log # ๐ Session logs
โโโ requirements.txt # ๐ฆ Dependencies
Supported Backends
| Backend | Port | Model | Purpose | |โโโ|โโ|โโ-|โโโ| | Ollama | 11434 | Mistral | Fast general responses | | Self-hosted | 8204 | Llama2+MiniGPT4 | Korean learning specialized | | Container | 8203 | Ollama | Fallback option | | Legacy | 8201 | Various | Compatibility |
๐ฎ Usage
Keyboard Controls
- Enter โ Send message
- e โ Switch API endpoint
- d โ Toggle dark/light mode
- Ctrl+E โ Scroll to bottom
- q โ Quit application
Learning Commands
# Basic Korean greetings
์๋
ํ์ธ์ (Hello)
# Grammar questions
How do I conjugate verbs in Korean?
# Translation requests
Translate "I love learning Korean" to Korean
# Cultural questions
Tell me about Korean honorifics
๐ง Configuration
API Endpoints
Edit scripts/kor2unity_advanced_tui.py
:
KOREAN_API_ENDPOINTS = {
"ollama_primary": "http://localhost:11434/api/generate",
"self_hosted": "http://localhost:8204/korean-learning",
"ollama_container": "http://localhost:8203/api/generate",
"legacy_api": "http://localhost:8201/chat"
}
Environment Variables
# Optional: Set custom model
export MISTRAL_MODEL="mistral:7b"
# Optional: Set custom timeout
export OLLAMA_TIMEOUT_SECONDS="60"
# Optional: Set log level
export LOG_LEVEL="INFO"
๐ Troubleshooting
Common Issues
# Missing dependencies
pip install textual httpx
# Ollama not running
ollama serve
# Self-hosted API not running
cd /home/hsyyu && python llm_api.py
# Korean fonts not displaying
sudo apt-get install fonts-noto-cjk
# Check running services
netstat -tulpn | grep -E "(8204|11434)"
Health Checks
# Test Ollama
curl -X POST http://localhost:11434/api/generate -d '{"model":"mistral","prompt":"test","stream":false}'
# Test self-hosted API
curl -X POST http://localhost:8204/korean-learning -d '{"question":"test","korean_mode":true}'
# View logs
tail -f ~/repo/kor2unity/logs/korean_learning.log
๐ค Contributing
Development Setup
# Install development dependencies
pip install -r requirements.txt
pip install pytest black flake8
# Format code
black scripts/kor2unity_advanced_tui.py
# Run tests
pytest tests/
Adding New Features
-
New endpoints: Modify
KOREAN_API_ENDPOINTS
dict -
UI components: Add to Textual
compose()
method -
Learning modes: Extend
learning_mode
options -
Keyboard shortcuts: Update
BINDINGS
list
๐ Performance
Benchmarks
- Startup time: ~2-3 seconds
- Response time: 1-5 seconds (depends on backend)
- Memory usage: ~50-100 MB
- CPU usage: Minimal when idle
Optimization Tips
- Use Ollama for faster responses
- Enable logging only when needed
- Use modern terminal with GPU acceleration
- Keep model files on SSD storage
๐ License
MIT License - See LICENSE file for details
๐ Acknowledgments
- Textual Framework - Rich terminal interface
- Ollama - Local LLM serving
- Korean Language Community - Learning resources and feedback
๐ Support
- Issues: Create GitHub issue with logs and screenshots
-
Documentation: See
docs/
directory -
Logs: Check
~/repo/kor2unity/logs/korean_learning.log
Ready to learn Korean? Start the TUI and type ์๋
ํ์ธ์
! ๐ฐ๐ท