To ensure all the latest versions of the listed AI/ML-related and supporting packages are
installed/updated on Ubuntu 24.04, you can use a combination of apt, pip, and sometimes specific
install scripts or tools like conda or ollama. Here’s a comprehensive command set to achieve this safely:
1. System Update First
sudo apt update && sudo apt upgrade -y
sudo apt install -y python3-pip git ffmpeg build-essential libgl1
2. Upgrade Pip and Install Miniforge (Conda Alternative)
pip3 install --upgrade pip – This is system wide installation and hence rejected by WSL. Need to create
virtual environment.
# Install Miniforge (cleaner version of Anaconda)
wget https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-x86_64.sh
Miniforge3-Linux-x86_64.sh -b -p $HOME/miniforge
eval "$($HOME/miniforge/bin/conda shell. hook)"
conda init
Then restart your terminal, and use:
conda update -n base -c defaults conda
3. Create a Clean Conda Env (Optional but Recommended)
conda create -n aienv python=3.10 -y
conda activate aienv
4. Install/Upgrade AI and Multimedia Packages
PyTorch (latest, with GPU if CUDA available)
pip install --upgrade torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121
TensorFlow (CPU version – use tensorflow-gpu for GPU)
pip install --upgrade tensorflow
• The “numpy version” requirement of this is not compatible with the “numpy version” Coqui TTS
requires
• Hence numpy version compatible with Coqui TTS installed (numpy version 1.26)
Coqui TTS
pip install --upgrade TTS
CrewAI
pip install --upgrade crewai
Ollama (install and update models like Mistral)
curl -fsSL https://ollama.com/install.sh | sh
# To install or update Mistral 7B
ollama run mistral
Mistral 7B (if using other than Ollama)
Ollama handles this best. For raw HuggingFace:
pip install --upgrade transformers accelerate
# Load Mistral model with:
# from transformers import AutoModelForCausalLM, AutoTokenizer
# model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-v0.1")
PySide6 (Qt GUI framework)
pip install --upgrade PySide6
Scarpy (webscraper)
pip install scrapy
manim (mathematical animations)
pip install --upgrade manim
ffmpeg ()
sudo add-apt-repository ppa:mc3man/trusty-media
sudo apt-get update
sudo apt-get dist-upgrade
sudo apt-get install ffmpeg
moviepy (video editing)
pip install --upgrade moviepy
newspaper3k (news scraping)
pip install --upgrade newspaper3k
trafilatura (web text extraction)
pip install --upgrade trafilatura
beautifulsoup4 (HTML parsing)
pip install --upgrade beautifulsoup4
5. Final Cleanup & Verification
pip check # checks for broken dependencies
(aienv) acjegan1976@DESKTOP-AIQC6S4:~$ pip check
tts 0.22.0 has requirement numpy==1.22.0; python_version <= "3.10", but you have numpy 1.26.0.
manim 0.19.0 has requirement numpy>=2.1; python_version >= "3.10", but you have numpy 1.26.0.
(aienv) acjegan1976@DESKTOP-AIQC6S4:~$
conda list # verify all versions if using conda
Optional: Freeze Environment for Reproducibility
pip freeze > requirements.txt
Installation
(aienv) acjegan1976@DESKTOP-AIQC6S4:~$ ollama serve
Couldn't find '/home/acjegan1976/.ollama/id_ed25519'. Generating new private key.
Your new public key is:
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC/fEU4oFvAO172rFRz08TmfAZK18K2lDxtBEwOCKF3J