Ollama’s Post

Ollama reposted this

It's pretty wild that you can now run a 405b model on your own hardware. Meta is just killing it with these models.

Mostafa Sherif

Enabling Automotive Enthusiasts to Reach Their Destinations 💜 | Specialist in Automotive Operations Management | Expertise in Dealership Management and Service Retail Operations

4mo

It is awesome how Ollama is easily deploy these huge models, but I won't call Cloud Commuting with AMD's MI300X, your own hardware. :)

Jason Head

Let's make work better for everyone - Software, AI, Robotics

4mo

Where can someone buy a MI300X?

Andrei Fedorov

Data Scientist & Telecom Expert

3mo

I’m literally laughing when people seriously discuss private LLMs as if all of this is done on regular consumer budget hardware.. 🤦♂️

Like
Reply
See more comments

To view or add a comment, sign in

Explore topics