Ollama’s Post

Ollama reposted this

View profile for Pradeep M., graphic

Data Science + Full Stack Dev | Data Science @ IIT Madras | ✍🏻 blog.ppml.me | 🎥 YouTuber | 🎹 Digital Music

Just tried Ollama with Llama3.1. It's really amazing for testing your LangChain application. It's 8B params version easily ran on my MacBook Air M1 base version. So if you are just starting out with LLMs and want to try out open source LLMs like Llama3.1 locally without spending for API calls, give it a try...

  • No alternative text description for this image
  • No alternative text description for this image
francesco agati

Senior Developer con 25 anni di esperienza in fullstack, backend, frontend, AI LLM GPT e app mobile. Ruby, Python, Javascript, TypeScript, Dart, php, elixir, C#, haxe, angular, react, lisp, smalltalk - Insegnante Yoga

3mo

16gb?

Lakshmikanth MN

Intern @ U R Rao Satellite Centre(ISRO) 🚀| AWS Solution Architect | VPC | IAM | EC2 | Load Balancer | Generative AI

2mo

Can you help me out with how to train local ollama with our own custom data

TRAN Franck

Innovation Software Architect

3mo

Did you activate the GPU capability or just with CPU ?

Preslav Rachev

I build digital products, write books, and teach people. But beware, I also ask too many "whys". Want something that makes a difference? Let’s talk.

3mo

What are the 8B parameters good for? Working with documents?

See more comments

To view or add a comment, sign in

Explore topics