-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question]: parsing file failed #2222
Comments
Same. I am trying to ingest a .csv file using the latest docker-compose and Ollama (nomic-embed-text) Any ideas? |
my os is ubuntu 20.04. fixed by updated to 22.04 |
I’m having the same problem, too. using ubuntu 24.04.1 |
same here. Running with local build docker image on Mac. By the way, looking into database.log gives: I am using local Ollama, chat model local llama3.2:1b, embedding model local Ollama "snowflake-arctic-embed" |
I had this error due to low storage space. Please check the status of ES through ragflow user setting-->system. If it is red, please follow the link https://www.elastic.co/guide/en/elasticsearch/reference/8.15/red-yellow-cluster-status.html#fix-red-yellow-cluster-status to diagnose the reason. |
this solved my problem. the following command in particular, as it fits with my logs:
|
I'm happy to share my experience here. The reason I have this is because of lack of space, strictly because elasticsearch monitors your storage usage and stops sharding when it's greater than I will describe below how I discovered and solved this problem. 1. Find the username and password for elasticsearch by using
|
Describe your problem
[ERROR]Insert chunk error, detail info please check ragflow-logs/api/cron_logger.log. Please also check ES
The text was updated successfully, but these errors were encountered: