Topics Engaged In
Viewing 3 topics - 1 through 3 (of 3 total)
-
- Topic
- Voices
- Last Post
-
- 1 2 … 9 10
- 180
- 3 weeks, 4 days ago
-
How to run Mixtral of experts on your local computer today. I will be presenting a full article on this soon. However here is a way yo get the very powerful open source Mixtral 8x7 running on your computer. It is fast but barebones as it uses your computer's "terminal". We start with a program called Ollama: Download here: https://ollama.ai/download It will go through an extensive install but it will not have any AI models. CPU requirements For best performance, a modern multi-core CPU is recommended. An Intel Core i7 from 8th gen onward or AMD Ryzen 5 from 3rd gen onward will work well. CPU with 6-core or 8-core is ideal. Higher clock speeds also improve prompt processing, so aim for 3.6GHz or more. Having CPU instruction sets like AVX, AVX2, AVX-512 can further improve performance if available. The key is to have a reasonably modern consumer-level CPU with decent core count and clocks, along with baseline vector processing (required for CPU inference with llama.cpp) through AVX2. With those specs, the CPU should handle Mixtral model size. Ollama can run any AI model but it is one of the best ways to dun the very powerful Mixtral. Once installed, you open your terminal (search how to do this if you don't know) and enter this: ollama run mixtral It will take a few minutes to download the model and you will need about 28-40GB of disk space.The 4-bit 13B billion parameter Mixtral model takes up 7.5GB of RAM also. I will write more soon. I recommend you only try this if you understand what is above and have the disc space to spare. I will be presenting a full article on this soon. However here is a way yo get the very powerful open source Mixtral 8x7 running on your computer. It is fast but barebones as it uses your computer's "terminal". We start with a program called Ollama: Download here: https://ollama.ai/download It will go through an extensive install but it will not have any AI models. CPU requirements For best performance, a modern multi-core CPU is recommended. An Intel Core i7 from 8th gen onward or AMD Ryzen 5 from 3rd gen onward will work well. CPU with 6-core or 8-core is ideal. Higher clock speeds also improve prompt processing, so aim for 3.6GHz or more. Having CPU instruction sets like AVX, AVX2, AVX-512 can further improve performance if available. The key is to have a reasonably modern consumer-level CPU with decent core count and clocks, along with baseline vector processing (required for CPU inference with llama.cpp) through AVX2. With those specs, the CPU should handle Mixtral model size. Ollama can run any AI model but it is one of the best ways to dun the very powerful Mixtral. Once installed, you open your terminal (search how to do this if you don't know) and enter this: ollama run mixtral It will take a few minutes to download the model and you will need about 28-40GB of disk space.The 4-bit 13B billion parameter Mixtral model takes up 7.5GB of RAM also. I will write more soon. I recommend you only try this if you understand what is above and have the disc space to spare.
- 4
- 1 year, 6 months ago
-
Personal Data consolidation ideas I recently started pulling together all my personal data in the hopes of connecting to various open and local AI models with the intent to provide a group of AI agents to advise and help organize my life. Sources that I have started to connect to include financial data, health app data, notebooks, picture libraries, and books that I've read. I am trying to make it accessible to AI models for the purpose of optimizing. I'm wondering if others have started such work and what types of data they're connecting, how they integrate the models, etc. Basically, any ideas on how a person could leverage AI technology to benefit the individual. I am envisioning a system/collection of AI advisors to use data to help optimize all the various categories of a person's life (Physical health, emotional health, business, personal finance, relationships, etc.).I recently started pulling together all my personal data in the hopes of connecting to various open and local AI models with the intent to provide a group of AI agents to advise and help organize my life. Sources that I have started to connect to include financial data, health app data, notebooks, picture libraries, and books that I've read. I am trying to make it accessible to AI models for the purpose of optimizing. I'm wondering if others have started such work and what types of data they're connecting, how they integrate the models, etc. Basically, any ideas on how a person could leverage AI technology to benefit the individual. I am envisioning a system/collection of AI advisors to use data to help optimize all the various categories of a person's life (Physical health, emotional health, business, personal finance, relationships, etc.).
- 2
- 1 year, 6 months ago
Viewing 3 topics - 1 through 3 (of 3 total)