forked from jan-leila/nix-config
		
	set horizon up to use twilight ollama models
This commit is contained in:
		
							parent
							
								
									8c36fe5a72
								
							
						
					
					
						commit
						f96f9f7675
					
				
					 3 changed files with 26 additions and 2 deletions
				
			
		|  | @ -40,7 +40,7 @@ | |||
|   }; | ||||
| 
 | ||||
|   config = { | ||||
|     # TODO: configure ollama to download any modules listed in options.host.ai.models.{name}.model if options.host.ai.models.{name}.apiBase is the default value | ||||
|     # TODO: if we have any models that have a non null options.host.ai.models.{name}.apiBase then set services.ollama.enable to a lib.mkAfter true | ||||
|     # TODO: configure ollama to download any modules listed in options.host.ai.models.{name}.model if options.host.ai.models.{name}.apiBase is localhost | ||||
|     # TODO: if we have any models that have a non localhost options.host.ai.models.{name}.apiBase then set services.ollama.enable to a lib.mkAfter true | ||||
|   }; | ||||
| } | ||||
|  |  | |||
		Loading…
	
	Add table
		Add a link
		
	
		Reference in a new issue