Draft:Ollama
An AI tool.
From Wikipedia, the free encyclopedia
| Review waiting, please be patient.
This may take 8 weeks or more, since drafts are reviewed in no specific order. There are 3,107 pending submissions waiting for review.
Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
Reviewer tools
|
| This is a draft article. It is a work in progress open to editing by anyone. Please ensure core content policies are met before publishing it as a live Wikipedia article. Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL Last edited by Fishes3 (talk | contribs) 12 hours ago. (Update)
This draft has been submitted and is currently awaiting review. |
| Ollama | |
|---|---|
| Developers | Ollama, Inc. |
| Written in | Go |
| Operating system | macOS, Linux, Windows |
| Type | Large language model runtime |
| Website | https://ollama.com |

Ollama
Ollama is an open source[1] (Ollama is under the MIT license.)software framework[2] for managing and running large language models (LLM) locally or in the cloud. It offers a command-line interface (CLI) for downloading, installing, using and removing models. By providing a uniform interface to the models it supports, it enables users to use models whose details they are not familiar with allowing them to run AIs locally on computer hardware. Jeffrey Morgan [3] and Michael Chiang [4] founded Ollama in 2021.
Architecture
At the core of Ollama is a local web server that provides a JSON REST API for managing models and interacting with them. The ollama program starts and manages that server and uses the REST API to provide a CLI for managing models and interacting with them.[5]
On its own, Ollama's web server does not have a graphical user interface. For this purpose, the Open WebUI can be used.[5]
