---
title: "TUIs"
url: "https://learn.omacom.io/1/read/42/tuis"
---

# TUIs

## Lazygit

[Lazygit](https://github.com/jesseduffield/lazygit) is a delightful alternative to something like the GitHub Desktop application, and it runs inside the terminal. 

You can run it directly, by going to any directory managed by git and running `lzg`. Or you can run it inside Neovim where it can be started with `Space G G`.

You hop between the different panes using `Tab`. In the Files pane, you select files for staging using `Space`, and then you can create a new commit using `c`. You can see all the commands available using `?`.

 ![lazygit.png](https://manual.omakub.org/u/lazygit-r1sdzd.png) 

## Lazydocker

[Lazydocker](https://github.com/jesseduffield/lazydocker) is made in the same spirit like Lazygit, and also gives you a terminal interface for managing your containers and images. 

You can start it yourself in the terminal using `lzd` or you can use the Docker application that Omakub creates in the dock for it.

You stop a container using `s` or start/restart it using `r`. See all commands using `?`.

 ![lazydocker.png](https://manual.omakub.org/u/lazydocker-RfTdXd.png) 

## Btop

[Btop](https://github.com/aristocratos/btop) is a beautiful resource manager that shows memory, CPU, disk, and network usage. It also lists all active processes, and allows you to manage them.

Omakub has added a dock app for it called Activity, which you can start from the dock or via Ulauncher. You can also start it in your own terminal by running `btop`.

 ![btop.png](https://manual.omakub.org/u/btop-sDEnIw.png) 

## Fastfetch

[Fastfetch](https://github.com/fastfetch-cli/fastfetch) shows system information, like kernel version, uptime, theme, CPU, memory, and more. It's a successor to the popular neofetch tool.

Omakub has packaged this as the About application, which you can find on the app grid (`Super + A`) or via Ulauncher.

 ![fastfetch.png](https://manual.omakub.org/u/fastfetch-5vryLS.png) 

## Ollama

[Ollama](https://ollama.com/) lets you run large language models, like Meta's Llama3, locally. Once installed, you can run `ollama run llama3` in a terminal to start a conversation with the llama3 model. It also supports Mistral and other models. And it'll automatically install GPU drivers for Nvidia or AMD graphics cards. 

 ![ollama.png](https://manual.omakub.org/u/ollama-EoteQO.png) 
