mastodon.world is one of the many independent Mastodon servers you can use to participate in the fediverse.
Generic Mastodon server for anyone to use.

Server stats:

8.1K
active users

#localai

1 post1 participant0 posts today

PC World: Proton’s new encrypted AI chatbot, Lumo, puts your privacy first. “Lumo can do a lot of the things that other AI chatbots can do, like summarize documents, write emails, and generate code. But all user data is stored locally and protected with so-called ‘zero-access’ encryption. This means that only you have the key to your own content.”

https://rbfirehose.com/2025/07/29/pc-world-protons-new-encrypted-ai-chatbot-lumo-puts-your-privacy-first/

ResearchBuzz: Firehose | Individual posts from ResearchBuzz · PC World: Proton’s new encrypted AI chatbot, Lumo, puts your privacy first | ResearchBuzz: Firehose
More from ResearchBuzz: Firehose
Replied to Ivan Todorov

@ivantodorov
The main problem is, most aliases don't work in lack of pronounciation, especially with non english speaker pronounciation or using your "supported" native language.

It is very fantastic, when speech to text rights down, what it has interpreted one has said.

My wife laughed me out after 15 min talking to myself, "hey, whats the weather here" and "dude, turn on the light". In these 15 min was time enough to buy bread, come home again and manually turn on the light.

Once I had planned to enable Voice Assistant for our toddler when it couldn't reach the switch. But finally the sentences were not recongnized.

So, nowadays we have movement detection sensors, managing the lights. The system is now smarter without talking to it.

I found some old recordings from my uni times. I used to record these lectures so that I could write everything down at home, but it required too much time and I couldn’t be arsed.

So I was wondering if #AI could do it for me, locally. That way I could save the transcription and delete the audio files. Would it be able to understand native languages, or just imperial ones?

I dusted off one of my #OrangePi​s and tried to install #LocalAI (localai.io). It runs, webUI loads, but I get some errors regarding diffusers and models and whatnot.

Out of curiosity I also installed #EasyDiffusion, but the webUI doesn’t load, and the SBC seems to become unresponsive.

I know nothing about Linux, and even less about AI, so I’m not going to waste any more of my time on this.

Let’s see what Jellyfin has for me tonight.

localai.ioLocalAIThe free, OpenAI, Anthropic alternative. Your All-in-One Complete AI Stack - Run powerful language models, autonomous agents, and document intelligence locally on your hardware
Replied in thread

@LokiTheCat it allows people to share their docs into the federation and then other people add stuff and it becomes the corpus of the sector - osint/comp intel, indexes, specialized insular industry info but all these people are working together for the greater good - you can run bigger models and they can be trained on docs specific to your sector so you get better answers #semantic tags #metadata #filtered lists #bloomberg termial killer #rag pipelines #trends #real time dashboards #distributed inference #localai

Telegram AI Companion: веселый проект на Rust, Telegram и локальном ИИ

Привет, Хабр! 👋 Недавно я собрал небольшой, но бодрый pet-проект — Telegram AI Companion . Это Telegram-бот, который умеет болтать с вами, используя локальную языковую модель через LocalAI . Без OpenAI, без облаков — всё на своём железе. Цель проекта — не революция в AI, а именно учебное и увлекательное погружение в Rust , асинхронность, Telegram API и локальные LLM-модели. Такой себе “бот-компаньон”, но больше для разработчика, чем пользователя :) Если вам интересно:

habr.com/ru/articles/920482/

ХабрTelegram AI Companion: веселый проект на Rust, Telegram и локальном ИИПривет, Хабр! 👋 Недавно я собрал небольшой, но бодрый pet-проект — Telegram AI Companion . Это Telegram-бот, который умеет болтать с вами, используя локальную языковую модель через LocalAI . Без...

Comme l'IA débarque dans nos vies qu'on le veuille ou non, je débute un projet perso qui consiste à mettre en place une "IA" qui tourne sur ma propre machine et consomme le moins de ressources possible (relativement aux IA mainstream). J'ai déjà quelques résultats intéressants pour gérer une todolist et d'autres tâches simples.

Je suis certain que d'autres projets existent mais ils sont difficiles à trouver. Si vous avez des liens, je suis preneur 🙂

Replied in thread

@Catvalente

Or just use you AI locally 🦾 💻 🧠

I completely understand the concerns about relying too heavily on AI, especially cloud-based, centralized models like ChatGPT. The issues of privacy, energy consumption, and the potential for misuse are very real and valid. However, I believe there's a middle ground that allows us to benefit from the advantages of AI without compromising our values or autonomy.

Instead of rejecting AI outright, we can opt for open-source models that run on local hardware. I've been using local language models (LLMs) on my own hardware. This approach offers several benefits:

- Privacy - By running models locally, we can ensure that our data stays within our control and isn't sent to third-party servers.

- Transparency - Open-source models allow us to understand how the AI works, making it easier to identify and correct biases or errors.

- Customization - Local models can be tailored to our specific needs, whether it's for accessibility, learning, or creative projects.

- Energy Efficiency - Local processing can be more energy-efficient than relying on large, centralized data centers.

- Empowerment - Using AI as a tool to augment our own abilities, rather than replacing them, can help us learn and grow. It's about leveraging technology to enhance our human potential, not diminish it.

For example, I use local LLMs for tasks like proofreading, transcribing audio, and even generating image descriptions. Instead of ChatGPT and Grok, I utilize Jan.ai with Mistral, Llama, OpenCoder, Qwen3, R1, WhisperAI, and Piper. These tools help me be more productive and creative, but they don't replace my own thinking or decision-making.

It's also crucial to advocate for policies and practices that ensure AI is used ethically and responsibly. This includes pushing back against government overreach and corporate misuse, as well as supporting initiatives that promote open-source and accessible technologies.

In conclusion, while it's important to be critical of AI and its potential downsides, I believe that a balanced, thoughtful approach can allow us to harness its benefits without sacrificing our values. Let's choose to be informed, engaged, and proactive in shaping the future of AI.

CC: @Catvalente @audubonballroon
@calsnoboarder @craigduncan

TechCrunch: Google quietly released an app that lets you download and run AI models locally. “Called Google AI Edge Gallery, the app is available for Android and will soon come to iOS. It allows users to find, download, and run compatible models that generate images, answer questions, write and edit code, and more. The models run offline, without needing an internet connection, tapping into […]

https://rbfirehose.com/2025/06/01/techcrunch-google-quietly-released-an-app-that-lets-you-download-and-run-ai-models-locally/

ResearchBuzz: Firehose | Individual posts from ResearchBuzz · TechCrunch: Google quietly released an app that lets you download and run AI models locally | ResearchBuzz: Firehose
More from ResearchBuzz: Firehose

Memaksakan diri coba nge-run #AI pakai laptop kentang…
maks 2 Token/s meski cuma nge-run model ukuran kecil. wkwkwk… 😅
gpp lah, pengalaman nyoba #localAI..