mastodon.world is one of the many independent Mastodon servers you can use to participate in the fediverse.
Generic Mastodon server for anyone to use.

Server stats:

8.5K
active users

#localai

2 posts2 participants0 posts today
gary<p><span class="h-card" translate="no"><a href="https://mastodon.social/@jonathanhogg" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>jonathanhogg</span></a></span> nice - i am sort of last to know but fedi services have a bright future <a href="https://infosec.exchange/tags/peertube" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>peertube</span></a> <a href="https://infosec.exchange/tags/localai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>localai</span></a></p>
ℒӱḏɩę :blahaj:<p>Well, AI can't reliably detect AI. Hit or miss. <a href="https://tech.lgbt/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://tech.lgbt/tags/localAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>localAI</span></a> </p><p>*I run all of this local, solar powered.</p>
gary<p><span class="h-card" translate="no"><a href="https://toot.community/@LokiTheCat" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>LokiTheCat</span></a></span> it allows people to share their docs into the federation and then other people add stuff and it becomes the corpus of the sector - osint/comp intel, indexes, specialized insular industry info but all these people are working together for the greater good - you can run bigger models and they can be trained on docs specific to your sector so you get better answers <a href="https://infosec.exchange/tags/semantic" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semantic</span></a> tags <a href="https://infosec.exchange/tags/metadata" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>metadata</span></a> <a href="https://infosec.exchange/tags/filtered" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>filtered</span></a> lists <a href="https://infosec.exchange/tags/bloomberg" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>bloomberg</span></a> termial killer <a href="https://infosec.exchange/tags/rag" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>rag</span></a> pipelines <a href="https://infosec.exchange/tags/trends" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>trends</span></a> <a href="https://infosec.exchange/tags/real" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>real</span></a> time dashboards <a href="https://infosec.exchange/tags/distributed" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>distributed</span></a> inference <a href="https://infosec.exchange/tags/localai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>localai</span></a></p>
TelegramTelegram AI Companion: веселый проект на Rust, Telegram и локальном ИИ Привет, Хабр! 👋 Недавно я собрал небольшой, но бодрый p...<br><br><a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/rust" target="_blank">#rust</a> <a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/telegram" target="_blank">#telegram</a> <a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/bot" target="_blank">#bot</a> <a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/localai" target="_blank">#localai</a> <a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/llm" target="_blank">#llm</a> <a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/docker" target="_blank">#docker</a> <a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/actix" target="_blank">#actix</a> <a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/openai" target="_blank">#openai</a> <a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/ai" target="_blank">#ai</a> <a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/ngrok" target="_blank">#ngrok</a> <a rel="nofollow noopener" class="mention hashtag" href="https://mastodon.social/tags/natural" target="_blank">#natural</a><br><br><a href="https://habr.com/ru/articles/920482/?utm_source=habrahabr&amp;utm_medium=rss&amp;utm_campaign=920482" rel="nofollow noopener" target="_blank">Origin</a> | <a href="https://awakari.com/sub-details.html?id=Telegram" rel="nofollow noopener" target="_blank">Interest</a> | <a href="https://awakari.com/pub-msg.html?id=PhsIsPQwkFbqStrFVRwHYHgxJT6&amp;interestId=Telegram" rel="nofollow noopener" target="_blank">Match</a>
Habr<p>Telegram AI Companion: веселый проект на Rust, Telegram и локальном ИИ</p><p>Привет, Хабр! 👋 Недавно я собрал небольшой, но бодрый pet-проект — Telegram AI Companion . Это Telegram-бот, который умеет болтать с вами, используя локальную языковую модель через LocalAI . Без OpenAI, без облаков — всё на своём железе. Цель проекта — не революция в AI, а именно учебное и увлекательное погружение в Rust , асинхронность, Telegram API и локальные LLM-модели. Такой себе “бот-компаньон”, но больше для разработчика, чем пользователя :) Если вам интересно:</p><p><a href="https://habr.com/ru/articles/920482/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">habr.com/ru/articles/920482/</span><span class="invisible"></span></a></p><p><a href="https://zhub.link/tags/rust" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>rust</span></a> <a href="https://zhub.link/tags/telegram_bot" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>telegram_bot</span></a> <a href="https://zhub.link/tags/localai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>localai</span></a> <a href="https://zhub.link/tags/llm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llm</span></a> <a href="https://zhub.link/tags/docker" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>docker</span></a> <a href="https://zhub.link/tags/actix" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>actix</span></a> <a href="https://zhub.link/tags/openai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>openai</span></a> <a href="https://zhub.link/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a> <a href="https://zhub.link/tags/ngrok" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ngrok</span></a> <a href="https://zhub.link/tags/natural_language_processing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>natural_language_processing</span></a></p>
5ika<p>Comme l'IA débarque dans nos vies qu'on le veuille ou non, je débute un projet perso qui consiste à mettre en place une "IA" qui tourne sur ma propre machine et consomme le moins de ressources possible (relativement aux IA mainstream). J'ai déjà quelques résultats intéressants pour gérer une todolist et d'autres tâches simples.</p><p>Je suis certain que d'autres projets existent mais ils sont difficiles à trouver. Si vous avez des liens, je suis preneur 🙂 </p><p><a href="https://tooting.ch/tags/LocalLLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LocalLLM</span></a> <a href="https://tooting.ch/tags/LocalAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LocalAI</span></a> <a href="https://tooting.ch/tags/SelfHosting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SelfHosting</span></a> <a href="https://tooting.ch/tags/Privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Privacy</span></a></p>
Stéphane Klein<p>J'ai découvert <a href="https://social.coop/tags/LocalAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LocalAI</span></a> </p><p><a href="https://notes.sklein.xyz/2025-06-12_1702/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">notes.sklein.xyz/2025-06-12_17</span><span class="invisible">02/</span></a></p><p>Pour le moment, je ne sais pas trop comment le positionner par rapport à Open WebUI, ollama, vLLM, etc</p><p><a href="https://social.coop/tags/TIL" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TIL</span></a> <a href="https://social.coop/tags/OpenSource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenSource</span></a> <a href="https://social.coop/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a></p>
Debby<p><span class="h-card" translate="no"><a href="https://wandering.shop/@Catvalente" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>Catvalente</span></a></span> </p><p>Or just use you AI locally 🦾 💻 🧠 </p><p>I completely understand the concerns about relying too heavily on AI, especially cloud-based, centralized models like ChatGPT. The issues of privacy, energy consumption, and the potential for misuse are very real and valid. However, I believe there's a middle ground that allows us to benefit from the advantages of AI without compromising our values or autonomy.</p><p>Instead of rejecting AI outright, we can opt for open-source models that run on local hardware. I've been using local language models (LLMs) on my own hardware. This approach offers several benefits:</p><p> - Privacy - By running models locally, we can ensure that our data stays within our control and isn't sent to third-party servers.</p><p> - Transparency - Open-source models allow us to understand how the AI works, making it easier to identify and correct biases or errors.</p><p> - Customization - Local models can be tailored to our specific needs, whether it's for accessibility, learning, or creative projects.</p><p> - Energy Efficiency - Local processing can be more energy-efficient than relying on large, centralized data centers.</p><p> - Empowerment - Using AI as a tool to augment our own abilities, rather than replacing them, can help us learn and grow. It's about leveraging technology to enhance our human potential, not diminish it.</p><p>For example, I use local LLMs for tasks like proofreading, transcribing audio, and even generating image descriptions. Instead of ChatGPT and Grok, I utilize Jan.ai with Mistral, Llama, OpenCoder, Qwen3, R1, WhisperAI, and Piper. These tools help me be more productive and creative, but they don't replace my own thinking or decision-making.</p><p>It's also crucial to advocate for policies and practices that ensure AI is used ethically and responsibly. This includes pushing back against government overreach and corporate misuse, as well as supporting initiatives that promote open-source and accessible technologies.</p><p>In conclusion, while it's important to be critical of AI and its potential downsides, I believe that a balanced, thoughtful approach can allow us to harness its benefits without sacrificing our values. Let's choose to be informed, engaged, and proactive in shaping the future of AI.</p><p>CC: <span class="h-card" translate="no"><a href="https://wandering.shop/@Catvalente" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>Catvalente</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@audubonballroon" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>audubonballroon</span></a></span> <br><span class="h-card" translate="no"><a href="https://universeodon.com/@calsnoboarder" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>calsnoboarder</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.au/@craigduncan" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>craigduncan</span></a></span> </p><p><a href="https://hear-me.social/tags/ArtificialIntelligence" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ArtificialIntelligence</span></a> <a href="https://hear-me.social/tags/OpenSource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenSource</span></a> <a href="https://hear-me.social/tags/LocalModels" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LocalModels</span></a> <a href="https://hear-me.social/tags/PrivacyLLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PrivacyLLM</span></a> <a href="https://hear-me.social/tags/Customization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Customization</span></a> <a href="https://hear-me.social/tags/LocalAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LocalAI</span></a> <a href="https://hear-me.social/tags/Empowerment" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Empowerment</span></a> <a href="https://hear-me.social/tags/DigitalLiteracy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DigitalLiteracy</span></a> <a href="https://hear-me.social/tags/CriticalThinking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CriticalThinking</span></a> <a href="https://hear-me.social/tags/EthicalAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EthicalAI</span></a> <a href="https://hear-me.social/tags/ResponsibleAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ResponsibleAI</span></a> <a href="https://hear-me.social/tags/Accessibility" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Accessibility</span></a> <a href="https://hear-me.social/tags/Inclusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Inclusion</span></a> <a href="https://hear-me.social/tags/Education" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Education</span></a></p>
Muhammeddd<p>Google, yerelde çalışan 3-4 gb ram ile çalışan ai modeli gemma 3n modelibi duyurmuş telefonlarda da yerel olarak kullanılabilecekmiş.Çok güzel gelişme veri mahremiyeti için.Hemde artık ai yavaştan dil modelleri optimizasyon kısmına önem vermeye başlıyorlar. :android_logo: <br><a href="https://social.vivaldi.net/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://social.vivaldi.net/tags/yz" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>yz</span></a> <a href="https://social.vivaldi.net/tags/yapayzeka" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>yapayzeka</span></a> <a href="https://social.vivaldi.net/tags/gemma3n" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>gemma3n</span></a> <a href="https://social.vivaldi.net/tags/Gemma" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Gemma</span></a> <a href="https://social.vivaldi.net/tags/localai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>localai</span></a> <a href="https://social.vivaldi.net/tags/mahremiyet" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>mahremiyet</span></a> <a href="https://social.vivaldi.net/tags/veri" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>veri</span></a> <a href="https://social.vivaldi.net/tags/Google" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Google</span></a> <a href="https://social.vivaldi.net/tags/ArtificialIntelligence" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ArtificialIntelligence</span></a> <a href="https://social.vivaldi.net/tags/telefon" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>telefon</span></a> <a href="https://social.vivaldi.net/tags/Android" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Android</span></a></p>
Emile Dingemans 🟥🟥🟥🟥🟥🟥<p>AI lokaal draaien, lijkt mij de toekomst hebben. Het geeft eigenaarschap terug aan de gebruiker en maakt ons los van bedrijven die steeds meer persoonlijke gegevens van ons willen vastleggen. Google lanceert nu een lokale AI versie. <a href="https://techcrunch.com/2025/05/31/google-quietly-released-an-app-that-lets-you-download-and-run-ai-models-locally/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">techcrunch.com/2025/05/31/goog</span><span class="invisible">le-quietly-released-an-app-that-lets-you-download-and-run-ai-models-locally/</span></a> <a href="https://mastodon.nl/tags/localAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>localAI</span></a></p>
ResearchBuzz: Firehose<p>TechCrunch: Google quietly released an app that lets you download and run AI models locally. “Called Google AI Edge Gallery, the app is available for Android and will soon come to iOS. It allows users to find, download, and run compatible models that generate images, answer questions, write and edit code, and more. The models run offline, without needing an internet connection, tapping into […]</p><p><a href="https://rbfirehose.com/2025/06/01/techcrunch-google-quietly-released-an-app-that-lets-you-download-and-run-ai-models-locally/" class="" rel="nofollow noopener" target="_blank">https://rbfirehose.com/2025/06/01/techcrunch-google-quietly-released-an-app-that-lets-you-download-and-run-ai-models-locally/</a></p>
s4if 🇵🇸 🇮🇩<p>Memaksakan diri coba nge-run <a class="hashtag" href="https://fd.s4if.dev/tag/ai" rel="nofollow noopener" target="_blank">#AI</a> pakai laptop kentang…<br>maks 2 Token/s meski cuma nge-run model ukuran kecil. wkwkwk… 😅<br>gpp lah, pengalaman nyoba <a class="hashtag" href="https://fd.s4if.dev/tag/localai" rel="nofollow noopener" target="_blank">#localAI</a>..</p>
Debby<p><span class="h-card" translate="no"><a href="https://fosstodon.org/@system76" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>system76</span></a></span> <br>I love <a href="https://hear-me.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a>, or as they're often called, <a href="https://hear-me.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a>, especially when used locally. Local models are incredibly effective for enhancing daily tasks like proofreading, checking emails for spelling and grammatical errors, quickly creating image descriptions, transcribing audio to text, or even finding that one quote buried in tons of files that answers a recurring question.</p><p>However, if I wanted to be fully transparent to <a href="https://hear-me.social/tags/bigtech" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>bigtech</span></a>, I would use Windows and Android with all the "big brotherly goodness" baked into them. That's why I hope these tools don't connect to third-party servers.</p><p>So, my question to you is: Do you propose a privacy-oriented and locally/self-hosted first LLM?</p><p>I'm not opposed to the general notion of using AI, and if done locally and open-source, I really think it could enhance the desktop experience. Even the terminal could use some AI integration, especially for spell-checking and syntax-checking those convoluted and long commands. I would love a self-hosted integration of some AI features. 🌟💻 <br><a href="https://hear-me.social/tags/OpenSource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenSource</span></a> <a href="https://hear-me.social/tags/Privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Privacy</span></a> <a href="https://hear-me.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://hear-me.social/tags/LocalModels" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LocalModels</span></a> <a href="https://hear-me.social/tags/SelfHosted" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SelfHosted</span></a> <a href="https://hear-me.social/tags/LinuxAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LinuxAI</span></a> <a href="https://hear-me.social/tags/LocalLLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LocalLLM</span></a> <a href="https://hear-me.social/tags/LocalAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LocalAI</span></a></p>

MakeUseOf: Anyone Can Enjoy the Benefits of a Local LLM With These 5 Apps . “Cloud-based AI chatbots like ChatGPT and Gemini are convenient, but they come with trade-offs. Running a local LLM—the tech behind the AI chatbot—puts you in control, offering offline access and stronger data privacy. And while it might sound technical, the right apps make it easy for anyone to get started.”

https://rbfirehose.com/2025/05/19/makeuseof-anyone-can-enjoy-the-benefits-of-a-local-llm-with-these-5-apps/

#ai#apps#howto

🧩➡️📊 Turn chaotic, unstructured data into organized insights! (Un)Perplexed Spready understands context and meaning in text fields, customer feedback, and product descriptions. Let AI extract, categorize, and analyze information that traditional formulas can't handle. Structure emerges from chaos! ✨
matasoft.hr/qtrendcontrol/inde

⚡ Tired of manual data sorting and analysis? (Un)Perplexed Spready automates tedious data tasks through intelligent AI formulas. Extract patterns, categorize information, and generate insights at lightning speed. Reclaim hours of your workday while getting better results! 🕒
matasoft.hr/qtrendcontrol/inde