Posts

Using OpenVoice as an all local voice operated agent.

General Impressions OpenVoice OS is an interesting project that runs on relatively lightweight hardware and lets you retain local voice processing. The most obvious use cases are Home Automation or local voice bot. AI has everyone's attention and all the voice agent work is now happening around AI integration for smarter responses. OpenVoice feels small as it currently exists.  It is mostly a curiosity on its own. OpenVoice LLM integration requires you to know something. IMO the best thing would be to have a zero-config Ollama or LocalLLM that is part of the default extended skill set.  The Raspberry Pi was great but it felt constrained. The most cost-effective and full-featured hardware setup for completely local voice recognition and processing is probably a Mac desktop.  It would be capable of running OpenVoice and local MLX-based LLMs. Revision History Created 2024 04

Selecting the OpenVoice O/S installation path for your Raspberry Pi based voice agent.

Image
I had a pile of Raspberry Pi 3b and Raspberry Pi 400 hardware so I used that.  If I had been building a magic-mirror type unit then I would have had a GUI.  In my case, I wanted a headless agent.  The Rasberry Pi 3 was frustratingly slow.  The P400 was pretty usable with and without the GUI. In my case, I ended up running the Pi headless to create more headroom for any processing on the unit. The RaspOVOS image was dead simple to write to an SD card on my Windows machine using the Raspberry Pi OS installer program. The decision tree looks pretty much like Related Links https://community.openconversational.ai/latest https://github.com/OpenVoiceOS/ovos-releases https://github.com/OpenVoiceOS/raspOVOS Revision History Created 2024 04

I tried to plant something in a garden....

Image
I tried to plant something in an enterprise garden.  It truned out that it was actually a graveyard of bad decisions. This quote is subject to revision based on future input. Revision History Created 2025/04  

Always free Mongo DB compatible Azure Cosmos DB instances

Image
The Azure Mongo DB version of Cosmos DB provides a sharable, scalable way of working with MongoDB across sites, different teams or different machines. I have a couple development machines and am always looking for ways to work with the same dataset independent of my current physical location and of the developer machine platform I'm using. Always Free Cosmos DB - Mongo DB Microsoft Azure has a Cosmos DB for MongoDB free tier offering. MongoDB (vCORE) has its own SKU (at this time). This means you may be able to use free tier Mongo DB and the free tier standard Cosmos DB You get a dedicated MongoDB cluster with 32 GB storage. The docs don't mention any compute or RU/s limit. From the docs Azure Cosmos DB for MongoDB (vCore) now introduces a new SKU, the "Free Tier," enabling users to explore the platform without any financial commitments. The free tier lasts for the lifetime of your account, boasting command and feature parity with a regular Azure Cosmos DB for MongoDB...

Always free Apache Gremlin graph compatible Azure Cosmos DB instances

Image
Cosmos DB has an Apache Tinkerpop /Gremlin, compatible persona that lets you run Gremlin Graph queries against a Cosmos DB data store.  This tool can be useful in a shared database in a personal project or startup situation. Cosmos DB Gremlin is targeted at teams migrating into Azure and off of some other COT product. Azure has an always-free tier Cosmos DB Gremlin instance offering reasonable limits and some compatibility constraints. Usage limits for the free tier are mentioned below. Caveat Emptor Microsoft is dropping Graph support in their Azure Databases VSCode extension. Microsoft's Cosmos DB Gremlin support document  that Cosmos DB is compatible with Gremlin 3.4 version drivers.  Later versions of Gremlin drivers don't work with Cosmos DB. At the time of this article, CAzure Cosmos DB supports the JSON format. It does not support the Gremlin bytecode format.  This means it doesn't work with some driver optimizations and libraries. What is gremlin Look at Kelv...

Micro benchmarking Apple M1 Max - MLX vs GGUF - LLM QWEN 2.5

Image
MLX is an ML framework targeted at Apple Silicon.  It provides noticeable ML performance gains when compared to the standard (GGUF) techniques running on Apple Silicon.  This MLX project describes MLX as:   MLX is an array framework for machine learning on Apple silicon, brought to you by Apple machine learning research. A notable difference from MLX and other frameworks is the   unified memory model . Arrays in MLX live in shared memory. Operations on MLX arrays can be performed on any of the supported device types without transferring data. LM Studio added support for Apple Silicon MLX models in 2024 . I totally ignored it until I saw a 2025/02 Reddit post in the /r/ocallama subreddit .  I wanted to execute their microbenchmark on my Mac to get a feel for the possible performance difference.  The performance improvement is exciting.  I am waiting on really jumping into the MLX until Ollama supports MLX something they are working on as of 2025/0...

Sample claude-code prompts from their. npm package

Image
Anthropic released an npm package for claude-code  that acts as an LLM coding assistant. Anthropic  on their website  including sample LLM interactions. This blog article exist to make it easier to read the prompts and provides no analysis or value judgement. Claude-code Typescript source is available on the npmjs site. Their source provides some interesting examples of good prompt engineering. They show how you can create a detailed system prompt to achieve better results.  I've extracted some of the prompts here to make them easier to read.  I've wrapped some of the text to make it easier to read.  I have no idea if the extra new lines impact the results. So copiers be aware. Claude-code on npmjs https://www.npmjs.com/package/@anthropic-ai/claude-code?activeTab=code The npmjs links to the GitHub repository with the source are broken at this time (2025 02).  cli.mjs I found the following system prompts while scanning cli.mjs. systemPrompt : [ `Your ta...