AI Concerns From the Trenches
- Chris Cassar
- Jul 7
- 2 min read

Artificial Intelligence (AI) has been at the forefront of everyone's minds since 2022, but I've been fascinated by it for much longer – all the way back to my SUNY Buffalo college days in the late 1980s. Back then, I was learning about and programming "AI-like" code that could change itself after execution. It was both mesmerizing and unsettling.
Since then, AI has come a long way. IBM's Watson, released in 2007 as a proof-of-concept, marked the beginning of computer intelligence. The following years saw significant advancements… Neural Networks were introduced to the public in 2014, and transformer-based models made their debut in 2018. Fast-forward to today, large language models (LLM), stable diffusion, and multi-modal capabilities are making AI an integral part of everyday life.
While I won't delve into the ethics surrounding AI or the concerns about Universal Basic Income (UBI) societies1, or the possibility of AI Singularity2. Instead, I will focus on a recurring question from customers, particularly as a Managed Service Provider (MSP) and technologist. The most pressing concern? How can I keep my information private?
As someone who is interested in the latest advances in the IT space, I am often bombarded with daily announcements about new AI developments. Yet my customers continually raise the question, "How can I keep my information private?" It is a major concern that persists despite the rapid pace of innovation. Google’s Gemini user guidelines recommend: “For users of the free, consumer version of Gemini, you should not upload proprietary or confidential information, as it may be used to improve the service and be reviewed by humans.”. Microsoft’s Copilot advises: “When you upload files to Copilot, they are used as knowledge sources for the AI to generate responses.” This basic rule-of-thumb applies to most publicly available LLMs. So, how can you prevent this? What guardrails or systems can you utilize with AI to enable use of your own personal or corporate knowledgebase?
Supplying your own articles and data for AI to use, known as retrieval augmented generation (RAG), and to stay private, you need to run these AI tools locally, on your own PC or server. This sounds expensive and complicated, but it doesn’t have to be. I was able to do this within my own organization by upgrading the systems I already had, for under $500. There are some limitations to implementing these local services, but I believe the benefits far outweigh the small hurdles and investments required to make this work. If you are willing to spend the money for paid AI subscriptions like Google AI Pro, ChatGPT Plus, or Claude Max, putting those funds into your own personal AI system might be the way to go.
1. International Monetary Fund, IMF Blog “AI Will Transform the Global Economy” (https://www.imf.org/en/Blogs/Articles/2024/01/14/ai-will-transform-the-global-economy-lets-make-sure-it-benefits-humanity)
2. AI singularity was first proposed by mathematician and computer scientist Vernor Vinge in his 1993 essay "The Coming Technological Singularity: How to Survive in the Post-Human Era. (https://edoras.sdsu.edu/~vinge/misc/singularity.html)